Error adding custom connector to local Airbyte instance


When trying to add a custom connector to a local Airbyte instance, an ‘Internal Server Error: Get Spec job failed’ is returned. The error seems to be related to fetching the spec from the custom connector.


Hi! I built a custom connector outside of the airbyte repo and have its docker image published. When I try to add it to my local airbyte it resturns Internal Server Error: Get Spec job failed.

    "message": "Internal Server Error: Get Spec job failed.",
    "exceptionClassName": "java.lang.IllegalStateException",
    "exceptionStack": [
        "java.lang.IllegalStateException: Get Spec job failed.",
        "\tat io.airbyte.commons.server.converters.SpecFetcher.getSpecFromJob(",
        "\tat io.airbyte.commons.server.handlers.helpers.ActorDefinitionHandlerHelper.getSpecForImage(",
        "\tat io.airbyte.commons.server.handlers.helpers.ActorDefinitionHandlerHelper.defaultDefinitionVersionFromCreate(",
        "\tat io.airbyte.commons.server.handlers.DestinationDefinitionsHandler.createCustomDestinationDefinition(",
        "\tat io.airbyte.server.apis.DestinationDefinitionApiController.lambda$createCustomDestinationDefinition$0(",
        "\tat io.airbyte.server.apis.ApiHelper.execute(",
        "\tat io.airbyte.server.apis.DestinationDefinitionApiController.createCustomDestinationDefinition(",
        "\tat io.airbyte.server.apis.$DestinationDefinitionApiController$Definition$Exec.dispatch(Unknown Source)",
        "\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(",
        "\tat io.micronaut.context.DefaultBeanContext$4.invoke(",
        "\tat io.micronaut.web.router.AbstractRouteMatch.execute(",
        "\tat io.micronaut.web.router.RouteMatch.execute(",
        "\tat io.micronaut.http.context.ServerRequestContext.with(",
        "\tat io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(",
        "\tat reactor.core.publisher.FluxDeferContextual.subscribe(",
        "\tat reactor.core.publisher.InternalFluxOperator.subscribe(",
        "\tat reactor.core.publisher.FluxSubscribeOn$",
        "\tat io.micronaut.reactive.reactor.instrument.ReactorInstrumentation.lambda$init$0(",
        "\tat java.base/",
        "\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(",
        "\tat java.base/java.util.concurrent.ThreadPoolExecutor$",
        "\tat java.base/"
But when i pulll the docker image and run the spec command it works just fine.



This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. [Click here]( if you want to access the original thread.

[Join the conversation on Slack](

["custom-connector", "docker-image", "internal-server-error", "get-spec-job-failed"]

The “Get Spec job failed” error you’re encountering when trying to add your custom connector to Airbyte can be due to several reasons. Here are some potential issues and solutions based on the information extracts: 1. Image Naming: Ensure that your Docker image name doesn’t contain any “bad” symbols like underscores or stacked folders in the Docker repository. This was mentioned in a forum post. 2. Docker Image Tag: If you’re using the latest tag for your Docker image, try switching to a real version number. This was a solution provided by a user in a Github issue comment. 3. Dependencies: Make sure all necessary dependencies are included in your setup. A user in a forum post solved the issue by adding certain dependencies to their 4. Architecture Compatibility: If you built your Docker image on a different architecture (e.g., Mac M1), it might have issues running in Airbyte. You can try building the image on your current machine or in the cloud to work around this issue. This was suggested by a support agent in a forum post. If none of these solutions work, you might want to check the server logs for more detailed error messages or post your issue on the Airbyte Github or forum for further assistance.

If you want to post the spec output, we could take a look. Likely something in the spec format or fields is not compatible?

it was an arch issue! my machine is arm64 and my Docker image was for amd64. Thanks!