Failing to set up a previously working redshift connector after update

  • Is this your first time deploying Airbyte?: No
  • OS Version / Instance: Amazon Linux 2 AMI
  • Memory / Disk: t3a.xlarge / 2TB
  • Deployment: docker compose
  • Airbyte Version: 0.35.65-alpha
  • Source name/version: N/A
  • Destination name/version: redshift 0.3.30
  • Step: this is happening when i’m trying to set the destination up and it tests the values
  • Description:
    Log attached, please advise :slight_smile:
2022-04-13 08:35:24 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/ca93f1e2-9bef-4e48-af86-7908e452fbcb/0/logs.log
2022-04-13 08:35:24 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.35.65-alpha
2022-04-13 08:35:24 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-redshift:0.3.30 exists...
2022-04-13 08:35:24 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-redshift:0.3.30 was found locally.
2022-04-13 08:35:24 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):106 - Creating docker job ID: ca93f1e2-9bef-4e48-af86-7908e452fbcb
2022-04-13 08:35:24 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):158 - Preparing command: docker run --rm --init -i -w /data/ca93f1e2-9bef-4e48-af86-7908e452fbcb/0 --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/destination-redshift:0.3.30 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.35.65-alpha -e WORKER_JOB_ID=ca93f1e2-9bef-4e48-af86-7908e452fbcb airbyte/destination-redshift:0.3.30 check --config source_config.json
2022-04-13 08:35:25 e[1;31mERRORe[m i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Class path contains multiple SLF4J bindings.
2022-04-13 08:35:25 e[1;31mERRORe[m i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2022-04-13 08:35:25 e[1;31mERRORe[m i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2022-04-13 08:35:25 e[1;31mERRORe[m i.a.c.i.LineGobbler(voidCall):82 - SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2022-04-13 08:35:25 e[1;31mERRORe[m i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2022-04-13 08:35:26 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:26 e[32mINFOe[m i.a.i.d.r.RedshiftDestination(main):76 - starting destination: class io.airbyte.integrations.destination.redshift.RedshiftDestination
2022-04-13 08:35:26 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:26 e[32mINFOe[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json}
2022-04-13 08:35:26 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:26 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):121 - Running integration: io.airbyte.integrations.destination.redshift.RedshiftDestination
2022-04-13 08:35:26 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:26 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):122 - Command: CHECK
2022-04-13 08:35:26 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:26 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):123 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2022-04-13 08:35:27 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:27 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-04-13 08:35:27 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:27 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-04-13 08:35:27 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:27 e[32mINFOe[m i.a.i.d.j.c.SwitchingDestination(check):55 - Using destination type: COPY_S3_WITH_SUPER_TMP_TYPE
2022-04-13 08:35:27 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:27 e[32mINFOe[m i.a.i.d.s.S3DestinationConfig(createS3Client):165 - Creating S3 client...
2022-04-13 08:35:28 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:28 e[32mINFOe[m i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):94 - Storage Object bi-airbyte-poc/bi-airbyte-poc does not exist in bucket; creating...
2022-04-13 08:35:28 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:28 e[32mINFOe[m i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):96 - Storage Object bi-airbyte-poc/bi-airbyte-poc has been created in bucket.
2022-04-13 08:35:28 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:28 e[32mINFOe[m i.a.i.d.s.S3Destination(testIAMUserHasListObjectPermission):159 - Started testing if IAM user can call listObjects on the destination bucket
2022-04-13 08:35:29 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:29 e[32mINFOe[m i.a.i.d.s.S3Destination(testIAMUserHasListObjectPermission):162 - Finished checking for listObjects permission
2022-04-13 08:35:29 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-04-13 08:35:29 e[1;31mERRORe[m i.a.i.d.j.c.CopyDestination(check):64 - Exception attempting to connect to the warehouse:
2022-04-13 08:35:29 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - java.lang.UnsupportedOperationException: RedshiftCopyS3Destination.getSqlOperations() without arguments not supported in Redshift destination connector.
2022-04-13 08:35:29 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.destination.redshift.RedshiftCopyS3Destination.getSqlOperations(RedshiftCopyS3Destination.java:78) ~[io.airbyte.airbyte-integrations.connectors-destination-redshift-0.35.65-alpha.jar:?]
2022-04-13 08:35:29 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.destination.jdbc.copy.CopyDestination.check(CopyDestination.java:60) [io.airbyte.airbyte-integrations.connectors-destination-jdbc-0.35.65-alpha.jar:?]
2022-04-13 08:35:29 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.destination.jdbc.copy.SwitchingDestination.check(SwitchingDestination.java:56) [io.airbyte.airbyte-integrations.connectors-destination-jdbc-0.35.65-alpha.jar:?]
2022-04-13 08:35:29 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:138) [io.airbyte.airbyte-integrations.bases-base-java-0.35.65-alpha.jar:?]
2022-04-13 08:35:29 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:105) [io.airbyte.airbyte-integrations.bases-base-java-0.35.65-alpha.jar:?]
2022-04-13 08:35:29 e[32mINFOe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.destination.redshift.RedshiftDestination.main(RedshiftDestination.java:77) [io.airbyte.airbyte-integrations.connectors-destination-redshift-0.35.65-alpha.jar:?]
2022-04-13 08:35:29 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling...

Also seeing this issue, total showstopper for us

1 Like

Finally had success by downgrading Redshift destination to 0.3.0 from 0.3.30, bug introduced sometime recently I think?

How does one downgrade the destination? :thinking:

Settings β†’ Destinations β†’ Change To β†’ 0.3.0

1 Like

That’s neat, didn’t know I could do that

3.3.28 is what the cloud version runs which fixed a similar issue for me.

About Redshift SUPER type upgrade problem.

I think one of the reasons why people complain about destination-redshift upgrade 0.3.29(or less) β†’ 0.3.31+ can be next:

We have 2 closely coupled components: destination-redshift, base-normalization

After we have added SUPER type this 2 components has to be upgraded both:

destination-redshift 0.3.31+
base-normalization 0.1.77+ (airbyte platform v0.36.2-alpha+)

It means if you upgrade destination-redshift you also must upgrade Airbyte platform.