2022-06-06 12:32:30 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/0/logs.log 2022-06-06 12:32:30 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:32:31 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-file:0.2.10 exists... 2022-06-06 12:32:31 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-file:0.2.10 was found locally. 2022-06-06 12:32:31 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:32:31 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/0 --log-driver none --name source-file-check-1831-0-xjaoz --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/source-file:0.2.10 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/source-file:0.2.10 check --config source_config.json 2022-06-06 12:32:48 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):97 - Checking access to sftp://cleaned/FuzeBI_PayCode_Table.CSV... 2022-06-06 12:32:48 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):96 - ignoring unsupported keyword arguments: ['connect_kwargs'] 2022-06-06 12:32:48 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):97 - Check succeeded 2022-06-06 12:32:50 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:32:51 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/0/logs.log 2022-06-06 12:32:51 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:32:52 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.26 exists... 2022-06-06 12:32:53 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.26 was found locally. 2022-06-06 12:32:53 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:32:53 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/0 --log-driver none --name destination-snowflake-check-1831-0-hipzd --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.26 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/destination-snowflake:0.4.26 check --config source_config.json 2022-06-06 12:33:00 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Class path contains multiple SLF4J bindings. 2022-06-06 12:33:00 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:33:00 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:33:00 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:33:00 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-06-06 12:33:01 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-06-06 12:33:15 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:15 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-06-06 12:33:15 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:15 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-06-06 12:33:15 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:15 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-06-06 12:33:15 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:15 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-06-06 12:33:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:17 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:33:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:17 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:33:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:17 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:33:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:17 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:33:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:17 INFO i.a.i.d.j.c.SwitchingDestination(check):55 - Using destination type: INTERNAL_STAGING 2022-06-06 12:33:21 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:21 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-06-06 12:33:28 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:28 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@11eed657 2022-06-06 12:33:28 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:28 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-06-06 12:33:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:29 INFO i.a.d.j.DefaultJdbcDatabase(lambda$unsafeQuery$1):106 - closing connection 2022-06-06 12:33:30 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:30 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-06-06 12:33:31 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:33:31 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-06-06 12:33:32 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:33:39 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/0/logs.log 2022-06-06 12:33:39 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:33:40 INFO i.a.w.g.DefaultReplicationWorker(run):115 - start sync worker. job id: 1831 attempt id: 0 2022-06-06 12:33:40 INFO i.a.w.g.DefaultReplicationWorker(run):127 - configured sync modes: {} 2022-06-06 12:33:40 INFO i.a.w.i.DefaultAirbyteDestination(start):69 - Running destination... 2022-06-06 12:33:40 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.26 exists... 2022-06-06 12:33:41 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.26 was found locally. 2022-06-06 12:33:41 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:33:41 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/0 --log-driver none --name destination-snowflake-write-1831-0-ixmwe --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.26 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/destination-snowflake:0.4.26 write --config destination_config.json --catalog destination_catalog.json 2022-06-06 12:33:41 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-file:0.2.10 exists... 2022-06-06 12:33:42 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-file:0.2.10 was found locally. 2022-06-06 12:33:42 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:33:42 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/0 --log-driver none --name source-file-read-1831-0-dxvnu --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/source-file:0.2.10 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/source-file:0.2.10 read --config source_config.json --catalog source_catalog.json --state input_state.json 2022-06-06 12:33:42 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):395 - Destination output thread started. 2022-06-06 12:33:43 INFO i.a.w.g.DefaultReplicationWorker(run):169 - Waiting for source and destination threads to complete. 2022-06-06 12:33:43 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):298 - Replication thread started. 2022-06-06 12:33:45 destination > SLF4J: Class path contains multiple SLF4J bindings. 2022-06-06 12:33:45 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:33:45 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:33:45 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:33:45 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-06-06 12:33:46 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-06-06 12:33:56 source > Reading PAYCODE (sftp://cleaned/FuzeBI_PayCode_Table.CSV)... 2022-06-06 12:33:56 source > ignoring unsupported keyword arguments: ['connect_kwargs'] 2022-06-06 12:33:57 destination > 2022-06-06 12:33:57 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2022-06-06 12:33:58 destination > 2022-06-06 12:33:57 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-06-06 12:33:58 destination > 2022-06-06 12:33:57 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: WRITE 2022-06-06 12:33:58 destination > 2022-06-06 12:33:58 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2022-06-06 12:33:59 destination > 2022-06-06 12:33:59 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:33:59 destination > 2022-06-06 12:33:59 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:33:59 destination > 2022-06-06 12:33:59 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:33:59 destination > 2022-06-06 12:33:59 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:33:59 destination > 2022-06-06 12:33:59 INFO i.a.i.d.j.c.SwitchingDestination(getConsumer):65 - Using destination type: INTERNAL_STAGING 2022-06-06 12:34:03 destination > 2022-06-06 12:34:03 INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2022-06-06 12:34:03 destination > 2022-06-06 12:34:03 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):117 - Preparing tmp tables in destination started for 0 streams 2022-06-06 12:34:03 destination > 2022-06-06 12:34:03 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):139 - Preparing tmp tables in destination completed. 2022-06-06 12:34:03 destination > 2022-06-06 12:34:03 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded. 2022-06-06 12:34:03 destination > 2022-06-06 12:34:03 INFO i.a.i.d.b.BufferedStreamConsumer(close):170 - executing on success close procedure. 2022-06-06 12:34:03 destination > 2022-06-06 12:34:03 INFO i.a.i.d.r.SerializedBufferingStrategy(flushAll):92 - Flushing all 0 current buffers (0 bytes in total) 2022-06-06 12:34:03 destination > 2022-06-06 12:34:03 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):186 - Copying into tables in destination started for 0 streams 2022-06-06 12:34:03 destination > 2022-06-06 12:34:03 INFO i.a.i.d.j.SqlOperations(onDestinationCloseOperations):138 - No onDestinationCloseOperations required for this destination. 2022-06-06 12:34:03 destination > 2022-06-06 12:34:03 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):216 - Executing finalization of tables. 2022-06-06 12:34:03 destination > 2022-06-06 12:34:03 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-06-06 12:34:09 destination > 2022-06-06 12:34:09 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@41da3aee 2022-06-06 12:34:09 destination > 2022-06-06 12:34:09 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-06-06 12:34:10 destination > 2022-06-06 12:34:10 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):218 - Finalizing tables in destination completed. 2022-06-06 12:34:10 destination > 2022-06-06 12:34:10 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):220 - Cleaning up destination started for 0 streams 2022-06-06 12:34:10 destination > 2022-06-06 12:34:10 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):235 - Cleaning up destination completed. 2022-06-06 12:34:10 destination > 2022-06-06 12:34:10 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-06-06 12:34:11 ERROR i.a.w.g.DefaultReplicationWorker(run):180 - Sync worker failed. java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:173) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:65) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:382) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more Caused by: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at io.airbyte.workers.RecordSchemaValidator.validateSchema(RecordSchemaValidator.java:51) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:316) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more 2022-06-06 12:34:11 INFO i.a.w.g.DefaultReplicationWorker(run):239 - sync summary: io.airbyte.config.ReplicationAttemptSummary@3e5119ff[status=failed,recordsSynced=0,bytesSynced=0,startTime=1654518820516,endTime=1654518851452,totalStats=io.airbyte.config.SyncStats@3d599e27[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]] 2022-06-06 12:34:11 INFO i.a.w.g.DefaultReplicationWorker(run):268 - Source did not output any state messages 2022-06-06 12:34:11 WARN i.a.w.g.DefaultReplicationWorker(run):276 - State capture: No new state, falling back on input state: io.airbyte.config.State@4a1cd867[state={}] 2022-06-06 12:34:11 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:34:11 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):157 - sync summary: io.airbyte.config.StandardSyncOutput@2b4f6169[standardSyncSummary=io.airbyte.config.StandardSyncSummary@50479b68[status=failed,recordsSynced=0,bytesSynced=0,startTime=1654518820516,endTime=1654518851452,totalStats=io.airbyte.config.SyncStats@3d599e27[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]],normalizationSummary=,state=io.airbyte.config.State@4a1cd867[state={}],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@34252141[streams=[],additionalProperties={}],failures=[io.airbyte.config.FailureReason@1a785f22[failureOrigin=replication,failureType=,internalMessage=java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null,externalMessage=Something went wrong during replication,metadata=io.airbyte.config.Metadata@4a6c212d[additionalProperties={attemptNumber=0, jobId=1831}],stacktrace=java.util.concurrent.CompletionException: java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:382) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ... 3 more Caused by: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at io.airbyte.workers.RecordSchemaValidator.validateSchema(RecordSchemaValidator.java:51) at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:316) ... 4 more ,retryable=,timestamp=1654518837338]]] 2022-06-06 12:34:11 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating... 2022-06-06 12:34:11 INFO i.a.c.p.ConfigRepository(updateConnectionState):775 - Updating connection 1e621fb1-8bf6-4c1e-bc2e-2b1cf14fea66 state: io.airbyte.config.State@63f7b470[state={}] 2022-06-06 12:34:52 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/0/logs.log 2022-06-06 12:34:52 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:34:52 INFO i.a.w.g.DefaultNormalizationWorker(run):49 - Running normalization. 2022-06-06 12:34:52 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.2.1 2022-06-06 12:34:52 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.2.1 exists... 2022-06-06 12:34:53 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.2.1 was found locally. 2022-06-06 12:34:53 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:34:53 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/0/normalize --log-driver none --name normalization-snowflake-normalize-1831-0-ngufi --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha airbyte/normalization-snowflake:0.2.1 run --integration-type snowflake --config destination_config.json --catalog destination_catalog.json 2022-06-06 12:34:54 normalization > Running: transform-config --config destination_config.json --integration-type snowflake --out /data/1831/0/normalize 2022-06-06 12:34:56 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/1831/0/normalize') 2022-06-06 12:34:56 normalization > transform_snowflake 2022-06-06 12:34:56 normalization > Running: transform-catalog --integration-type snowflake --profile-config-dir /data/1831/0/normalize --catalog destination_catalog.json --out /data/1831/0/normalize/models/generated/ --json-column _airbyte_data 2022-06-06 12:34:57 normalization > Processing destination_catalog.json... 2022-06-06 12:34:57 normalization > detected no config file for ssh, assuming ssh is off. 2022-06-06 12:35:07 normalization > [--event-buffer-size EVENT_BUFFER_SIZE] 2022-06-06 12:35:07 normalization > --event-buffer-size EVENT_BUFFER_SIZE 2022-06-06 12:35:07 normalization > 2022-06-06 12:35:07 normalization > DBT >=1.0.0 detected; using 10K event buffer size 2022-06-06 12:35:07 normalization > 2022-06-06 12:35:17 normalization > 12:35:17 Running with dbt=1.0.0 2022-06-06 12:35:17 normalization > 12:35:17 Partial parse save file not found. Starting full parse. 2022-06-06 12:35:22 normalization > 12:35:22 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-06-06 12:35:22 normalization > There are 6 unused configuration paths: 2022-06-06 12:35:22 normalization > - models.airbyte_utils 2022-06-06 12:35:22 normalization > - models.airbyte_utils.generated.airbyte_views 2022-06-06 12:35:22 normalization > - models.airbyte_utils.generated.airbyte_tables 2022-06-06 12:35:22 normalization > - models 2022-06-06 12:35:22 normalization > - models.airbyte_utils.generated.airbyte_ctes 2022-06-06 12:35:22 normalization > - models.airbyte_utils.generated.airbyte_incremental 2022-06-06 12:35:22 normalization > 2022-06-06 12:35:22 normalization > 12:35:22 Found 0 models, 0 tests, 0 snapshots, 0 analyses, 535 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics 2022-06-06 12:35:22 normalization > 12:35:22 2022-06-06 12:35:22 normalization > 12:35:22 [WARNING]: Nothing to do. Try checking your model configs and model specification args 2022-06-06 12:35:23 INFO i.a.w.g.DefaultNormalizationWorker(run):73 - Normalization executed in 30 seconds. 2022-06-06 12:35:23 INFO i.a.w.g.DefaultNormalizationWorker(run):79 - Normalization summary: io.airbyte.config.NormalizationSummary@1fdc13da[startTime=1654518892800,endTime=1654518923194] 2022-06-06 12:35:23 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:35:23 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating... 2022-06-06 12:35:23 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/0/logs.log 2022-06-06 12:35:23 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:35:23 INFO i.a.w.g.DbtTransformationWorker(run):46 - Running dbt transformation. 2022-06-06 12:35:23 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.2.1 2022-06-06 12:35:23 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.2.1 exists... 2022-06-06 12:35:23 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.2.1 was found locally. 2022-06-06 12:35:23 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:35:23 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/0/transform --log-driver none --name normalization-snowflake-normalize-1831-0-nzetc --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha airbyte/normalization-snowflake:0.2.1 configure-dbt --integration-type snowflake --config destination_config.json --git-repo https://fuzeadmin:hrvmrx4a7ccm4mv7v4ixfjh27eqtuzvsyf5fgkl6yjykjemxz37q@dev.azure.com/FuzeHrSolutionInc/FUZE%20BI/_git/FUZE%20BI --git-branch main 2022-06-06 12:35:24 normalization > Running: git clone --depth 5 -b main --single-branch $GIT_REPO git_repo 2022-06-06 12:35:24 normalization > Cloning into 'git_repo'... 2022-06-06 12:35:28 normalization > Last 5 commits in git_repo: 2022-06-06 12:35:28 normalization > 7a0d093 reorg folder 2022-06-06 12:35:28 normalization > 3e1f607 Add PTB Transformation + CURA incremental + REF 2022-06-06 12:35:28 normalization > 279323b Merge branch 'main' of https://dev.azure.com/FuzeHrSolutionInc/FUZE%20BI/_git/FUZE%20BI 2022-06-06 12:35:28 normalization > c1ff6e1 add MDLITE/Create_table script 2022-06-06 12:35:28 normalization > 625123c Remove ref loads + set week count at account level + set cura with views 2022-06-06 12:35:28 normalization > /data/1831/0/transform 2022-06-06 12:35:28 normalization > Running: transform-config --config destination_config.json --integration-type snowflake --out /data/1831/0/transform 2022-06-06 12:35:30 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/1831/0/transform') 2022-06-06 12:35:30 normalization > transform_snowflake 2022-06-06 12:35:30 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if fishtownanalytics/dbt:1.0.0 exists... 2022-06-06 12:35:30 INFO i.a.c.i.LineGobbler(voidCall):82 - fishtownanalytics/dbt:1.0.0 was found locally. 2022-06-06 12:35:30 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:35:30 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/0/transform --log-driver none --name dbt-custom-1831-0-ihshg --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha --entrypoint /bin/bash fishtownanalytics/dbt:1.0.0 entrypoint.sh run --select PTB_DIM_PAY_TYPE 2022-06-06 12:35:31 dbt > Running from /data/1831/0/transform/git_repo 2022-06-06 12:35:31 dbt > detected no config file for ssh, assuming ssh is off. 2022-06-06 12:35:31 dbt > Running: dbt run --select PTB_DIM_PAY_TYPE --profiles-dir=/data/1831/0/transform --project-dir=/data/1831/0/transform/git_repo 2022-06-06 12:35:39 dbt > 12:35:39 Running with dbt=1.0.0 2022-06-06 12:35:39 dbt > 12:35:39 Partial parse save file not found. Starting full parse. 2022-06-06 12:35:41 dbt > 12:35:41 Found 13 models, 0 tests, 0 snapshots, 0 analyses, 180 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics 2022-06-06 12:35:41 dbt > 12:35:41 2022-06-06 12:35:45 dbt > 12:35:45 Concurrency: 5 threads (target='prod') 2022-06-06 12:35:45 dbt > 12:35:45 2022-06-06 12:35:45 dbt > 12:35:45 1 of 1 START view model AIRBYTE.PTB_DIM_PAY_TYPE........................................................................ [RUN] 2022-06-06 12:35:46 dbt > 12:35:46 1 of 1 OK created view model AIRBYTE.PTB_DIM_PAY_TYPE................................................................... [SUCCESS 1 in 1.34s] 2022-06-06 12:35:46 dbt > 12:35:46 2022-06-06 12:35:46 dbt > 12:35:46 Finished running 1 view model in 4.76s. 2022-06-06 12:35:46 dbt > 12:35:46 2022-06-06 12:35:46 dbt > 12:35:46 Completed successfully 2022-06-06 12:35:46 dbt > 12:35:46 2022-06-06 12:35:46 dbt > 12:35:46 Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1 2022-06-06 12:35:47 INFO i.a.w.g.DbtTransformationWorker(run):66 - Dbt Transformation executed in 0. 2022-06-06 12:35:47 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:35:47 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating... 2022-06-06 12:35:47 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.storage: must be a constant value HTTPS 2022-06-06 12:35:47 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.storage: must be a constant value GCS 2022-06-06 12:35:47 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.storage: must be a constant value S3 2022-06-06 12:35:47 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.storage_account: is missing but it is required, $.storage: must be a constant value AzBlob 2022-06-06 12:35:47 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.password: object found, string expected, $.storage: must be a constant value SSH 2022-06-06 12:35:47 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.password: object found, string expected, $.storage: must be a constant value SCP 2022-06-06 12:35:47 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.password: object found, string expected 2022-06-06 12:35:47 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.storage: must be a constant value local 2022-06-06 12:35:47 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.access_token: is missing but it is required, $.refresh_token: is missing but it is required 2022-06-06 12:35:47 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.password: object found, string expected 2022-06-06 12:35:47 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.method: does not have a value in the enumeration [Standard] 2022-06-06 12:35:47 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/1/logs.log 2022-06-06 12:35:47 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:35:48 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-file:0.2.10 exists... 2022-06-06 12:35:48 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-file:0.2.10 was found locally. 2022-06-06 12:35:48 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:35:48 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/1 --log-driver none --name source-file-check-1831-1-xgkub --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=1 -e WORKER_CONNECTOR_IMAGE=airbyte/source-file:0.2.10 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/source-file:0.2.10 check --config source_config.json 2022-06-06 12:35:53 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):97 - Checking access to sftp://cleaned/FuzeBI_PayCode_Table.CSV... 2022-06-06 12:35:53 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):96 - ignoring unsupported keyword arguments: ['connect_kwargs'] 2022-06-06 12:35:53 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):97 - Check succeeded 2022-06-06 12:35:54 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:35:54 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/1/logs.log 2022-06-06 12:35:54 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:35:54 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.26 exists... 2022-06-06 12:35:54 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.26 was found locally. 2022-06-06 12:35:54 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:35:54 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/1 --log-driver none --name destination-snowflake-check-1831-1-jfbax --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=1 -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.26 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/destination-snowflake:0.4.26 check --config source_config.json 2022-06-06 12:35:56 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Class path contains multiple SLF4J bindings. 2022-06-06 12:35:56 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:35:56 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:35:56 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:35:56 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-06-06 12:35:56 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-06-06 12:35:59 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:35:59 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-06-06 12:35:59 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:35:59 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-06-06 12:35:59 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:35:59 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-06-06 12:36:00 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:00 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-06-06 12:36:00 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:00 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:36:00 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:00 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:36:00 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:00 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:36:00 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:00 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:36:00 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:00 INFO i.a.i.d.j.c.SwitchingDestination(check):55 - Using destination type: INTERNAL_STAGING 2022-06-06 12:36:02 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:02 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-06-06 12:36:05 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:05 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@11eed657 2022-06-06 12:36:05 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:05 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-06-06 12:36:06 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:06 INFO i.a.d.j.DefaultJdbcDatabase(lambda$unsafeQuery$1):106 - closing connection 2022-06-06 12:36:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:08 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-06-06 12:36:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:36:09 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-06-06 12:36:10 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:36:10 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/1/logs.log 2022-06-06 12:36:10 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:36:10 INFO i.a.w.g.DefaultReplicationWorker(run):115 - start sync worker. job id: 1831 attempt id: 1 2022-06-06 12:36:10 INFO i.a.w.g.DefaultReplicationWorker(run):127 - configured sync modes: {} 2022-06-06 12:36:10 INFO i.a.w.i.DefaultAirbyteDestination(start):69 - Running destination... 2022-06-06 12:36:11 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.26 exists... 2022-06-06 12:36:11 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.26 was found locally. 2022-06-06 12:36:11 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:36:11 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/1 --log-driver none --name destination-snowflake-write-1831-1-chqhg --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=1 -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.26 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/destination-snowflake:0.4.26 write --config destination_config.json --catalog destination_catalog.json 2022-06-06 12:36:11 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-file:0.2.10 exists... 2022-06-06 12:36:11 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-file:0.2.10 was found locally. 2022-06-06 12:36:11 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:36:11 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/1 --log-driver none --name source-file-read-1831-1-zidhl --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=1 -e WORKER_CONNECTOR_IMAGE=airbyte/source-file:0.2.10 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/source-file:0.2.10 read --config source_config.json --catalog source_catalog.json --state input_state.json 2022-06-06 12:36:11 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):395 - Destination output thread started. 2022-06-06 12:36:11 INFO i.a.w.g.DefaultReplicationWorker(run):169 - Waiting for source and destination threads to complete. 2022-06-06 12:36:11 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):298 - Replication thread started. 2022-06-06 12:36:13 destination > SLF4J: Class path contains multiple SLF4J bindings. 2022-06-06 12:36:13 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:36:13 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:36:13 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:36:13 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-06-06 12:36:13 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-06-06 12:36:20 destination > 2022-06-06 12:36:19 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2022-06-06 12:36:20 destination > 2022-06-06 12:36:20 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-06-06 12:36:20 destination > 2022-06-06 12:36:20 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: WRITE 2022-06-06 12:36:20 destination > 2022-06-06 12:36:20 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2022-06-06 12:36:20 source > Reading PAYCODE (sftp://cleaned/FuzeBI_PayCode_Table.CSV)... 2022-06-06 12:36:20 source > ignoring unsupported keyword arguments: ['connect_kwargs'] 2022-06-06 12:36:21 destination > 2022-06-06 12:36:21 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:36:21 destination > 2022-06-06 12:36:21 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:36:21 destination > 2022-06-06 12:36:21 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:36:21 destination > 2022-06-06 12:36:21 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:36:21 destination > 2022-06-06 12:36:21 INFO i.a.i.d.j.c.SwitchingDestination(getConsumer):65 - Using destination type: INTERNAL_STAGING 2022-06-06 12:36:23 destination > 2022-06-06 12:36:23 INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2022-06-06 12:36:23 destination > 2022-06-06 12:36:23 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):117 - Preparing tmp tables in destination started for 0 streams 2022-06-06 12:36:23 destination > 2022-06-06 12:36:23 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):139 - Preparing tmp tables in destination completed. 2022-06-06 12:36:23 destination > 2022-06-06 12:36:23 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded. 2022-06-06 12:36:23 destination > 2022-06-06 12:36:23 INFO i.a.i.d.b.BufferedStreamConsumer(close):170 - executing on success close procedure. 2022-06-06 12:36:23 destination > 2022-06-06 12:36:23 INFO i.a.i.d.r.SerializedBufferingStrategy(flushAll):92 - Flushing all 0 current buffers (0 bytes in total) 2022-06-06 12:36:23 destination > 2022-06-06 12:36:23 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):186 - Copying into tables in destination started for 0 streams 2022-06-06 12:36:23 destination > 2022-06-06 12:36:23 INFO i.a.i.d.j.SqlOperations(onDestinationCloseOperations):138 - No onDestinationCloseOperations required for this destination. 2022-06-06 12:36:23 destination > 2022-06-06 12:36:23 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):216 - Executing finalization of tables. 2022-06-06 12:36:23 destination > 2022-06-06 12:36:23 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-06-06 12:36:27 destination > 2022-06-06 12:36:27 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@41da3aee 2022-06-06 12:36:27 destination > 2022-06-06 12:36:27 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-06-06 12:36:27 destination > 2022-06-06 12:36:27 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):218 - Finalizing tables in destination completed. 2022-06-06 12:36:27 destination > 2022-06-06 12:36:27 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):220 - Cleaning up destination started for 0 streams 2022-06-06 12:36:27 destination > 2022-06-06 12:36:27 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):235 - Cleaning up destination completed. 2022-06-06 12:36:27 destination > 2022-06-06 12:36:27 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-06-06 12:36:29 ERROR i.a.w.g.DefaultReplicationWorker(run):180 - Sync worker failed. java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:173) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:65) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:382) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more Caused by: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at io.airbyte.workers.RecordSchemaValidator.validateSchema(RecordSchemaValidator.java:51) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:316) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more 2022-06-06 12:36:29 INFO i.a.w.g.DefaultReplicationWorker(run):239 - sync summary: io.airbyte.config.ReplicationAttemptSummary@783ad226[status=failed,recordsSynced=0,bytesSynced=0,startTime=1654518970992,endTime=1654518989615,totalStats=io.airbyte.config.SyncStats@fa8417a[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]] 2022-06-06 12:36:29 INFO i.a.w.g.DefaultReplicationWorker(run):268 - Source did not output any state messages 2022-06-06 12:36:29 WARN i.a.w.g.DefaultReplicationWorker(run):276 - State capture: No new state, falling back on input state: io.airbyte.config.State@111f16f2[state={}] 2022-06-06 12:36:29 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:36:29 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):157 - sync summary: io.airbyte.config.StandardSyncOutput@69317f18[standardSyncSummary=io.airbyte.config.StandardSyncSummary@2c33ec5a[status=failed,recordsSynced=0,bytesSynced=0,startTime=1654518970992,endTime=1654518989615,totalStats=io.airbyte.config.SyncStats@fa8417a[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]],normalizationSummary=,state=io.airbyte.config.State@111f16f2[state={}],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@772c7e0[streams=[],additionalProperties={}],failures=[io.airbyte.config.FailureReason@1dd50481[failureOrigin=replication,failureType=,internalMessage=java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null,externalMessage=Something went wrong during replication,metadata=io.airbyte.config.Metadata@7f7e719c[additionalProperties={attemptNumber=1, jobId=1831}],stacktrace=java.util.concurrent.CompletionException: java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:382) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ... 3 more Caused by: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null at io.airbyte.workers.RecordSchemaValidator.validateSchema(RecordSchemaValidator.java:51) at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:316) ... 4 more ,retryable=,timestamp=1654518980683]]] 2022-06-06 12:36:29 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating... 2022-06-06 12:36:29 INFO i.a.c.p.ConfigRepository(updateConnectionState):775 - Updating connection 12565a7b-a078-4cb8-9778-91753781c43a state: io.airbyte.config.State@647a1899[state={}] 2022-06-06 12:36:29 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/1/logs.log 2022-06-06 12:36:29 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:36:29 INFO i.a.w.g.DefaultNormalizationWorker(run):49 - Running normalization. 2022-06-06 12:36:29 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.2.1 2022-06-06 12:36:29 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.2.1 exists... 2022-06-06 12:36:29 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.2.1 was found locally. 2022-06-06 12:36:29 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:36:29 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/1/normalize --log-driver none --name normalization-snowflake-normalize-1831-1-zxwzx --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha airbyte/normalization-snowflake:0.2.1 run --integration-type snowflake --config destination_config.json --catalog destination_catalog.json 2022-06-06 12:36:30 normalization > Running: transform-config --config destination_config.json --integration-type snowflake --out /data/1831/1/normalize 2022-06-06 12:36:31 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/1831/1/normalize') 2022-06-06 12:36:31 normalization > transform_snowflake 2022-06-06 12:36:31 normalization > Running: transform-catalog --integration-type snowflake --profile-config-dir /data/1831/1/normalize --catalog destination_catalog.json --out /data/1831/1/normalize/models/generated/ --json-column _airbyte_data 2022-06-06 12:36:32 normalization > Processing destination_catalog.json... 2022-06-06 12:36:32 normalization > detected no config file for ssh, assuming ssh is off. 2022-06-06 12:36:40 normalization > [--event-buffer-size EVENT_BUFFER_SIZE] 2022-06-06 12:36:40 normalization > --event-buffer-size EVENT_BUFFER_SIZE 2022-06-06 12:36:40 normalization > 2022-06-06 12:36:40 normalization > DBT >=1.0.0 detected; using 10K event buffer size 2022-06-06 12:36:40 normalization > 2022-06-06 12:36:48 normalization > 12:36:48 Running with dbt=1.0.0 2022-06-06 12:36:48 normalization > 12:36:48 Partial parse save file not found. Starting full parse. 2022-06-06 12:36:51 normalization > 12:36:51 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-06-06 12:36:51 normalization > There are 6 unused configuration paths: 2022-06-06 12:36:51 normalization > - models.airbyte_utils.generated.airbyte_incremental 2022-06-06 12:36:51 normalization > - models.airbyte_utils.generated.airbyte_tables 2022-06-06 12:36:51 normalization > - models.airbyte_utils.generated.airbyte_views 2022-06-06 12:36:51 normalization > - models.airbyte_utils.generated.airbyte_ctes 2022-06-06 12:36:51 normalization > - models.airbyte_utils 2022-06-06 12:36:51 normalization > - models 2022-06-06 12:36:51 normalization > 2022-06-06 12:36:51 normalization > 12:36:51 Found 0 models, 0 tests, 0 snapshots, 0 analyses, 535 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics 2022-06-06 12:36:51 normalization > 12:36:51 2022-06-06 12:36:51 normalization > 12:36:51 [WARNING]: Nothing to do. Try checking your model configs and model specification args 2022-06-06 12:36:52 INFO i.a.w.g.DefaultNormalizationWorker(run):73 - Normalization executed in 22 seconds. 2022-06-06 12:36:52 INFO i.a.w.g.DefaultNormalizationWorker(run):79 - Normalization summary: io.airbyte.config.NormalizationSummary@73310771[startTime=1654518989802,endTime=1654519012573] 2022-06-06 12:36:52 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:36:52 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating... 2022-06-06 12:36:52 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/1/logs.log 2022-06-06 12:36:52 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:36:52 INFO i.a.w.g.DbtTransformationWorker(run):46 - Running dbt transformation. 2022-06-06 12:36:52 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.2.1 2022-06-06 12:36:52 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.2.1 exists... 2022-06-06 12:36:53 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.2.1 was found locally. 2022-06-06 12:36:53 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:36:53 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/1/transform --log-driver none --name normalization-snowflake-normalize-1831-1-ggenh --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha airbyte/normalization-snowflake:0.2.1 configure-dbt --integration-type snowflake --config destination_config.json --git-repo https://fuzeadmin:hrvmrx4a7ccm4mv7v4ixfjh27eqtuzvsyf5fgkl6yjykjemxz37q@dev.azure.com/FuzeHrSolutionInc/FUZE%20BI/_git/FUZE%20BI --git-branch main 2022-06-06 12:36:53 normalization > Running: git clone --depth 5 -b main --single-branch $GIT_REPO git_repo 2022-06-06 12:36:53 normalization > Cloning into 'git_repo'... 2022-06-06 12:36:54 normalization > Last 5 commits in git_repo: 2022-06-06 12:36:54 normalization > 7a0d093 reorg folder 2022-06-06 12:36:54 normalization > 3e1f607 Add PTB Transformation + CURA incremental + REF 2022-06-06 12:36:54 normalization > 279323b Merge branch 'main' of https://dev.azure.com/FuzeHrSolutionInc/FUZE%20BI/_git/FUZE%20BI 2022-06-06 12:36:54 normalization > c1ff6e1 add MDLITE/Create_table script 2022-06-06 12:36:54 normalization > 625123c Remove ref loads + set week count at account level + set cura with views 2022-06-06 12:36:54 normalization > /data/1831/1/transform 2022-06-06 12:36:54 normalization > Running: transform-config --config destination_config.json --integration-type snowflake --out /data/1831/1/transform 2022-06-06 12:36:55 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/1831/1/transform') 2022-06-06 12:36:55 normalization > transform_snowflake 2022-06-06 12:36:56 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if fishtownanalytics/dbt:1.0.0 exists... 2022-06-06 12:36:56 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:36:56 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/1/transform --log-driver none --name dbt-custom-1831-1-gaemv --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha --entrypoint /bin/bash fishtownanalytics/dbt:1.0.0 entrypoint.sh run --select PTB_DIM_PAY_TYPE 2022-06-06 12:36:56 INFO i.a.c.i.LineGobbler(voidCall):82 - fishtownanalytics/dbt:1.0.0 was found locally. 2022-06-06 12:36:56 dbt > Running from /data/1831/1/transform/git_repo 2022-06-06 12:36:56 dbt > detected no config file for ssh, assuming ssh is off. 2022-06-06 12:36:56 dbt > Running: dbt run --select PTB_DIM_PAY_TYPE --profiles-dir=/data/1831/1/transform --project-dir=/data/1831/1/transform/git_repo 2022-06-06 12:37:05 dbt > 12:37:05 Running with dbt=1.0.0 2022-06-06 12:37:05 dbt > 12:37:05 Partial parse save file not found. Starting full parse. 2022-06-06 12:37:07 dbt > 12:37:07 Found 13 models, 0 tests, 0 snapshots, 0 analyses, 180 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics 2022-06-06 12:37:07 dbt > 12:37:07 2022-06-06 12:37:10 dbt > 12:37:10 Concurrency: 5 threads (target='prod') 2022-06-06 12:37:10 dbt > 12:37:10 2022-06-06 12:37:10 dbt > 12:37:10 1 of 1 START view model AIRBYTE.PTB_DIM_PAY_TYPE........................................................................ [RUN] 2022-06-06 12:37:12 dbt > 12:37:12 1 of 1 OK created view model AIRBYTE.PTB_DIM_PAY_TYPE................................................................... [SUCCESS 1 in 1.36s] 2022-06-06 12:37:12 dbt > 12:37:12 2022-06-06 12:37:12 dbt > 12:37:12 Finished running 1 view model in 4.78s. 2022-06-06 12:37:12 dbt > 12:37:12 2022-06-06 12:37:12 dbt > 12:37:12 Completed successfully 2022-06-06 12:37:12 dbt > 12:37:12 2022-06-06 12:37:12 dbt > 12:37:12 Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1 2022-06-06 12:37:12 INFO i.a.w.g.DbtTransformationWorker(run):66 - Dbt Transformation executed in 0. 2022-06-06 12:37:12 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:37:12 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating... 2022-06-06 12:37:13 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.storage: must be a constant value HTTPS 2022-06-06 12:37:13 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.storage: must be a constant value GCS 2022-06-06 12:37:13 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.storage: must be a constant value S3 2022-06-06 12:37:13 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.storage_account: is missing but it is required, $.storage: must be a constant value AzBlob 2022-06-06 12:37:13 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.password: object found, string expected, $.storage: must be a constant value SSH 2022-06-06 12:37:13 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.password: object found, string expected, $.storage: must be a constant value SCP 2022-06-06 12:37:13 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.password: object found, string expected 2022-06-06 12:37:13 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.storage: must be a constant value local 2022-06-06 12:37:13 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.access_token: is missing but it is required, $.refresh_token: is missing but it is required 2022-06-06 12:37:13 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.password: object found, string expected 2022-06-06 12:37:13 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.method: does not have a value in the enumeration [Standard] 2022-06-06 12:37:13 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/2/logs.log 2022-06-06 12:37:13 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:37:13 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-file:0.2.10 exists... 2022-06-06 12:37:13 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-file:0.2.10 was found locally. 2022-06-06 12:37:13 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:37:13 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/2 --log-driver none --name source-file-check-1831-2-vvour --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=2 -e WORKER_CONNECTOR_IMAGE=airbyte/source-file:0.2.10 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/source-file:0.2.10 check --config source_config.json 2022-06-06 12:37:19 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):97 - Checking access to sftp://cleaned/FuzeBI_PayCode_Table.CSV... 2022-06-06 12:37:19 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):96 - ignoring unsupported keyword arguments: ['connect_kwargs'] 2022-06-06 12:37:19 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):97 - Check succeeded 2022-06-06 12:37:20 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:37:20 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/2/logs.log 2022-06-06 12:37:20 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:37:20 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.26 exists... 2022-06-06 12:37:20 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.26 was found locally. 2022-06-06 12:37:20 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:37:20 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/2 --log-driver none --name destination-snowflake-check-1831-2-nmlhr --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=2 -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.26 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/destination-snowflake:0.4.26 check --config source_config.json 2022-06-06 12:37:22 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Class path contains multiple SLF4J bindings. 2022-06-06 12:37:22 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:37:22 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:37:22 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:37:22 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-06-06 12:37:22 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-06-06 12:37:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:26 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-06-06 12:37:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:26 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-06-06 12:37:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:26 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-06-06 12:37:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:26 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-06-06 12:37:27 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:27 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:37:27 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:27 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:37:27 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:27 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:37:27 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:27 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:37:27 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:27 INFO i.a.i.d.j.c.SwitchingDestination(check):55 - Using destination type: INTERNAL_STAGING 2022-06-06 12:37:28 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:28 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-06-06 12:37:32 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:32 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@11eed657 2022-06-06 12:37:32 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:32 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-06-06 12:37:33 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:33 INFO i.a.d.j.DefaultJdbcDatabase(lambda$unsafeQuery$1):106 - closing connection 2022-06-06 12:37:34 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:34 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-06-06 12:37:35 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-06-06 12:37:35 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-06-06 12:37:35 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:37:36 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/2/logs.log 2022-06-06 12:37:36 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:37:36 INFO i.a.w.g.DefaultReplicationWorker(run):115 - start sync worker. job id: 1831 attempt id: 2 2022-06-06 12:37:36 INFO i.a.w.g.DefaultReplicationWorker(run):127 - configured sync modes: {} 2022-06-06 12:37:36 INFO i.a.w.i.DefaultAirbyteDestination(start):69 - Running destination... 2022-06-06 12:37:36 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.26 exists... 2022-06-06 12:37:36 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:37:36 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.26 was found locally. 2022-06-06 12:37:36 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/2 --log-driver none --name destination-snowflake-write-1831-2-ubhny --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=2 -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.26 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/destination-snowflake:0.4.26 write --config destination_config.json --catalog destination_catalog.json 2022-06-06 12:37:36 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-file:0.2.10 exists... 2022-06-06 12:37:37 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:37:37 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/2 --log-driver none --name source-file-read-1831-2-tqekl --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=2 -e WORKER_CONNECTOR_IMAGE=airbyte/source-file:0.2.10 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha -e WORKER_JOB_ID=1831 airbyte/source-file:0.2.10 read --config source_config.json --catalog source_catalog.json --state input_state.json 2022-06-06 12:37:37 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-file:0.2.10 was found locally. 2022-06-06 12:37:37 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):395 - Destination output thread started. 2022-06-06 12:37:37 INFO i.a.w.g.DefaultReplicationWorker(run):169 - Waiting for source and destination threads to complete. 2022-06-06 12:37:37 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):298 - Replication thread started. 2022-06-06 12:37:38 destination > SLF4J: Class path contains multiple SLF4J bindings. 2022-06-06 12:37:38 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:37:38 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:37:38 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-06-06 12:37:38 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-06-06 12:37:39 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-06-06 12:37:43 source > Reading PAYCODE (sftp://cleaned/FuzeBI_PayCode_Table.CSV)... 2022-06-06 12:37:43 source > ignoring unsupported keyword arguments: ['connect_kwargs'] 2022-06-06 12:37:43 destination > 2022-06-06 12:37:43 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2022-06-06 12:37:43 destination > 2022-06-06 12:37:43 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-06-06 12:37:43 destination > 2022-06-06 12:37:43 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: WRITE 2022-06-06 12:37:43 destination > 2022-06-06 12:37:43 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2022-06-06 12:37:44 destination > 2022-06-06 12:37:44 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:37:44 destination > 2022-06-06 12:37:44 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:37:44 destination > 2022-06-06 12:37:44 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:37:44 destination > 2022-06-06 12:37:44 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-06-06 12:37:44 destination > 2022-06-06 12:37:44 INFO i.a.i.d.j.c.SwitchingDestination(getConsumer):65 - Using destination type: INTERNAL_STAGING 2022-06-06 12:37:45 destination > 2022-06-06 12:37:45 INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2022-06-06 12:37:45 destination > 2022-06-06 12:37:45 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):117 - Preparing tmp tables in destination started for 0 streams 2022-06-06 12:37:45 destination > 2022-06-06 12:37:45 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):139 - Preparing tmp tables in destination completed. 2022-06-06 12:37:45 destination > 2022-06-06 12:37:45 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded. 2022-06-06 12:37:45 destination > 2022-06-06 12:37:45 INFO i.a.i.d.b.BufferedStreamConsumer(close):170 - executing on success close procedure. 2022-06-06 12:37:45 destination > 2022-06-06 12:37:45 INFO i.a.i.d.r.SerializedBufferingStrategy(flushAll):92 - Flushing all 0 current buffers (0 bytes in total) 2022-06-06 12:37:45 destination > 2022-06-06 12:37:45 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):186 - Copying into tables in destination started for 0 streams 2022-06-06 12:37:45 destination > 2022-06-06 12:37:45 INFO i.a.i.d.j.SqlOperations(onDestinationCloseOperations):138 - No onDestinationCloseOperations required for this destination. 2022-06-06 12:37:46 destination > 2022-06-06 12:37:45 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):216 - Executing finalization of tables. 2022-06-06 12:37:46 destination > 2022-06-06 12:37:46 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-06-06 12:37:49 destination > 2022-06-06 12:37:49 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@41da3aee 2022-06-06 12:37:49 destination > 2022-06-06 12:37:49 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-06-06 12:37:51 destination > 2022-06-06 12:37:51 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):218 - Finalizing tables in destination completed. 2022-06-06 12:37:51 destination > 2022-06-06 12:37:51 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):220 - Cleaning up destination started for 0 streams 2022-06-06 12:37:51 destination > 2022-06-06 12:37:51 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):235 - Cleaning up destination completed. 2022-06-06 12:37:51 destination > 2022-06-06 12:37:51 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-06-06 12:37:52 ERROR i.a.w.g.DefaultReplicationWorker(run):180 - Sync worker failed. java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.NullPointerException at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:173) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:65) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: java.lang.RuntimeException: java.lang.NullPointerException at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:382) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more Caused by: java.lang.NullPointerException 2022-06-06 12:37:52 INFO i.a.w.g.DefaultReplicationWorker(run):239 - sync summary: io.airbyte.config.ReplicationAttemptSummary@45c3101c[status=failed,recordsSynced=0,bytesSynced=0,startTime=1654519056456,endTime=1654519072740,totalStats=io.airbyte.config.SyncStats@60b6ad5b[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]] 2022-06-06 12:37:52 INFO i.a.w.g.DefaultReplicationWorker(run):268 - Source did not output any state messages 2022-06-06 12:37:52 WARN i.a.w.g.DefaultReplicationWorker(run):276 - State capture: No new state, falling back on input state: io.airbyte.config.State@4a7f417b[state={}] 2022-06-06 12:37:52 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:37:52 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):157 - sync summary: io.airbyte.config.StandardSyncOutput@57a1837b[standardSyncSummary=io.airbyte.config.StandardSyncSummary@12a307eb[status=failed,recordsSynced=0,bytesSynced=0,startTime=1654519056456,endTime=1654519072740,totalStats=io.airbyte.config.SyncStats@60b6ad5b[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]],normalizationSummary=,state=io.airbyte.config.State@4a7f417b[state={}],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@a8ac04[streams=[],additionalProperties={}],failures=[io.airbyte.config.FailureReason@23fbd14c[failureOrigin=replication,failureType=,internalMessage=java.lang.RuntimeException: java.lang.NullPointerException,externalMessage=Something went wrong during replication,metadata=io.airbyte.config.Metadata@781a2413[additionalProperties={attemptNumber=2, jobId=1831}],stacktrace=java.util.concurrent.CompletionException: java.lang.RuntimeException: java.lang.NullPointerException at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: java.lang.RuntimeException: java.lang.NullPointerException at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:382) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ... 3 more Caused by: java.lang.NullPointerException ,retryable=,timestamp=1654519064095]]] 2022-06-06 12:37:52 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating... 2022-06-06 12:37:52 INFO i.a.c.p.ConfigRepository(updateConnectionState):775 - Updating connection 12565a7b-a078-4cb8-9778-91753781c43a state: io.airbyte.config.State@4a0d7460[state={}] 2022-06-06 12:37:52 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/2/logs.log 2022-06-06 12:37:52 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:37:52 INFO i.a.w.g.DefaultNormalizationWorker(run):49 - Running normalization. 2022-06-06 12:37:52 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.2.1 2022-06-06 12:37:52 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.2.1 exists... 2022-06-06 12:37:53 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.2.1 was found locally. 2022-06-06 12:37:53 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:37:53 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/2/normalize --log-driver none --name normalization-snowflake-normalize-1831-2-fceqi --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha airbyte/normalization-snowflake:0.2.1 run --integration-type snowflake --config destination_config.json --catalog destination_catalog.json 2022-06-06 12:37:53 normalization > Running: transform-config --config destination_config.json --integration-type snowflake --out /data/1831/2/normalize 2022-06-06 12:37:54 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/1831/2/normalize') 2022-06-06 12:37:54 normalization > transform_snowflake 2022-06-06 12:37:54 normalization > Running: transform-catalog --integration-type snowflake --profile-config-dir /data/1831/2/normalize --catalog destination_catalog.json --out /data/1831/2/normalize/models/generated/ --json-column _airbyte_data 2022-06-06 12:37:55 normalization > Processing destination_catalog.json... 2022-06-06 12:37:55 normalization > detected no config file for ssh, assuming ssh is off. 2022-06-06 12:38:02 normalization > [--event-buffer-size EVENT_BUFFER_SIZE] 2022-06-06 12:38:02 normalization > --event-buffer-size EVENT_BUFFER_SIZE 2022-06-06 12:38:02 normalization > 2022-06-06 12:38:02 normalization > DBT >=1.0.0 detected; using 10K event buffer size 2022-06-06 12:38:02 normalization > 2022-06-06 12:38:10 normalization > 12:38:10 Running with dbt=1.0.0 2022-06-06 12:38:10 normalization > 12:38:10 Partial parse save file not found. Starting full parse. 2022-06-06 12:38:14 normalization > 12:38:14 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-06-06 12:38:14 normalization > There are 6 unused configuration paths: 2022-06-06 12:38:14 normalization > - models.airbyte_utils.generated.airbyte_views 2022-06-06 12:38:14 normalization > - models.airbyte_utils.generated.airbyte_ctes 2022-06-06 12:38:14 normalization > - models.airbyte_utils.generated.airbyte_incremental 2022-06-06 12:38:14 normalization > - models.airbyte_utils 2022-06-06 12:38:14 normalization > - models 2022-06-06 12:38:14 normalization > - models.airbyte_utils.generated.airbyte_tables 2022-06-06 12:38:14 normalization > 2022-06-06 12:38:14 normalization > 12:38:14 Found 0 models, 0 tests, 0 snapshots, 0 analyses, 535 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics 2022-06-06 12:38:14 normalization > 12:38:14 2022-06-06 12:38:14 normalization > 12:38:14 [WARNING]: Nothing to do. Try checking your model configs and model specification args 2022-06-06 12:38:15 INFO i.a.w.g.DefaultNormalizationWorker(run):73 - Normalization executed in 22 seconds. 2022-06-06 12:38:15 INFO i.a.w.g.DefaultNormalizationWorker(run):79 - Normalization summary: io.airbyte.config.NormalizationSummary@1bcebf74[startTime=1654519072880,endTime=1654519095249] 2022-06-06 12:38:15 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:38:15 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating... 2022-06-06 12:38:15 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/1831/2/logs.log 2022-06-06 12:38:15 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.7-alpha 2022-06-06 12:38:15 INFO i.a.w.g.DbtTransformationWorker(run):46 - Running dbt transformation. 2022-06-06 12:38:15 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.2.1 2022-06-06 12:38:15 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.2.1 exists... 2022-06-06 12:38:15 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.2.1 was found locally. 2022-06-06 12:38:15 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:38:15 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/2/transform --log-driver none --name normalization-snowflake-normalize-1831-2-zapuf --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha airbyte/normalization-snowflake:0.2.1 configure-dbt --integration-type snowflake --config destination_config.json --git-repo https://fuzeadmin:hrvmrx4a7ccm4mv7v4ixfjh27eqtuzvsyf5fgkl6yjykjemxz37q@dev.azure.com/FuzeHrSolutionInc/FUZE%20BI/_git/FUZE%20BI --git-branch main 2022-06-06 12:38:16 normalization > Cloning into 'git_repo'... 2022-06-06 12:38:16 normalization > Running: git clone --depth 5 -b main --single-branch $GIT_REPO git_repo 2022-06-06 12:38:16 normalization > Last 5 commits in git_repo: 2022-06-06 12:38:16 normalization > 7a0d093 reorg folder 2022-06-06 12:38:16 normalization > 3e1f607 Add PTB Transformation + CURA incremental + REF 2022-06-06 12:38:16 normalization > 279323b Merge branch 'main' of https://dev.azure.com/FuzeHrSolutionInc/FUZE%20BI/_git/FUZE%20BI 2022-06-06 12:38:16 normalization > c1ff6e1 add MDLITE/Create_table script 2022-06-06 12:38:16 normalization > 625123c Remove ref loads + set week count at account level + set cura with views 2022-06-06 12:38:16 normalization > /data/1831/2/transform 2022-06-06 12:38:16 normalization > Running: transform-config --config destination_config.json --integration-type snowflake --out /data/1831/2/transform 2022-06-06 12:38:17 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/1831/2/transform') 2022-06-06 12:38:17 normalization > transform_snowflake 2022-06-06 12:38:18 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if fishtownanalytics/dbt:1.0.0 exists... 2022-06-06 12:38:18 INFO i.a.c.i.LineGobbler(voidCall):82 - fishtownanalytics/dbt:1.0.0 was found locally. 2022-06-06 12:38:18 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 1831 2022-06-06 12:38:18 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/1831/2/transform --log-driver none --name dbt-custom-1831-2-jpmsc --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.7-alpha --entrypoint /bin/bash fishtownanalytics/dbt:1.0.0 entrypoint.sh run --select PTB_DIM_PAY_TYPE 2022-06-06 12:38:18 dbt > Running from /data/1831/2/transform/git_repo 2022-06-06 12:38:18 dbt > detected no config file for ssh, assuming ssh is off. 2022-06-06 12:38:18 dbt > Running: dbt run --select PTB_DIM_PAY_TYPE --profiles-dir=/data/1831/2/transform --project-dir=/data/1831/2/transform/git_repo 2022-06-06 12:38:27 dbt > 12:38:27 Running with dbt=1.0.0 2022-06-06 12:38:27 dbt > 12:38:27 Partial parse save file not found. Starting full parse. 2022-06-06 12:38:29 dbt > 12:38:29 Found 13 models, 0 tests, 0 snapshots, 0 analyses, 180 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics 2022-06-06 12:38:29 dbt > 12:38:29 2022-06-06 12:38:32 dbt > 12:38:32 Concurrency: 5 threads (target='prod') 2022-06-06 12:38:32 dbt > 12:38:32 2022-06-06 12:38:32 dbt > 12:38:32 1 of 1 START view model AIRBYTE.PTB_DIM_PAY_TYPE........................................................................ [RUN] 2022-06-06 12:38:33 dbt > 12:38:33 1 of 1 OK created view model AIRBYTE.PTB_DIM_PAY_TYPE................................................................... [SUCCESS 1 in 1.59s] 2022-06-06 12:38:33 dbt > 12:38:33 2022-06-06 12:38:33 dbt > 12:38:33 Finished running 1 view model in 4.86s. 2022-06-06 12:38:33 dbt > 12:38:33 2022-06-06 12:38:33 dbt > 12:38:33 Completed successfully 2022-06-06 12:38:33 dbt > 12:38:33 2022-06-06 12:38:33 dbt > 12:38:33 Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1 2022-06-06 12:38:34 INFO i.a.w.g.DbtTransformationWorker(run):66 - Dbt Transformation executed in 0. 2022-06-06 12:38:34 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-06-06 12:38:34 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating...