2022-05-12 10:52:51 INFO i.a.w.w.WorkerRun(call):49 - Executing worker wrapper. Airbyte version: 0.36.1-alpha 2022-05-12 10:52:51 INFO i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/911/0/logs.log 2022-05-12 10:52:51 INFO i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.36.1-alpha 2022-05-12 10:52:51 INFO i.a.w.DefaultReplicationWorker(run):104 - start sync worker. job id: 911 attempt id: 0 2022-05-12 10:52:51 INFO i.a.w.DefaultReplicationWorker(run):116 - configured sync modes: {null.owners=full_refresh - overwrite} 2022-05-12 10:52:51 INFO i.a.w.p.a.DefaultAirbyteDestination(start):69 - Running destination... 2022-05-12 10:52:51 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-bigquery:1.1.4 exists... 2022-05-12 10:52:51 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-bigquery:1.1.4 was found locally. 2022-05-12 10:52:51 INFO i.a.w.p.DockerProcessFactory(create):106 - Creating docker job ID: 911 2022-05-12 10:52:51 INFO i.a.w.p.DockerProcessFactory(create):158 - Preparing command: docker run --rm --init -i -w /data/911/0 --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:1.1.4 -e WORKER_JOB_ATTEMPT=0 -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.36.1-alpha -e WORKER_JOB_ID=911 airbyte/destination-bigquery:1.1.4 write --config destination_config.json --catalog destination_catalog.json 2022-05-12 10:52:51 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-hubspot:0.1.51 exists... 2022-05-12 10:52:51 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-hubspot:0.1.51 was found locally. 2022-05-12 10:52:51 INFO i.a.w.p.DockerProcessFactory(create):106 - Creating docker job ID: 911 2022-05-12 10:52:51 INFO i.a.w.p.DockerProcessFactory(create):158 - Preparing command: docker run --rm --init -i -w /data/911/0 --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_CONNECTOR_IMAGE=airbyte/source-hubspot:0.1.51 -e WORKER_JOB_ATTEMPT=0 -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.36.1-alpha -e WORKER_JOB_ID=911 airbyte/source-hubspot:0.1.51 read --config source_config.json --catalog source_catalog.json 2022-05-12 10:52:51 INFO i.a.w.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$6):339 - Destination output thread started. 2022-05-12 10:52:51 INFO i.a.w.DefaultReplicationWorker(run):158 - Waiting for source and destination threads to complete. 2022-05-12 10:52:51 INFO i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):279 - Replication thread started. 2022-05-12 10:52:52 destination > SLF4J: Class path contains multiple SLF4J bindings. 2022-05-12 10:52:52 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-05-12 10:52:52 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-05-12 10:52:52 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-05-12 10:52:52 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-05-12 10:52:52 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-05-12 10:52:53 source > Starting syncing SourceHubspot 2022-05-12 10:52:53 source > Syncing stream: owners 2022-05-12 10:52:53 destination > 2022-05-12 10:52:53 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2022-05-12 10:52:53 destination > 2022-05-12 10:52:53 INFO i.a.i.b.IntegrationRunner(runInternal):121 - Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2022-05-12 10:52:53 destination > 2022-05-12 10:52:53 INFO i.a.i.b.IntegrationRunner(runInternal):122 - Command: WRITE 2022-05-12 10:52:53 destination > 2022-05-12 10:52:53 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2022-05-12 10:52:53 destination > 2022-05-12 10:52:53 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-05-12 10:52:53 destination > 2022-05-12 10:52:53 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-05-12 10:52:53 destination > 2022-05-12 10:52:53 INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):295 - Selected loading method is set to: GCS 2022-05-12 10:52:53 destination > 2022-05-12 10:52:53 INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 - S3 format config: {"format_type":"AVRO","flattening":"No flattening","part_size_mb":"5"} 2022-05-12 10:52:53 destination > 2022-05-12 10:52:53 INFO i.a.i.d.b.BigQueryUtils(isKeepFilesInGcs):311 - All tmp files will be removed from GCS when replication is finished 2022-05-12 10:52:54 destination > 2022-05-12 10:52:54 INFO i.a.i.d.b.BigQueryDestination(getGcsRecordConsumer):288 - Creating BigQuery staging message consumer with staging ID 51b2df61-0b43-4220-a492-cc4f7bdfaa83 at 2022-05-12T10:52:53.660Z 2022-05-12 10:52:54 source > Read 46 records from owners stream 2022-05-12 10:52:54 source > Finished syncing SourceHubspot 2022-05-12 10:52:54 source > SourceHubspot runtimes: 2022-05-12 10:52:54 source > Finished syncing SourceHubspot 2022-05-12 10:52:54 destination > 2022-05-12 10:52:54 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=test_owners, namespace=temp, datasetId=temp, datasetLocation=EU, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=temp, tableId=_airbyte_tmp_tll_test_owners}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=temp, tableId=_airbyte_raw_test_owners}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=overwrite, stagedFiles=[]] 2022-05-12 10:52:54 destination > 2022-05-12 10:52:54 INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2022-05-12 10:52:54 destination > 2022-05-12 10:52:54 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$3):98 - Preparing tmp tables in destination started for 1 streams 2022-05-12 10:52:54 destination > 2022-05-12 10:52:54 INFO i.a.i.d.b.BigQueryGcsOperations(createSchemaIfNotExists):86 - Creating dataset temp 2022-05-12 10:52:54 INFO i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):305 - Total records read: 46 (10 KB) 2022-05-12 10:52:54 INFO i.a.w.DefaultReplicationWorker(run):163 - One of source or destination thread complete. Waiting on the other. 2022-05-12 10:52:55 destination > 2022-05-12 10:52:55 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=temp, tableId=_airbyte_tmp_tll_test_owners}} 2022-05-12 10:52:55 destination > 2022-05-12 10:52:55 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=temp, tableId=_airbyte_tmp_tll_test_owners}} 2022-05-12 10:52:55 destination > 2022-05-12 10:52:55 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream test_owners (dataset temp): airbyte/data_sync/temp_test_owners/2022/05/12/10/51b2df61-0b43-4220-a492-cc4f7bdfaa83/ 2022-05-12 10:52:56 destination > 2022-05-12 10:52:56 ERROR i.a.i.b.FailureTrackingAirbyteMessageConsumer(start):39 - Exception while starting consumer 2022-05-12 10:52:56 destination > com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: null; S3 Extended Request ID: null; Proxy: null) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1819) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleServiceErrorResponse(AmazonHttpClient.java:1403) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1372) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1145) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:802) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:770) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:744) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:704) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:686) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:550) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:530) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5437) ~[aws-java-sdk-s3-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5384) ~[aws-java-sdk-s3-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1367) ~[aws-java-sdk-s3-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1341) ~[aws-java-sdk-s3-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.doesObjectExist(AmazonS3Client.java:1422) ~[aws-java-sdk-s3-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.s3.S3StorageOperations.createBucketObjectIfNotExists(S3StorageOperations.java:96) ~[io.airbyte.airbyte-integrations.connectors-destination-s3-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.bigquery.BigQueryGcsOperations.createStageIfNotExists(BigQueryGcsOperations.java:102) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onStartFunction$3(BigQueryStagingConsumerFactory.java:105) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.commons.concurrency.VoidCallable.call(VoidCallable.java:15) ~[io.airbyte-airbyte-commons-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.startTracked(BufferedStreamConsumer.java:118) ~[io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54) ~[io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.start(FailureTrackingAirbyteMessageConsumer.java:36) ~[io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.consumeWriteStream(IntegrationRunner.java:187) ~[io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.lambda$runConsumer$4(IntegrationRunner.java:201) ~[io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:230) ~[io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.runConsumer(IntegrationRunner.java:200) ~[io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$1(IntegrationRunner.java:163) ~[io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54) [io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:38) [io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:163) [io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:105) [io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:314) [io.airbyte.airbyte-integrations.connectors-destination-bigquery-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > 2022-05-12 10:52:56 WARN i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):63 - Airbyte message consumer: failed. 2022-05-12 10:52:56 destination > 2022-05-12 10:52:56 ERROR i.a.i.d.b.BufferedStreamConsumer(close):168 - executing on failed close procedure. 2022-05-12 10:52:56 destination > 2022-05-12 10:52:56 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):159 - Cleaning up destination started for 1 streams 2022-05-12 10:52:56 destination > 2022-05-12 10:52:56 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=temp, tableId=_airbyte_tmp_tll_test_owners}} (dataset temp) 2022-05-12 10:52:56 destination > 2022-05-12 10:52:56 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream test_owners (dataset temp): airbyte/data_sync/temp_test_owners 2022-05-12 10:52:56 destination > 2022-05-12 10:52:56 ERROR i.a.i.d.b.BufferedStreamConsumer(close):191 - Close failed. 2022-05-12 10:52:56 destination > com.amazonaws.services.s3.model.AmazonS3Exception: The request signature we calculated does not match the signature you provided. Check your Google secret key and signing method. (Service: Amazon S3; Status Code: 403; Error Code: SignatureDoesNotMatch; Request ID: null; S3 Extended Request ID: null; Proxy: null) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1819) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > Exception in thread "main" com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: null; S3 Extended Request ID: null; Proxy: null), S3 Extended Request ID: null 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1819) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleServiceErrorResponse(AmazonHttpClient.java:1403) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleServiceErrorResponse(AmazonHttpClient.java:1403) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1372) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1372) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1145) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1145) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:802) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:802) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:770) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:770) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:744) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:704) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:744) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:686) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:550) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:704) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:530) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5437) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:686) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5384) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1367) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:550) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1341) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.doesObjectExist(AmazonS3Client.java:1422) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:530) ~[aws-java-sdk-core-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.s3.S3StorageOperations.createBucketObjectIfNotExists(S3StorageOperations.java:96) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.bigquery.BigQueryGcsOperations.createStageIfNotExists(BigQueryGcsOperations.java:102) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5437) ~[aws-java-sdk-s3-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onStartFunction$3(BigQueryStagingConsumerFactory.java:105) 2022-05-12 10:52:56 destination > at io.airbyte.commons.concurrency.VoidCallable.call(VoidCallable.java:15) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5384) ~[aws-java-sdk-s3-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.startTracked(BufferedStreamConsumer.java:118) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5378) ~[aws-java-sdk-s3-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.listObjects(AmazonS3Client.java:927) ~[aws-java-sdk-s3-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.start(FailureTrackingAirbyteMessageConsumer.java:36) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.consumeWriteStream(IntegrationRunner.java:187) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.listObjects(AmazonS3Client.java:901) ~[aws-java-sdk-s3-1.12.14.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.lambda$runConsumer$4(IntegrationRunner.java:201) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:230) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.s3.S3StorageOperations.cleanUpBucketObject(S3StorageOperations.java:260) ~[io.airbyte.airbyte-integrations.connectors-destination-s3-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.runConsumer(IntegrationRunner.java:200) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$1(IntegrationRunner.java:163) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.s3.S3StorageOperations.dropBucketObject(S3StorageOperations.java:208) ~[io.airbyte.airbyte-integrations.connectors-destination-s3-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:38) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.bigquery.BigQueryGcsOperations.dropStageIfExists(BigQueryGcsOperations.java:184) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:163) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$5(BigQueryStagingConsumerFactory.java:162) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:179) ~[io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.lambda$close$0(FailureTrackingAirbyteMessageConsumer.java:67) ~[io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54) [io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:105) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:67) [io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:314) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:162) [io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > Suppressed: com.amazonaws.services.s3.model.AmazonS3Exception: The request signature we calculated does not match the signature you provided. Check your Google secret key and signing method. (Service: Amazon S3; Status Code: 403; Error Code: SignatureDoesNotMatch; Request ID: null; S3 Extended Request ID: null; Proxy: null), S3 Extended Request ID: null 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1819) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:105) [io.airbyte.airbyte-integrations.bases-base-java-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleServiceErrorResponse(AmazonHttpClient.java:1403) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:314) [io.airbyte.airbyte-integrations.connectors-destination-bigquery-0.36.8-alpha.jar:?] 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1372) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1145) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:802) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:770) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:744) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:704) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:686) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:550) 2022-05-12 10:52:56 destination > at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:530) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5437) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5384) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5378) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.listObjects(AmazonS3Client.java:927) 2022-05-12 10:52:56 destination > at com.amazonaws.services.s3.AmazonS3Client.listObjects(AmazonS3Client.java:901) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.s3.S3StorageOperations.cleanUpBucketObject(S3StorageOperations.java:260) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.s3.S3StorageOperations.dropBucketObject(S3StorageOperations.java:208) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.bigquery.BigQueryGcsOperations.dropStageIfExists(BigQueryGcsOperations.java:184) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.bigquery.BigQueryStagingConsumerFactory.lambda$onCloseFunction$5(BigQueryStagingConsumerFactory.java:162) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:179) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.lambda$close$0(FailureTrackingAirbyteMessageConsumer.java:67) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:67) 2022-05-12 10:52:56 destination > at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:162) 2022-05-12 10:52:56 destination > ... 2 more 2022-05-12 10:52:56 ERROR i.a.w.DefaultReplicationWorker(run):169 - Sync worker failed. java.util.concurrent.ExecutionException: io.airbyte.workers.DefaultReplicationWorker$DestinationException: Destination process exited with non-zero exit code 1 at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:164) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Suppressed: io.airbyte.workers.WorkerException: Destination process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.protocols.airbyte.DefaultAirbyteDestination.close(DefaultAirbyteDestination.java:119) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:126) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.airbyte.workers.DefaultReplicationWorker$DestinationException: Destination process exited with non-zero exit code 1 at io.airbyte.workers.DefaultReplicationWorker.lambda$getDestinationOutputRunnable$6(DefaultReplicationWorker.java:354) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more 2022-05-12 10:52:56 INFO i.a.w.DefaultReplicationWorker(run):228 - sync summary: io.airbyte.config.ReplicationAttemptSummary@2dde16ad[status=failed,recordsSynced=46,bytesSynced=10938,startTime=1652352771388,endTime=1652352776734,totalStats=io.airbyte.config.SyncStats@7f055e1[recordsEmitted=46,bytesEmitted=10938,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[io.airbyte.config.StreamSyncStats@28a5168f[streamName=test_owners,stats=io.airbyte.config.SyncStats@63538619[recordsEmitted=46,bytesEmitted=10938,stateMessagesEmitted=,recordsCommitted=]]]] 2022-05-12 10:52:56 INFO i.a.w.DefaultReplicationWorker(run):250 - Source did not output any state messages 2022-05-12 10:52:56 WARN i.a.w.DefaultReplicationWorker(run):261 - State capture: No state retained. 2022-05-12 10:52:56 INFO i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling... 2022-05-12 10:52:56 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$1):147 - sync summary: io.airbyte.config.StandardSyncOutput@59d66480[standardSyncSummary=io.airbyte.config.StandardSyncSummary@27242256[status=failed,recordsSynced=46,bytesSynced=10938,startTime=1652352771388,endTime=1652352776734,totalStats=io.airbyte.config.SyncStats@7f055e1[recordsEmitted=46,bytesEmitted=10938,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[io.airbyte.config.StreamSyncStats@28a5168f[streamName=test_owners,stats=io.airbyte.config.SyncStats@63538619[recordsEmitted=46,bytesEmitted=10938,stateMessagesEmitted=,recordsCommitted=]]]],normalizationSummary=,state=,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@21602b60[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@7657cc49[stream=io.airbyte.protocol.models.AirbyteStream@1ccf4467[name=test_owners,jsonSchema={"type":["null","object"],"$schema":"http://json-schema.org/draft-07/schema#","properties":{"id":{"type":["null","string"]},"email":{"type":["null","string"]},"teams":{"type":["null","array"],"items":{"type":"object","properties":{"id":{"type":["null","string"]},"name":{"type":["null","string"]},"membership":{"type":["null","string"]}}}},"userId":{"type":["null","integer"]},"archived":{"type":["null","boolean"]},"lastName":{"type":["null","string"]},"createdAt":{"type":["null","string"],"format":"date-time"},"firstName":{"type":["null","string"]},"updatedAt":{"type":["null","string"],"format":"date-time"}}},supportedSyncModes=[full_refresh],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=temp,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@1a7ff33a[failureOrigin=destination,failureType=,internalMessage=io.airbyte.workers.DefaultReplicationWorker$DestinationException: Destination process exited with non-zero exit code 1,externalMessage=Something went wrong within the destination connector,metadata=io.airbyte.config.Metadata@3a64e03e[additionalProperties={attemptNumber=0, jobId=911}],stacktrace=java.util.concurrent.CompletionException: io.airbyte.workers.DefaultReplicationWorker$DestinationException: Destination process exited with non-zero exit code 1 at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: io.airbyte.workers.DefaultReplicationWorker$DestinationException: Destination process exited with non-zero exit code 1 at io.airbyte.workers.DefaultReplicationWorker.lambda$getDestinationOutputRunnable$6(DefaultReplicationWorker.java:354) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ... 3 more ,retryable=,timestamp=1652352776734]]] 2022-05-12 10:52:56 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):235 - Stopping temporal heartbeating... 2022-05-12 10:52:56 INFO i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/911/0/logs.log 2022-05-12 10:52:56 INFO i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.36.1-alpha 2022-05-12 10:52:56 INFO i.a.w.DefaultNormalizationWorker(run):47 - Running normalization. 2022-05-12 10:52:56 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization:0.1.75 2022-05-12 10:52:56 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization:0.1.75 exists... 2022-05-12 10:52:56 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization:0.1.75 was found locally. 2022-05-12 10:52:56 INFO i.a.w.p.DockerProcessFactory(create):106 - Creating docker job ID: 911 2022-05-12 10:52:56 INFO i.a.w.p.DockerProcessFactory(create):158 - Preparing command: docker run --rm --init -i -w /data/911/0/normalize --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.36.1-alpha airbyte/normalization:0.1.75 run --integration-type bigquery --config destination_config.json --catalog destination_catalog.json 2022-05-12 10:52:57 normalization > Running: transform-config --config destination_config.json --integration-type bigquery --out /data/911/0/normalize 2022-05-12 10:52:57 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/911/0/normalize') 2022-05-12 10:52:57 normalization > transform_bigquery 2022-05-12 10:52:57 normalization > Running: transform-catalog --integration-type bigquery --profile-config-dir /data/911/0/normalize --catalog destination_catalog.json --out /data/911/0/normalize/models/generated/ --json-column _airbyte_data 2022-05-12 10:52:57 normalization > Processing destination_catalog.json... 2022-05-12 10:52:57 normalization > Generating airbyte_ctes/temp/test_owners_ab1.sql from test_owners 2022-05-12 10:52:57 normalization > Generating airbyte_ctes/temp/test_owners_ab2.sql from test_owners 2022-05-12 10:52:57 normalization > Generating airbyte_ctes/temp/test_owners_ab3.sql from test_owners 2022-05-12 10:52:57 normalization > Generating airbyte_tables/temp/test_owners.sql from test_owners 2022-05-12 10:52:57 normalization > Generating airbyte_ctes/temp/test_owners_teams_ab1.sql from test_owners/teams 2022-05-12 10:52:57 normalization > Generating airbyte_ctes/temp/test_owners_teams_ab2.sql from test_owners/teams 2022-05-12 10:52:57 normalization > Generating airbyte_ctes/temp/test_owners_teams_ab3.sql from test_owners/teams 2022-05-12 10:52:57 normalization > Generating airbyte_tables/temp/test_owners_teams.sql from test_owners/teams 2022-05-12 10:52:57 normalization > detected no config file for ssh, assuming ssh is off. 2022-05-12 10:53:00 normalization > [--event-buffer-size EVENT_BUFFER_SIZE] 2022-05-12 10:53:00 normalization > --event-buffer-size EVENT_BUFFER_SIZE 2022-05-12 10:53:00 normalization > 2022-05-12 10:53:00 normalization > DBT >=1.0.0 detected; using 10K event buffer size 2022-05-12 10:53:00 normalization > 2022-05-12 10:53:03 normalization > 10:53:03 Running with dbt=1.0.0 2022-05-12 10:53:03 normalization > 10:53:03 Partial parse save file not found. Starting full parse. 2022-05-12 10:53:04 normalization > 10:53:04 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-05-12 10:53:04 normalization > There are 2 unused configuration paths: 2022-05-12 10:53:04 normalization > - models.airbyte_utils.generated.airbyte_incremental 2022-05-12 10:53:04 normalization > - models.airbyte_utils.generated.airbyte_views 2022-05-12 10:53:04 normalization > 2022-05-12 10:53:04 normalization > 10:53:04 Found 8 models, 0 tests, 0 snapshots, 0 analyses, 537 macros, 0 operations, 0 seed files, 1 source, 0 exposures, 0 metrics 2022-05-12 10:53:04 normalization > 10:53:04 2022-05-12 10:53:05 normalization > 10:53:05 Concurrency: 8 threads (target='prod') 2022-05-12 10:53:05 normalization > 10:53:05 2022-05-12 10:53:05 normalization > 10:53:05 1 of 2 START table model temp.test_owners............................................................................... [RUN] 2022-05-12 10:53:05 normalization > 10:53:05 1 of 2 ERROR creating table model temp.test_owners...................................................................... [ERROR in 0.30s] 2022-05-12 10:53:05 normalization > 10:53:05 2 of 2 SKIP relation temp.test_owners_teams............................................................................. [SKIP] 2022-05-12 10:53:05 normalization > 10:53:05 2022-05-12 10:53:05 normalization > 10:53:05 Finished running 2 table models in 1.08s. 2022-05-12 10:53:05 normalization > 10:53:05 2022-05-12 10:53:05 normalization > 10:53:05 Completed with 1 error and 0 warnings: 2022-05-12 10:53:05 normalization > 10:53:05 2022-05-12 10:53:05 normalization > 10:53:05 Runtime Error in model test_owners (models/generated/airbyte_tables/temp/test_owners.sql) 2022-05-12 10:53:05 normalization > 10:53:05 404 Not found: Table bi-dixa-staging:temp._airbyte_raw_test_owners was not found in location EU 2022-05-12 10:53:05 normalization > 10:53:05 2022-05-12 10:53:05 normalization > 10:53:05 (job ID: be1c91aa-7103-4af0-8744-1f787e8c36e8) 2022-05-12 10:53:05 normalization > 10:53:05 2022-05-12 10:53:05 normalization > 10:53:05 Done. PASS=0 WARN=0 ERROR=1 SKIP=1 TOTAL=2 2022-05-12 10:53:06 normalization > 2022-05-12 10:53:06 normalization > Diagnosing dbt debug to check if destination is available for dbt and well configured (1): 2022-05-12 10:53:06 normalization > 2022-05-12 10:53:08 normalization > 10:53:08 Running with dbt=1.0.0 2022-05-12 10:53:08 normalization > dbt version: 1.0.0 2022-05-12 10:53:08 normalization > python version: 3.9.9 2022-05-12 10:53:08 normalization > python path: /usr/local/bin/python 2022-05-12 10:53:08 normalization > os info: Linux-4.19.0-20-cloud-amd64-x86_64-with-glibc2.31 2022-05-12 10:53:08 normalization > Using profiles.yml file at /data/911/0/normalize/profiles.yml 2022-05-12 10:53:08 normalization > Using dbt_project.yml file at /data/911/0/normalize/dbt_project.yml 2022-05-12 10:53:08 normalization > 2022-05-12 10:53:09 normalization > Configuration: 2022-05-12 10:53:09 normalization > profiles.yml file [OK found and valid] 2022-05-12 10:53:09 normalization > dbt_project.yml file [OK found and valid] 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > Required dependencies: 2022-05-12 10:53:09 normalization > - git [OK found] 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > Connection: 2022-05-12 10:53:09 normalization > method: service-account-json 2022-05-12 10:53:09 normalization > database: bi-dixa-staging 2022-05-12 10:53:09 normalization > schema: dixa_support 2022-05-12 10:53:09 normalization > location: EU 2022-05-12 10:53:09 normalization > priority: interactive 2022-05-12 10:53:09 normalization > timeout_seconds: 300 2022-05-12 10:53:09 normalization > maximum_bytes_billed: None 2022-05-12 10:53:09 normalization > execution_project: bi-dixa-staging 2022-05-12 10:53:09 normalization > Connection test: [OK connection ok] 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > All checks passed! 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > Forward dbt output logs to diagnose/debug errors (0): 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > ============================== 2022-05-12 10:53:03.176395 | e8502984-e380-4330-bb43-a4181fd1cf8b ============================== 2022-05-12 10:53:09 normalization > 10:53:03.176395 [info ] [MainThread]: Running with dbt=1.0.0 2022-05-12 10:53:09 normalization > 10:53:03.177068 [debug] [MainThread]: running dbt with arguments Namespace(record_timing_info=None, debug=None, log_format=None, write_json=None, use_colors=None, printer_width=None, warn_error=None, version_check=None, partial_parse=None, single_threaded=False, use_experimental_parser=None, static_parser=None, profiles_dir='/data/911/0/normalize', send_anonymous_usage_stats=None, fail_fast=None, event_buffer_size='10000', project_dir='/data/911/0/normalize', profile=None, target=None, vars='{}', log_cache_events=False, threads=None, select=None, exclude=None, selector_name=None, state=None, defer=None, full_refresh=False, cls=, which='run', rpc_method='run') 2022-05-12 10:53:09 normalization > 10:53:03.177443 [debug] [MainThread]: Tracking: do not track 2022-05-12 10:53:09 normalization > 10:53:03.223215 [info ] [MainThread]: Partial parse save file not found. Starting full parse. 2022-05-12 10:53:09 normalization > 10:53:03.261290 [debug] [MainThread]: Parsing macros/incremental.sql 2022-05-12 10:53:09 normalization > 10:53:03.269976 [debug] [MainThread]: Parsing macros/star_intersect.sql 2022-05-12 10:53:09 normalization > 10:53:03.277500 [debug] [MainThread]: Parsing macros/get_custom_schema.sql 2022-05-12 10:53:09 normalization > 10:53:03.278310 [debug] [MainThread]: Parsing macros/should_full_refresh.sql 2022-05-12 10:53:09 normalization > 10:53:03.284421 [debug] [MainThread]: Parsing macros/schema_tests/equality.sql 2022-05-12 10:53:09 normalization > 10:53:03.292334 [debug] [MainThread]: Parsing macros/schema_tests/equal_rowcount.sql 2022-05-12 10:53:09 normalization > 10:53:03.293779 [debug] [MainThread]: Parsing macros/cross_db_utils/type_conversions.sql 2022-05-12 10:53:09 normalization > 10:53:03.300859 [debug] [MainThread]: Parsing macros/cross_db_utils/concat.sql 2022-05-12 10:53:09 normalization > 10:53:03.303449 [debug] [MainThread]: Parsing macros/cross_db_utils/current_timestamp.sql 2022-05-12 10:53:09 normalization > 10:53:03.304240 [debug] [MainThread]: Parsing macros/cross_db_utils/json_operations.sql 2022-05-12 10:53:09 normalization > 10:53:03.344296 [debug] [MainThread]: Parsing macros/cross_db_utils/except.sql 2022-05-12 10:53:09 normalization > 10:53:03.345375 [debug] [MainThread]: Parsing macros/cross_db_utils/datatypes.sql 2022-05-12 10:53:09 normalization > 10:53:03.357962 [debug] [MainThread]: Parsing macros/cross_db_utils/surrogate_key.sql 2022-05-12 10:53:09 normalization > 10:53:03.360460 [debug] [MainThread]: Parsing macros/cross_db_utils/quote.sql 2022-05-12 10:53:09 normalization > 10:53:03.362415 [debug] [MainThread]: Parsing macros/cross_db_utils/hash.sql 2022-05-12 10:53:09 normalization > 10:53:03.363199 [debug] [MainThread]: Parsing macros/cross_db_utils/array.sql 2022-05-12 10:53:09 normalization > 10:53:03.379794 [debug] [MainThread]: Parsing macros/catalog.sql 2022-05-12 10:53:09 normalization > 10:53:03.386867 [debug] [MainThread]: Parsing macros/adapters.sql 2022-05-12 10:53:09 normalization > 10:53:03.415793 [debug] [MainThread]: Parsing macros/etc.sql 2022-05-12 10:53:09 normalization > 10:53:03.418199 [debug] [MainThread]: Parsing macros/materializations/incremental.sql 2022-05-12 10:53:09 normalization > 10:53:03.437728 [debug] [MainThread]: Parsing macros/materializations/seed.sql 2022-05-12 10:53:09 normalization > 10:53:03.440993 [debug] [MainThread]: Parsing macros/materializations/view.sql 2022-05-12 10:53:09 normalization > 10:53:03.444279 [debug] [MainThread]: Parsing macros/materializations/snapshot.sql 2022-05-12 10:53:09 normalization > 10:53:03.446372 [debug] [MainThread]: Parsing macros/materializations/copy.sql 2022-05-12 10:53:09 normalization > 10:53:03.450141 [debug] [MainThread]: Parsing macros/materializations/table.sql 2022-05-12 10:53:09 normalization > 10:53:03.455075 [debug] [MainThread]: Parsing macros/etc/datetime.sql 2022-05-12 10:53:09 normalization > 10:53:03.465701 [debug] [MainThread]: Parsing macros/etc/statement.sql 2022-05-12 10:53:09 normalization > 10:53:03.471253 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_schema.sql 2022-05-12 10:53:09 normalization > 10:53:03.474380 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_database.sql 2022-05-12 10:53:09 normalization > 10:53:03.476410 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_alias.sql 2022-05-12 10:53:09 normalization > 10:53:03.478266 [debug] [MainThread]: Parsing macros/adapters/columns.sql 2022-05-12 10:53:09 normalization > 10:53:03.490662 [debug] [MainThread]: Parsing macros/adapters/indexes.sql 2022-05-12 10:53:09 normalization > 10:53:03.494143 [debug] [MainThread]: Parsing macros/adapters/persist_docs.sql 2022-05-12 10:53:09 normalization > 10:53:03.499712 [debug] [MainThread]: Parsing macros/adapters/metadata.sql 2022-05-12 10:53:09 normalization > 10:53:03.509267 [debug] [MainThread]: Parsing macros/adapters/relation.sql 2022-05-12 10:53:09 normalization > 10:53:03.524389 [debug] [MainThread]: Parsing macros/adapters/freshness.sql 2022-05-12 10:53:09 normalization > 10:53:03.528284 [debug] [MainThread]: Parsing macros/adapters/schema.sql 2022-05-12 10:53:09 normalization > 10:53:03.531294 [debug] [MainThread]: Parsing macros/materializations/configs.sql 2022-05-12 10:53:09 normalization > 10:53:03.534346 [debug] [MainThread]: Parsing macros/materializations/hooks.sql 2022-05-12 10:53:09 normalization > 10:53:03.539426 [debug] [MainThread]: Parsing macros/materializations/tests/test.sql 2022-05-12 10:53:09 normalization > 10:53:03.545131 [debug] [MainThread]: Parsing macros/materializations/tests/helpers.sql 2022-05-12 10:53:09 normalization > 10:53:03.547438 [debug] [MainThread]: Parsing macros/materializations/tests/where_subquery.sql 2022-05-12 10:53:09 normalization > 10:53:03.549744 [debug] [MainThread]: Parsing macros/materializations/snapshots/snapshot_merge.sql 2022-05-12 10:53:09 normalization > 10:53:03.551895 [debug] [MainThread]: Parsing macros/materializations/snapshots/helpers.sql 2022-05-12 10:53:09 normalization > 10:53:03.566375 [debug] [MainThread]: Parsing macros/materializations/snapshots/strategies.sql 2022-05-12 10:53:09 normalization > 10:53:03.587898 [debug] [MainThread]: Parsing macros/materializations/snapshots/snapshot.sql 2022-05-12 10:53:09 normalization > 10:53:03.602835 [debug] [MainThread]: Parsing macros/materializations/seeds/seed.sql 2022-05-12 10:53:09 normalization > 10:53:03.610897 [debug] [MainThread]: Parsing macros/materializations/seeds/helpers.sql 2022-05-12 10:53:09 normalization > 10:53:03.632310 [debug] [MainThread]: Parsing macros/materializations/models/table/create_table_as.sql 2022-05-12 10:53:09 normalization > 10:53:03.636248 [debug] [MainThread]: Parsing macros/materializations/models/table/table.sql 2022-05-12 10:53:09 normalization > 10:53:03.645308 [debug] [MainThread]: Parsing macros/materializations/models/incremental/incremental.sql 2022-05-12 10:53:09 normalization > 10:53:03.658047 [debug] [MainThread]: Parsing macros/materializations/models/incremental/merge.sql 2022-05-12 10:53:09 normalization > 10:53:03.672487 [debug] [MainThread]: Parsing macros/materializations/models/incremental/is_incremental.sql 2022-05-12 10:53:09 normalization > 10:53:03.674438 [debug] [MainThread]: Parsing macros/materializations/models/incremental/on_schema_change.sql 2022-05-12 10:53:09 normalization > 10:53:03.694863 [debug] [MainThread]: Parsing macros/materializations/models/incremental/column_helpers.sql 2022-05-12 10:53:09 normalization > 10:53:03.700609 [debug] [MainThread]: Parsing macros/materializations/models/view/helpers.sql 2022-05-12 10:53:09 normalization > 10:53:03.702283 [debug] [MainThread]: Parsing macros/materializations/models/view/view.sql 2022-05-12 10:53:09 normalization > 10:53:03.711053 [debug] [MainThread]: Parsing macros/materializations/models/view/create_or_replace_view.sql 2022-05-12 10:53:09 normalization > 10:53:03.714551 [debug] [MainThread]: Parsing macros/materializations/models/view/create_view_as.sql 2022-05-12 10:53:09 normalization > 10:53:03.717552 [debug] [MainThread]: Parsing macros/generic_test_sql/unique.sql 2022-05-12 10:53:09 normalization > 10:53:03.718470 [debug] [MainThread]: Parsing macros/generic_test_sql/not_null.sql 2022-05-12 10:53:09 normalization > 10:53:03.719180 [debug] [MainThread]: Parsing macros/generic_test_sql/accepted_values.sql 2022-05-12 10:53:09 normalization > 10:53:03.720961 [debug] [MainThread]: Parsing macros/generic_test_sql/relationships.sql 2022-05-12 10:53:09 normalization > 10:53:03.722116 [debug] [MainThread]: Parsing tests/generic/builtin.sql 2022-05-12 10:53:09 normalization > 10:53:03.726230 [debug] [MainThread]: Parsing macros/sql/safe_add.sql 2022-05-12 10:53:09 normalization > 10:53:03.728327 [debug] [MainThread]: Parsing macros/sql/unpivot.sql 2022-05-12 10:53:09 normalization > 10:53:03.738291 [debug] [MainThread]: Parsing macros/sql/nullcheck_table.sql 2022-05-12 10:53:09 normalization > 10:53:03.740405 [debug] [MainThread]: Parsing macros/sql/get_relations_by_pattern.sql 2022-05-12 10:53:09 normalization > 10:53:03.745070 [debug] [MainThread]: Parsing macros/sql/pivot.sql 2022-05-12 10:53:09 normalization > 10:53:03.750890 [debug] [MainThread]: Parsing macros/sql/groupby.sql 2022-05-12 10:53:09 normalization > 10:53:03.752617 [debug] [MainThread]: Parsing macros/sql/get_relations_by_prefix.sql 2022-05-12 10:53:09 normalization > 10:53:03.756909 [debug] [MainThread]: Parsing macros/sql/get_query_results_as_dict.sql 2022-05-12 10:53:09 normalization > 10:53:03.759885 [debug] [MainThread]: Parsing macros/sql/get_table_types_sql.sql 2022-05-12 10:53:09 normalization > 10:53:03.761459 [debug] [MainThread]: Parsing macros/sql/nullcheck.sql 2022-05-12 10:53:09 normalization > 10:53:03.763776 [debug] [MainThread]: Parsing macros/sql/surrogate_key.sql 2022-05-12 10:53:09 normalization > 10:53:03.768169 [debug] [MainThread]: Parsing macros/sql/get_tables_by_prefix_sql.sql 2022-05-12 10:53:09 normalization > 10:53:03.770539 [debug] [MainThread]: Parsing macros/sql/get_column_values.sql 2022-05-12 10:53:09 normalization > 10:53:03.777065 [debug] [MainThread]: Parsing macros/sql/star.sql 2022-05-12 10:53:09 normalization > 10:53:03.782593 [debug] [MainThread]: Parsing macros/sql/date_spine.sql 2022-05-12 10:53:09 normalization > 10:53:03.787980 [debug] [MainThread]: Parsing macros/sql/get_tables_by_pattern_sql.sql 2022-05-12 10:53:09 normalization > 10:53:03.796110 [debug] [MainThread]: Parsing macros/sql/haversine_distance.sql 2022-05-12 10:53:09 normalization > 10:53:03.803467 [debug] [MainThread]: Parsing macros/sql/generate_series.sql 2022-05-12 10:53:09 normalization > 10:53:03.808853 [debug] [MainThread]: Parsing macros/sql/union.sql 2022-05-12 10:53:09 normalization > 10:53:03.822072 [debug] [MainThread]: Parsing macros/web/get_url_path.sql 2022-05-12 10:53:09 normalization > 10:53:03.825495 [debug] [MainThread]: Parsing macros/web/get_url_host.sql 2022-05-12 10:53:09 normalization > 10:53:03.828057 [debug] [MainThread]: Parsing macros/web/get_url_parameter.sql 2022-05-12 10:53:09 normalization > 10:53:03.830070 [debug] [MainThread]: Parsing macros/schema_tests/equality.sql 2022-05-12 10:53:09 normalization > 10:53:03.834744 [debug] [MainThread]: Parsing macros/schema_tests/mutually_exclusive_ranges.sql 2022-05-12 10:53:09 normalization > 10:53:03.846012 [debug] [MainThread]: Parsing macros/schema_tests/not_null_proportion.sql 2022-05-12 10:53:09 normalization > 10:53:03.848809 [debug] [MainThread]: Parsing macros/schema_tests/equal_rowcount.sql 2022-05-12 10:53:09 normalization > 10:53:03.850888 [debug] [MainThread]: Parsing macros/schema_tests/test_unique_where.sql 2022-05-12 10:53:09 normalization > 10:53:03.852715 [debug] [MainThread]: Parsing macros/schema_tests/test_not_null_where.sql 2022-05-12 10:53:09 normalization > 10:53:03.854515 [debug] [MainThread]: Parsing macros/schema_tests/sequential_values.sql 2022-05-12 10:53:09 normalization > 10:53:03.858128 [debug] [MainThread]: Parsing macros/schema_tests/unique_combination_of_columns.sql 2022-05-12 10:53:09 normalization > 10:53:03.861692 [debug] [MainThread]: Parsing macros/schema_tests/expression_is_true.sql 2022-05-12 10:53:09 normalization > 10:53:03.864033 [debug] [MainThread]: Parsing macros/schema_tests/fewer_rows_than.sql 2022-05-12 10:53:09 normalization > 10:53:03.866189 [debug] [MainThread]: Parsing macros/schema_tests/cardinality_equality.sql 2022-05-12 10:53:09 normalization > 10:53:03.868908 [debug] [MainThread]: Parsing macros/schema_tests/recency.sql 2022-05-12 10:53:09 normalization > 10:53:03.871132 [debug] [MainThread]: Parsing macros/schema_tests/at_least_one.sql 2022-05-12 10:53:09 normalization > 10:53:03.872765 [debug] [MainThread]: Parsing macros/schema_tests/not_constant.sql 2022-05-12 10:53:09 normalization > 10:53:03.874350 [debug] [MainThread]: Parsing macros/schema_tests/accepted_range.sql 2022-05-12 10:53:09 normalization > 10:53:03.877479 [debug] [MainThread]: Parsing macros/schema_tests/not_accepted_values.sql 2022-05-12 10:53:09 normalization > 10:53:03.880184 [debug] [MainThread]: Parsing macros/schema_tests/relationships_where.sql 2022-05-12 10:53:09 normalization > 10:53:03.882961 [debug] [MainThread]: Parsing macros/materializations/insert_by_period_materialization.sql 2022-05-12 10:53:09 normalization > 10:53:03.913300 [debug] [MainThread]: Parsing macros/jinja_helpers/pretty_time.sql 2022-05-12 10:53:09 normalization > 10:53:03.915113 [debug] [MainThread]: Parsing macros/jinja_helpers/log_info.sql 2022-05-12 10:53:09 normalization > 10:53:03.916654 [debug] [MainThread]: Parsing macros/jinja_helpers/pretty_log_format.sql 2022-05-12 10:53:09 normalization > 10:53:03.918112 [debug] [MainThread]: Parsing macros/jinja_helpers/slugify.sql 2022-05-12 10:53:09 normalization > 10:53:03.919671 [debug] [MainThread]: Parsing macros/cross_db_utils/dateadd.sql 2022-05-12 10:53:09 normalization > 10:53:03.923218 [debug] [MainThread]: Parsing macros/cross_db_utils/right.sql 2022-05-12 10:53:09 normalization > 10:53:03.926037 [debug] [MainThread]: Parsing macros/cross_db_utils/length.sql 2022-05-12 10:53:09 normalization > 10:53:03.927700 [debug] [MainThread]: Parsing macros/cross_db_utils/_is_relation.sql 2022-05-12 10:53:09 normalization > 10:53:03.929329 [debug] [MainThread]: Parsing macros/cross_db_utils/replace.sql 2022-05-12 10:53:09 normalization > 10:53:03.931085 [debug] [MainThread]: Parsing macros/cross_db_utils/date_trunc.sql 2022-05-12 10:53:09 normalization > 10:53:03.933222 [debug] [MainThread]: Parsing macros/cross_db_utils/safe_cast.sql 2022-05-12 10:53:09 normalization > 10:53:03.935623 [debug] [MainThread]: Parsing macros/cross_db_utils/intersect.sql 2022-05-12 10:53:09 normalization > 10:53:03.936961 [debug] [MainThread]: Parsing macros/cross_db_utils/concat.sql 2022-05-12 10:53:09 normalization > 10:53:03.938222 [debug] [MainThread]: Parsing macros/cross_db_utils/position.sql 2022-05-12 10:53:09 normalization > 10:53:03.940196 [debug] [MainThread]: Parsing macros/cross_db_utils/escape_single_quotes.sql 2022-05-12 10:53:09 normalization > 10:53:03.943022 [debug] [MainThread]: Parsing macros/cross_db_utils/current_timestamp.sql 2022-05-12 10:53:09 normalization > 10:53:03.947247 [debug] [MainThread]: Parsing macros/cross_db_utils/except.sql 2022-05-12 10:53:09 normalization > 10:53:03.948619 [debug] [MainThread]: Parsing macros/cross_db_utils/datatypes.sql 2022-05-12 10:53:09 normalization > 10:53:03.956183 [debug] [MainThread]: Parsing macros/cross_db_utils/bool_or.sql 2022-05-12 10:53:09 normalization > 10:53:03.958182 [debug] [MainThread]: Parsing macros/cross_db_utils/datediff.sql 2022-05-12 10:53:09 normalization > 10:53:03.970787 [debug] [MainThread]: Parsing macros/cross_db_utils/width_bucket.sql 2022-05-12 10:53:09 normalization > 10:53:03.977336 [debug] [MainThread]: Parsing macros/cross_db_utils/hash.sql 2022-05-12 10:53:09 normalization > 10:53:03.979161 [debug] [MainThread]: Parsing macros/cross_db_utils/last_day.sql 2022-05-12 10:53:09 normalization > 10:53:03.983604 [debug] [MainThread]: Parsing macros/cross_db_utils/cast_bool_to_text.sql 2022-05-12 10:53:09 normalization > 10:53:03.985673 [debug] [MainThread]: Parsing macros/cross_db_utils/identifier.sql 2022-05-12 10:53:09 normalization > 10:53:03.987853 [debug] [MainThread]: Parsing macros/cross_db_utils/literal.sql 2022-05-12 10:53:09 normalization > 10:53:03.989237 [debug] [MainThread]: Parsing macros/cross_db_utils/any_value.sql 2022-05-12 10:53:09 normalization > 10:53:03.990881 [debug] [MainThread]: Parsing macros/cross_db_utils/split_part.sql 2022-05-12 10:53:09 normalization > 10:53:03.993220 [debug] [MainThread]: Parsing macros/cross_db_utils/_is_ephemeral.sql 2022-05-12 10:53:09 normalization > 10:53:04.535043 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/temp/test_owners_ab2.sql 2022-05-12 10:53:09 normalization > 10:53:04.572470 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/temp/test_owners_ab2.sql 2022-05-12 10:53:09 normalization > 10:53:04.574292 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/temp/test_owners_teams_ab2.sql 2022-05-12 10:53:09 normalization > 10:53:04.583064 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/temp/test_owners_teams_ab2.sql 2022-05-12 10:53:09 normalization > 10:53:04.584719 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/temp/test_owners_ab3.sql 2022-05-12 10:53:09 normalization > 10:53:04.617790 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/temp/test_owners_ab3.sql 2022-05-12 10:53:09 normalization > 10:53:04.621057 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/temp/test_owners_ab1.sql 2022-05-12 10:53:09 normalization > 10:53:04.647135 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/temp/test_owners_ab1.sql 2022-05-12 10:53:09 normalization > 10:53:04.649100 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/temp/test_owners_teams_ab1.sql 2022-05-12 10:53:09 normalization > 10:53:04.668579 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/temp/test_owners_teams_ab1.sql 2022-05-12 10:53:09 normalization > 10:53:04.670465 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/temp/test_owners_teams_ab3.sql 2022-05-12 10:53:09 normalization > 10:53:04.679514 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/temp/test_owners_teams_ab3.sql 2022-05-12 10:53:09 normalization > 10:53:04.681239 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_tables/temp/test_owners_teams.sql 2022-05-12 10:53:09 normalization > 10:53:04.748645 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_tables/temp/test_owners_teams.sql 2022-05-12 10:53:09 normalization > 10:53:04.750520 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_tables/temp/test_owners.sql 2022-05-12 10:53:09 normalization > 10:53:04.758212 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_tables/temp/test_owners.sql 2022-05-12 10:53:09 normalization > 10:53:04.839442 [warn ] [MainThread]: [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-05-12 10:53:09 normalization > There are 2 unused configuration paths: 2022-05-12 10:53:09 normalization > - models.airbyte_utils.generated.airbyte_incremental 2022-05-12 10:53:09 normalization > - models.airbyte_utils.generated.airbyte_views 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > 10:53:04.862141 [info ] [MainThread]: Found 8 models, 0 tests, 0 snapshots, 0 analyses, 537 macros, 0 operations, 0 seed files, 1 source, 0 exposures, 0 metrics 2022-05-12 10:53:09 normalization > 10:53:04.864517 [info ] [MainThread]: 2022-05-12 10:53:09 normalization > 10:53:04.865295 [debug] [MainThread]: Acquiring new bigquery connection "master" 2022-05-12 10:53:09 normalization > 10:53:04.866719 [debug] [ThreadPool]: Acquiring new bigquery connection "list_bi-dixa-staging" 2022-05-12 10:53:09 normalization > 10:53:04.867134 [debug] [ThreadPool]: Opening a new connection, currently in state init 2022-05-12 10:53:09 normalization > 10:53:05.087074 [debug] [ThreadPool]: Acquiring new bigquery connection "list_bi-dixa-staging_temp" 2022-05-12 10:53:09 normalization > 10:53:05.087585 [debug] [ThreadPool]: Opening a new connection, currently in state closed 2022-05-12 10:53:09 normalization > 10:53:05.539652 [info ] [MainThread]: Concurrency: 8 threads (target='prod') 2022-05-12 10:53:09 normalization > 10:53:05.540164 [info ] [MainThread]: 2022-05-12 10:53:09 normalization > 10:53:05.560759 [debug] [Thread-1 ]: Began running node model.airbyte_utils.test_owners_ab1 2022-05-12 10:53:09 normalization > 10:53:05.561525 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.test_owners_ab1" 2022-05-12 10:53:09 normalization > 10:53:05.561736 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.test_owners_ab1 2022-05-12 10:53:09 normalization > 10:53:05.561902 [debug] [Thread-1 ]: Compiling model.airbyte_utils.test_owners_ab1 2022-05-12 10:53:09 normalization > 10:53:05.576142 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.test_owners_ab1" 2022-05-12 10:53:09 normalization > 10:53:05.576747 [debug] [Thread-1 ]: finished collecting timing info 2022-05-12 10:53:09 normalization > 10:53:05.577256 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.test_owners_ab1 2022-05-12 10:53:09 normalization > 10:53:05.577932 [debug] [Thread-3 ]: Began running node model.airbyte_utils.test_owners_ab2 2022-05-12 10:53:09 normalization > 10:53:05.578564 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.test_owners_ab2" 2022-05-12 10:53:09 normalization > 10:53:05.578744 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.test_owners_ab2 2022-05-12 10:53:09 normalization > 10:53:05.578917 [debug] [Thread-3 ]: Compiling model.airbyte_utils.test_owners_ab2 2022-05-12 10:53:09 normalization > 10:53:05.608175 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.test_owners_ab2" 2022-05-12 10:53:09 normalization > 10:53:05.608654 [debug] [Thread-3 ]: finished collecting timing info 2022-05-12 10:53:09 normalization > 10:53:05.609187 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.test_owners_ab2 2022-05-12 10:53:09 normalization > 10:53:05.610031 [debug] [Thread-5 ]: Began running node model.airbyte_utils.test_owners_ab3 2022-05-12 10:53:09 normalization > 10:53:05.610577 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.test_owners_ab3" 2022-05-12 10:53:09 normalization > 10:53:05.610751 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.test_owners_ab3 2022-05-12 10:53:09 normalization > 10:53:05.610898 [debug] [Thread-5 ]: Compiling model.airbyte_utils.test_owners_ab3 2022-05-12 10:53:09 normalization > 10:53:05.636738 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.test_owners_ab3" 2022-05-12 10:53:09 normalization > 10:53:05.637232 [debug] [Thread-5 ]: finished collecting timing info 2022-05-12 10:53:09 normalization > 10:53:05.637745 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.test_owners_ab3 2022-05-12 10:53:09 normalization > 10:53:05.638620 [debug] [Thread-7 ]: Began running node model.airbyte_utils.test_owners 2022-05-12 10:53:09 normalization > 10:53:05.639054 [info ] [Thread-7 ]: 1 of 2 START table model temp.test_owners............................................................................... [RUN] 2022-05-12 10:53:09 normalization > 10:53:05.639695 [debug] [Thread-7 ]: Acquiring new bigquery connection "model.airbyte_utils.test_owners" 2022-05-12 10:53:09 normalization > 10:53:05.639880 [debug] [Thread-7 ]: Began compiling node model.airbyte_utils.test_owners 2022-05-12 10:53:09 normalization > 10:53:05.640021 [debug] [Thread-7 ]: Compiling model.airbyte_utils.test_owners 2022-05-12 10:53:09 normalization > 10:53:05.651756 [debug] [Thread-7 ]: Writing injected SQL for node "model.airbyte_utils.test_owners" 2022-05-12 10:53:09 normalization > 10:53:05.652225 [debug] [Thread-7 ]: finished collecting timing info 2022-05-12 10:53:09 normalization > 10:53:05.652402 [debug] [Thread-7 ]: Began executing node model.airbyte_utils.test_owners 2022-05-12 10:53:09 normalization > 10:53:05.698886 [debug] [Thread-7 ]: Writing runtime SQL for node "model.airbyte_utils.test_owners" 2022-05-12 10:53:09 normalization > 10:53:05.699599 [debug] [Thread-7 ]: Opening a new connection, currently in state init 2022-05-12 10:53:09 normalization > 10:53:05.704668 [debug] [Thread-7 ]: On model.airbyte_utils.test_owners: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.test_owners"} */ 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > create or replace table `bi-dixa-staging`.temp.`test_owners` 2022-05-12 10:53:09 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-05-12 10:53:09 normalization > cluster by _airbyte_emitted_at 2022-05-12 10:53:09 normalization > OPTIONS() 2022-05-12 10:53:09 normalization > as ( 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > with __dbt__cte__test_owners_ab1 as ( 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-05-12 10:53:09 normalization > -- depends_on: `bi-dixa-staging`.temp._airbyte_raw_test_owners 2022-05-12 10:53:09 normalization > select 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['email']") as email, 2022-05-12 10:53:09 normalization > json_extract_array(_airbyte_data, "$['teams']") as teams, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['userId']") as userId, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['archived']") as archived, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['lastName']") as lastName, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['createdAt']") as createdAt, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['firstName']") as firstName, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['updatedAt']") as updatedAt, 2022-05-12 10:53:09 normalization > _airbyte_ab_id, 2022-05-12 10:53:09 normalization > _airbyte_emitted_at, 2022-05-12 10:53:09 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-05-12 10:53:09 normalization > from `bi-dixa-staging`.temp._airbyte_raw_test_owners as table_alias 2022-05-12 10:53:09 normalization > -- test_owners 2022-05-12 10:53:09 normalization > where 1 = 1 2022-05-12 10:53:09 normalization > ), __dbt__cte__test_owners_ab2 as ( 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-05-12 10:53:09 normalization > -- depends_on: __dbt__cte__test_owners_ab1 2022-05-12 10:53:09 normalization > select 2022-05-12 10:53:09 normalization > cast(id as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ) as id, 2022-05-12 10:53:09 normalization > cast(email as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ) as email, 2022-05-12 10:53:09 normalization > teams, 2022-05-12 10:53:09 normalization > cast(userId as 2022-05-12 10:53:09 normalization > int64 2022-05-12 10:53:09 normalization > ) as userId, 2022-05-12 10:53:09 normalization > cast(archived as boolean) as archived, 2022-05-12 10:53:09 normalization > cast(lastName as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ) as lastName, 2022-05-12 10:53:09 normalization > cast(nullif(createdAt, '') as 2022-05-12 10:53:09 normalization > timestamp 2022-05-12 10:53:09 normalization > ) as createdAt, 2022-05-12 10:53:09 normalization > cast(firstName as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ) as firstName, 2022-05-12 10:53:09 normalization > cast(nullif(updatedAt, '') as 2022-05-12 10:53:09 normalization > timestamp 2022-05-12 10:53:09 normalization > ) as updatedAt, 2022-05-12 10:53:09 normalization > _airbyte_ab_id, 2022-05-12 10:53:09 normalization > _airbyte_emitted_at, 2022-05-12 10:53:09 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-05-12 10:53:09 normalization > from __dbt__cte__test_owners_ab1 2022-05-12 10:53:09 normalization > -- test_owners 2022-05-12 10:53:09 normalization > where 1 = 1 2022-05-12 10:53:09 normalization > ), __dbt__cte__test_owners_ab3 as ( 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > -- SQL model to build a hash column based on the values of this record 2022-05-12 10:53:09 normalization > -- depends_on: __dbt__cte__test_owners_ab2 2022-05-12 10:53:09 normalization > select 2022-05-12 10:53:09 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(email as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(array_to_string(teams, "|", "") as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(userId as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(archived as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(lastName as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(createdAt as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(firstName as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(updatedAt as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), '')) as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ))) as _airbyte_test_owners_hashid, 2022-05-12 10:53:09 normalization > tmp.* 2022-05-12 10:53:09 normalization > from __dbt__cte__test_owners_ab2 tmp 2022-05-12 10:53:09 normalization > -- test_owners 2022-05-12 10:53:09 normalization > where 1 = 1 2022-05-12 10:53:09 normalization > )-- Final base SQL model 2022-05-12 10:53:09 normalization > -- depends_on: __dbt__cte__test_owners_ab3 2022-05-12 10:53:09 normalization > select 2022-05-12 10:53:09 normalization > id, 2022-05-12 10:53:09 normalization > email, 2022-05-12 10:53:09 normalization > teams, 2022-05-12 10:53:09 normalization > userId, 2022-05-12 10:53:09 normalization > archived, 2022-05-12 10:53:09 normalization > lastName, 2022-05-12 10:53:09 normalization > createdAt, 2022-05-12 10:53:09 normalization > firstName, 2022-05-12 10:53:09 normalization > updatedAt, 2022-05-12 10:53:09 normalization > _airbyte_ab_id, 2022-05-12 10:53:09 normalization > _airbyte_emitted_at, 2022-05-12 10:53:09 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-05-12 10:53:09 normalization > _airbyte_test_owners_hashid 2022-05-12 10:53:09 normalization > from __dbt__cte__test_owners_ab3 2022-05-12 10:53:09 normalization > -- test_owners from `bi-dixa-staging`.temp._airbyte_raw_test_owners 2022-05-12 10:53:09 normalization > where 1 = 1 2022-05-12 10:53:09 normalization > ); 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > 10:53:05.933147 [debug] [Thread-7 ]: BigQuery adapter: Unhandled error while running: 2022-05-12 10:53:09 normalization > /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.test_owners"} */ 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > create or replace table `bi-dixa-staging`.temp.`test_owners` 2022-05-12 10:53:09 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-05-12 10:53:09 normalization > cluster by _airbyte_emitted_at 2022-05-12 10:53:09 normalization > OPTIONS() 2022-05-12 10:53:09 normalization > as ( 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > with __dbt__cte__test_owners_ab1 as ( 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-05-12 10:53:09 normalization > -- depends_on: `bi-dixa-staging`.temp._airbyte_raw_test_owners 2022-05-12 10:53:09 normalization > select 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['email']") as email, 2022-05-12 10:53:09 normalization > json_extract_array(_airbyte_data, "$['teams']") as teams, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['userId']") as userId, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['archived']") as archived, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['lastName']") as lastName, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['createdAt']") as createdAt, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['firstName']") as firstName, 2022-05-12 10:53:09 normalization > json_extract_scalar(_airbyte_data, "$['updatedAt']") as updatedAt, 2022-05-12 10:53:09 normalization > _airbyte_ab_id, 2022-05-12 10:53:09 normalization > _airbyte_emitted_at, 2022-05-12 10:53:09 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-05-12 10:53:09 normalization > from `bi-dixa-staging`.temp._airbyte_raw_test_owners as table_alias 2022-05-12 10:53:09 normalization > -- test_owners 2022-05-12 10:53:09 normalization > where 1 = 1 2022-05-12 10:53:09 normalization > ), __dbt__cte__test_owners_ab2 as ( 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-05-12 10:53:09 normalization > -- depends_on: __dbt__cte__test_owners_ab1 2022-05-12 10:53:09 normalization > select 2022-05-12 10:53:09 normalization > cast(id as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ) as id, 2022-05-12 10:53:09 normalization > cast(email as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ) as email, 2022-05-12 10:53:09 normalization > teams, 2022-05-12 10:53:09 normalization > cast(userId as 2022-05-12 10:53:09 normalization > int64 2022-05-12 10:53:09 normalization > ) as userId, 2022-05-12 10:53:09 normalization > cast(archived as boolean) as archived, 2022-05-12 10:53:09 normalization > cast(lastName as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ) as lastName, 2022-05-12 10:53:09 normalization > cast(nullif(createdAt, '') as 2022-05-12 10:53:09 normalization > timestamp 2022-05-12 10:53:09 normalization > ) as createdAt, 2022-05-12 10:53:09 normalization > cast(firstName as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ) as firstName, 2022-05-12 10:53:09 normalization > cast(nullif(updatedAt, '') as 2022-05-12 10:53:09 normalization > timestamp 2022-05-12 10:53:09 normalization > ) as updatedAt, 2022-05-12 10:53:09 normalization > _airbyte_ab_id, 2022-05-12 10:53:09 normalization > _airbyte_emitted_at, 2022-05-12 10:53:09 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-05-12 10:53:09 normalization > from __dbt__cte__test_owners_ab1 2022-05-12 10:53:09 normalization > -- test_owners 2022-05-12 10:53:09 normalization > where 1 = 1 2022-05-12 10:53:09 normalization > ), __dbt__cte__test_owners_ab3 as ( 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > -- SQL model to build a hash column based on the values of this record 2022-05-12 10:53:09 normalization > -- depends_on: __dbt__cte__test_owners_ab2 2022-05-12 10:53:09 normalization > select 2022-05-12 10:53:09 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(email as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(array_to_string(teams, "|", "") as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(userId as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(archived as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(lastName as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(createdAt as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(firstName as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), ''), '-', coalesce(cast(updatedAt as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ), '')) as 2022-05-12 10:53:09 normalization > string 2022-05-12 10:53:09 normalization > ))) as _airbyte_test_owners_hashid, 2022-05-12 10:53:09 normalization > tmp.* 2022-05-12 10:53:09 normalization > from __dbt__cte__test_owners_ab2 tmp 2022-05-12 10:53:09 normalization > -- test_owners 2022-05-12 10:53:09 normalization > where 1 = 1 2022-05-12 10:53:09 normalization > )-- Final base SQL model 2022-05-12 10:53:09 normalization > -- depends_on: __dbt__cte__test_owners_ab3 2022-05-12 10:53:09 normalization > select 2022-05-12 10:53:09 normalization > id, 2022-05-12 10:53:09 normalization > email, 2022-05-12 10:53:09 normalization > teams, 2022-05-12 10:53:09 normalization > userId, 2022-05-12 10:53:09 normalization > archived, 2022-05-12 10:53:09 normalization > lastName, 2022-05-12 10:53:09 normalization > createdAt, 2022-05-12 10:53:09 normalization > firstName, 2022-05-12 10:53:09 normalization > updatedAt, 2022-05-12 10:53:09 normalization > _airbyte_ab_id, 2022-05-12 10:53:09 normalization > _airbyte_emitted_at, 2022-05-12 10:53:09 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-05-12 10:53:09 normalization > _airbyte_test_owners_hashid 2022-05-12 10:53:09 normalization > from __dbt__cte__test_owners_ab3 2022-05-12 10:53:09 normalization > -- test_owners from `bi-dixa-staging`.temp._airbyte_raw_test_owners 2022-05-12 10:53:09 normalization > where 1 = 1 2022-05-12 10:53:09 normalization > ); 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > 10:53:05.933484 [debug] [Thread-7 ]: BigQuery adapter: 404 Not found: Table bi-dixa-staging:temp._airbyte_raw_test_owners was not found in location EU 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > (job ID: be1c91aa-7103-4af0-8744-1f787e8c36e8) 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > -----Query Job SQL Follows----- 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > | . | . | . | . | . | . | . | . | . | . | . | . | . | . | 2022-05-12 10:53:09 normalization > 1:/* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.test_owners"} */ 2022-05-12 10:53:09 normalization > 2: 2022-05-12 10:53:09 normalization > 3: 2022-05-12 10:53:09 normalization > 4: create or replace table `bi-dixa-staging`.temp.`test_owners` 2022-05-12 10:53:09 normalization > 5: partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-05-12 10:53:09 normalization > 6: cluster by _airbyte_emitted_at 2022-05-12 10:53:09 normalization > 7: OPTIONS() 2022-05-12 10:53:09 normalization > 8: as ( 2022-05-12 10:53:09 normalization > 9: 2022-05-12 10:53:09 normalization > 10:with __dbt__cte__test_owners_ab1 as ( 2022-05-12 10:53:09 normalization > 11: 2022-05-12 10:53:09 normalization > 12:-- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-05-12 10:53:09 normalization > 13:-- depends_on: `bi-dixa-staging`.temp._airbyte_raw_test_owners 2022-05-12 10:53:09 normalization > 14:select 2022-05-12 10:53:09 normalization > 15: json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-05-12 10:53:09 normalization > 16: json_extract_scalar(_airbyte_data, "$['email']") as email, 2022-05-12 10:53:09 normalization > 17: json_extract_array(_airbyte_data, "$['teams']") as teams, 2022-05-12 10:53:09 normalization > 18: json_extract_scalar(_airbyte_data, "$['userId']") as userId, 2022-05-12 10:53:09 normalization > 19: json_extract_scalar(_airbyte_data, "$['archived']") as archived, 2022-05-12 10:53:09 normalization > 20: json_extract_scalar(_airbyte_data, "$['lastName']") as lastName, 2022-05-12 10:53:09 normalization > 21: json_extract_scalar(_airbyte_data, "$['createdAt']") as createdAt, 2022-05-12 10:53:09 normalization > 22: json_extract_scalar(_airbyte_data, "$['firstName']") as firstName, 2022-05-12 10:53:09 normalization > 23: json_extract_scalar(_airbyte_data, "$['updatedAt']") as updatedAt, 2022-05-12 10:53:09 normalization > 24: _airbyte_ab_id, 2022-05-12 10:53:09 normalization > 25: _airbyte_emitted_at, 2022-05-12 10:53:09 normalization > 26: CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-05-12 10:53:09 normalization > 27:from `bi-dixa-staging`.temp._airbyte_raw_test_owners as table_alias 2022-05-12 10:53:09 normalization > 28:-- test_owners 2022-05-12 10:53:09 normalization > 29:where 1 = 1 2022-05-12 10:53:09 normalization > 30:), __dbt__cte__test_owners_ab2 as ( 2022-05-12 10:53:09 normalization > 31: 2022-05-12 10:53:09 normalization > 32:-- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-05-12 10:53:09 normalization > 33:-- depends_on: __dbt__cte__test_owners_ab1 2022-05-12 10:53:09 normalization > 34:select 2022-05-12 10:53:09 normalization > 35: cast(id as 2022-05-12 10:53:09 normalization > 36: string 2022-05-12 10:53:09 normalization > 37:) as id, 2022-05-12 10:53:09 normalization > 38: cast(email as 2022-05-12 10:53:09 normalization > 39: string 2022-05-12 10:53:09 normalization > 40:) as email, 2022-05-12 10:53:09 normalization > 41: teams, 2022-05-12 10:53:09 normalization > 42: cast(userId as 2022-05-12 10:53:09 normalization > 43: int64 2022-05-12 10:53:09 normalization > 44:) as userId, 2022-05-12 10:53:09 normalization > 45: cast(archived as boolean) as archived, 2022-05-12 10:53:09 normalization > 46: cast(lastName as 2022-05-12 10:53:09 normalization > 47: string 2022-05-12 10:53:09 normalization > 48:) as lastName, 2022-05-12 10:53:09 normalization > 49: cast(nullif(createdAt, '') as 2022-05-12 10:53:09 normalization > 50: timestamp 2022-05-12 10:53:09 normalization > 51:) as createdAt, 2022-05-12 10:53:09 normalization > 52: cast(firstName as 2022-05-12 10:53:09 normalization > 53: string 2022-05-12 10:53:09 normalization > 54:) as firstName, 2022-05-12 10:53:09 normalization > 55: cast(nullif(updatedAt, '') as 2022-05-12 10:53:09 normalization > 56: timestamp 2022-05-12 10:53:09 normalization > 57:) as updatedAt, 2022-05-12 10:53:09 normalization > 58: _airbyte_ab_id, 2022-05-12 10:53:09 normalization > 59: _airbyte_emitted_at, 2022-05-12 10:53:09 normalization > 60: CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-05-12 10:53:09 normalization > 61:from __dbt__cte__test_owners_ab1 2022-05-12 10:53:09 normalization > 62:-- test_owners 2022-05-12 10:53:09 normalization > 63:where 1 = 1 2022-05-12 10:53:09 normalization > 64:), __dbt__cte__test_owners_ab3 as ( 2022-05-12 10:53:09 normalization > 65: 2022-05-12 10:53:09 normalization > 66:-- SQL model to build a hash column based on the values of this record 2022-05-12 10:53:09 normalization > 67:-- depends_on: __dbt__cte__test_owners_ab2 2022-05-12 10:53:09 normalization > 68:select 2022-05-12 10:53:09 normalization > 69: to_hex(md5(cast(concat(coalesce(cast(id as 2022-05-12 10:53:09 normalization > 70: string 2022-05-12 10:53:09 normalization > 71:), ''), '-', coalesce(cast(email as 2022-05-12 10:53:09 normalization > 72: string 2022-05-12 10:53:09 normalization > 73:), ''), '-', coalesce(cast(array_to_string(teams, "|", "") as 2022-05-12 10:53:09 normalization > 74: string 2022-05-12 10:53:09 normalization > 75:), ''), '-', coalesce(cast(userId as 2022-05-12 10:53:09 normalization > 76: string 2022-05-12 10:53:09 normalization > 77:), ''), '-', coalesce(cast(archived as 2022-05-12 10:53:09 normalization > 78: string 2022-05-12 10:53:09 normalization > 79:), ''), '-', coalesce(cast(lastName as 2022-05-12 10:53:09 normalization > 80: string 2022-05-12 10:53:09 normalization > 81:), ''), '-', coalesce(cast(createdAt as 2022-05-12 10:53:09 normalization > 82: string 2022-05-12 10:53:09 normalization > 83:), ''), '-', coalesce(cast(firstName as 2022-05-12 10:53:09 normalization > 84: string 2022-05-12 10:53:09 normalization > 85:), ''), '-', coalesce(cast(updatedAt as 2022-05-12 10:53:09 normalization > 86: string 2022-05-12 10:53:09 normalization > 87:), '')) as 2022-05-12 10:53:09 normalization > 88: string 2022-05-12 10:53:09 normalization > 89:))) as _airbyte_test_owners_hashid, 2022-05-12 10:53:09 normalization > 90: tmp.* 2022-05-12 10:53:09 normalization > 91:from __dbt__cte__test_owners_ab2 tmp 2022-05-12 10:53:09 normalization > 92:-- test_owners 2022-05-12 10:53:09 normalization > 93:where 1 = 1 2022-05-12 10:53:09 normalization > 94:)-- Final base SQL model 2022-05-12 10:53:09 normalization > 95:-- depends_on: __dbt__cte__test_owners_ab3 2022-05-12 10:53:09 normalization > 96:select 2022-05-12 10:53:09 normalization > 97: id, 2022-05-12 10:53:09 normalization > 98: email, 2022-05-12 10:53:09 normalization > 99: teams, 2022-05-12 10:53:09 normalization > 100: userId, 2022-05-12 10:53:09 normalization > 101: archived, 2022-05-12 10:53:09 normalization > 102: lastName, 2022-05-12 10:53:09 normalization > 103: createdAt, 2022-05-12 10:53:09 normalization > 104: firstName, 2022-05-12 10:53:09 normalization > 105: updatedAt, 2022-05-12 10:53:09 normalization > 106: _airbyte_ab_id, 2022-05-12 10:53:09 normalization > 107: _airbyte_emitted_at, 2022-05-12 10:53:09 normalization > 108: CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-05-12 10:53:09 normalization > 109: _airbyte_test_owners_hashid 2022-05-12 10:53:09 normalization > 110:from __dbt__cte__test_owners_ab3 2022-05-12 10:53:09 normalization > 111:-- test_owners from `bi-dixa-staging`.temp._airbyte_raw_test_owners 2022-05-12 10:53:09 normalization > 112:where 1 = 1 2022-05-12 10:53:09 normalization > 113: ); 2022-05-12 10:53:09 normalization > 114: 2022-05-12 10:53:09 normalization > | . | . | . | . | . | . | . | . | . | . | . | . | . | . | 2022-05-12 10:53:09 normalization > 10:53:05.933791 [debug] [Thread-7 ]: finished collecting timing info 2022-05-12 10:53:09 normalization > 10:53:05.934309 [debug] [Thread-7 ]: Runtime Error in model test_owners (models/generated/airbyte_tables/temp/test_owners.sql) 2022-05-12 10:53:09 normalization > 404 Not found: Table bi-dixa-staging:temp._airbyte_raw_test_owners was not found in location EU 2022-05-12 10:53:09 normalization > 2022-05-12 10:53:09 normalization > (job ID: be1c91aa-7103-4af0-8744-1f787e8c36e8) 2022-05-12 10:53:09 normalization > 10:53:05.934825 [error] [Thread-7 ]: 1 of 2 ERROR creating table model temp.test_owners...................................................................... [ERROR in 0.30s] 2022-05-12 10:53:09 normalization > 10:53:05.935296 [debug] [Thread-7 ]: Finished running node model.airbyte_utils.test_owners 2022-05-12 10:53:09 normalization > 10:53:05.936353 [debug] [Thread-2 ]: Began running node model.airbyte_utils.test_owners_teams_ab1 2022-05-12 10:53:09 normalization > 10:53:05.936674 [debug] [Thread-2 ]: Finished running node model.airbyte_utils.test_owners_teams_ab1 2022-05-12 10:53:09 normalization > 10:53:05.937432 [debug] [Thread-4 ]: Began running node model.airbyte_utils.test_owners_teams_ab2 2022-05-12 10:53:09 normalization > 10:53:05.937897 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.test_owners_teams_ab2 2022-05-12 10:53:09 normalization > 10:53:05.938742 [debug] [Thread-6 ]: Began running node model.airbyte_utils.test_owners_teams_ab3 2022-05-12 10:53:09 normalization > 10:53:05.939009 [debug] [Thread-6 ]: Finished running node model.airbyte_utils.test_owners_teams_ab3 2022-05-12 10:53:09 normalization > 10:53:05.939597 [debug] [Thread-8 ]: Began running node model.airbyte_utils.test_owners_teams 2022-05-12 10:53:09 normalization > 10:53:05.940069 [info ] [Thread-8 ]: 2 of 2 SKIP relation temp.test_owners_teams............................................................................. [SKIP] 2022-05-12 10:53:09 normalization > 10:53:05.940669 [debug] [Thread-8 ]: Finished running node model.airbyte_utils.test_owners_teams 2022-05-12 10:53:09 normalization > 10:53:05.942854 [debug] [MainThread]: Acquiring new bigquery connection "master" 2022-05-12 10:53:09 normalization > 10:53:05.943429 [info ] [MainThread]: 2022-05-12 10:53:09 normalization > 10:53:05.943785 [info ] [MainThread]: Finished running 2 table models in 1.08s. 2022-05-12 10:53:09 normalization > 10:53:05.944231 [debug] [MainThread]: Connection 'master' was properly closed. 2022-05-12 10:53:09 normalization > 10:53:05.944492 [debug] [MainThread]: Connection 'model.airbyte_utils.test_owners_ab1' was properly closed. 2022-05-12 10:53:09 normalization > 10:53:05.944714 [debug] [MainThread]: Connection 'model.airbyte_utils.test_owners_ab2' was properly closed. 2022-05-12 10:53:09 normalization > 10:53:05.944850 [debug] [MainThread]: Connection 'model.airbyte_utils.test_owners_ab3' was properly closed. 2022-05-12 10:53:09 normalization > 10:53:05.944966 [debug] [MainThread]: Connection 'model.airbyte_utils.test_owners' was properly closed. 2022-05-12 10:53:09 normalization > 10:53:05.959566 [info ] [MainThread]: 2022-05-12 10:53:09 normalization > 10:53:05.959912 [info ] [MainThread]: Completed with 1 error and 0 warnings: 2022-05-12 10:53:09 normalization > 10:53:05.960472 [info ] [MainThread]: 2022-05-12 10:53:09 normalization > 10:53:05.961054 [error] [MainThread]: Runtime Error in model test_owners (models/generated/airbyte_tables/temp/test_owners.sql) 2022-05-12 10:53:09 normalization > 10:53:05.961516 [error] [MainThread]: 404 Not found: Table bi-dixa-staging:temp._airbyte_raw_test_owners was not found in location EU 2022-05-12 10:53:09 normalization > 10:53:05.962003 [error] [MainThread]: 2022-05-12 10:53:09 normalization > 10:53:05.962440 [error] [MainThread]: (job ID: be1c91aa-7103-4af0-8744-1f787e8c36e8) 2022-05-12 10:53:09 normalization > 10:53:05.962841 [info ] [MainThread]: 2022-05-12 10:53:09 normalization > 10:53:05.963321 [info ] [MainThread]: Done. PASS=0 WARN=0 ERROR=1 SKIP=1 TOTAL=2 2022-05-12 10:53:09 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):158 - Completing future exceptionally... io.airbyte.workers.WorkerException: Normalization Failed. at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:61) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:19) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.airbyte.workers.WorkerException: Normalization Failed. at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:58) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] ... 3 more Suppressed: io.airbyte.workers.WorkerException: Normalization process wasn't successful at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:160) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:46) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:19) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-05-12 10:53:09 INFO i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling... 2022-05-12 10:53:09 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):235 - Stopping temporal heartbeating... 2022-05-12 10:53:09 WARN i.t.i.a.POJOActivityTaskHandler(activityFailureToResult):307 - Activity failure. ActivityId=7da5a44b-f65d-3246-9819-cfdae149d274, activityType=Normalize, attempt=1 java.lang.RuntimeException: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.WorkerException: Normalization Failed. at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:233) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.normalize(NormalizationActivityImpl.java:72) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at jdk.internal.reflect.GeneratedMethodAccessor207.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?] at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?] at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityInboundCallsInterceptor.execute(POJOActivityTaskHandler.java:214) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityImplementation.execute(POJOActivityTaskHandler.java:180) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.activity.POJOActivityTaskHandler.handle(POJOActivityTaskHandler.java:120) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:204) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:164) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.8.1.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.WorkerException: Normalization Failed. at io.temporal.serviceclient.CheckedExceptionWrapper.wrap(CheckedExceptionWrapper.java:56) ~[temporal-serviceclient-1.8.1.jar:?] at io.temporal.internal.sync.WorkflowInternal.wrap(WorkflowInternal.java:448) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.activity.Activity.wrap(Activity.java:51) ~[temporal-sdk-1.8.1.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:135) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.lambda$normalize$1(NormalizationActivityImpl.java:98) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:228) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] ... 13 more Caused by: java.util.concurrent.ExecutionException: io.airbyte.workers.WorkerException: Normalization Failed. at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:129) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.lambda$normalize$1(NormalizationActivityImpl.java:98) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:228) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] ... 13 more Caused by: io.airbyte.workers.WorkerException: Normalization Failed. at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:61) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:19) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] ... 1 more Caused by: io.airbyte.workers.WorkerException: Normalization Failed. at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:58) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:19) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] ... 1 more Suppressed: io.airbyte.workers.WorkerException: Normalization process wasn't successful at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:160) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:46) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:19) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-05-12 10:53:09 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. errors: $.client_id: is missing but it is required, $.client_secret: is missing but it is required, $.refresh_token: is missing but it is required, $.credentials_title: must be a constant value OAuth Credentials, $.credentials_title: does not have a value in the enumeration [OAuth Credentials] 2022-05-12 10:53:09 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. errors: $.credential: is not defined in the schema and the schema does not allow additional properties, $.part_size_mb: is not defined in the schema and the schema does not allow additional properties, $.gcs_bucket_name: is not defined in the schema and the schema does not allow additional properties, $.gcs_bucket_path: is not defined in the schema and the schema does not allow additional properties, $.keep_files_in_gcs-bucket: is not defined in the schema and the schema does not allow additional properties, $.method: must be a constant value Standard