2022-07-11 15:46:23 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.plugin: is not defined in the schema and the schema does not allow additional properties, $.publication: is not defined in the schema and the schema does not allow additional properties, $.replication_slot: is not defined in the schema and the schema does not allow additional properties, $.method: does not have a value in the enumeration [Standard], $.method: must be a constant value Standard 2022-07-11 15:46:23 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.method: must be a constant value Standard 2022-07-11 15:46:23 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.credential.hmac_key_access_id: object found, string expected, $.credential.hmac_key_secret: object found, string expected 2022-07-11 15:46:23 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/0/logs.log 2022-07-11 15:46:23 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:46:23 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:46:23 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-postgres:0.4.31 exists... 2022-07-11 15:46:23 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-postgres:0.4.31 was found locally. 2022-07-11 15:46:23 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:46:23 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/0 --log-driver none --name source-postgres-check-89696-0-qyscz --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/source-postgres:0.4.31 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/source-postgres:0.4.31 check --config source_config.json 2022-07-11 15:46:25 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:25 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(getSource):73 - Running source under deployment mode: OSS 2022-07-11 15:46:25 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:25 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):85 - Starting source: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:46:25 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:25 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-07-11 15:46:25 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:25 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:46:25 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:25 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-07-11 15:46:25 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:25 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-07-11 15:46:25 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:25 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:25 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:25 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:25 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:25 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:25 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:25 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:26 INFO i.a.i.b.s.SshTunnel(getInstance):170 - Starting connection with method: NO_TUNNEL 2022-07-11 15:46:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:26 INFO c.z.h.HikariDataSource():80 - HikariPool-1 - Starting... 2022-07-11 15:46:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:26 INFO c.z.h.HikariDataSource():82 - HikariPool-1 - Start completed. 2022-07-11 15:46:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:26 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:46:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:26 INFO i.a.i.s.j.AbstractJdbcSource(lambda$getCheckOperations$1):93 - Attempting to get metadata from the database to see if we can connect. 2022-07-11 15:46:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:26 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$2):197 - Attempting to find the named replication slot using the query: HikariProxyPreparedStatement@1637601612 wrapping SELECT * FROM pg_replication_slots WHERE slot_name = 'airbyte_slot_achilles' AND plugin = 'wal2json' AND database = 'achilles' 2022-07-11 15:46:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:26 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:46:27 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:27 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$4):214 - Attempting to find the publication using the query: HikariProxyPreparedStatement@2063786038 wrapping SELECT * FROM pg_publication WHERE pubname = 'achilles_publication' 2022-07-11 15:46:27 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:27 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:46:27 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:27 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-07-11 15:46:27 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:27 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-07-11 15:46:27 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:46:27 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/0/logs.log 2022-07-11 15:46:27 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:46:27 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:46:27 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-bigquery:1.1.11 exists... 2022-07-11 15:46:27 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-bigquery:1.1.11 was found locally. 2022-07-11 15:46:27 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:46:27 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/0 --log-driver none --name destination-bigquery-check-89696-0-znquy --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:1.1.11 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/destination-bigquery:1.1.11 check --config source_config.json 2022-07-11 15:46:28 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Class path contains multiple SLF4J bindings. 2022-07-11 15:46:28 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:46:28 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:46:28 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:46:28 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-11 15:46:28 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-11 15:46:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:29 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-07-11 15:46:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:29 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2022-07-11 15:46:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:29 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-07-11 15:46:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:29 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-07-11 15:46:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:29 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:29 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:29 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:29 INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):301 - Selected loading method is set to: GCS 2022-07-11 15:46:31 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:31 INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 - S3 format config: {"format_type":"CSV","flattening":"No flattening"} 2022-07-11 15:46:31 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:31 INFO i.a.i.d.s.S3Destination(testSingleUpload):81 - Started testing if all required credentials assigned to user for single file uploading 2022-07-11 15:46:32 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:32 INFO i.a.i.d.s.S3Destination(testSingleUpload):91 - Finished checking for normal upload mode 2022-07-11 15:46:32 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:32 INFO i.a.i.d.s.S3Destination(testMultipartUpload):95 - Started testing if all required credentials assigned to user for multipart upload 2022-07-11 15:46:32 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:32 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/test_1657554392131 with full ID ABPnzm62mJUaq8qVIq6oY1GUNNihfRU9_IPS1AHy6y6omGhYISc3ty1Flu9_dW8OpAaAUgg 2022-07-11 15:46:32 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:32 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:46:32 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:32 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:46:32 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:32 WARN a.m.s.MultiPartOutputStream(close):160 - [MultipartOutputStream for parts 1 - 10000] is already closed 2022-07-11 15:46:32 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:32 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/test_1657554392131 with id ABPnzm62m...8OpAaAUgg]: Uploading leftover stream [Part number 1 containing 3.34 MB] 2022-07-11 15:46:32 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:32 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/test_1657554392131 with id ABPnzm62m...8OpAaAUgg]: Finished uploading [Part number 1 containing 3.34 MB] 2022-07-11 15:46:33 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:33 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/test_1657554392131 with id ABPnzm62m...8OpAaAUgg]: Completed 2022-07-11 15:46:33 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:46:33 INFO i.a.i.d.s.S3Destination(testMultipartUpload):119 - Finished verification for multipart upload mode 2022-07-11 15:46:34 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:46:34 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/0/logs.log 2022-07-11 15:46:34 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:46:34 INFO i.a.w.g.DefaultReplicationWorker(run):115 - start sync worker. job id: 89696 attempt id: 0 2022-07-11 15:46:34 INFO i.a.w.g.DefaultReplicationWorker(run):127 - configured sync modes: {public.bank_config=incremental - append_dedup, public.files_out=incremental - append_dedup, public.transactions_out=incremental - append_dedup, public.partner_config=incremental - append_dedup, public.transactions_in=incremental - append_dedup, public.files_in=incremental - append_dedup} 2022-07-11 15:46:34 INFO i.a.w.i.DefaultAirbyteDestination(start):69 - Running destination... 2022-07-11 15:46:34 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:46:34 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-bigquery:1.1.11 exists... 2022-07-11 15:46:34 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-bigquery:1.1.11 was found locally. 2022-07-11 15:46:34 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:46:34 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/0 --log-driver none --name destination-bigquery-write-89696-0-aoxuq --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:1.1.11 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/destination-bigquery:1.1.11 write --config destination_config.json --catalog destination_catalog.json 2022-07-11 15:46:34 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:46:34 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-postgres:0.4.31 exists... 2022-07-11 15:46:34 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-postgres:0.4.31 was found locally. 2022-07-11 15:46:34 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:46:34 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/0 --log-driver none --name source-postgres-read-89696-0-cjvtu --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/source-postgres:0.4.31 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/source-postgres:0.4.31 read --config source_config.json --catalog source_catalog.json --state input_state.json 2022-07-11 15:46:34 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):405 - Destination output thread started. 2022-07-11 15:46:34 INFO i.a.w.g.DefaultReplicationWorker(run):169 - Waiting for source and destination threads to complete. 2022-07-11 15:46:34 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):298 - Replication thread started. 2022-07-11 15:46:35 destination > SLF4J: Class path contains multiple SLF4J bindings. 2022-07-11 15:46:35 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:46:35 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:46:35 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:46:35 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-11 15:46:35 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-11 15:46:37 source > 2022-07-11 15:46:37 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(getSource):73 - Running source under deployment mode: OSS 2022-07-11 15:46:37 source > 2022-07-11 15:46:37 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):85 - Starting source: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:46:37 destination > 2022-07-11 15:46:37 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2022-07-11 15:46:37 source > 2022-07-11 15:46:37 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {read=null, catalog=source_catalog.json, state=input_state.json, config=source_config.json} 2022-07-11 15:46:37 source > 2022-07-11 15:46:37 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:46:37 source > 2022-07-11 15:46:37 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: READ 2022-07-11 15:46:37 destination > 2022-07-11 15:46:37 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2022-07-11 15:46:37 source > 2022-07-11 15:46:37 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='input_state.json'} 2022-07-11 15:46:37 destination > 2022-07-11 15:46:37 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: WRITE 2022-07-11 15:46:37 destination > 2022-07-11 15:46:37 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2022-07-11 15:46:37 destination > 2022-07-11 15:46:37 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:37 destination > 2022-07-11 15:46:37 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:37 destination > 2022-07-11 15:46:37 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:37 source > 2022-07-11 15:46:37 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:37 source > 2022-07-11 15:46:37 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:37 source > 2022-07-11 15:46:37 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:37 source > 2022-07-11 15:46:37 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:46:37 destination > 2022-07-11 15:46:37 INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):301 - Selected loading method is set to: GCS 2022-07-11 15:46:38 destination > 2022-07-11 15:46:38 INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 - S3 format config: {"format_type":"AVRO","flattening":"No flattening"} 2022-07-11 15:46:38 destination > 2022-07-11 15:46:38 INFO i.a.i.d.b.BigQueryUtils(isKeepFilesInGcs):317 - All tmp files will be removed from GCS when replication is finished 2022-07-11 15:46:38 source > 2022-07-11 15:46:38 INFO i.a.i.b.s.SshTunnel(getInstance):170 - Starting connection with method: NO_TUNNEL 2022-07-11 15:46:38 source > 2022-07-11 15:46:38 INFO c.z.h.HikariDataSource():80 - HikariPool-1 - Starting... 2022-07-11 15:46:38 source > 2022-07-11 15:46:38 INFO c.z.h.HikariDataSource():82 - HikariPool-1 - Start completed. 2022-07-11 15:46:39 destination > 2022-07-11 15:46:39 INFO i.a.i.d.b.BigQueryDestination(getGcsRecordConsumer):289 - Creating BigQuery staging message consumer with staging ID 570c2fab-944c-4499-b774-309f87bdc1fd at 2022-07-11T15:46:38.294Z 2022-07-11 15:46:39 destination > 2022-07-11 15:46:39 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=bank_config, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_bwg_bank_config}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_bank_config}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:46:39 destination > 2022-07-11 15:46:39 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=files_in, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_nnj_files_in}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_in}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:46:39 destination > 2022-07-11 15:46:39 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=files_out, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_ott_files_out}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_out}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:46:39 destination > 2022-07-11 15:46:39 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=partner_config, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_lgc_partner_config}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_partner_config}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:46:39 destination > 2022-07-11 15:46:39 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=transactions_in, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qek_transactions_in}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_in}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:46:39 destination > 2022-07-11 15:46:39 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=transactions_out, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wly_transactions_out}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_out}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:46:39 source > 2022-07-11 15:46:39 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:46:39 source > 2022-07-11 15:46:39 INFO i.a.i.s.j.AbstractJdbcSource(lambda$getCheckOperations$1):93 - Attempting to get metadata from the database to see if we can connect. 2022-07-11 15:46:39 source > 2022-07-11 15:46:39 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$2):197 - Attempting to find the named replication slot using the query: HikariProxyPreparedStatement@1855734078 wrapping SELECT * FROM pg_replication_slots WHERE slot_name = 'airbyte_slot_achilles' AND plugin = 'wal2json' AND database = 'achilles' 2022-07-11 15:46:39 source > 2022-07-11 15:46:39 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:46:39 source > 2022-07-11 15:46:39 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$4):214 - Attempting to find the publication using the query: HikariProxyPreparedStatement@234223040 wrapping SELECT * FROM pg_publication WHERE pubname = 'achilles_publication' 2022-07-11 15:46:39 source > 2022-07-11 15:46:39 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:46:39 destination > 2022-07-11 15:46:39 INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2022-07-11 15:46:39 destination > 2022-07-11 15:46:39 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$3):98 - Preparing tmp tables in destination started for 6 streams 2022-07-11 15:46:39 source > 2022-07-11 15:46:39 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-07-11 15:46:39 destination > 2022-07-11 15:46:39 INFO i.a.i.d.b.BigQueryGcsOperations(createSchemaIfNotExists):86 - Creating dataset raw_achilles 2022-07-11 15:46:39 source > 2022-07-11 15:46:39 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-07-11 15:46:39 source > 2022-07-11 15:46:39 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.r.s.StateManagerFactory(createStateManager):51 - Global state manager selected to manage state object with type LEGACY. 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.r.s.StateManagerFactory(generateGlobalState):84 - Legacy state converted to global state. 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='files_in', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='partner_config', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='bank_config', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='files_out', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='transactions_out', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='transactions_in', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.r.CdcStateManager():29 - Initialized CDC state with: null 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO c.z.h.HikariDataSource():80 - HikariPool-2 - Starting... 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO c.z.h.HikariDataSource():82 - HikariPool-2 - Start completed. 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresSource(discoverRawTables):168 - Checking schema: public 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(discoverInternal):121 - Internal schemas to exclude: [catalog_history, information_schema, pg_catalog, pg_internal] 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column trace_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column account_no (type varchar[17]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column account_name (type varchar[22]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column transaction_code (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column dc_sign (type varchar[6]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column effective_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column originating_bank_routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column destination_bank_routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_history (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column uuid (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_attempt (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column in_suspense (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_error (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column subtype (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column ach_entry (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column returned (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column partner_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column name (type varchar[23]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column account_prefix (type varchar[6]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column config (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table schema_migrations column version (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table schema_migrations column dirty (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column name (type varchar[23]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column config (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column file_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column external_id (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column partner_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column trace_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column account_no (type varchar[17]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column account_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column source_account_no (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column source_account_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column description (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column effective_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column destination_bank_routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column return_data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column reference_info (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column transaction_code (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column transaction_in_id (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column uuid (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column status (type varchar[30]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column is_same_day (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column file_name (type varchar[255]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column batch_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column file_hash (type varchar[256]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column exchange_window (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column file_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column file_hash (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column company_id (type varchar[10]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column company_name (type varchar[16]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column company_entry_description (type varchar[10]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column batch_type (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column batch_number (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column originating_dfi (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column sec_code (type varchar[3]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column settlement_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column entry_trace_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column transaction_code (type varchar[2]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column receiving_dfi (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column dfi_account_no (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column individual_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column individual_id_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_record_count (type varchar[4]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column external_id (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column partner_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column effective_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column returned (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column processing_history (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column uuid (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column return_data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column transaction_out_id (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column foreign_exchange_indicator (type varchar[2]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column destination_country_code (type varchar[2]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column originator_id (type varchar[10]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column originating_currency_code (type varchar[3]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column destination_currency_code (type varchar[3]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_99 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_98 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_02 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_05 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_10 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_11 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_12 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_13 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_14 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_15 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_16 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_17 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_18 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column future_dated (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column preprocessing_path (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column postprocessing_path (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column file_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column file_hash (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column started (type timestamp[29]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column ended (type timestamp[29]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column std_entries_processed (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column iat_entries_processed (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column iat_entry_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column std_entry_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_entry_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_batch_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_debit_amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_credit_amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.in_processing 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.partner_config 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.schema_migrations 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.bank_config 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.transactions_out 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.files_out 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.transactions_in 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.files_in 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.d.j.s.TwoStageSizeEstimator(getTargetBufferByteSize):72 - Max memory limit: 29578231808, JDBC buffer size: 1073741824 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresCdcCatalogHelper(getPublicizedTables):92 - For CDC, only tables in publication achilles_publication will be included in the sync: [pglogical.node, pglogical.replication_set_seq, public.partner_config, public.schema_migrations, public.files_in, pglogical.queue, pglogical.node_interface, public.transactions_out, pglogical.local_node, pglogical.subscription, pglogical.replication_set_table, pglogical.depend, public.bank_config, pglogical.local_sync_status, public.files_out, public.in_processing, pglogical.replication_set, pglogical.sequence_state, public.transactions_in] 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.s.p.PostgresCdcTargetPosition(targetPosition):45 - identified target lsn: PgLsn{lsn=365839818040} 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO i.a.i.d.AirbyteDebeziumHandler(getIncrementalIterators):99 - Using CDC: true 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 INFO o.a.k.c.c.AbstractConfig(logAll):376 - EmbeddedConfig values: 2022-07-11 15:46:40 source > access.control.allow.methods = 2022-07-11 15:46:40 source > access.control.allow.origin = 2022-07-11 15:46:40 source > admin.listeners = null 2022-07-11 15:46:40 source > bootstrap.servers = [localhost:9092] 2022-07-11 15:46:40 source > client.dns.lookup = use_all_dns_ips 2022-07-11 15:46:40 source > config.providers = [] 2022-07-11 15:46:40 source > connector.client.config.override.policy = All 2022-07-11 15:46:40 source > header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter 2022-07-11 15:46:40 source > key.converter = class org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:46:40 source > listeners = [http://:8083] 2022-07-11 15:46:40 source > metric.reporters = [] 2022-07-11 15:46:40 source > metrics.num.samples = 2 2022-07-11 15:46:40 source > metrics.recording.level = INFO 2022-07-11 15:46:40 source > metrics.sample.window.ms = 30000 2022-07-11 15:46:40 source > offset.flush.interval.ms = 1000 2022-07-11 15:46:40 source > offset.flush.timeout.ms = 5000 2022-07-11 15:46:40 source > offset.storage.file.filename = /tmp/cdc-state-offset4096452785080963400/offset.dat 2022-07-11 15:46:40 source > offset.storage.partitions = null 2022-07-11 15:46:40 source > offset.storage.replication.factor = null 2022-07-11 15:46:40 source > offset.storage.topic = 2022-07-11 15:46:40 source > plugin.path = null 2022-07-11 15:46:40 source > response.http.headers.config = 2022-07-11 15:46:40 source > rest.advertised.host.name = null 2022-07-11 15:46:40 source > rest.advertised.listener = null 2022-07-11 15:46:40 source > rest.advertised.port = null 2022-07-11 15:46:40 source > rest.extension.classes = [] 2022-07-11 15:46:40 source > ssl.cipher.suites = null 2022-07-11 15:46:40 source > ssl.client.auth = none 2022-07-11 15:46:40 source > ssl.enabled.protocols = [TLSv1.2, TLSv1.3] 2022-07-11 15:46:40 source > ssl.endpoint.identification.algorithm = https 2022-07-11 15:46:40 source > ssl.engine.factory.class = null 2022-07-11 15:46:40 source > ssl.key.password = null 2022-07-11 15:46:40 source > ssl.keymanager.algorithm = SunX509 2022-07-11 15:46:40 source > ssl.keystore.certificate.chain = null 2022-07-11 15:46:40 source > ssl.keystore.key = null 2022-07-11 15:46:40 source > ssl.keystore.location = null 2022-07-11 15:46:40 source > ssl.keystore.password = null 2022-07-11 15:46:40 source > ssl.keystore.type = JKS 2022-07-11 15:46:40 source > ssl.protocol = TLSv1.3 2022-07-11 15:46:40 source > ssl.provider = null 2022-07-11 15:46:40 source > ssl.secure.random.implementation = null 2022-07-11 15:46:40 source > ssl.trustmanager.algorithm = PKIX 2022-07-11 15:46:40 source > ssl.truststore.certificates = null 2022-07-11 15:46:40 source > ssl.truststore.location = null 2022-07-11 15:46:40 source > ssl.truststore.password = null 2022-07-11 15:46:40 source > ssl.truststore.type = JKS 2022-07-11 15:46:40 source > task.shutdown.graceful.timeout.ms = 5000 2022-07-11 15:46:40 source > topic.creation.enable = true 2022-07-11 15:46:40 source > topic.tracking.allow.reset = true 2022-07-11 15:46:40 source > topic.tracking.enable = true 2022-07-11 15:46:40 source > value.converter = class org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:46:40 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $: unknown found, object expected 2022-07-11 15:46:40 ERROR i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: null 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 WARN o.a.k.c.r.WorkerConfig(logInternalConverterRemovalWarnings):316 - The worker has been configured with one or more internal converter properties ([internal.key.converter, internal.value.converter]). Support for these properties was deprecated in version 2.0 and removed in version 3.0, and specifying them will have no effect. Instead, an instance of the JsonConverter with schemas.enable set to false will be used. For more information, please visit http://kafka.apache.org/documentation/#upgrade and consult the upgrade notesfor the 3.0 release. 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 WARN o.a.k.c.r.WorkerConfig(logPluginPathConfigProviderWarning):334 - Variables cannot be used in the 'plugin.path' property, since the property is used by plugin scanning before the config providers that replace the variables are initialized. The raw value 'null' was used for plugin scanning, as opposed to the transformed value 'null', and this may cause unexpected results. 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 WARN i.d.c.p.PostgresConnectorConfig(validatePluginName):1394 - Logical decoder 'wal2json' is deprecated and will be removed in future versions 2022-07-11 15:46:40 source > 2022-07-11 15:46:40 WARN i.d.c.p.PostgresConnectorConfig(validateTruncateHandlingMode):1333 - Configuration property 'truncate.handling.mode' is deprecated and will be removed in future versions. Please use 'skipped.operations' instead. 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 WARN i.d.c.p.PostgresConnectorConfig(validateToastedValuePlaceholder):1384 - Configuration property 'toasted.value.placeholder' is deprecated and will be removed in future versions. Please use 'unavailable.value.placeholder' instead. 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(start):124 - Starting PostgresConnectorTask with configuration: 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - connector.class = io.debezium.connector.postgresql.PostgresConnector 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - max.queue.size = 8192 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - slot.name = airbyte_slot_achilles 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - publication.name = achilles_publication 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.storage.file.filename = /tmp/cdc-state-offset4096452785080963400/offset.dat 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - decimal.handling.mode = string 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - converters = datetime 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - datetime.type = io.airbyte.integrations.debezium.internals.PostgresConverter 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - value.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - key.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - publication.autocreate.mode = disabled 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.user = airbyte_achilles 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.dbname = achilles 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.storage = org.apache.kafka.connect.storage.FileOffsetBackingStore 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.server.name = achilles 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.flush.timeout.ms = 5000 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - plugin.name = wal2json 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.port = 5432 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.flush.interval.ms = 1000 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - key.converter.schemas.enable = false 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - internal.key.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.hostname = 10.58.160.3 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.password = ******** 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - name = achilles 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - value.converter.schemas.enable = false 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - internal.value.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - max.batch.size = 2048 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - table.include.list = public.bank_config,public.files_in,public.files_out,public.partner_config,public.transactions_in,public.transactions_out 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - snapshot.mode = initial 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.c.BaseSourceTask(getPreviousOffsets):318 - No previous offsets found 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.p.PostgresConnectorTask(start):108 - user 'airbyte_achilles' connected to database 'achilles' on PostgreSQL 12.10 on x86_64-pc-linux-gnu, compiled by Debian clang version 12.0.1, 64-bit with roles: 2022-07-11 15:46:41 source > role 'cloudsqlsuperuser' [superuser: false, replication: false, inherit: true, create role: true, create db: true, can log in: true] 2022-07-11 15:46:41 source > role 'pg_read_all_settings' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:46:41 source > role 'pg_stat_scan_tables' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:46:41 source > role 'pg_monitor' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:46:41 source > role 'airbyte_achilles' [superuser: false, replication: true, inherit: true, create role: true, create db: true, can log in: true] 2022-07-11 15:46:41 source > role 'pg_read_all_stats' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:46:41 source > role 'pg_signal_backend' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.p.c.PostgresConnection(readReplicationSlotInfo):251 - Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{53/E1155F98}, catalogXmin=19514215] 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.p.PostgresConnectorTask(start):117 - No previous offset found 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.p.s.InitialSnapshotter(shouldSnapshot):34 - Taking initial snapshot for new datasource 2022-07-11 15:46:41 destination > 2022-07-11 15:46:41 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_lgc_partner_config}} 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.u.Threads(threadFactory):270 - Requested thread factory for connector PostgresConnector, id = achilles named = change-event-source-coordinator 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.u.Threads$3(newThread):287 - Creating thread debezium-postgresconnector-achilles-change-event-source-coordinator 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.p.ChangeEventSourceCoordinator(lambda$start$0):103 - Metrics registered 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.p.ChangeEventSourceCoordinator(lambda$start$0):106 - Context created 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.p.s.InitialSnapshotter(shouldSnapshot):34 - Taking initial snapshot for new datasource 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.p.PostgresSnapshotChangeEventSource(getSnapshottingTask):64 - According to the connector configuration data will be snapshotted 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):87 - Snapshot step 1 - Preparing 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):96 - Snapshot step 2 - Determining captured tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.replication_set_table to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.bank_config to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.files_out to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.local_sync_status to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.partner_config to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.replication_set to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.node to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.schema_migrations to the list of capture schema tables 2022-07-11 15:46:41 destination > 2022-07-11 15:46:41 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_lgc_partner_config}} 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.local_node to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.depend to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.replication_set_seq to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.queue to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.subscription to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.transactions_out to the list of capture schema tables 2022-07-11 15:46:41 destination > 2022-07-11 15:46:41 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream partner_config (dataset raw_achilles): airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.node_interface to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.in_processing to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.transactions_in to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.sequence_state to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.files_in to the list of capture schema tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):103 - Snapshot step 3 - Locking captured tables [public.bank_config, public.files_in, public.files_out, public.partner_config, public.transactions_in, public.transactions_out] 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):109 - Snapshot step 4 - Determining snapshot offset 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.p.PostgresOffsetContext(initialContext):231 - Creating initial offset context 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.p.PostgresOffsetContext(initialContext):234 - Read xlogStart at 'LSN{55/2DC09D38}' from transaction '20086007' 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.p.PostgresSnapshotChangeEventSource(updateOffsetForSnapshot):146 - Read xlogStart at 'LSN{55/2DC09D38}' from transaction '20086007' 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):112 - Snapshot step 5 - Reading structure of captured tables 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.c.p.PostgresSnapshotChangeEventSource(readTableStructure):192 - Reading structure of schema 'public' of catalog 'achilles' 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):116 - Snapshot step 6 - Persisting schema history 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):128 - Snapshot step 7 - Snapshotting data 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEvents):302 - Snapshotting contents of 6 tables while still in transaction 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.bank_config' (1 of 6 tables) 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.bank_config' using select statement: 'SELECT "bank_id", "name", "routing_no", "created", "updated", "config" FROM "public"."bank_config"' 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 3 records for table 'public.bank_config'; total duration '00:00:00.035' 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.files_in' (2 of 6 tables) 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.files_in' using select statement: 'SELECT "id", "preprocessing_path", "postprocessing_path", "file_name", "file_hash", "started", "ended", "std_entries_processed", "iat_entries_processed", "iat_entry_count", "std_entry_count", "total_entry_count", "total_batch_count", "total_debit_amount", "total_credit_amount", "updated" FROM "public"."files_in"' 2022-07-11 15:46:41 destination > 2022-07-11 15:46:41 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ does not exist in bucket; creating... 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 36 records for table 'public.files_in'; total duration '00:00:00.099' 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.files_out' (3 of 6 tables) 2022-07-11 15:46:41 source > 2022-07-11 15:46:41 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.files_out' using select statement: 'SELECT "id", "bank_id", "file_name", "batch_count", "file_hash", "created", "exchange_window", "updated" FROM "public"."files_out"' 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 34 records for table 'public.files_out'; total duration '00:00:00.036' 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.partner_config' (4 of 6 tables) 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.partner_config' using select statement: 'SELECT "bank_id", "partner_id", "name", "account_prefix", "created", "updated", "config", "routing_no" FROM "public"."partner_config"' 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 206 records for table 'public.partner_config'; total duration '00:00:00.117' 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.transactions_in' (5 of 6 tables) 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.transactions_in' using select statement: 'SELECT "id", "file_name", "file_hash", "company_id", "company_name", "company_entry_description", "batch_type", "batch_number", "originating_dfi", "sec_code", "settlement_date", "entry_trace_no", "transaction_code", "receiving_dfi", "dfi_account_no", "individual_name", "individual_id_no", "addenda_record_count", "external_id", "bank_id", "partner_id", "amount", "effective_date", "returned", "processing_history", "created", "updated", "uuid", "return_data", "transaction_out_id", "foreign_exchange_indicator", "destination_country_code", "originator_id", "originating_currency_code", "destination_currency_code", "addenda_99", "addenda_98", "addenda_02", "addenda_05", "addenda_10", "addenda_11", "addenda_12", "addenda_13", "addenda_14", "addenda_15", "addenda_16", "addenda_17", "addenda_18", "future_dated" FROM "public"."transactions_in"' 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 26 records for table 'public.transactions_in'; total duration '00:00:00.042' 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.transactions_out' (6 of 6 tables) 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.transactions_out' using select statement: 'SELECT "id", "file_id", "external_id", "bank_id", "partner_id", "trace_no", "account_no", "account_name", "amount", "source_account_no", "source_account_name", "description", "effective_date", "destination_bank_routing_no", "return_data", "reference_info", "transaction_code", "created", "updated", "transaction_in_id", "uuid", "data", "status", "is_same_day" FROM "public"."transactions_out"' 2022-07-11 15:46:42 destination > 2022-07-11 15:46:42 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ has been created in bucket. 2022-07-11 15:46:42 destination > 2022-07-11 15:46:42 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_bwg_bank_config}} 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 113 records for table 'public.transactions_out'; total duration '00:00:00.089' 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.p.s.AbstractSnapshotChangeEventSource(execute):88 - Snapshot - Final stage 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.p.ChangeEventSourceCoordinator(doSnapshot):156 - Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.postgresql.Source:STRUCT}, sourceInfo=source_info[server='achilles'db='achilles', lsn=LSN{55/2DC09D38}, txId=20086007, timestamp=2022-07-11T15:46:42.250Z, snapshot=FALSE, schema=public, table=transactions_out], lastSnapshotRecord=true, lastCompletelyProcessedLsn=null, lastCommitLsn=null, streamingStoppingLsn=null, transactionContext=TransactionContext [currentTransactionId=null, perTableEventCount={}, totalEventCount=0], incrementalSnapshotContext=IncrementalSnapshotContext [windowOpened=false, chunkEndPosition=null, dataCollectionsToSnapshot=[], lastEventKeySent=null, maximumKey=null]]] 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.p.ChangeEventSourceCoordinator(streamingConnected):234 - Connected metrics set to 'true' 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.bank_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.partner_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.p.ChangeEventSourceCoordinator(streamEvents):173 - Starting streaming 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresStreamingChangeEventSource(execute):127 - Retrieved latest position from stored offset 'LSN{55/2DC09D38}' 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.c.WalPositionLocator():40 - Looking for WAL restart position for last commit LSN 'null' and last change LSN 'LSN{55/2DC09D38}' 2022-07-11 15:46:42 destination > 2022-07-11 15:46:42 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_bwg_bank_config}} 2022-07-11 15:46:42 destination > 2022-07-11 15:46:42 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream bank_config (dataset raw_achilles): airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.c.PostgresConnection(readReplicationSlotInfo):251 - Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{53/E1155F98}, catalogXmin=19514215] 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:46:42 destination > 2022-07-11 15:46:42 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ does not exist in bucket; creating... 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.u.Threads(threadFactory):270 - Requested thread factory for connector PostgresConnector, id = achilles named = keep-alive 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.u.Threads$3(newThread):287 - Creating thread debezium-postgresconnector-achilles-keep-alive 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.bank_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.partner_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.c.p.PostgresStreamingChangeEventSource(searchWalPosition):314 - Searching for WAL resume position 2022-07-11 15:46:42 destination > 2022-07-11 15:46:42 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ has been created in bucket. 2022-07-11 15:46:42 destination > 2022-07-11 15:46:42 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_nnj_files_in}} 2022-07-11 15:46:42 destination > 2022-07-11 15:46:42 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_nnj_files_in}} 2022-07-11 15:46:42 destination > 2022-07-11 15:46:42 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream files_in (dataset raw_achilles): airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:42 destination > 2022-07-11 15:46:42 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ does not exist in bucket; creating... 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.e.EmbeddedEngine(stop):1047 - Stopping the embedded engine 2022-07-11 15:46:42 source > 2022-07-11 15:46:42 INFO i.d.e.EmbeddedEngine(stop):1055 - Waiting for PT5M for connector to stop 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ has been created in bucket. 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qek_transactions_in}} 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.e.EmbeddedEngine(run):846 - Stopping the task and engine 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.c.c.BaseSourceTask(stop):238 - Stopping down connector 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qek_transactions_in}} 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream transactions_in (dataset raw_achilles): airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ does not exist in bucket; creating... 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ has been created in bucket. 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_ott_files_out}} 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.c.p.PostgresStreamingChangeEventSource(searchWalPosition):335 - WAL resume position 'null' discovered 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_ott_files_out}} 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream files_out (dataset raw_achilles): airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.u.Threads(threadFactory):270 - Requested thread factory for connector PostgresConnector, id = achilles named = keep-alive 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.u.Threads$3(newThread):287 - Creating thread debezium-postgresconnector-achilles-keep-alive 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.c.p.PostgresStreamingChangeEventSource(processMessages):202 - Processing messages 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.p.ChangeEventSourceCoordinator(streamEvents):175 - Finished streaming 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.p.ChangeEventSourceCoordinator(streamingConnected):234 - Connected metrics set to 'false' 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.a.i.d.i.DebeziumRecordPublisher(lambda$start$1):85 - Debezium engine shutdown. 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.a.i.s.p.PostgresCdcStateHandler(saveState):32 - debezium state: {"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839818040,\"txId\":20086007,\"ts_usec\":1657554402250000,\"snapshot\":true}"} 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.a.i.s.r.AbstractDbSource(lambda$read$2):139 - Closing database connection pool. 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO c.z.h.HikariDataSource(close):350 - HikariPool-2 - Shutdown initiated... 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ does not exist in bucket; creating... 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO c.z.h.HikariDataSource(close):352 - HikariPool-2 - Shutdown completed. 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.a.i.s.r.AbstractDbSource(lambda$read$2):141 - Closed database connection pool. 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:46:43 source > 2022-07-11 15:46:43 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):87 - Completed source: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:46:43 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):327 - Source has no more messages, closing connection. 2022-07-11 15:46:43 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):335 - Total records read: 419 (288 KB) 2022-07-11 15:46:43 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publictransactions_out. Error messages: [$.file_id is of an incorrect type. Expected it to be number, $.transaction_in_id is of an incorrect type. Expected it to be string, $.return_data is of an incorrect type. Expected it to be string, $._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:46:43 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicfiles_in. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:46:43 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publictransactions_in. Error messages: [$.destination_country_code is of an incorrect type. Expected it to be string, $.addenda_11 is of an incorrect type. Expected it to be string, $.destination_currency_code is of an incorrect type. Expected it to be string, $.addenda_99 is of an incorrect type. Expected it to be string, $.foreign_exchange_indicator is of an incorrect type. Expected it to be string, $.addenda_12 is of an incorrect type. Expected it to be string, $.addenda_05 is of an incorrect type. Expected it to be string, $.addenda_15 is of an incorrect type. Expected it to be string, $.originator_id is of an incorrect type. Expected it to be string, $.addenda_10 is of an incorrect type. Expected it to be string, $.addenda_02 is of an incorrect type. Expected it to be string, $.addenda_18 is of an incorrect type. Expected it to be string, $.addenda_98 is of an incorrect type. Expected it to be string, $.individual_id_no is of an incorrect type. Expected it to be string, $.addenda_13 is of an incorrect type. Expected it to be string, $.addenda_record_count is of an incorrect type. Expected it to be string, $.transaction_out_id is of an incorrect type. Expected it to be string, $.addenda_16 is of an incorrect type. Expected it to be string, $.addenda_17 is of an incorrect type. Expected it to be string, $.return_data is of an incorrect type. Expected it to be string, $.originating_currency_code is of an incorrect type. Expected it to be string, $.future_dated is of an incorrect type. Expected it to be boolean, $._ab_cdc_deleted_at is of an incorrect type. Expected it to be string, $.addenda_14 is of an incorrect type. Expected it to be string] 2022-07-11 15:46:43 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicfiles_out. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:46:43 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicpartner_config. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:46:43 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicbank_config. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:46:43 INFO i.a.w.g.DefaultReplicationWorker(run):174 - One of source or destination thread complete. Waiting on the other. 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ has been created in bucket. 2022-07-11 15:46:43 destination > 2022-07-11 15:46:43 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wly_transactions_out}} 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wly_transactions_out}} 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream transactions_out (dataset raw_achilles): airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ does not exist in bucket; creating... 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ has been created in bucket. 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$3):107 - Preparing tmp tables in destination completed. 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream bank_config (current state: 0 bytes in 0 buffers) 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream files_in (current state: 0 bytes in 1 buffers) 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream files_out (current state: 0 bytes in 2 buffers) 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream partner_config (current state: 0 bytes in 3 buffers) 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream transactions_in (current state: 62 KB in 4 buffers) 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream transactions_out (current state: 125 KB in 5 buffers) 2022-07-11 15:46:44 destination > 2022-07-11 15:46:44 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded. 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.b.BufferedStreamConsumer(close):171 - executing on success close procedure. 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.r.SerializedBufferingStrategy(flushAll):103 - Flushing all 6 current buffers (188 KB in total) 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream partner_config (62 KB) 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream partner_config (62 KB) to staging 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to fb2e83da-69cc-4717-8230-53467e5f88d614449931737705220915.avro (111 KB) 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream partner_config (dataset raw_achilles): airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with full ID ABPnzm6b3l5XbiAiVMQSTJkvIewcAyQm2UxzAz_xuaOeqC1AiOiwyl0HUZtnAoZLeM8hKa0 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm6b3...ZLeM8hKa0]: Uploading leftover stream [Part number 1 containing 0.11 MB] 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm6b3...ZLeM8hKa0]: Finished uploading [Part number 1 containing 0.11 MB] 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm6b3...ZLeM8hKa0]: Completed 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: fb2e83da-69cc-4717-8230-53467e5f88d614449931737705220915.avro -> airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro (filename: 1.avro) 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data fb2e83da-69cc-4717-8230-53467e5f88d614449931737705220915.avro 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream files_in (325 bytes) 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream files_in (325 bytes) to staging 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to edd96d1d-23f8-49a6-8ed0-3830815aaca31634975853622269956.avro (23 KB) 2022-07-11 15:46:45 destination > 2022-07-11 15:46:45 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream files_in (dataset raw_achilles): airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with full ID ABPnzm5vOXVHzbCuAD84vPi3LJEjNlSVlavdeNyZmi-6pIplawFmweCX2AedVQ5PmaAisno 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm5vO...5PmaAisno]: Uploading leftover stream [Part number 1 containing 0.02 MB] 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm5vO...5PmaAisno]: Finished uploading [Part number 1 containing 0.02 MB] 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm5vO...5PmaAisno]: Completed 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: edd96d1d-23f8-49a6-8ed0-3830815aaca31634975853622269956.avro -> airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro (filename: 1.avro) 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data edd96d1d-23f8-49a6-8ed0-3830815aaca31634975853622269956.avro 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream bank_config (328 bytes) 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream bank_config (328 bytes) to staging 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to ba4d4096-dbce-4483-8f62-262b4b47e43f346927659161794810.avro (2 KB) 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream bank_config (dataset raw_achilles): airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with full ID ABPnzm4FyM2rtfUqkU1aibnaCXQGvXH_2gu00yJYGYi7RF7Gf7jkJr2UR6QeaDu6m0t0UwQ 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:46:46 destination > 2022-07-11 15:46:46 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm4Fy...u6m0t0UwQ]: Uploading leftover stream [Part number 1 containing 0.00 MB] 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm4Fy...u6m0t0UwQ]: Finished uploading [Part number 1 containing 0.00 MB] 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm4Fy...u6m0t0UwQ]: Completed 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: ba4d4096-dbce-4483-8f62-262b4b47e43f346927659161794810.avro -> airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro (filename: 1.avro) 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data ba4d4096-dbce-4483-8f62-262b4b47e43f346927659161794810.avro 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream transactions_in (62 KB) 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream transactions_in (62 KB) to staging 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to ace17335-2281-406e-ab56-3b4a832b2fc1966517866256468876.avro (62 KB) 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream transactions_in (dataset raw_achilles): airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with full ID ABPnzm4CGQzoVZOUOQDbsGS3H4b7g_G7Nl3HRlNlSIDF8MBWGoiqualYeBwFjaaqIiS8rLM 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:46:47 destination > 2022-07-11 15:46:47 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm4CG...aqIiS8rLM]: Uploading leftover stream [Part number 1 containing 0.06 MB] 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm4CG...aqIiS8rLM]: Finished uploading [Part number 1 containing 0.06 MB] 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm4CG...aqIiS8rLM]: Completed 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: ace17335-2281-406e-ab56-3b4a832b2fc1966517866256468876.avro -> airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro (filename: 1.avro) 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data ace17335-2281-406e-ab56-3b4a832b2fc1966517866256468876.avro 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream transactions_out (63 KB) 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream transactions_out (63 KB) to staging 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to 29a44119-34e1-4dbd-9930-7f047b4a900d15282054105029930350.avro (93 KB) 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream transactions_out (dataset raw_achilles): airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with full ID ABPnzm4OsLc-v_ICLzRD62Fhxi28O7Yo9WBsx-DwovVvwmaV0urV61w9b_BekspBiWglyUs 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm4Os...pBiWglyUs]: Uploading leftover stream [Part number 1 containing 0.09 MB] 2022-07-11 15:46:48 destination > 2022-07-11 15:46:48 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm4Os...pBiWglyUs]: Finished uploading [Part number 1 containing 0.09 MB] 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm4Os...pBiWglyUs]: Completed 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: 29a44119-34e1-4dbd-9930-7f047b4a900d15282054105029930350.avro -> airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro (filename: 1.avro) 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data 29a44119-34e1-4dbd-9930-7f047b4a900d15282054105029930350.avro 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream files_out (326 bytes) 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream files_out (326 bytes) to staging 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to f2d09652-9ea9-4517-9d20-58f74de1ba474421847134820957223.avro (14 KB) 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream files_out (dataset raw_achilles): airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with full ID ABPnzm45RKUO27oRFeRkG0arvUlnH3H8xyUuzaS52occ93afjJjbHnhBYb-iJHsF45XHhy0 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm45R...sF45XHhy0]: Uploading leftover stream [Part number 1 containing 0.01 MB] 2022-07-11 15:46:49 destination > 2022-07-11 15:46:49 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm45R...sF45XHhy0]: Finished uploading [Part number 1 containing 0.01 MB] 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro with id ABPnzm45R...sF45XHhy0]: Completed 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: f2d09652-9ea9-4517-9d20-58f74de1ba474421847134820957223.avro -> airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro (filename: 1.avro) 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data f2d09652-9ea9-4517-9d20-58f74de1ba474421847134820957223.avro 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream partner_config 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream files_in 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream bank_config 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream transactions_in 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream transactions_out 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream files_out 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):137 - Copying into tables in destination started for 6 streams 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_lgc_partner_config}} (dataset raw_achilles): [1.avro] 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=74be95ee-b2f0-43e8-be62-03562f75e7c8, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_lgc_partner_config}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=74be95ee-b2f0-43e8-be62-03562f75e7c8, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554410497, endTime=null, startTime=1657554410623, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=zUTfVd4oV3hpLFI8ke/Glg==, generatedId=mainapi-282501:US.74be95ee-b2f0-43e8-be62-03562f75e7c8, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/74be95ee-b2f0-43e8-be62-03562f75e7c8?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_lgc_partner_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:46:50 destination > 2022-07-11 15:46:50 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=74be95ee-b2f0-43e8-be62-03562f75e7c8, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554410497, endTime=null, startTime=1657554410623, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=zUTfVd4oV3hpLFI8ke/Glg==, generatedId=mainapi-282501:US.74be95ee-b2f0-43e8-be62-03562f75e7c8, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/74be95ee-b2f0-43e8-be62-03562f75e7c8?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_lgc_partner_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:46:57 destination > 2022-07-11 15:46:57 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=74be95ee-b2f0-43e8-be62-03562f75e7c8, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554410497, endTime=null, startTime=1657554410623, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=zUTfVd4oV3hpLFI8ke/Glg==, generatedId=mainapi-282501:US.74be95ee-b2f0-43e8-be62-03562f75e7c8, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/74be95ee-b2f0-43e8-be62-03562f75e7c8?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_lgc_partner_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:46:57 destination > 2022-07-11 15:46:57 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=74be95ee-b2f0-43e8-be62-03562f75e7c8, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_lgc_partner_config}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:46:57 destination > 2022-07-11 15:46:57 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_lgc_partner_config}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_partner_config}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:47:00 destination > 2022-07-11 15:47:00 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_lgc_partner_config}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_partner_config}} 2022-07-11 15:47:00 destination > 2022-07-11 15:47:00 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_bwg_bank_config}} (dataset raw_achilles): [1.avro] 2022-07-11 15:47:00 destination > 2022-07-11 15:47:00 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:47:00 destination > 2022-07-11 15:47:00 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=b7754641-0319-423e-a557-b8aa8f89fba6, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_bwg_bank_config}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=b7754641-0319-423e-a557-b8aa8f89fba6, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554420259, endTime=null, startTime=1657554420376, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=X7/LwarnGsXvLdPUvYndjw==, generatedId=mainapi-282501:US.b7754641-0319-423e-a557-b8aa8f89fba6, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/b7754641-0319-423e-a557-b8aa8f89fba6?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_bwg_bank_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:47:00 destination > 2022-07-11 15:47:00 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=b7754641-0319-423e-a557-b8aa8f89fba6, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554420259, endTime=null, startTime=1657554420376, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=X7/LwarnGsXvLdPUvYndjw==, generatedId=mainapi-282501:US.b7754641-0319-423e-a557-b8aa8f89fba6, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/b7754641-0319-423e-a557-b8aa8f89fba6?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_bwg_bank_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:47:02 destination > 2022-07-11 15:47:02 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=b7754641-0319-423e-a557-b8aa8f89fba6, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554420259, endTime=null, startTime=1657554420376, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=X7/LwarnGsXvLdPUvYndjw==, generatedId=mainapi-282501:US.b7754641-0319-423e-a557-b8aa8f89fba6, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/b7754641-0319-423e-a557-b8aa8f89fba6?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_bwg_bank_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:47:02 destination > 2022-07-11 15:47:02 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=b7754641-0319-423e-a557-b8aa8f89fba6, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_bwg_bank_config}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:47:02 destination > 2022-07-11 15:47:02 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_bwg_bank_config}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_bank_config}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:47:05 destination > 2022-07-11 15:47:05 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_bwg_bank_config}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_bank_config}} 2022-07-11 15:47:05 destination > 2022-07-11 15:47:05 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_nnj_files_in}} (dataset raw_achilles): [1.avro] 2022-07-11 15:47:05 destination > 2022-07-11 15:47:05 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:47:06 destination > 2022-07-11 15:47:06 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=60f98067-476e-41af-95ff-dae86329b56a, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_nnj_files_in}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=60f98067-476e-41af-95ff-dae86329b56a, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554425845, endTime=null, startTime=1657554425978, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=BZr/u7Ct2O970Yzk8k82EA==, generatedId=mainapi-282501:US.60f98067-476e-41af-95ff-dae86329b56a, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/60f98067-476e-41af-95ff-dae86329b56a?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_nnj_files_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:47:06 destination > 2022-07-11 15:47:06 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=60f98067-476e-41af-95ff-dae86329b56a, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554425845, endTime=null, startTime=1657554425978, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=BZr/u7Ct2O970Yzk8k82EA==, generatedId=mainapi-282501:US.60f98067-476e-41af-95ff-dae86329b56a, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/60f98067-476e-41af-95ff-dae86329b56a?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_nnj_files_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:47:09 destination > 2022-07-11 15:47:09 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=60f98067-476e-41af-95ff-dae86329b56a, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554425845, endTime=null, startTime=1657554425978, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=BZr/u7Ct2O970Yzk8k82EA==, generatedId=mainapi-282501:US.60f98067-476e-41af-95ff-dae86329b56a, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/60f98067-476e-41af-95ff-dae86329b56a?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_nnj_files_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:47:09 destination > 2022-07-11 15:47:09 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=60f98067-476e-41af-95ff-dae86329b56a, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_nnj_files_in}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:47:09 destination > 2022-07-11 15:47:09 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_nnj_files_in}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_in}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:47:11 destination > 2022-07-11 15:47:11 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_nnj_files_in}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_in}} 2022-07-11 15:47:11 destination > 2022-07-11 15:47:11 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qek_transactions_in}} (dataset raw_achilles): [1.avro] 2022-07-11 15:47:11 destination > 2022-07-11 15:47:11 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:47:11 destination > 2022-07-11 15:47:11 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=11c2d48f-3b4a-4ade-8fcc-23ff996173f7, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qek_transactions_in}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=11c2d48f-3b4a-4ade-8fcc-23ff996173f7, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554431609, endTime=null, startTime=1657554431717, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=gAgPfzjR1TO9OfamGeiwXw==, generatedId=mainapi-282501:US.11c2d48f-3b4a-4ade-8fcc-23ff996173f7, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/11c2d48f-3b4a-4ade-8fcc-23ff996173f7?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_qek_transactions_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:47:11 destination > 2022-07-11 15:47:11 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=11c2d48f-3b4a-4ade-8fcc-23ff996173f7, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554431609, endTime=null, startTime=1657554431717, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=gAgPfzjR1TO9OfamGeiwXw==, generatedId=mainapi-282501:US.11c2d48f-3b4a-4ade-8fcc-23ff996173f7, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/11c2d48f-3b4a-4ade-8fcc-23ff996173f7?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_qek_transactions_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:47:16 destination > 2022-07-11 15:47:16 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=11c2d48f-3b4a-4ade-8fcc-23ff996173f7, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554431609, endTime=null, startTime=1657554431717, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=gAgPfzjR1TO9OfamGeiwXw==, generatedId=mainapi-282501:US.11c2d48f-3b4a-4ade-8fcc-23ff996173f7, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/11c2d48f-3b4a-4ade-8fcc-23ff996173f7?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_qek_transactions_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:47:16 destination > 2022-07-11 15:47:16 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=11c2d48f-3b4a-4ade-8fcc-23ff996173f7, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qek_transactions_in}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:47:16 destination > 2022-07-11 15:47:16 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qek_transactions_in}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_in}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:47:20 destination > 2022-07-11 15:47:20 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qek_transactions_in}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_in}} 2022-07-11 15:47:20 destination > 2022-07-11 15:47:20 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_ott_files_out}} (dataset raw_achilles): [1.avro] 2022-07-11 15:47:20 destination > 2022-07-11 15:47:20 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:47:20 destination > 2022-07-11 15:47:20 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=3d2d2ca4-848d-44e8-aa30-c089be40bbb6, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_ott_files_out}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=3d2d2ca4-848d-44e8-aa30-c089be40bbb6, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554440194, endTime=null, startTime=1657554440319, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=wP7Ir9XOChKwY7L5bDSkuA==, generatedId=mainapi-282501:US.3d2d2ca4-848d-44e8-aa30-c089be40bbb6, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/3d2d2ca4-848d-44e8-aa30-c089be40bbb6?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_ott_files_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:47:20 destination > 2022-07-11 15:47:20 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=3d2d2ca4-848d-44e8-aa30-c089be40bbb6, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554440194, endTime=null, startTime=1657554440319, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=wP7Ir9XOChKwY7L5bDSkuA==, generatedId=mainapi-282501:US.3d2d2ca4-848d-44e8-aa30-c089be40bbb6, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/3d2d2ca4-848d-44e8-aa30-c089be40bbb6?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_ott_files_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:47:22 destination > 2022-07-11 15:47:22 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=3d2d2ca4-848d-44e8-aa30-c089be40bbb6, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554440194, endTime=null, startTime=1657554440319, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=wP7Ir9XOChKwY7L5bDSkuA==, generatedId=mainapi-282501:US.3d2d2ca4-848d-44e8-aa30-c089be40bbb6, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/3d2d2ca4-848d-44e8-aa30-c089be40bbb6?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_ott_files_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:47:22 destination > 2022-07-11 15:47:22 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=3d2d2ca4-848d-44e8-aa30-c089be40bbb6, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_ott_files_out}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:47:22 destination > 2022-07-11 15:47:22 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_ott_files_out}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_out}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:47:24 destination > 2022-07-11 15:47:24 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_ott_files_out}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_out}} 2022-07-11 15:47:24 destination > 2022-07-11 15:47:24 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wly_transactions_out}} (dataset raw_achilles): [1.avro] 2022-07-11 15:47:24 destination > 2022-07-11 15:47:24 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:47:25 destination > 2022-07-11 15:47:25 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=090d04bf-98a0-4ab8-b5d0-0a805a14f120, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wly_transactions_out}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=090d04bf-98a0-4ab8-b5d0-0a805a14f120, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554444890, endTime=null, startTime=1657554445011, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=l3j1RXIH+WseluzOln/goQ==, generatedId=mainapi-282501:US.090d04bf-98a0-4ab8-b5d0-0a805a14f120, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/090d04bf-98a0-4ab8-b5d0-0a805a14f120?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_wly_transactions_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:47:25 destination > 2022-07-11 15:47:25 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=090d04bf-98a0-4ab8-b5d0-0a805a14f120, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554444890, endTime=null, startTime=1657554445011, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=l3j1RXIH+WseluzOln/goQ==, generatedId=mainapi-282501:US.090d04bf-98a0-4ab8-b5d0-0a805a14f120, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/090d04bf-98a0-4ab8-b5d0-0a805a14f120?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_wly_transactions_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:47:27 destination > 2022-07-11 15:47:27 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=090d04bf-98a0-4ab8-b5d0-0a805a14f120, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554444890, endTime=null, startTime=1657554445011, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=l3j1RXIH+WseluzOln/goQ==, generatedId=mainapi-282501:US.090d04bf-98a0-4ab8-b5d0-0a805a14f120, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/090d04bf-98a0-4ab8-b5d0-0a805a14f120?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_wly_transactions_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:47:27 destination > 2022-07-11 15:47:27 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=090d04bf-98a0-4ab8-b5d0-0a805a14f120, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wly_transactions_out}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:47:27 destination > 2022-07-11 15:47:27 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wly_transactions_out}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_out}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:47:33 destination > 2022-07-11 15:47:33 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wly_transactions_out}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_out}} 2022-07-11 15:47:33 destination > 2022-07-11 15:47:33 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):156 - Finalizing tables in destination completed 2022-07-11 15:47:33 destination > 2022-07-11 15:47:33 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):159 - Cleaning up destination started for 6 streams 2022-07-11 15:47:33 destination > 2022-07-11 15:47:33 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_lgc_partner_config}} (dataset raw_achilles) 2022-07-11 15:47:33 destination > 2022-07-11 15:47:33 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream partner_config (dataset raw_achilles): airbyte/raw_achilles_partner_config 2022-07-11 15:47:33 destination > 2022-07-11 15:47:33 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:47:33 destination > 2022-07-11 15:47:33 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_partner_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:47:33 destination > 2022-07-11 15:47:33 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_partner_config has been cleaned-up (2 objects were deleted)... 2022-07-11 15:47:33 destination > 2022-07-11 15:47:33 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_bwg_bank_config}} (dataset raw_achilles) 2022-07-11 15:47:33 destination > 2022-07-11 15:47:33 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream bank_config (dataset raw_achilles): airbyte/raw_achilles_bank_config 2022-07-11 15:47:33 destination > 2022-07-11 15:47:33 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:47:34 destination > 2022-07-11 15:47:34 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_bank_config/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:47:34 destination > 2022-07-11 15:47:34 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_bank_config has been cleaned-up (2 objects were deleted)... 2022-07-11 15:47:34 destination > 2022-07-11 15:47:34 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_nnj_files_in}} (dataset raw_achilles) 2022-07-11 15:47:34 destination > 2022-07-11 15:47:34 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream files_in (dataset raw_achilles): airbyte/raw_achilles_files_in 2022-07-11 15:47:34 destination > 2022-07-11 15:47:34 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:47:34 destination > 2022-07-11 15:47:34 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:47:34 destination > 2022-07-11 15:47:34 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_files_in has been cleaned-up (2 objects were deleted)... 2022-07-11 15:47:34 destination > 2022-07-11 15:47:34 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qek_transactions_in}} (dataset raw_achilles) 2022-07-11 15:47:34 destination > 2022-07-11 15:47:34 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream transactions_in (dataset raw_achilles): airbyte/raw_achilles_transactions_in 2022-07-11 15:47:35 destination > 2022-07-11 15:47:35 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:47:35 destination > 2022-07-11 15:47:35 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_in/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:47:35 destination > 2022-07-11 15:47:35 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_transactions_in has been cleaned-up (2 objects were deleted)... 2022-07-11 15:47:35 destination > 2022-07-11 15:47:35 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_ott_files_out}} (dataset raw_achilles) 2022-07-11 15:47:35 destination > 2022-07-11 15:47:35 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream files_out (dataset raw_achilles): airbyte/raw_achilles_files_out 2022-07-11 15:47:35 destination > 2022-07-11 15:47:35 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:47:35 destination > 2022-07-11 15:47:35 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:47:35 destination > 2022-07-11 15:47:35 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_files_out has been cleaned-up (2 objects were deleted)... 2022-07-11 15:47:35 destination > 2022-07-11 15:47:35 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wly_transactions_out}} (dataset raw_achilles) 2022-07-11 15:47:36 destination > 2022-07-11 15:47:36 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream transactions_out (dataset raw_achilles): airbyte/raw_achilles_transactions_out 2022-07-11 15:47:36 destination > 2022-07-11 15:47:36 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/ 2022-07-11 15:47:36 destination > 2022-07-11 15:47:36 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_out/2022/07/11/15/570c2fab-944c-4499-b774-309f87bdc1fd/1.avro 2022-07-11 15:47:36 destination > 2022-07-11 15:47:36 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_transactions_out has been cleaned-up (2 objects were deleted)... 2022-07-11 15:47:36 destination > 2022-07-11 15:47:36 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):164 - Cleaning up destination completed. 2022-07-11 15:47:36 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):415 - State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@7a3da762[type=STATE,log=,spec=,connectionStatus=,catalog=,record=,state=io.airbyte.protocol.models.AirbyteStateMessage@74fa4655[type=GLOBAL,stream=,global=io.airbyte.protocol.models.AirbyteGlobalState@70c5cc12[sharedState={"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839818040,\"txId\":20086007,\"ts_usec\":1657554402250000,\"snapshot\":true}"}},streamStates=[io.airbyte.protocol.models.AirbyteStreamState@35c74c6e[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@22d6fb03[name=bank_config,namespace=public,additionalProperties={}],streamState={"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@4b8d9e2[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@5a43303b[name=files_in,namespace=public,additionalProperties={}],streamState={"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@55e8b321[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@4073447c[name=files_out,namespace=public,additionalProperties={}],streamState={"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@246f259[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@4147d8c2[name=partner_config,namespace=public,additionalProperties={}],streamState={"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@51a60a70[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@b2cf49c[name=transactions_in,namespace=public,additionalProperties={}],streamState={"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@3ecd0b9a[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@41b35549[name=transactions_out,namespace=public,additionalProperties={}],streamState={"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}]],additionalProperties={}],data={"cdc":true,"cdc_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839818040,\"txId\":20086007,\"ts_usec\":1657554402250000,\"snapshot\":true}"}},"streams":[{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}]},additionalProperties={}],trace=,additionalProperties={}] 2022-07-11 15:47:36 destination > 2022-07-11 15:47:36 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2022-07-11 15:47:36 INFO i.a.w.g.DefaultReplicationWorker(run):176 - Source and destination threads complete. 2022-07-11 15:47:36 INFO i.a.w.g.DefaultReplicationWorker(run):239 - sync summary: io.airbyte.config.ReplicationAttemptSummary@7f40d97f[status=completed,recordsSynced=418,bytesSynced=295080,startTime=1657554394447,endTime=1657554456585,totalStats=io.airbyte.config.SyncStats@7d116c7d[recordsEmitted=418,bytesEmitted=295080,stateMessagesEmitted=1,recordsCommitted=418],streamStats=[io.airbyte.config.StreamSyncStats@8f270e3[streamName=bank_config,stats=io.airbyte.config.SyncStats@583cb1d5[recordsEmitted=3,bytesEmitted=1792,stateMessagesEmitted=,recordsCommitted=3]], io.airbyte.config.StreamSyncStats@21ce108c[streamName=transactions_out,stats=io.airbyte.config.SyncStats@57cae24c[recordsEmitted=113,bytesEmitted=90414,stateMessagesEmitted=,recordsCommitted=113]], io.airbyte.config.StreamSyncStats@4c24f53e[streamName=partner_config,stats=io.airbyte.config.SyncStats@712b4776[recordsEmitted=206,bytesEmitted=104843,stateMessagesEmitted=,recordsCommitted=206]], io.airbyte.config.StreamSyncStats@203073af[streamName=transactions_in,stats=io.airbyte.config.SyncStats@20e82d67[recordsEmitted=26,bytesEmitted=62878,stateMessagesEmitted=,recordsCommitted=26]], io.airbyte.config.StreamSyncStats@4b8ea141[streamName=files_in,stats=io.airbyte.config.SyncStats@60b55fb1[recordsEmitted=36,bytesEmitted=22085,stateMessagesEmitted=,recordsCommitted=36]], io.airbyte.config.StreamSyncStats@43bbe503[streamName=files_out,stats=io.airbyte.config.SyncStats@22597b06[recordsEmitted=34,bytesEmitted=13068,stateMessagesEmitted=,recordsCommitted=34]]]] 2022-07-11 15:47:36 INFO i.a.w.g.DefaultReplicationWorker(run):266 - Source output at least one state message 2022-07-11 15:47:36 INFO i.a.w.g.DefaultReplicationWorker(run):272 - State capture: Updated state to: Optional[io.airbyte.config.State@1c6352a5[state=[{"type":"GLOBAL","global":{"shared_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839818040,\"txId\":20086007,\"ts_usec\":1657554402250000,\"snapshot\":true}"}},"stream_states":[{"stream_descriptor":{"name":"bank_config","namespace":"public"},"stream_state":{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_in","namespace":"public"},"stream_state":{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_out","namespace":"public"},"stream_state":{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"partner_config","namespace":"public"},"stream_state":{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_in","namespace":"public"},"stream_state":{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_out","namespace":"public"},"stream_state":{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}}]},"data":{"cdc":true,"cdc_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839818040,\"txId\":20086007,\"ts_usec\":1657554402250000,\"snapshot\":true}"}},"streams":[{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}]}}]]] 2022-07-11 15:47:36 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:47:36 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):161 - sync summary: io.airbyte.config.StandardSyncOutput@3ec32f58[standardSyncSummary=io.airbyte.config.StandardSyncSummary@7bc1b6e4[status=completed,recordsSynced=418,bytesSynced=295080,startTime=1657554394447,endTime=1657554456585,totalStats=io.airbyte.config.SyncStats@7d116c7d[recordsEmitted=418,bytesEmitted=295080,stateMessagesEmitted=1,recordsCommitted=418],streamStats=[io.airbyte.config.StreamSyncStats@8f270e3[streamName=bank_config,stats=io.airbyte.config.SyncStats@583cb1d5[recordsEmitted=3,bytesEmitted=1792,stateMessagesEmitted=,recordsCommitted=3]], io.airbyte.config.StreamSyncStats@21ce108c[streamName=transactions_out,stats=io.airbyte.config.SyncStats@57cae24c[recordsEmitted=113,bytesEmitted=90414,stateMessagesEmitted=,recordsCommitted=113]], io.airbyte.config.StreamSyncStats@4c24f53e[streamName=partner_config,stats=io.airbyte.config.SyncStats@712b4776[recordsEmitted=206,bytesEmitted=104843,stateMessagesEmitted=,recordsCommitted=206]], io.airbyte.config.StreamSyncStats@203073af[streamName=transactions_in,stats=io.airbyte.config.SyncStats@20e82d67[recordsEmitted=26,bytesEmitted=62878,stateMessagesEmitted=,recordsCommitted=26]], io.airbyte.config.StreamSyncStats@4b8ea141[streamName=files_in,stats=io.airbyte.config.SyncStats@60b55fb1[recordsEmitted=36,bytesEmitted=22085,stateMessagesEmitted=,recordsCommitted=36]], io.airbyte.config.StreamSyncStats@43bbe503[streamName=files_out,stats=io.airbyte.config.SyncStats@22597b06[recordsEmitted=34,bytesEmitted=13068,stateMessagesEmitted=,recordsCommitted=34]]]],normalizationSummary=,state=io.airbyte.config.State@1c6352a5[state=[{"type":"GLOBAL","global":{"shared_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839818040,\"txId\":20086007,\"ts_usec\":1657554402250000,\"snapshot\":true}"}},"stream_states":[{"stream_descriptor":{"name":"bank_config","namespace":"public"},"stream_state":{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_in","namespace":"public"},"stream_state":{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_out","namespace":"public"},"stream_state":{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"partner_config","namespace":"public"},"stream_state":{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_in","namespace":"public"},"stream_state":{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_out","namespace":"public"},"stream_state":{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}}]},"data":{"cdc":true,"cdc_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839818040,\"txId\":20086007,\"ts_usec\":1657554402250000,\"snapshot\":true}"}},"streams":[{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}]}}]],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@1535b7b0[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@4a70eb8d[stream=io.airbyte.protocol.models.AirbyteStream@6d6eb474[name=bank_config,jsonSchema={"type":"object","properties":{"name":{"type":"string"},"config":{"type":"string"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"routing_no":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[bank_id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[bank_id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@1bee49ab[stream=io.airbyte.protocol.models.AirbyteStream@23def551[name=files_in,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"ended":{"type":"string","format":"date-time","airbyte_type":"timestamp_without_timezone"},"started":{"type":"string","format":"date-time","airbyte_type":"timestamp_without_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"file_hash":{"type":"string"},"file_name":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"iat_entry_count":{"type":"number"},"std_entry_count":{"type":"number"},"total_batch_count":{"type":"number"},"total_entry_count":{"type":"number"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"},"preprocessing_path":{"type":"string"},"total_debit_amount":{"type":"number"},"postprocessing_path":{"type":"string"},"total_credit_amount":{"type":"number"},"iat_entries_processed":{"type":"number"},"std_entries_processed":{"type":"number"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@6e3f4e24[stream=io.airbyte.protocol.models.AirbyteStream@766a9a83[name=files_out,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"file_hash":{"type":"string"},"file_name":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"batch_count":{"type":"number"},"exchange_window":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@1fa036f4[stream=io.airbyte.protocol.models.AirbyteStream@4b802c2c[name=partner_config,jsonSchema={"type":"object","properties":{"name":{"type":"string"},"config":{"type":"string"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"partner_id":{"type":"number"},"routing_no":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"account_prefix":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[partner_id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[partner_id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@5f5a1516[stream=io.airbyte.protocol.models.AirbyteStream@7d4622fc[name=transactions_in,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"uuid":{"type":"string"},"amount":{"type":"number"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"returned":{"type":"boolean"},"sec_code":{"type":"string"},"file_hash":{"type":"string"},"file_name":{"type":"string"},"addenda_02":{"type":"string"},"addenda_05":{"type":"string"},"addenda_10":{"type":"string"},"addenda_11":{"type":"string"},"addenda_12":{"type":"string"},"addenda_13":{"type":"string"},"addenda_14":{"type":"string"},"addenda_15":{"type":"string"},"addenda_16":{"type":"string"},"addenda_17":{"type":"string"},"addenda_18":{"type":"string"},"addenda_98":{"type":"string"},"addenda_99":{"type":"string"},"batch_type":{"type":"string"},"company_id":{"type":"string"},"partner_id":{"type":"number"},"_ab_cdc_lsn":{"type":"number"},"external_id":{"type":"string"},"return_data":{"type":"string"},"batch_number":{"type":"number"},"company_name":{"type":"string"},"future_dated":{"type":"boolean"},"originator_id":{"type":"string"},"receiving_dfi":{"type":"string"},"dfi_account_no":{"type":"string"},"effective_date":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"entry_trace_no":{"type":"string"},"individual_name":{"type":"string"},"originating_dfi":{"type":"string"},"settlement_date":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"individual_id_no":{"type":"string"},"transaction_code":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"},"processing_history":{"type":"string"},"transaction_out_id":{"type":"string"},"addenda_record_count":{"type":"string"},"destination_country_code":{"type":"string"},"company_entry_description":{"type":"string"},"destination_currency_code":{"type":"string"},"originating_currency_code":{"type":"string"},"foreign_exchange_indicator":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@63cfbc36[stream=io.airbyte.protocol.models.AirbyteStream@29e41d2f[name=transactions_out,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"data":{"type":"string"},"uuid":{"type":"string"},"amount":{"type":"number"},"status":{"type":"string"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"file_id":{"type":"number"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"trace_no":{"type":"string"},"account_no":{"type":"string"},"partner_id":{"type":"number"},"_ab_cdc_lsn":{"type":"number"},"description":{"type":"string"},"external_id":{"type":"string"},"is_same_day":{"type":"boolean"},"return_data":{"type":"string"},"account_name":{"type":"string"},"effective_date":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"reference_info":{"type":"string"},"transaction_code":{"type":"number"},"source_account_no":{"type":"string"},"transaction_in_id":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"},"source_account_name":{"type":"string"},"destination_bank_routing_no":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[]] 2022-07-11 15:47:36 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating... 2022-07-11 15:47:36 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:47:36 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/0/logs.log 2022-07-11 15:47:36 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:47:36 INFO i.a.w.g.DefaultNormalizationWorker(run):49 - Running normalization. 2022-07-11 15:47:36 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization:0.2.6 2022-07-11 15:47:36 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization:0.2.6 exists... 2022-07-11 15:47:36 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization:0.2.6 was found locally. 2022-07-11 15:47:36 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:47:36 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/0/normalize --log-driver none --name normalization-normalize-89696-0-miowa --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.32-alpha airbyte/normalization:0.2.6 run --integration-type bigquery --config destination_config.json --catalog destination_catalog.json 2022-07-11 15:47:37 normalization > Running: transform-config --config destination_config.json --integration-type bigquery --out /data/89696/0/normalize 2022-07-11 15:47:37 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/89696/0/normalize') 2022-07-11 15:47:37 normalization > transform_bigquery 2022-07-11 15:47:37 normalization > Running: transform-catalog --integration-type bigquery --profile-config-dir /data/89696/0/normalize --catalog destination_catalog.json --out /data/89696/0/normalize/models/generated/ --json-column _airbyte_data 2022-07-11 15:47:38 normalization > Processing destination_catalog.json... 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/bank_config_ab1.sql from bank_config 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/bank_config_ab2.sql from bank_config 2022-07-11 15:47:38 normalization > Generating airbyte_views/raw_achilles/bank_config_stg.sql from bank_config 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/scd/raw_achilles/bank_config_scd.sql from bank_config 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/raw_achilles/bank_config.sql from bank_config 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/files_in_ab1.sql from files_in 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/files_in_ab2.sql from files_in 2022-07-11 15:47:38 normalization > Generating airbyte_views/raw_achilles/files_in_stg.sql from files_in 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/scd/raw_achilles/files_in_scd.sql from files_in 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/raw_achilles/files_in.sql from files_in 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/files_out_ab1.sql from files_out 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/files_out_ab2.sql from files_out 2022-07-11 15:47:38 normalization > Generating airbyte_views/raw_achilles/files_out_stg.sql from files_out 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/scd/raw_achilles/files_out_scd.sql from files_out 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/raw_achilles/files_out.sql from files_out 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/partner_config_ab1.sql from partner_config 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/partner_config_ab2.sql from partner_config 2022-07-11 15:47:38 normalization > Generating airbyte_views/raw_achilles/partner_config_stg.sql from partner_config 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/scd/raw_achilles/partner_config_scd.sql from partner_config 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/raw_achilles/partner_config.sql from partner_config 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/transactions_in_ab1.sql from transactions_in 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/transactions_in_ab2.sql from transactions_in 2022-07-11 15:47:38 normalization > Generating airbyte_views/raw_achilles/transactions_in_stg.sql from transactions_in 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql from transactions_in 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/raw_achilles/transactions_in.sql from transactions_in 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/transactions_out_ab1.sql from transactions_out 2022-07-11 15:47:38 normalization > Generating airbyte_ctes/raw_achilles/transactions_out_ab2.sql from transactions_out 2022-07-11 15:47:38 normalization > Generating airbyte_views/raw_achilles/transactions_out_stg.sql from transactions_out 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/scd/raw_achilles/transactions_out_scd.sql from transactions_out 2022-07-11 15:47:38 normalization > Generating airbyte_incremental/raw_achilles/transactions_out.sql from transactions_out 2022-07-11 15:47:38 normalization > detected no config file for ssh, assuming ssh is off. 2022-07-11 15:47:42 normalization > [--event-buffer-size EVENT_BUFFER_SIZE] 2022-07-11 15:47:42 normalization > --event-buffer-size EVENT_BUFFER_SIZE 2022-07-11 15:47:42 normalization > 2022-07-11 15:47:42 normalization > DBT >=1.0.0 detected; using 10K event buffer size 2022-07-11 15:47:42 normalization > 2022-07-11 15:47:46 normalization > 15:47:46 Running with dbt=1.0.0 2022-07-11 15:47:46 normalization > 15:47:46 Partial parse save file not found. Starting full parse. 2022-07-11 15:47:49 normalization > 15:47:49 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-07-11 15:47:49 normalization > There are 1 unused configuration paths: 2022-07-11 15:47:49 normalization > - models.airbyte_utils.generated.airbyte_tables 2022-07-11 15:47:49 normalization > 2022-07-11 15:47:49 normalization > 15:47:49 Found 30 models, 0 tests, 0 snapshots, 0 analyses, 549 macros, 0 operations, 0 seed files, 6 sources, 0 exposures, 0 metrics 2022-07-11 15:47:49 normalization > 15:47:49 2022-07-11 15:47:50 normalization > 15:47:50 Concurrency: 8 threads (target='prod') 2022-07-11 15:47:50 normalization > 15:47:50 2022-07-11 15:47:51 normalization > 15:47:51 1 of 18 START view model _airbyte_raw_achilles.bank_config_stg.......................................................... [RUN] 2022-07-11 15:47:51 normalization > 15:47:51 2 of 18 START view model _airbyte_raw_achilles.transactions_in_stg...................................................... [RUN] 2022-07-11 15:47:51 normalization > 15:47:51 3 of 18 START view model _airbyte_raw_achilles.files_out_stg............................................................ [RUN] 2022-07-11 15:47:51 normalization > 15:47:51 4 of 18 START view model _airbyte_raw_achilles.partner_config_stg....................................................... [RUN] 2022-07-11 15:47:51 normalization > 15:47:51 5 of 18 START view model _airbyte_raw_achilles.files_in_stg............................................................. [RUN] 2022-07-11 15:47:51 normalization > 15:47:51 6 of 18 START view model _airbyte_raw_achilles.transactions_out_stg..................................................... [RUN] 2022-07-11 15:47:52 normalization > 15:47:52 1 of 18 OK created view model _airbyte_raw_achilles.bank_config_stg..................................................... [OK in 1.23s] 2022-07-11 15:47:52 normalization > 15:47:52 7 of 18 START incremental model raw_achilles.bank_config_scd............................................................ [RUN] 2022-07-11 15:47:53 normalization > 15:47:53 3 of 18 OK created view model _airbyte_raw_achilles.files_out_stg....................................................... [OK in 1.33s] 2022-07-11 15:47:53 normalization > 15:47:53 6 of 18 OK created view model _airbyte_raw_achilles.transactions_out_stg................................................ [OK in 1.33s] 2022-07-11 15:47:53 normalization > 15:47:53 8 of 18 START incremental model raw_achilles.transactions_out_scd....................................................... [RUN] 2022-07-11 15:47:53 normalization > 15:47:53 9 of 18 START incremental model raw_achilles.files_out_scd.............................................................. [RUN] 2022-07-11 15:47:53 normalization > 15:47:53 4 of 18 OK created view model _airbyte_raw_achilles.partner_config_stg.................................................. [OK in 1.40s] 2022-07-11 15:47:53 normalization > 15:47:53 5 of 18 OK created view model _airbyte_raw_achilles.files_in_stg........................................................ [OK in 1.40s] 2022-07-11 15:47:53 normalization > 15:47:53 10 of 18 START incremental model raw_achilles.partner_config_scd........................................................ [RUN] 2022-07-11 15:47:53 normalization > 15:47:53 11 of 18 START incremental model raw_achilles.files_in_scd.............................................................. [RUN] 2022-07-11 15:47:53 normalization > 15:47:53 2 of 18 OK created view model _airbyte_raw_achilles.transactions_in_stg................................................. [OK in 1.51s] 2022-07-11 15:47:53 normalization > 15:47:53 15:47:53 + `mainapi-282501`.raw_achilles.`bank_config_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:47:53 normalization > 15:47:53 12 of 18 START incremental model raw_achilles.transactions_in_scd....................................................... [RUN] 2022-07-11 15:47:53 normalization > 15:47:53 15:47:53 + `mainapi-282501`.raw_achilles.`transactions_out_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:47:53 normalization > 15:47:53 15:47:53 + `mainapi-282501`.raw_achilles.`files_out_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:47:53 normalization > 15:47:53 15:47:53 + `mainapi-282501`.raw_achilles.`partner_config_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:47:53 normalization > 15:47:53 15:47:53 + `mainapi-282501`.raw_achilles.`files_in_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:47:53 normalization > 15:47:53 15:47:53 + `mainapi-282501`.raw_achilles.`transactions_in_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:47:59 normalization > 15:47:59 7 of 18 OK created incremental model raw_achilles.bank_config_scd....................................................... [CREATE TABLE (3.0 rows, 1.9 KB processed) in 6.34s] 2022-07-11 15:47:59 normalization > 15:47:59 13 of 18 START incremental model raw_achilles.bank_config............................................................... [RUN] 2022-07-11 15:47:59 normalization > 15:47:59 15:47:59 + `mainapi-282501`.raw_achilles.`bank_config`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:47:59 normalization > 15:47:59 10 of 18 OK created incremental model raw_achilles.partner_config_scd................................................... [CREATE TABLE (206.0 rows, 112.0 KB processed) in 6.25s] 2022-07-11 15:47:59 normalization > 15:47:59 14 of 18 START incremental model raw_achilles.partner_config............................................................ [RUN] 2022-07-11 15:47:59 normalization > 15:47:59 15:47:59 + `mainapi-282501`.raw_achilles.`partner_config`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:47:59 normalization > 15:47:59 9 of 18 OK created incremental model raw_achilles.files_out_scd......................................................... [CREATE TABLE (34.0 rows, 14.4 KB processed) in 6.63s] 2022-07-11 15:47:59 normalization > 15:47:59 15 of 18 START incremental model raw_achilles.files_out................................................................. [RUN] 2022-07-11 15:47:59 normalization > 15:47:59 8 of 18 OK created incremental model raw_achilles.transactions_out_scd.................................................. [CREATE TABLE (113.0 rows, 93.6 KB processed) in 6.73s] 2022-07-11 15:47:59 normalization > 15:47:59 16 of 18 START incremental model raw_achilles.transactions_out.......................................................... [RUN] 2022-07-11 15:47:59 normalization > 15:47:59 15:47:59 + `mainapi-282501`.raw_achilles.`files_out`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:47:59 normalization > 15:47:59 15:47:59 + `mainapi-282501`.raw_achilles.`transactions_out`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:00 normalization > 15:48:00 12 of 18 ERROR creating incremental model raw_achilles.transactions_in_scd.............................................. [ERROR in 6.80s] 2022-07-11 15:48:00 normalization > 15:48:00 17 of 18 SKIP relation raw_achilles.transactions_in..................................................................... [SKIP] 2022-07-11 15:48:00 normalization > 15:48:00 11 of 18 OK created incremental model raw_achilles.files_in_scd......................................................... [CREATE TABLE (36.0 rows, 23.3 KB processed) in 7.14s] 2022-07-11 15:48:00 normalization > 15:48:00 18 of 18 START incremental model raw_achilles.files_in.................................................................. [RUN] 2022-07-11 15:48:00 normalization > 15:48:00 15:48:00 + `mainapi-282501`.raw_achilles.`files_in`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:01 normalization > 15:48:01 13 of 18 OK created incremental model raw_achilles.bank_config.......................................................... [CREATE TABLE (3.0 rows, 1.5 KB processed) in 2.45s] 2022-07-11 15:48:01 normalization > 15:48:01 14 of 18 OK created incremental model raw_achilles.partner_config....................................................... [CREATE TABLE (206.0 rows, 84.2 KB processed) in 2.40s] 2022-07-11 15:48:02 normalization > 15:48:02 15 of 18 OK created incremental model raw_achilles.files_out............................................................ [CREATE TABLE (34.0 rows, 10.1 KB processed) in 2.52s] 2022-07-11 15:48:02 normalization > 15:48:02 16 of 18 OK created incremental model raw_achilles.transactions_out..................................................... [CREATE TABLE (113.0 rows, 51.0 KB processed) in 2.53s] 2022-07-11 15:48:03 normalization > 15:48:03 18 of 18 OK created incremental model raw_achilles.files_in............................................................. [CREATE TABLE (36.0 rows, 13.3 KB processed) in 2.94s] 2022-07-11 15:48:03 normalization > 15:48:03 2022-07-11 15:48:03 normalization > 15:48:03 Finished running 6 view models, 12 incremental models in 13.49s. 2022-07-11 15:48:03 normalization > 15:48:03 2022-07-11 15:48:03 normalization > 15:48:03 Completed with 1 error and 0 warnings: 2022-07-11 15:48:03 normalization > 15:48:03 2022-07-11 15:48:03 normalization > 15:48:03 Database Error in model transactions_in_scd (models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql) 2022-07-11 15:48:03 normalization > 15:48:03 Invalid timestamp string "0000-12-30T00:00:00Z" 2022-07-11 15:48:03 normalization > 15:48:03 compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:48:03 normalization > 15:48:03 2022-07-11 15:48:03 normalization > 15:48:03 Done. PASS=16 WARN=0 ERROR=1 SKIP=1 TOTAL=18 2022-07-11 15:48:03 normalization > 2022-07-11 15:48:03 normalization > Diagnosing dbt debug to check if destination is available for dbt and well configured (1): 2022-07-11 15:48:03 normalization > 2022-07-11 15:48:07 normalization > 15:48:07 Running with dbt=1.0.0 2022-07-11 15:48:07 normalization > dbt version: 1.0.0 2022-07-11 15:48:07 normalization > python version: 3.9.9 2022-07-11 15:48:07 normalization > python path: /usr/local/bin/python 2022-07-11 15:48:07 normalization > os info: Linux-5.13.0-1024-gcp-x86_64-with-glibc2.31 2022-07-11 15:48:07 normalization > Using profiles.yml file at /data/89696/0/normalize/profiles.yml 2022-07-11 15:48:07 normalization > Using dbt_project.yml file at /data/89696/0/normalize/dbt_project.yml 2022-07-11 15:48:07 normalization > 2022-07-11 15:48:07 normalization > Configuration: 2022-07-11 15:48:07 normalization > profiles.yml file [OK found and valid] 2022-07-11 15:48:07 normalization > dbt_project.yml file [OK found and valid] 2022-07-11 15:48:07 normalization > 2022-07-11 15:48:07 normalization > Required dependencies: 2022-07-11 15:48:07 normalization > - git [OK found] 2022-07-11 15:48:07 normalization > 2022-07-11 15:48:07 normalization > Connection: 2022-07-11 15:48:07 normalization > method: service-account-json 2022-07-11 15:48:07 normalization > database: mainapi-282501 2022-07-11 15:48:07 normalization > schema: airbyte 2022-07-11 15:48:07 normalization > location: US 2022-07-11 15:48:07 normalization > priority: interactive 2022-07-11 15:48:07 normalization > timeout_seconds: 300 2022-07-11 15:48:07 normalization > maximum_bytes_billed: None 2022-07-11 15:48:07 normalization > execution_project: mainapi-282501 2022-07-11 15:48:08 normalization > Connection test: [OK connection ok] 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > All checks passed! 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > Forward dbt output logs to diagnose/debug errors (0): 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ============================== 2022-07-11 15:47:46.569703 | a03d450a-ec3b-4494-adfd-b4845a43a5a2 ============================== 2022-07-11 15:48:08 normalization > 15:47:46.569703 [info ] [MainThread]: Running with dbt=1.0.0 2022-07-11 15:48:08 normalization > 15:47:46.570516 [debug] [MainThread]: running dbt with arguments Namespace(record_timing_info=None, debug=None, log_format=None, write_json=None, use_colors=None, printer_width=None, warn_error=None, version_check=None, partial_parse=None, single_threaded=False, use_experimental_parser=None, static_parser=None, profiles_dir='/data/89696/0/normalize', send_anonymous_usage_stats=None, fail_fast=None, event_buffer_size='10000', project_dir='/data/89696/0/normalize', profile=None, target=None, vars='{}', log_cache_events=False, threads=None, select=None, exclude=None, selector_name=None, state=None, defer=None, full_refresh=False, cls=, which='run', rpc_method='run') 2022-07-11 15:48:08 normalization > 15:47:46.570983 [debug] [MainThread]: Tracking: do not track 2022-07-11 15:48:08 normalization > 15:47:46.611023 [info ] [MainThread]: Partial parse save file not found. Starting full parse. 2022-07-11 15:48:08 normalization > 15:47:46.667858 [debug] [MainThread]: Parsing macros/configuration.sql 2022-07-11 15:48:08 normalization > 15:47:46.672970 [debug] [MainThread]: Parsing macros/should_full_refresh.sql 2022-07-11 15:48:08 normalization > 15:47:46.682405 [debug] [MainThread]: Parsing macros/incremental.sql 2022-07-11 15:48:08 normalization > 15:47:46.693483 [debug] [MainThread]: Parsing macros/get_custom_schema.sql 2022-07-11 15:48:08 normalization > 15:47:46.694591 [debug] [MainThread]: Parsing macros/star_intersect.sql 2022-07-11 15:48:08 normalization > 15:47:46.705379 [debug] [MainThread]: Parsing macros/schema_tests/equal_rowcount.sql 2022-07-11 15:48:08 normalization > 15:47:46.707533 [debug] [MainThread]: Parsing macros/schema_tests/equality.sql 2022-07-11 15:48:08 normalization > 15:47:46.719499 [debug] [MainThread]: Parsing macros/cross_db_utils/except.sql 2022-07-11 15:48:08 normalization > 15:47:46.720859 [debug] [MainThread]: Parsing macros/cross_db_utils/current_timestamp.sql 2022-07-11 15:48:08 normalization > 15:47:46.721842 [debug] [MainThread]: Parsing macros/cross_db_utils/columns.sql 2022-07-11 15:48:08 normalization > 15:47:46.726661 [debug] [MainThread]: Parsing macros/cross_db_utils/hash.sql 2022-07-11 15:48:08 normalization > 15:47:46.727695 [debug] [MainThread]: Parsing macros/cross_db_utils/json_operations.sql 2022-07-11 15:48:08 normalization > 15:47:46.796456 [debug] [MainThread]: Parsing macros/cross_db_utils/quote.sql 2022-07-11 15:48:08 normalization > 15:47:46.799271 [debug] [MainThread]: Parsing macros/cross_db_utils/type_conversions.sql 2022-07-11 15:48:08 normalization > 15:47:46.813654 [debug] [MainThread]: Parsing macros/cross_db_utils/surrogate_key.sql 2022-07-11 15:48:08 normalization > 15:47:46.817168 [debug] [MainThread]: Parsing macros/cross_db_utils/datatypes.sql 2022-07-11 15:48:08 normalization > 15:47:46.836447 [debug] [MainThread]: Parsing macros/cross_db_utils/concat.sql 2022-07-11 15:48:08 normalization > 15:47:46.841336 [debug] [MainThread]: Parsing macros/cross_db_utils/array.sql 2022-07-11 15:48:08 normalization > 15:47:46.867690 [debug] [MainThread]: Parsing macros/adapters.sql 2022-07-11 15:48:08 normalization > 15:47:46.921325 [debug] [MainThread]: Parsing macros/catalog.sql 2022-07-11 15:48:08 normalization > 15:47:46.938628 [debug] [MainThread]: Parsing macros/etc.sql 2022-07-11 15:48:08 normalization > 15:47:46.941949 [debug] [MainThread]: Parsing macros/materializations/table.sql 2022-07-11 15:48:08 normalization > 15:47:46.949172 [debug] [MainThread]: Parsing macros/materializations/copy.sql 2022-07-11 15:48:08 normalization > 15:47:46.954080 [debug] [MainThread]: Parsing macros/materializations/seed.sql 2022-07-11 15:48:08 normalization > 15:47:46.958728 [debug] [MainThread]: Parsing macros/materializations/incremental.sql 2022-07-11 15:48:08 normalization > 15:47:46.988266 [debug] [MainThread]: Parsing macros/materializations/view.sql 2022-07-11 15:48:08 normalization > 15:47:46.993136 [debug] [MainThread]: Parsing macros/materializations/snapshot.sql 2022-07-11 15:48:08 normalization > 15:47:46.996114 [debug] [MainThread]: Parsing macros/etc/statement.sql 2022-07-11 15:48:08 normalization > 15:47:47.004817 [debug] [MainThread]: Parsing macros/etc/datetime.sql 2022-07-11 15:48:08 normalization > 15:47:47.021480 [debug] [MainThread]: Parsing macros/materializations/configs.sql 2022-07-11 15:48:08 normalization > 15:47:47.025783 [debug] [MainThread]: Parsing macros/materializations/hooks.sql 2022-07-11 15:48:08 normalization > 15:47:47.033313 [debug] [MainThread]: Parsing macros/materializations/tests/where_subquery.sql 2022-07-11 15:48:08 normalization > 15:47:47.036515 [debug] [MainThread]: Parsing macros/materializations/tests/helpers.sql 2022-07-11 15:48:08 normalization > 15:47:47.039656 [debug] [MainThread]: Parsing macros/materializations/tests/test.sql 2022-07-11 15:48:08 normalization > 15:47:47.048240 [debug] [MainThread]: Parsing macros/materializations/seeds/seed.sql 2022-07-11 15:48:08 normalization > 15:47:47.060680 [debug] [MainThread]: Parsing macros/materializations/seeds/helpers.sql 2022-07-11 15:48:08 normalization > 15:47:47.093460 [debug] [MainThread]: Parsing macros/materializations/models/table/table.sql 2022-07-11 15:48:08 normalization > 15:47:47.109136 [debug] [MainThread]: Parsing macros/materializations/models/table/create_table_as.sql 2022-07-11 15:48:08 normalization > 15:47:47.117505 [debug] [MainThread]: Parsing macros/materializations/models/incremental/incremental.sql 2022-07-11 15:48:08 normalization > 15:47:47.142784 [debug] [MainThread]: Parsing macros/materializations/models/incremental/column_helpers.sql 2022-07-11 15:48:08 normalization > 15:47:47.151367 [debug] [MainThread]: Parsing macros/materializations/models/incremental/merge.sql 2022-07-11 15:48:08 normalization > 15:47:47.173280 [debug] [MainThread]: Parsing macros/materializations/models/incremental/on_schema_change.sql 2022-07-11 15:48:08 normalization > 15:47:47.204513 [debug] [MainThread]: Parsing macros/materializations/models/incremental/is_incremental.sql 2022-07-11 15:48:08 normalization > 15:47:47.207263 [debug] [MainThread]: Parsing macros/materializations/models/view/create_or_replace_view.sql 2022-07-11 15:48:08 normalization > 15:47:47.212035 [debug] [MainThread]: Parsing macros/materializations/models/view/create_view_as.sql 2022-07-11 15:48:08 normalization > 15:47:47.216007 [debug] [MainThread]: Parsing macros/materializations/models/view/view.sql 2022-07-11 15:48:08 normalization > 15:47:47.229171 [debug] [MainThread]: Parsing macros/materializations/models/view/helpers.sql 2022-07-11 15:48:08 normalization > 15:47:47.231577 [debug] [MainThread]: Parsing macros/materializations/snapshots/snapshot_merge.sql 2022-07-11 15:48:08 normalization > 15:47:47.234974 [debug] [MainThread]: Parsing macros/materializations/snapshots/snapshot.sql 2022-07-11 15:48:08 normalization > 15:47:47.256816 [debug] [MainThread]: Parsing macros/materializations/snapshots/helpers.sql 2022-07-11 15:48:08 normalization > 15:47:47.279217 [debug] [MainThread]: Parsing macros/materializations/snapshots/strategies.sql 2022-07-11 15:48:08 normalization > 15:47:47.310394 [debug] [MainThread]: Parsing macros/adapters/persist_docs.sql 2022-07-11 15:48:08 normalization > 15:47:47.324447 [debug] [MainThread]: Parsing macros/adapters/columns.sql 2022-07-11 15:48:08 normalization > 15:47:47.347654 [debug] [MainThread]: Parsing macros/adapters/indexes.sql 2022-07-11 15:48:08 normalization > 15:47:47.352916 [debug] [MainThread]: Parsing macros/adapters/relation.sql 2022-07-11 15:48:08 normalization > 15:47:47.372214 [debug] [MainThread]: Parsing macros/adapters/schema.sql 2022-07-11 15:48:08 normalization > 15:47:47.376434 [debug] [MainThread]: Parsing macros/adapters/freshness.sql 2022-07-11 15:48:08 normalization > 15:47:47.381706 [debug] [MainThread]: Parsing macros/adapters/metadata.sql 2022-07-11 15:48:08 normalization > 15:47:47.395295 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_database.sql 2022-07-11 15:48:08 normalization > 15:47:47.398246 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_schema.sql 2022-07-11 15:48:08 normalization > 15:47:47.402912 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_alias.sql 2022-07-11 15:48:08 normalization > 15:47:47.405479 [debug] [MainThread]: Parsing macros/generic_test_sql/not_null.sql 2022-07-11 15:48:08 normalization > 15:47:47.406516 [debug] [MainThread]: Parsing macros/generic_test_sql/accepted_values.sql 2022-07-11 15:48:08 normalization > 15:47:47.408937 [debug] [MainThread]: Parsing macros/generic_test_sql/unique.sql 2022-07-11 15:48:08 normalization > 15:47:47.410156 [debug] [MainThread]: Parsing macros/generic_test_sql/relationships.sql 2022-07-11 15:48:08 normalization > 15:47:47.411776 [debug] [MainThread]: Parsing tests/generic/builtin.sql 2022-07-11 15:48:08 normalization > 15:47:47.417148 [debug] [MainThread]: Parsing macros/web/get_url_path.sql 2022-07-11 15:48:08 normalization > 15:47:47.421843 [debug] [MainThread]: Parsing macros/web/get_url_host.sql 2022-07-11 15:48:08 normalization > 15:47:47.425320 [debug] [MainThread]: Parsing macros/web/get_url_parameter.sql 2022-07-11 15:48:08 normalization > 15:47:47.428217 [debug] [MainThread]: Parsing macros/materializations/insert_by_period_materialization.sql 2022-07-11 15:48:08 normalization > 15:47:47.475582 [debug] [MainThread]: Parsing macros/schema_tests/test_not_null_where.sql 2022-07-11 15:48:08 normalization > 15:47:47.478474 [debug] [MainThread]: Parsing macros/schema_tests/test_unique_where.sql 2022-07-11 15:48:08 normalization > 15:47:47.481027 [debug] [MainThread]: Parsing macros/schema_tests/at_least_one.sql 2022-07-11 15:48:08 normalization > 15:47:47.483283 [debug] [MainThread]: Parsing macros/schema_tests/not_constant.sql 2022-07-11 15:48:08 normalization > 15:47:47.485425 [debug] [MainThread]: Parsing macros/schema_tests/expression_is_true.sql 2022-07-11 15:48:08 normalization > 15:47:47.488609 [debug] [MainThread]: Parsing macros/schema_tests/recency.sql 2022-07-11 15:48:08 normalization > 15:47:47.491659 [debug] [MainThread]: Parsing macros/schema_tests/equal_rowcount.sql 2022-07-11 15:48:08 normalization > 15:47:47.494643 [debug] [MainThread]: Parsing macros/schema_tests/not_null_proportion.sql 2022-07-11 15:48:08 normalization > 15:47:47.498538 [debug] [MainThread]: Parsing macros/schema_tests/accepted_range.sql 2022-07-11 15:48:08 normalization > 15:47:47.503494 [debug] [MainThread]: Parsing macros/schema_tests/not_accepted_values.sql 2022-07-11 15:48:08 normalization > 15:47:47.507193 [debug] [MainThread]: Parsing macros/schema_tests/cardinality_equality.sql 2022-07-11 15:48:08 normalization > 15:47:47.510765 [debug] [MainThread]: Parsing macros/schema_tests/unique_combination_of_columns.sql 2022-07-11 15:48:08 normalization > 15:47:47.515877 [debug] [MainThread]: Parsing macros/schema_tests/mutually_exclusive_ranges.sql 2022-07-11 15:48:08 normalization > 15:47:47.533345 [debug] [MainThread]: Parsing macros/schema_tests/fewer_rows_than.sql 2022-07-11 15:48:08 normalization > 15:47:47.536751 [debug] [MainThread]: Parsing macros/schema_tests/equality.sql 2022-07-11 15:48:08 normalization > 15:47:47.543261 [debug] [MainThread]: Parsing macros/schema_tests/relationships_where.sql 2022-07-11 15:48:08 normalization > 15:47:47.547199 [debug] [MainThread]: Parsing macros/schema_tests/sequential_values.sql 2022-07-11 15:48:08 normalization > 15:47:47.552630 [debug] [MainThread]: Parsing macros/cross_db_utils/except.sql 2022-07-11 15:48:08 normalization > 15:47:47.554474 [debug] [MainThread]: Parsing macros/cross_db_utils/length.sql 2022-07-11 15:48:08 normalization > 15:47:47.556657 [debug] [MainThread]: Parsing macros/cross_db_utils/position.sql 2022-07-11 15:48:08 normalization > 15:47:47.559339 [debug] [MainThread]: Parsing macros/cross_db_utils/current_timestamp.sql 2022-07-11 15:48:08 normalization > 15:47:47.565633 [debug] [MainThread]: Parsing macros/cross_db_utils/intersect.sql 2022-07-11 15:48:08 normalization > 15:47:47.567663 [debug] [MainThread]: Parsing macros/cross_db_utils/replace.sql 2022-07-11 15:48:08 normalization > 15:47:47.569801 [debug] [MainThread]: Parsing macros/cross_db_utils/escape_single_quotes.sql 2022-07-11 15:48:08 normalization > 15:47:47.572924 [debug] [MainThread]: Parsing macros/cross_db_utils/any_value.sql 2022-07-11 15:48:08 normalization > 15:47:47.575131 [debug] [MainThread]: Parsing macros/cross_db_utils/last_day.sql 2022-07-11 15:48:08 normalization > 15:47:47.581456 [debug] [MainThread]: Parsing macros/cross_db_utils/cast_bool_to_text.sql 2022-07-11 15:48:08 normalization > 15:47:47.583976 [debug] [MainThread]: Parsing macros/cross_db_utils/dateadd.sql 2022-07-11 15:48:08 normalization > 15:47:47.589240 [debug] [MainThread]: Parsing macros/cross_db_utils/literal.sql 2022-07-11 15:48:08 normalization > 15:47:47.590853 [debug] [MainThread]: Parsing macros/cross_db_utils/safe_cast.sql 2022-07-11 15:48:08 normalization > 15:47:47.593986 [debug] [MainThread]: Parsing macros/cross_db_utils/date_trunc.sql 2022-07-11 15:48:08 normalization > 15:47:47.596535 [debug] [MainThread]: Parsing macros/cross_db_utils/bool_or.sql 2022-07-11 15:48:08 normalization > 15:47:47.599412 [debug] [MainThread]: Parsing macros/cross_db_utils/width_bucket.sql 2022-07-11 15:48:08 normalization > 15:47:47.609454 [debug] [MainThread]: Parsing macros/cross_db_utils/hash.sql 2022-07-11 15:48:08 normalization > 15:47:47.611929 [debug] [MainThread]: Parsing macros/cross_db_utils/_is_ephemeral.sql 2022-07-11 15:48:08 normalization > 15:47:47.615346 [debug] [MainThread]: Parsing macros/cross_db_utils/_is_relation.sql 2022-07-11 15:48:08 normalization > 15:47:47.617254 [debug] [MainThread]: Parsing macros/cross_db_utils/right.sql 2022-07-11 15:48:08 normalization > 15:47:47.621279 [debug] [MainThread]: Parsing macros/cross_db_utils/split_part.sql 2022-07-11 15:48:08 normalization > 15:47:47.624542 [debug] [MainThread]: Parsing macros/cross_db_utils/datediff.sql 2022-07-11 15:48:08 normalization > 15:47:47.644283 [debug] [MainThread]: Parsing macros/cross_db_utils/datatypes.sql 2022-07-11 15:48:08 normalization > 15:47:47.655329 [debug] [MainThread]: Parsing macros/cross_db_utils/concat.sql 2022-07-11 15:48:08 normalization > 15:47:47.657039 [debug] [MainThread]: Parsing macros/cross_db_utils/identifier.sql 2022-07-11 15:48:08 normalization > 15:47:47.659880 [debug] [MainThread]: Parsing macros/sql/get_tables_by_prefix_sql.sql 2022-07-11 15:48:08 normalization > 15:47:47.662766 [debug] [MainThread]: Parsing macros/sql/get_column_values.sql 2022-07-11 15:48:08 normalization > 15:47:47.672042 [debug] [MainThread]: Parsing macros/sql/get_query_results_as_dict.sql 2022-07-11 15:48:08 normalization > 15:47:47.676281 [debug] [MainThread]: Parsing macros/sql/get_relations_by_pattern.sql 2022-07-11 15:48:08 normalization > 15:47:47.682427 [debug] [MainThread]: Parsing macros/sql/get_relations_by_prefix.sql 2022-07-11 15:48:08 normalization > 15:47:47.688655 [debug] [MainThread]: Parsing macros/sql/haversine_distance.sql 2022-07-11 15:48:08 normalization > 15:47:47.699577 [debug] [MainThread]: Parsing macros/sql/get_tables_by_pattern_sql.sql 2022-07-11 15:48:08 normalization > 15:47:47.711986 [debug] [MainThread]: Parsing macros/sql/pivot.sql 2022-07-11 15:48:08 normalization > 15:47:47.720538 [debug] [MainThread]: Parsing macros/sql/date_spine.sql 2022-07-11 15:48:08 normalization > 15:47:47.728612 [debug] [MainThread]: Parsing macros/sql/star.sql 2022-07-11 15:48:08 normalization > 15:47:47.736803 [debug] [MainThread]: Parsing macros/sql/union.sql 2022-07-11 15:48:08 normalization > 15:47:47.757253 [debug] [MainThread]: Parsing macros/sql/get_table_types_sql.sql 2022-07-11 15:48:08 normalization > 15:47:47.759976 [debug] [MainThread]: Parsing macros/sql/safe_add.sql 2022-07-11 15:48:08 normalization > 15:47:47.763860 [debug] [MainThread]: Parsing macros/sql/surrogate_key.sql 2022-07-11 15:48:08 normalization > 15:47:47.770567 [debug] [MainThread]: Parsing macros/sql/groupby.sql 2022-07-11 15:48:08 normalization > 15:47:47.773239 [debug] [MainThread]: Parsing macros/sql/generate_series.sql 2022-07-11 15:48:08 normalization > 15:47:47.781677 [debug] [MainThread]: Parsing macros/sql/nullcheck.sql 2022-07-11 15:48:08 normalization > 15:47:47.784921 [debug] [MainThread]: Parsing macros/sql/unpivot.sql 2022-07-11 15:48:08 normalization > 15:47:47.800671 [debug] [MainThread]: Parsing macros/sql/nullcheck_table.sql 2022-07-11 15:48:08 normalization > 15:47:47.803908 [debug] [MainThread]: Parsing macros/jinja_helpers/log_info.sql 2022-07-11 15:48:08 normalization > 15:47:47.805995 [debug] [MainThread]: Parsing macros/jinja_helpers/pretty_time.sql 2022-07-11 15:48:08 normalization > 15:47:47.808224 [debug] [MainThread]: Parsing macros/jinja_helpers/slugify.sql 2022-07-11 15:48:08 normalization > 15:47:47.810312 [debug] [MainThread]: Parsing macros/jinja_helpers/pretty_log_format.sql 2022-07-11 15:48:08 normalization > 15:47:48.557280 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/transactions_out_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.643452 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/transactions_out_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.646530 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/bank_config_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.683156 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/bank_config_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.686169 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.723959 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.727326 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/files_out_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.834473 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/files_out_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.837414 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/files_in_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.872626 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/files_in_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.875557 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/partner_config_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.910302 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/partner_config_scd.sql 2022-07-11 15:48:08 normalization > 15:47:48.912492 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/files_out.sql 2022-07-11 15:48:08 normalization > 15:47:48.928650 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/files_out.sql 2022-07-11 15:48:08 normalization > 15:47:48.930788 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/files_in.sql 2022-07-11 15:48:08 normalization > 15:47:48.944343 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/files_in.sql 2022-07-11 15:48:08 normalization > 15:47:48.946564 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/bank_config.sql 2022-07-11 15:48:08 normalization > 15:47:48.958481 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/bank_config.sql 2022-07-11 15:48:08 normalization > 15:47:48.960642 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/partner_config.sql 2022-07-11 15:48:08 normalization > 15:47:48.973343 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/partner_config.sql 2022-07-11 15:48:08 normalization > 15:47:48.975532 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/transactions_in.sql 2022-07-11 15:48:08 normalization > 15:47:48.987585 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/transactions_in.sql 2022-07-11 15:48:08 normalization > 15:47:48.989827 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/transactions_out.sql 2022-07-11 15:48:08 normalization > 15:47:49.001704 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/transactions_out.sql 2022-07-11 15:48:08 normalization > 15:47:49.004154 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_out_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.040874 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_out_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.043404 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_in_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.079593 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_in_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.081936 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_out_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.111535 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_out_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.114151 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/bank_config_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.132868 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/bank_config_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.135200 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/partner_config_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.161657 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/partner_config_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.163934 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/bank_config_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.187411 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/bank_config_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.189701 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/partner_config_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.208346 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/partner_config_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.210776 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_out_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.258964 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_out_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.261357 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_in_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.286941 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_in_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.289931 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_in_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.341852 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_in_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.345357 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_in_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.429156 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_in_ab1.sql 2022-07-11 15:48:08 normalization > 15:47:49.431788 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_out_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.463065 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_out_ab2.sql 2022-07-11 15:48:08 normalization > 15:47:49.465327 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/transactions_in_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.502884 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/transactions_in_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.505122 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/bank_config_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.520249 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/bank_config_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.522177 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/partner_config_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.539049 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/partner_config_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.541233 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/transactions_out_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.565168 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/transactions_out_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.567412 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/files_out_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.585234 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/files_out_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.587381 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/files_in_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.607319 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/files_in_stg.sql 2022-07-11 15:48:08 normalization > 15:47:49.729928 [warn ] [MainThread]: [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-07-11 15:48:08 normalization > There are 1 unused configuration paths: 2022-07-11 15:48:08 normalization > - models.airbyte_utils.generated.airbyte_tables 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:49.767560 [info ] [MainThread]: Found 30 models, 0 tests, 0 snapshots, 0 analyses, 549 macros, 0 operations, 0 seed files, 6 sources, 0 exposures, 0 metrics 2022-07-11 15:48:08 normalization > 15:47:49.771991 [info ] [MainThread]: 2022-07-11 15:48:08 normalization > 15:47:49.773391 [debug] [MainThread]: Acquiring new bigquery connection "master" 2022-07-11 15:48:08 normalization > 15:47:49.776534 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501" 2022-07-11 15:48:08 normalization > 15:47:49.777141 [debug] [ThreadPool]: Opening a new connection, currently in state init 2022-07-11 15:48:08 normalization > 15:47:49.778082 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501" 2022-07-11 15:48:08 normalization > 15:47:49.779034 [debug] [ThreadPool]: Opening a new connection, currently in state init 2022-07-11 15:48:08 normalization > 15:47:50.433519 [debug] [ThreadPool]: Acquiring new bigquery connection "create_mainapi-282501__airbyte_raw_achilles" 2022-07-11 15:48:08 normalization > 15:47:50.434502 [debug] [ThreadPool]: Acquiring new bigquery connection "create_mainapi-282501__airbyte_raw_achilles" 2022-07-11 15:48:08 normalization > 15:47:50.434734 [debug] [ThreadPool]: BigQuery adapter: Creating schema "mainapi-282501._airbyte_raw_achilles". 2022-07-11 15:48:08 normalization > 15:47:50.434945 [debug] [ThreadPool]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:50.723307 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501__airbyte_raw_achilles" 2022-07-11 15:48:08 normalization > 15:47:50.724713 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501_raw_achilles" 2022-07-11 15:48:08 normalization > 15:47:50.725354 [debug] [ThreadPool]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:50.725733 [debug] [ThreadPool]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:50.872510 [info ] [MainThread]: Concurrency: 8 threads (target='prod') 2022-07-11 15:48:08 normalization > 15:47:50.873073 [info ] [MainThread]: 2022-07-11 15:48:08 normalization > 15:47:50.899979 [debug] [Thread-1 ]: Began running node model.airbyte_utils.bank_config_ab1 2022-07-11 15:48:08 normalization > 15:47:50.900481 [debug] [Thread-2 ]: Began running node model.airbyte_utils.files_in_ab1 2022-07-11 15:48:08 normalization > 15:47:50.901399 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_ab1" 2022-07-11 15:48:08 normalization > 15:47:50.901687 [debug] [Thread-3 ]: Began running node model.airbyte_utils.files_out_ab1 2022-07-11 15:48:08 normalization > 15:47:50.902098 [debug] [Thread-4 ]: Began running node model.airbyte_utils.partner_config_ab1 2022-07-11 15:48:08 normalization > 15:47:50.902995 [debug] [Thread-2 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_ab1" 2022-07-11 15:48:08 normalization > 15:47:50.903303 [debug] [Thread-5 ]: Began running node model.airbyte_utils.transactions_in_ab1 2022-07-11 15:48:08 normalization > 15:47:50.903697 [debug] [Thread-6 ]: Began running node model.airbyte_utils.transactions_out_ab1 2022-07-11 15:48:08 normalization > 15:47:50.904068 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.bank_config_ab1 2022-07-11 15:48:08 normalization > 15:47:50.905212 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_ab1" 2022-07-11 15:48:08 normalization > 15:47:50.906177 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_ab1" 2022-07-11 15:48:08 normalization > 15:47:50.906558 [debug] [Thread-2 ]: Began compiling node model.airbyte_utils.files_in_ab1 2022-07-11 15:48:08 normalization > 15:47:50.907436 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_ab1" 2022-07-11 15:48:08 normalization > 15:47:50.908475 [debug] [Thread-6 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_ab1" 2022-07-11 15:48:08 normalization > 15:47:50.908842 [debug] [Thread-1 ]: Compiling model.airbyte_utils.bank_config_ab1 2022-07-11 15:48:08 normalization > 15:47:50.909133 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.files_out_ab1 2022-07-11 15:48:08 normalization > 15:47:50.909428 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.partner_config_ab1 2022-07-11 15:48:08 normalization > 15:47:50.909747 [debug] [Thread-2 ]: Compiling model.airbyte_utils.files_in_ab1 2022-07-11 15:48:08 normalization > 15:47:50.910007 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.transactions_in_ab1 2022-07-11 15:48:08 normalization > 15:47:50.910311 [debug] [Thread-6 ]: Began compiling node model.airbyte_utils.transactions_out_ab1 2022-07-11 15:48:08 normalization > 15:47:50.931598 [debug] [Thread-3 ]: Compiling model.airbyte_utils.files_out_ab1 2022-07-11 15:48:08 normalization > 15:47:50.933625 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_ab1" 2022-07-11 15:48:08 normalization > 15:47:50.933974 [debug] [Thread-4 ]: Compiling model.airbyte_utils.partner_config_ab1 2022-07-11 15:48:08 normalization > 15:47:50.938565 [debug] [Thread-5 ]: Compiling model.airbyte_utils.transactions_in_ab1 2022-07-11 15:48:08 normalization > 15:47:50.939805 [debug] [Thread-6 ]: Compiling model.airbyte_utils.transactions_out_ab1 2022-07-11 15:48:08 normalization > 15:47:50.966565 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.files_out_ab1" 2022-07-11 15:48:08 normalization > 15:47:51.038772 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_ab1" 2022-07-11 15:48:08 normalization > 15:47:51.168967 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_ab1" 2022-07-11 15:48:08 normalization > 15:47:51.182391 [debug] [Thread-6 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_ab1" 2022-07-11 15:48:08 normalization > 15:47:51.188474 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.192197 [debug] [Thread-2 ]: Writing injected SQL for node "model.airbyte_utils.files_in_ab1" 2022-07-11 15:48:08 normalization > 15:47:51.192867 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.193970 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.transactions_in_ab1 2022-07-11 15:48:08 normalization > 15:47:51.194191 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.194617 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.194928 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.196002 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.bank_config_ab1 2022-07-11 15:48:08 normalization > 15:47:51.196318 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.197131 [debug] [Thread-8 ]: Began running node model.airbyte_utils.transactions_in_ab2 2022-07-11 15:48:08 normalization > 15:47:51.198065 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.partner_config_ab1 2022-07-11 15:48:08 normalization > 15:47:51.198947 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.files_out_ab1 2022-07-11 15:48:08 normalization > 15:47:51.199816 [debug] [Thread-6 ]: Finished running node model.airbyte_utils.transactions_out_ab1 2022-07-11 15:48:08 normalization > 15:47:51.201273 [debug] [Thread-2 ]: Finished running node model.airbyte_utils.files_in_ab1 2022-07-11 15:48:08 normalization > 15:47:51.201615 [debug] [Thread-5 ]: Began running node model.airbyte_utils.bank_config_ab2 2022-07-11 15:48:08 normalization > 15:47:51.202653 [debug] [Thread-8 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.204335 [debug] [Thread-1 ]: Began running node model.airbyte_utils.files_out_ab2 2022-07-11 15:48:08 normalization > 15:47:51.204747 [debug] [Thread-7 ]: Began running node model.airbyte_utils.partner_config_ab2 2022-07-11 15:48:08 normalization > 15:47:51.205124 [debug] [Thread-2 ]: Began running node model.airbyte_utils.transactions_out_ab2 2022-07-11 15:48:08 normalization > 15:47:51.206217 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.206622 [debug] [Thread-3 ]: Began running node model.airbyte_utils.files_in_ab2 2022-07-11 15:48:08 normalization > 15:47:51.206952 [debug] [Thread-8 ]: Began compiling node model.airbyte_utils.transactions_in_ab2 2022-07-11 15:48:08 normalization > 15:47:51.207794 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.208759 [debug] [Thread-7 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.209603 [debug] [Thread-2 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.209902 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.bank_config_ab2 2022-07-11 15:48:08 normalization > 15:47:51.210789 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.211142 [debug] [Thread-8 ]: Compiling model.airbyte_utils.transactions_in_ab2 2022-07-11 15:48:08 normalization > 15:47:51.211447 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.files_out_ab2 2022-07-11 15:48:08 normalization > 15:47:51.211836 [debug] [Thread-7 ]: Began compiling node model.airbyte_utils.partner_config_ab2 2022-07-11 15:48:08 normalization > 15:47:51.212136 [debug] [Thread-2 ]: Began compiling node model.airbyte_utils.transactions_out_ab2 2022-07-11 15:48:08 normalization > 15:47:51.212439 [debug] [Thread-5 ]: Compiling model.airbyte_utils.bank_config_ab2 2022-07-11 15:48:08 normalization > 15:47:51.212761 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.files_in_ab2 2022-07-11 15:48:08 normalization > 15:47:51.295060 [debug] [Thread-1 ]: Compiling model.airbyte_utils.files_out_ab2 2022-07-11 15:48:08 normalization > 15:47:51.323451 [debug] [Thread-7 ]: Compiling model.airbyte_utils.partner_config_ab2 2022-07-11 15:48:08 normalization > 15:47:51.349258 [debug] [Thread-2 ]: Compiling model.airbyte_utils.transactions_out_ab2 2022-07-11 15:48:08 normalization > 15:47:51.385894 [debug] [Thread-3 ]: Compiling model.airbyte_utils.files_in_ab2 2022-07-11 15:48:08 normalization > 15:47:51.508550 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.555429 [debug] [Thread-8 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.602280 [debug] [Thread-7 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.609144 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.files_out_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.646436 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.693710 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.697064 [debug] [Thread-2 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.700602 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.files_in_ab2" 2022-07-11 15:48:08 normalization > 15:47:51.702470 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.bank_config_ab2 2022-07-11 15:48:08 normalization > 15:47:51.703562 [debug] [Thread-8 ]: Finished running node model.airbyte_utils.transactions_in_ab2 2022-07-11 15:48:08 normalization > 15:47:51.703893 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.704295 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.705015 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.705908 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.706562 [debug] [Thread-4 ]: Began running node model.airbyte_utils.bank_config_stg 2022-07-11 15:48:08 normalization > 15:47:51.707209 [debug] [Thread-6 ]: Began running node model.airbyte_utils.transactions_in_stg 2022-07-11 15:48:08 normalization > 15:47:51.708176 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.files_out_ab2 2022-07-11 15:48:08 normalization > 15:47:51.709135 [debug] [Thread-7 ]: Finished running node model.airbyte_utils.partner_config_ab2 2022-07-11 15:48:08 normalization > 15:47:51.710038 [debug] [Thread-2 ]: Finished running node model.airbyte_utils.transactions_out_ab2 2022-07-11 15:48:08 normalization > 15:47:51.710967 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.files_in_ab2 2022-07-11 15:48:08 normalization > 15:47:51.711449 [info ] [Thread-4 ]: 1 of 18 START view model _airbyte_raw_achilles.bank_config_stg.......................................................... [RUN] 2022-07-11 15:48:08 normalization > 15:47:51.711875 [info ] [Thread-6 ]: 2 of 18 START view model _airbyte_raw_achilles.transactions_in_stg...................................................... [RUN] 2022-07-11 15:48:08 normalization > 15:47:51.712795 [debug] [Thread-8 ]: Began running node model.airbyte_utils.files_out_stg 2022-07-11 15:48:08 normalization > 15:47:51.713463 [debug] [Thread-5 ]: Began running node model.airbyte_utils.partner_config_stg 2022-07-11 15:48:08 normalization > 15:47:51.715842 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_stg" 2022-07-11 15:48:08 normalization > 15:47:51.716165 [debug] [Thread-7 ]: Began running node model.airbyte_utils.files_in_stg 2022-07-11 15:48:08 normalization > 15:47:51.717394 [debug] [Thread-6 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_stg" 2022-07-11 15:48:08 normalization > 15:47:51.717745 [debug] [Thread-1 ]: Began running node model.airbyte_utils.transactions_out_stg 2022-07-11 15:48:08 normalization > 15:47:51.718076 [info ] [Thread-8 ]: 3 of 18 START view model _airbyte_raw_achilles.files_out_stg............................................................ [RUN] 2022-07-11 15:48:08 normalization > 15:47:51.718576 [info ] [Thread-5 ]: 4 of 18 START view model _airbyte_raw_achilles.partner_config_stg....................................................... [RUN] 2022-07-11 15:48:08 normalization > 15:47:51.718885 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.bank_config_stg 2022-07-11 15:48:08 normalization > 15:47:51.719296 [info ] [Thread-7 ]: 5 of 18 START view model _airbyte_raw_achilles.files_in_stg............................................................. [RUN] 2022-07-11 15:48:08 normalization > 15:47:51.719604 [debug] [Thread-6 ]: Began compiling node model.airbyte_utils.transactions_in_stg 2022-07-11 15:48:08 normalization > 15:47:51.720005 [info ] [Thread-1 ]: 6 of 18 START view model _airbyte_raw_achilles.transactions_out_stg..................................................... [RUN] 2022-07-11 15:48:08 normalization > 15:47:51.721214 [debug] [Thread-8 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_stg" 2022-07-11 15:48:08 normalization > 15:47:51.722569 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_stg" 2022-07-11 15:48:08 normalization > 15:47:51.722926 [debug] [Thread-4 ]: Compiling model.airbyte_utils.bank_config_stg 2022-07-11 15:48:08 normalization > 15:47:51.724287 [debug] [Thread-7 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_stg" 2022-07-11 15:48:08 normalization > 15:47:51.724604 [debug] [Thread-6 ]: Compiling model.airbyte_utils.transactions_in_stg 2022-07-11 15:48:08 normalization > 15:47:51.725909 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_stg" 2022-07-11 15:48:08 normalization > 15:47:51.726289 [debug] [Thread-8 ]: Began compiling node model.airbyte_utils.files_out_stg 2022-07-11 15:48:08 normalization > 15:47:51.726603 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.partner_config_stg 2022-07-11 15:48:08 normalization > 15:47:51.747747 [debug] [Thread-7 ]: Began compiling node model.airbyte_utils.files_in_stg 2022-07-11 15:48:08 normalization > 15:47:51.766762 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_stg" 2022-07-11 15:48:08 normalization > 15:47:51.777800 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.transactions_out_stg 2022-07-11 15:48:08 normalization > 15:47:51.803877 [debug] [Thread-8 ]: Compiling model.airbyte_utils.files_out_stg 2022-07-11 15:48:08 normalization > 15:47:51.829856 [debug] [Thread-5 ]: Compiling model.airbyte_utils.partner_config_stg 2022-07-11 15:48:08 normalization > 15:47:51.855673 [debug] [Thread-7 ]: Compiling model.airbyte_utils.files_in_stg 2022-07-11 15:48:08 normalization > 15:47:51.892593 [debug] [Thread-1 ]: Compiling model.airbyte_utils.transactions_out_stg 2022-07-11 15:48:08 normalization > 15:47:51.961366 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:51.989654 [debug] [Thread-6 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_stg" 2022-07-11 15:48:08 normalization > 15:47:52.022258 [debug] [Thread-8 ]: Writing injected SQL for node "model.airbyte_utils.files_out_stg" 2022-07-11 15:48:08 normalization > 15:47:52.105770 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_stg" 2022-07-11 15:48:08 normalization > 15:47:52.106174 [debug] [Thread-4 ]: Began executing node model.airbyte_utils.bank_config_stg 2022-07-11 15:48:08 normalization > 15:47:52.175453 [debug] [Thread-7 ]: Writing injected SQL for node "model.airbyte_utils.files_in_stg" 2022-07-11 15:48:08 normalization > 15:47:52.192574 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:52.203768 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:52.224950 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:52.225379 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_stg" 2022-07-11 15:48:08 normalization > 15:47:52.246566 [debug] [Thread-8 ]: Began executing node model.airbyte_utils.files_out_stg 2022-07-11 15:48:08 normalization > 15:47:52.262011 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:52.271569 [debug] [Thread-4 ]: Writing runtime SQL for node "model.airbyte_utils.bank_config_stg" 2022-07-11 15:48:08 normalization > 15:47:52.272068 [debug] [Thread-6 ]: Began executing node model.airbyte_utils.transactions_in_stg 2022-07-11 15:48:08 normalization > 15:47:52.272430 [debug] [Thread-5 ]: Began executing node model.airbyte_utils.partner_config_stg 2022-07-11 15:48:08 normalization > 15:47:52.278576 [debug] [Thread-8 ]: Writing runtime SQL for node "model.airbyte_utils.files_out_stg" 2022-07-11 15:48:08 normalization > 15:47:52.279114 [debug] [Thread-7 ]: Began executing node model.airbyte_utils.files_in_stg 2022-07-11 15:48:08 normalization > 15:47:52.279373 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:52.284966 [debug] [Thread-6 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_in_stg" 2022-07-11 15:48:08 normalization > 15:47:52.292244 [debug] [Thread-5 ]: Writing runtime SQL for node "model.airbyte_utils.partner_config_stg" 2022-07-11 15:48:08 normalization > 15:47:52.297717 [debug] [Thread-7 ]: Writing runtime SQL for node "model.airbyte_utils.files_in_stg" 2022-07-11 15:48:08 normalization > 15:47:52.298391 [debug] [Thread-1 ]: Began executing node model.airbyte_utils.transactions_out_stg 2022-07-11 15:48:08 normalization > 15:47:52.298710 [debug] [Thread-4 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:52.299625 [debug] [Thread-8 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:52.300680 [debug] [Thread-6 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:52.301057 [debug] [Thread-5 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:52.301325 [debug] [Thread-7 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:52.306696 [debug] [Thread-1 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_out_stg" 2022-07-11 15:48:08 normalization > 15:47:52.316677 [debug] [Thread-4 ]: On model.airbyte_utils.bank_config_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_stg"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`bank_config_stg` 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as 2022-07-11 15:48:08 normalization > with __dbt__cte__bank_config_ab1 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:48:08 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_bank_config 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['name']") as name, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['config']") as config, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['routing_no']") as routing_no, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_bank_config as table_alias 2022-07-11 15:48:08 normalization > -- bank_config 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ), __dbt__cte__bank_config_ab2 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__bank_config_ab1 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > cast(name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as name, 2022-07-11 15:48:08 normalization > cast(config as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as config, 2022-07-11 15:48:08 normalization > cast(bank_id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as bank_id, 2022-07-11 15:48:08 normalization > cast(nullif(created, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as created, 2022-07-11 15:48:08 normalization > cast(nullif(updated, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as updated, 2022-07-11 15:48:08 normalization > cast(routing_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as routing_no, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from __dbt__cte__bank_config_ab1 2022-07-11 15:48:08 normalization > -- bank_config 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__bank_config_ab2 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(config as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(routing_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_bank_config_hashid, 2022-07-11 15:48:08 normalization > tmp.* 2022-07-11 15:48:08 normalization > from __dbt__cte__bank_config_ab2 tmp 2022-07-11 15:48:08 normalization > -- bank_config 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > ; 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:52.317139 [debug] [Thread-8 ]: On model.airbyte_utils.files_out_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_stg"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`files_out_stg` 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as 2022-07-11 15:48:08 normalization > with __dbt__cte__files_out_ab1 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:48:08 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_files_out 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['file_hash']") as file_hash, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['file_name']") as file_name, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['batch_count']") as batch_count, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['exchange_window']") as exchange_window, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_files_out as table_alias 2022-07-11 15:48:08 normalization > -- files_out 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ), __dbt__cte__files_out_ab2 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__files_out_ab1 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > cast(id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as id, 2022-07-11 15:48:08 normalization > cast(bank_id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as bank_id, 2022-07-11 15:48:08 normalization > cast(nullif(created, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as created, 2022-07-11 15:48:08 normalization > cast(nullif(updated, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as updated, 2022-07-11 15:48:08 normalization > cast(file_hash as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as file_hash, 2022-07-11 15:48:08 normalization > cast(file_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as file_name, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > cast(batch_count as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as batch_count, 2022-07-11 15:48:08 normalization > cast(nullif(exchange_window, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as exchange_window, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from __dbt__cte__files_out_ab1 2022-07-11 15:48:08 normalization > -- files_out 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__files_out_ab2 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(file_hash as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(file_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(batch_count as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(exchange_window as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_files_out_hashid, 2022-07-11 15:48:08 normalization > tmp.* 2022-07-11 15:48:08 normalization > from __dbt__cte__files_out_ab2 tmp 2022-07-11 15:48:08 normalization > -- files_out 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > ; 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:52.317438 [debug] [Thread-6 ]: On model.airbyte_utils.transactions_in_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_in_stg"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`transactions_in_stg` 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as 2022-07-11 15:48:08 normalization > with __dbt__cte__transactions_in_ab1 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:48:08 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_transactions_in 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['uuid']") as uuid, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['amount']") as amount, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['returned']") as returned, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['sec_code']") as sec_code, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['file_hash']") as file_hash, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['file_name']") as file_name, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_02']") as addenda_02, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_05']") as addenda_05, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_10']") as addenda_10, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_11']") as addenda_11, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_12']") as addenda_12, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_13']") as addenda_13, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_14']") as addenda_14, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_15']") as addenda_15, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_16']") as addenda_16, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_17']") as addenda_17, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_18']") as addenda_18, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_98']") as addenda_98, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_99']") as addenda_99, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['batch_type']") as batch_type, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['company_id']") as company_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['partner_id']") as partner_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['external_id']") as external_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['return_data']") as return_data, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['batch_number']") as batch_number, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['company_name']") as company_name, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['future_dated']") as future_dated, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['originator_id']") as originator_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['receiving_dfi']") as receiving_dfi, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['dfi_account_no']") as dfi_account_no, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['effective_date']") as effective_date, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['entry_trace_no']") as entry_trace_no, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['individual_name']") as individual_name, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['originating_dfi']") as originating_dfi, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['settlement_date']") as settlement_date, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['individual_id_no']") as individual_id_no, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['transaction_code']") as transaction_code, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['processing_history']") as processing_history, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['transaction_out_id']") as transaction_out_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['addenda_record_count']") as addenda_record_count, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['destination_country_code']") as destination_country_code, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['company_entry_description']") as company_entry_description, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['destination_currency_code']") as destination_currency_code, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['originating_currency_code']") as originating_currency_code, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['foreign_exchange_indicator']") as foreign_exchange_indicator, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_in as table_alias 2022-07-11 15:48:08 normalization > -- transactions_in 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ), __dbt__cte__transactions_in_ab2 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__transactions_in_ab1 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > cast(id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as id, 2022-07-11 15:48:08 normalization > cast(uuid as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as uuid, 2022-07-11 15:48:08 normalization > cast(amount as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as amount, 2022-07-11 15:48:08 normalization > cast(bank_id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as bank_id, 2022-07-11 15:48:08 normalization > cast(nullif(created, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as created, 2022-07-11 15:48:08 normalization > cast(nullif(updated, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as updated, 2022-07-11 15:48:08 normalization > cast(returned as boolean) as returned, 2022-07-11 15:48:08 normalization > cast(sec_code as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as sec_code, 2022-07-11 15:48:08 normalization > cast(file_hash as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as file_hash, 2022-07-11 15:48:08 normalization > cast(file_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as file_name, 2022-07-11 15:48:08 normalization > cast(addenda_02 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_02, 2022-07-11 15:48:08 normalization > cast(addenda_05 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_05, 2022-07-11 15:48:08 normalization > cast(addenda_10 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_10, 2022-07-11 15:48:08 normalization > cast(addenda_11 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_11, 2022-07-11 15:48:08 normalization > cast(addenda_12 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_12, 2022-07-11 15:48:08 normalization > cast(addenda_13 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_13, 2022-07-11 15:48:08 normalization > cast(addenda_14 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_14, 2022-07-11 15:48:08 normalization > cast(addenda_15 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_15, 2022-07-11 15:48:08 normalization > cast(addenda_16 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_16, 2022-07-11 15:48:08 normalization > cast(addenda_17 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_17, 2022-07-11 15:48:08 normalization > cast(addenda_18 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_18, 2022-07-11 15:48:08 normalization > cast(addenda_98 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_98, 2022-07-11 15:48:08 normalization > cast(addenda_99 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_99, 2022-07-11 15:48:08 normalization > cast(batch_type as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as batch_type, 2022-07-11 15:48:08 normalization > cast(company_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as company_id, 2022-07-11 15:48:08 normalization > cast(partner_id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as partner_id, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > cast(external_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as external_id, 2022-07-11 15:48:08 normalization > cast(return_data as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as return_data, 2022-07-11 15:48:08 normalization > cast(batch_number as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as batch_number, 2022-07-11 15:48:08 normalization > cast(company_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as company_name, 2022-07-11 15:48:08 normalization > cast(future_dated as boolean) as future_dated, 2022-07-11 15:48:08 normalization > cast(originator_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as originator_id, 2022-07-11 15:48:08 normalization > cast(receiving_dfi as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as receiving_dfi, 2022-07-11 15:48:08 normalization > cast(dfi_account_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as dfi_account_no, 2022-07-11 15:48:08 normalization > cast(nullif(effective_date, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as effective_date, 2022-07-11 15:48:08 normalization > cast(entry_trace_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as entry_trace_no, 2022-07-11 15:48:08 normalization > cast(individual_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as individual_name, 2022-07-11 15:48:08 normalization > cast(originating_dfi as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as originating_dfi, 2022-07-11 15:48:08 normalization > cast(nullif(settlement_date, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as settlement_date, 2022-07-11 15:48:08 normalization > cast(individual_id_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as individual_id_no, 2022-07-11 15:48:08 normalization > cast(transaction_code as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as transaction_code, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > cast(processing_history as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as processing_history, 2022-07-11 15:48:08 normalization > cast(transaction_out_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as transaction_out_id, 2022-07-11 15:48:08 normalization > cast(addenda_record_count as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as addenda_record_count, 2022-07-11 15:48:08 normalization > cast(destination_country_code as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as destination_country_code, 2022-07-11 15:48:08 normalization > cast(company_entry_description as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as company_entry_description, 2022-07-11 15:48:08 normalization > cast(destination_currency_code as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as destination_currency_code, 2022-07-11 15:48:08 normalization > cast(originating_currency_code as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as originating_currency_code, 2022-07-11 15:48:08 normalization > cast(foreign_exchange_indicator as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as foreign_exchange_indicator, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from __dbt__cte__transactions_in_ab1 2022-07-11 15:48:08 normalization > -- transactions_in 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__transactions_in_ab2 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(uuid as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(amount as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(returned as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(sec_code as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(file_hash as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(file_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_02 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_05 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_10 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_11 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_12 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_13 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_14 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_15 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_16 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_17 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_18 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_98 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_99 as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(batch_type as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(company_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(partner_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(external_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(return_data as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(batch_number as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(company_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(future_dated as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(originator_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(receiving_dfi as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(dfi_account_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(effective_date as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(entry_trace_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(individual_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(originating_dfi as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(settlement_date as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(individual_id_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(transaction_code as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(processing_history as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(transaction_out_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(addenda_record_count as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(destination_country_code as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(company_entry_description as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(destination_currency_code as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(originating_currency_code as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(foreign_exchange_indicator as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_transactions_in_hashid, 2022-07-11 15:48:08 normalization > tmp.* 2022-07-11 15:48:08 normalization > from __dbt__cte__transactions_in_ab2 tmp 2022-07-11 15:48:08 normalization > -- transactions_in 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > ; 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:52.317785 [debug] [Thread-5 ]: On model.airbyte_utils.partner_config_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_stg"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`partner_config_stg` 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as 2022-07-11 15:48:08 normalization > with __dbt__cte__partner_config_ab1 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:48:08 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_partner_config 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['name']") as name, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['config']") as config, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['partner_id']") as partner_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['routing_no']") as routing_no, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['account_prefix']") as account_prefix, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_partner_config as table_alias 2022-07-11 15:48:08 normalization > -- partner_config 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ), __dbt__cte__partner_config_ab2 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__partner_config_ab1 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > cast(name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as name, 2022-07-11 15:48:08 normalization > cast(config as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as config, 2022-07-11 15:48:08 normalization > cast(bank_id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as bank_id, 2022-07-11 15:48:08 normalization > cast(nullif(created, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as created, 2022-07-11 15:48:08 normalization > cast(nullif(updated, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as updated, 2022-07-11 15:48:08 normalization > cast(partner_id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as partner_id, 2022-07-11 15:48:08 normalization > cast(routing_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as routing_no, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > cast(account_prefix as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as account_prefix, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from __dbt__cte__partner_config_ab1 2022-07-11 15:48:08 normalization > -- partner_config 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__partner_config_ab2 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(config as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(partner_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(routing_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(account_prefix as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_partner_config_hashid, 2022-07-11 15:48:08 normalization > tmp.* 2022-07-11 15:48:08 normalization > from __dbt__cte__partner_config_ab2 tmp 2022-07-11 15:48:08 normalization > -- partner_config 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > ; 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:52.320578 [debug] [Thread-1 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:52.328846 [debug] [Thread-7 ]: On model.airbyte_utils.files_in_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_stg"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`files_in_stg` 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as 2022-07-11 15:48:08 normalization > with __dbt__cte__files_in_ab1 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:48:08 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_files_in 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['ended']") as ended, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['started']") as started, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['file_hash']") as file_hash, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['file_name']") as file_name, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['iat_entry_count']") as iat_entry_count, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['std_entry_count']") as std_entry_count, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['total_batch_count']") as total_batch_count, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['total_entry_count']") as total_entry_count, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['preprocessing_path']") as preprocessing_path, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['total_debit_amount']") as total_debit_amount, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['postprocessing_path']") as postprocessing_path, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['total_credit_amount']") as total_credit_amount, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['iat_entries_processed']") as iat_entries_processed, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['std_entries_processed']") as std_entries_processed, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_files_in as table_alias 2022-07-11 15:48:08 normalization > -- files_in 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ), __dbt__cte__files_in_ab2 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__files_in_ab1 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > cast(id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as id, 2022-07-11 15:48:08 normalization > cast(nullif(ended, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as ended, 2022-07-11 15:48:08 normalization > cast(nullif(started, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as started, 2022-07-11 15:48:08 normalization > cast(nullif(updated, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as updated, 2022-07-11 15:48:08 normalization > cast(file_hash as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as file_hash, 2022-07-11 15:48:08 normalization > cast(file_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as file_name, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > cast(iat_entry_count as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as iat_entry_count, 2022-07-11 15:48:08 normalization > cast(std_entry_count as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as std_entry_count, 2022-07-11 15:48:08 normalization > cast(total_batch_count as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as total_batch_count, 2022-07-11 15:48:08 normalization > cast(total_entry_count as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as total_entry_count, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > cast(preprocessing_path as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as preprocessing_path, 2022-07-11 15:48:08 normalization > cast(total_debit_amount as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as total_debit_amount, 2022-07-11 15:48:08 normalization > cast(postprocessing_path as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as postprocessing_path, 2022-07-11 15:48:08 normalization > cast(total_credit_amount as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as total_credit_amount, 2022-07-11 15:48:08 normalization > cast(iat_entries_processed as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as iat_entries_processed, 2022-07-11 15:48:08 normalization > cast(std_entries_processed as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as std_entries_processed, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from __dbt__cte__files_in_ab1 2022-07-11 15:48:08 normalization > -- files_in 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__files_in_ab2 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(ended as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(started as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(file_hash as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(file_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(iat_entry_count as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(std_entry_count as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(total_batch_count as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(total_entry_count as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(preprocessing_path as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(total_debit_amount as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(postprocessing_path as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(total_credit_amount as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(iat_entries_processed as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(std_entries_processed as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_files_in_hashid, 2022-07-11 15:48:08 normalization > tmp.* 2022-07-11 15:48:08 normalization > from __dbt__cte__files_in_ab2 tmp 2022-07-11 15:48:08 normalization > -- files_in 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > ; 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:52.342244 [debug] [Thread-1 ]: On model.airbyte_utils.transactions_out_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_stg"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`transactions_out_stg` 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as 2022-07-11 15:48:08 normalization > with __dbt__cte__transactions_out_ab1 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:48:08 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['data']") as data, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['uuid']") as uuid, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['amount']") as amount, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['status']") as status, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['file_id']") as file_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['trace_no']") as trace_no, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['account_no']") as account_no, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['partner_id']") as partner_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['description']") as description, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['external_id']") as external_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['is_same_day']") as is_same_day, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['return_data']") as return_data, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['account_name']") as account_name, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['effective_date']") as effective_date, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['reference_info']") as reference_info, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['transaction_code']") as transaction_code, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['source_account_no']") as source_account_no, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['transaction_in_id']") as transaction_in_id, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['source_account_name']") as source_account_name, 2022-07-11 15:48:08 normalization > json_extract_scalar(_airbyte_data, "$['destination_bank_routing_no']") as destination_bank_routing_no, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out as table_alias 2022-07-11 15:48:08 normalization > -- transactions_out 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ), __dbt__cte__transactions_out_ab2 as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__transactions_out_ab1 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > cast(id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as id, 2022-07-11 15:48:08 normalization > cast(data as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as data, 2022-07-11 15:48:08 normalization > cast(uuid as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as uuid, 2022-07-11 15:48:08 normalization > cast(amount as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as amount, 2022-07-11 15:48:08 normalization > cast(status as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as status, 2022-07-11 15:48:08 normalization > cast(bank_id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as bank_id, 2022-07-11 15:48:08 normalization > cast(nullif(created, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as created, 2022-07-11 15:48:08 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):161 - Completing future exceptionally... io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:63) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:60) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 3 more Suppressed: io.airbyte.workers.exception.WorkerException: Normalization process wasn't successful at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:162) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:48) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-07-11 15:48:08 normalization > cast(file_id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as file_id, 2022-07-11 15:48:08 normalization > cast(nullif(updated, '') as 2022-07-11 15:48:08 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as updated, 2022-07-11 15:48:08 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating... 2022-07-11 15:48:08 normalization > cast(trace_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as trace_no, 2022-07-11 15:48:08 normalization > cast(account_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as account_no, 2022-07-11 15:48:08 normalization > cast(partner_id as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as partner_id, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 WARN i.t.i.a.POJOActivityTaskHandler(activityFailureToResult):307 - Activity failure. ActivityId=3f836ec9-088a-3109-8e18-58f99df0f5f0, activityType=Normalize, attempt=1 java.lang.RuntimeException: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:289) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.normalize(NormalizationActivityImpl.java:75) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at jdk.internal.reflect.GeneratedMethodAccessor386.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?] at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?] at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityInboundCallsInterceptor.execute(POJOActivityTaskHandler.java:214) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityImplementation.execute(POJOActivityTaskHandler.java:180) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.activity.POJOActivityTaskHandler.handle(POJOActivityTaskHandler.java:120) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:204) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:164) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.8.1.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.temporal.serviceclient.CheckedExceptionWrapper.wrap(CheckedExceptionWrapper.java:56) ~[temporal-serviceclient-1.8.1.jar:?] at io.temporal.internal.sync.WorkflowInternal.wrap(WorkflowInternal.java:448) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.activity.Activity.wrap(Activity.java:51) ~[temporal-sdk-1.8.1.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:138) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.lambda$normalize$3(NormalizationActivityImpl.java:103) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:284) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 13 more Caused by: java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Normalization Failed. at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:132) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.lambda$normalize$3(NormalizationActivityImpl.java:103) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:284) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 13 more Caused by: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:63) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 1 more Caused by: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:60) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 1 more Suppressed: io.airbyte.workers.exception.WorkerException: Normalization process wasn't successful at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:162) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:48) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-07-11 15:48:08 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > cast(description as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as description, 2022-07-11 15:48:08 normalization > cast(external_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as external_id, 2022-07-11 15:48:08 normalization > cast(is_same_day as boolean) as is_same_day, 2022-07-11 15:48:08 normalization > cast(return_data as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as return_data, 2022-07-11 15:48:08 normalization > cast(account_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as account_name, 2022-07-11 15:48:08 normalization > cast(nullif(effective_date, '') as 2022-07-11 15:48:08 normalization > timestamp 2022-07-11 15:48:08 normalization > ) as effective_date, 2022-07-11 15:48:08 normalization > cast(reference_info as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as reference_info, 2022-07-11 15:48:08 normalization > cast(transaction_code as 2022-07-11 15:48:08 normalization > float64 2022-07-11 15:48:08 normalization > ) as transaction_code, 2022-07-11 15:48:08 normalization > cast(source_account_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as source_account_no, 2022-07-11 15:48:08 normalization > cast(transaction_in_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as transaction_in_id, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > cast(source_account_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as source_account_name, 2022-07-11 15:48:08 normalization > cast(destination_bank_routing_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) as destination_bank_routing_no, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:48:08 normalization > from __dbt__cte__transactions_out_ab1 2022-07-11 15:48:08 normalization > -- transactions_out 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:48:08 normalization > -- depends_on: __dbt__cte__transactions_out_ab2 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(data as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(uuid as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(amount as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(status as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(file_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(trace_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(account_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(partner_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(description as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(external_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(is_same_day as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(return_data as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(account_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(effective_date as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(reference_info as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(transaction_code as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(source_account_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(transaction_in_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(source_account_name as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(destination_bank_routing_no as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_transactions_out_hashid, 2022-07-11 15:48:08 normalization > tmp.* 2022-07-11 15:48:08 normalization > from __dbt__cte__transactions_out_ab2 tmp 2022-07-11 15:48:08 normalization > -- transactions_out 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > ; 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:52.940127 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:52.941375 [info ] [Thread-4 ]: 1 of 18 OK created view model _airbyte_raw_achilles.bank_config_stg..................................................... [OK in 1.23s] 2022-07-11 15:48:08 normalization > 15:47:52.941968 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.bank_config_stg 2022-07-11 15:48:08 normalization > 15:47:52.943038 [debug] [Thread-3 ]: Began running node model.airbyte_utils.bank_config_scd 2022-07-11 15:48:08 normalization > 15:47:52.943430 [info ] [Thread-3 ]: 7 of 18 START incremental model raw_achilles.bank_config_scd............................................................ [RUN] 2022-07-11 15:48:08 normalization > 15:47:52.944558 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_scd" 2022-07-11 15:48:08 normalization > 15:47:52.944791 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.bank_config_scd 2022-07-11 15:48:08 normalization > 15:47:52.944998 [debug] [Thread-3 ]: Compiling model.airbyte_utils.bank_config_scd 2022-07-11 15:48:08 normalization > 15:47:52.975297 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_scd" 2022-07-11 15:48:08 normalization > 15:47:52.984821 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:52.985137 [debug] [Thread-3 ]: Began executing node model.airbyte_utils.bank_config_scd 2022-07-11 15:48:08 normalization > 15:47:53.033765 [debug] [Thread-3 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:53.045253 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:53.047393 [info ] [Thread-8 ]: 3 of 18 OK created view model _airbyte_raw_achilles.files_out_stg....................................................... [OK in 1.33s] 2022-07-11 15:48:08 normalization > 15:47:53.050418 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:53.054950 [info ] [Thread-1 ]: 6 of 18 OK created view model _airbyte_raw_achilles.transactions_out_stg................................................ [OK in 1.33s] 2022-07-11 15:48:08 normalization > 15:47:53.058059 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.transactions_out_stg 2022-07-11 15:48:08 normalization > 15:47:53.052333 [debug] [Thread-8 ]: Finished running node model.airbyte_utils.files_out_stg 2022-07-11 15:48:08 normalization > 15:47:53.059421 [debug] [Thread-4 ]: Began running node model.airbyte_utils.transactions_out_scd 2022-07-11 15:48:08 normalization > 15:47:53.060892 [info ] [Thread-4 ]: 8 of 18 START incremental model raw_achilles.transactions_out_scd....................................................... [RUN] 2022-07-11 15:48:08 normalization > 15:47:53.062423 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_scd" 2022-07-11 15:48:08 normalization > 15:47:53.062755 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.transactions_out_scd 2022-07-11 15:48:08 normalization > 15:47:53.063039 [debug] [Thread-4 ]: Compiling model.airbyte_utils.transactions_out_scd 2022-07-11 15:48:08 normalization > 15:47:53.082558 [debug] [Thread-1 ]: Began running node model.airbyte_utils.files_out_scd 2022-07-11 15:48:08 normalization > 15:47:53.098491 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_scd" 2022-07-11 15:48:08 normalization > 15:47:53.099536 [info ] [Thread-1 ]: 9 of 18 START incremental model raw_achilles.files_out_scd.............................................................. [RUN] 2022-07-11 15:48:08 normalization > 15:47:53.104224 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:53.104564 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:53.107963 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_scd" 2022-07-11 15:48:08 normalization > 15:47:53.118693 [info ] [Thread-5 ]: 4 of 18 OK created view model _airbyte_raw_achilles.partner_config_stg.................................................. [OK in 1.40s] 2022-07-11 15:48:08 normalization > 15:47:53.124436 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:53.128012 [info ] [Thread-7 ]: 5 of 18 OK created view model _airbyte_raw_achilles.files_in_stg........................................................ [OK in 1.40s] 2022-07-11 15:48:08 normalization > 15:47:53.125459 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.files_out_scd 2022-07-11 15:48:08 normalization > 15:47:53.129320 [debug] [Thread-1 ]: Compiling model.airbyte_utils.files_out_scd 2022-07-11 15:48:08 normalization > 15:47:53.124857 [debug] [Thread-4 ]: Began executing node model.airbyte_utils.transactions_out_scd 2022-07-11 15:48:08 normalization > 15:47:53.126436 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.partner_config_stg 2022-07-11 15:48:08 normalization > 15:47:53.150506 [debug] [Thread-4 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:53.151221 [debug] [Thread-7 ]: Finished running node model.airbyte_utils.files_in_stg 2022-07-11 15:48:08 normalization > 15:47:53.174236 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.files_out_scd" 2022-07-11 15:48:08 normalization > 15:47:53.175280 [debug] [Thread-8 ]: Began running node model.airbyte_utils.partner_config_scd 2022-07-11 15:48:08 normalization > 15:47:53.177211 [info ] [Thread-8 ]: 10 of 18 START incremental model raw_achilles.partner_config_scd........................................................ [RUN] 2022-07-11 15:48:08 normalization > 15:47:53.176337 [debug] [Thread-5 ]: Began running node model.airbyte_utils.files_in_scd 2022-07-11 15:48:08 normalization > 15:47:53.178714 [info ] [Thread-5 ]: 11 of 18 START incremental model raw_achilles.files_in_scd.............................................................. [RUN] 2022-07-11 15:48:08 normalization > 15:47:53.180227 [debug] [Thread-8 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_scd" 2022-07-11 15:48:08 normalization > 15:47:53.180786 [debug] [Thread-8 ]: Began compiling node model.airbyte_utils.partner_config_scd 2022-07-11 15:48:08 normalization > 15:47:53.177996 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:53.181955 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_scd" 2022-07-11 15:48:08 normalization > 15:47:53.182410 [debug] [Thread-8 ]: Compiling model.airbyte_utils.partner_config_scd 2022-07-11 15:48:08 normalization > 15:47:53.182912 [debug] [Thread-1 ]: Began executing node model.airbyte_utils.files_out_scd 2022-07-11 15:48:08 normalization > 15:47:53.183300 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.files_in_scd 2022-07-11 15:48:08 normalization > 15:47:53.217088 [debug] [Thread-1 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:53.220935 [debug] [Thread-8 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_scd" 2022-07-11 15:48:08 normalization > 15:47:53.227526 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:53.227979 [debug] [Thread-5 ]: Compiling model.airbyte_utils.files_in_scd 2022-07-11 15:48:08 normalization > 15:47:53.228398 [debug] [Thread-3 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/bank_config_scd?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.bank_config_scd 2022-07-11 15:48:08 normalization > 15:47:53.231118 [info ] [Thread-6 ]: 2 of 18 OK created view model _airbyte_raw_achilles.transactions_in_stg................................................. [OK in 1.51s] 2022-07-11 15:48:08 normalization > 15:47:53.231645 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:53.261203 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.files_in_scd" 2022-07-11 15:48:08 normalization > 15:47:53.279163 [info ] [Thread-3 ]: 15:47:53 + `mainapi-282501`.raw_achilles.`bank_config_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:08 normalization > 15:47:53.282318 [debug] [Thread-6 ]: Finished running node model.airbyte_utils.transactions_in_stg 2022-07-11 15:48:08 normalization > 15:47:53.283218 [debug] [Thread-8 ]: Began executing node model.airbyte_utils.partner_config_scd 2022-07-11 15:48:08 normalization > 15:47:53.288411 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:53.311268 [debug] [Thread-7 ]: Began running node model.airbyte_utils.transactions_in_scd 2022-07-11 15:48:08 normalization > 15:47:53.330178 [debug] [Thread-8 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:53.356507 [debug] [Thread-5 ]: Began executing node model.airbyte_utils.files_in_scd 2022-07-11 15:48:08 normalization > 15:47:53.360154 [debug] [Thread-3 ]: Writing runtime SQL for node "model.airbyte_utils.bank_config_scd" 2022-07-11 15:48:08 normalization > 15:47:53.361013 [info ] [Thread-7 ]: 12 of 18 START incremental model raw_achilles.transactions_in_scd....................................................... [RUN] 2022-07-11 15:48:08 normalization > 15:47:53.375713 [debug] [Thread-7 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_scd" 2022-07-11 15:48:08 normalization > 15:47:53.368932 [debug] [Thread-5 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:53.376507 [debug] [Thread-7 ]: Began compiling node model.airbyte_utils.transactions_in_scd 2022-07-11 15:48:08 normalization > 15:47:53.377541 [debug] [Thread-3 ]: On model.airbyte_utils.bank_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace table `mainapi-282501`.raw_achilles.`bank_config_scd` 2022-07-11 15:48:08 normalization > partition by range_bucket( 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > generate_array(0, 1, 1) 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- depends_on: ref('bank_config_stg') 2022-07-11 15:48:08 normalization > with 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > input_data as ( 2022-07-11 15:48:08 normalization > select * 2022-07-11 15:48:08 normalization > from `mainapi-282501`._airbyte_raw_achilles.`bank_config_stg` 2022-07-11 15:48:08 normalization > -- bank_config from `mainapi-282501`.raw_achilles._airbyte_raw_bank_config 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > scd_data as ( 2022-07-11 15:48:08 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(bank_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:48:08 normalization > name, 2022-07-11 15:48:08 normalization > config, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > routing_no, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > updated as _airbyte_start_at, 2022-07-11 15:48:08 normalization > lag(updated) over ( 2022-07-11 15:48:08 normalization > partition by cast(bank_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) as _airbyte_end_at, 2022-07-11 15:48:08 normalization > case when row_number() over ( 2022-07-11 15:48:08 normalization > partition by cast(bank_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > _airbyte_bank_config_hashid 2022-07-11 15:48:08 normalization > from input_data 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > dedup_data as ( 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:48:08 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:48:08 normalization > row_number() over ( 2022-07-11 15:48:08 normalization > partition by 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:48:08 normalization > ) as _airbyte_row_num, 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > scd_data.* 2022-07-11 15:48:08 normalization > from scd_data 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > name, 2022-07-11 15:48:08 normalization > config, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > routing_no, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_end_at, 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:48:08 normalization > _airbyte_bank_config_hashid 2022-07-11 15:48:08 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:48:08 normalization > ); 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:53.378005 [debug] [Thread-7 ]: Compiling model.airbyte_utils.transactions_in_scd 2022-07-11 15:48:08 normalization > 15:47:53.420011 [debug] [Thread-7 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_scd" 2022-07-11 15:48:08 normalization > 15:47:53.421240 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:53.421607 [debug] [Thread-7 ]: Began executing node model.airbyte_utils.transactions_in_scd 2022-07-11 15:48:08 normalization > 15:47:53.426194 [debug] [Thread-7 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:53.452783 [debug] [Thread-4 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/transactions_out_scd?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.transactions_out_scd 2022-07-11 15:48:08 normalization > 15:47:53.456293 [info ] [Thread-4 ]: 15:47:53 + `mainapi-282501`.raw_achilles.`transactions_out_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:08 normalization > 15:47:53.461627 [debug] [Thread-4 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_out_scd" 2022-07-11 15:48:08 normalization > 15:47:53.463636 [debug] [Thread-4 ]: On model.airbyte_utils.transactions_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace table `mainapi-282501`.raw_achilles.`transactions_out_scd` 2022-07-11 15:48:08 normalization > partition by range_bucket( 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > generate_array(0, 1, 1) 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- depends_on: ref('transactions_out_stg') 2022-07-11 15:48:08 normalization > with 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > input_data as ( 2022-07-11 15:48:08 normalization > select * 2022-07-11 15:48:08 normalization > from `mainapi-282501`._airbyte_raw_achilles.`transactions_out_stg` 2022-07-11 15:48:08 normalization > -- transactions_out from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > scd_data as ( 2022-07-11 15:48:08 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:48:08 normalization > id, 2022-07-11 15:48:08 normalization > data, 2022-07-11 15:48:08 normalization > uuid, 2022-07-11 15:48:08 normalization > amount, 2022-07-11 15:48:08 normalization > status, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > file_id, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > trace_no, 2022-07-11 15:48:08 normalization > account_no, 2022-07-11 15:48:08 normalization > partner_id, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > description, 2022-07-11 15:48:08 normalization > external_id, 2022-07-11 15:48:08 normalization > is_same_day, 2022-07-11 15:48:08 normalization > return_data, 2022-07-11 15:48:08 normalization > account_name, 2022-07-11 15:48:08 normalization > effective_date, 2022-07-11 15:48:08 normalization > reference_info, 2022-07-11 15:48:08 normalization > transaction_code, 2022-07-11 15:48:08 normalization > source_account_no, 2022-07-11 15:48:08 normalization > transaction_in_id, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > source_account_name, 2022-07-11 15:48:08 normalization > destination_bank_routing_no, 2022-07-11 15:48:08 normalization > updated as _airbyte_start_at, 2022-07-11 15:48:08 normalization > lag(updated) over ( 2022-07-11 15:48:08 normalization > partition by cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) as _airbyte_end_at, 2022-07-11 15:48:08 normalization > case when row_number() over ( 2022-07-11 15:48:08 normalization > partition by cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > _airbyte_transactions_out_hashid 2022-07-11 15:48:08 normalization > from input_data 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > dedup_data as ( 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:48:08 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:48:08 normalization > row_number() over ( 2022-07-11 15:48:08 normalization > partition by 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:48:08 normalization > ) as _airbyte_row_num, 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > scd_data.* 2022-07-11 15:48:08 normalization > from scd_data 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > id, 2022-07-11 15:48:08 normalization > data, 2022-07-11 15:48:08 normalization > uuid, 2022-07-11 15:48:08 normalization > amount, 2022-07-11 15:48:08 normalization > status, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > file_id, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > trace_no, 2022-07-11 15:48:08 normalization > account_no, 2022-07-11 15:48:08 normalization > partner_id, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > description, 2022-07-11 15:48:08 normalization > external_id, 2022-07-11 15:48:08 normalization > is_same_day, 2022-07-11 15:48:08 normalization > return_data, 2022-07-11 15:48:08 normalization > account_name, 2022-07-11 15:48:08 normalization > effective_date, 2022-07-11 15:48:08 normalization > reference_info, 2022-07-11 15:48:08 normalization > transaction_code, 2022-07-11 15:48:08 normalization > source_account_no, 2022-07-11 15:48:08 normalization > transaction_in_id, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > source_account_name, 2022-07-11 15:48:08 normalization > destination_bank_routing_no, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_end_at, 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:48:08 normalization > _airbyte_transactions_out_hashid 2022-07-11 15:48:08 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:48:08 normalization > ); 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:53.472916 [debug] [Thread-1 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/files_out_scd?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.files_out_scd 2022-07-11 15:48:08 normalization > 15:47:53.474740 [info ] [Thread-1 ]: 15:47:53 + `mainapi-282501`.raw_achilles.`files_out_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:08 normalization > 15:47:53.479510 [debug] [Thread-1 ]: Writing runtime SQL for node "model.airbyte_utils.files_out_scd" 2022-07-11 15:48:08 normalization > 15:47:53.484148 [debug] [Thread-1 ]: On model.airbyte_utils.files_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_out_scd` 2022-07-11 15:48:08 normalization > partition by range_bucket( 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > generate_array(0, 1, 1) 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- depends_on: ref('files_out_stg') 2022-07-11 15:48:08 normalization > with 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > input_data as ( 2022-07-11 15:48:08 normalization > select * 2022-07-11 15:48:08 normalization > from `mainapi-282501`._airbyte_raw_achilles.`files_out_stg` 2022-07-11 15:48:08 normalization > -- files_out from `mainapi-282501`.raw_achilles._airbyte_raw_files_out 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > scd_data as ( 2022-07-11 15:48:08 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:48:08 normalization > id, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > file_hash, 2022-07-11 15:48:08 normalization > file_name, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > batch_count, 2022-07-11 15:48:08 normalization > exchange_window, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > updated as _airbyte_start_at, 2022-07-11 15:48:08 normalization > lag(updated) over ( 2022-07-11 15:48:08 normalization > partition by cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) as _airbyte_end_at, 2022-07-11 15:48:08 normalization > case when row_number() over ( 2022-07-11 15:48:08 normalization > partition by cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > _airbyte_files_out_hashid 2022-07-11 15:48:08 normalization > from input_data 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > dedup_data as ( 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:48:08 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:48:08 normalization > row_number() over ( 2022-07-11 15:48:08 normalization > partition by 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:48:08 normalization > ) as _airbyte_row_num, 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > scd_data.* 2022-07-11 15:48:08 normalization > from scd_data 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > id, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > file_hash, 2022-07-11 15:48:08 normalization > file_name, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > batch_count, 2022-07-11 15:48:08 normalization > exchange_window, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_end_at, 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:48:08 normalization > _airbyte_files_out_hashid 2022-07-11 15:48:08 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:48:08 normalization > ); 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:53.567113 [debug] [Thread-8 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/partner_config_scd?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.partner_config_scd 2022-07-11 15:48:08 normalization > 15:47:53.568863 [info ] [Thread-8 ]: 15:47:53 + `mainapi-282501`.raw_achilles.`partner_config_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:08 normalization > 15:47:53.573223 [debug] [Thread-8 ]: Writing runtime SQL for node "model.airbyte_utils.partner_config_scd" 2022-07-11 15:48:08 normalization > 15:47:53.573715 [debug] [Thread-8 ]: On model.airbyte_utils.partner_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace table `mainapi-282501`.raw_achilles.`partner_config_scd` 2022-07-11 15:48:08 normalization > partition by range_bucket( 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > generate_array(0, 1, 1) 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- depends_on: ref('partner_config_stg') 2022-07-11 15:48:08 normalization > with 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > input_data as ( 2022-07-11 15:48:08 normalization > select * 2022-07-11 15:48:08 normalization > from `mainapi-282501`._airbyte_raw_achilles.`partner_config_stg` 2022-07-11 15:48:08 normalization > -- partner_config from `mainapi-282501`.raw_achilles._airbyte_raw_partner_config 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > scd_data as ( 2022-07-11 15:48:08 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(partner_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:48:08 normalization > name, 2022-07-11 15:48:08 normalization > config, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > partner_id, 2022-07-11 15:48:08 normalization > routing_no, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > account_prefix, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > updated as _airbyte_start_at, 2022-07-11 15:48:08 normalization > lag(updated) over ( 2022-07-11 15:48:08 normalization > partition by cast(partner_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) as _airbyte_end_at, 2022-07-11 15:48:08 normalization > case when row_number() over ( 2022-07-11 15:48:08 normalization > partition by cast(partner_id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > _airbyte_partner_config_hashid 2022-07-11 15:48:08 normalization > from input_data 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > dedup_data as ( 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:48:08 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:48:08 normalization > row_number() over ( 2022-07-11 15:48:08 normalization > partition by 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:48:08 normalization > ) as _airbyte_row_num, 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > scd_data.* 2022-07-11 15:48:08 normalization > from scd_data 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > name, 2022-07-11 15:48:08 normalization > config, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > partner_id, 2022-07-11 15:48:08 normalization > routing_no, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > account_prefix, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_end_at, 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:48:08 normalization > _airbyte_partner_config_hashid 2022-07-11 15:48:08 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:48:08 normalization > ); 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:53.580605 [debug] [Thread-5 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/files_in_scd?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.files_in_scd 2022-07-11 15:48:08 normalization > 15:47:53.582392 [info ] [Thread-5 ]: 15:47:53 + `mainapi-282501`.raw_achilles.`files_in_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:08 normalization > 15:47:53.583660 [debug] [Thread-7 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/transactions_in_scd?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.transactions_in_scd 2022-07-11 15:48:08 normalization > 15:47:53.588600 [debug] [Thread-5 ]: Writing runtime SQL for node "model.airbyte_utils.files_in_scd" 2022-07-11 15:48:08 normalization > 15:47:53.590300 [info ] [Thread-7 ]: 15:47:53 + `mainapi-282501`.raw_achilles.`transactions_in_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:08 normalization > 15:47:53.595065 [debug] [Thread-7 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_in_scd" 2022-07-11 15:48:08 normalization > 15:47:53.595631 [debug] [Thread-5 ]: On model.airbyte_utils.files_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_in_scd` 2022-07-11 15:48:08 normalization > partition by range_bucket( 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > generate_array(0, 1, 1) 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- depends_on: ref('files_in_stg') 2022-07-11 15:48:08 normalization > with 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > input_data as ( 2022-07-11 15:48:08 normalization > select * 2022-07-11 15:48:08 normalization > from `mainapi-282501`._airbyte_raw_achilles.`files_in_stg` 2022-07-11 15:48:08 normalization > -- files_in from `mainapi-282501`.raw_achilles._airbyte_raw_files_in 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > scd_data as ( 2022-07-11 15:48:08 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:48:08 normalization > id, 2022-07-11 15:48:08 normalization > ended, 2022-07-11 15:48:08 normalization > started, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > file_hash, 2022-07-11 15:48:08 normalization > file_name, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > iat_entry_count, 2022-07-11 15:48:08 normalization > std_entry_count, 2022-07-11 15:48:08 normalization > total_batch_count, 2022-07-11 15:48:08 normalization > total_entry_count, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > preprocessing_path, 2022-07-11 15:48:08 normalization > total_debit_amount, 2022-07-11 15:48:08 normalization > postprocessing_path, 2022-07-11 15:48:08 normalization > total_credit_amount, 2022-07-11 15:48:08 normalization > iat_entries_processed, 2022-07-11 15:48:08 normalization > std_entries_processed, 2022-07-11 15:48:08 normalization > updated as _airbyte_start_at, 2022-07-11 15:48:08 normalization > lag(updated) over ( 2022-07-11 15:48:08 normalization > partition by cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) as _airbyte_end_at, 2022-07-11 15:48:08 normalization > case when row_number() over ( 2022-07-11 15:48:08 normalization > partition by cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > _airbyte_files_in_hashid 2022-07-11 15:48:08 normalization > from input_data 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > dedup_data as ( 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:48:08 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:48:08 normalization > row_number() over ( 2022-07-11 15:48:08 normalization > partition by 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:48:08 normalization > ) as _airbyte_row_num, 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > scd_data.* 2022-07-11 15:48:08 normalization > from scd_data 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > id, 2022-07-11 15:48:08 normalization > ended, 2022-07-11 15:48:08 normalization > started, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > file_hash, 2022-07-11 15:48:08 normalization > file_name, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > iat_entry_count, 2022-07-11 15:48:08 normalization > std_entry_count, 2022-07-11 15:48:08 normalization > total_batch_count, 2022-07-11 15:48:08 normalization > total_entry_count, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > preprocessing_path, 2022-07-11 15:48:08 normalization > total_debit_amount, 2022-07-11 15:48:08 normalization > postprocessing_path, 2022-07-11 15:48:08 normalization > total_credit_amount, 2022-07-11 15:48:08 normalization > iat_entries_processed, 2022-07-11 15:48:08 normalization > std_entries_processed, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_end_at, 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:48:08 normalization > _airbyte_files_in_hashid 2022-07-11 15:48:08 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:48:08 normalization > ); 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:53.595975 [debug] [Thread-7 ]: On model.airbyte_utils.transactions_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_in_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace table `mainapi-282501`.raw_achilles.`transactions_in_scd` 2022-07-11 15:48:08 normalization > partition by range_bucket( 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > generate_array(0, 1, 1) 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- depends_on: ref('transactions_in_stg') 2022-07-11 15:48:08 normalization > with 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > input_data as ( 2022-07-11 15:48:08 normalization > select * 2022-07-11 15:48:08 normalization > from `mainapi-282501`._airbyte_raw_achilles.`transactions_in_stg` 2022-07-11 15:48:08 normalization > -- transactions_in from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_in 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > scd_data as ( 2022-07-11 15:48:08 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:48:08 normalization > id, 2022-07-11 15:48:08 normalization > uuid, 2022-07-11 15:48:08 normalization > amount, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > returned, 2022-07-11 15:48:08 normalization > sec_code, 2022-07-11 15:48:08 normalization > file_hash, 2022-07-11 15:48:08 normalization > file_name, 2022-07-11 15:48:08 normalization > addenda_02, 2022-07-11 15:48:08 normalization > addenda_05, 2022-07-11 15:48:08 normalization > addenda_10, 2022-07-11 15:48:08 normalization > addenda_11, 2022-07-11 15:48:08 normalization > addenda_12, 2022-07-11 15:48:08 normalization > addenda_13, 2022-07-11 15:48:08 normalization > addenda_14, 2022-07-11 15:48:08 normalization > addenda_15, 2022-07-11 15:48:08 normalization > addenda_16, 2022-07-11 15:48:08 normalization > addenda_17, 2022-07-11 15:48:08 normalization > addenda_18, 2022-07-11 15:48:08 normalization > addenda_98, 2022-07-11 15:48:08 normalization > addenda_99, 2022-07-11 15:48:08 normalization > batch_type, 2022-07-11 15:48:08 normalization > company_id, 2022-07-11 15:48:08 normalization > partner_id, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > external_id, 2022-07-11 15:48:08 normalization > return_data, 2022-07-11 15:48:08 normalization > batch_number, 2022-07-11 15:48:08 normalization > company_name, 2022-07-11 15:48:08 normalization > future_dated, 2022-07-11 15:48:08 normalization > originator_id, 2022-07-11 15:48:08 normalization > receiving_dfi, 2022-07-11 15:48:08 normalization > dfi_account_no, 2022-07-11 15:48:08 normalization > effective_date, 2022-07-11 15:48:08 normalization > entry_trace_no, 2022-07-11 15:48:08 normalization > individual_name, 2022-07-11 15:48:08 normalization > originating_dfi, 2022-07-11 15:48:08 normalization > settlement_date, 2022-07-11 15:48:08 normalization > individual_id_no, 2022-07-11 15:48:08 normalization > transaction_code, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > processing_history, 2022-07-11 15:48:08 normalization > transaction_out_id, 2022-07-11 15:48:08 normalization > addenda_record_count, 2022-07-11 15:48:08 normalization > destination_country_code, 2022-07-11 15:48:08 normalization > company_entry_description, 2022-07-11 15:48:08 normalization > destination_currency_code, 2022-07-11 15:48:08 normalization > originating_currency_code, 2022-07-11 15:48:08 normalization > foreign_exchange_indicator, 2022-07-11 15:48:08 normalization > updated as _airbyte_start_at, 2022-07-11 15:48:08 normalization > lag(updated) over ( 2022-07-11 15:48:08 normalization > partition by cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) as _airbyte_end_at, 2022-07-11 15:48:08 normalization > case when row_number() over ( 2022-07-11 15:48:08 normalization > partition by cast(id as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by 2022-07-11 15:48:08 normalization > updated is null asc, 2022-07-11 15:48:08 normalization > updated desc, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at desc 2022-07-11 15:48:08 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > _airbyte_transactions_in_hashid 2022-07-11 15:48:08 normalization > from input_data 2022-07-11 15:48:08 normalization > ), 2022-07-11 15:48:08 normalization > dedup_data as ( 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:48:08 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:48:08 normalization > row_number() over ( 2022-07-11 15:48:08 normalization > partition by 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:48:08 normalization > ) as _airbyte_row_num, 2022-07-11 15:48:08 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ), '')) as 2022-07-11 15:48:08 normalization > string 2022-07-11 15:48:08 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > scd_data.* 2022-07-11 15:48:08 normalization > from scd_data 2022-07-11 15:48:08 normalization > ) 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > _airbyte_unique_key_scd, 2022-07-11 15:48:08 normalization > id, 2022-07-11 15:48:08 normalization > uuid, 2022-07-11 15:48:08 normalization > amount, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > returned, 2022-07-11 15:48:08 normalization > sec_code, 2022-07-11 15:48:08 normalization > file_hash, 2022-07-11 15:48:08 normalization > file_name, 2022-07-11 15:48:08 normalization > addenda_02, 2022-07-11 15:48:08 normalization > addenda_05, 2022-07-11 15:48:08 normalization > addenda_10, 2022-07-11 15:48:08 normalization > addenda_11, 2022-07-11 15:48:08 normalization > addenda_12, 2022-07-11 15:48:08 normalization > addenda_13, 2022-07-11 15:48:08 normalization > addenda_14, 2022-07-11 15:48:08 normalization > addenda_15, 2022-07-11 15:48:08 normalization > addenda_16, 2022-07-11 15:48:08 normalization > addenda_17, 2022-07-11 15:48:08 normalization > addenda_18, 2022-07-11 15:48:08 normalization > addenda_98, 2022-07-11 15:48:08 normalization > addenda_99, 2022-07-11 15:48:08 normalization > batch_type, 2022-07-11 15:48:08 normalization > company_id, 2022-07-11 15:48:08 normalization > partner_id, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > external_id, 2022-07-11 15:48:08 normalization > return_data, 2022-07-11 15:48:08 normalization > batch_number, 2022-07-11 15:48:08 normalization > company_name, 2022-07-11 15:48:08 normalization > future_dated, 2022-07-11 15:48:08 normalization > originator_id, 2022-07-11 15:48:08 normalization > receiving_dfi, 2022-07-11 15:48:08 normalization > dfi_account_no, 2022-07-11 15:48:08 normalization > effective_date, 2022-07-11 15:48:08 normalization > entry_trace_no, 2022-07-11 15:48:08 normalization > individual_name, 2022-07-11 15:48:08 normalization > originating_dfi, 2022-07-11 15:48:08 normalization > settlement_date, 2022-07-11 15:48:08 normalization > individual_id_no, 2022-07-11 15:48:08 normalization > transaction_code, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > processing_history, 2022-07-11 15:48:08 normalization > transaction_out_id, 2022-07-11 15:48:08 normalization > addenda_record_count, 2022-07-11 15:48:08 normalization > destination_country_code, 2022-07-11 15:48:08 normalization > company_entry_description, 2022-07-11 15:48:08 normalization > destination_currency_code, 2022-07-11 15:48:08 normalization > originating_currency_code, 2022-07-11 15:48:08 normalization > foreign_exchange_indicator, 2022-07-11 15:48:08 normalization > _airbyte_start_at, 2022-07-11 15:48:08 normalization > _airbyte_end_at, 2022-07-11 15:48:08 normalization > _airbyte_active_row, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:48:08 normalization > _airbyte_transactions_in_hashid 2022-07-11 15:48:08 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:48:08 normalization > ); 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:54.482052 [debug] [Thread-7 ]: BigQuery adapter: Retry attempt 1 of 3 after error: BadRequest('Invalid timestamp string "0000-12-30T00:00:00Z"') 2022-07-11 15:48:08 normalization > 15:47:56.283426 [debug] [Thread-1 ]: On model.airbyte_utils.files_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- We have to have a non-empty query, so just do a noop delete 2022-07-11 15:48:08 normalization > delete from `mainapi-282501`.raw_achilles.`files_out_scd` where 1=0 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:56.385693 [debug] [Thread-8 ]: On model.airbyte_utils.partner_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- We have to have a non-empty query, so just do a noop delete 2022-07-11 15:48:08 normalization > delete from `mainapi-282501`.raw_achilles.`partner_config_scd` where 1=0 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:56.406771 [debug] [Thread-3 ]: On model.airbyte_utils.bank_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- We have to have a non-empty query, so just do a noop delete 2022-07-11 15:48:08 normalization > delete from `mainapi-282501`.raw_achilles.`bank_config_scd` where 1=0 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:56.435157 [debug] [Thread-4 ]: On model.airbyte_utils.transactions_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- We have to have a non-empty query, so just do a noop delete 2022-07-11 15:48:08 normalization > delete from `mainapi-282501`.raw_achilles.`transactions_out_scd` where 1=0 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:56.436272 [debug] [Thread-7 ]: BigQuery adapter: Retry attempt 2 of 3 after error: BadRequest('Invalid timestamp string "0000-12-30T00:00:00Z"') 2022-07-11 15:48:08 normalization > 15:47:56.579038 [debug] [Thread-5 ]: On model.airbyte_utils.files_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- We have to have a non-empty query, so just do a noop delete 2022-07-11 15:48:08 normalization > delete from `mainapi-282501`.raw_achilles.`files_in_scd` where 1=0 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:58.381567 [debug] [Thread-7 ]: BigQuery adapter: Retry attempt 3 of 3 after error: BadRequest('Invalid timestamp string "0000-12-30T00:00:00Z"') 2022-07-11 15:48:08 normalization > 15:47:58.761372 [debug] [Thread-3 ]: On model.airbyte_utils.bank_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > drop view _airbyte_raw_achilles.bank_config_stg 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:58.858973 [debug] [Thread-8 ]: On model.airbyte_utils.partner_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > drop view _airbyte_raw_achilles.partner_config_stg 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:59.116848 [debug] [Thread-4 ]: On model.airbyte_utils.transactions_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > drop view _airbyte_raw_achilles.transactions_out_stg 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:59.192704 [debug] [Thread-1 ]: On model.airbyte_utils.files_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > drop view _airbyte_raw_achilles.files_out_stg 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:59.283184 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:59.284369 [info ] [Thread-3 ]: 7 of 18 OK created incremental model raw_achilles.bank_config_scd....................................................... [CREATE TABLE (3.0 rows, 1.9 KB processed) in 6.34s] 2022-07-11 15:48:08 normalization > 15:47:59.284928 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.bank_config_scd 2022-07-11 15:48:08 normalization > 15:47:59.286390 [debug] [Thread-6 ]: Began running node model.airbyte_utils.bank_config 2022-07-11 15:48:08 normalization > 15:47:59.287111 [info ] [Thread-6 ]: 13 of 18 START incremental model raw_achilles.bank_config............................................................... [RUN] 2022-07-11 15:48:08 normalization > 15:47:59.288674 [debug] [Thread-6 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config" 2022-07-11 15:48:08 normalization > 15:47:59.288928 [debug] [Thread-6 ]: Began compiling node model.airbyte_utils.bank_config 2022-07-11 15:48:08 normalization > 15:47:59.289156 [debug] [Thread-6 ]: Compiling model.airbyte_utils.bank_config 2022-07-11 15:48:08 normalization > 15:47:59.300628 [debug] [Thread-6 ]: Writing injected SQL for node "model.airbyte_utils.bank_config" 2022-07-11 15:48:08 normalization > 15:47:59.301397 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:59.301630 [debug] [Thread-6 ]: Began executing node model.airbyte_utils.bank_config 2022-07-11 15:48:08 normalization > 15:47:59.306767 [debug] [Thread-6 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:59.416643 [debug] [Thread-6 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/bank_config?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.bank_config 2022-07-11 15:48:08 normalization > 15:47:59.418801 [info ] [Thread-6 ]: 15:47:59 + `mainapi-282501`.raw_achilles.`bank_config`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:08 normalization > 15:47:59.423237 [debug] [Thread-6 ]: Writing runtime SQL for node "model.airbyte_utils.bank_config" 2022-07-11 15:48:08 normalization > 15:47:59.427100 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:59.428269 [info ] [Thread-8 ]: 10 of 18 OK created incremental model raw_achilles.partner_config_scd................................................... [CREATE TABLE (206.0 rows, 112.0 KB processed) in 6.25s] 2022-07-11 15:48:08 normalization > 15:47:59.428891 [debug] [Thread-8 ]: Finished running node model.airbyte_utils.partner_config_scd 2022-07-11 15:48:08 normalization > 15:47:59.429819 [debug] [Thread-6 ]: On model.airbyte_utils.bank_config: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace table `mainapi-282501`.raw_achilles.`bank_config` 2022-07-11 15:48:08 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:48:08 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- Final base SQL model 2022-07-11 15:48:08 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`bank_config_scd` 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > name, 2022-07-11 15:48:08 normalization > config, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > routing_no, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:48:08 normalization > _airbyte_bank_config_hashid 2022-07-11 15:48:08 normalization > from `mainapi-282501`.raw_achilles.`bank_config_scd` 2022-07-11 15:48:08 normalization > -- bank_config from `mainapi-282501`.raw_achilles._airbyte_raw_bank_config 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > and _airbyte_active_row = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ); 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:59.433062 [debug] [Thread-3 ]: Began running node model.airbyte_utils.partner_config 2022-07-11 15:48:08 normalization > 15:47:59.433483 [info ] [Thread-3 ]: 14 of 18 START incremental model raw_achilles.partner_config............................................................ [RUN] 2022-07-11 15:48:08 normalization > 15:47:59.434736 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config" 2022-07-11 15:48:08 normalization > 15:47:59.434964 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.partner_config 2022-07-11 15:48:08 normalization > 15:47:59.435199 [debug] [Thread-3 ]: Compiling model.airbyte_utils.partner_config 2022-07-11 15:48:08 normalization > 15:47:59.447359 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.partner_config" 2022-07-11 15:48:08 normalization > 15:47:59.450966 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:59.451264 [debug] [Thread-3 ]: Began executing node model.airbyte_utils.partner_config 2022-07-11 15:48:08 normalization > 15:47:59.458090 [debug] [Thread-3 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:59.544596 [debug] [Thread-3 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/partner_config?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.partner_config 2022-07-11 15:48:08 normalization > 15:47:59.546558 [info ] [Thread-3 ]: 15:47:59 + `mainapi-282501`.raw_achilles.`partner_config`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:08 normalization > 15:47:59.551450 [debug] [Thread-3 ]: Writing runtime SQL for node "model.airbyte_utils.partner_config" 2022-07-11 15:48:08 normalization > 15:47:59.552000 [debug] [Thread-3 ]: On model.airbyte_utils.partner_config: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace table `mainapi-282501`.raw_achilles.`partner_config` 2022-07-11 15:48:08 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:48:08 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- Final base SQL model 2022-07-11 15:48:08 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`partner_config_scd` 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > name, 2022-07-11 15:48:08 normalization > config, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > partner_id, 2022-07-11 15:48:08 normalization > routing_no, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > account_prefix, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:48:08 normalization > _airbyte_partner_config_hashid 2022-07-11 15:48:08 normalization > from `mainapi-282501`.raw_achilles.`partner_config_scd` 2022-07-11 15:48:08 normalization > -- partner_config from `mainapi-282501`.raw_achilles._airbyte_raw_partner_config 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > and _airbyte_active_row = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ); 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:59.736499 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:59.737858 [info ] [Thread-1 ]: 9 of 18 OK created incremental model raw_achilles.files_out_scd......................................................... [CREATE TABLE (34.0 rows, 14.4 KB processed) in 6.63s] 2022-07-11 15:48:08 normalization > 15:47:59.738568 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.files_out_scd 2022-07-11 15:48:08 normalization > 15:47:59.739671 [debug] [Thread-8 ]: Began running node model.airbyte_utils.files_out 2022-07-11 15:48:08 normalization > 15:47:59.740118 [info ] [Thread-8 ]: 15 of 18 START incremental model raw_achilles.files_out................................................................. [RUN] 2022-07-11 15:48:08 normalization > 15:47:59.741489 [debug] [Thread-8 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out" 2022-07-11 15:48:08 normalization > 15:47:59.741734 [debug] [Thread-8 ]: Began compiling node model.airbyte_utils.files_out 2022-07-11 15:48:08 normalization > 15:47:59.741958 [debug] [Thread-8 ]: Compiling model.airbyte_utils.files_out 2022-07-11 15:48:08 normalization > 15:47:59.753896 [debug] [Thread-8 ]: Writing injected SQL for node "model.airbyte_utils.files_out" 2022-07-11 15:48:08 normalization > 15:47:59.759008 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:59.759530 [debug] [Thread-8 ]: Began executing node model.airbyte_utils.files_out 2022-07-11 15:48:08 normalization > 15:47:59.764969 [debug] [Thread-8 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:59.768662 [debug] [Thread-5 ]: On model.airbyte_utils.files_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_scd"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > drop view _airbyte_raw_achilles.files_in_stg 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:59.795084 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:59.796228 [info ] [Thread-4 ]: 8 of 18 OK created incremental model raw_achilles.transactions_out_scd.................................................. [CREATE TABLE (113.0 rows, 93.6 KB processed) in 6.73s] 2022-07-11 15:48:08 normalization > 15:47:59.796809 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.transactions_out_scd 2022-07-11 15:48:08 normalization > 15:47:59.797887 [debug] [Thread-1 ]: Began running node model.airbyte_utils.transactions_out 2022-07-11 15:48:08 normalization > 15:47:59.798370 [info ] [Thread-1 ]: 16 of 18 START incremental model raw_achilles.transactions_out.......................................................... [RUN] 2022-07-11 15:48:08 normalization > 15:47:59.799661 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out" 2022-07-11 15:48:08 normalization > 15:47:59.799921 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.transactions_out 2022-07-11 15:48:08 normalization > 15:47:59.801054 [debug] [Thread-1 ]: Compiling model.airbyte_utils.transactions_out 2022-07-11 15:48:08 normalization > 15:47:59.815092 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out" 2022-07-11 15:48:08 normalization > 15:47:59.815794 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:47:59.816018 [debug] [Thread-1 ]: Began executing node model.airbyte_utils.transactions_out 2022-07-11 15:48:08 normalization > 15:47:59.820327 [debug] [Thread-1 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:47:59.880458 [debug] [Thread-8 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/files_out?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.files_out 2022-07-11 15:48:08 normalization > 15:47:59.882778 [info ] [Thread-8 ]: 15:47:59 + `mainapi-282501`.raw_achilles.`files_out`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:08 normalization > 15:47:59.887543 [debug] [Thread-8 ]: Writing runtime SQL for node "model.airbyte_utils.files_out" 2022-07-11 15:48:08 normalization > 15:47:59.888164 [debug] [Thread-8 ]: On model.airbyte_utils.files_out: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_out` 2022-07-11 15:48:08 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:48:08 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- Final base SQL model 2022-07-11 15:48:08 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`files_out_scd` 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > id, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > file_hash, 2022-07-11 15:48:08 normalization > file_name, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > batch_count, 2022-07-11 15:48:08 normalization > exchange_window, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:48:08 normalization > _airbyte_files_out_hashid 2022-07-11 15:48:08 normalization > from `mainapi-282501`.raw_achilles.`files_out_scd` 2022-07-11 15:48:08 normalization > -- files_out from `mainapi-282501`.raw_achilles._airbyte_raw_files_out 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > and _airbyte_active_row = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ); 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:47:59.912181 [debug] [Thread-1 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/transactions_out?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.transactions_out 2022-07-11 15:48:08 normalization > 15:47:59.913987 [info ] [Thread-1 ]: 15:47:59 + `mainapi-282501`.raw_achilles.`transactions_out`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:08 normalization > 15:47:59.918885 [debug] [Thread-1 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_out" 2022-07-11 15:48:08 normalization > 15:47:59.919396 [debug] [Thread-1 ]: On model.airbyte_utils.transactions_out: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace table `mainapi-282501`.raw_achilles.`transactions_out` 2022-07-11 15:48:08 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:48:08 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- Final base SQL model 2022-07-11 15:48:08 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`transactions_out_scd` 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > id, 2022-07-11 15:48:08 normalization > data, 2022-07-11 15:48:08 normalization > uuid, 2022-07-11 15:48:08 normalization > amount, 2022-07-11 15:48:08 normalization > status, 2022-07-11 15:48:08 normalization > bank_id, 2022-07-11 15:48:08 normalization > created, 2022-07-11 15:48:08 normalization > file_id, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > trace_no, 2022-07-11 15:48:08 normalization > account_no, 2022-07-11 15:48:08 normalization > partner_id, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > description, 2022-07-11 15:48:08 normalization > external_id, 2022-07-11 15:48:08 normalization > is_same_day, 2022-07-11 15:48:08 normalization > return_data, 2022-07-11 15:48:08 normalization > account_name, 2022-07-11 15:48:08 normalization > effective_date, 2022-07-11 15:48:08 normalization > reference_info, 2022-07-11 15:48:08 normalization > transaction_code, 2022-07-11 15:48:08 normalization > source_account_no, 2022-07-11 15:48:08 normalization > transaction_in_id, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > source_account_name, 2022-07-11 15:48:08 normalization > destination_bank_routing_no, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:48:08 normalization > _airbyte_transactions_out_hashid 2022-07-11 15:48:08 normalization > from `mainapi-282501`.raw_achilles.`transactions_out_scd` 2022-07-11 15:48:08 normalization > -- transactions_out from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > and _airbyte_active_row = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ); 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:48:00.168363 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:48:00.169449 [debug] [Thread-7 ]: Database Error in model transactions_in_scd (models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql) 2022-07-11 15:48:08 normalization > Invalid timestamp string "0000-12-30T00:00:00Z" 2022-07-11 15:48:08 normalization > compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:48:08 normalization > 15:48:00.171012 [error] [Thread-7 ]: 12 of 18 ERROR creating incremental model raw_achilles.transactions_in_scd.............................................. [ERROR in 6.80s] 2022-07-11 15:48:08 normalization > 15:48:00.174791 [debug] [Thread-7 ]: Finished running node model.airbyte_utils.transactions_in_scd 2022-07-11 15:48:08 normalization > 15:48:00.176071 [debug] [Thread-4 ]: Began running node model.airbyte_utils.transactions_in 2022-07-11 15:48:08 normalization > 15:48:00.176417 [info ] [Thread-4 ]: 17 of 18 SKIP relation raw_achilles.transactions_in..................................................................... [SKIP] 2022-07-11 15:48:08 normalization > 15:48:00.176950 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.transactions_in 2022-07-11 15:48:08 normalization > 15:48:00.317214 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:48:00.318418 [info ] [Thread-5 ]: 11 of 18 OK created incremental model raw_achilles.files_in_scd......................................................... [CREATE TABLE (36.0 rows, 23.3 KB processed) in 7.14s] 2022-07-11 15:48:08 normalization > 15:48:00.318950 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.files_in_scd 2022-07-11 15:48:08 normalization > 15:48:00.319949 [debug] [Thread-7 ]: Began running node model.airbyte_utils.files_in 2022-07-11 15:48:08 normalization > 15:48:00.320339 [info ] [Thread-7 ]: 18 of 18 START incremental model raw_achilles.files_in.................................................................. [RUN] 2022-07-11 15:48:08 normalization > 15:48:00.321402 [debug] [Thread-7 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in" 2022-07-11 15:48:08 normalization > 15:48:00.321630 [debug] [Thread-7 ]: Began compiling node model.airbyte_utils.files_in 2022-07-11 15:48:08 normalization > 15:48:00.321851 [debug] [Thread-7 ]: Compiling model.airbyte_utils.files_in 2022-07-11 15:48:08 normalization > 15:48:00.416542 [debug] [Thread-7 ]: Writing injected SQL for node "model.airbyte_utils.files_in" 2022-07-11 15:48:08 normalization > 15:48:00.417153 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:48:00.417385 [debug] [Thread-7 ]: Began executing node model.airbyte_utils.files_in 2022-07-11 15:48:08 normalization > 15:48:00.437354 [debug] [Thread-7 ]: Opening a new connection, currently in state closed 2022-07-11 15:48:08 normalization > 15:48:00.565305 [debug] [Thread-7 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/files_in?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.files_in 2022-07-11 15:48:08 normalization > 15:48:00.566993 [info ] [Thread-7 ]: 15:48:00 + `mainapi-282501`.raw_achilles.`files_in`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:48:08 normalization > 15:48:00.571504 [debug] [Thread-7 ]: Writing runtime SQL for node "model.airbyte_utils.files_in" 2022-07-11 15:48:08 normalization > 15:48:00.572011 [debug] [Thread-7 ]: On model.airbyte_utils.files_in: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in"} */ 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_in` 2022-07-11 15:48:08 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:48:08 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:48:08 normalization > OPTIONS() 2022-07-11 15:48:08 normalization > as ( 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > -- Final base SQL model 2022-07-11 15:48:08 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`files_in_scd` 2022-07-11 15:48:08 normalization > select 2022-07-11 15:48:08 normalization > _airbyte_unique_key, 2022-07-11 15:48:08 normalization > id, 2022-07-11 15:48:08 normalization > ended, 2022-07-11 15:48:08 normalization > started, 2022-07-11 15:48:08 normalization > updated, 2022-07-11 15:48:08 normalization > file_hash, 2022-07-11 15:48:08 normalization > file_name, 2022-07-11 15:48:08 normalization > _ab_cdc_lsn, 2022-07-11 15:48:08 normalization > iat_entry_count, 2022-07-11 15:48:08 normalization > std_entry_count, 2022-07-11 15:48:08 normalization > total_batch_count, 2022-07-11 15:48:08 normalization > total_entry_count, 2022-07-11 15:48:08 normalization > _ab_cdc_deleted_at, 2022-07-11 15:48:08 normalization > _ab_cdc_updated_at, 2022-07-11 15:48:08 normalization > preprocessing_path, 2022-07-11 15:48:08 normalization > total_debit_amount, 2022-07-11 15:48:08 normalization > postprocessing_path, 2022-07-11 15:48:08 normalization > total_credit_amount, 2022-07-11 15:48:08 normalization > iat_entries_processed, 2022-07-11 15:48:08 normalization > std_entries_processed, 2022-07-11 15:48:08 normalization > _airbyte_ab_id, 2022-07-11 15:48:08 normalization > _airbyte_emitted_at, 2022-07-11 15:48:08 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:48:08 normalization > _airbyte_files_in_hashid 2022-07-11 15:48:08 normalization > from `mainapi-282501`.raw_achilles.`files_in_scd` 2022-07-11 15:48:08 normalization > -- files_in from `mainapi-282501`.raw_achilles._airbyte_raw_files_in 2022-07-11 15:48:08 normalization > where 1 = 1 2022-07-11 15:48:08 normalization > and _airbyte_active_row = 1 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > ); 2022-07-11 15:48:08 normalization > 2022-07-11 15:48:08 normalization > 15:48:01.739906 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:48:01.741378 [info ] [Thread-6 ]: 13 of 18 OK created incremental model raw_achilles.bank_config.......................................................... [CREATE TABLE (3.0 rows, 1.5 KB processed) in 2.45s] 2022-07-11 15:48:08 normalization > 15:48:01.741999 [debug] [Thread-6 ]: Finished running node model.airbyte_utils.bank_config 2022-07-11 15:48:08 normalization > 15:48:01.828816 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:48:01.829973 [info ] [Thread-3 ]: 14 of 18 OK created incremental model raw_achilles.partner_config....................................................... [CREATE TABLE (206.0 rows, 84.2 KB processed) in 2.40s] 2022-07-11 15:48:08 normalization > 15:48:01.830542 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.partner_config 2022-07-11 15:48:08 normalization > 15:48:02.257512 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:48:02.260114 [info ] [Thread-8 ]: 15 of 18 OK created incremental model raw_achilles.files_out............................................................ [CREATE TABLE (34.0 rows, 10.1 KB processed) in 2.52s] 2022-07-11 15:48:08 normalization > 15:48:02.260800 [debug] [Thread-8 ]: Finished running node model.airbyte_utils.files_out 2022-07-11 15:48:08 normalization > 15:48:02.330550 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:48:02.331809 [info ] [Thread-1 ]: 16 of 18 OK created incremental model raw_achilles.transactions_out..................................................... [CREATE TABLE (113.0 rows, 51.0 KB processed) in 2.53s] 2022-07-11 15:48:08 normalization > 15:48:02.332365 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.transactions_out 2022-07-11 15:48:08 normalization > 15:48:03.258592 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:48:08 normalization > 15:48:03.259748 [info ] [Thread-7 ]: 18 of 18 OK created incremental model raw_achilles.files_in............................................................. [CREATE TABLE (36.0 rows, 13.3 KB processed) in 2.94s] 2022-07-11 15:48:08 normalization > 15:48:03.260337 [debug] [Thread-7 ]: Finished running node model.airbyte_utils.files_in 2022-07-11 15:48:08 normalization > 15:48:03.264547 [debug] [MainThread]: Acquiring new bigquery connection "master" 2022-07-11 15:48:08 normalization > 15:48:03.265445 [info ] [MainThread]: 2022-07-11 15:48:08 normalization > 15:48:03.265832 [info ] [MainThread]: Finished running 6 view models, 12 incremental models in 13.49s. 2022-07-11 15:48:08 normalization > 15:48:03.266508 [debug] [MainThread]: Connection 'master' was properly closed. 2022-07-11 15:48:08 normalization > 15:48:03.266854 [debug] [MainThread]: Connection 'model.airbyte_utils.transactions_out' was properly closed. 2022-07-11 15:48:08 normalization > 15:48:03.267079 [debug] [MainThread]: Connection 'model.airbyte_utils.transactions_out_ab2' was properly closed. 2022-07-11 15:48:08 normalization > 15:48:03.267249 [debug] [MainThread]: Connection 'model.airbyte_utils.partner_config' was properly closed. 2022-07-11 15:48:08 normalization > 15:48:03.267407 [debug] [MainThread]: Connection 'model.airbyte_utils.transactions_out_scd' was properly closed. 2022-07-11 15:48:08 normalization > 15:48:03.267563 [debug] [MainThread]: Connection 'model.airbyte_utils.files_in_scd' was properly closed. 2022-07-11 15:48:08 normalization > 15:48:03.267732 [debug] [MainThread]: Connection 'model.airbyte_utils.bank_config' was properly closed. 2022-07-11 15:48:08 normalization > 15:48:03.267914 [debug] [MainThread]: Connection 'model.airbyte_utils.files_out' was properly closed. 2022-07-11 15:48:08 normalization > 15:48:03.268071 [debug] [MainThread]: Connection 'model.airbyte_utils.files_in' was properly closed. 2022-07-11 15:48:08 normalization > 15:48:03.291330 [info ] [MainThread]: 2022-07-11 15:48:08 normalization > 15:48:03.291819 [info ] [MainThread]: Completed with 1 error and 0 warnings: 2022-07-11 15:48:08 normalization > 15:48:03.292511 [info ] [MainThread]: 2022-07-11 15:48:08 normalization > 15:48:03.293065 [error] [MainThread]: Database Error in model transactions_in_scd (models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql) 2022-07-11 15:48:08 normalization > 15:48:03.293622 [error] [MainThread]: Invalid timestamp string "0000-12-30T00:00:00Z" 2022-07-11 15:48:08 normalization > 15:48:03.294238 [error] [MainThread]: compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:48:08 normalization > 15:48:03.294783 [info ] [MainThread]: 2022-07-11 15:48:08 normalization > 15:48:03.295232 [info ] [MainThread]: Done. PASS=16 WARN=0 ERROR=1 SKIP=1 TOTAL=18 2022-07-11 15:48:09 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.plugin: is not defined in the schema and the schema does not allow additional properties, $.publication: is not defined in the schema and the schema does not allow additional properties, $.replication_slot: is not defined in the schema and the schema does not allow additional properties, $.method: does not have a value in the enumeration [Standard], $.method: must be a constant value Standard 2022-07-11 15:48:09 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.method: must be a constant value Standard 2022-07-11 15:48:09 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.credential.hmac_key_access_id: object found, string expected, $.credential.hmac_key_secret: object found, string expected 2022-07-11 15:48:09 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/1/logs.log 2022-07-11 15:48:09 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:48:09 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:48:09 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-postgres:0.4.31 exists... 2022-07-11 15:48:09 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-postgres:0.4.31 was found locally. 2022-07-11 15:48:09 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:48:09 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/1 --log-driver none --name source-postgres-check-89696-1-jpryd --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/source-postgres:0.4.31 -e WORKER_JOB_ATTEMPT=1 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/source-postgres:0.4.31 check --config source_config.json 2022-07-11 15:48:10 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:10 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(getSource):73 - Running source under deployment mode: OSS 2022-07-11 15:48:10 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:10 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):85 - Starting source: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:48:10 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:10 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-07-11 15:48:10 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:10 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:48:10 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:10 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-07-11 15:48:10 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:10 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-07-11 15:48:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:11 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:11 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:11 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:11 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:11 INFO i.a.i.b.s.SshTunnel(getInstance):170 - Starting connection with method: NO_TUNNEL 2022-07-11 15:48:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:11 INFO c.z.h.HikariDataSource():80 - HikariPool-1 - Starting... 2022-07-11 15:48:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:11 INFO c.z.h.HikariDataSource():82 - HikariPool-1 - Start completed. 2022-07-11 15:48:12 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:12 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:48:12 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:12 INFO i.a.i.s.j.AbstractJdbcSource(lambda$getCheckOperations$1):93 - Attempting to get metadata from the database to see if we can connect. 2022-07-11 15:48:12 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:12 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$2):197 - Attempting to find the named replication slot using the query: HikariProxyPreparedStatement@1637601612 wrapping SELECT * FROM pg_replication_slots WHERE slot_name = 'airbyte_slot_achilles' AND plugin = 'wal2json' AND database = 'achilles' 2022-07-11 15:48:12 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:12 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:48:12 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:12 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$4):214 - Attempting to find the publication using the query: HikariProxyPreparedStatement@2063786038 wrapping SELECT * FROM pg_publication WHERE pubname = 'achilles_publication' 2022-07-11 15:48:12 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:12 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:48:12 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:12 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-07-11 15:48:12 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:12 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-07-11 15:48:12 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:48:12 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/1/logs.log 2022-07-11 15:48:12 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:48:12 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:48:12 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-bigquery:1.1.11 exists... 2022-07-11 15:48:12 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-bigquery:1.1.11 was found locally. 2022-07-11 15:48:12 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:48:12 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/1 --log-driver none --name destination-bigquery-check-89696-1-bwxbn --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:1.1.11 -e WORKER_JOB_ATTEMPT=1 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/destination-bigquery:1.1.11 check --config source_config.json 2022-07-11 15:48:13 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Class path contains multiple SLF4J bindings. 2022-07-11 15:48:13 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:48:13 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:48:13 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:48:13 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-11 15:48:13 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-11 15:48:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:14 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-07-11 15:48:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:14 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2022-07-11 15:48:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:14 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-07-11 15:48:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:14 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-07-11 15:48:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:14 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:14 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:14 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:14 INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):301 - Selected loading method is set to: GCS 2022-07-11 15:48:16 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:16 INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 - S3 format config: {"format_type":"CSV","flattening":"No flattening"} 2022-07-11 15:48:16 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:16 INFO i.a.i.d.s.S3Destination(testSingleUpload):81 - Started testing if all required credentials assigned to user for single file uploading 2022-07-11 15:48:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:17 INFO i.a.i.d.s.S3Destination(testSingleUpload):91 - Finished checking for normal upload mode 2022-07-11 15:48:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:17 INFO i.a.i.d.s.S3Destination(testMultipartUpload):95 - Started testing if all required credentials assigned to user for multipart upload 2022-07-11 15:48:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:17 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/test_1657554497360 with full ID ABPnzm6k0c6LjZXsOCiMZe9HVfv5vsuwF65xKNyqMBh47iADritowLvop2wKI9CkICG8SOk 2022-07-11 15:48:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:17 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:48:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:17 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:48:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:17 WARN a.m.s.MultiPartOutputStream(close):160 - [MultipartOutputStream for parts 1 - 10000] is already closed 2022-07-11 15:48:17 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:17 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/test_1657554497360 with id ABPnzm6k0...CkICG8SOk]: Uploading leftover stream [Part number 1 containing 3.34 MB] 2022-07-11 15:48:18 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:18 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/test_1657554497360 with id ABPnzm6k0...CkICG8SOk]: Finished uploading [Part number 1 containing 3.34 MB] 2022-07-11 15:48:18 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:18 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/test_1657554497360 with id ABPnzm6k0...CkICG8SOk]: Completed 2022-07-11 15:48:18 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:48:18 INFO i.a.i.d.s.S3Destination(testMultipartUpload):119 - Finished verification for multipart upload mode 2022-07-11 15:48:19 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:48:19 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/1/logs.log 2022-07-11 15:48:19 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:48:19 INFO i.a.w.g.DefaultReplicationWorker(run):115 - start sync worker. job id: 89696 attempt id: 1 2022-07-11 15:48:19 INFO i.a.w.g.DefaultReplicationWorker(run):127 - configured sync modes: {public.bank_config=incremental - append_dedup, public.files_out=incremental - append_dedup, public.transactions_out=incremental - append_dedup, public.partner_config=incremental - append_dedup, public.transactions_in=incremental - append_dedup, public.files_in=incremental - append_dedup} 2022-07-11 15:48:19 INFO i.a.w.i.DefaultAirbyteDestination(start):69 - Running destination... 2022-07-11 15:48:19 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:48:19 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-bigquery:1.1.11 exists... 2022-07-11 15:48:19 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-bigquery:1.1.11 was found locally. 2022-07-11 15:48:19 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:48:19 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/1 --log-driver none --name destination-bigquery-write-89696-1-asdee --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:1.1.11 -e WORKER_JOB_ATTEMPT=1 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/destination-bigquery:1.1.11 write --config destination_config.json --catalog destination_catalog.json 2022-07-11 15:48:19 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:48:19 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-postgres:0.4.31 exists... 2022-07-11 15:48:19 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-postgres:0.4.31 was found locally. 2022-07-11 15:48:19 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:48:19 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/1 --log-driver none --name source-postgres-read-89696-1-takyd --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/source-postgres:0.4.31 -e WORKER_JOB_ATTEMPT=1 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/source-postgres:0.4.31 read --config source_config.json --catalog source_catalog.json --state input_state.json 2022-07-11 15:48:19 INFO i.a.w.g.DefaultReplicationWorker(run):169 - Waiting for source and destination threads to complete. 2022-07-11 15:48:19 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):405 - Destination output thread started. 2022-07-11 15:48:19 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):298 - Replication thread started. 2022-07-11 15:48:20 destination > SLF4J: Class path contains multiple SLF4J bindings. 2022-07-11 15:48:20 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:48:20 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:48:20 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:48:20 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-11 15:48:20 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-11 15:48:22 source > 2022-07-11 15:48:22 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(getSource):73 - Running source under deployment mode: OSS 2022-07-11 15:48:22 source > 2022-07-11 15:48:22 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):85 - Starting source: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:48:22 source > 2022-07-11 15:48:22 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {read=null, catalog=source_catalog.json, state=input_state.json, config=source_config.json} 2022-07-11 15:48:22 source > 2022-07-11 15:48:22 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:48:22 source > 2022-07-11 15:48:22 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: READ 2022-07-11 15:48:22 source > 2022-07-11 15:48:22 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='input_state.json'} 2022-07-11 15:48:22 destination > 2022-07-11 15:48:22 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2022-07-11 15:48:22 destination > 2022-07-11 15:48:22 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2022-07-11 15:48:22 destination > 2022-07-11 15:48:22 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: WRITE 2022-07-11 15:48:22 destination > 2022-07-11 15:48:22 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2022-07-11 15:48:22 source > 2022-07-11 15:48:22 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:22 source > 2022-07-11 15:48:22 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:22 source > 2022-07-11 15:48:22 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:22 source > 2022-07-11 15:48:22 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:22 destination > 2022-07-11 15:48:22 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:22 destination > 2022-07-11 15:48:22 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:22 destination > 2022-07-11 15:48:22 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:48:22 destination > 2022-07-11 15:48:22 INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):301 - Selected loading method is set to: GCS 2022-07-11 15:48:23 destination > 2022-07-11 15:48:23 INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 - S3 format config: {"format_type":"AVRO","flattening":"No flattening"} 2022-07-11 15:48:23 source > 2022-07-11 15:48:23 INFO i.a.i.b.s.SshTunnel(getInstance):170 - Starting connection with method: NO_TUNNEL 2022-07-11 15:48:23 destination > 2022-07-11 15:48:23 INFO i.a.i.d.b.BigQueryUtils(isKeepFilesInGcs):317 - All tmp files will be removed from GCS when replication is finished 2022-07-11 15:48:23 source > 2022-07-11 15:48:23 INFO c.z.h.HikariDataSource():80 - HikariPool-1 - Starting... 2022-07-11 15:48:23 source > 2022-07-11 15:48:23 INFO c.z.h.HikariDataSource():82 - HikariPool-1 - Start completed. 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.j.AbstractJdbcSource(lambda$getCheckOperations$1):93 - Attempting to get metadata from the database to see if we can connect. 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$2):197 - Attempting to find the named replication slot using the query: HikariProxyPreparedStatement@615853374 wrapping SELECT * FROM pg_replication_slots WHERE slot_name = 'airbyte_slot_achilles' AND plugin = 'wal2json' AND database = 'achilles' 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$4):214 - Attempting to find the publication using the query: HikariProxyPreparedStatement@465152579 wrapping SELECT * FROM pg_publication WHERE pubname = 'achilles_publication' 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:48:24 destination > 2022-07-11 15:48:24 INFO i.a.i.d.b.BigQueryDestination(getGcsRecordConsumer):289 - Creating BigQuery staging message consumer with staging ID 9afd5930-86b5-40f9-85b2-1934e0a22ee0 at 2022-07-11T15:48:23.503Z 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.r.s.StateManagerFactory(createStateManager):51 - Global state manager selected to manage state object with type LEGACY. 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.r.s.StateManagerFactory(generateGlobalState):84 - Legacy state converted to global state. 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='files_in', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='partner_config', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='bank_config', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='files_out', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='transactions_out', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='transactions_in', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.r.CdcStateManager():29 - Initialized CDC state with: null 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO c.z.h.HikariDataSource():80 - HikariPool-2 - Starting... 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO c.z.h.HikariDataSource():82 - HikariPool-2 - Start completed. 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.p.PostgresSource(discoverRawTables):168 - Checking schema: public 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.i.s.j.AbstractJdbcSource(discoverInternal):121 - Internal schemas to exclude: [catalog_history, information_schema, pg_catalog, pg_internal] 2022-07-11 15:48:24 destination > 2022-07-11 15:48:24 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=bank_config, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xha_bank_config}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_bank_config}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:48:24 destination > 2022-07-11 15:48:24 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=files_in, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_huf_files_in}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_in}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:48:24 destination > 2022-07-11 15:48:24 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=files_out, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kol_files_out}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_out}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:48:24 destination > 2022-07-11 15:48:24 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=partner_config, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_geb_partner_config}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_partner_config}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:48:24 destination > 2022-07-11 15:48:24 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=transactions_in, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_cil_transactions_in}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_in}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:48:24 destination > 2022-07-11 15:48:24 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=transactions_out, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kdw_transactions_out}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_out}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:48:24 source > 2022-07-11 15:48:24 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:48:25 destination > 2022-07-11 15:48:25 INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2022-07-11 15:48:25 destination > 2022-07-11 15:48:25 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$3):98 - Preparing tmp tables in destination started for 6 streams 2022-07-11 15:48:25 destination > 2022-07-11 15:48:25 INFO i.a.i.d.b.BigQueryGcsOperations(createSchemaIfNotExists):86 - Creating dataset raw_achilles 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column trace_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column account_no (type varchar[17]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column account_name (type varchar[22]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column transaction_code (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column dc_sign (type varchar[6]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column effective_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column originating_bank_routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column destination_bank_routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_history (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column uuid (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_attempt (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column in_suspense (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_error (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column subtype (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column ach_entry (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column returned (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column partner_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column name (type varchar[23]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column account_prefix (type varchar[6]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column config (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table schema_migrations column version (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table schema_migrations column dirty (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column name (type varchar[23]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column config (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column file_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column external_id (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column partner_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column trace_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column account_no (type varchar[17]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column account_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column source_account_no (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column source_account_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column description (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column effective_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column destination_bank_routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column return_data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column reference_info (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column transaction_code (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column transaction_in_id (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column uuid (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column status (type varchar[30]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column is_same_day (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column file_name (type varchar[255]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column batch_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column file_hash (type varchar[256]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column exchange_window (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column file_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column file_hash (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column company_id (type varchar[10]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column company_name (type varchar[16]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column company_entry_description (type varchar[10]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column batch_type (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column batch_number (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column originating_dfi (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column sec_code (type varchar[3]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column settlement_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column entry_trace_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column transaction_code (type varchar[2]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column receiving_dfi (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column dfi_account_no (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column individual_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column individual_id_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_record_count (type varchar[4]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column external_id (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column partner_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column effective_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column returned (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column processing_history (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column uuid (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column return_data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column transaction_out_id (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column foreign_exchange_indicator (type varchar[2]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column destination_country_code (type varchar[2]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column originator_id (type varchar[10]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column originating_currency_code (type varchar[3]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column destination_currency_code (type varchar[3]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_99 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_98 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_02 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_05 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_10 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_11 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_12 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_13 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_14 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_15 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_16 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_17 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_18 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column future_dated (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column preprocessing_path (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column postprocessing_path (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column file_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column file_hash (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column started (type timestamp[29]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column ended (type timestamp[29]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column std_entries_processed (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column iat_entries_processed (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column iat_entry_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column std_entry_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_entry_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_batch_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_debit_amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_credit_amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.in_processing 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.partner_config 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.schema_migrations 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.bank_config 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.transactions_out 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.files_out 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.transactions_in 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.files_in 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.d.j.s.TwoStageSizeEstimator(getTargetBufferByteSize):72 - Max memory limit: 29578231808, JDBC buffer size: 1073741824 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresCdcCatalogHelper(getPublicizedTables):92 - For CDC, only tables in publication achilles_publication will be included in the sync: [pglogical.node, pglogical.replication_set_seq, public.partner_config, public.schema_migrations, public.files_in, pglogical.queue, pglogical.node_interface, public.transactions_out, pglogical.local_node, pglogical.subscription, pglogical.replication_set_table, pglogical.depend, public.bank_config, pglogical.local_sync_status, public.files_out, public.in_processing, pglogical.replication_set, pglogical.sequence_state, public.transactions_in] 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.s.p.PostgresCdcTargetPosition(targetPosition):45 - identified target lsn: PgLsn{lsn=365839851224} 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.a.i.d.AirbyteDebeziumHandler(getIncrementalIterators):99 - Using CDC: true 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO o.a.k.c.c.AbstractConfig(logAll):376 - EmbeddedConfig values: 2022-07-11 15:48:25 source > access.control.allow.methods = 2022-07-11 15:48:25 source > access.control.allow.origin = 2022-07-11 15:48:25 source > admin.listeners = null 2022-07-11 15:48:25 source > bootstrap.servers = [localhost:9092] 2022-07-11 15:48:25 source > client.dns.lookup = use_all_dns_ips 2022-07-11 15:48:25 source > config.providers = [] 2022-07-11 15:48:25 source > connector.client.config.override.policy = All 2022-07-11 15:48:25 source > header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter 2022-07-11 15:48:25 source > key.converter = class org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:48:25 source > listeners = [http://:8083] 2022-07-11 15:48:25 source > metric.reporters = [] 2022-07-11 15:48:25 source > metrics.num.samples = 2 2022-07-11 15:48:25 source > metrics.recording.level = INFO 2022-07-11 15:48:25 source > metrics.sample.window.ms = 30000 2022-07-11 15:48:25 source > offset.flush.interval.ms = 1000 2022-07-11 15:48:25 source > offset.flush.timeout.ms = 5000 2022-07-11 15:48:25 source > offset.storage.file.filename = /tmp/cdc-state-offset313680603054582523/offset.dat 2022-07-11 15:48:25 source > offset.storage.partitions = null 2022-07-11 15:48:25 source > offset.storage.replication.factor = null 2022-07-11 15:48:25 source > offset.storage.topic = 2022-07-11 15:48:25 source > plugin.path = null 2022-07-11 15:48:25 source > response.http.headers.config = 2022-07-11 15:48:25 source > rest.advertised.host.name = null 2022-07-11 15:48:25 source > rest.advertised.listener = null 2022-07-11 15:48:25 source > rest.advertised.port = null 2022-07-11 15:48:25 source > rest.extension.classes = [] 2022-07-11 15:48:25 source > ssl.cipher.suites = null 2022-07-11 15:48:25 source > ssl.client.auth = none 2022-07-11 15:48:25 source > ssl.enabled.protocols = [TLSv1.2, TLSv1.3] 2022-07-11 15:48:25 source > ssl.endpoint.identification.algorithm = https 2022-07-11 15:48:25 source > ssl.engine.factory.class = null 2022-07-11 15:48:25 source > ssl.key.password = null 2022-07-11 15:48:25 source > ssl.keymanager.algorithm = SunX509 2022-07-11 15:48:25 source > ssl.keystore.certificate.chain = null 2022-07-11 15:48:25 source > ssl.keystore.key = null 2022-07-11 15:48:25 source > ssl.keystore.location = null 2022-07-11 15:48:25 source > ssl.keystore.password = null 2022-07-11 15:48:25 source > ssl.keystore.type = JKS 2022-07-11 15:48:25 source > ssl.protocol = TLSv1.3 2022-07-11 15:48:25 source > ssl.provider = null 2022-07-11 15:48:25 source > ssl.secure.random.implementation = null 2022-07-11 15:48:25 source > ssl.trustmanager.algorithm = PKIX 2022-07-11 15:48:25 source > ssl.truststore.certificates = null 2022-07-11 15:48:25 source > ssl.truststore.location = null 2022-07-11 15:48:25 source > ssl.truststore.password = null 2022-07-11 15:48:25 source > ssl.truststore.type = JKS 2022-07-11 15:48:25 source > task.shutdown.graceful.timeout.ms = 5000 2022-07-11 15:48:25 source > topic.creation.enable = true 2022-07-11 15:48:25 source > topic.tracking.allow.reset = true 2022-07-11 15:48:25 source > topic.tracking.enable = true 2022-07-11 15:48:25 source > value.converter = class org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:48:25 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $: unknown found, object expected 2022-07-11 15:48:25 ERROR i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: null 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 WARN o.a.k.c.r.WorkerConfig(logInternalConverterRemovalWarnings):316 - The worker has been configured with one or more internal converter properties ([internal.key.converter, internal.value.converter]). Support for these properties was deprecated in version 2.0 and removed in version 3.0, and specifying them will have no effect. Instead, an instance of the JsonConverter with schemas.enable set to false will be used. For more information, please visit http://kafka.apache.org/documentation/#upgrade and consult the upgrade notesfor the 3.0 release. 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 WARN o.a.k.c.r.WorkerConfig(logPluginPathConfigProviderWarning):334 - Variables cannot be used in the 'plugin.path' property, since the property is used by plugin scanning before the config providers that replace the variables are initialized. The raw value 'null' was used for plugin scanning, as opposed to the transformed value 'null', and this may cause unexpected results. 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 WARN i.d.c.p.PostgresConnectorConfig(validatePluginName):1394 - Logical decoder 'wal2json' is deprecated and will be removed in future versions 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 WARN i.d.c.p.PostgresConnectorConfig(validateTruncateHandlingMode):1333 - Configuration property 'truncate.handling.mode' is deprecated and will be removed in future versions. Please use 'skipped.operations' instead. 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 WARN i.d.c.p.PostgresConnectorConfig(validateToastedValuePlaceholder):1384 - Configuration property 'toasted.value.placeholder' is deprecated and will be removed in future versions. Please use 'unavailable.value.placeholder' instead. 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(start):124 - Starting PostgresConnectorTask with configuration: 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - connector.class = io.debezium.connector.postgresql.PostgresConnector 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - max.queue.size = 8192 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - slot.name = airbyte_slot_achilles 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - publication.name = achilles_publication 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.storage.file.filename = /tmp/cdc-state-offset313680603054582523/offset.dat 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - decimal.handling.mode = string 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - converters = datetime 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - datetime.type = io.airbyte.integrations.debezium.internals.PostgresConverter 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - value.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - key.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - publication.autocreate.mode = disabled 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.user = airbyte_achilles 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.dbname = achilles 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.storage = org.apache.kafka.connect.storage.FileOffsetBackingStore 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.server.name = achilles 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.flush.timeout.ms = 5000 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - plugin.name = wal2json 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.port = 5432 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.flush.interval.ms = 1000 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - key.converter.schemas.enable = false 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - internal.key.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.hostname = 10.58.160.3 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.password = ******** 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - name = achilles 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - value.converter.schemas.enable = false 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - internal.value.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - max.batch.size = 2048 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - table.include.list = public.bank_config,public.files_in,public.files_out,public.partner_config,public.transactions_in,public.transactions_out 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - snapshot.mode = initial 2022-07-11 15:48:25 source > 2022-07-11 15:48:25 INFO i.d.c.c.BaseSourceTask(getPreviousOffsets):318 - No previous offsets found 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.c.p.PostgresConnectorTask(start):108 - user 'airbyte_achilles' connected to database 'achilles' on PostgreSQL 12.10 on x86_64-pc-linux-gnu, compiled by Debian clang version 12.0.1, 64-bit with roles: 2022-07-11 15:48:26 source > role 'cloudsqlsuperuser' [superuser: false, replication: false, inherit: true, create role: true, create db: true, can log in: true] 2022-07-11 15:48:26 source > role 'pg_read_all_settings' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:48:26 source > role 'pg_stat_scan_tables' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:48:26 source > role 'pg_monitor' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:48:26 source > role 'airbyte_achilles' [superuser: false, replication: true, inherit: true, create role: true, create db: true, can log in: true] 2022-07-11 15:48:26 source > role 'pg_read_all_stats' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:48:26 source > role 'pg_signal_backend' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.c.p.c.PostgresConnection(readReplicationSlotInfo):251 - Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{53/E1155F98}, catalogXmin=19514215] 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.c.p.PostgresConnectorTask(start):117 - No previous offset found 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.c.p.s.InitialSnapshotter(shouldSnapshot):34 - Taking initial snapshot for new datasource 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.u.Threads(threadFactory):270 - Requested thread factory for connector PostgresConnector, id = achilles named = change-event-source-coordinator 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.u.Threads$3(newThread):287 - Creating thread debezium-postgresconnector-achilles-change-event-source-coordinator 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.p.ChangeEventSourceCoordinator(lambda$start$0):103 - Metrics registered 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.p.ChangeEventSourceCoordinator(lambda$start$0):106 - Context created 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.c.p.s.InitialSnapshotter(shouldSnapshot):34 - Taking initial snapshot for new datasource 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.c.p.PostgresSnapshotChangeEventSource(getSnapshottingTask):64 - According to the connector configuration data will be snapshotted 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):87 - Snapshot step 1 - Preparing 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):96 - Snapshot step 2 - Determining captured tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.replication_set_table to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.bank_config to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.files_out to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.local_sync_status to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.partner_config to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.replication_set to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.node to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.schema_migrations to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.local_node to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.depend to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.replication_set_seq to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.queue to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.subscription to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.transactions_out to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.node_interface to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.in_processing to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.transactions_in to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.sequence_state to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.files_in to the list of capture schema tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):103 - Snapshot step 3 - Locking captured tables [public.bank_config, public.files_in, public.files_out, public.partner_config, public.transactions_in, public.transactions_out] 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):109 - Snapshot step 4 - Determining snapshot offset 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.c.p.PostgresOffsetContext(initialContext):231 - Creating initial offset context 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.c.p.PostgresOffsetContext(initialContext):234 - Read xlogStart at 'LSN{55/2DC11ED8}' from transaction '20086033' 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.c.p.PostgresSnapshotChangeEventSource(updateOffsetForSnapshot):146 - Read xlogStart at 'LSN{55/2DC11ED8}' from transaction '20086033' 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):112 - Snapshot step 5 - Reading structure of captured tables 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.c.p.PostgresSnapshotChangeEventSource(readTableStructure):192 - Reading structure of schema 'public' of catalog 'achilles' 2022-07-11 15:48:26 destination > 2022-07-11 15:48:26 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_geb_partner_config}} 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):116 - Snapshot step 6 - Persisting schema history 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):128 - Snapshot step 7 - Snapshotting data 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEvents):302 - Snapshotting contents of 6 tables while still in transaction 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.bank_config' (1 of 6 tables) 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.bank_config' using select statement: 'SELECT "bank_id", "name", "routing_no", "created", "updated", "config" FROM "public"."bank_config"' 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 3 records for table 'public.bank_config'; total duration '00:00:00.025' 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.files_in' (2 of 6 tables) 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.files_in' using select statement: 'SELECT "id", "preprocessing_path", "postprocessing_path", "file_name", "file_hash", "started", "ended", "std_entries_processed", "iat_entries_processed", "iat_entry_count", "std_entry_count", "total_entry_count", "total_batch_count", "total_debit_amount", "total_credit_amount", "updated" FROM "public"."files_in"' 2022-07-11 15:48:26 destination > 2022-07-11 15:48:26 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_geb_partner_config}} 2022-07-11 15:48:26 destination > 2022-07-11 15:48:26 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream partner_config (dataset raw_achilles): airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 36 records for table 'public.files_in'; total duration '00:00:00.063' 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.files_out' (3 of 6 tables) 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.files_out' using select statement: 'SELECT "id", "bank_id", "file_name", "batch_count", "file_hash", "created", "exchange_window", "updated" FROM "public"."files_out"' 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 34 records for table 'public.files_out'; total duration '00:00:00.049' 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.partner_config' (4 of 6 tables) 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.partner_config' using select statement: 'SELECT "bank_id", "partner_id", "name", "account_prefix", "created", "updated", "config", "routing_no" FROM "public"."partner_config"' 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 206 records for table 'public.partner_config'; total duration '00:00:00.293' 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.transactions_in' (5 of 6 tables) 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.transactions_in' using select statement: 'SELECT "id", "file_name", "file_hash", "company_id", "company_name", "company_entry_description", "batch_type", "batch_number", "originating_dfi", "sec_code", "settlement_date", "entry_trace_no", "transaction_code", "receiving_dfi", "dfi_account_no", "individual_name", "individual_id_no", "addenda_record_count", "external_id", "bank_id", "partner_id", "amount", "effective_date", "returned", "processing_history", "created", "updated", "uuid", "return_data", "transaction_out_id", "foreign_exchange_indicator", "destination_country_code", "originator_id", "originating_currency_code", "destination_currency_code", "addenda_99", "addenda_98", "addenda_02", "addenda_05", "addenda_10", "addenda_11", "addenda_12", "addenda_13", "addenda_14", "addenda_15", "addenda_16", "addenda_17", "addenda_18", "future_dated" FROM "public"."transactions_in"' 2022-07-11 15:48:26 destination > 2022-07-11 15:48:26 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ does not exist in bucket; creating... 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 26 records for table 'public.transactions_in'; total duration '00:00:00.052' 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.transactions_out' (6 of 6 tables) 2022-07-11 15:48:26 source > 2022-07-11 15:48:26 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.transactions_out' using select statement: 'SELECT "id", "file_id", "external_id", "bank_id", "partner_id", "trace_no", "account_no", "account_name", "amount", "source_account_no", "source_account_name", "description", "effective_date", "destination_bank_routing_no", "return_data", "reference_info", "transaction_code", "created", "updated", "transaction_in_id", "uuid", "data", "status", "is_same_day" FROM "public"."transactions_out"' 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 113 records for table 'public.transactions_out'; total duration '00:00:00.137' 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.p.s.AbstractSnapshotChangeEventSource(execute):88 - Snapshot - Final stage 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.p.ChangeEventSourceCoordinator(doSnapshot):156 - Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.postgresql.Source:STRUCT}, sourceInfo=source_info[server='achilles'db='achilles', lsn=LSN{55/2DC11ED8}, txId=20086033, timestamp=2022-07-11T15:48:27.069Z, snapshot=FALSE, schema=public, table=transactions_out], lastSnapshotRecord=true, lastCompletelyProcessedLsn=null, lastCommitLsn=null, streamingStoppingLsn=null, transactionContext=TransactionContext [currentTransactionId=null, perTableEventCount={}, totalEventCount=0], incrementalSnapshotContext=IncrementalSnapshotContext [windowOpened=false, chunkEndPosition=null, dataCollectionsToSnapshot=[], lastEventKeySent=null, maximumKey=null]]] 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.p.ChangeEventSourceCoordinator(streamingConnected):234 - Connected metrics set to 'true' 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.bank_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.partner_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 destination > 2022-07-11 15:48:27 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ has been created in bucket. 2022-07-11 15:48:27 destination > 2022-07-11 15:48:27 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xha_bank_config}} 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.p.ChangeEventSourceCoordinator(streamEvents):173 - Starting streaming 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresStreamingChangeEventSource(execute):127 - Retrieved latest position from stored offset 'LSN{55/2DC11ED8}' 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.c.WalPositionLocator():40 - Looking for WAL restart position for last commit LSN 'null' and last change LSN 'LSN{55/2DC11ED8}' 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.c.PostgresConnection(readReplicationSlotInfo):251 - Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{53/E1155F98}, catalogXmin=19514215] 2022-07-11 15:48:27 destination > 2022-07-11 15:48:27 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xha_bank_config}} 2022-07-11 15:48:27 destination > 2022-07-11 15:48:27 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream bank_config (dataset raw_achilles): airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.e.EmbeddedEngine(stop):1047 - Stopping the embedded engine 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.e.EmbeddedEngine(stop):1055 - Waiting for PT5M for connector to stop 2022-07-11 15:48:27 destination > 2022-07-11 15:48:27 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ does not exist in bucket; creating... 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.u.Threads(threadFactory):270 - Requested thread factory for connector PostgresConnector, id = achilles named = keep-alive 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.u.Threads$3(newThread):287 - Creating thread debezium-postgresconnector-achilles-keep-alive 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.bank_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.partner_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.p.PostgresStreamingChangeEventSource(searchWalPosition):314 - Searching for WAL resume position 2022-07-11 15:48:27 destination > 2022-07-11 15:48:27 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ has been created in bucket. 2022-07-11 15:48:27 destination > 2022-07-11 15:48:27 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_huf_files_in}} 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.e.EmbeddedEngine(run):846 - Stopping the task and engine 2022-07-11 15:48:27 source > 2022-07-11 15:48:27 INFO i.d.c.c.BaseSourceTask(stop):238 - Stopping down connector 2022-07-11 15:48:27 destination > 2022-07-11 15:48:27 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_huf_files_in}} 2022-07-11 15:48:27 destination > 2022-07-11 15:48:27 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream files_in (dataset raw_achilles): airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:28 destination > 2022-07-11 15:48:28 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ does not exist in bucket; creating... 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.d.c.p.PostgresStreamingChangeEventSource(searchWalPosition):335 - WAL resume position 'null' discovered 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:48:28 destination > 2022-07-11 15:48:28 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ has been created in bucket. 2022-07-11 15:48:28 destination > 2022-07-11 15:48:28 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_cil_transactions_in}} 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.d.u.Threads(threadFactory):270 - Requested thread factory for connector PostgresConnector, id = achilles named = keep-alive 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.d.u.Threads$3(newThread):287 - Creating thread debezium-postgresconnector-achilles-keep-alive 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.d.c.p.PostgresStreamingChangeEventSource(processMessages):202 - Processing messages 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.d.p.ChangeEventSourceCoordinator(streamEvents):175 - Finished streaming 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.d.p.ChangeEventSourceCoordinator(streamingConnected):234 - Connected metrics set to 'false' 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.a.i.d.i.DebeziumRecordPublisher(lambda$start$1):85 - Debezium engine shutdown. 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.a.i.s.p.PostgresCdcStateHandler(saveState):32 - debezium state: {"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839851224,\"txId\":20086033,\"ts_usec\":1657554507069000,\"snapshot\":true}"} 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.a.i.s.r.AbstractDbSource(lambda$read$2):139 - Closing database connection pool. 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO c.z.h.HikariDataSource(close):350 - HikariPool-2 - Shutdown initiated... 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO c.z.h.HikariDataSource(close):352 - HikariPool-2 - Shutdown completed. 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.a.i.s.r.AbstractDbSource(lambda$read$2):141 - Closed database connection pool. 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:48:28 source > 2022-07-11 15:48:28 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):87 - Completed source: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:48:28 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):327 - Source has no more messages, closing connection. 2022-07-11 15:48:28 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):335 - Total records read: 419 (288 KB) 2022-07-11 15:48:28 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publictransactions_out. Error messages: [$.file_id is of an incorrect type. Expected it to be number, $.transaction_in_id is of an incorrect type. Expected it to be string, $.return_data is of an incorrect type. Expected it to be string, $._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:48:28 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicfiles_in. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:48:28 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publictransactions_in. Error messages: [$.destination_country_code is of an incorrect type. Expected it to be string, $.addenda_11 is of an incorrect type. Expected it to be string, $.destination_currency_code is of an incorrect type. Expected it to be string, $.addenda_99 is of an incorrect type. Expected it to be string, $.foreign_exchange_indicator is of an incorrect type. Expected it to be string, $.addenda_12 is of an incorrect type. Expected it to be string, $.addenda_05 is of an incorrect type. Expected it to be string, $.addenda_15 is of an incorrect type. Expected it to be string, $.originator_id is of an incorrect type. Expected it to be string, $.addenda_10 is of an incorrect type. Expected it to be string, $.addenda_02 is of an incorrect type. Expected it to be string, $.addenda_18 is of an incorrect type. Expected it to be string, $.addenda_98 is of an incorrect type. Expected it to be string, $.individual_id_no is of an incorrect type. Expected it to be string, $.addenda_13 is of an incorrect type. Expected it to be string, $.addenda_record_count is of an incorrect type. Expected it to be string, $.transaction_out_id is of an incorrect type. Expected it to be string, $.addenda_16 is of an incorrect type. Expected it to be string, $.addenda_17 is of an incorrect type. Expected it to be string, $.return_data is of an incorrect type. Expected it to be string, $.originating_currency_code is of an incorrect type. Expected it to be string, $.future_dated is of an incorrect type. Expected it to be boolean, $._ab_cdc_deleted_at is of an incorrect type. Expected it to be string, $.addenda_14 is of an incorrect type. Expected it to be string] 2022-07-11 15:48:28 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicfiles_out. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:48:28 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicpartner_config. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:48:28 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicbank_config. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:48:28 INFO i.a.w.g.DefaultReplicationWorker(run):174 - One of source or destination thread complete. Waiting on the other. 2022-07-11 15:48:28 destination > 2022-07-11 15:48:28 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_cil_transactions_in}} 2022-07-11 15:48:28 destination > 2022-07-11 15:48:28 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream transactions_in (dataset raw_achilles): airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:28 destination > 2022-07-11 15:48:28 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ does not exist in bucket; creating... 2022-07-11 15:48:28 destination > 2022-07-11 15:48:28 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ has been created in bucket. 2022-07-11 15:48:28 destination > 2022-07-11 15:48:28 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kol_files_out}} 2022-07-11 15:48:28 destination > 2022-07-11 15:48:28 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kol_files_out}} 2022-07-11 15:48:28 destination > 2022-07-11 15:48:28 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream files_out (dataset raw_achilles): airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:28 destination > 2022-07-11 15:48:28 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ does not exist in bucket; creating... 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ has been created in bucket. 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kdw_transactions_out}} 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kdw_transactions_out}} 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream transactions_out (dataset raw_achilles): airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ does not exist in bucket; creating... 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ has been created in bucket. 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$3):107 - Preparing tmp tables in destination completed. 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream bank_config (current state: 0 bytes in 0 buffers) 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream files_in (current state: 0 bytes in 1 buffers) 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream files_out (current state: 0 bytes in 2 buffers) 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream partner_config (current state: 0 bytes in 3 buffers) 2022-07-11 15:48:29 destination > 2022-07-11 15:48:29 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream transactions_in (current state: 62 KB in 4 buffers) 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream transactions_out (current state: 125 KB in 5 buffers) 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded. 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO i.a.i.d.b.BufferedStreamConsumer(close):171 - executing on success close procedure. 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO i.a.i.d.r.SerializedBufferingStrategy(flushAll):103 - Flushing all 6 current buffers (188 KB in total) 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream partner_config (62 KB) 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream partner_config (62 KB) to staging 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to ed9fd2a4-68b8-4227-9cc6-a6667a85af109571263729087191514.avro (111 KB) 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream partner_config (dataset raw_achilles): airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with full ID ABPnzm5oN8UDQcD1-2CFjBlBjx6oRtMIvUgviryCsVMMjJTKJkzGpfoZgk4oHIJ1LdvckSc 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm5oN...J1LdvckSc]: Uploading leftover stream [Part number 1 containing 0.11 MB] 2022-07-11 15:48:30 destination > 2022-07-11 15:48:30 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm5oN...J1LdvckSc]: Finished uploading [Part number 1 containing 0.11 MB] 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm5oN...J1LdvckSc]: Completed 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: ed9fd2a4-68b8-4227-9cc6-a6667a85af109571263729087191514.avro -> airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro (filename: 1.avro) 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data ed9fd2a4-68b8-4227-9cc6-a6667a85af109571263729087191514.avro 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream files_in (325 bytes) 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream files_in (325 bytes) to staging 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to 20925192-caf4-4392-8f30-b370c57118cf14394547386503295326.avro (23 KB) 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream files_in (dataset raw_achilles): airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with full ID ABPnzm6CzwYIm2zN-2FxNtuRHktnVzhWiCRKkFqPwYQe2ur3xm5UlrlDvxpgWDllEREec_o 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm6Cz...llEREec_o]: Uploading leftover stream [Part number 1 containing 0.02 MB] 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm6Cz...llEREec_o]: Finished uploading [Part number 1 containing 0.02 MB] 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm6Cz...llEREec_o]: Completed 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: 20925192-caf4-4392-8f30-b370c57118cf14394547386503295326.avro -> airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro (filename: 1.avro) 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data 20925192-caf4-4392-8f30-b370c57118cf14394547386503295326.avro 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream bank_config (328 bytes) 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream bank_config (328 bytes) to staging 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to bb4c118c-5cf9-4fea-8f64-67f8ef59ad057120792287449995175.avro (2 KB) 2022-07-11 15:48:31 destination > 2022-07-11 15:48:31 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream bank_config (dataset raw_achilles): airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with full ID ABPnzm4in60Ug2jwOnEjod25WY49dpaesmxS9AZ8SLVN-mvXBfMRikMBUI_OMrN_7P9Hshg 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm4in...N_7P9Hshg]: Uploading leftover stream [Part number 1 containing 0.00 MB] 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm4in...N_7P9Hshg]: Finished uploading [Part number 1 containing 0.00 MB] 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm4in...N_7P9Hshg]: Completed 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: bb4c118c-5cf9-4fea-8f64-67f8ef59ad057120792287449995175.avro -> airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro (filename: 1.avro) 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data bb4c118c-5cf9-4fea-8f64-67f8ef59ad057120792287449995175.avro 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream transactions_in (62 KB) 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream transactions_in (62 KB) to staging 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to a6873292-a031-409c-9230-28fb4e0136cb11955991491072591562.avro (62 KB) 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream transactions_in (dataset raw_achilles): airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with full ID ABPnzm72MhwFQURU5jA2FXvQiM66pwL3IOjG-XpNYr-tfHyWbmXp1EyVET6SkQw330lVWiQ 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:48:32 destination > 2022-07-11 15:48:32 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm72M...w330lVWiQ]: Uploading leftover stream [Part number 1 containing 0.06 MB] 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm72M...w330lVWiQ]: Finished uploading [Part number 1 containing 0.06 MB] 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm72M...w330lVWiQ]: Completed 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: a6873292-a031-409c-9230-28fb4e0136cb11955991491072591562.avro -> airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro (filename: 1.avro) 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data a6873292-a031-409c-9230-28fb4e0136cb11955991491072591562.avro 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream transactions_out (63 KB) 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream transactions_out (63 KB) to staging 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to 9b874bec-501b-4f97-b91b-1d8711625c153832140059545158227.avro (93 KB) 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream transactions_out (dataset raw_achilles): airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with full ID ABPnzm6McIzAQc3JGH3ysgkXfgJ9YbaLQeeIetQ6B5CUO9-D0R80gVhFA3hmoT5LDTHSgBs 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm6Mc...5LDTHSgBs]: Uploading leftover stream [Part number 1 containing 0.09 MB] 2022-07-11 15:48:33 destination > 2022-07-11 15:48:33 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm6Mc...5LDTHSgBs]: Finished uploading [Part number 1 containing 0.09 MB] 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm6Mc...5LDTHSgBs]: Completed 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: 9b874bec-501b-4f97-b91b-1d8711625c153832140059545158227.avro -> airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro (filename: 1.avro) 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data 9b874bec-501b-4f97-b91b-1d8711625c153832140059545158227.avro 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream files_out (326 bytes) 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream files_out (326 bytes) to staging 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to fa34c1a6-f8ef-4ab8-bd43-c7796249f80a1717092182851751482.avro (14 KB) 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream files_out (dataset raw_achilles): airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with full ID ABPnzm68t8JJgbmSxV_ErAjLKej6qUumiG6_GYljXuEP2QdLpn4IB2AWbHPSEFuGgHXa47I 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm68t...uGgHXa47I]: Uploading leftover stream [Part number 1 containing 0.01 MB] 2022-07-11 15:48:34 destination > 2022-07-11 15:48:34 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm68t...uGgHXa47I]: Finished uploading [Part number 1 containing 0.01 MB] 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro with id ABPnzm68t...uGgHXa47I]: Completed 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: fa34c1a6-f8ef-4ab8-bd43-c7796249f80a1717092182851751482.avro -> airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro (filename: 1.avro) 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data fa34c1a6-f8ef-4ab8-bd43-c7796249f80a1717092182851751482.avro 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream partner_config 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream files_in 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream bank_config 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream transactions_in 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream transactions_out 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream files_out 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):137 - Copying into tables in destination started for 6 streams 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_geb_partner_config}} (dataset raw_achilles): [1.avro] 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=f16a92bf-d375-4a87-8156-66a9424836df, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_geb_partner_config}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=f16a92bf-d375-4a87-8156-66a9424836df, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554515214, endTime=null, startTime=1657554515340, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=Gz98DM3shQPtBKvjYT7/wA==, generatedId=mainapi-282501:US.f16a92bf-d375-4a87-8156-66a9424836df, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/f16a92bf-d375-4a87-8156-66a9424836df?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_geb_partner_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:48:35 destination > 2022-07-11 15:48:35 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=f16a92bf-d375-4a87-8156-66a9424836df, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554515214, endTime=null, startTime=1657554515340, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=Gz98DM3shQPtBKvjYT7/wA==, generatedId=mainapi-282501:US.f16a92bf-d375-4a87-8156-66a9424836df, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/f16a92bf-d375-4a87-8156-66a9424836df?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_geb_partner_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:48:40 destination > 2022-07-11 15:48:40 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=f16a92bf-d375-4a87-8156-66a9424836df, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554515214, endTime=null, startTime=1657554515340, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=Gz98DM3shQPtBKvjYT7/wA==, generatedId=mainapi-282501:US.f16a92bf-d375-4a87-8156-66a9424836df, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/f16a92bf-d375-4a87-8156-66a9424836df?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_geb_partner_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:48:40 destination > 2022-07-11 15:48:40 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=f16a92bf-d375-4a87-8156-66a9424836df, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_geb_partner_config}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:48:40 destination > 2022-07-11 15:48:40 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_geb_partner_config}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_partner_config}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:48:44 destination > 2022-07-11 15:48:44 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_geb_partner_config}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_partner_config}} 2022-07-11 15:48:44 destination > 2022-07-11 15:48:44 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xha_bank_config}} (dataset raw_achilles): [1.avro] 2022-07-11 15:48:44 destination > 2022-07-11 15:48:44 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:48:44 destination > 2022-07-11 15:48:44 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=ee9f2681-8364-40d2-91a8-f2b4e32a64c7, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xha_bank_config}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=ee9f2681-8364-40d2-91a8-f2b4e32a64c7, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554524239, endTime=null, startTime=1657554524556, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=jV1/1LpIz2LjJXy/2YG4cQ==, generatedId=mainapi-282501:US.ee9f2681-8364-40d2-91a8-f2b4e32a64c7, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/ee9f2681-8364-40d2-91a8-f2b4e32a64c7?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_xha_bank_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:48:44 destination > 2022-07-11 15:48:44 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=ee9f2681-8364-40d2-91a8-f2b4e32a64c7, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554524239, endTime=null, startTime=1657554524556, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=jV1/1LpIz2LjJXy/2YG4cQ==, generatedId=mainapi-282501:US.ee9f2681-8364-40d2-91a8-f2b4e32a64c7, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/ee9f2681-8364-40d2-91a8-f2b4e32a64c7?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_xha_bank_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:48:47 destination > 2022-07-11 15:48:47 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=ee9f2681-8364-40d2-91a8-f2b4e32a64c7, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554524239, endTime=null, startTime=1657554524556, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=jV1/1LpIz2LjJXy/2YG4cQ==, generatedId=mainapi-282501:US.ee9f2681-8364-40d2-91a8-f2b4e32a64c7, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/ee9f2681-8364-40d2-91a8-f2b4e32a64c7?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_xha_bank_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:48:47 destination > 2022-07-11 15:48:47 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=ee9f2681-8364-40d2-91a8-f2b4e32a64c7, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xha_bank_config}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:48:47 destination > 2022-07-11 15:48:47 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xha_bank_config}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_bank_config}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:48:49 destination > 2022-07-11 15:48:49 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xha_bank_config}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_bank_config}} 2022-07-11 15:48:49 destination > 2022-07-11 15:48:49 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_huf_files_in}} (dataset raw_achilles): [1.avro] 2022-07-11 15:48:49 destination > 2022-07-11 15:48:49 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:48:50 destination > 2022-07-11 15:48:50 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=065ec457-d404-4ccb-aa11-644985a91b25, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_huf_files_in}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=065ec457-d404-4ccb-aa11-644985a91b25, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554529784, endTime=null, startTime=1657554530102, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=+WKKWVicBbbY8SSsDzZ9Ug==, generatedId=mainapi-282501:US.065ec457-d404-4ccb-aa11-644985a91b25, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/065ec457-d404-4ccb-aa11-644985a91b25?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_huf_files_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:48:50 destination > 2022-07-11 15:48:50 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=065ec457-d404-4ccb-aa11-644985a91b25, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554529784, endTime=null, startTime=1657554530102, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=+WKKWVicBbbY8SSsDzZ9Ug==, generatedId=mainapi-282501:US.065ec457-d404-4ccb-aa11-644985a91b25, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/065ec457-d404-4ccb-aa11-644985a91b25?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_huf_files_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:48:53 destination > 2022-07-11 15:48:53 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=065ec457-d404-4ccb-aa11-644985a91b25, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554529784, endTime=null, startTime=1657554530102, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=+WKKWVicBbbY8SSsDzZ9Ug==, generatedId=mainapi-282501:US.065ec457-d404-4ccb-aa11-644985a91b25, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/065ec457-d404-4ccb-aa11-644985a91b25?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_huf_files_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:48:53 destination > 2022-07-11 15:48:53 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=065ec457-d404-4ccb-aa11-644985a91b25, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_huf_files_in}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:48:53 destination > 2022-07-11 15:48:53 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_huf_files_in}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_in}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:48:57 destination > 2022-07-11 15:48:57 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_huf_files_in}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_in}} 2022-07-11 15:48:57 destination > 2022-07-11 15:48:57 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_cil_transactions_in}} (dataset raw_achilles): [1.avro] 2022-07-11 15:48:57 destination > 2022-07-11 15:48:57 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:48:58 destination > 2022-07-11 15:48:58 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=41737e65-56d2-4dcf-ade7-9f345623e04e, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_cil_transactions_in}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=41737e65-56d2-4dcf-ade7-9f345623e04e, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554538051, endTime=null, startTime=1657554538488, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=coebKBG2tpXOdZcvWsGIoQ==, generatedId=mainapi-282501:US.41737e65-56d2-4dcf-ade7-9f345623e04e, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/41737e65-56d2-4dcf-ade7-9f345623e04e?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_cil_transactions_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:48:58 destination > 2022-07-11 15:48:58 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=41737e65-56d2-4dcf-ade7-9f345623e04e, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554538051, endTime=null, startTime=1657554538488, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=coebKBG2tpXOdZcvWsGIoQ==, generatedId=mainapi-282501:US.41737e65-56d2-4dcf-ade7-9f345623e04e, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/41737e65-56d2-4dcf-ade7-9f345623e04e?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_cil_transactions_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:49:01 destination > 2022-07-11 15:49:01 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=41737e65-56d2-4dcf-ade7-9f345623e04e, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554538051, endTime=null, startTime=1657554538488, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=coebKBG2tpXOdZcvWsGIoQ==, generatedId=mainapi-282501:US.41737e65-56d2-4dcf-ade7-9f345623e04e, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/41737e65-56d2-4dcf-ade7-9f345623e04e?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_cil_transactions_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:49:01 destination > 2022-07-11 15:49:01 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=41737e65-56d2-4dcf-ade7-9f345623e04e, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_cil_transactions_in}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:49:01 destination > 2022-07-11 15:49:01 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_cil_transactions_in}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_in}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:49:06 destination > 2022-07-11 15:49:06 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_cil_transactions_in}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_in}} 2022-07-11 15:49:06 destination > 2022-07-11 15:49:06 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kol_files_out}} (dataset raw_achilles): [1.avro] 2022-07-11 15:49:06 destination > 2022-07-11 15:49:06 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:49:07 destination > 2022-07-11 15:49:07 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=6300e869-1951-4562-8185-834ee598376b, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kol_files_out}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=6300e869-1951-4562-8185-834ee598376b, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554546624, endTime=null, startTime=1657554547061, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=19bcdhcu49eDCOikZY3hCQ==, generatedId=mainapi-282501:US.6300e869-1951-4562-8185-834ee598376b, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/6300e869-1951-4562-8185-834ee598376b?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_kol_files_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:49:07 destination > 2022-07-11 15:49:07 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=6300e869-1951-4562-8185-834ee598376b, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554546624, endTime=null, startTime=1657554547061, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=19bcdhcu49eDCOikZY3hCQ==, generatedId=mainapi-282501:US.6300e869-1951-4562-8185-834ee598376b, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/6300e869-1951-4562-8185-834ee598376b?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_kol_files_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:49:10 destination > 2022-07-11 15:49:10 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=6300e869-1951-4562-8185-834ee598376b, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554546624, endTime=null, startTime=1657554547061, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=19bcdhcu49eDCOikZY3hCQ==, generatedId=mainapi-282501:US.6300e869-1951-4562-8185-834ee598376b, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/6300e869-1951-4562-8185-834ee598376b?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_kol_files_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:49:10 destination > 2022-07-11 15:49:10 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=6300e869-1951-4562-8185-834ee598376b, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kol_files_out}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:49:10 destination > 2022-07-11 15:49:10 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kol_files_out}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_out}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:49:15 destination > 2022-07-11 15:49:15 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kol_files_out}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_out}} 2022-07-11 15:49:15 destination > 2022-07-11 15:49:15 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kdw_transactions_out}} (dataset raw_achilles): [1.avro] 2022-07-11 15:49:15 destination > 2022-07-11 15:49:15 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:49:15 destination > 2022-07-11 15:49:15 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=682e31c6-2bdd-422c-aa5c-b09bd25c89c5, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kdw_transactions_out}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=682e31c6-2bdd-422c-aa5c-b09bd25c89c5, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554555230, endTime=null, startTime=1657554555414, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=CsVVaQIUvPQ9v+KSkVBOyw==, generatedId=mainapi-282501:US.682e31c6-2bdd-422c-aa5c-b09bd25c89c5, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/682e31c6-2bdd-422c-aa5c-b09bd25c89c5?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_kdw_transactions_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:49:15 destination > 2022-07-11 15:49:15 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=682e31c6-2bdd-422c-aa5c-b09bd25c89c5, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554555230, endTime=null, startTime=1657554555414, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=CsVVaQIUvPQ9v+KSkVBOyw==, generatedId=mainapi-282501:US.682e31c6-2bdd-422c-aa5c-b09bd25c89c5, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/682e31c6-2bdd-422c-aa5c-b09bd25c89c5?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_kdw_transactions_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:49:17 destination > 2022-07-11 15:49:17 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=682e31c6-2bdd-422c-aa5c-b09bd25c89c5, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554555230, endTime=null, startTime=1657554555414, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=CsVVaQIUvPQ9v+KSkVBOyw==, generatedId=mainapi-282501:US.682e31c6-2bdd-422c-aa5c-b09bd25c89c5, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/682e31c6-2bdd-422c-aa5c-b09bd25c89c5?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_kdw_transactions_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:49:17 destination > 2022-07-11 15:49:17 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=682e31c6-2bdd-422c-aa5c-b09bd25c89c5, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kdw_transactions_out}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:49:17 destination > 2022-07-11 15:49:17 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kdw_transactions_out}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_out}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:49:21 destination > 2022-07-11 15:49:21 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kdw_transactions_out}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_out}} 2022-07-11 15:49:21 destination > 2022-07-11 15:49:21 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):156 - Finalizing tables in destination completed 2022-07-11 15:49:21 destination > 2022-07-11 15:49:21 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):159 - Cleaning up destination started for 6 streams 2022-07-11 15:49:21 destination > 2022-07-11 15:49:21 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_geb_partner_config}} (dataset raw_achilles) 2022-07-11 15:49:21 destination > 2022-07-11 15:49:21 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream partner_config (dataset raw_achilles): airbyte/raw_achilles_partner_config 2022-07-11 15:49:21 destination > 2022-07-11 15:49:21 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:49:21 destination > 2022-07-11 15:49:21 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_partner_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:49:22 destination > 2022-07-11 15:49:22 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_partner_config has been cleaned-up (2 objects were deleted)... 2022-07-11 15:49:22 destination > 2022-07-11 15:49:22 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xha_bank_config}} (dataset raw_achilles) 2022-07-11 15:49:22 destination > 2022-07-11 15:49:22 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream bank_config (dataset raw_achilles): airbyte/raw_achilles_bank_config 2022-07-11 15:49:22 destination > 2022-07-11 15:49:22 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:49:22 destination > 2022-07-11 15:49:22 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_bank_config/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:49:22 destination > 2022-07-11 15:49:22 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_bank_config has been cleaned-up (2 objects were deleted)... 2022-07-11 15:49:22 destination > 2022-07-11 15:49:22 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_huf_files_in}} (dataset raw_achilles) 2022-07-11 15:49:22 destination > 2022-07-11 15:49:22 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream files_in (dataset raw_achilles): airbyte/raw_achilles_files_in 2022-07-11 15:49:22 destination > 2022-07-11 15:49:22 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:49:22 destination > 2022-07-11 15:49:22 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:49:23 destination > 2022-07-11 15:49:23 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_files_in has been cleaned-up (2 objects were deleted)... 2022-07-11 15:49:23 destination > 2022-07-11 15:49:23 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_cil_transactions_in}} (dataset raw_achilles) 2022-07-11 15:49:23 destination > 2022-07-11 15:49:23 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream transactions_in (dataset raw_achilles): airbyte/raw_achilles_transactions_in 2022-07-11 15:49:23 destination > 2022-07-11 15:49:23 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:49:23 destination > 2022-07-11 15:49:23 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_in/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:49:23 destination > 2022-07-11 15:49:23 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_transactions_in has been cleaned-up (2 objects were deleted)... 2022-07-11 15:49:23 destination > 2022-07-11 15:49:23 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kol_files_out}} (dataset raw_achilles) 2022-07-11 15:49:23 destination > 2022-07-11 15:49:23 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream files_out (dataset raw_achilles): airbyte/raw_achilles_files_out 2022-07-11 15:49:23 destination > 2022-07-11 15:49:23 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:49:24 destination > 2022-07-11 15:49:24 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:49:24 destination > 2022-07-11 15:49:24 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_files_out has been cleaned-up (2 objects were deleted)... 2022-07-11 15:49:24 destination > 2022-07-11 15:49:24 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_kdw_transactions_out}} (dataset raw_achilles) 2022-07-11 15:49:24 destination > 2022-07-11 15:49:24 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream transactions_out (dataset raw_achilles): airbyte/raw_achilles_transactions_out 2022-07-11 15:49:24 destination > 2022-07-11 15:49:24 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/ 2022-07-11 15:49:24 destination > 2022-07-11 15:49:24 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_out/2022/07/11/15/9afd5930-86b5-40f9-85b2-1934e0a22ee0/1.avro 2022-07-11 15:49:25 destination > 2022-07-11 15:49:25 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_transactions_out has been cleaned-up (2 objects were deleted)... 2022-07-11 15:49:25 destination > 2022-07-11 15:49:25 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):164 - Cleaning up destination completed. 2022-07-11 15:49:25 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):415 - State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@717888a3[type=STATE,log=,spec=,connectionStatus=,catalog=,record=,state=io.airbyte.protocol.models.AirbyteStateMessage@7fbf68fc[type=GLOBAL,stream=,global=io.airbyte.protocol.models.AirbyteGlobalState@7b80e3d3[sharedState={"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839851224,\"txId\":20086033,\"ts_usec\":1657554507069000,\"snapshot\":true}"}},streamStates=[io.airbyte.protocol.models.AirbyteStreamState@3e8262c7[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@b8ac824[name=bank_config,namespace=public,additionalProperties={}],streamState={"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@ff6b921[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@73144d77[name=files_in,namespace=public,additionalProperties={}],streamState={"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@5ead9d88[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@33ba591[name=files_out,namespace=public,additionalProperties={}],streamState={"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@393e3b66[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@6810a6c3[name=partner_config,namespace=public,additionalProperties={}],streamState={"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@5ae32794[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@4aa1674[name=transactions_in,namespace=public,additionalProperties={}],streamState={"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@4c07f88c[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@214c6971[name=transactions_out,namespace=public,additionalProperties={}],streamState={"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}]],additionalProperties={}],data={"cdc":true,"cdc_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839851224,\"txId\":20086033,\"ts_usec\":1657554507069000,\"snapshot\":true}"}},"streams":[{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}]},additionalProperties={}],trace=,additionalProperties={}] 2022-07-11 15:49:25 destination > 2022-07-11 15:49:25 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2022-07-11 15:49:25 INFO i.a.w.g.DefaultReplicationWorker(run):176 - Source and destination threads complete. 2022-07-11 15:49:25 INFO i.a.w.g.DefaultReplicationWorker(run):239 - sync summary: io.airbyte.config.ReplicationAttemptSummary@6c12c8a2[status=completed,recordsSynced=418,bytesSynced=295072,startTime=1657554499560,endTime=1657554565143,totalStats=io.airbyte.config.SyncStats@1603ea2c[recordsEmitted=418,bytesEmitted=295072,stateMessagesEmitted=1,recordsCommitted=418],streamStats=[io.airbyte.config.StreamSyncStats@3f29e9d5[streamName=bank_config,stats=io.airbyte.config.SyncStats@6c1bccb7[recordsEmitted=3,bytesEmitted=1792,stateMessagesEmitted=,recordsCommitted=3]], io.airbyte.config.StreamSyncStats@16364e4e[streamName=transactions_out,stats=io.airbyte.config.SyncStats@1f6d942e[recordsEmitted=113,bytesEmitted=90406,stateMessagesEmitted=,recordsCommitted=113]], io.airbyte.config.StreamSyncStats@6f7ab157[streamName=partner_config,stats=io.airbyte.config.SyncStats@5db6b67b[recordsEmitted=206,bytesEmitted=104843,stateMessagesEmitted=,recordsCommitted=206]], io.airbyte.config.StreamSyncStats@7956d7bd[streamName=transactions_in,stats=io.airbyte.config.SyncStats@ae9f05d[recordsEmitted=26,bytesEmitted=62878,stateMessagesEmitted=,recordsCommitted=26]], io.airbyte.config.StreamSyncStats@30a3085e[streamName=files_in,stats=io.airbyte.config.SyncStats@58ce755f[recordsEmitted=36,bytesEmitted=22085,stateMessagesEmitted=,recordsCommitted=36]], io.airbyte.config.StreamSyncStats@17eabac4[streamName=files_out,stats=io.airbyte.config.SyncStats@52c4cb7c[recordsEmitted=34,bytesEmitted=13068,stateMessagesEmitted=,recordsCommitted=34]]]] 2022-07-11 15:49:25 INFO i.a.w.g.DefaultReplicationWorker(run):266 - Source output at least one state message 2022-07-11 15:49:25 INFO i.a.w.g.DefaultReplicationWorker(run):272 - State capture: Updated state to: Optional[io.airbyte.config.State@7a8dd882[state=[{"type":"GLOBAL","global":{"shared_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839851224,\"txId\":20086033,\"ts_usec\":1657554507069000,\"snapshot\":true}"}},"stream_states":[{"stream_descriptor":{"name":"bank_config","namespace":"public"},"stream_state":{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_in","namespace":"public"},"stream_state":{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_out","namespace":"public"},"stream_state":{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"partner_config","namespace":"public"},"stream_state":{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_in","namespace":"public"},"stream_state":{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_out","namespace":"public"},"stream_state":{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}}]},"data":{"cdc":true,"cdc_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839851224,\"txId\":20086033,\"ts_usec\":1657554507069000,\"snapshot\":true}"}},"streams":[{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}]}}]]] 2022-07-11 15:49:25 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:49:25 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):161 - sync summary: io.airbyte.config.StandardSyncOutput@1f52dfe3[standardSyncSummary=io.airbyte.config.StandardSyncSummary@5a50526d[status=completed,recordsSynced=418,bytesSynced=295072,startTime=1657554499560,endTime=1657554565143,totalStats=io.airbyte.config.SyncStats@1603ea2c[recordsEmitted=418,bytesEmitted=295072,stateMessagesEmitted=1,recordsCommitted=418],streamStats=[io.airbyte.config.StreamSyncStats@3f29e9d5[streamName=bank_config,stats=io.airbyte.config.SyncStats@6c1bccb7[recordsEmitted=3,bytesEmitted=1792,stateMessagesEmitted=,recordsCommitted=3]], io.airbyte.config.StreamSyncStats@16364e4e[streamName=transactions_out,stats=io.airbyte.config.SyncStats@1f6d942e[recordsEmitted=113,bytesEmitted=90406,stateMessagesEmitted=,recordsCommitted=113]], io.airbyte.config.StreamSyncStats@6f7ab157[streamName=partner_config,stats=io.airbyte.config.SyncStats@5db6b67b[recordsEmitted=206,bytesEmitted=104843,stateMessagesEmitted=,recordsCommitted=206]], io.airbyte.config.StreamSyncStats@7956d7bd[streamName=transactions_in,stats=io.airbyte.config.SyncStats@ae9f05d[recordsEmitted=26,bytesEmitted=62878,stateMessagesEmitted=,recordsCommitted=26]], io.airbyte.config.StreamSyncStats@30a3085e[streamName=files_in,stats=io.airbyte.config.SyncStats@58ce755f[recordsEmitted=36,bytesEmitted=22085,stateMessagesEmitted=,recordsCommitted=36]], io.airbyte.config.StreamSyncStats@17eabac4[streamName=files_out,stats=io.airbyte.config.SyncStats@52c4cb7c[recordsEmitted=34,bytesEmitted=13068,stateMessagesEmitted=,recordsCommitted=34]]]],normalizationSummary=,state=io.airbyte.config.State@7a8dd882[state=[{"type":"GLOBAL","global":{"shared_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839851224,\"txId\":20086033,\"ts_usec\":1657554507069000,\"snapshot\":true}"}},"stream_states":[{"stream_descriptor":{"name":"bank_config","namespace":"public"},"stream_state":{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_in","namespace":"public"},"stream_state":{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_out","namespace":"public"},"stream_state":{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"partner_config","namespace":"public"},"stream_state":{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_in","namespace":"public"},"stream_state":{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_out","namespace":"public"},"stream_state":{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}}]},"data":{"cdc":true,"cdc_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839851224,\"txId\":20086033,\"ts_usec\":1657554507069000,\"snapshot\":true}"}},"streams":[{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}]}}]],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@1a9e7b98[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@6b402f75[stream=io.airbyte.protocol.models.AirbyteStream@62645839[name=bank_config,jsonSchema={"type":"object","properties":{"name":{"type":"string"},"config":{"type":"string"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"routing_no":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[bank_id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[bank_id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@3a7fad22[stream=io.airbyte.protocol.models.AirbyteStream@5354434e[name=files_in,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"ended":{"type":"string","format":"date-time","airbyte_type":"timestamp_without_timezone"},"started":{"type":"string","format":"date-time","airbyte_type":"timestamp_without_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"file_hash":{"type":"string"},"file_name":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"iat_entry_count":{"type":"number"},"std_entry_count":{"type":"number"},"total_batch_count":{"type":"number"},"total_entry_count":{"type":"number"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"},"preprocessing_path":{"type":"string"},"total_debit_amount":{"type":"number"},"postprocessing_path":{"type":"string"},"total_credit_amount":{"type":"number"},"iat_entries_processed":{"type":"number"},"std_entries_processed":{"type":"number"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@3985e5d6[stream=io.airbyte.protocol.models.AirbyteStream@7960d74f[name=files_out,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"file_hash":{"type":"string"},"file_name":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"batch_count":{"type":"number"},"exchange_window":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@3e3173fc[stream=io.airbyte.protocol.models.AirbyteStream@4f0e0947[name=partner_config,jsonSchema={"type":"object","properties":{"name":{"type":"string"},"config":{"type":"string"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"partner_id":{"type":"number"},"routing_no":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"account_prefix":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[partner_id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[partner_id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@59b3fe25[stream=io.airbyte.protocol.models.AirbyteStream@261680f3[name=transactions_in,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"uuid":{"type":"string"},"amount":{"type":"number"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"returned":{"type":"boolean"},"sec_code":{"type":"string"},"file_hash":{"type":"string"},"file_name":{"type":"string"},"addenda_02":{"type":"string"},"addenda_05":{"type":"string"},"addenda_10":{"type":"string"},"addenda_11":{"type":"string"},"addenda_12":{"type":"string"},"addenda_13":{"type":"string"},"addenda_14":{"type":"string"},"addenda_15":{"type":"string"},"addenda_16":{"type":"string"},"addenda_17":{"type":"string"},"addenda_18":{"type":"string"},"addenda_98":{"type":"string"},"addenda_99":{"type":"string"},"batch_type":{"type":"string"},"company_id":{"type":"string"},"partner_id":{"type":"number"},"_ab_cdc_lsn":{"type":"number"},"external_id":{"type":"string"},"return_data":{"type":"string"},"batch_number":{"type":"number"},"company_name":{"type":"string"},"future_dated":{"type":"boolean"},"originator_id":{"type":"string"},"receiving_dfi":{"type":"string"},"dfi_account_no":{"type":"string"},"effective_date":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"entry_trace_no":{"type":"string"},"individual_name":{"type":"string"},"originating_dfi":{"type":"string"},"settlement_date":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"individual_id_no":{"type":"string"},"transaction_code":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"},"processing_history":{"type":"string"},"transaction_out_id":{"type":"string"},"addenda_record_count":{"type":"string"},"destination_country_code":{"type":"string"},"company_entry_description":{"type":"string"},"destination_currency_code":{"type":"string"},"originating_currency_code":{"type":"string"},"foreign_exchange_indicator":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@130db95e[stream=io.airbyte.protocol.models.AirbyteStream@2c76de49[name=transactions_out,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"data":{"type":"string"},"uuid":{"type":"string"},"amount":{"type":"number"},"status":{"type":"string"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"file_id":{"type":"number"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"trace_no":{"type":"string"},"account_no":{"type":"string"},"partner_id":{"type":"number"},"_ab_cdc_lsn":{"type":"number"},"description":{"type":"string"},"external_id":{"type":"string"},"is_same_day":{"type":"boolean"},"return_data":{"type":"string"},"account_name":{"type":"string"},"effective_date":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"reference_info":{"type":"string"},"transaction_code":{"type":"number"},"source_account_no":{"type":"string"},"transaction_in_id":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"},"source_account_name":{"type":"string"},"destination_bank_routing_no":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[]] 2022-07-11 15:49:25 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating... 2022-07-11 15:49:25 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:49:25 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/1/logs.log 2022-07-11 15:49:25 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:49:25 INFO i.a.w.g.DefaultNormalizationWorker(run):49 - Running normalization. 2022-07-11 15:49:25 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization:0.2.6 2022-07-11 15:49:25 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization:0.2.6 exists... 2022-07-11 15:49:25 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization:0.2.6 was found locally. 2022-07-11 15:49:25 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:49:25 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/1/normalize --log-driver none --name normalization-normalize-89696-1-sdskj --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.32-alpha airbyte/normalization:0.2.6 run --integration-type bigquery --config destination_config.json --catalog destination_catalog.json 2022-07-11 15:49:25 normalization > Running: transform-config --config destination_config.json --integration-type bigquery --out /data/89696/1/normalize 2022-07-11 15:49:26 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/89696/1/normalize') 2022-07-11 15:49:26 normalization > transform_bigquery 2022-07-11 15:49:26 normalization > Running: transform-catalog --integration-type bigquery --profile-config-dir /data/89696/1/normalize --catalog destination_catalog.json --out /data/89696/1/normalize/models/generated/ --json-column _airbyte_data 2022-07-11 15:49:27 normalization > Processing destination_catalog.json... 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/bank_config_ab1.sql from bank_config 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/bank_config_ab2.sql from bank_config 2022-07-11 15:49:27 normalization > Generating airbyte_views/raw_achilles/bank_config_stg.sql from bank_config 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/scd/raw_achilles/bank_config_scd.sql from bank_config 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/raw_achilles/bank_config.sql from bank_config 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/files_in_ab1.sql from files_in 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/files_in_ab2.sql from files_in 2022-07-11 15:49:27 normalization > Generating airbyte_views/raw_achilles/files_in_stg.sql from files_in 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/scd/raw_achilles/files_in_scd.sql from files_in 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/raw_achilles/files_in.sql from files_in 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/files_out_ab1.sql from files_out 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/files_out_ab2.sql from files_out 2022-07-11 15:49:27 normalization > Generating airbyte_views/raw_achilles/files_out_stg.sql from files_out 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/scd/raw_achilles/files_out_scd.sql from files_out 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/raw_achilles/files_out.sql from files_out 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/partner_config_ab1.sql from partner_config 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/partner_config_ab2.sql from partner_config 2022-07-11 15:49:27 normalization > Generating airbyte_views/raw_achilles/partner_config_stg.sql from partner_config 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/scd/raw_achilles/partner_config_scd.sql from partner_config 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/raw_achilles/partner_config.sql from partner_config 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/transactions_in_ab1.sql from transactions_in 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/transactions_in_ab2.sql from transactions_in 2022-07-11 15:49:27 normalization > Generating airbyte_views/raw_achilles/transactions_in_stg.sql from transactions_in 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql from transactions_in 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/raw_achilles/transactions_in.sql from transactions_in 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/transactions_out_ab1.sql from transactions_out 2022-07-11 15:49:27 normalization > Generating airbyte_ctes/raw_achilles/transactions_out_ab2.sql from transactions_out 2022-07-11 15:49:27 normalization > Generating airbyte_views/raw_achilles/transactions_out_stg.sql from transactions_out 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/scd/raw_achilles/transactions_out_scd.sql from transactions_out 2022-07-11 15:49:27 normalization > Generating airbyte_incremental/raw_achilles/transactions_out.sql from transactions_out 2022-07-11 15:49:27 normalization > detected no config file for ssh, assuming ssh is off. 2022-07-11 15:49:31 normalization > [--event-buffer-size EVENT_BUFFER_SIZE] 2022-07-11 15:49:31 normalization > --event-buffer-size EVENT_BUFFER_SIZE 2022-07-11 15:49:31 normalization > 2022-07-11 15:49:31 normalization > DBT >=1.0.0 detected; using 10K event buffer size 2022-07-11 15:49:31 normalization > 2022-07-11 15:49:35 normalization > 15:49:35 Running with dbt=1.0.0 2022-07-11 15:49:35 normalization > 15:49:35 Partial parse save file not found. Starting full parse. 2022-07-11 15:49:38 normalization > 15:49:38 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-07-11 15:49:38 normalization > There are 1 unused configuration paths: 2022-07-11 15:49:38 normalization > - models.airbyte_utils.generated.airbyte_tables 2022-07-11 15:49:38 normalization > 2022-07-11 15:49:38 normalization > 15:49:38 Found 30 models, 0 tests, 0 snapshots, 0 analyses, 549 macros, 0 operations, 0 seed files, 6 sources, 0 exposures, 0 metrics 2022-07-11 15:49:38 normalization > 15:49:38 2022-07-11 15:49:38 normalization > 15:49:38 Concurrency: 8 threads (target='prod') 2022-07-11 15:49:38 normalization > 15:49:38 2022-07-11 15:49:39 normalization > 15:49:39 1 of 18 START view model _airbyte_raw_achilles.bank_config_stg.......................................................... [RUN] 2022-07-11 15:49:39 normalization > 15:49:39 2 of 18 START view model _airbyte_raw_achilles.files_out_stg............................................................ [RUN] 2022-07-11 15:49:39 normalization > 15:49:39 3 of 18 START view model _airbyte_raw_achilles.partner_config_stg....................................................... [RUN] 2022-07-11 15:49:39 normalization > 15:49:39 4 of 18 START view model _airbyte_raw_achilles.files_in_stg............................................................. [RUN] 2022-07-11 15:49:39 normalization > 15:49:39 5 of 18 START view model _airbyte_raw_achilles.transactions_out_stg..................................................... [RUN] 2022-07-11 15:49:39 normalization > 15:49:39 6 of 18 START view model _airbyte_raw_achilles.transactions_in_stg...................................................... [RUN] 2022-07-11 15:49:41 normalization > 15:49:41 4 of 18 OK created view model _airbyte_raw_achilles.files_in_stg........................................................ [OK in 1.32s] 2022-07-11 15:49:41 normalization > 15:49:41 7 of 18 START incremental model raw_achilles.files_in_scd............................................................... [RUN] 2022-07-11 15:49:41 normalization > 15:49:41 3 of 18 OK created view model _airbyte_raw_achilles.partner_config_stg.................................................. [OK in 1.38s] 2022-07-11 15:49:41 normalization > 15:49:41 8 of 18 START incremental model raw_achilles.partner_config_scd......................................................... [RUN] 2022-07-11 15:49:41 normalization > 15:49:41 1 of 18 OK created view model _airbyte_raw_achilles.bank_config_stg..................................................... [OK in 1.42s] 2022-07-11 15:49:41 normalization > 15:49:41 9 of 18 START incremental model raw_achilles.bank_config_scd............................................................ [RUN] 2022-07-11 15:49:41 normalization > 15:49:41 5 of 18 OK created view model _airbyte_raw_achilles.transactions_out_stg................................................ [OK in 1.39s] 2022-07-11 15:49:41 normalization > 15:49:41 10 of 18 START incremental model raw_achilles.transactions_out_scd...................................................... [RUN] 2022-07-11 15:49:41 normalization > 15:49:41 6 of 18 OK created view model _airbyte_raw_achilles.transactions_in_stg................................................. [OK in 1.44s] 2022-07-11 15:49:41 normalization > 15:49:41 11 of 18 START incremental model raw_achilles.transactions_in_scd....................................................... [RUN] 2022-07-11 15:49:41 normalization > 15:49:41 2 of 18 OK created view model _airbyte_raw_achilles.files_out_stg....................................................... [OK in 1.55s] 2022-07-11 15:49:41 normalization > 15:49:41 12 of 18 START incremental model raw_achilles.files_out_scd............................................................. [RUN] 2022-07-11 15:49:41 normalization > 15:49:41 15:49:41 + `mainapi-282501`.raw_achilles.`transactions_in_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:49:48 normalization > 15:49:48 11 of 18 ERROR creating incremental model raw_achilles.transactions_in_scd.............................................. [ERROR in 6.86s] 2022-07-11 15:49:48 normalization > 15:49:48 13 of 18 SKIP relation raw_achilles.transactions_in..................................................................... [SKIP] 2022-07-11 15:49:52 normalization > 15:49:52 12 of 18 OK created incremental model raw_achilles.files_out_scd........................................................ [MERGE (68.0 rows, 35.5 KB processed) in 11.63s] 2022-07-11 15:49:52 normalization > 15:49:52 14 of 18 START incremental model raw_achilles.files_out................................................................. [RUN] 2022-07-11 15:49:53 normalization > 15:49:53 10 of 18 OK created incremental model raw_achilles.transactions_out_scd................................................. [MERGE (226.0 rows, 170.3 KB processed) in 12.00s] 2022-07-11 15:49:53 normalization > 15:49:53 9 of 18 OK created incremental model raw_achilles.bank_config_scd....................................................... [MERGE (6.0 rows, 5.0 KB processed) in 12.02s] 2022-07-11 15:49:53 normalization > 15:49:53 15 of 18 START incremental model raw_achilles.transactions_out.......................................................... [RUN] 2022-07-11 15:49:53 normalization > 15:49:53 16 of 18 START incremental model raw_achilles.bank_config............................................................... [RUN] 2022-07-11 15:49:53 normalization > 15:49:53 7 of 18 OK created incremental model raw_achilles.files_in_scd.......................................................... [MERGE (72.0 rows, 45.5 KB processed) in 12.23s] 2022-07-11 15:49:53 normalization > 15:49:53 17 of 18 START incremental model raw_achilles.files_in.................................................................. [RUN] 2022-07-11 15:49:53 normalization > 15:49:53 8 of 18 OK created incremental model raw_achilles.partner_config_scd.................................................... [MERGE (412.0 rows, 284.3 KB processed) in 12.57s] 2022-07-11 15:49:53 normalization > 15:49:53 18 of 18 START incremental model raw_achilles.partner_config............................................................ [RUN] 2022-07-11 15:49:58 normalization > 15:49:58 16 of 18 OK created incremental model raw_achilles.bank_config.......................................................... [MERGE (3.0 rows, 3.1 KB processed) in 5.66s] 2022-07-11 15:49:59 normalization > 15:49:59 15 of 18 OK created incremental model raw_achilles.transactions_out..................................................... [MERGE (113.0 rows, 101.9 KB processed) in 5.80s] 2022-07-11 15:49:59 normalization > 15:49:59 18 of 18 OK created incremental model raw_achilles.partner_config....................................................... [MERGE (206.0 rows, 168.4 KB processed) in 5.85s] 2022-07-11 15:49:59 normalization > 15:49:59 17 of 18 OK created incremental model raw_achilles.files_in............................................................. [MERGE (36.0 rows, 26.6 KB processed) in 6.52s] 2022-07-11 15:50:00 normalization > 15:50:00 14 of 18 OK created incremental model raw_achilles.files_out............................................................ [MERGE (34.0 rows, 20.2 KB processed) in 7.54s] 2022-07-11 15:50:00 normalization > 15:50:00 2022-07-11 15:50:00 normalization > 15:50:00 Finished running 6 view models, 12 incremental models in 22.37s. 2022-07-11 15:50:00 normalization > 15:50:00 2022-07-11 15:50:00 normalization > 15:50:00 Completed with 1 error and 0 warnings: 2022-07-11 15:50:00 normalization > 15:50:00 2022-07-11 15:50:00 normalization > 15:50:00 Database Error in model transactions_in_scd (models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql) 2022-07-11 15:50:00 normalization > 15:50:00 Invalid timestamp string "0000-12-30T00:00:00Z" 2022-07-11 15:50:00 normalization > 15:50:00 compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:50:00 normalization > 15:50:00 2022-07-11 15:50:00 normalization > 15:50:00 Done. PASS=16 WARN=0 ERROR=1 SKIP=1 TOTAL=18 2022-07-11 15:50:00 normalization > 2022-07-11 15:50:00 normalization > Diagnosing dbt debug to check if destination is available for dbt and well configured (1): 2022-07-11 15:50:00 normalization > 2022-07-11 15:50:04 normalization > 15:50:04 Running with dbt=1.0.0 2022-07-11 15:50:04 normalization > dbt version: 1.0.0 2022-07-11 15:50:04 normalization > python version: 3.9.9 2022-07-11 15:50:04 normalization > python path: /usr/local/bin/python 2022-07-11 15:50:04 normalization > os info: Linux-5.13.0-1024-gcp-x86_64-with-glibc2.31 2022-07-11 15:50:04 normalization > Using profiles.yml file at /data/89696/1/normalize/profiles.yml 2022-07-11 15:50:04 normalization > Using dbt_project.yml file at /data/89696/1/normalize/dbt_project.yml 2022-07-11 15:50:04 normalization > 2022-07-11 15:50:04 normalization > Configuration: 2022-07-11 15:50:04 normalization > profiles.yml file [OK found and valid] 2022-07-11 15:50:04 normalization > dbt_project.yml file [OK found and valid] 2022-07-11 15:50:04 normalization > 2022-07-11 15:50:04 normalization > Required dependencies: 2022-07-11 15:50:04 normalization > - git [OK found] 2022-07-11 15:50:04 normalization > 2022-07-11 15:50:04 normalization > Connection: 2022-07-11 15:50:04 normalization > method: service-account-json 2022-07-11 15:50:04 normalization > database: mainapi-282501 2022-07-11 15:50:04 normalization > schema: airbyte 2022-07-11 15:50:04 normalization > location: US 2022-07-11 15:50:04 normalization > priority: interactive 2022-07-11 15:50:04 normalization > timeout_seconds: 300 2022-07-11 15:50:04 normalization > maximum_bytes_billed: None 2022-07-11 15:50:04 normalization > execution_project: mainapi-282501 2022-07-11 15:50:05 normalization > Connection test: [OK connection ok] 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > All checks passed! 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > Forward dbt output logs to diagnose/debug errors (0): 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ============================== 2022-07-11 15:49:35.020659 | 366b306d-ffb4-46ac-8e11-f1004ac48b3b ============================== 2022-07-11 15:50:05 normalization > 15:49:35.020659 [info ] [MainThread]: Running with dbt=1.0.0 2022-07-11 15:50:05 normalization > 15:49:35.021416 [debug] [MainThread]: running dbt with arguments Namespace(record_timing_info=None, debug=None, log_format=None, write_json=None, use_colors=None, printer_width=None, warn_error=None, version_check=None, partial_parse=None, single_threaded=False, use_experimental_parser=None, static_parser=None, profiles_dir='/data/89696/1/normalize', send_anonymous_usage_stats=None, fail_fast=None, event_buffer_size='10000', project_dir='/data/89696/1/normalize', profile=None, target=None, vars='{}', log_cache_events=False, threads=None, select=None, exclude=None, selector_name=None, state=None, defer=None, full_refresh=False, cls=, which='run', rpc_method='run') 2022-07-11 15:50:05 normalization > 15:49:35.021857 [debug] [MainThread]: Tracking: do not track 2022-07-11 15:50:05 normalization > 15:49:35.060292 [info ] [MainThread]: Partial parse save file not found. Starting full parse. 2022-07-11 15:50:05 normalization > 15:49:35.110354 [debug] [MainThread]: Parsing macros/configuration.sql 2022-07-11 15:50:05 normalization > 15:49:35.115359 [debug] [MainThread]: Parsing macros/should_full_refresh.sql 2022-07-11 15:50:05 normalization > 15:49:35.124649 [debug] [MainThread]: Parsing macros/incremental.sql 2022-07-11 15:50:05 normalization > 15:49:35.135719 [debug] [MainThread]: Parsing macros/get_custom_schema.sql 2022-07-11 15:50:05 normalization > 15:49:35.136719 [debug] [MainThread]: Parsing macros/star_intersect.sql 2022-07-11 15:50:05 normalization > 15:49:35.147362 [debug] [MainThread]: Parsing macros/schema_tests/equal_rowcount.sql 2022-07-11 15:50:05 normalization > 15:49:35.149422 [debug] [MainThread]: Parsing macros/schema_tests/equality.sql 2022-07-11 15:50:05 normalization > 15:49:35.160803 [debug] [MainThread]: Parsing macros/cross_db_utils/except.sql 2022-07-11 15:50:05 normalization > 15:49:35.162114 [debug] [MainThread]: Parsing macros/cross_db_utils/current_timestamp.sql 2022-07-11 15:50:05 normalization > 15:49:35.163223 [debug] [MainThread]: Parsing macros/cross_db_utils/columns.sql 2022-07-11 15:50:05 normalization > 15:49:35.169103 [debug] [MainThread]: Parsing macros/cross_db_utils/hash.sql 2022-07-11 15:50:05 normalization > 15:49:35.170384 [debug] [MainThread]: Parsing macros/cross_db_utils/json_operations.sql 2022-07-11 15:50:05 normalization > 15:49:35.241789 [debug] [MainThread]: Parsing macros/cross_db_utils/quote.sql 2022-07-11 15:50:05 normalization > 15:49:35.244632 [debug] [MainThread]: Parsing macros/cross_db_utils/type_conversions.sql 2022-07-11 15:50:05 normalization > 15:49:35.259779 [debug] [MainThread]: Parsing macros/cross_db_utils/surrogate_key.sql 2022-07-11 15:50:05 normalization > 15:49:35.263293 [debug] [MainThread]: Parsing macros/cross_db_utils/datatypes.sql 2022-07-11 15:50:05 normalization > 15:49:35.281884 [debug] [MainThread]: Parsing macros/cross_db_utils/concat.sql 2022-07-11 15:50:05 normalization > 15:49:35.286381 [debug] [MainThread]: Parsing macros/cross_db_utils/array.sql 2022-07-11 15:50:05 normalization > 15:49:35.312227 [debug] [MainThread]: Parsing macros/adapters.sql 2022-07-11 15:50:05 normalization > 15:49:35.364975 [debug] [MainThread]: Parsing macros/catalog.sql 2022-07-11 15:50:05 normalization > 15:49:35.378279 [debug] [MainThread]: Parsing macros/etc.sql 2022-07-11 15:50:05 normalization > 15:49:35.381716 [debug] [MainThread]: Parsing macros/materializations/table.sql 2022-07-11 15:50:05 normalization > 15:49:35.389512 [debug] [MainThread]: Parsing macros/materializations/copy.sql 2022-07-11 15:50:05 normalization > 15:49:35.394164 [debug] [MainThread]: Parsing macros/materializations/seed.sql 2022-07-11 15:50:05 normalization > 15:49:35.398665 [debug] [MainThread]: Parsing macros/materializations/incremental.sql 2022-07-11 15:50:05 normalization > 15:49:35.426593 [debug] [MainThread]: Parsing macros/materializations/view.sql 2022-07-11 15:50:05 normalization > 15:49:35.431321 [debug] [MainThread]: Parsing macros/materializations/snapshot.sql 2022-07-11 15:50:05 normalization > 15:49:35.434180 [debug] [MainThread]: Parsing macros/etc/statement.sql 2022-07-11 15:50:05 normalization > 15:49:35.442183 [debug] [MainThread]: Parsing macros/etc/datetime.sql 2022-07-11 15:50:05 normalization > 15:49:35.459505 [debug] [MainThread]: Parsing macros/materializations/configs.sql 2022-07-11 15:50:05 normalization > 15:49:35.463631 [debug] [MainThread]: Parsing macros/materializations/hooks.sql 2022-07-11 15:50:05 normalization > 15:49:35.470495 [debug] [MainThread]: Parsing macros/materializations/tests/where_subquery.sql 2022-07-11 15:50:05 normalization > 15:49:35.473576 [debug] [MainThread]: Parsing macros/materializations/tests/helpers.sql 2022-07-11 15:50:05 normalization > 15:49:35.476628 [debug] [MainThread]: Parsing macros/materializations/tests/test.sql 2022-07-11 15:50:05 normalization > 15:49:35.484943 [debug] [MainThread]: Parsing macros/materializations/seeds/seed.sql 2022-07-11 15:50:05 normalization > 15:49:35.496035 [debug] [MainThread]: Parsing macros/materializations/seeds/helpers.sql 2022-07-11 15:50:05 normalization > 15:49:35.527825 [debug] [MainThread]: Parsing macros/materializations/models/table/table.sql 2022-07-11 15:50:05 normalization > 15:49:35.541082 [debug] [MainThread]: Parsing macros/materializations/models/table/create_table_as.sql 2022-07-11 15:50:05 normalization > 15:49:35.546006 [debug] [MainThread]: Parsing macros/materializations/models/incremental/incremental.sql 2022-07-11 15:50:05 normalization > 15:49:35.565405 [debug] [MainThread]: Parsing macros/materializations/models/incremental/column_helpers.sql 2022-07-11 15:50:05 normalization > 15:49:35.573327 [debug] [MainThread]: Parsing macros/materializations/models/incremental/merge.sql 2022-07-11 15:50:05 normalization > 15:49:35.595218 [debug] [MainThread]: Parsing macros/materializations/models/incremental/on_schema_change.sql 2022-07-11 15:50:05 normalization > 15:49:35.626177 [debug] [MainThread]: Parsing macros/materializations/models/incremental/is_incremental.sql 2022-07-11 15:50:05 normalization > 15:49:35.628964 [debug] [MainThread]: Parsing macros/materializations/models/view/create_or_replace_view.sql 2022-07-11 15:50:05 normalization > 15:49:35.633708 [debug] [MainThread]: Parsing macros/materializations/models/view/create_view_as.sql 2022-07-11 15:50:05 normalization > 15:49:35.637626 [debug] [MainThread]: Parsing macros/materializations/models/view/view.sql 2022-07-11 15:50:05 normalization > 15:49:35.650848 [debug] [MainThread]: Parsing macros/materializations/models/view/helpers.sql 2022-07-11 15:50:05 normalization > 15:49:35.653126 [debug] [MainThread]: Parsing macros/materializations/snapshots/snapshot_merge.sql 2022-07-11 15:50:05 normalization > 15:49:35.656069 [debug] [MainThread]: Parsing macros/materializations/snapshots/snapshot.sql 2022-07-11 15:50:05 normalization > 15:49:35.677848 [debug] [MainThread]: Parsing macros/materializations/snapshots/helpers.sql 2022-07-11 15:50:05 normalization > 15:49:35.700277 [debug] [MainThread]: Parsing macros/materializations/snapshots/strategies.sql 2022-07-11 15:50:05 normalization > 15:49:35.732751 [debug] [MainThread]: Parsing macros/adapters/persist_docs.sql 2022-07-11 15:50:05 normalization > 15:49:35.741264 [debug] [MainThread]: Parsing macros/adapters/columns.sql 2022-07-11 15:50:05 normalization > 15:49:35.760164 [debug] [MainThread]: Parsing macros/adapters/indexes.sql 2022-07-11 15:50:05 normalization > 15:49:35.765263 [debug] [MainThread]: Parsing macros/adapters/relation.sql 2022-07-11 15:50:05 normalization > 15:49:35.783871 [debug] [MainThread]: Parsing macros/adapters/schema.sql 2022-07-11 15:50:05 normalization > 15:49:35.788451 [debug] [MainThread]: Parsing macros/adapters/freshness.sql 2022-07-11 15:50:05 normalization > 15:49:35.793699 [debug] [MainThread]: Parsing macros/adapters/metadata.sql 2022-07-11 15:50:05 normalization > 15:49:35.807346 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_database.sql 2022-07-11 15:50:05 normalization > 15:49:35.810227 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_schema.sql 2022-07-11 15:50:05 normalization > 15:49:35.814757 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_alias.sql 2022-07-11 15:50:05 normalization > 15:49:35.817210 [debug] [MainThread]: Parsing macros/generic_test_sql/not_null.sql 2022-07-11 15:50:05 normalization > 15:49:35.818099 [debug] [MainThread]: Parsing macros/generic_test_sql/accepted_values.sql 2022-07-11 15:50:05 normalization > 15:49:35.820461 [debug] [MainThread]: Parsing macros/generic_test_sql/unique.sql 2022-07-11 15:50:05 normalization > 15:49:35.821658 [debug] [MainThread]: Parsing macros/generic_test_sql/relationships.sql 2022-07-11 15:50:05 normalization > 15:49:35.823224 [debug] [MainThread]: Parsing tests/generic/builtin.sql 2022-07-11 15:50:05 normalization > 15:49:35.828565 [debug] [MainThread]: Parsing macros/web/get_url_path.sql 2022-07-11 15:50:05 normalization > 15:49:35.833198 [debug] [MainThread]: Parsing macros/web/get_url_host.sql 2022-07-11 15:50:05 normalization > 15:49:35.836675 [debug] [MainThread]: Parsing macros/web/get_url_parameter.sql 2022-07-11 15:50:05 normalization > 15:49:35.839396 [debug] [MainThread]: Parsing macros/materializations/insert_by_period_materialization.sql 2022-07-11 15:50:05 normalization > 15:49:35.886297 [debug] [MainThread]: Parsing macros/schema_tests/test_not_null_where.sql 2022-07-11 15:50:05 normalization > 15:49:35.888965 [debug] [MainThread]: Parsing macros/schema_tests/test_unique_where.sql 2022-07-11 15:50:05 normalization > 15:49:35.891467 [debug] [MainThread]: Parsing macros/schema_tests/at_least_one.sql 2022-07-11 15:50:05 normalization > 15:49:35.893607 [debug] [MainThread]: Parsing macros/schema_tests/not_constant.sql 2022-07-11 15:50:05 normalization > 15:49:35.895743 [debug] [MainThread]: Parsing macros/schema_tests/expression_is_true.sql 2022-07-11 15:50:05 normalization > 15:49:35.898939 [debug] [MainThread]: Parsing macros/schema_tests/recency.sql 2022-07-11 15:50:05 normalization > 15:49:35.901987 [debug] [MainThread]: Parsing macros/schema_tests/equal_rowcount.sql 2022-07-11 15:50:05 normalization > 15:49:35.904884 [debug] [MainThread]: Parsing macros/schema_tests/not_null_proportion.sql 2022-07-11 15:50:05 normalization > 15:49:35.908782 [debug] [MainThread]: Parsing macros/schema_tests/accepted_range.sql 2022-07-11 15:50:05 normalization > 15:49:35.914134 [debug] [MainThread]: Parsing macros/schema_tests/not_accepted_values.sql 2022-07-11 15:50:05 normalization > 15:49:35.917911 [debug] [MainThread]: Parsing macros/schema_tests/cardinality_equality.sql 2022-07-11 15:50:05 normalization > 15:49:35.921525 [debug] [MainThread]: Parsing macros/schema_tests/unique_combination_of_columns.sql 2022-07-11 15:50:05 normalization > 15:49:35.926551 [debug] [MainThread]: Parsing macros/schema_tests/mutually_exclusive_ranges.sql 2022-07-11 15:50:05 normalization > 15:49:35.943966 [debug] [MainThread]: Parsing macros/schema_tests/fewer_rows_than.sql 2022-07-11 15:50:05 normalization > 15:49:35.947152 [debug] [MainThread]: Parsing macros/schema_tests/equality.sql 2022-07-11 15:50:05 normalization > 15:49:35.953664 [debug] [MainThread]: Parsing macros/schema_tests/relationships_where.sql 2022-07-11 15:50:05 normalization > 15:49:35.957664 [debug] [MainThread]: Parsing macros/schema_tests/sequential_values.sql 2022-07-11 15:50:05 normalization > 15:49:35.963121 [debug] [MainThread]: Parsing macros/cross_db_utils/except.sql 2022-07-11 15:50:05 normalization > 15:49:35.964973 [debug] [MainThread]: Parsing macros/cross_db_utils/length.sql 2022-07-11 15:50:05 normalization > 15:49:35.967826 [debug] [MainThread]: Parsing macros/cross_db_utils/position.sql 2022-07-11 15:50:05 normalization > 15:49:35.970742 [debug] [MainThread]: Parsing macros/cross_db_utils/current_timestamp.sql 2022-07-11 15:50:05 normalization > 15:49:35.977029 [debug] [MainThread]: Parsing macros/cross_db_utils/intersect.sql 2022-07-11 15:50:05 normalization > 15:49:35.979208 [debug] [MainThread]: Parsing macros/cross_db_utils/replace.sql 2022-07-11 15:50:05 normalization > 15:49:35.981363 [debug] [MainThread]: Parsing macros/cross_db_utils/escape_single_quotes.sql 2022-07-11 15:50:05 normalization > 15:49:35.984551 [debug] [MainThread]: Parsing macros/cross_db_utils/any_value.sql 2022-07-11 15:50:05 normalization > 15:49:35.986855 [debug] [MainThread]: Parsing macros/cross_db_utils/last_day.sql 2022-07-11 15:50:05 normalization > 15:49:35.993662 [debug] [MainThread]: Parsing macros/cross_db_utils/cast_bool_to_text.sql 2022-07-11 15:50:05 normalization > 15:49:35.996208 [debug] [MainThread]: Parsing macros/cross_db_utils/dateadd.sql 2022-07-11 15:50:05 normalization > 15:49:36.001508 [debug] [MainThread]: Parsing macros/cross_db_utils/literal.sql 2022-07-11 15:50:05 normalization > 15:49:36.003133 [debug] [MainThread]: Parsing macros/cross_db_utils/safe_cast.sql 2022-07-11 15:50:05 normalization > 15:49:36.006396 [debug] [MainThread]: Parsing macros/cross_db_utils/date_trunc.sql 2022-07-11 15:50:05 normalization > 15:49:36.008983 [debug] [MainThread]: Parsing macros/cross_db_utils/bool_or.sql 2022-07-11 15:50:05 normalization > 15:49:36.011811 [debug] [MainThread]: Parsing macros/cross_db_utils/width_bucket.sql 2022-07-11 15:50:05 normalization > 15:49:36.022081 [debug] [MainThread]: Parsing macros/cross_db_utils/hash.sql 2022-07-11 15:50:05 normalization > 15:49:36.024661 [debug] [MainThread]: Parsing macros/cross_db_utils/_is_ephemeral.sql 2022-07-11 15:50:05 normalization > 15:49:36.028187 [debug] [MainThread]: Parsing macros/cross_db_utils/_is_relation.sql 2022-07-11 15:50:05 normalization > 15:49:36.030159 [debug] [MainThread]: Parsing macros/cross_db_utils/right.sql 2022-07-11 15:50:05 normalization > 15:49:36.034343 [debug] [MainThread]: Parsing macros/cross_db_utils/split_part.sql 2022-07-11 15:50:05 normalization > 15:49:36.037536 [debug] [MainThread]: Parsing macros/cross_db_utils/datediff.sql 2022-07-11 15:50:05 normalization > 15:49:36.057566 [debug] [MainThread]: Parsing macros/cross_db_utils/datatypes.sql 2022-07-11 15:50:05 normalization > 15:49:36.068804 [debug] [MainThread]: Parsing macros/cross_db_utils/concat.sql 2022-07-11 15:50:05 normalization > 15:49:36.070694 [debug] [MainThread]: Parsing macros/cross_db_utils/identifier.sql 2022-07-11 15:50:05 normalization > 15:49:36.073551 [debug] [MainThread]: Parsing macros/sql/get_tables_by_prefix_sql.sql 2022-07-11 15:50:05 normalization > 15:49:36.076514 [debug] [MainThread]: Parsing macros/sql/get_column_values.sql 2022-07-11 15:50:05 normalization > 15:49:36.086766 [debug] [MainThread]: Parsing macros/sql/get_query_results_as_dict.sql 2022-07-11 15:50:05 normalization > 15:49:36.091300 [debug] [MainThread]: Parsing macros/sql/get_relations_by_pattern.sql 2022-07-11 15:50:05 normalization > 15:49:36.097657 [debug] [MainThread]: Parsing macros/sql/get_relations_by_prefix.sql 2022-07-11 15:50:05 normalization > 15:49:36.104074 [debug] [MainThread]: Parsing macros/sql/haversine_distance.sql 2022-07-11 15:50:05 normalization > 15:49:36.115204 [debug] [MainThread]: Parsing macros/sql/get_tables_by_pattern_sql.sql 2022-07-11 15:50:05 normalization > 15:49:36.127875 [debug] [MainThread]: Parsing macros/sql/pivot.sql 2022-07-11 15:50:05 normalization > 15:49:36.135656 [debug] [MainThread]: Parsing macros/sql/date_spine.sql 2022-07-11 15:50:05 normalization > 15:49:36.143244 [debug] [MainThread]: Parsing macros/sql/star.sql 2022-07-11 15:50:05 normalization > 15:49:36.151334 [debug] [MainThread]: Parsing macros/sql/union.sql 2022-07-11 15:50:05 normalization > 15:49:36.171427 [debug] [MainThread]: Parsing macros/sql/get_table_types_sql.sql 2022-07-11 15:50:05 normalization > 15:49:36.173898 [debug] [MainThread]: Parsing macros/sql/safe_add.sql 2022-07-11 15:50:05 normalization > 15:49:36.176756 [debug] [MainThread]: Parsing macros/sql/surrogate_key.sql 2022-07-11 15:50:05 normalization > 15:49:36.182834 [debug] [MainThread]: Parsing macros/sql/groupby.sql 2022-07-11 15:50:05 normalization > 15:49:36.185096 [debug] [MainThread]: Parsing macros/sql/generate_series.sql 2022-07-11 15:50:05 normalization > 15:49:36.193257 [debug] [MainThread]: Parsing macros/sql/nullcheck.sql 2022-07-11 15:50:05 normalization > 15:49:36.196131 [debug] [MainThread]: Parsing macros/sql/unpivot.sql 2022-07-11 15:50:05 normalization > 15:49:36.210978 [debug] [MainThread]: Parsing macros/sql/nullcheck_table.sql 2022-07-11 15:50:05 normalization > 15:49:36.213743 [debug] [MainThread]: Parsing macros/jinja_helpers/log_info.sql 2022-07-11 15:50:05 normalization > 15:49:36.215649 [debug] [MainThread]: Parsing macros/jinja_helpers/pretty_time.sql 2022-07-11 15:50:05 normalization > 15:49:36.217824 [debug] [MainThread]: Parsing macros/jinja_helpers/slugify.sql 2022-07-11 15:50:05 normalization > 15:49:36.219823 [debug] [MainThread]: Parsing macros/jinja_helpers/pretty_log_format.sql 2022-07-11 15:50:05 normalization > 15:49:36.957977 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/transactions_out_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.045629 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/transactions_out_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.049096 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/bank_config_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.089407 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/bank_config_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.092653 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.134802 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.137910 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/files_out_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.244736 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/files_out_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.247594 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/files_in_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.283486 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/files_in_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.286395 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/partner_config_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.322505 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/partner_config_scd.sql 2022-07-11 15:50:05 normalization > 15:49:37.324759 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/files_out.sql 2022-07-11 15:50:05 normalization > 15:49:37.340836 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/files_out.sql 2022-07-11 15:50:05 normalization > 15:49:37.342969 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/files_in.sql 2022-07-11 15:50:05 normalization > 15:49:37.355750 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/files_in.sql 2022-07-11 15:50:05 normalization > 15:49:37.357797 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/bank_config.sql 2022-07-11 15:50:05 normalization > 15:49:37.368779 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/bank_config.sql 2022-07-11 15:50:05 normalization > 15:49:37.370812 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/partner_config.sql 2022-07-11 15:50:05 normalization > 15:49:37.382102 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/partner_config.sql 2022-07-11 15:50:05 normalization > 15:49:37.384380 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/transactions_in.sql 2022-07-11 15:50:05 normalization > 15:49:37.396128 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/transactions_in.sql 2022-07-11 15:50:05 normalization > 15:49:37.398473 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/transactions_out.sql 2022-07-11 15:50:05 normalization > 15:49:37.409749 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/transactions_out.sql 2022-07-11 15:50:05 normalization > 15:49:37.411899 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_out_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.449821 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_out_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.452259 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_in_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.489794 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_in_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.492326 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_out_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.521259 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_out_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.523506 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/bank_config_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.540669 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/bank_config_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.542793 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/partner_config_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.568940 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/partner_config_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.571103 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/bank_config_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.592879 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/bank_config_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.595058 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/partner_config_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.613827 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/partner_config_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.616324 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_out_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.664788 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_out_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.667247 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_in_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.691962 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_in_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.694827 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_in_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.744157 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_in_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.747605 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_in_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.825789 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_in_ab1.sql 2022-07-11 15:50:05 normalization > 15:49:37.828348 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_out_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.858195 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_out_ab2.sql 2022-07-11 15:50:05 normalization > 15:49:37.860455 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/transactions_in_stg.sql 2022-07-11 15:50:05 normalization > 15:49:37.898373 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/transactions_in_stg.sql 2022-07-11 15:50:05 normalization > 15:49:37.900512 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/bank_config_stg.sql 2022-07-11 15:50:05 normalization > 15:49:37.915984 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/bank_config_stg.sql 2022-07-11 15:50:05 normalization > 15:49:37.918092 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/partner_config_stg.sql 2022-07-11 15:50:05 normalization > 15:49:37.934237 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/partner_config_stg.sql 2022-07-11 15:50:05 normalization > 15:49:37.936434 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/transactions_out_stg.sql 2022-07-11 15:50:05 normalization > 15:49:37.959858 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/transactions_out_stg.sql 2022-07-11 15:50:05 normalization > 15:49:37.962015 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/files_out_stg.sql 2022-07-11 15:50:05 normalization > 15:49:37.980320 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/files_out_stg.sql 2022-07-11 15:50:05 normalization > 15:49:37.982550 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/files_in_stg.sql 2022-07-11 15:50:05 normalization > 15:49:38.002685 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/files_in_stg.sql 2022-07-11 15:50:05 normalization > 15:49:38.124904 [warn ] [MainThread]: [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-07-11 15:50:05 normalization > There are 1 unused configuration paths: 2022-07-11 15:50:05 normalization > - models.airbyte_utils.generated.airbyte_tables 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:38.159814 [info ] [MainThread]: Found 30 models, 0 tests, 0 snapshots, 0 analyses, 549 macros, 0 operations, 0 seed files, 6 sources, 0 exposures, 0 metrics 2022-07-11 15:50:05 normalization > 15:49:38.164095 [info ] [MainThread]: 2022-07-11 15:50:05 normalization > 15:49:38.165317 [debug] [MainThread]: Acquiring new bigquery connection "master" 2022-07-11 15:50:05 normalization > 15:49:38.168368 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501" 2022-07-11 15:50:05 normalization > 15:49:38.169518 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501" 2022-07-11 15:50:05 normalization > 15:49:38.169840 [debug] [ThreadPool]: Opening a new connection, currently in state init 2022-07-11 15:50:05 normalization > 15:49:38.170378 [debug] [ThreadPool]: Opening a new connection, currently in state init 2022-07-11 15:50:05 normalization > 15:49:38.443324 [debug] [ThreadPool]: Acquiring new bigquery connection "create_mainapi-282501__airbyte_raw_achilles" 2022-07-11 15:50:05 normalization > 15:49:38.444234 [debug] [ThreadPool]: Acquiring new bigquery connection "create_mainapi-282501__airbyte_raw_achilles" 2022-07-11 15:50:05 normalization > 15:49:38.444477 [debug] [ThreadPool]: BigQuery adapter: Creating schema "mainapi-282501._airbyte_raw_achilles". 2022-07-11 15:50:05 normalization > 15:49:38.444697 [debug] [ThreadPool]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:38.717960 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501_raw_achilles" 2022-07-11 15:50:05 normalization > 15:49:38.719312 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501__airbyte_raw_achilles" 2022-07-11 15:50:05 normalization > 15:49:38.720039 [debug] [ThreadPool]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:38.720498 [debug] [ThreadPool]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:38.879270 [info ] [MainThread]: Concurrency: 8 threads (target='prod') 2022-07-11 15:50:05 normalization > 15:49:38.880253 [info ] [MainThread]: 2022-07-11 15:50:05 normalization > 15:49:38.908563 [debug] [Thread-1 ]: Began running node model.airbyte_utils.bank_config_ab1 2022-07-11 15:50:05 normalization > 15:49:38.908972 [debug] [Thread-2 ]: Began running node model.airbyte_utils.files_in_ab1 2022-07-11 15:50:05 normalization > 15:49:38.909468 [debug] [Thread-3 ]: Began running node model.airbyte_utils.files_out_ab1 2022-07-11 15:50:05 normalization > 15:49:38.910545 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_ab1" 2022-07-11 15:50:05 normalization > 15:49:38.910804 [debug] [Thread-4 ]: Began running node model.airbyte_utils.partner_config_ab1 2022-07-11 15:50:05 normalization > 15:49:38.911129 [debug] [Thread-5 ]: Began running node model.airbyte_utils.transactions_in_ab1 2022-07-11 15:50:05 normalization > 15:49:38.911496 [debug] [Thread-6 ]: Began running node model.airbyte_utils.transactions_out_ab1 2022-07-11 15:50:05 normalization > 15:49:38.912311 [debug] [Thread-2 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_ab1" 2022-07-11 15:50:05 normalization > 15:49:38.913469 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_ab1" 2022-07-11 15:50:05 normalization > 15:49:38.913810 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.bank_config_ab1 2022-07-11 15:50:05 normalization > 15:49:38.914789 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_ab1" 2022-07-11 15:50:05 normalization > 15:49:38.915820 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_ab1" 2022-07-11 15:50:05 normalization > 15:49:38.916782 [debug] [Thread-6 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_ab1" 2022-07-11 15:50:05 normalization > 15:49:38.917133 [debug] [Thread-2 ]: Began compiling node model.airbyte_utils.files_in_ab1 2022-07-11 15:50:05 normalization > 15:49:38.917413 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.files_out_ab1 2022-07-11 15:50:05 normalization > 15:49:38.917766 [debug] [Thread-1 ]: Compiling model.airbyte_utils.bank_config_ab1 2022-07-11 15:50:05 normalization > 15:49:38.918095 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.partner_config_ab1 2022-07-11 15:50:05 normalization > 15:49:38.918427 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.transactions_in_ab1 2022-07-11 15:50:05 normalization > 15:49:38.918727 [debug] [Thread-6 ]: Began compiling node model.airbyte_utils.transactions_out_ab1 2022-07-11 15:50:05 normalization > 15:49:38.919084 [debug] [Thread-2 ]: Compiling model.airbyte_utils.files_in_ab1 2022-07-11 15:50:05 normalization > 15:49:38.919356 [debug] [Thread-3 ]: Compiling model.airbyte_utils.files_out_ab1 2022-07-11 15:50:05 normalization > 15:49:38.940656 [debug] [Thread-4 ]: Compiling model.airbyte_utils.partner_config_ab1 2022-07-11 15:50:05 normalization > 15:49:38.942419 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_ab1" 2022-07-11 15:50:05 normalization > 15:49:38.942806 [debug] [Thread-5 ]: Compiling model.airbyte_utils.transactions_in_ab1 2022-07-11 15:50:05 normalization > 15:49:38.943186 [debug] [Thread-6 ]: Compiling model.airbyte_utils.transactions_out_ab1 2022-07-11 15:50:05 normalization > 15:49:38.984743 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.files_out_ab1" 2022-07-11 15:50:05 normalization > 15:49:39.003078 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_ab1" 2022-07-11 15:50:05 normalization > 15:49:39.124200 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.130026 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.151858 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.bank_config_ab1 2022-07-11 15:50:05 normalization > 15:49:39.163603 [debug] [Thread-2 ]: Writing injected SQL for node "model.airbyte_utils.files_in_ab1" 2022-07-11 15:50:05 normalization > 15:49:39.171520 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.201309 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_ab1" 2022-07-11 15:50:05 normalization > 15:49:39.203995 [debug] [Thread-6 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_ab1" 2022-07-11 15:50:05 normalization > 15:49:39.205005 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.files_out_ab1 2022-07-11 15:50:05 normalization > 15:49:39.206601 [debug] [Thread-8 ]: Began running node model.airbyte_utils.bank_config_ab2 2022-07-11 15:50:05 normalization > 15:49:39.207768 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.partner_config_ab1 2022-07-11 15:50:05 normalization > 15:49:39.208399 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.210296 [debug] [Thread-1 ]: Began running node model.airbyte_utils.files_out_ab2 2022-07-11 15:50:05 normalization > 15:49:39.210673 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.211069 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.212047 [debug] [Thread-8 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.213507 [debug] [Thread-2 ]: Finished running node model.airbyte_utils.files_in_ab1 2022-07-11 15:50:05 normalization > 15:49:39.214120 [debug] [Thread-3 ]: Began running node model.airbyte_utils.partner_config_ab2 2022-07-11 15:50:05 normalization > 15:49:39.215165 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.216034 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.transactions_in_ab1 2022-07-11 15:50:05 normalization > 15:49:39.216991 [debug] [Thread-6 ]: Finished running node model.airbyte_utils.transactions_out_ab1 2022-07-11 15:50:05 normalization > 15:49:39.217336 [debug] [Thread-8 ]: Began compiling node model.airbyte_utils.bank_config_ab2 2022-07-11 15:50:05 normalization > 15:49:39.218150 [debug] [Thread-4 ]: Began running node model.airbyte_utils.files_in_ab2 2022-07-11 15:50:05 normalization > 15:49:39.219111 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.219421 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.files_out_ab2 2022-07-11 15:50:05 normalization > 15:49:39.220280 [debug] [Thread-2 ]: Began running node model.airbyte_utils.transactions_in_ab2 2022-07-11 15:50:05 normalization > 15:49:39.221257 [debug] [Thread-5 ]: Began running node model.airbyte_utils.transactions_out_ab2 2022-07-11 15:50:05 normalization > 15:49:39.221548 [debug] [Thread-8 ]: Compiling model.airbyte_utils.bank_config_ab2 2022-07-11 15:50:05 normalization > 15:49:39.222479 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.222800 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.partner_config_ab2 2022-07-11 15:50:05 normalization > 15:49:39.223130 [debug] [Thread-1 ]: Compiling model.airbyte_utils.files_out_ab2 2022-07-11 15:50:05 normalization > 15:49:39.223992 [debug] [Thread-2 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.224857 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.330369 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.files_in_ab2 2022-07-11 15:50:05 normalization > 15:49:39.356357 [debug] [Thread-3 ]: Compiling model.airbyte_utils.partner_config_ab2 2022-07-11 15:50:05 normalization > 15:49:39.361700 [debug] [Thread-8 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.383261 [debug] [Thread-2 ]: Began compiling node model.airbyte_utils.transactions_in_ab2 2022-07-11 15:50:05 normalization > 15:49:39.404259 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.files_out_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.404645 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.transactions_out_ab2 2022-07-11 15:50:05 normalization > 15:49:39.405122 [debug] [Thread-4 ]: Compiling model.airbyte_utils.files_in_ab2 2022-07-11 15:50:05 normalization > 15:49:39.448859 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.449412 [debug] [Thread-2 ]: Compiling model.airbyte_utils.transactions_in_ab2 2022-07-11 15:50:05 normalization > 15:49:39.449684 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.450448 [debug] [Thread-5 ]: Compiling model.airbyte_utils.transactions_out_ab2 2022-07-11 15:50:05 normalization > 15:49:39.466831 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.499392 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.533647 [debug] [Thread-8 ]: Finished running node model.airbyte_utils.bank_config_ab2 2022-07-11 15:50:05 normalization > 15:49:39.552220 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.files_in_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.590271 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.files_out_ab2 2022-07-11 15:50:05 normalization > 15:49:39.621865 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.partner_config_ab2 2022-07-11 15:50:05 normalization > 15:49:39.732486 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.738344 [debug] [Thread-6 ]: Began running node model.airbyte_utils.bank_config_stg 2022-07-11 15:50:05 normalization > 15:49:39.752401 [debug] [Thread-2 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_ab2" 2022-07-11 15:50:05 normalization > 15:49:39.753528 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.753898 [debug] [Thread-3 ]: Began running node model.airbyte_utils.files_out_stg 2022-07-11 15:50:05 normalization > 15:49:39.755049 [info ] [Thread-6 ]: 1 of 18 START view model _airbyte_raw_achilles.bank_config_stg.......................................................... [RUN] 2022-07-11 15:50:05 normalization > 15:49:39.755599 [debug] [Thread-7 ]: Began running node model.airbyte_utils.partner_config_stg 2022-07-11 15:50:05 normalization > 15:49:39.756218 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.757417 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.files_in_ab2 2022-07-11 15:50:05 normalization > 15:49:39.757728 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.758050 [info ] [Thread-3 ]: 2 of 18 START view model _airbyte_raw_achilles.files_out_stg............................................................ [RUN] 2022-07-11 15:50:05 normalization > 15:49:39.759346 [debug] [Thread-6 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_stg" 2022-07-11 15:50:05 normalization > 15:49:39.759880 [info ] [Thread-7 ]: 3 of 18 START view model _airbyte_raw_achilles.partner_config_stg....................................................... [RUN] 2022-07-11 15:50:05 normalization > 15:49:39.760737 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.transactions_out_ab2 2022-07-11 15:50:05 normalization > 15:49:39.762373 [debug] [Thread-2 ]: Finished running node model.airbyte_utils.transactions_in_ab2 2022-07-11 15:50:05 normalization > 15:49:39.762729 [debug] [Thread-8 ]: Began running node model.airbyte_utils.files_in_stg 2022-07-11 15:50:05 normalization > 15:49:39.764488 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_stg" 2022-07-11 15:50:05 normalization > 15:49:39.764818 [debug] [Thread-6 ]: Began compiling node model.airbyte_utils.bank_config_stg 2022-07-11 15:50:05 normalization > 15:49:39.766262 [debug] [Thread-7 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_stg" 2022-07-11 15:50:05 normalization > 15:49:39.767090 [debug] [Thread-4 ]: Began running node model.airbyte_utils.transactions_out_stg 2022-07-11 15:50:05 normalization > 15:49:39.768203 [info ] [Thread-8 ]: 4 of 18 START view model _airbyte_raw_achilles.files_in_stg............................................................. [RUN] 2022-07-11 15:50:05 normalization > 15:49:39.768459 [debug] [Thread-5 ]: Began running node model.airbyte_utils.transactions_in_stg 2022-07-11 15:50:05 normalization > 15:49:39.768790 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.files_out_stg 2022-07-11 15:50:05 normalization > 15:49:39.769100 [debug] [Thread-6 ]: Compiling model.airbyte_utils.bank_config_stg 2022-07-11 15:50:05 normalization > 15:49:39.769379 [debug] [Thread-7 ]: Began compiling node model.airbyte_utils.partner_config_stg 2022-07-11 15:50:05 normalization > 15:49:39.769753 [info ] [Thread-4 ]: 5 of 18 START view model _airbyte_raw_achilles.transactions_out_stg..................................................... [RUN] 2022-07-11 15:50:05 normalization > 15:49:39.770965 [debug] [Thread-8 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_stg" 2022-07-11 15:50:05 normalization > 15:49:39.771441 [info ] [Thread-5 ]: 6 of 18 START view model _airbyte_raw_achilles.transactions_in_stg...................................................... [RUN] 2022-07-11 15:50:05 normalization > 15:49:39.771778 [debug] [Thread-3 ]: Compiling model.airbyte_utils.files_out_stg 2022-07-11 15:50:05 normalization > 15:49:39.792877 [debug] [Thread-7 ]: Compiling model.airbyte_utils.partner_config_stg 2022-07-11 15:50:05 normalization > 15:49:39.810804 [debug] [Thread-6 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_stg" 2022-07-11 15:50:05 normalization > 15:49:39.812339 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_stg" 2022-07-11 15:50:05 normalization > 15:49:39.812628 [debug] [Thread-8 ]: Began compiling node model.airbyte_utils.files_in_stg 2022-07-11 15:50:05 normalization > 15:49:39.813804 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_stg" 2022-07-11 15:50:05 normalization > 15:49:39.872870 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.files_out_stg" 2022-07-11 15:50:05 normalization > 15:49:39.892084 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.transactions_out_stg 2022-07-11 15:50:05 normalization > 15:49:39.905115 [debug] [Thread-7 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_stg" 2022-07-11 15:50:05 normalization > 15:49:39.905525 [debug] [Thread-8 ]: Compiling model.airbyte_utils.files_in_stg 2022-07-11 15:50:05 normalization > 15:49:39.905888 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.906154 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.transactions_in_stg 2022-07-11 15:50:05 normalization > 15:49:39.906726 [debug] [Thread-4 ]: Compiling model.airbyte_utils.transactions_out_stg 2022-07-11 15:50:05 normalization > 15:49:39.907246 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.923291 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:39.928896 [debug] [Thread-6 ]: Began executing node model.airbyte_utils.bank_config_stg 2022-07-11 15:50:05 normalization > 15:49:39.954740 [debug] [Thread-5 ]: Compiling model.airbyte_utils.transactions_in_stg 2022-07-11 15:50:05 normalization > 15:49:39.974110 [debug] [Thread-8 ]: Writing injected SQL for node "model.airbyte_utils.files_in_stg" 2022-07-11 15:50:05 normalization > 15:49:39.995277 [debug] [Thread-3 ]: Began executing node model.airbyte_utils.files_out_stg 2022-07-11 15:50:05 normalization > 15:49:40.021852 [debug] [Thread-7 ]: Began executing node model.airbyte_utils.partner_config_stg 2022-07-11 15:50:05 normalization > 15:49:40.140978 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_stg" 2022-07-11 15:50:05 normalization > 15:49:40.148047 [debug] [Thread-6 ]: Writing runtime SQL for node "model.airbyte_utils.bank_config_stg" 2022-07-11 15:50:05 normalization > 15:49:40.206723 [debug] [Thread-7 ]: Writing runtime SQL for node "model.airbyte_utils.partner_config_stg" 2022-07-11 15:50:05 normalization > 15:49:40.207096 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:40.208252 [debug] [Thread-3 ]: Writing runtime SQL for node "model.airbyte_utils.files_out_stg" 2022-07-11 15:50:05 normalization > 15:49:40.233962 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:40.240277 [debug] [Thread-8 ]: Began executing node model.airbyte_utils.files_in_stg 2022-07-11 15:50:05 normalization > 15:49:40.262640 [debug] [Thread-4 ]: Began executing node model.airbyte_utils.transactions_out_stg 2022-07-11 15:50:05 normalization > 15:49:40.268851 [debug] [Thread-3 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:40.279030 [debug] [Thread-6 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:40.279561 [debug] [Thread-7 ]: Opening a new connection, currently in state init 2022-07-11 15:50:05 normalization > 15:49:40.291960 [debug] [Thread-8 ]: Writing runtime SQL for node "model.airbyte_utils.files_in_stg" 2022-07-11 15:50:05 normalization > 15:49:40.319054 [debug] [Thread-4 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_out_stg" 2022-07-11 15:50:05 normalization > 15:49:40.341197 [debug] [Thread-4 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:40.341638 [debug] [Thread-3 ]: On model.airbyte_utils.files_out_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_stg"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`files_out_stg` 2022-07-11 15:50:05 normalization > OPTIONS() 2022-07-11 15:50:05 normalization > as 2022-07-11 15:50:05 normalization > with __dbt__cte__files_out_ab1 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:50:05 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_files_out 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['file_hash']") as file_hash, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['file_name']") as file_name, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['batch_count']") as batch_count, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['exchange_window']") as exchange_window, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_files_out as table_alias 2022-07-11 15:50:05 normalization > -- files_out 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ), __dbt__cte__files_out_ab2 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__files_out_ab1 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > cast(id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as id, 2022-07-11 15:50:05 normalization > cast(bank_id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as bank_id, 2022-07-11 15:50:05 normalization > cast(nullif(created, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as created, 2022-07-11 15:50:05 normalization > cast(nullif(updated, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as updated, 2022-07-11 15:50:05 normalization > cast(file_hash as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as file_hash, 2022-07-11 15:50:05 normalization > cast(file_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as file_name, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > cast(batch_count as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as batch_count, 2022-07-11 15:50:05 normalization > cast(nullif(exchange_window, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as exchange_window, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from __dbt__cte__files_out_ab1 2022-07-11 15:50:05 normalization > -- files_out 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__files_out_ab2 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(file_hash as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(file_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(batch_count as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(exchange_window as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_files_out_hashid, 2022-07-11 15:50:05 normalization > tmp.* 2022-07-11 15:50:05 normalization > from __dbt__cte__files_out_ab2 tmp 2022-07-11 15:50:05 normalization > -- files_out 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > ; 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:40.355728 [debug] [Thread-6 ]: On model.airbyte_utils.bank_config_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_stg"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`bank_config_stg` 2022-07-11 15:50:05 normalization > OPTIONS() 2022-07-11 15:50:05 normalization > as 2022-07-11 15:50:05 normalization > with __dbt__cte__bank_config_ab1 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:50:05 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_bank_config 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['name']") as name, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['config']") as config, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['routing_no']") as routing_no, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_bank_config as table_alias 2022-07-11 15:50:05 normalization > -- bank_config 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ), __dbt__cte__bank_config_ab2 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__bank_config_ab1 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > cast(name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as name, 2022-07-11 15:50:05 normalization > cast(config as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as config, 2022-07-11 15:50:05 normalization > cast(bank_id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as bank_id, 2022-07-11 15:50:05 normalization > cast(nullif(created, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as created, 2022-07-11 15:50:05 normalization > cast(nullif(updated, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as updated, 2022-07-11 15:50:05 normalization > cast(routing_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as routing_no, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from __dbt__cte__bank_config_ab1 2022-07-11 15:50:05 normalization > -- bank_config 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__bank_config_ab2 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(config as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(routing_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_bank_config_hashid, 2022-07-11 15:50:05 normalization > tmp.* 2022-07-11 15:50:05 normalization > from __dbt__cte__bank_config_ab2 tmp 2022-07-11 15:50:05 normalization > -- bank_config 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > ; 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:40.361561 [debug] [Thread-7 ]: On model.airbyte_utils.partner_config_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_stg"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`partner_config_stg` 2022-07-11 15:50:05 normalization > OPTIONS() 2022-07-11 15:50:05 normalization > as 2022-07-11 15:50:05 normalization > with __dbt__cte__partner_config_ab1 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:50:05 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_partner_config 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['name']") as name, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['config']") as config, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['partner_id']") as partner_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['routing_no']") as routing_no, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['account_prefix']") as account_prefix, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_partner_config as table_alias 2022-07-11 15:50:05 normalization > -- partner_config 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ), __dbt__cte__partner_config_ab2 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__partner_config_ab1 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > cast(name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as name, 2022-07-11 15:50:05 normalization > cast(config as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as config, 2022-07-11 15:50:05 normalization > cast(bank_id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as bank_id, 2022-07-11 15:50:05 normalization > cast(nullif(created, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as created, 2022-07-11 15:50:05 normalization > cast(nullif(updated, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as updated, 2022-07-11 15:50:05 normalization > cast(partner_id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as partner_id, 2022-07-11 15:50:05 normalization > cast(routing_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as routing_no, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > cast(account_prefix as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as account_prefix, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from __dbt__cte__partner_config_ab1 2022-07-11 15:50:05 normalization > -- partner_config 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__partner_config_ab2 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(config as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(partner_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(routing_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(account_prefix as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_partner_config_hashid, 2022-07-11 15:50:05 normalization > tmp.* 2022-07-11 15:50:05 normalization > from __dbt__cte__partner_config_ab2 tmp 2022-07-11 15:50:05 normalization > -- partner_config 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > ; 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:40.371447 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_stg" 2022-07-11 15:50:05 normalization > 15:49:40.371862 [debug] [Thread-8 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:40.373653 [debug] [Thread-4 ]: On model.airbyte_utils.transactions_out_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_stg"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`transactions_out_stg` 2022-07-11 15:50:05 normalization > OPTIONS() 2022-07-11 15:50:05 normalization > as 2022-07-11 15:50:05 normalization > with __dbt__cte__transactions_out_ab1 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:50:05 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['data']") as data, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['uuid']") as uuid, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['amount']") as amount, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['status']") as status, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['file_id']") as file_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['trace_no']") as trace_no, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['account_no']") as account_no, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['partner_id']") as partner_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['description']") as description, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['external_id']") as external_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['is_same_day']") as is_same_day, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['return_data']") as return_data, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['account_name']") as account_name, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['effective_date']") as effective_date, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['reference_info']") as reference_info, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['transaction_code']") as transaction_code, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['source_account_no']") as source_account_no, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['transaction_in_id']") as transaction_in_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['source_account_name']") as source_account_name, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['destination_bank_routing_no']") as destination_bank_routing_no, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out as table_alias 2022-07-11 15:50:05 normalization > -- transactions_out 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ), __dbt__cte__transactions_out_ab2 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__transactions_out_ab1 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > cast(id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as id, 2022-07-11 15:50:05 normalization > cast(data as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as data, 2022-07-11 15:50:05 normalization > cast(uuid as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as uuid, 2022-07-11 15:50:05 normalization > cast(amount as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as amount, 2022-07-11 15:50:05 normalization > cast(status as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as status, 2022-07-11 15:50:05 normalization > cast(bank_id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as bank_id, 2022-07-11 15:50:05 normalization > cast(nullif(created, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as created, 2022-07-11 15:50:05 normalization > cast(file_id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as file_id, 2022-07-11 15:50:05 normalization > cast(nullif(updated, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as updated, 2022-07-11 15:50:05 normalization > cast(trace_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as trace_no, 2022-07-11 15:50:05 normalization > cast(account_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as account_no, 2022-07-11 15:50:05 normalization > cast(partner_id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as partner_id, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > cast(description as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as description, 2022-07-11 15:50:05 normalization > cast(external_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as external_id, 2022-07-11 15:50:05 normalization > cast(is_same_day as boolean) as is_same_day, 2022-07-11 15:50:05 normalization > cast(return_data as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as return_data, 2022-07-11 15:50:05 normalization > cast(account_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as account_name, 2022-07-11 15:50:05 normalization > cast(nullif(effective_date, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as effective_date, 2022-07-11 15:50:05 normalization > cast(reference_info as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as reference_info, 2022-07-11 15:50:05 normalization > cast(transaction_code as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as transaction_code, 2022-07-11 15:50:05 normalization > cast(source_account_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as source_account_no, 2022-07-11 15:50:05 normalization > cast(transaction_in_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as transaction_in_id, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > cast(source_account_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as source_account_name, 2022-07-11 15:50:05 normalization > cast(destination_bank_routing_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as destination_bank_routing_no, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from __dbt__cte__transactions_out_ab1 2022-07-11 15:50:05 normalization > -- transactions_out 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__transactions_out_ab2 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(data as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(uuid as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(amount as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(status as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(file_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(trace_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(account_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(partner_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(description as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(external_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(is_same_day as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(return_data as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(account_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(effective_date as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(reference_info as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(transaction_code as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(source_account_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(transaction_in_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(source_account_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(destination_bank_routing_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_transactions_out_hashid, 2022-07-11 15:50:05 normalization > tmp.* 2022-07-11 15:50:05 normalization > from __dbt__cte__transactions_out_ab2 tmp 2022-07-11 15:50:05 normalization > -- transactions_out 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > ; 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:40.381915 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:40.391063 [debug] [Thread-5 ]: Began executing node model.airbyte_utils.transactions_in_stg 2022-07-11 15:50:05 normalization > 15:49:40.398480 [debug] [Thread-8 ]: On model.airbyte_utils.files_in_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_stg"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`files_in_stg` 2022-07-11 15:50:05 normalization > OPTIONS() 2022-07-11 15:50:05 normalization > as 2022-07-11 15:50:05 normalization > with __dbt__cte__files_in_ab1 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:50:05 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_files_in 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['ended']") as ended, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['started']") as started, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['file_hash']") as file_hash, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['file_name']") as file_name, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['iat_entry_count']") as iat_entry_count, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['std_entry_count']") as std_entry_count, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['total_batch_count']") as total_batch_count, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['total_entry_count']") as total_entry_count, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['preprocessing_path']") as preprocessing_path, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['total_debit_amount']") as total_debit_amount, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['postprocessing_path']") as postprocessing_path, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['total_credit_amount']") as total_credit_amount, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['iat_entries_processed']") as iat_entries_processed, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['std_entries_processed']") as std_entries_processed, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_files_in as table_alias 2022-07-11 15:50:05 normalization > -- files_in 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ), __dbt__cte__files_in_ab2 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__files_in_ab1 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > cast(id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as id, 2022-07-11 15:50:05 normalization > cast(nullif(ended, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as ended, 2022-07-11 15:50:05 normalization > cast(nullif(started, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as started, 2022-07-11 15:50:05 normalization > cast(nullif(updated, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as updated, 2022-07-11 15:50:05 normalization > cast(file_hash as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as file_hash, 2022-07-11 15:50:05 normalization > cast(file_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as file_name, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > cast(iat_entry_count as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as iat_entry_count, 2022-07-11 15:50:05 normalization > cast(std_entry_count as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as std_entry_count, 2022-07-11 15:50:05 normalization > cast(total_batch_count as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as total_batch_count, 2022-07-11 15:50:05 normalization > cast(total_entry_count as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as total_entry_count, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > cast(preprocessing_path as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as preprocessing_path, 2022-07-11 15:50:05 normalization > cast(total_debit_amount as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as total_debit_amount, 2022-07-11 15:50:05 normalization > cast(postprocessing_path as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as postprocessing_path, 2022-07-11 15:50:05 normalization > cast(total_credit_amount as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as total_credit_amount, 2022-07-11 15:50:05 normalization > cast(iat_entries_processed as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as iat_entries_processed, 2022-07-11 15:50:05 normalization > cast(std_entries_processed as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as std_entries_processed, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from __dbt__cte__files_in_ab1 2022-07-11 15:50:05 normalization > -- files_in 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__files_in_ab2 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(ended as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(started as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(file_hash as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(file_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(iat_entry_count as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(std_entry_count as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(total_batch_count as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(total_entry_count as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(preprocessing_path as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(total_debit_amount as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(postprocessing_path as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(total_credit_amount as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(iat_entries_processed as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(std_entries_processed as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_files_in_hashid, 2022-07-11 15:50:05 normalization > tmp.* 2022-07-11 15:50:05 normalization > from __dbt__cte__files_in_ab2 tmp 2022-07-11 15:50:05 normalization > -- files_in 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > ; 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:40.400946 [debug] [Thread-5 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_in_stg" 2022-07-11 15:50:05 normalization > 15:49:40.406702 [debug] [Thread-5 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:40.416519 [debug] [Thread-5 ]: On model.airbyte_utils.transactions_in_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_in_stg"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`transactions_in_stg` 2022-07-11 15:50:05 normalization > OPTIONS() 2022-07-11 15:50:05 normalization > as 2022-07-11 15:50:05 normalization > with __dbt__cte__transactions_in_ab1 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:50:05 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_transactions_in 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['uuid']") as uuid, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['amount']") as amount, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['returned']") as returned, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['sec_code']") as sec_code, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['file_hash']") as file_hash, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['file_name']") as file_name, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_02']") as addenda_02, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_05']") as addenda_05, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_10']") as addenda_10, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_11']") as addenda_11, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_12']") as addenda_12, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_13']") as addenda_13, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_14']") as addenda_14, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_15']") as addenda_15, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_16']") as addenda_16, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_17']") as addenda_17, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_18']") as addenda_18, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_98']") as addenda_98, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_99']") as addenda_99, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['batch_type']") as batch_type, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['company_id']") as company_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['partner_id']") as partner_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['external_id']") as external_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['return_data']") as return_data, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['batch_number']") as batch_number, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['company_name']") as company_name, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['future_dated']") as future_dated, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['originator_id']") as originator_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['receiving_dfi']") as receiving_dfi, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['dfi_account_no']") as dfi_account_no, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['effective_date']") as effective_date, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['entry_trace_no']") as entry_trace_no, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['individual_name']") as individual_name, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['originating_dfi']") as originating_dfi, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['settlement_date']") as settlement_date, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['individual_id_no']") as individual_id_no, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['transaction_code']") as transaction_code, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['processing_history']") as processing_history, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['transaction_out_id']") as transaction_out_id, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['addenda_record_count']") as addenda_record_count, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['destination_country_code']") as destination_country_code, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['company_entry_description']") as company_entry_description, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['destination_currency_code']") as destination_currency_code, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['originating_currency_code']") as originating_currency_code, 2022-07-11 15:50:05 normalization > json_extract_scalar(_airbyte_data, "$['foreign_exchange_indicator']") as foreign_exchange_indicator, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_in as table_alias 2022-07-11 15:50:05 normalization > -- transactions_in 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ), __dbt__cte__transactions_in_ab2 as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__transactions_in_ab1 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > cast(id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as id, 2022-07-11 15:50:05 normalization > cast(uuid as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as uuid, 2022-07-11 15:50:05 normalization > cast(amount as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as amount, 2022-07-11 15:50:05 normalization > cast(bank_id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as bank_id, 2022-07-11 15:50:05 normalization > cast(nullif(created, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as created, 2022-07-11 15:50:05 normalization > cast(nullif(updated, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as updated, 2022-07-11 15:50:05 normalization > cast(returned as boolean) as returned, 2022-07-11 15:50:05 normalization > cast(sec_code as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as sec_code, 2022-07-11 15:50:05 normalization > cast(file_hash as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as file_hash, 2022-07-11 15:50:05 normalization > cast(file_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as file_name, 2022-07-11 15:50:05 normalization > cast(addenda_02 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_02, 2022-07-11 15:50:05 normalization > cast(addenda_05 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_05, 2022-07-11 15:50:05 normalization > cast(addenda_10 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_10, 2022-07-11 15:50:05 normalization > cast(addenda_11 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_11, 2022-07-11 15:50:05 normalization > cast(addenda_12 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_12, 2022-07-11 15:50:05 normalization > cast(addenda_13 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_13, 2022-07-11 15:50:05 normalization > cast(addenda_14 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_14, 2022-07-11 15:50:05 normalization > cast(addenda_15 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_15, 2022-07-11 15:50:05 normalization > cast(addenda_16 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_16, 2022-07-11 15:50:05 normalization > cast(addenda_17 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_17, 2022-07-11 15:50:05 normalization > cast(addenda_18 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_18, 2022-07-11 15:50:05 normalization > cast(addenda_98 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_98, 2022-07-11 15:50:05 normalization > cast(addenda_99 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_99, 2022-07-11 15:50:05 normalization > cast(batch_type as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as batch_type, 2022-07-11 15:50:05 normalization > cast(company_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as company_id, 2022-07-11 15:50:05 normalization > cast(partner_id as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as partner_id, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > cast(external_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as external_id, 2022-07-11 15:50:05 normalization > cast(return_data as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as return_data, 2022-07-11 15:50:05 normalization > cast(batch_number as 2022-07-11 15:50:05 normalization > float64 2022-07-11 15:50:05 normalization > ) as batch_number, 2022-07-11 15:50:05 normalization > cast(company_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as company_name, 2022-07-11 15:50:05 normalization > cast(future_dated as boolean) as future_dated, 2022-07-11 15:50:05 normalization > cast(originator_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as originator_id, 2022-07-11 15:50:05 normalization > cast(receiving_dfi as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as receiving_dfi, 2022-07-11 15:50:05 normalization > cast(dfi_account_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as dfi_account_no, 2022-07-11 15:50:05 normalization > cast(nullif(effective_date, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as effective_date, 2022-07-11 15:50:05 normalization > cast(entry_trace_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as entry_trace_no, 2022-07-11 15:50:05 normalization > cast(individual_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as individual_name, 2022-07-11 15:50:05 normalization > cast(originating_dfi as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as originating_dfi, 2022-07-11 15:50:05 normalization > cast(nullif(settlement_date, '') as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) as settlement_date, 2022-07-11 15:50:05 normalization > cast(individual_id_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as individual_id_no, 2022-07-11 15:50:05 normalization > cast(transaction_code as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as transaction_code, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > cast(processing_history as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as processing_history, 2022-07-11 15:50:05 normalization > cast(transaction_out_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as transaction_out_id, 2022-07-11 15:50:05 normalization > cast(addenda_record_count as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as addenda_record_count, 2022-07-11 15:50:05 normalization > cast(destination_country_code as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as destination_country_code, 2022-07-11 15:50:05 normalization > cast(company_entry_description as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as company_entry_description, 2022-07-11 15:50:05 normalization > cast(destination_currency_code as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as destination_currency_code, 2022-07-11 15:50:05 normalization > cast(originating_currency_code as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as originating_currency_code, 2022-07-11 15:50:05 normalization > cast(foreign_exchange_indicator as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) as foreign_exchange_indicator, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:50:05 normalization > from __dbt__cte__transactions_in_ab1 2022-07-11 15:50:05 normalization > -- transactions_in 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:50:05 normalization > -- depends_on: __dbt__cte__transactions_in_ab2 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(uuid as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(amount as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(returned as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(sec_code as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(file_hash as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(file_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_02 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_05 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_10 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_11 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_12 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_13 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_14 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_15 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_16 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_17 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_18 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_98 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_99 as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(batch_type as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(company_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(partner_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(external_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(return_data as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(batch_number as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(company_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(future_dated as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(originator_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(receiving_dfi as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(dfi_account_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(effective_date as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(entry_trace_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(individual_name as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(originating_dfi as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(settlement_date as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(individual_id_no as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(transaction_code as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(processing_history as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(transaction_out_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(addenda_record_count as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(destination_country_code as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(company_entry_description as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(destination_currency_code as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(originating_currency_code as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(foreign_exchange_indicator as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_transactions_in_hashid, 2022-07-11 15:50:05 normalization > tmp.* 2022-07-11 15:50:05 normalization > from __dbt__cte__transactions_in_ab2 tmp 2022-07-11 15:50:05 normalization > -- transactions_in 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > ; 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:41.085481 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:41.086902 [info ] [Thread-8 ]: 4 of 18 OK created view model _airbyte_raw_achilles.files_in_stg........................................................ [OK in 1.32s] 2022-07-11 15:50:05 normalization > 15:49:41.088223 [debug] [Thread-8 ]: Finished running node model.airbyte_utils.files_in_stg 2022-07-11 15:50:05 normalization > 15:49:41.089724 [debug] [Thread-1 ]: Began running node model.airbyte_utils.files_in_scd 2022-07-11 15:50:05 normalization > 15:49:41.090410 [info ] [Thread-1 ]: 7 of 18 START incremental model raw_achilles.files_in_scd............................................................... [RUN] 2022-07-11 15:50:05 normalization > 15:49:41.091826 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_scd" 2022-07-11 15:50:05 normalization > 15:49:41.092063 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.files_in_scd 2022-07-11 15:50:05 normalization > 15:49:41.092287 [debug] [Thread-1 ]: Compiling model.airbyte_utils.files_in_scd 2022-07-11 15:50:05 normalization > 15:49:41.135909 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:41.142811 [debug] [Thread-1 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:41.144135 [info ] [Thread-7 ]: 3 of 18 OK created view model _airbyte_raw_achilles.partner_config_stg.................................................. [OK in 1.38s] 2022-07-11 15:50:05 normalization > 15:49:41.146342 [debug] [Thread-7 ]: Finished running node model.airbyte_utils.partner_config_stg 2022-07-11 15:50:05 normalization > 15:49:41.147439 [debug] [Thread-8 ]: Began running node model.airbyte_utils.partner_config_scd 2022-07-11 15:50:05 normalization > 15:49:41.148029 [info ] [Thread-8 ]: 8 of 18 START incremental model raw_achilles.partner_config_scd......................................................... [RUN] 2022-07-11 15:50:05 normalization > 15:49:41.149517 [debug] [Thread-8 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_scd" 2022-07-11 15:50:05 normalization > 15:49:41.149756 [debug] [Thread-8 ]: Began compiling node model.airbyte_utils.partner_config_scd 2022-07-11 15:50:05 normalization > 15:49:41.149966 [debug] [Thread-8 ]: Compiling model.airbyte_utils.partner_config_scd 2022-07-11 15:50:05 normalization > 15:49:41.170806 [debug] [Thread-8 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:41.182129 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:41.183520 [info ] [Thread-6 ]: 1 of 18 OK created view model _airbyte_raw_achilles.bank_config_stg..................................................... [OK in 1.42s] 2022-07-11 15:50:05 normalization > 15:49:41.184869 [debug] [Thread-6 ]: Finished running node model.airbyte_utils.bank_config_stg 2022-07-11 15:50:05 normalization > 15:49:41.195562 [debug] [Thread-7 ]: Began running node model.airbyte_utils.bank_config_scd 2022-07-11 15:50:05 normalization > 15:49:41.196767 [info ] [Thread-7 ]: 9 of 18 START incremental model raw_achilles.bank_config_scd............................................................ [RUN] 2022-07-11 15:50:05 normalization > 15:49:41.199206 [debug] [Thread-7 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_scd" 2022-07-11 15:50:05 normalization > 15:49:41.202955 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:41.203211 [debug] [Thread-7 ]: Began compiling node model.airbyte_utils.bank_config_scd 2022-07-11 15:50:05 normalization > 15:49:41.204925 [info ] [Thread-4 ]: 5 of 18 OK created view model _airbyte_raw_achilles.transactions_out_stg................................................ [OK in 1.39s] 2022-07-11 15:50:05 normalization > 15:49:41.205285 [debug] [Thread-7 ]: Compiling model.airbyte_utils.bank_config_scd 2022-07-11 15:50:05 normalization > 15:49:41.206289 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.transactions_out_stg 2022-07-11 15:50:05 normalization > 15:49:41.220288 [debug] [Thread-6 ]: Began running node model.airbyte_utils.transactions_out_scd 2022-07-11 15:50:05 normalization > 15:49:41.220961 [info ] [Thread-6 ]: 10 of 18 START incremental model raw_achilles.transactions_out_scd...................................................... [RUN] 2022-07-11 15:50:05 normalization > 15:49:41.222641 [debug] [Thread-6 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_scd" 2022-07-11 15:50:05 normalization > 15:49:41.234057 [debug] [Thread-6 ]: Began compiling node model.airbyte_utils.transactions_out_scd 2022-07-11 15:50:05 normalization > 15:49:41.235596 [debug] [Thread-7 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:41.235974 [debug] [Thread-6 ]: Compiling model.airbyte_utils.transactions_out_scd 2022-07-11 15:50:05 normalization > 15:49:41.254993 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:41.257289 [info ] [Thread-5 ]: 6 of 18 OK created view model _airbyte_raw_achilles.transactions_in_stg................................................. [OK in 1.44s] 2022-07-11 15:50:05 normalization > 15:49:41.276923 [debug] [Thread-6 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:41.281907 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.transactions_in_stg 2022-07-11 15:50:05 normalization > 15:49:41.283648 [debug] [Thread-4 ]: Began running node model.airbyte_utils.transactions_in_scd 2022-07-11 15:50:05 normalization > 15:49:41.286092 [info ] [Thread-4 ]: 11 of 18 START incremental model raw_achilles.transactions_in_scd....................................................... [RUN] 2022-07-11 15:50:05 normalization > 15:49:41.287962 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_scd" 2022-07-11 15:50:05 normalization > 15:49:41.288303 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.transactions_in_scd 2022-07-11 15:50:05 normalization > 15:49:41.288639 [debug] [Thread-4 ]: Compiling model.airbyte_utils.transactions_in_scd 2022-07-11 15:50:05 normalization > 15:49:41.302341 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:41.313009 [info ] [Thread-3 ]: 2 of 18 OK created view model _airbyte_raw_achilles.files_out_stg....................................................... [OK in 1.55s] 2022-07-11 15:50:05 normalization > 15:49:41.348725 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_scd" 2022-07-11 15:50:05 normalization > 15:49:41.350850 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.files_out_stg 2022-07-11 15:50:05 normalization > 15:49:41.355188 [debug] [Thread-5 ]: Began running node model.airbyte_utils.files_out_scd 2022-07-11 15:50:05 normalization > 15:49:41.355978 [info ] [Thread-5 ]: 12 of 18 START incremental model raw_achilles.files_out_scd............................................................. [RUN] 2022-07-11 15:50:05 normalization > 15:49:41.357641 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_scd" 2022-07-11 15:50:05 normalization > 15:49:41.358150 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.files_out_scd 2022-07-11 15:50:05 normalization > 15:49:41.361891 [debug] [Thread-5 ]: Compiling model.airbyte_utils.files_out_scd 2022-07-11 15:50:05 normalization > 15:49:41.362400 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:41.379600 [debug] [Thread-4 ]: Began executing node model.airbyte_utils.transactions_in_scd 2022-07-11 15:50:05 normalization > 15:49:41.387370 [debug] [Thread-5 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:41.435461 [debug] [Thread-4 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:41.636707 [debug] [Thread-4 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/transactions_in_scd?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.transactions_in_scd 2022-07-11 15:50:05 normalization > 15:49:41.653741 [info ] [Thread-4 ]: 15:49:41 + `mainapi-282501`.raw_achilles.`transactions_in_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:50:05 normalization > 15:49:41.732458 [debug] [Thread-4 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_in_scd" 2022-07-11 15:50:05 normalization > 15:49:41.771170 [debug] [Thread-4 ]: On model.airbyte_utils.transactions_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_in_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace table `mainapi-282501`.raw_achilles.`transactions_in_scd` 2022-07-11 15:50:05 normalization > partition by range_bucket( 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > generate_array(0, 1, 1) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:50:05 normalization > OPTIONS() 2022-07-11 15:50:05 normalization > as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- depends_on: ref('transactions_in_stg') 2022-07-11 15:50:05 normalization > with 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > input_data as ( 2022-07-11 15:50:05 normalization > select * 2022-07-11 15:50:05 normalization > from `mainapi-282501`._airbyte_raw_achilles.`transactions_in_stg` 2022-07-11 15:50:05 normalization > -- transactions_in from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_in 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > scd_data as ( 2022-07-11 15:50:05 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:50:05 normalization > id, 2022-07-11 15:50:05 normalization > uuid, 2022-07-11 15:50:05 normalization > amount, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > returned, 2022-07-11 15:50:05 normalization > sec_code, 2022-07-11 15:50:05 normalization > file_hash, 2022-07-11 15:50:05 normalization > file_name, 2022-07-11 15:50:05 normalization > addenda_02, 2022-07-11 15:50:05 normalization > addenda_05, 2022-07-11 15:50:05 normalization > addenda_10, 2022-07-11 15:50:05 normalization > addenda_11, 2022-07-11 15:50:05 normalization > addenda_12, 2022-07-11 15:50:05 normalization > addenda_13, 2022-07-11 15:50:05 normalization > addenda_14, 2022-07-11 15:50:05 normalization > addenda_15, 2022-07-11 15:50:05 normalization > addenda_16, 2022-07-11 15:50:05 normalization > addenda_17, 2022-07-11 15:50:05 normalization > addenda_18, 2022-07-11 15:50:05 normalization > addenda_98, 2022-07-11 15:50:05 normalization > addenda_99, 2022-07-11 15:50:05 normalization > batch_type, 2022-07-11 15:50:05 normalization > company_id, 2022-07-11 15:50:05 normalization > partner_id, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > external_id, 2022-07-11 15:50:05 normalization > return_data, 2022-07-11 15:50:05 normalization > batch_number, 2022-07-11 15:50:05 normalization > company_name, 2022-07-11 15:50:05 normalization > future_dated, 2022-07-11 15:50:05 normalization > originator_id, 2022-07-11 15:50:05 normalization > receiving_dfi, 2022-07-11 15:50:05 normalization > dfi_account_no, 2022-07-11 15:50:05 normalization > effective_date, 2022-07-11 15:50:05 normalization > entry_trace_no, 2022-07-11 15:50:05 normalization > individual_name, 2022-07-11 15:50:05 normalization > originating_dfi, 2022-07-11 15:50:05 normalization > settlement_date, 2022-07-11 15:50:05 normalization > individual_id_no, 2022-07-11 15:50:05 normalization > transaction_code, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > processing_history, 2022-07-11 15:50:05 normalization > transaction_out_id, 2022-07-11 15:50:05 normalization > addenda_record_count, 2022-07-11 15:50:05 normalization > destination_country_code, 2022-07-11 15:50:05 normalization > company_entry_description, 2022-07-11 15:50:05 normalization > destination_currency_code, 2022-07-11 15:50:05 normalization > originating_currency_code, 2022-07-11 15:50:05 normalization > foreign_exchange_indicator, 2022-07-11 15:50:05 normalization > updated as _airbyte_start_at, 2022-07-11 15:50:05 normalization > lag(updated) over ( 2022-07-11 15:50:05 normalization > partition by cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) as _airbyte_end_at, 2022-07-11 15:50:05 normalization > case when row_number() over ( 2022-07-11 15:50:05 normalization > partition by cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > _airbyte_transactions_in_hashid 2022-07-11 15:50:05 normalization > from input_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > dedup_data as ( 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:50:05 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:50:05 normalization > row_number() over ( 2022-07-11 15:50:05 normalization > partition by 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:50:05 normalization > ) as _airbyte_row_num, 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > scd_data.* 2022-07-11 15:50:05 normalization > from scd_data 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > id, 2022-07-11 15:50:05 normalization > uuid, 2022-07-11 15:50:05 normalization > amount, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > returned, 2022-07-11 15:50:05 normalization > sec_code, 2022-07-11 15:50:05 normalization > file_hash, 2022-07-11 15:50:05 normalization > file_name, 2022-07-11 15:50:05 normalization > addenda_02, 2022-07-11 15:50:05 normalization > addenda_05, 2022-07-11 15:50:05 normalization > addenda_10, 2022-07-11 15:50:05 normalization > addenda_11, 2022-07-11 15:50:05 normalization > addenda_12, 2022-07-11 15:50:05 normalization > addenda_13, 2022-07-11 15:50:05 normalization > addenda_14, 2022-07-11 15:50:05 normalization > addenda_15, 2022-07-11 15:50:05 normalization > addenda_16, 2022-07-11 15:50:05 normalization > addenda_17, 2022-07-11 15:50:05 normalization > addenda_18, 2022-07-11 15:50:05 normalization > addenda_98, 2022-07-11 15:50:05 normalization > addenda_99, 2022-07-11 15:50:05 normalization > batch_type, 2022-07-11 15:50:05 normalization > company_id, 2022-07-11 15:50:05 normalization > partner_id, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > external_id, 2022-07-11 15:50:05 normalization > return_data, 2022-07-11 15:50:05 normalization > batch_number, 2022-07-11 15:50:05 normalization > company_name, 2022-07-11 15:50:05 normalization > future_dated, 2022-07-11 15:50:05 normalization > originator_id, 2022-07-11 15:50:05 normalization > receiving_dfi, 2022-07-11 15:50:05 normalization > dfi_account_no, 2022-07-11 15:50:05 normalization > effective_date, 2022-07-11 15:50:05 normalization > entry_trace_no, 2022-07-11 15:50:05 normalization > individual_name, 2022-07-11 15:50:05 normalization > originating_dfi, 2022-07-11 15:50:05 normalization > settlement_date, 2022-07-11 15:50:05 normalization > individual_id_no, 2022-07-11 15:50:05 normalization > transaction_code, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > processing_history, 2022-07-11 15:50:05 normalization > transaction_out_id, 2022-07-11 15:50:05 normalization > addenda_record_count, 2022-07-11 15:50:05 normalization > destination_country_code, 2022-07-11 15:50:05 normalization > company_entry_description, 2022-07-11 15:50:05 normalization > destination_currency_code, 2022-07-11 15:50:05 normalization > originating_currency_code, 2022-07-11 15:50:05 normalization > foreign_exchange_indicator, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_end_at, 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:50:05 normalization > _airbyte_transactions_in_hashid 2022-07-11 15:50:05 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:50:05 normalization > ); 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:41.935183 [debug] [Thread-8 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_scd" 2022-07-11 15:50:05 normalization > 15:49:41.935773 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:41.935999 [debug] [Thread-8 ]: Began executing node model.airbyte_utils.partner_config_scd 2022-07-11 15:50:05 normalization > 15:49:41.981331 [debug] [Thread-6 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_scd" 2022-07-11 15:50:05 normalization > 15:49:41.992676 [debug] [Thread-7 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_scd" 2022-07-11 15:50:05 normalization > 15:49:41.994261 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:41.994837 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:41.995144 [debug] [Thread-6 ]: Began executing node model.airbyte_utils.transactions_out_scd 2022-07-11 15:50:05 normalization > 15:49:42.000539 [debug] [Thread-7 ]: Began executing node model.airbyte_utils.bank_config_scd 2022-07-11 15:50:05 normalization > 15:49:42.010054 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.files_in_scd" 2022-07-11 15:50:05 normalization > 15:49:42.025837 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:42.026115 [debug] [Thread-1 ]: Began executing node model.airbyte_utils.files_in_scd 2022-07-11 15:50:05 normalization > 15:49:42.092872 [debug] [Thread-8 ]: On model.airbyte_utils.partner_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace table `mainapi-282501`.raw_achilles.`partner_config_scd__dbt_tmp` 2022-07-11 15:50:05 normalization > partition by range_bucket( 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > generate_array(0, 1, 1) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:50:05 normalization > OPTIONS( 2022-07-11 15:50:05 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- depends_on: ref('partner_config_stg') 2022-07-11 15:50:05 normalization > with 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > new_data as ( 2022-07-11 15:50:05 normalization > -- retrieve incremental "new" data 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > * 2022-07-11 15:50:05 normalization > from `mainapi-282501`._airbyte_raw_achilles.`partner_config_stg` 2022-07-11 15:50:05 normalization > -- partner_config from `mainapi-282501`.raw_achilles._airbyte_raw_partner_config 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from `mainapi-282501`.raw_achilles.`partner_config_scd`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > new_data_ids as ( 2022-07-11 15:50:05 normalization > -- build a subset of _airbyte_unique_key from rows that are new 2022-07-11 15:50:05 normalization > select distinct 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(partner_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key 2022-07-11 15:50:05 normalization > from new_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > empty_new_data as ( 2022-07-11 15:50:05 normalization > -- build an empty table to only keep the table's column types 2022-07-11 15:50:05 normalization > select * from new_data where 1 = 0 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > previous_active_scd_data as ( 2022-07-11 15:50:05 normalization > -- retrieve "incomplete old" data that needs to be updated with an end date because of new changes 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > this_data.`_airbyte_partner_config_hashid`, 2022-07-11 15:50:05 normalization > this_data.`name`, 2022-07-11 15:50:05 normalization > this_data.`config`, 2022-07-11 15:50:05 normalization > this_data.`bank_id`, 2022-07-11 15:50:05 normalization > this_data.`created`, 2022-07-11 15:50:05 normalization > this_data.`updated`, 2022-07-11 15:50:05 normalization > this_data.`partner_id`, 2022-07-11 15:50:05 normalization > this_data.`routing_no`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > this_data.`account_prefix`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_normalized_at` 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`partner_config_scd` as this_data 2022-07-11 15:50:05 normalization > -- make a join with new_data using primary key to filter active data that need to be updated only 2022-07-11 15:50:05 normalization > join new_data_ids on this_data._airbyte_unique_key = new_data_ids._airbyte_unique_key 2022-07-11 15:50:05 normalization > -- force left join to NULL values (we just need to transfer column types only for the star_intersect macro on schema changes) 2022-07-11 15:50:05 normalization > left join empty_new_data as inc_data on this_data._airbyte_ab_id = inc_data._airbyte_ab_id 2022-07-11 15:50:05 normalization > where _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > input_data as ( 2022-07-11 15:50:05 normalization > select `_airbyte_partner_config_hashid`, 2022-07-11 15:50:05 normalization > `name`, 2022-07-11 15:50:05 normalization > `config`, 2022-07-11 15:50:05 normalization > `bank_id`, 2022-07-11 15:50:05 normalization > `created`, 2022-07-11 15:50:05 normalization > `updated`, 2022-07-11 15:50:05 normalization > `partner_id`, 2022-07-11 15:50:05 normalization > `routing_no`, 2022-07-11 15:50:05 normalization > `_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > `account_prefix`, 2022-07-11 15:50:05 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > `_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > `_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > `_airbyte_normalized_at` from new_data 2022-07-11 15:50:05 normalization > union all 2022-07-11 15:50:05 normalization > select `_airbyte_partner_config_hashid`, 2022-07-11 15:50:05 normalization > `name`, 2022-07-11 15:50:05 normalization > `config`, 2022-07-11 15:50:05 normalization > `bank_id`, 2022-07-11 15:50:05 normalization > `created`, 2022-07-11 15:50:05 normalization > `updated`, 2022-07-11 15:50:05 normalization > `partner_id`, 2022-07-11 15:50:05 normalization > `routing_no`, 2022-07-11 15:50:05 normalization > `_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > `account_prefix`, 2022-07-11 15:50:05 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > `_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > `_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > `_airbyte_normalized_at` from previous_active_scd_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > scd_data as ( 2022-07-11 15:50:05 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(partner_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:50:05 normalization > name, 2022-07-11 15:50:05 normalization > config, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > partner_id, 2022-07-11 15:50:05 normalization > routing_no, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > account_prefix, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > updated as _airbyte_start_at, 2022-07-11 15:50:05 normalization > lag(updated) over ( 2022-07-11 15:50:05 normalization > partition by cast(partner_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) as _airbyte_end_at, 2022-07-11 15:50:05 normalization > case when row_number() over ( 2022-07-11 15:50:05 normalization > partition by cast(partner_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > _airbyte_partner_config_hashid 2022-07-11 15:50:05 normalization > from input_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > dedup_data as ( 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:50:05 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:50:05 normalization > row_number() over ( 2022-07-11 15:50:05 normalization > partition by 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:50:05 normalization > ) as _airbyte_row_num, 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > scd_data.* 2022-07-11 15:50:05 normalization > from scd_data 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > name, 2022-07-11 15:50:05 normalization > config, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > partner_id, 2022-07-11 15:50:05 normalization > routing_no, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > account_prefix, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_end_at, 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:50:05 normalization > _airbyte_partner_config_hashid 2022-07-11 15:50:05 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:50:05 normalization > ); 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:42.095799 [debug] [Thread-7 ]: On model.airbyte_utils.bank_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace table `mainapi-282501`.raw_achilles.`bank_config_scd__dbt_tmp` 2022-07-11 15:50:05 normalization > partition by range_bucket( 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > generate_array(0, 1, 1) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:50:05 normalization > OPTIONS( 2022-07-11 15:50:05 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- depends_on: ref('bank_config_stg') 2022-07-11 15:50:05 normalization > with 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > new_data as ( 2022-07-11 15:50:05 normalization > -- retrieve incremental "new" data 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > * 2022-07-11 15:50:05 normalization > from `mainapi-282501`._airbyte_raw_achilles.`bank_config_stg` 2022-07-11 15:50:05 normalization > -- bank_config from `mainapi-282501`.raw_achilles._airbyte_raw_bank_config 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from `mainapi-282501`.raw_achilles.`bank_config_scd`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > new_data_ids as ( 2022-07-11 15:50:05 normalization > -- build a subset of _airbyte_unique_key from rows that are new 2022-07-11 15:50:05 normalization > select distinct 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(bank_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key 2022-07-11 15:50:05 normalization > from new_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > empty_new_data as ( 2022-07-11 15:50:05 normalization > -- build an empty table to only keep the table's column types 2022-07-11 15:50:05 normalization > select * from new_data where 1 = 0 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > previous_active_scd_data as ( 2022-07-11 15:50:05 normalization > -- retrieve "incomplete old" data that needs to be updated with an end date because of new changes 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > this_data.`_airbyte_bank_config_hashid`, 2022-07-11 15:50:05 normalization > this_data.`name`, 2022-07-11 15:50:05 normalization > this_data.`config`, 2022-07-11 15:50:05 normalization > this_data.`bank_id`, 2022-07-11 15:50:05 normalization > this_data.`created`, 2022-07-11 15:50:05 normalization > this_data.`updated`, 2022-07-11 15:50:05 normalization > this_data.`routing_no`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_normalized_at` 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`bank_config_scd` as this_data 2022-07-11 15:50:05 normalization > -- make a join with new_data using primary key to filter active data that need to be updated only 2022-07-11 15:50:05 normalization > join new_data_ids on this_data._airbyte_unique_key = new_data_ids._airbyte_unique_key 2022-07-11 15:50:05 normalization > -- force left join to NULL values (we just need to transfer column types only for the star_intersect macro on schema changes) 2022-07-11 15:50:05 normalization > left join empty_new_data as inc_data on this_data._airbyte_ab_id = inc_data._airbyte_ab_id 2022-07-11 15:50:05 normalization > where _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > input_data as ( 2022-07-11 15:50:05 normalization > select `_airbyte_bank_config_hashid`, 2022-07-11 15:50:05 normalization > `name`, 2022-07-11 15:50:05 normalization > `config`, 2022-07-11 15:50:05 normalization > `bank_id`, 2022-07-11 15:50:05 normalization > `created`, 2022-07-11 15:50:05 normalization > `updated`, 2022-07-11 15:50:05 normalization > `routing_no`, 2022-07-11 15:50:05 normalization > `_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > `_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > `_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > `_airbyte_normalized_at` from new_data 2022-07-11 15:50:05 normalization > union all 2022-07-11 15:50:05 normalization > select `_airbyte_bank_config_hashid`, 2022-07-11 15:50:05 normalization > `name`, 2022-07-11 15:50:05 normalization > `config`, 2022-07-11 15:50:05 normalization > `bank_id`, 2022-07-11 15:50:05 normalization > `created`, 2022-07-11 15:50:05 normalization > `updated`, 2022-07-11 15:50:05 normalization > `routing_no`, 2022-07-11 15:50:05 normalization > `_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > `_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > `_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > `_airbyte_normalized_at` from previous_active_scd_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > scd_data as ( 2022-07-11 15:50:05 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(bank_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:50:05 normalization > name, 2022-07-11 15:50:05 normalization > config, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > routing_no, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > updated as _airbyte_start_at, 2022-07-11 15:50:05 normalization > lag(updated) over ( 2022-07-11 15:50:05 normalization > partition by cast(bank_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) as _airbyte_end_at, 2022-07-11 15:50:05 normalization > case when row_number() over ( 2022-07-11 15:50:05 normalization > partition by cast(bank_id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > _airbyte_bank_config_hashid 2022-07-11 15:50:05 normalization > from input_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > dedup_data as ( 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:50:05 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:50:05 normalization > row_number() over ( 2022-07-11 15:50:05 normalization > partition by 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:50:05 normalization > ) as _airbyte_row_num, 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > scd_data.* 2022-07-11 15:50:05 normalization > from scd_data 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > name, 2022-07-11 15:50:05 normalization > config, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > routing_no, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_end_at, 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:50:05 normalization > _airbyte_bank_config_hashid 2022-07-11 15:50:05 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:50:05 normalization > ); 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:42.117706 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.files_out_scd" 2022-07-11 15:50:05 normalization > 15:49:42.118530 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:42.118779 [debug] [Thread-5 ]: Began executing node model.airbyte_utils.files_out_scd 2022-07-11 15:50:05 normalization > 15:49:42.133718 [debug] [Thread-1 ]: On model.airbyte_utils.files_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_in_scd__dbt_tmp` 2022-07-11 15:50:05 normalization > partition by range_bucket( 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > generate_array(0, 1, 1) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:50:05 normalization > OPTIONS( 2022-07-11 15:50:05 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- depends_on: ref('files_in_stg') 2022-07-11 15:50:05 normalization > with 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > new_data as ( 2022-07-11 15:50:05 normalization > -- retrieve incremental "new" data 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > * 2022-07-11 15:50:05 normalization > from `mainapi-282501`._airbyte_raw_achilles.`files_in_stg` 2022-07-11 15:50:05 normalization > -- files_in from `mainapi-282501`.raw_achilles._airbyte_raw_files_in 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from `mainapi-282501`.raw_achilles.`files_in_scd`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > new_data_ids as ( 2022-07-11 15:50:05 normalization > -- build a subset of _airbyte_unique_key from rows that are new 2022-07-11 15:50:05 normalization > select distinct 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key 2022-07-11 15:50:05 normalization > from new_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > empty_new_data as ( 2022-07-11 15:50:05 normalization > -- build an empty table to only keep the table's column types 2022-07-11 15:50:05 normalization > select * from new_data where 1 = 0 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > previous_active_scd_data as ( 2022-07-11 15:50:05 normalization > -- retrieve "incomplete old" data that needs to be updated with an end date because of new changes 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > this_data.`_airbyte_files_in_hashid`, 2022-07-11 15:50:05 normalization > this_data.`id`, 2022-07-11 15:50:05 normalization > this_data.`ended`, 2022-07-11 15:50:05 normalization > this_data.`started`, 2022-07-11 15:50:05 normalization > this_data.`updated`, 2022-07-11 15:50:05 normalization > this_data.`file_hash`, 2022-07-11 15:50:05 normalization > this_data.`file_name`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > this_data.`iat_entry_count`, 2022-07-11 15:50:05 normalization > this_data.`std_entry_count`, 2022-07-11 15:50:05 normalization > this_data.`total_batch_count`, 2022-07-11 15:50:05 normalization > this_data.`total_entry_count`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > this_data.`preprocessing_path`, 2022-07-11 15:50:05 normalization > this_data.`total_debit_amount`, 2022-07-11 15:50:05 normalization > this_data.`postprocessing_path`, 2022-07-11 15:50:05 normalization > this_data.`total_credit_amount`, 2022-07-11 15:50:05 normalization > this_data.`iat_entries_processed`, 2022-07-11 15:50:05 normalization > this_data.`std_entries_processed`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_normalized_at` 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`files_in_scd` as this_data 2022-07-11 15:50:05 normalization > -- make a join with new_data using primary key to filter active data that need to be updated only 2022-07-11 15:50:05 normalization > join new_data_ids on this_data._airbyte_unique_key = new_data_ids._airbyte_unique_key 2022-07-11 15:50:05 normalization > -- force left join to NULL values (we just need to transfer column types only for the star_intersect macro on schema changes) 2022-07-11 15:50:05 normalization > left join empty_new_data as inc_data on this_data._airbyte_ab_id = inc_data._airbyte_ab_id 2022-07-11 15:50:05 normalization > where _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > input_data as ( 2022-07-11 15:50:05 normalization > select `_airbyte_files_in_hashid`, 2022-07-11 15:50:05 normalization > `id`, 2022-07-11 15:50:05 normalization > `ended`, 2022-07-11 15:50:05 normalization > `started`, 2022-07-11 15:50:05 normalization > `updated`, 2022-07-11 15:50:05 normalization > `file_hash`, 2022-07-11 15:50:05 normalization > `file_name`, 2022-07-11 15:50:05 normalization > `_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > `iat_entry_count`, 2022-07-11 15:50:05 normalization > `std_entry_count`, 2022-07-11 15:50:05 normalization > `total_batch_count`, 2022-07-11 15:50:05 normalization > `total_entry_count`, 2022-07-11 15:50:05 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > `preprocessing_path`, 2022-07-11 15:50:05 normalization > `total_debit_amount`, 2022-07-11 15:50:05 normalization > `postprocessing_path`, 2022-07-11 15:50:05 normalization > `total_credit_amount`, 2022-07-11 15:50:05 normalization > `iat_entries_processed`, 2022-07-11 15:50:05 normalization > `std_entries_processed`, 2022-07-11 15:50:05 normalization > `_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > `_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > `_airbyte_normalized_at` from new_data 2022-07-11 15:50:05 normalization > union all 2022-07-11 15:50:05 normalization > select `_airbyte_files_in_hashid`, 2022-07-11 15:50:05 normalization > `id`, 2022-07-11 15:50:05 normalization > `ended`, 2022-07-11 15:50:05 normalization > `started`, 2022-07-11 15:50:05 normalization > `updated`, 2022-07-11 15:50:05 normalization > `file_hash`, 2022-07-11 15:50:05 normalization > `file_name`, 2022-07-11 15:50:05 normalization > `_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > `iat_entry_count`, 2022-07-11 15:50:05 normalization > `std_entry_count`, 2022-07-11 15:50:05 normalization > `total_batch_count`, 2022-07-11 15:50:05 normalization > `total_entry_count`, 2022-07-11 15:50:05 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > `preprocessing_path`, 2022-07-11 15:50:05 normalization > `total_debit_amount`, 2022-07-11 15:50:05 normalization > `postprocessing_path`, 2022-07-11 15:50:05 normalization > `total_credit_amount`, 2022-07-11 15:50:05 normalization > `iat_entries_processed`, 2022-07-11 15:50:05 normalization > `std_entries_processed`, 2022-07-11 15:50:05 normalization > `_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > `_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > `_airbyte_normalized_at` from previous_active_scd_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > scd_data as ( 2022-07-11 15:50:05 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:50:05 normalization > id, 2022-07-11 15:50:05 normalization > ended, 2022-07-11 15:50:05 normalization > started, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > file_hash, 2022-07-11 15:50:05 normalization > file_name, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > iat_entry_count, 2022-07-11 15:50:05 normalization > std_entry_count, 2022-07-11 15:50:05 normalization > total_batch_count, 2022-07-11 15:50:05 normalization > total_entry_count, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > preprocessing_path, 2022-07-11 15:50:05 normalization > total_debit_amount, 2022-07-11 15:50:05 normalization > postprocessing_path, 2022-07-11 15:50:05 normalization > total_credit_amount, 2022-07-11 15:50:05 normalization > iat_entries_processed, 2022-07-11 15:50:05 normalization > std_entries_processed, 2022-07-11 15:50:05 normalization > updated as _airbyte_start_at, 2022-07-11 15:50:05 normalization > lag(updated) over ( 2022-07-11 15:50:05 normalization > partition by cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) as _airbyte_end_at, 2022-07-11 15:50:05 normalization > case when row_number() over ( 2022-07-11 15:50:05 normalization > partition by cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > _airbyte_files_in_hashid 2022-07-11 15:50:05 normalization > from input_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > dedup_data as ( 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:50:05 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:50:05 normalization > row_number() over ( 2022-07-11 15:50:05 normalization > partition by 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:50:05 normalization > ) as _airbyte_row_num, 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > scd_data.* 2022-07-11 15:50:05 normalization > from scd_data 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > id, 2022-07-11 15:50:05 normalization > ended, 2022-07-11 15:50:05 normalization > started, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > file_hash, 2022-07-11 15:50:05 normalization > file_name, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > iat_entry_count, 2022-07-11 15:50:05 normalization > std_entry_count, 2022-07-11 15:50:05 normalization > total_batch_count, 2022-07-11 15:50:05 normalization > total_entry_count, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > preprocessing_path, 2022-07-11 15:50:05 normalization > total_debit_amount, 2022-07-11 15:50:05 normalization > postprocessing_path, 2022-07-11 15:50:05 normalization > total_credit_amount, 2022-07-11 15:50:05 normalization > iat_entries_processed, 2022-07-11 15:50:05 normalization > std_entries_processed, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_end_at, 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:50:05 normalization > _airbyte_files_in_hashid 2022-07-11 15:50:05 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:50:05 normalization > ); 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:42.151646 [debug] [Thread-6 ]: On model.airbyte_utils.transactions_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace table `mainapi-282501`.raw_achilles.`transactions_out_scd__dbt_tmp` 2022-07-11 15:50:05 normalization > partition by range_bucket( 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > generate_array(0, 1, 1) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:50:05 normalization > OPTIONS( 2022-07-11 15:50:05 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- depends_on: ref('transactions_out_stg') 2022-07-11 15:50:05 normalization > with 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > new_data as ( 2022-07-11 15:50:05 normalization > -- retrieve incremental "new" data 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > * 2022-07-11 15:50:05 normalization > from `mainapi-282501`._airbyte_raw_achilles.`transactions_out_stg` 2022-07-11 15:50:05 normalization > -- transactions_out from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from `mainapi-282501`.raw_achilles.`transactions_out_scd`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > new_data_ids as ( 2022-07-11 15:50:05 normalization > -- build a subset of _airbyte_unique_key from rows that are new 2022-07-11 15:50:05 normalization > select distinct 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key 2022-07-11 15:50:05 normalization > from new_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > empty_new_data as ( 2022-07-11 15:50:05 normalization > -- build an empty table to only keep the table's column types 2022-07-11 15:50:05 normalization > select * from new_data where 1 = 0 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > previous_active_scd_data as ( 2022-07-11 15:50:05 normalization > -- retrieve "incomplete old" data that needs to be updated with an end date because of new changes 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > this_data.`_airbyte_transactions_out_hashid`, 2022-07-11 15:50:05 normalization > this_data.`id`, 2022-07-11 15:50:05 normalization > this_data.`data`, 2022-07-11 15:50:05 normalization > this_data.`uuid`, 2022-07-11 15:50:05 normalization > this_data.`amount`, 2022-07-11 15:50:05 normalization > this_data.`status`, 2022-07-11 15:50:05 normalization > this_data.`bank_id`, 2022-07-11 15:50:05 normalization > this_data.`created`, 2022-07-11 15:50:05 normalization > this_data.`file_id`, 2022-07-11 15:50:05 normalization > this_data.`updated`, 2022-07-11 15:50:05 normalization > this_data.`trace_no`, 2022-07-11 15:50:05 normalization > this_data.`account_no`, 2022-07-11 15:50:05 normalization > this_data.`partner_id`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > this_data.`description`, 2022-07-11 15:50:05 normalization > this_data.`external_id`, 2022-07-11 15:50:05 normalization > this_data.`is_same_day`, 2022-07-11 15:50:05 normalization > this_data.`return_data`, 2022-07-11 15:50:05 normalization > this_data.`account_name`, 2022-07-11 15:50:05 normalization > this_data.`effective_date`, 2022-07-11 15:50:05 normalization > this_data.`reference_info`, 2022-07-11 15:50:05 normalization > this_data.`transaction_code`, 2022-07-11 15:50:05 normalization > this_data.`source_account_no`, 2022-07-11 15:50:05 normalization > this_data.`transaction_in_id`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > this_data.`source_account_name`, 2022-07-11 15:50:05 normalization > this_data.`destination_bank_routing_no`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_normalized_at` 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`transactions_out_scd` as this_data 2022-07-11 15:50:05 normalization > -- make a join with new_data using primary key to filter active data that need to be updated only 2022-07-11 15:50:05 normalization > join new_data_ids on this_data._airbyte_unique_key = new_data_ids._airbyte_unique_key 2022-07-11 15:50:05 normalization > -- force left join to NULL values (we just need to transfer column types only for the star_intersect macro on schema changes) 2022-07-11 15:50:05 normalization > left join empty_new_data as inc_data on this_data._airbyte_ab_id = inc_data._airbyte_ab_id 2022-07-11 15:50:05 normalization > where _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > input_data as ( 2022-07-11 15:50:05 normalization > select `_airbyte_transactions_out_hashid`, 2022-07-11 15:50:05 normalization > `id`, 2022-07-11 15:50:05 normalization > `data`, 2022-07-11 15:50:05 normalization > `uuid`, 2022-07-11 15:50:05 normalization > `amount`, 2022-07-11 15:50:05 normalization > `status`, 2022-07-11 15:50:05 normalization > `bank_id`, 2022-07-11 15:50:05 normalization > `created`, 2022-07-11 15:50:05 normalization > `file_id`, 2022-07-11 15:50:05 normalization > `updated`, 2022-07-11 15:50:05 normalization > `trace_no`, 2022-07-11 15:50:05 normalization > `account_no`, 2022-07-11 15:50:05 normalization > `partner_id`, 2022-07-11 15:50:05 normalization > `_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > `description`, 2022-07-11 15:50:05 normalization > `external_id`, 2022-07-11 15:50:05 normalization > `is_same_day`, 2022-07-11 15:50:05 normalization > `return_data`, 2022-07-11 15:50:05 normalization > `account_name`, 2022-07-11 15:50:05 normalization > `effective_date`, 2022-07-11 15:50:05 normalization > `reference_info`, 2022-07-11 15:50:05 normalization > `transaction_code`, 2022-07-11 15:50:05 normalization > `source_account_no`, 2022-07-11 15:50:05 normalization > `transaction_in_id`, 2022-07-11 15:50:05 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > `source_account_name`, 2022-07-11 15:50:05 normalization > `destination_bank_routing_no`, 2022-07-11 15:50:05 normalization > `_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > `_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > `_airbyte_normalized_at` from new_data 2022-07-11 15:50:05 normalization > union all 2022-07-11 15:50:05 normalization > select `_airbyte_transactions_out_hashid`, 2022-07-11 15:50:05 normalization > `id`, 2022-07-11 15:50:05 normalization > `data`, 2022-07-11 15:50:05 normalization > `uuid`, 2022-07-11 15:50:05 normalization > `amount`, 2022-07-11 15:50:05 normalization > `status`, 2022-07-11 15:50:05 normalization > `bank_id`, 2022-07-11 15:50:05 normalization > `created`, 2022-07-11 15:50:05 normalization > `file_id`, 2022-07-11 15:50:05 normalization > `updated`, 2022-07-11 15:50:05 normalization > `trace_no`, 2022-07-11 15:50:05 normalization > `account_no`, 2022-07-11 15:50:05 normalization > `partner_id`, 2022-07-11 15:50:05 normalization > `_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > `description`, 2022-07-11 15:50:05 normalization > `external_id`, 2022-07-11 15:50:05 normalization > `is_same_day`, 2022-07-11 15:50:05 normalization > `return_data`, 2022-07-11 15:50:05 normalization > `account_name`, 2022-07-11 15:50:05 normalization > `effective_date`, 2022-07-11 15:50:05 normalization > `reference_info`, 2022-07-11 15:50:05 normalization > `transaction_code`, 2022-07-11 15:50:05 normalization > `source_account_no`, 2022-07-11 15:50:05 normalization > `transaction_in_id`, 2022-07-11 15:50:05 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > `source_account_name`, 2022-07-11 15:50:05 normalization > `destination_bank_routing_no`, 2022-07-11 15:50:05 normalization > `_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > `_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > `_airbyte_normalized_at` from previous_active_scd_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > scd_data as ( 2022-07-11 15:50:05 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:50:05 normalization > id, 2022-07-11 15:50:05 normalization > data, 2022-07-11 15:50:05 normalization > uuid, 2022-07-11 15:50:05 normalization > amount, 2022-07-11 15:50:05 normalization > status, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > file_id, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > trace_no, 2022-07-11 15:50:05 normalization > account_no, 2022-07-11 15:50:05 normalization > partner_id, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > description, 2022-07-11 15:50:05 normalization > external_id, 2022-07-11 15:50:05 normalization > is_same_day, 2022-07-11 15:50:05 normalization > return_data, 2022-07-11 15:50:05 normalization > account_name, 2022-07-11 15:50:05 normalization > effective_date, 2022-07-11 15:50:05 normalization > reference_info, 2022-07-11 15:50:05 normalization > transaction_code, 2022-07-11 15:50:05 normalization > source_account_no, 2022-07-11 15:50:05 normalization > transaction_in_id, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > source_account_name, 2022-07-11 15:50:05 normalization > destination_bank_routing_no, 2022-07-11 15:50:05 normalization > updated as _airbyte_start_at, 2022-07-11 15:50:05 normalization > lag(updated) over ( 2022-07-11 15:50:05 normalization > partition by cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) as _airbyte_end_at, 2022-07-11 15:50:05 normalization > case when row_number() over ( 2022-07-11 15:50:05 normalization > partition by cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > _airbyte_transactions_out_hashid 2022-07-11 15:50:05 normalization > from input_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > dedup_data as ( 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:50:05 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:50:05 normalization > row_number() over ( 2022-07-11 15:50:05 normalization > partition by 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:50:05 normalization > ) as _airbyte_row_num, 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > scd_data.* 2022-07-11 15:50:05 normalization > from scd_data 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > id, 2022-07-11 15:50:05 normalization > data, 2022-07-11 15:50:05 normalization > uuid, 2022-07-11 15:50:05 normalization > amount, 2022-07-11 15:50:05 normalization > status, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > file_id, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > trace_no, 2022-07-11 15:50:05 normalization > account_no, 2022-07-11 15:50:05 normalization > partner_id, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > description, 2022-07-11 15:50:05 normalization > external_id, 2022-07-11 15:50:05 normalization > is_same_day, 2022-07-11 15:50:05 normalization > return_data, 2022-07-11 15:50:05 normalization > account_name, 2022-07-11 15:50:05 normalization > effective_date, 2022-07-11 15:50:05 normalization > reference_info, 2022-07-11 15:50:05 normalization > transaction_code, 2022-07-11 15:50:05 normalization > source_account_no, 2022-07-11 15:50:05 normalization > transaction_in_id, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > source_account_name, 2022-07-11 15:50:05 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):161 - Completing future exceptionally... io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:63) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:60) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 3 more Suppressed: io.airbyte.workers.exception.WorkerException: Normalization process wasn't successful at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:162) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:48) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-07-11 15:50:05 normalization > destination_bank_routing_no, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_end_at, 2022-07-11 15:50:05 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating... 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:50:05 normalization > _airbyte_transactions_out_hashid 2022-07-11 15:50:05 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:50:05 normalization > ); 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:42.202850 [debug] [Thread-5 ]: On model.airbyte_utils.files_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 WARN i.t.i.a.POJOActivityTaskHandler(activityFailureToResult):307 - Activity failure. ActivityId=ac542d01-4013-3b4f-aea8-3a7b10fbf311, activityType=Normalize, attempt=1 java.lang.RuntimeException: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:289) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.normalize(NormalizationActivityImpl.java:75) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at jdk.internal.reflect.GeneratedMethodAccessor386.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?] at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?] at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityInboundCallsInterceptor.execute(POJOActivityTaskHandler.java:214) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityImplementation.execute(POJOActivityTaskHandler.java:180) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.activity.POJOActivityTaskHandler.handle(POJOActivityTaskHandler.java:120) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:204) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:164) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.8.1.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.temporal.serviceclient.CheckedExceptionWrapper.wrap(CheckedExceptionWrapper.java:56) ~[temporal-serviceclient-1.8.1.jar:?] at io.temporal.internal.sync.WorkflowInternal.wrap(WorkflowInternal.java:448) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.activity.Activity.wrap(Activity.java:51) ~[temporal-sdk-1.8.1.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:138) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.lambda$normalize$3(NormalizationActivityImpl.java:103) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:284) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 13 more Caused by: java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Normalization Failed. at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:132) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.lambda$normalize$3(NormalizationActivityImpl.java:103) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:284) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 13 more Caused by: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:63) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 1 more Caused by: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:60) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 1 more Suppressed: io.airbyte.workers.exception.WorkerException: Normalization process wasn't successful at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:162) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:48) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_out_scd__dbt_tmp` 2022-07-11 15:50:05 normalization > partition by range_bucket( 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > generate_array(0, 1, 1) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:50:05 normalization > OPTIONS( 2022-07-11 15:50:05 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- depends_on: ref('files_out_stg') 2022-07-11 15:50:05 normalization > with 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > new_data as ( 2022-07-11 15:50:05 normalization > -- retrieve incremental "new" data 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > * 2022-07-11 15:50:05 normalization > from `mainapi-282501`._airbyte_raw_achilles.`files_out_stg` 2022-07-11 15:50:05 normalization > -- files_out from `mainapi-282501`.raw_achilles._airbyte_raw_files_out 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from `mainapi-282501`.raw_achilles.`files_out_scd`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > new_data_ids as ( 2022-07-11 15:50:05 normalization > -- build a subset of _airbyte_unique_key from rows that are new 2022-07-11 15:50:05 normalization > select distinct 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key 2022-07-11 15:50:05 normalization > from new_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > empty_new_data as ( 2022-07-11 15:50:05 normalization > -- build an empty table to only keep the table's column types 2022-07-11 15:50:05 normalization > select * from new_data where 1 = 0 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > previous_active_scd_data as ( 2022-07-11 15:50:05 normalization > -- retrieve "incomplete old" data that needs to be updated with an end date because of new changes 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > this_data.`_airbyte_files_out_hashid`, 2022-07-11 15:50:05 normalization > this_data.`id`, 2022-07-11 15:50:05 normalization > this_data.`bank_id`, 2022-07-11 15:50:05 normalization > this_data.`created`, 2022-07-11 15:50:05 normalization > this_data.`updated`, 2022-07-11 15:50:05 normalization > this_data.`file_hash`, 2022-07-11 15:50:05 normalization > this_data.`file_name`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > this_data.`batch_count`, 2022-07-11 15:50:05 normalization > this_data.`exchange_window`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > this_data.`_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > this_data.`_airbyte_normalized_at` 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`files_out_scd` as this_data 2022-07-11 15:50:05 normalization > -- make a join with new_data using primary key to filter active data that need to be updated only 2022-07-11 15:50:05 normalization > join new_data_ids on this_data._airbyte_unique_key = new_data_ids._airbyte_unique_key 2022-07-11 15:50:05 normalization > -- force left join to NULL values (we just need to transfer column types only for the star_intersect macro on schema changes) 2022-07-11 15:50:05 normalization > left join empty_new_data as inc_data on this_data._airbyte_ab_id = inc_data._airbyte_ab_id 2022-07-11 15:50:05 normalization > where _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > input_data as ( 2022-07-11 15:50:05 normalization > select `_airbyte_files_out_hashid`, 2022-07-11 15:50:05 normalization > `id`, 2022-07-11 15:50:05 normalization > `bank_id`, 2022-07-11 15:50:05 normalization > `created`, 2022-07-11 15:50:05 normalization > `updated`, 2022-07-11 15:50:05 normalization > `file_hash`, 2022-07-11 15:50:05 normalization > `file_name`, 2022-07-11 15:50:05 normalization > `_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > `batch_count`, 2022-07-11 15:50:05 normalization > `exchange_window`, 2022-07-11 15:50:05 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > `_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > `_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > `_airbyte_normalized_at` from new_data 2022-07-11 15:50:05 normalization > union all 2022-07-11 15:50:05 normalization > select `_airbyte_files_out_hashid`, 2022-07-11 15:50:05 normalization > `id`, 2022-07-11 15:50:05 normalization > `bank_id`, 2022-07-11 15:50:05 normalization > `created`, 2022-07-11 15:50:05 normalization > `updated`, 2022-07-11 15:50:05 normalization > `file_hash`, 2022-07-11 15:50:05 normalization > `file_name`, 2022-07-11 15:50:05 normalization > `_ab_cdc_lsn`, 2022-07-11 15:50:05 normalization > `batch_count`, 2022-07-11 15:50:05 normalization > `exchange_window`, 2022-07-11 15:50:05 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:50:05 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:50:05 normalization > `_airbyte_ab_id`, 2022-07-11 15:50:05 normalization > `_airbyte_emitted_at`, 2022-07-11 15:50:05 normalization > `_airbyte_normalized_at` from previous_active_scd_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > scd_data as ( 2022-07-11 15:50:05 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:50:05 normalization > id, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > file_hash, 2022-07-11 15:50:05 normalization > file_name, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > batch_count, 2022-07-11 15:50:05 normalization > exchange_window, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > updated as _airbyte_start_at, 2022-07-11 15:50:05 normalization > lag(updated) over ( 2022-07-11 15:50:05 normalization > partition by cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) as _airbyte_end_at, 2022-07-11 15:50:05 normalization > case when row_number() over ( 2022-07-11 15:50:05 normalization > partition by cast(id as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by 2022-07-11 15:50:05 normalization > updated is null asc, 2022-07-11 15:50:05 normalization > updated desc, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at desc 2022-07-11 15:50:05 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > _airbyte_files_out_hashid 2022-07-11 15:50:05 normalization > from input_data 2022-07-11 15:50:05 normalization > ), 2022-07-11 15:50:05 normalization > dedup_data as ( 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:50:05 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:50:05 normalization > row_number() over ( 2022-07-11 15:50:05 normalization > partition by 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:50:05 normalization > ) as _airbyte_row_num, 2022-07-11 15:50:05 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ), '')) as 2022-07-11 15:50:05 normalization > string 2022-07-11 15:50:05 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > scd_data.* 2022-07-11 15:50:05 normalization > from scd_data 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > _airbyte_unique_key_scd, 2022-07-11 15:50:05 normalization > id, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > file_hash, 2022-07-11 15:50:05 normalization > file_name, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > batch_count, 2022-07-11 15:50:05 normalization > exchange_window, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_start_at, 2022-07-11 15:50:05 normalization > _airbyte_end_at, 2022-07-11 15:50:05 normalization > _airbyte_active_row, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:50:05 normalization > _airbyte_files_out_hashid 2022-07-11 15:50:05 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:50:05 normalization > ); 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:42.657095 [debug] [Thread-4 ]: BigQuery adapter: Retry attempt 1 of 3 after error: BadRequest('Invalid timestamp string "0000-12-30T00:00:00Z"') 2022-07-11 15:50:05 normalization > 15:49:44.586695 [debug] [Thread-4 ]: BigQuery adapter: Retry attempt 2 of 3 after error: BadRequest('Invalid timestamp string "0000-12-30T00:00:00Z"') 2022-07-11 15:50:05 normalization > 15:49:45.884139 [debug] [Thread-6 ]: 2022-07-11 15:50:05 normalization > In `mainapi-282501`.`raw_achilles`.`transactions_out_scd`: 2022-07-11 15:50:05 normalization > Schema changed: False 2022-07-11 15:50:05 normalization > Source columns not in target: [] 2022-07-11 15:50:05 normalization > Target columns not in source: [] 2022-07-11 15:50:05 normalization > New column types: [] 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:45.953169 [debug] [Thread-1 ]: 2022-07-11 15:50:05 normalization > In `mainapi-282501`.`raw_achilles`.`files_in_scd`: 2022-07-11 15:50:05 normalization > Schema changed: False 2022-07-11 15:50:05 normalization > Source columns not in target: [] 2022-07-11 15:50:05 normalization > Target columns not in source: [] 2022-07-11 15:50:05 normalization > New column types: [] 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:45.949548 [debug] [Thread-6 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_out_scd" 2022-07-11 15:50:05 normalization > 15:49:45.961075 [debug] [Thread-1 ]: Writing runtime SQL for node "model.airbyte_utils.files_in_scd" 2022-07-11 15:50:05 normalization > 15:49:45.963266 [debug] [Thread-6 ]: On model.airbyte_utils.transactions_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > merge into `mainapi-282501`.raw_achilles.`transactions_out_scd` as DBT_INTERNAL_DEST 2022-07-11 15:50:05 normalization > using ( 2022-07-11 15:50:05 normalization > select * from `mainapi-282501`.raw_achilles.`transactions_out_scd__dbt_tmp` 2022-07-11 15:50:05 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:50:05 normalization > on 2022-07-11 15:50:05 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key_scd = DBT_INTERNAL_DEST._airbyte_unique_key_scd 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when matched then update set 2022-07-11 15:50:05 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`_airbyte_unique_key_scd` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key_scd`,`id` = DBT_INTERNAL_SOURCE.`id`,`data` = DBT_INTERNAL_SOURCE.`data`,`uuid` = DBT_INTERNAL_SOURCE.`uuid`,`amount` = DBT_INTERNAL_SOURCE.`amount`,`status` = DBT_INTERNAL_SOURCE.`status`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`file_id` = DBT_INTERNAL_SOURCE.`file_id`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`trace_no` = DBT_INTERNAL_SOURCE.`trace_no`,`account_no` = DBT_INTERNAL_SOURCE.`account_no`,`partner_id` = DBT_INTERNAL_SOURCE.`partner_id`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`description` = DBT_INTERNAL_SOURCE.`description`,`external_id` = DBT_INTERNAL_SOURCE.`external_id`,`is_same_day` = DBT_INTERNAL_SOURCE.`is_same_day`,`return_data` = DBT_INTERNAL_SOURCE.`return_data`,`account_name` = DBT_INTERNAL_SOURCE.`account_name`,`effective_date` = DBT_INTERNAL_SOURCE.`effective_date`,`reference_info` = DBT_INTERNAL_SOURCE.`reference_info`,`transaction_code` = DBT_INTERNAL_SOURCE.`transaction_code`,`source_account_no` = DBT_INTERNAL_SOURCE.`source_account_no`,`transaction_in_id` = DBT_INTERNAL_SOURCE.`transaction_in_id`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`source_account_name` = DBT_INTERNAL_SOURCE.`source_account_name`,`destination_bank_routing_no` = DBT_INTERNAL_SOURCE.`destination_bank_routing_no`,`_airbyte_start_at` = DBT_INTERNAL_SOURCE.`_airbyte_start_at`,`_airbyte_end_at` = DBT_INTERNAL_SOURCE.`_airbyte_end_at`,`_airbyte_active_row` = DBT_INTERNAL_SOURCE.`_airbyte_active_row`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_transactions_out_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_transactions_out_hashid` 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when not matched then insert 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `data`, `uuid`, `amount`, `status`, `bank_id`, `created`, `file_id`, `updated`, `trace_no`, `account_no`, `partner_id`, `_ab_cdc_lsn`, `description`, `external_id`, `is_same_day`, `return_data`, `account_name`, `effective_date`, `reference_info`, `transaction_code`, `source_account_no`, `transaction_in_id`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `source_account_name`, `destination_bank_routing_no`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_transactions_out_hashid`) 2022-07-11 15:50:05 normalization > values 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `data`, `uuid`, `amount`, `status`, `bank_id`, `created`, `file_id`, `updated`, `trace_no`, `account_no`, `partner_id`, `_ab_cdc_lsn`, `description`, `external_id`, `is_same_day`, `return_data`, `account_name`, `effective_date`, `reference_info`, `transaction_code`, `source_account_no`, `transaction_in_id`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `source_account_name`, `destination_bank_routing_no`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_transactions_out_hashid`) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:45.966167 [debug] [Thread-1 ]: On model.airbyte_utils.files_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > merge into `mainapi-282501`.raw_achilles.`files_in_scd` as DBT_INTERNAL_DEST 2022-07-11 15:50:05 normalization > using ( 2022-07-11 15:50:05 normalization > select * from `mainapi-282501`.raw_achilles.`files_in_scd__dbt_tmp` 2022-07-11 15:50:05 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:50:05 normalization > on 2022-07-11 15:50:05 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key_scd = DBT_INTERNAL_DEST._airbyte_unique_key_scd 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when matched then update set 2022-07-11 15:50:05 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`_airbyte_unique_key_scd` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key_scd`,`id` = DBT_INTERNAL_SOURCE.`id`,`ended` = DBT_INTERNAL_SOURCE.`ended`,`started` = DBT_INTERNAL_SOURCE.`started`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`file_hash` = DBT_INTERNAL_SOURCE.`file_hash`,`file_name` = DBT_INTERNAL_SOURCE.`file_name`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`iat_entry_count` = DBT_INTERNAL_SOURCE.`iat_entry_count`,`std_entry_count` = DBT_INTERNAL_SOURCE.`std_entry_count`,`total_batch_count` = DBT_INTERNAL_SOURCE.`total_batch_count`,`total_entry_count` = DBT_INTERNAL_SOURCE.`total_entry_count`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`preprocessing_path` = DBT_INTERNAL_SOURCE.`preprocessing_path`,`total_debit_amount` = DBT_INTERNAL_SOURCE.`total_debit_amount`,`postprocessing_path` = DBT_INTERNAL_SOURCE.`postprocessing_path`,`total_credit_amount` = DBT_INTERNAL_SOURCE.`total_credit_amount`,`iat_entries_processed` = DBT_INTERNAL_SOURCE.`iat_entries_processed`,`std_entries_processed` = DBT_INTERNAL_SOURCE.`std_entries_processed`,`_airbyte_start_at` = DBT_INTERNAL_SOURCE.`_airbyte_start_at`,`_airbyte_end_at` = DBT_INTERNAL_SOURCE.`_airbyte_end_at`,`_airbyte_active_row` = DBT_INTERNAL_SOURCE.`_airbyte_active_row`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_files_in_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_files_in_hashid` 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when not matched then insert 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `ended`, `started`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `iat_entry_count`, `std_entry_count`, `total_batch_count`, `total_entry_count`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `preprocessing_path`, `total_debit_amount`, `postprocessing_path`, `total_credit_amount`, `iat_entries_processed`, `std_entries_processed`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_in_hashid`) 2022-07-11 15:50:05 normalization > values 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `ended`, `started`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `iat_entry_count`, `std_entry_count`, `total_batch_count`, `total_entry_count`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `preprocessing_path`, `total_debit_amount`, `postprocessing_path`, `total_credit_amount`, `iat_entries_processed`, `std_entries_processed`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_in_hashid`) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:46.012850 [debug] [Thread-5 ]: 2022-07-11 15:50:05 normalization > In `mainapi-282501`.`raw_achilles`.`files_out_scd`: 2022-07-11 15:50:05 normalization > Schema changed: False 2022-07-11 15:50:05 normalization > Source columns not in target: [] 2022-07-11 15:50:05 normalization > Target columns not in source: [] 2022-07-11 15:50:05 normalization > New column types: [] 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:46.020165 [debug] [Thread-5 ]: Writing runtime SQL for node "model.airbyte_utils.files_out_scd" 2022-07-11 15:50:05 normalization > 15:49:46.020900 [debug] [Thread-5 ]: On model.airbyte_utils.files_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > merge into `mainapi-282501`.raw_achilles.`files_out_scd` as DBT_INTERNAL_DEST 2022-07-11 15:50:05 normalization > using ( 2022-07-11 15:50:05 normalization > select * from `mainapi-282501`.raw_achilles.`files_out_scd__dbt_tmp` 2022-07-11 15:50:05 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:50:05 normalization > on 2022-07-11 15:50:05 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key_scd = DBT_INTERNAL_DEST._airbyte_unique_key_scd 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when matched then update set 2022-07-11 15:50:05 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`_airbyte_unique_key_scd` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key_scd`,`id` = DBT_INTERNAL_SOURCE.`id`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`file_hash` = DBT_INTERNAL_SOURCE.`file_hash`,`file_name` = DBT_INTERNAL_SOURCE.`file_name`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`batch_count` = DBT_INTERNAL_SOURCE.`batch_count`,`exchange_window` = DBT_INTERNAL_SOURCE.`exchange_window`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_start_at` = DBT_INTERNAL_SOURCE.`_airbyte_start_at`,`_airbyte_end_at` = DBT_INTERNAL_SOURCE.`_airbyte_end_at`,`_airbyte_active_row` = DBT_INTERNAL_SOURCE.`_airbyte_active_row`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_files_out_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_files_out_hashid` 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when not matched then insert 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `bank_id`, `created`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `batch_count`, `exchange_window`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_out_hashid`) 2022-07-11 15:50:05 normalization > values 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `bank_id`, `created`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `batch_count`, `exchange_window`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_out_hashid`) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:46.249816 [debug] [Thread-4 ]: BigQuery adapter: Retry attempt 3 of 3 after error: BadRequest('Invalid timestamp string "0000-12-30T00:00:00Z"') 2022-07-11 15:50:05 normalization > 15:49:46.316113 [debug] [Thread-7 ]: 2022-07-11 15:50:05 normalization > In `mainapi-282501`.`raw_achilles`.`bank_config_scd`: 2022-07-11 15:50:05 normalization > Schema changed: False 2022-07-11 15:50:05 normalization > Source columns not in target: [] 2022-07-11 15:50:05 normalization > Target columns not in source: [] 2022-07-11 15:50:05 normalization > New column types: [] 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:46.319855 [debug] [Thread-7 ]: Writing runtime SQL for node "model.airbyte_utils.bank_config_scd" 2022-07-11 15:50:05 normalization > 15:49:46.320691 [debug] [Thread-7 ]: On model.airbyte_utils.bank_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > merge into `mainapi-282501`.raw_achilles.`bank_config_scd` as DBT_INTERNAL_DEST 2022-07-11 15:50:05 normalization > using ( 2022-07-11 15:50:05 normalization > select * from `mainapi-282501`.raw_achilles.`bank_config_scd__dbt_tmp` 2022-07-11 15:50:05 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:50:05 normalization > on 2022-07-11 15:50:05 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key_scd = DBT_INTERNAL_DEST._airbyte_unique_key_scd 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when matched then update set 2022-07-11 15:50:05 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`_airbyte_unique_key_scd` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key_scd`,`name` = DBT_INTERNAL_SOURCE.`name`,`config` = DBT_INTERNAL_SOURCE.`config`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`routing_no` = DBT_INTERNAL_SOURCE.`routing_no`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_start_at` = DBT_INTERNAL_SOURCE.`_airbyte_start_at`,`_airbyte_end_at` = DBT_INTERNAL_SOURCE.`_airbyte_end_at`,`_airbyte_active_row` = DBT_INTERNAL_SOURCE.`_airbyte_active_row`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_bank_config_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_bank_config_hashid` 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when not matched then insert 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `name`, `config`, `bank_id`, `created`, `updated`, `routing_no`, `_ab_cdc_lsn`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_bank_config_hashid`) 2022-07-11 15:50:05 normalization > values 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `name`, `config`, `bank_id`, `created`, `updated`, `routing_no`, `_ab_cdc_lsn`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_bank_config_hashid`) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:46.815348 [debug] [Thread-8 ]: 2022-07-11 15:50:05 normalization > In `mainapi-282501`.`raw_achilles`.`partner_config_scd`: 2022-07-11 15:50:05 normalization > Schema changed: False 2022-07-11 15:50:05 normalization > Source columns not in target: [] 2022-07-11 15:50:05 normalization > Target columns not in source: [] 2022-07-11 15:50:05 normalization > New column types: [] 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:46.819001 [debug] [Thread-8 ]: Writing runtime SQL for node "model.airbyte_utils.partner_config_scd" 2022-07-11 15:50:05 normalization > 15:49:46.819758 [debug] [Thread-8 ]: On model.airbyte_utils.partner_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > merge into `mainapi-282501`.raw_achilles.`partner_config_scd` as DBT_INTERNAL_DEST 2022-07-11 15:50:05 normalization > using ( 2022-07-11 15:50:05 normalization > select * from `mainapi-282501`.raw_achilles.`partner_config_scd__dbt_tmp` 2022-07-11 15:50:05 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:50:05 normalization > on 2022-07-11 15:50:05 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key_scd = DBT_INTERNAL_DEST._airbyte_unique_key_scd 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when matched then update set 2022-07-11 15:50:05 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`_airbyte_unique_key_scd` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key_scd`,`name` = DBT_INTERNAL_SOURCE.`name`,`config` = DBT_INTERNAL_SOURCE.`config`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`partner_id` = DBT_INTERNAL_SOURCE.`partner_id`,`routing_no` = DBT_INTERNAL_SOURCE.`routing_no`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`account_prefix` = DBT_INTERNAL_SOURCE.`account_prefix`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_start_at` = DBT_INTERNAL_SOURCE.`_airbyte_start_at`,`_airbyte_end_at` = DBT_INTERNAL_SOURCE.`_airbyte_end_at`,`_airbyte_active_row` = DBT_INTERNAL_SOURCE.`_airbyte_active_row`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_partner_config_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_partner_config_hashid` 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when not matched then insert 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `name`, `config`, `bank_id`, `created`, `updated`, `partner_id`, `routing_no`, `_ab_cdc_lsn`, `account_prefix`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_partner_config_hashid`) 2022-07-11 15:50:05 normalization > values 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `name`, `config`, `bank_id`, `created`, `updated`, `partner_id`, `routing_no`, `_ab_cdc_lsn`, `account_prefix`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_partner_config_hashid`) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:48.148115 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:48.149156 [debug] [Thread-4 ]: Database Error in model transactions_in_scd (models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql) 2022-07-11 15:50:05 normalization > Invalid timestamp string "0000-12-30T00:00:00Z" 2022-07-11 15:50:05 normalization > compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:50:05 normalization > 15:49:48.149811 [error] [Thread-4 ]: 11 of 18 ERROR creating incremental model raw_achilles.transactions_in_scd.............................................. [ERROR in 6.86s] 2022-07-11 15:50:05 normalization > 15:49:48.150515 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.transactions_in_scd 2022-07-11 15:50:05 normalization > 15:49:48.151948 [debug] [Thread-3 ]: Began running node model.airbyte_utils.transactions_in 2022-07-11 15:50:05 normalization > 15:49:48.152484 [info ] [Thread-3 ]: 13 of 18 SKIP relation raw_achilles.transactions_in..................................................................... [SKIP] 2022-07-11 15:50:05 normalization > 15:49:48.153078 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.transactions_in 2022-07-11 15:50:05 normalization > 15:49:49.151602 [debug] [Thread-6 ]: On model.airbyte_utils.transactions_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- Delete records which are no longer active: 2022-07-11 15:50:05 normalization > -- This query is equivalent, but the left join version is more performant: 2022-07-11 15:50:05 normalization > -- delete from final_table where unique_key in ( 2022-07-11 15:50:05 normalization > -- select unique_key from scd_table where 1 = 1 2022-07-11 15:50:05 normalization > -- ) and unique_key not in ( 2022-07-11 15:50:05 normalization > -- select unique_key from scd_table where active_row = 1 2022-07-11 15:50:05 normalization > -- ) 2022-07-11 15:50:05 normalization > -- We're incremental against normalized_at rather than emitted_at because we need to fetch the SCD 2022-07-11 15:50:05 normalization > -- entries that were _updated_ recently. This is because a deleted record will have an SCD record 2022-07-11 15:50:05 normalization > -- which was emitted a long time ago, but recently re-normalized to have active_row = 0. 2022-07-11 15:50:05 normalization > delete from `mainapi-282501`.`raw_achilles`.`transactions_out` final_table where final_table._airbyte_unique_key in ( 2022-07-11 15:50:05 normalization > select recent_records.unique_key 2022-07-11 15:50:05 normalization > from ( 2022-07-11 15:50:05 normalization > select distinct _airbyte_unique_key as unique_key 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`transactions_out_scd` 2022-07-11 15:50:05 normalization > where 1=1 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from raw_achilles.`transactions_out`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ) recent_records 2022-07-11 15:50:05 normalization > left join ( 2022-07-11 15:50:05 normalization > select _airbyte_unique_key as unique_key, count(_airbyte_unique_key) as active_count 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`transactions_out_scd` 2022-07-11 15:50:05 normalization > where _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from raw_achilles.`transactions_out`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > group by _airbyte_unique_key 2022-07-11 15:50:05 normalization > ) active_counts 2022-07-11 15:50:05 normalization > on recent_records.unique_key = active_counts.unique_key 2022-07-11 15:50:05 normalization > where active_count is null or active_count = 0 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:49.205856 [debug] [Thread-5 ]: On model.airbyte_utils.files_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- Delete records which are no longer active: 2022-07-11 15:50:05 normalization > -- This query is equivalent, but the left join version is more performant: 2022-07-11 15:50:05 normalization > -- delete from final_table where unique_key in ( 2022-07-11 15:50:05 normalization > -- select unique_key from scd_table where 1 = 1 2022-07-11 15:50:05 normalization > -- ) and unique_key not in ( 2022-07-11 15:50:05 normalization > -- select unique_key from scd_table where active_row = 1 2022-07-11 15:50:05 normalization > -- ) 2022-07-11 15:50:05 normalization > -- We're incremental against normalized_at rather than emitted_at because we need to fetch the SCD 2022-07-11 15:50:05 normalization > -- entries that were _updated_ recently. This is because a deleted record will have an SCD record 2022-07-11 15:50:05 normalization > -- which was emitted a long time ago, but recently re-normalized to have active_row = 0. 2022-07-11 15:50:05 normalization > delete from `mainapi-282501`.`raw_achilles`.`files_out` final_table where final_table._airbyte_unique_key in ( 2022-07-11 15:50:05 normalization > select recent_records.unique_key 2022-07-11 15:50:05 normalization > from ( 2022-07-11 15:50:05 normalization > select distinct _airbyte_unique_key as unique_key 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`files_out_scd` 2022-07-11 15:50:05 normalization > where 1=1 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from raw_achilles.`files_out`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ) recent_records 2022-07-11 15:50:05 normalization > left join ( 2022-07-11 15:50:05 normalization > select _airbyte_unique_key as unique_key, count(_airbyte_unique_key) as active_count 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`files_out_scd` 2022-07-11 15:50:05 normalization > where _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from raw_achilles.`files_out`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > group by _airbyte_unique_key 2022-07-11 15:50:05 normalization > ) active_counts 2022-07-11 15:50:05 normalization > on recent_records.unique_key = active_counts.unique_key 2022-07-11 15:50:05 normalization > where active_count is null or active_count = 0 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:49.266776 [debug] [Thread-1 ]: On model.airbyte_utils.files_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- Delete records which are no longer active: 2022-07-11 15:50:05 normalization > -- This query is equivalent, but the left join version is more performant: 2022-07-11 15:50:05 normalization > -- delete from final_table where unique_key in ( 2022-07-11 15:50:05 normalization > -- select unique_key from scd_table where 1 = 1 2022-07-11 15:50:05 normalization > -- ) and unique_key not in ( 2022-07-11 15:50:05 normalization > -- select unique_key from scd_table where active_row = 1 2022-07-11 15:50:05 normalization > -- ) 2022-07-11 15:50:05 normalization > -- We're incremental against normalized_at rather than emitted_at because we need to fetch the SCD 2022-07-11 15:50:05 normalization > -- entries that were _updated_ recently. This is because a deleted record will have an SCD record 2022-07-11 15:50:05 normalization > -- which was emitted a long time ago, but recently re-normalized to have active_row = 0. 2022-07-11 15:50:05 normalization > delete from `mainapi-282501`.`raw_achilles`.`files_in` final_table where final_table._airbyte_unique_key in ( 2022-07-11 15:50:05 normalization > select recent_records.unique_key 2022-07-11 15:50:05 normalization > from ( 2022-07-11 15:50:05 normalization > select distinct _airbyte_unique_key as unique_key 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`files_in_scd` 2022-07-11 15:50:05 normalization > where 1=1 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from raw_achilles.`files_in`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ) recent_records 2022-07-11 15:50:05 normalization > left join ( 2022-07-11 15:50:05 normalization > select _airbyte_unique_key as unique_key, count(_airbyte_unique_key) as active_count 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`files_in_scd` 2022-07-11 15:50:05 normalization > where _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from raw_achilles.`files_in`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > group by _airbyte_unique_key 2022-07-11 15:50:05 normalization > ) active_counts 2022-07-11 15:50:05 normalization > on recent_records.unique_key = active_counts.unique_key 2022-07-11 15:50:05 normalization > where active_count is null or active_count = 0 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:49.407016 [debug] [Thread-7 ]: On model.airbyte_utils.bank_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- Delete records which are no longer active: 2022-07-11 15:50:05 normalization > -- This query is equivalent, but the left join version is more performant: 2022-07-11 15:50:05 normalization > -- delete from final_table where unique_key in ( 2022-07-11 15:50:05 normalization > -- select unique_key from scd_table where 1 = 1 2022-07-11 15:50:05 normalization > -- ) and unique_key not in ( 2022-07-11 15:50:05 normalization > -- select unique_key from scd_table where active_row = 1 2022-07-11 15:50:05 normalization > -- ) 2022-07-11 15:50:05 normalization > -- We're incremental against normalized_at rather than emitted_at because we need to fetch the SCD 2022-07-11 15:50:05 normalization > -- entries that were _updated_ recently. This is because a deleted record will have an SCD record 2022-07-11 15:50:05 normalization > -- which was emitted a long time ago, but recently re-normalized to have active_row = 0. 2022-07-11 15:50:05 normalization > delete from `mainapi-282501`.`raw_achilles`.`bank_config` final_table where final_table._airbyte_unique_key in ( 2022-07-11 15:50:05 normalization > select recent_records.unique_key 2022-07-11 15:50:05 normalization > from ( 2022-07-11 15:50:05 normalization > select distinct _airbyte_unique_key as unique_key 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`bank_config_scd` 2022-07-11 15:50:05 normalization > where 1=1 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from raw_achilles.`bank_config`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ) recent_records 2022-07-11 15:50:05 normalization > left join ( 2022-07-11 15:50:05 normalization > select _airbyte_unique_key as unique_key, count(_airbyte_unique_key) as active_count 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`bank_config_scd` 2022-07-11 15:50:05 normalization > where _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from raw_achilles.`bank_config`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > group by _airbyte_unique_key 2022-07-11 15:50:05 normalization > ) active_counts 2022-07-11 15:50:05 normalization > on recent_records.unique_key = active_counts.unique_key 2022-07-11 15:50:05 normalization > where active_count is null or active_count = 0 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:49.871570 [debug] [Thread-8 ]: On model.airbyte_utils.partner_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- Delete records which are no longer active: 2022-07-11 15:50:05 normalization > -- This query is equivalent, but the left join version is more performant: 2022-07-11 15:50:05 normalization > -- delete from final_table where unique_key in ( 2022-07-11 15:50:05 normalization > -- select unique_key from scd_table where 1 = 1 2022-07-11 15:50:05 normalization > -- ) and unique_key not in ( 2022-07-11 15:50:05 normalization > -- select unique_key from scd_table where active_row = 1 2022-07-11 15:50:05 normalization > -- ) 2022-07-11 15:50:05 normalization > -- We're incremental against normalized_at rather than emitted_at because we need to fetch the SCD 2022-07-11 15:50:05 normalization > -- entries that were _updated_ recently. This is because a deleted record will have an SCD record 2022-07-11 15:50:05 normalization > -- which was emitted a long time ago, but recently re-normalized to have active_row = 0. 2022-07-11 15:50:05 normalization > delete from `mainapi-282501`.`raw_achilles`.`partner_config` final_table where final_table._airbyte_unique_key in ( 2022-07-11 15:50:05 normalization > select recent_records.unique_key 2022-07-11 15:50:05 normalization > from ( 2022-07-11 15:50:05 normalization > select distinct _airbyte_unique_key as unique_key 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`partner_config_scd` 2022-07-11 15:50:05 normalization > where 1=1 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from raw_achilles.`partner_config`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ) recent_records 2022-07-11 15:50:05 normalization > left join ( 2022-07-11 15:50:05 normalization > select _airbyte_unique_key as unique_key, count(_airbyte_unique_key) as active_count 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`partner_config_scd` 2022-07-11 15:50:05 normalization > where _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from raw_achilles.`partner_config`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > group by _airbyte_unique_key 2022-07-11 15:50:05 normalization > ) active_counts 2022-07-11 15:50:05 normalization > on recent_records.unique_key = active_counts.unique_key 2022-07-11 15:50:05 normalization > where active_count is null or active_count = 0 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:52.477299 [debug] [Thread-5 ]: On model.airbyte_utils.files_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > drop view _airbyte_raw_achilles.files_out_stg 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:52.593453 [debug] [Thread-7 ]: On model.airbyte_utils.bank_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > drop view _airbyte_raw_achilles.bank_config_stg 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:52.616811 [debug] [Thread-6 ]: On model.airbyte_utils.transactions_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > drop view _airbyte_raw_achilles.transactions_out_stg 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:52.749361 [debug] [Thread-1 ]: On model.airbyte_utils.files_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > drop view _airbyte_raw_achilles.files_in_stg 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:52.981924 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:52.983090 [info ] [Thread-5 ]: 12 of 18 OK created incremental model raw_achilles.files_out_scd........................................................ [MERGE (68.0 rows, 35.5 KB processed) in 11.63s] 2022-07-11 15:50:05 normalization > 15:49:52.983691 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.files_out_scd 2022-07-11 15:50:05 normalization > 15:49:52.984751 [debug] [Thread-4 ]: Began running node model.airbyte_utils.files_out 2022-07-11 15:50:05 normalization > 15:49:52.985176 [info ] [Thread-4 ]: 14 of 18 START incremental model raw_achilles.files_out................................................................. [RUN] 2022-07-11 15:50:05 normalization > 15:49:52.986338 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out" 2022-07-11 15:50:05 normalization > 15:49:52.986565 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.files_out 2022-07-11 15:50:05 normalization > 15:49:52.986792 [debug] [Thread-4 ]: Compiling model.airbyte_utils.files_out 2022-07-11 15:50:05 normalization > 15:49:53.000367 [debug] [Thread-4 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:53.179501 [debug] [Thread-8 ]: On model.airbyte_utils.partner_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_scd"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > drop view _airbyte_raw_achilles.partner_config_stg 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:53.194153 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.files_out" 2022-07-11 15:50:05 normalization > 15:49:53.194974 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:53.195262 [debug] [Thread-4 ]: Began executing node model.airbyte_utils.files_out 2022-07-11 15:50:05 normalization > 15:49:53.216449 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:53.218664 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:53.219814 [info ] [Thread-6 ]: 10 of 18 OK created incremental model raw_achilles.transactions_out_scd................................................. [MERGE (226.0 rows, 170.3 KB processed) in 12.00s] 2022-07-11 15:50:05 normalization > 15:49:53.220875 [info ] [Thread-7 ]: 9 of 18 OK created incremental model raw_achilles.bank_config_scd....................................................... [MERGE (6.0 rows, 5.0 KB processed) in 12.02s] 2022-07-11 15:50:05 normalization > 15:49:53.222069 [debug] [Thread-6 ]: Finished running node model.airbyte_utils.transactions_out_scd 2022-07-11 15:50:05 normalization > 15:49:53.223111 [debug] [Thread-7 ]: Finished running node model.airbyte_utils.bank_config_scd 2022-07-11 15:50:05 normalization > 15:49:53.224211 [debug] [Thread-3 ]: Began running node model.airbyte_utils.transactions_out 2022-07-11 15:50:05 normalization > 15:49:53.224472 [debug] [Thread-5 ]: Began running node model.airbyte_utils.bank_config 2022-07-11 15:50:05 normalization > 15:49:53.225695 [info ] [Thread-3 ]: 15 of 18 START incremental model raw_achilles.transactions_out.......................................................... [RUN] 2022-07-11 15:50:05 normalization > 15:49:53.226105 [info ] [Thread-5 ]: 16 of 18 START incremental model raw_achilles.bank_config............................................................... [RUN] 2022-07-11 15:50:05 normalization > 15:49:53.227428 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out" 2022-07-11 15:50:05 normalization > 15:49:53.228711 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config" 2022-07-11 15:50:05 normalization > 15:49:53.229016 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.transactions_out 2022-07-11 15:50:05 normalization > 15:49:53.229315 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.bank_config 2022-07-11 15:50:05 normalization > 15:49:53.229646 [debug] [Thread-3 ]: Compiling model.airbyte_utils.transactions_out 2022-07-11 15:50:05 normalization > 15:49:53.229950 [debug] [Thread-5 ]: Compiling model.airbyte_utils.bank_config 2022-07-11 15:50:05 normalization > 15:49:53.241313 [debug] [Thread-3 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:53.251552 [debug] [Thread-5 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:53.276813 [debug] [Thread-4 ]: On model.airbyte_utils.files_out: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_out__dbt_tmp` 2022-07-11 15:50:05 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:50:05 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:50:05 normalization > OPTIONS( 2022-07-11 15:50:05 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- Final base SQL model 2022-07-11 15:50:05 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`files_out_scd` 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > id, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > file_hash, 2022-07-11 15:50:05 normalization > file_name, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > batch_count, 2022-07-11 15:50:05 normalization > exchange_window, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:50:05 normalization > _airbyte_files_out_hashid 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`files_out_scd` 2022-07-11 15:50:05 normalization > -- files_out from `mainapi-282501`.raw_achilles._airbyte_raw_files_out 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > and _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from `mainapi-282501`.raw_achilles.`files_out`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ); 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:53.322139 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:53.324237 [info ] [Thread-1 ]: 7 of 18 OK created incremental model raw_achilles.files_in_scd.......................................................... [MERGE (72.0 rows, 45.5 KB processed) in 12.23s] 2022-07-11 15:50:05 normalization > 15:49:53.324950 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.files_in_scd 2022-07-11 15:50:05 normalization > 15:49:53.327300 [debug] [Thread-6 ]: Began running node model.airbyte_utils.files_in 2022-07-11 15:50:05 normalization > 15:49:53.332739 [info ] [Thread-6 ]: 17 of 18 START incremental model raw_achilles.files_in.................................................................. [RUN] 2022-07-11 15:50:05 normalization > 15:49:53.336534 [debug] [Thread-6 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in" 2022-07-11 15:50:05 normalization > 15:49:53.336975 [debug] [Thread-6 ]: Began compiling node model.airbyte_utils.files_in 2022-07-11 15:50:05 normalization > 15:49:53.339151 [debug] [Thread-6 ]: Compiling model.airbyte_utils.files_in 2022-07-11 15:50:05 normalization > 15:49:53.353948 [debug] [Thread-6 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:53.379417 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.bank_config" 2022-07-11 15:50:05 normalization > 15:49:53.380027 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:53.380280 [debug] [Thread-5 ]: Began executing node model.airbyte_utils.bank_config 2022-07-11 15:50:05 normalization > 15:49:53.395944 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out" 2022-07-11 15:50:05 normalization > 15:49:53.396892 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:53.397160 [debug] [Thread-3 ]: Began executing node model.airbyte_utils.transactions_out 2022-07-11 15:50:05 normalization > 15:49:53.483130 [debug] [Thread-5 ]: On model.airbyte_utils.bank_config: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace table `mainapi-282501`.raw_achilles.`bank_config__dbt_tmp` 2022-07-11 15:50:05 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:50:05 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:50:05 normalization > OPTIONS( 2022-07-11 15:50:05 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- Final base SQL model 2022-07-11 15:50:05 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`bank_config_scd` 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > name, 2022-07-11 15:50:05 normalization > config, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > routing_no, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:50:05 normalization > _airbyte_bank_config_hashid 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`bank_config_scd` 2022-07-11 15:50:05 normalization > -- bank_config from `mainapi-282501`.raw_achilles._airbyte_raw_bank_config 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > and _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from `mainapi-282501`.raw_achilles.`bank_config`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ); 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:53.490934 [debug] [Thread-6 ]: Writing injected SQL for node "model.airbyte_utils.files_in" 2022-07-11 15:50:05 normalization > 15:49:53.491431 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:53.491655 [debug] [Thread-6 ]: Began executing node model.airbyte_utils.files_in 2022-07-11 15:50:05 normalization > 15:49:53.507837 [debug] [Thread-3 ]: On model.airbyte_utils.transactions_out: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace table `mainapi-282501`.raw_achilles.`transactions_out__dbt_tmp` 2022-07-11 15:50:05 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:50:05 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:50:05 normalization > OPTIONS( 2022-07-11 15:50:05 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- Final base SQL model 2022-07-11 15:50:05 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`transactions_out_scd` 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > id, 2022-07-11 15:50:05 normalization > data, 2022-07-11 15:50:05 normalization > uuid, 2022-07-11 15:50:05 normalization > amount, 2022-07-11 15:50:05 normalization > status, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > file_id, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > trace_no, 2022-07-11 15:50:05 normalization > account_no, 2022-07-11 15:50:05 normalization > partner_id, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > description, 2022-07-11 15:50:05 normalization > external_id, 2022-07-11 15:50:05 normalization > is_same_day, 2022-07-11 15:50:05 normalization > return_data, 2022-07-11 15:50:05 normalization > account_name, 2022-07-11 15:50:05 normalization > effective_date, 2022-07-11 15:50:05 normalization > reference_info, 2022-07-11 15:50:05 normalization > transaction_code, 2022-07-11 15:50:05 normalization > source_account_no, 2022-07-11 15:50:05 normalization > transaction_in_id, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > source_account_name, 2022-07-11 15:50:05 normalization > destination_bank_routing_no, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:50:05 normalization > _airbyte_transactions_out_hashid 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`transactions_out_scd` 2022-07-11 15:50:05 normalization > -- transactions_out from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > and _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from `mainapi-282501`.raw_achilles.`transactions_out`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ); 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:53.549753 [debug] [Thread-6 ]: On model.airbyte_utils.files_in: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_in__dbt_tmp` 2022-07-11 15:50:05 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:50:05 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:50:05 normalization > OPTIONS( 2022-07-11 15:50:05 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- Final base SQL model 2022-07-11 15:50:05 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`files_in_scd` 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > id, 2022-07-11 15:50:05 normalization > ended, 2022-07-11 15:50:05 normalization > started, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > file_hash, 2022-07-11 15:50:05 normalization > file_name, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > iat_entry_count, 2022-07-11 15:50:05 normalization > std_entry_count, 2022-07-11 15:50:05 normalization > total_batch_count, 2022-07-11 15:50:05 normalization > total_entry_count, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > preprocessing_path, 2022-07-11 15:50:05 normalization > total_debit_amount, 2022-07-11 15:50:05 normalization > postprocessing_path, 2022-07-11 15:50:05 normalization > total_credit_amount, 2022-07-11 15:50:05 normalization > iat_entries_processed, 2022-07-11 15:50:05 normalization > std_entries_processed, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:50:05 normalization > _airbyte_files_in_hashid 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`files_in_scd` 2022-07-11 15:50:05 normalization > -- files_in from `mainapi-282501`.raw_achilles._airbyte_raw_files_in 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > and _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from `mainapi-282501`.raw_achilles.`files_in`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ); 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:53.718579 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:53.719801 [info ] [Thread-8 ]: 8 of 18 OK created incremental model raw_achilles.partner_config_scd.................................................... [MERGE (412.0 rows, 284.3 KB processed) in 12.57s] 2022-07-11 15:50:05 normalization > 15:49:53.720354 [debug] [Thread-8 ]: Finished running node model.airbyte_utils.partner_config_scd 2022-07-11 15:50:05 normalization > 15:49:53.721347 [debug] [Thread-2 ]: Began running node model.airbyte_utils.partner_config 2022-07-11 15:50:05 normalization > 15:49:53.721727 [info ] [Thread-2 ]: 18 of 18 START incremental model raw_achilles.partner_config............................................................ [RUN] 2022-07-11 15:50:05 normalization > 15:49:53.722937 [debug] [Thread-2 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config" 2022-07-11 15:50:05 normalization > 15:49:53.723169 [debug] [Thread-2 ]: Began compiling node model.airbyte_utils.partner_config 2022-07-11 15:50:05 normalization > 15:49:53.723408 [debug] [Thread-2 ]: Compiling model.airbyte_utils.partner_config 2022-07-11 15:50:05 normalization > 15:49:53.735822 [debug] [Thread-2 ]: Opening a new connection, currently in state closed 2022-07-11 15:50:05 normalization > 15:49:53.854017 [debug] [Thread-2 ]: Writing injected SQL for node "model.airbyte_utils.partner_config" 2022-07-11 15:50:05 normalization > 15:49:53.854684 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:50:05 normalization > 15:49:53.854918 [debug] [Thread-2 ]: Began executing node model.airbyte_utils.partner_config 2022-07-11 15:50:05 normalization > 15:49:53.924438 [debug] [Thread-2 ]: On model.airbyte_utils.partner_config: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > create or replace table `mainapi-282501`.raw_achilles.`partner_config__dbt_tmp` 2022-07-11 15:50:05 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:50:05 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:50:05 normalization > OPTIONS( 2022-07-11 15:50:05 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:50:05 normalization > ) 2022-07-11 15:50:05 normalization > as ( 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > -- Final base SQL model 2022-07-11 15:50:05 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`partner_config_scd` 2022-07-11 15:50:05 normalization > select 2022-07-11 15:50:05 normalization > _airbyte_unique_key, 2022-07-11 15:50:05 normalization > name, 2022-07-11 15:50:05 normalization > config, 2022-07-11 15:50:05 normalization > bank_id, 2022-07-11 15:50:05 normalization > created, 2022-07-11 15:50:05 normalization > updated, 2022-07-11 15:50:05 normalization > partner_id, 2022-07-11 15:50:05 normalization > routing_no, 2022-07-11 15:50:05 normalization > _ab_cdc_lsn, 2022-07-11 15:50:05 normalization > account_prefix, 2022-07-11 15:50:05 normalization > _ab_cdc_deleted_at, 2022-07-11 15:50:05 normalization > _ab_cdc_updated_at, 2022-07-11 15:50:05 normalization > _airbyte_ab_id, 2022-07-11 15:50:05 normalization > _airbyte_emitted_at, 2022-07-11 15:50:05 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:50:05 normalization > _airbyte_partner_config_hashid 2022-07-11 15:50:05 normalization > from `mainapi-282501`.raw_achilles.`partner_config_scd` 2022-07-11 15:50:05 normalization > -- partner_config from `mainapi-282501`.raw_achilles._airbyte_raw_partner_config 2022-07-11 15:50:05 normalization > where 1 = 1 2022-07-11 15:50:05 normalization > and _airbyte_active_row = 1 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > and coalesce( 2022-07-11 15:50:05 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:50:05 normalization > timestamp 2022-07-11 15:50:05 normalization > )) from `mainapi-282501`.raw_achilles.`partner_config`), 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > true) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > ); 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:55.999111 [debug] [Thread-5 ]: 2022-07-11 15:50:05 normalization > In `mainapi-282501`.`raw_achilles`.`bank_config`: 2022-07-11 15:50:05 normalization > Schema changed: False 2022-07-11 15:50:05 normalization > Source columns not in target: [] 2022-07-11 15:50:05 normalization > Target columns not in source: [] 2022-07-11 15:50:05 normalization > New column types: [] 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:56.001173 [debug] [Thread-5 ]: Writing runtime SQL for node "model.airbyte_utils.bank_config" 2022-07-11 15:50:05 normalization > 15:49:56.001860 [debug] [Thread-5 ]: On model.airbyte_utils.bank_config: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > merge into `mainapi-282501`.raw_achilles.`bank_config` as DBT_INTERNAL_DEST 2022-07-11 15:50:05 normalization > using ( 2022-07-11 15:50:05 normalization > select * from `mainapi-282501`.raw_achilles.`bank_config__dbt_tmp` 2022-07-11 15:50:05 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:50:05 normalization > on 2022-07-11 15:50:05 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key = DBT_INTERNAL_DEST._airbyte_unique_key 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when matched then update set 2022-07-11 15:50:05 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`name` = DBT_INTERNAL_SOURCE.`name`,`config` = DBT_INTERNAL_SOURCE.`config`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`routing_no` = DBT_INTERNAL_SOURCE.`routing_no`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_bank_config_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_bank_config_hashid` 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when not matched then insert 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `name`, `config`, `bank_id`, `created`, `updated`, `routing_no`, `_ab_cdc_lsn`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_bank_config_hashid`) 2022-07-11 15:50:05 normalization > values 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `name`, `config`, `bank_id`, `created`, `updated`, `routing_no`, `_ab_cdc_lsn`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_bank_config_hashid`) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:56.025364 [debug] [Thread-3 ]: 2022-07-11 15:50:05 normalization > In `mainapi-282501`.`raw_achilles`.`transactions_out`: 2022-07-11 15:50:05 normalization > Schema changed: False 2022-07-11 15:50:05 normalization > Source columns not in target: [] 2022-07-11 15:50:05 normalization > Target columns not in source: [] 2022-07-11 15:50:05 normalization > New column types: [] 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:56.028061 [debug] [Thread-3 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_out" 2022-07-11 15:50:05 normalization > 15:49:56.028720 [debug] [Thread-3 ]: On model.airbyte_utils.transactions_out: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > merge into `mainapi-282501`.raw_achilles.`transactions_out` as DBT_INTERNAL_DEST 2022-07-11 15:50:05 normalization > using ( 2022-07-11 15:50:05 normalization > select * from `mainapi-282501`.raw_achilles.`transactions_out__dbt_tmp` 2022-07-11 15:50:05 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:50:05 normalization > on 2022-07-11 15:50:05 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key = DBT_INTERNAL_DEST._airbyte_unique_key 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when matched then update set 2022-07-11 15:50:05 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`id` = DBT_INTERNAL_SOURCE.`id`,`data` = DBT_INTERNAL_SOURCE.`data`,`uuid` = DBT_INTERNAL_SOURCE.`uuid`,`amount` = DBT_INTERNAL_SOURCE.`amount`,`status` = DBT_INTERNAL_SOURCE.`status`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`file_id` = DBT_INTERNAL_SOURCE.`file_id`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`trace_no` = DBT_INTERNAL_SOURCE.`trace_no`,`account_no` = DBT_INTERNAL_SOURCE.`account_no`,`partner_id` = DBT_INTERNAL_SOURCE.`partner_id`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`description` = DBT_INTERNAL_SOURCE.`description`,`external_id` = DBT_INTERNAL_SOURCE.`external_id`,`is_same_day` = DBT_INTERNAL_SOURCE.`is_same_day`,`return_data` = DBT_INTERNAL_SOURCE.`return_data`,`account_name` = DBT_INTERNAL_SOURCE.`account_name`,`effective_date` = DBT_INTERNAL_SOURCE.`effective_date`,`reference_info` = DBT_INTERNAL_SOURCE.`reference_info`,`transaction_code` = DBT_INTERNAL_SOURCE.`transaction_code`,`source_account_no` = DBT_INTERNAL_SOURCE.`source_account_no`,`transaction_in_id` = DBT_INTERNAL_SOURCE.`transaction_in_id`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`source_account_name` = DBT_INTERNAL_SOURCE.`source_account_name`,`destination_bank_routing_no` = DBT_INTERNAL_SOURCE.`destination_bank_routing_no`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_transactions_out_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_transactions_out_hashid` 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > when not matched then insert 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `id`, `data`, `uuid`, `amount`, `status`, `bank_id`, `created`, `file_id`, `updated`, `trace_no`, `account_no`, `partner_id`, `_ab_cdc_lsn`, `description`, `external_id`, `is_same_day`, `return_data`, `account_name`, `effective_date`, `reference_info`, `transaction_code`, `source_account_no`, `transaction_in_id`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `source_account_name`, `destination_bank_routing_no`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_transactions_out_hashid`) 2022-07-11 15:50:05 normalization > values 2022-07-11 15:50:05 normalization > (`_airbyte_unique_key`, `id`, `data`, `uuid`, `amount`, `status`, `bank_id`, `created`, `file_id`, `updated`, `trace_no`, `account_no`, `partner_id`, `_ab_cdc_lsn`, `description`, `external_id`, `is_same_day`, `return_data`, `account_name`, `effective_date`, `reference_info`, `transaction_code`, `source_account_no`, `transaction_in_id`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `source_account_name`, `destination_bank_routing_no`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_transactions_out_hashid`) 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:56.587808 [debug] [Thread-6 ]: 2022-07-11 15:50:05 normalization > In `mainapi-282501`.`raw_achilles`.`files_in`: 2022-07-11 15:50:05 normalization > Schema changed: False 2022-07-11 15:50:05 normalization > Source columns not in target: [] 2022-07-11 15:50:05 normalization > Target columns not in source: [] 2022-07-11 15:50:05 normalization > New column types: [] 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 15:49:56.590432 [debug] [Thread-6 ]: Writing runtime SQL for node "model.airbyte_utils.files_in" 2022-07-11 15:50:05 normalization > 15:49:56.591337 [debug] [Thread-6 ]: On model.airbyte_utils.files_in: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in"} */ 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:05 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > merge into `mainapi-282501`.raw_achilles.`files_in` as DBT_INTERNAL_DEST 2022-07-11 15:50:06 normalization > using ( 2022-07-11 15:50:06 normalization > select * from `mainapi-282501`.raw_achilles.`files_in__dbt_tmp` 2022-07-11 15:50:06 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:50:06 normalization > on 2022-07-11 15:50:06 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key = DBT_INTERNAL_DEST._airbyte_unique_key 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > when matched then update set 2022-07-11 15:50:06 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`id` = DBT_INTERNAL_SOURCE.`id`,`ended` = DBT_INTERNAL_SOURCE.`ended`,`started` = DBT_INTERNAL_SOURCE.`started`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`file_hash` = DBT_INTERNAL_SOURCE.`file_hash`,`file_name` = DBT_INTERNAL_SOURCE.`file_name`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`iat_entry_count` = DBT_INTERNAL_SOURCE.`iat_entry_count`,`std_entry_count` = DBT_INTERNAL_SOURCE.`std_entry_count`,`total_batch_count` = DBT_INTERNAL_SOURCE.`total_batch_count`,`total_entry_count` = DBT_INTERNAL_SOURCE.`total_entry_count`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`preprocessing_path` = DBT_INTERNAL_SOURCE.`preprocessing_path`,`total_debit_amount` = DBT_INTERNAL_SOURCE.`total_debit_amount`,`postprocessing_path` = DBT_INTERNAL_SOURCE.`postprocessing_path`,`total_credit_amount` = DBT_INTERNAL_SOURCE.`total_credit_amount`,`iat_entries_processed` = DBT_INTERNAL_SOURCE.`iat_entries_processed`,`std_entries_processed` = DBT_INTERNAL_SOURCE.`std_entries_processed`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_files_in_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_files_in_hashid` 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > when not matched then insert 2022-07-11 15:50:06 normalization > (`_airbyte_unique_key`, `id`, `ended`, `started`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `iat_entry_count`, `std_entry_count`, `total_batch_count`, `total_entry_count`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `preprocessing_path`, `total_debit_amount`, `postprocessing_path`, `total_credit_amount`, `iat_entries_processed`, `std_entries_processed`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_in_hashid`) 2022-07-11 15:50:06 normalization > values 2022-07-11 15:50:06 normalization > (`_airbyte_unique_key`, `id`, `ended`, `started`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `iat_entry_count`, `std_entry_count`, `total_batch_count`, `total_entry_count`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `preprocessing_path`, `total_debit_amount`, `postprocessing_path`, `total_credit_amount`, `iat_entries_processed`, `std_entries_processed`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_in_hashid`) 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 15:49:56.609846 [debug] [Thread-4 ]: 2022-07-11 15:50:06 normalization > In `mainapi-282501`.`raw_achilles`.`files_out`: 2022-07-11 15:50:06 normalization > Schema changed: False 2022-07-11 15:50:06 normalization > Source columns not in target: [] 2022-07-11 15:50:06 normalization > Target columns not in source: [] 2022-07-11 15:50:06 normalization > New column types: [] 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 15:49:56.612360 [debug] [Thread-4 ]: Writing runtime SQL for node "model.airbyte_utils.files_out" 2022-07-11 15:50:06 normalization > 15:49:56.612913 [debug] [Thread-4 ]: On model.airbyte_utils.files_out: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out"} */ 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > merge into `mainapi-282501`.raw_achilles.`files_out` as DBT_INTERNAL_DEST 2022-07-11 15:50:06 normalization > using ( 2022-07-11 15:50:06 normalization > select * from `mainapi-282501`.raw_achilles.`files_out__dbt_tmp` 2022-07-11 15:50:06 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:50:06 normalization > on 2022-07-11 15:50:06 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key = DBT_INTERNAL_DEST._airbyte_unique_key 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > when matched then update set 2022-07-11 15:50:06 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`id` = DBT_INTERNAL_SOURCE.`id`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`file_hash` = DBT_INTERNAL_SOURCE.`file_hash`,`file_name` = DBT_INTERNAL_SOURCE.`file_name`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`batch_count` = DBT_INTERNAL_SOURCE.`batch_count`,`exchange_window` = DBT_INTERNAL_SOURCE.`exchange_window`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_files_out_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_files_out_hashid` 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > when not matched then insert 2022-07-11 15:50:06 normalization > (`_airbyte_unique_key`, `id`, `bank_id`, `created`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `batch_count`, `exchange_window`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_out_hashid`) 2022-07-11 15:50:06 normalization > values 2022-07-11 15:50:06 normalization > (`_airbyte_unique_key`, `id`, `bank_id`, `created`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `batch_count`, `exchange_window`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_out_hashid`) 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 15:49:56.704057 [debug] [Thread-2 ]: 2022-07-11 15:50:06 normalization > In `mainapi-282501`.`raw_achilles`.`partner_config`: 2022-07-11 15:50:06 normalization > Schema changed: False 2022-07-11 15:50:06 normalization > Source columns not in target: [] 2022-07-11 15:50:06 normalization > Target columns not in source: [] 2022-07-11 15:50:06 normalization > New column types: [] 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 15:49:56.706479 [debug] [Thread-2 ]: Writing runtime SQL for node "model.airbyte_utils.partner_config" 2022-07-11 15:50:06 normalization > 15:49:56.707137 [debug] [Thread-2 ]: On model.airbyte_utils.partner_config: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config"} */ 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > merge into `mainapi-282501`.raw_achilles.`partner_config` as DBT_INTERNAL_DEST 2022-07-11 15:50:06 normalization > using ( 2022-07-11 15:50:06 normalization > select * from `mainapi-282501`.raw_achilles.`partner_config__dbt_tmp` 2022-07-11 15:50:06 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:50:06 normalization > on 2022-07-11 15:50:06 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key = DBT_INTERNAL_DEST._airbyte_unique_key 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > when matched then update set 2022-07-11 15:50:06 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`name` = DBT_INTERNAL_SOURCE.`name`,`config` = DBT_INTERNAL_SOURCE.`config`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`partner_id` = DBT_INTERNAL_SOURCE.`partner_id`,`routing_no` = DBT_INTERNAL_SOURCE.`routing_no`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`account_prefix` = DBT_INTERNAL_SOURCE.`account_prefix`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_partner_config_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_partner_config_hashid` 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > when not matched then insert 2022-07-11 15:50:06 normalization > (`_airbyte_unique_key`, `name`, `config`, `bank_id`, `created`, `updated`, `partner_id`, `routing_no`, `_ab_cdc_lsn`, `account_prefix`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_partner_config_hashid`) 2022-07-11 15:50:06 normalization > values 2022-07-11 15:50:06 normalization > (`_airbyte_unique_key`, `name`, `config`, `bank_id`, `created`, `updated`, `partner_id`, `routing_no`, `_ab_cdc_lsn`, `account_prefix`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_partner_config_hashid`) 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 2022-07-11 15:50:06 normalization > 15:49:58.883712 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:50:06 normalization > 15:49:58.884975 [info ] [Thread-5 ]: 16 of 18 OK created incremental model raw_achilles.bank_config.......................................................... [MERGE (3.0 rows, 3.1 KB processed) in 5.66s] 2022-07-11 15:50:06 normalization > 15:49:58.885507 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.bank_config 2022-07-11 15:50:06 normalization > 15:49:59.022962 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:50:06 normalization > 15:49:59.024231 [info ] [Thread-3 ]: 15 of 18 OK created incremental model raw_achilles.transactions_out..................................................... [MERGE (113.0 rows, 101.9 KB processed) in 5.80s] 2022-07-11 15:50:06 normalization > 15:49:59.024775 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.transactions_out 2022-07-11 15:50:06 normalization > 15:49:59.572259 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:50:06 normalization > 15:49:59.573341 [info ] [Thread-2 ]: 18 of 18 OK created incremental model raw_achilles.partner_config....................................................... [MERGE (206.0 rows, 168.4 KB processed) in 5.85s] 2022-07-11 15:50:06 normalization > 15:49:59.573852 [debug] [Thread-2 ]: Finished running node model.airbyte_utils.partner_config 2022-07-11 15:50:06 normalization > 15:49:59.855543 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:50:06 normalization > 15:49:59.856894 [info ] [Thread-6 ]: 17 of 18 OK created incremental model raw_achilles.files_in............................................................. [MERGE (36.0 rows, 26.6 KB processed) in 6.52s] 2022-07-11 15:50:06 normalization > 15:49:59.857543 [debug] [Thread-6 ]: Finished running node model.airbyte_utils.files_in 2022-07-11 15:50:06 normalization > 15:50:00.522461 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:50:06 normalization > 15:50:00.523710 [info ] [Thread-4 ]: 14 of 18 OK created incremental model raw_achilles.files_out............................................................ [MERGE (34.0 rows, 20.2 KB processed) in 7.54s] 2022-07-11 15:50:06 normalization > 15:50:00.524311 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.files_out 2022-07-11 15:50:06 normalization > 15:50:00.528738 [debug] [MainThread]: Acquiring new bigquery connection "master" 2022-07-11 15:50:06 normalization > 15:50:00.529753 [info ] [MainThread]: 2022-07-11 15:50:06 normalization > 15:50:00.530230 [info ] [MainThread]: Finished running 6 view models, 12 incremental models in 22.37s. 2022-07-11 15:50:06 normalization > 15:50:00.530785 [debug] [MainThread]: Connection 'master' was properly closed. 2022-07-11 15:50:06 normalization > 15:50:00.531115 [debug] [MainThread]: Connection 'model.airbyte_utils.partner_config' was properly closed. 2022-07-11 15:50:06 normalization > 15:50:00.531352 [debug] [MainThread]: Connection 'model.airbyte_utils.files_in_scd' was properly closed. 2022-07-11 15:50:06 normalization > 15:50:00.531524 [debug] [MainThread]: Connection 'model.airbyte_utils.transactions_out' was properly closed. 2022-07-11 15:50:06 normalization > 15:50:00.531703 [debug] [MainThread]: Connection 'model.airbyte_utils.files_out' was properly closed. 2022-07-11 15:50:06 normalization > 15:50:00.531864 [debug] [MainThread]: Connection 'model.airbyte_utils.bank_config' was properly closed. 2022-07-11 15:50:06 normalization > 15:50:00.532022 [debug] [MainThread]: Connection 'model.airbyte_utils.files_in' was properly closed. 2022-07-11 15:50:06 normalization > 15:50:00.532181 [debug] [MainThread]: Connection 'model.airbyte_utils.partner_config_scd' was properly closed. 2022-07-11 15:50:06 normalization > 15:50:00.532339 [debug] [MainThread]: Connection 'model.airbyte_utils.bank_config_scd' was properly closed. 2022-07-11 15:50:06 normalization > 15:50:00.555009 [info ] [MainThread]: 2022-07-11 15:50:06 normalization > 15:50:00.555529 [info ] [MainThread]: Completed with 1 error and 0 warnings: 2022-07-11 15:50:06 normalization > 15:50:00.556237 [info ] [MainThread]: 2022-07-11 15:50:06 normalization > 15:50:00.556853 [error] [MainThread]: Database Error in model transactions_in_scd (models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql) 2022-07-11 15:50:06 normalization > 15:50:00.557603 [error] [MainThread]: Invalid timestamp string "0000-12-30T00:00:00Z" 2022-07-11 15:50:06 normalization > 15:50:00.558086 [error] [MainThread]: compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:50:06 normalization > 15:50:00.558746 [info ] [MainThread]: 2022-07-11 15:50:06 normalization > 15:50:00.559259 [info ] [MainThread]: Done. PASS=16 WARN=0 ERROR=1 SKIP=1 TOTAL=18 2022-07-11 15:50:06 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.plugin: is not defined in the schema and the schema does not allow additional properties, $.publication: is not defined in the schema and the schema does not allow additional properties, $.replication_slot: is not defined in the schema and the schema does not allow additional properties, $.method: does not have a value in the enumeration [Standard], $.method: must be a constant value Standard 2022-07-11 15:50:06 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.method: must be a constant value Standard 2022-07-11 15:50:06 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.credential.hmac_key_access_id: object found, string expected, $.credential.hmac_key_secret: object found, string expected 2022-07-11 15:50:06 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/2/logs.log 2022-07-11 15:50:06 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:50:06 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:50:06 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-postgres:0.4.31 exists... 2022-07-11 15:50:06 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-postgres:0.4.31 was found locally. 2022-07-11 15:50:06 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:50:06 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/2 --log-driver none --name source-postgres-check-89696-2-tcegq --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/source-postgres:0.4.31 -e WORKER_JOB_ATTEMPT=2 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/source-postgres:0.4.31 check --config source_config.json 2022-07-11 15:50:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:08 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(getSource):73 - Running source under deployment mode: OSS 2022-07-11 15:50:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:08 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):85 - Starting source: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:50:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:08 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-07-11 15:50:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:08 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:50:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:08 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-07-11 15:50:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:08 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-07-11 15:50:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:08 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:08 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:08 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:08 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:08 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:08 INFO i.a.i.b.s.SshTunnel(getInstance):170 - Starting connection with method: NO_TUNNEL 2022-07-11 15:50:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:09 INFO c.z.h.HikariDataSource():80 - HikariPool-1 - Starting... 2022-07-11 15:50:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:09 INFO c.z.h.HikariDataSource():82 - HikariPool-1 - Start completed. 2022-07-11 15:50:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:09 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:50:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:09 INFO i.a.i.s.j.AbstractJdbcSource(lambda$getCheckOperations$1):93 - Attempting to get metadata from the database to see if we can connect. 2022-07-11 15:50:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:09 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$2):197 - Attempting to find the named replication slot using the query: HikariProxyPreparedStatement@1637601612 wrapping SELECT * FROM pg_replication_slots WHERE slot_name = 'airbyte_slot_achilles' AND plugin = 'wal2json' AND database = 'achilles' 2022-07-11 15:50:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:09 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:50:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:09 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$4):214 - Attempting to find the publication using the query: HikariProxyPreparedStatement@2063786038 wrapping SELECT * FROM pg_publication WHERE pubname = 'achilles_publication' 2022-07-11 15:50:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:09 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:50:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:09 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-07-11 15:50:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:09 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-07-11 15:50:09 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:50:09 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/2/logs.log 2022-07-11 15:50:09 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:50:09 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:50:09 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-bigquery:1.1.11 exists... 2022-07-11 15:50:09 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-bigquery:1.1.11 was found locally. 2022-07-11 15:50:09 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:50:09 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/2 --log-driver none --name destination-bigquery-check-89696-2-oucam --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:1.1.11 -e WORKER_JOB_ATTEMPT=2 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/destination-bigquery:1.1.11 check --config source_config.json 2022-07-11 15:50:10 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Class path contains multiple SLF4J bindings. 2022-07-11 15:50:10 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:50:10 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:50:10 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:50:10 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-11 15:50:10 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-11 15:50:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:11 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-07-11 15:50:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:11 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2022-07-11 15:50:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:11 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-07-11 15:50:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:11 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-07-11 15:50:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:11 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:11 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:11 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:12 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:12 INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):301 - Selected loading method is set to: GCS 2022-07-11 15:50:13 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:13 INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 - S3 format config: {"format_type":"CSV","flattening":"No flattening"} 2022-07-11 15:50:13 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:13 INFO i.a.i.d.s.S3Destination(testSingleUpload):81 - Started testing if all required credentials assigned to user for single file uploading 2022-07-11 15:50:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:14 INFO i.a.i.d.s.S3Destination(testSingleUpload):91 - Finished checking for normal upload mode 2022-07-11 15:50:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:14 INFO i.a.i.d.s.S3Destination(testMultipartUpload):95 - Started testing if all required credentials assigned to user for multipart upload 2022-07-11 15:50:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:14 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/test_1657554614570 with full ID ABPnzm5czhMXPiQntCxoXLKtvBetLPybquJd-2Xzi3Q2xBnEFu1In1fgry3_EnHmdIJdniI 2022-07-11 15:50:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:14 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:50:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:14 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:50:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:14 WARN a.m.s.MultiPartOutputStream(close):160 - [MultipartOutputStream for parts 1 - 10000] is already closed 2022-07-11 15:50:14 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:14 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/test_1657554614570 with id ABPnzm5cz...HmdIJdniI]: Uploading leftover stream [Part number 1 containing 3.34 MB] 2022-07-11 15:50:15 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:15 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/test_1657554614570 with id ABPnzm5cz...HmdIJdniI]: Finished uploading [Part number 1 containing 3.34 MB] 2022-07-11 15:50:15 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:15 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/test_1657554614570 with id ABPnzm5cz...HmdIJdniI]: Completed 2022-07-11 15:50:15 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-11 15:50:15 INFO i.a.i.d.s.S3Destination(testMultipartUpload):119 - Finished verification for multipart upload mode 2022-07-11 15:50:16 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:50:16 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/2/logs.log 2022-07-11 15:50:16 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:50:16 INFO i.a.w.g.DefaultReplicationWorker(run):115 - start sync worker. job id: 89696 attempt id: 2 2022-07-11 15:50:16 INFO i.a.w.g.DefaultReplicationWorker(run):127 - configured sync modes: {public.bank_config=incremental - append_dedup, public.files_out=incremental - append_dedup, public.transactions_out=incremental - append_dedup, public.partner_config=incremental - append_dedup, public.transactions_in=incremental - append_dedup, public.files_in=incremental - append_dedup} 2022-07-11 15:50:16 INFO i.a.w.i.DefaultAirbyteDestination(start):69 - Running destination... 2022-07-11 15:50:16 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:50:16 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-bigquery:1.1.11 exists... 2022-07-11 15:50:16 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-bigquery:1.1.11 was found locally. 2022-07-11 15:50:16 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:50:16 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/2 --log-driver none --name destination-bigquery-write-89696-2-tqqmn --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:1.1.11 -e WORKER_JOB_ATTEMPT=2 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/destination-bigquery:1.1.11 write --config destination_config.json --catalog destination_catalog.json 2022-07-11 15:50:16 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:50:16 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-postgres:0.4.31 exists... 2022-07-11 15:50:16 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-postgres:0.4.31 was found locally. 2022-07-11 15:50:16 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:50:16 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/2 --log-driver none --name source-postgres-read-89696-2-seyif --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/source-postgres:0.4.31 -e WORKER_JOB_ATTEMPT=2 -e AIRBYTE_VERSION=0.39.32-alpha -e WORKER_JOB_ID=89696 airbyte/source-postgres:0.4.31 read --config source_config.json --catalog source_catalog.json --state input_state.json 2022-07-11 15:50:16 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):405 - Destination output thread started. 2022-07-11 15:50:16 INFO i.a.w.g.DefaultReplicationWorker(run):169 - Waiting for source and destination threads to complete. 2022-07-11 15:50:16 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):298 - Replication thread started. 2022-07-11 15:50:17 destination > SLF4J: Class path contains multiple SLF4J bindings. 2022-07-11 15:50:17 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:50:17 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:50:17 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-11 15:50:17 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-11 15:50:17 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-11 15:50:19 source > 2022-07-11 15:50:19 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(getSource):73 - Running source under deployment mode: OSS 2022-07-11 15:50:19 source > 2022-07-11 15:50:19 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):85 - Starting source: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:50:19 source > 2022-07-11 15:50:19 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {read=null, catalog=source_catalog.json, state=input_state.json, config=source_config.json} 2022-07-11 15:50:19 source > 2022-07-11 15:50:19 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:50:19 source > 2022-07-11 15:50:19 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: READ 2022-07-11 15:50:19 source > 2022-07-11 15:50:19 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='input_state.json'} 2022-07-11 15:50:19 destination > 2022-07-11 15:50:19 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2022-07-11 15:50:19 destination > 2022-07-11 15:50:19 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2022-07-11 15:50:19 destination > 2022-07-11 15:50:19 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: WRITE 2022-07-11 15:50:19 destination > 2022-07-11 15:50:19 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2022-07-11 15:50:19 source > 2022-07-11 15:50:19 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:19 source > 2022-07-11 15:50:19 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:19 source > 2022-07-11 15:50:19 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:19 source > 2022-07-11 15:50:19 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:19 destination > 2022-07-11 15:50:19 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:19 destination > 2022-07-11 15:50:19 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:19 destination > 2022-07-11 15:50:19 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-11 15:50:20 destination > 2022-07-11 15:50:20 INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):301 - Selected loading method is set to: GCS 2022-07-11 15:50:20 destination > 2022-07-11 15:50:20 INFO i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 - S3 format config: {"format_type":"AVRO","flattening":"No flattening"} 2022-07-11 15:50:20 destination > 2022-07-11 15:50:20 INFO i.a.i.d.b.BigQueryUtils(isKeepFilesInGcs):317 - All tmp files will be removed from GCS when replication is finished 2022-07-11 15:50:20 source > 2022-07-11 15:50:20 INFO i.a.i.b.s.SshTunnel(getInstance):170 - Starting connection with method: NO_TUNNEL 2022-07-11 15:50:20 source > 2022-07-11 15:50:20 INFO c.z.h.HikariDataSource():80 - HikariPool-1 - Starting... 2022-07-11 15:50:20 source > 2022-07-11 15:50:20 INFO c.z.h.HikariDataSource():82 - HikariPool-1 - Start completed. 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.j.AbstractJdbcSource(lambda$getCheckOperations$1):93 - Attempting to get metadata from the database to see if we can connect. 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$2):197 - Attempting to find the named replication slot using the query: HikariProxyPreparedStatement@615853374 wrapping SELECT * FROM pg_replication_slots WHERE slot_name = 'airbyte_slot_achilles' AND plugin = 'wal2json' AND database = 'achilles' 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.p.PostgresSource(lambda$getCheckOperations$4):214 - Attempting to find the publication using the query: HikariProxyPreparedStatement@465152579 wrapping SELECT * FROM pg_publication WHERE pubname = 'achilles_publication' 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:50:21 destination > 2022-07-11 15:50:21 INFO i.a.i.d.b.BigQueryDestination(getGcsRecordConsumer):289 - Creating BigQuery staging message consumer with staging ID e88dc468-12f8-44ae-b1e2-27336f786e43 at 2022-07-11T15:50:20.512Z 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.r.s.StateManagerFactory(createStateManager):51 - Global state manager selected to manage state object with type LEGACY. 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.r.s.StateManagerFactory(generateGlobalState):84 - Legacy state converted to global state. 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='files_in', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='partner_config', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='bank_config', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='files_out', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='transactions_out', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.r.s.CursorManager(createCursorInfoForStream):161 - No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='transactions_in', namespace='public'}, New Cursor Field: updated. Resetting cursor value 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.r.CdcStateManager():29 - Initialized CDC state with: null 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO c.z.h.HikariDataSource():80 - HikariPool-2 - Starting... 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO c.z.h.HikariDataSource():82 - HikariPool-2 - Start completed. 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.p.PostgresSource(discoverRawTables):168 - Checking schema: public 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.i.s.j.AbstractJdbcSource(discoverInternal):121 - Internal schemas to exclude: [catalog_history, information_schema, pg_catalog, pg_internal] 2022-07-11 15:50:21 destination > 2022-07-11 15:50:21 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=bank_config, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_mnk_bank_config}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_bank_config}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:50:21 destination > 2022-07-11 15:50:21 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=files_in, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xzl_files_in}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_in}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:50:21 destination > 2022-07-11 15:50:21 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=files_out, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wcj_files_out}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_out}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:50:21 source > 2022-07-11 15:50:21 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:50:21 destination > 2022-07-11 15:50:21 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=partner_config, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_hdh_partner_config}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_partner_config}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:50:21 destination > 2022-07-11 15:50:21 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=transactions_in, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qvp_transactions_in}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_in}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:50:21 destination > 2022-07-11 15:50:21 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$createWriteConfigs$1):86 - BigQuery write config: BigQueryWriteConfig[streamName=transactions_out, namespace=raw_achilles, datasetId=raw_achilles, datasetLocation=US, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qpv_transactions_out}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_out}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=append_dedup, stagedFiles=[]] 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column trace_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column account_no (type varchar[17]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column account_name (type varchar[22]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column transaction_code (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column dc_sign (type varchar[6]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column effective_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column originating_bank_routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column destination_bank_routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_history (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column uuid (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_attempt (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column in_suspense (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_error (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column subtype (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column ach_entry (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column returned (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table in_processing column processing_data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column partner_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column name (type varchar[23]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 destination > 2022-07-11 15:50:22 INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2022-07-11 15:50:22 destination > 2022-07-11 15:50:22 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$3):98 - Preparing tmp tables in destination started for 6 streams 2022-07-11 15:50:22 destination > 2022-07-11 15:50:22 INFO i.a.i.d.b.BigQueryGcsOperations(createSchemaIfNotExists):86 - Creating dataset raw_achilles 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column account_prefix (type varchar[6]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column config (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table partner_config column routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table schema_migrations column version (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table schema_migrations column dirty (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column name (type varchar[23]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table bank_config column config (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column file_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column external_id (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column partner_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column trace_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column account_no (type varchar[17]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column account_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column source_account_no (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column source_account_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column description (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column effective_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column destination_bank_routing_no (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column return_data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column reference_info (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column transaction_code (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column transaction_in_id (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column uuid (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column status (type varchar[30]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_out column is_same_day (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column file_name (type varchar[255]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column batch_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column file_hash (type varchar[256]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column exchange_window (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_out column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column file_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column file_hash (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column company_id (type varchar[10]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column company_name (type varchar[16]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column company_entry_description (type varchar[10]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column batch_type (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column batch_number (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column originating_dfi (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column sec_code (type varchar[3]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column settlement_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column entry_trace_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column transaction_code (type varchar[2]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column receiving_dfi (type varchar[9]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column dfi_account_no (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column individual_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column individual_id_no (type varchar[15]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_record_count (type varchar[4]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column external_id (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column bank_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column partner_id (type int4[10]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column effective_date (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column returned (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column processing_history (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column created (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column uuid (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column return_data (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column transaction_out_id (type uuid[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column foreign_exchange_indicator (type varchar[2]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column destination_country_code (type varchar[2]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column originator_id (type varchar[10]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column originating_currency_code (type varchar[3]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column destination_currency_code (type varchar[3]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_99 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_98 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_02 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_05 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_10 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_11 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_12 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_13 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_14 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_15 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_16 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_17 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column addenda_18 (type jsonb[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table transactions_in column future_dated (type bool[1]) -> JsonSchemaType({type=boolean}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column id (type bigserial[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column preprocessing_path (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column postprocessing_path (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column file_name (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column file_hash (type text[2147483647]) -> JsonSchemaType({type=string}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column started (type timestamp[29]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column ended (type timestamp[29]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column std_entries_processed (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column iat_entries_processed (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column iat_entry_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column std_entry_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_entry_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_batch_count (type int2[5]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_debit_amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column total_credit_amount (type int8[19]) -> JsonSchemaType({type=number}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.j.AbstractJdbcSource(lambda$discoverInternal$5):143 - Table files_in column updated (type timestamptz[35]) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.in_processing 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.partner_config 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.schema_migrations 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.bank_config 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.transactions_out 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.files_out 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.transactions_in 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresSource(discoverRawTables):172 - Found table: public.files_in 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.d.j.s.AdaptiveStreamingQueryConfig(initialize):38 - Set initial fetch size: 10 rows 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.d.j.s.TwoStageSizeEstimator(getTargetBufferByteSize):72 - Max memory limit: 29578231808, JDBC buffer size: 1073741824 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresCdcCatalogHelper(getPublicizedTables):92 - For CDC, only tables in publication achilles_publication will be included in the sync: [pglogical.node, pglogical.replication_set_seq, public.partner_config, public.schema_migrations, public.files_in, pglogical.queue, pglogical.node_interface, public.transactions_out, pglogical.local_node, pglogical.subscription, pglogical.replication_set_table, pglogical.depend, public.bank_config, pglogical.local_sync_status, public.files_out, public.in_processing, pglogical.replication_set, pglogical.sequence_state, public.transactions_in] 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresUtils(isCdc):25 - using CDC: true 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.s.p.PostgresCdcTargetPosition(targetPosition):45 - identified target lsn: PgLsn{lsn=365839866712} 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.a.i.d.AirbyteDebeziumHandler(getIncrementalIterators):99 - Using CDC: true 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO o.a.k.c.c.AbstractConfig(logAll):376 - EmbeddedConfig values: 2022-07-11 15:50:22 source > access.control.allow.methods = 2022-07-11 15:50:22 source > access.control.allow.origin = 2022-07-11 15:50:22 source > admin.listeners = null 2022-07-11 15:50:22 source > bootstrap.servers = [localhost:9092] 2022-07-11 15:50:22 source > client.dns.lookup = use_all_dns_ips 2022-07-11 15:50:22 source > config.providers = [] 2022-07-11 15:50:22 source > connector.client.config.override.policy = All 2022-07-11 15:50:22 source > header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter 2022-07-11 15:50:22 source > key.converter = class org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:50:22 source > listeners = [http://:8083] 2022-07-11 15:50:22 source > metric.reporters = [] 2022-07-11 15:50:22 source > metrics.num.samples = 2 2022-07-11 15:50:22 source > metrics.recording.level = INFO 2022-07-11 15:50:22 source > metrics.sample.window.ms = 30000 2022-07-11 15:50:22 source > offset.flush.interval.ms = 1000 2022-07-11 15:50:22 source > offset.flush.timeout.ms = 5000 2022-07-11 15:50:22 source > offset.storage.file.filename = /tmp/cdc-state-offset8679596912831471251/offset.dat 2022-07-11 15:50:22 source > offset.storage.partitions = null 2022-07-11 15:50:22 source > offset.storage.replication.factor = null 2022-07-11 15:50:22 source > offset.storage.topic = 2022-07-11 15:50:22 source > plugin.path = null 2022-07-11 15:50:22 source > response.http.headers.config = 2022-07-11 15:50:22 source > rest.advertised.host.name = null 2022-07-11 15:50:22 source > rest.advertised.listener = null 2022-07-11 15:50:22 source > rest.advertised.port = null 2022-07-11 15:50:22 source > rest.extension.classes = [] 2022-07-11 15:50:22 source > ssl.cipher.suites = null 2022-07-11 15:50:22 source > ssl.client.auth = none 2022-07-11 15:50:22 source > ssl.enabled.protocols = [TLSv1.2, TLSv1.3] 2022-07-11 15:50:22 source > ssl.endpoint.identification.algorithm = https 2022-07-11 15:50:22 source > ssl.engine.factory.class = null 2022-07-11 15:50:22 source > ssl.key.password = null 2022-07-11 15:50:22 source > ssl.keymanager.algorithm = SunX509 2022-07-11 15:50:22 source > ssl.keystore.certificate.chain = null 2022-07-11 15:50:22 source > ssl.keystore.key = null 2022-07-11 15:50:22 source > ssl.keystore.location = null 2022-07-11 15:50:22 source > ssl.keystore.password = null 2022-07-11 15:50:22 source > ssl.keystore.type = JKS 2022-07-11 15:50:22 source > ssl.protocol = TLSv1.3 2022-07-11 15:50:22 source > ssl.provider = null 2022-07-11 15:50:22 source > ssl.secure.random.implementation = null 2022-07-11 15:50:22 source > ssl.trustmanager.algorithm = PKIX 2022-07-11 15:50:22 source > ssl.truststore.certificates = null 2022-07-11 15:50:22 source > ssl.truststore.location = null 2022-07-11 15:50:22 source > ssl.truststore.password = null 2022-07-11 15:50:22 source > ssl.truststore.type = JKS 2022-07-11 15:50:22 source > task.shutdown.graceful.timeout.ms = 5000 2022-07-11 15:50:22 source > topic.creation.enable = true 2022-07-11 15:50:22 source > topic.tracking.allow.reset = true 2022-07-11 15:50:22 source > topic.tracking.enable = true 2022-07-11 15:50:22 source > value.converter = class org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:50:22 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $: unknown found, object expected 2022-07-11 15:50:22 ERROR i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: null 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 WARN o.a.k.c.r.WorkerConfig(logInternalConverterRemovalWarnings):316 - The worker has been configured with one or more internal converter properties ([internal.key.converter, internal.value.converter]). Support for these properties was deprecated in version 2.0 and removed in version 3.0, and specifying them will have no effect. Instead, an instance of the JsonConverter with schemas.enable set to false will be used. For more information, please visit http://kafka.apache.org/documentation/#upgrade and consult the upgrade notesfor the 3.0 release. 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 WARN o.a.k.c.r.WorkerConfig(logPluginPathConfigProviderWarning):334 - Variables cannot be used in the 'plugin.path' property, since the property is used by plugin scanning before the config providers that replace the variables are initialized. The raw value 'null' was used for plugin scanning, as opposed to the transformed value 'null', and this may cause unexpected results. 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 WARN i.d.c.p.PostgresConnectorConfig(validatePluginName):1394 - Logical decoder 'wal2json' is deprecated and will be removed in future versions 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 WARN i.d.c.p.PostgresConnectorConfig(validateTruncateHandlingMode):1333 - Configuration property 'truncate.handling.mode' is deprecated and will be removed in future versions. Please use 'skipped.operations' instead. 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 WARN i.d.c.p.PostgresConnectorConfig(validateToastedValuePlaceholder):1384 - Configuration property 'toasted.value.placeholder' is deprecated and will be removed in future versions. Please use 'unavailable.value.placeholder' instead. 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(start):124 - Starting PostgresConnectorTask with configuration: 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - connector.class = io.debezium.connector.postgresql.PostgresConnector 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - max.queue.size = 8192 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - slot.name = airbyte_slot_achilles 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - publication.name = achilles_publication 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.storage.file.filename = /tmp/cdc-state-offset8679596912831471251/offset.dat 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - decimal.handling.mode = string 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - converters = datetime 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - datetime.type = io.airbyte.integrations.debezium.internals.PostgresConverter 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - value.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - key.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - publication.autocreate.mode = disabled 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.user = airbyte_achilles 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.dbname = achilles 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.storage = org.apache.kafka.connect.storage.FileOffsetBackingStore 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.server.name = achilles 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.flush.timeout.ms = 5000 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - plugin.name = wal2json 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.port = 5432 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - offset.flush.interval.ms = 1000 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - key.converter.schemas.enable = false 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - internal.key.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.hostname = 10.58.160.3 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - database.password = ******** 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - name = achilles 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - value.converter.schemas.enable = false 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - internal.value.converter = org.apache.kafka.connect.json.JsonConverter 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - max.batch.size = 2048 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - table.include.list = public.bank_config,public.files_in,public.files_out,public.partner_config,public.transactions_in,public.transactions_out 2022-07-11 15:50:22 source > 2022-07-11 15:50:22 INFO i.d.c.c.BaseSourceTask(lambda$start$0):126 - snapshot.mode = initial 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.c.c.BaseSourceTask(getPreviousOffsets):318 - No previous offsets found 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.c.p.PostgresConnectorTask(start):108 - user 'airbyte_achilles' connected to database 'achilles' on PostgreSQL 12.10 on x86_64-pc-linux-gnu, compiled by Debian clang version 12.0.1, 64-bit with roles: 2022-07-11 15:50:23 source > role 'cloudsqlsuperuser' [superuser: false, replication: false, inherit: true, create role: true, create db: true, can log in: true] 2022-07-11 15:50:23 source > role 'pg_read_all_settings' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:50:23 source > role 'pg_stat_scan_tables' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:50:23 source > role 'pg_monitor' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:50:23 source > role 'airbyte_achilles' [superuser: false, replication: true, inherit: true, create role: true, create db: true, can log in: true] 2022-07-11 15:50:23 source > role 'pg_read_all_stats' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:50:23 source > role 'pg_signal_backend' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.c.p.c.PostgresConnection(readReplicationSlotInfo):251 - Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{53/E1155F98}, catalogXmin=19514215] 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.c.p.PostgresConnectorTask(start):117 - No previous offset found 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.c.p.s.InitialSnapshotter(shouldSnapshot):34 - Taking initial snapshot for new datasource 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.u.Threads(threadFactory):270 - Requested thread factory for connector PostgresConnector, id = achilles named = change-event-source-coordinator 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.u.Threads$3(newThread):287 - Creating thread debezium-postgresconnector-achilles-change-event-source-coordinator 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.p.ChangeEventSourceCoordinator(lambda$start$0):103 - Metrics registered 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.p.ChangeEventSourceCoordinator(lambda$start$0):106 - Context created 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.c.p.s.InitialSnapshotter(shouldSnapshot):34 - Taking initial snapshot for new datasource 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.c.p.PostgresSnapshotChangeEventSource(getSnapshottingTask):64 - According to the connector configuration data will be snapshotted 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):87 - Snapshot step 1 - Preparing 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):96 - Snapshot step 2 - Determining captured tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.replication_set_table to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.bank_config to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.files_out to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.local_sync_status to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.partner_config to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.replication_set to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.node to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.schema_migrations to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.local_node to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.depend to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.replication_set_seq to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.queue to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.subscription to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.transactions_out to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.node_interface to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.in_processing to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.transactions_in to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table pglogical.sequence_state to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(determineCapturedTables):189 - Adding table public.files_in to the list of capture schema tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):103 - Snapshot step 3 - Locking captured tables [public.bank_config, public.files_in, public.files_out, public.partner_config, public.transactions_in, public.transactions_out] 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):109 - Snapshot step 4 - Determining snapshot offset 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.c.p.PostgresOffsetContext(initialContext):231 - Creating initial offset context 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.c.p.PostgresOffsetContext(initialContext):234 - Read xlogStart at 'LSN{55/2DC15B88}' from transaction '20086061' 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.c.p.PostgresSnapshotChangeEventSource(updateOffsetForSnapshot):146 - Read xlogStart at 'LSN{55/2DC15B88}' from transaction '20086061' 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):112 - Snapshot step 5 - Reading structure of captured tables 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.c.p.PostgresSnapshotChangeEventSource(readTableStructure):192 - Reading structure of schema 'public' of catalog 'achilles' 2022-07-11 15:50:23 destination > 2022-07-11 15:50:23 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_hdh_partner_config}} 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):116 - Snapshot step 6 - Persisting schema history 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(doExecute):128 - Snapshot step 7 - Snapshotting data 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEvents):302 - Snapshotting contents of 6 tables while still in transaction 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.bank_config' (1 of 6 tables) 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.bank_config' using select statement: 'SELECT "bank_id", "name", "routing_no", "created", "updated", "config" FROM "public"."bank_config"' 2022-07-11 15:50:23 destination > 2022-07-11 15:50:23 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_hdh_partner_config}} 2022-07-11 15:50:23 destination > 2022-07-11 15:50:23 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream partner_config (dataset raw_achilles): airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 3 records for table 'public.bank_config'; total duration '00:00:00.029' 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.files_in' (2 of 6 tables) 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.files_in' using select statement: 'SELECT "id", "preprocessing_path", "postprocessing_path", "file_name", "file_hash", "started", "ended", "std_entries_processed", "iat_entries_processed", "iat_entry_count", "std_entry_count", "total_entry_count", "total_batch_count", "total_debit_amount", "total_credit_amount", "updated" FROM "public"."files_in"' 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 36 records for table 'public.files_in'; total duration '00:00:00.087' 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.files_out' (3 of 6 tables) 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.files_out' using select statement: 'SELECT "id", "bank_id", "file_name", "batch_count", "file_hash", "created", "exchange_window", "updated" FROM "public"."files_out"' 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 34 records for table 'public.files_out'; total duration '00:00:00.127' 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.partner_config' (4 of 6 tables) 2022-07-11 15:50:23 source > 2022-07-11 15:50:23 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.partner_config' using select statement: 'SELECT "bank_id", "partner_id", "name", "account_prefix", "created", "updated", "config", "routing_no" FROM "public"."partner_config"' 2022-07-11 15:50:24 destination > 2022-07-11 15:50:24 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ does not exist in bucket; creating... 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 206 records for table 'public.partner_config'; total duration '00:00:00.21' 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.transactions_in' (5 of 6 tables) 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.transactions_in' using select statement: 'SELECT "id", "file_name", "file_hash", "company_id", "company_name", "company_entry_description", "batch_type", "batch_number", "originating_dfi", "sec_code", "settlement_date", "entry_trace_no", "transaction_code", "receiving_dfi", "dfi_account_no", "individual_name", "individual_id_no", "addenda_record_count", "external_id", "bank_id", "partner_id", "amount", "effective_date", "returned", "processing_history", "created", "updated", "uuid", "return_data", "transaction_out_id", "foreign_exchange_indicator", "destination_country_code", "originator_id", "originating_currency_code", "destination_currency_code", "addenda_99", "addenda_98", "addenda_02", "addenda_05", "addenda_10", "addenda_11", "addenda_12", "addenda_13", "addenda_14", "addenda_15", "addenda_16", "addenda_17", "addenda_18", "future_dated" FROM "public"."transactions_in"' 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 26 records for table 'public.transactions_in'; total duration '00:00:00.041' 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):338 - Exporting data from table 'public.transactions_out' (6 of 6 tables) 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):346 - For table 'public.transactions_out' using select statement: 'SELECT "id", "file_id", "external_id", "bank_id", "partner_id", "trace_no", "account_no", "account_name", "amount", "source_account_no", "source_account_name", "description", "effective_date", "destination_bank_routing_no", "return_data", "reference_info", "transaction_code", "created", "updated", "transaction_in_id", "uuid", "data", "status", "is_same_day" FROM "public"."transactions_out"' 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.r.RelationalSnapshotChangeEventSource(createDataEventsForTable):392 - Finished exporting 113 records for table 'public.transactions_out'; total duration '00:00:00.069' 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.p.s.AbstractSnapshotChangeEventSource(execute):88 - Snapshot - Final stage 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.p.ChangeEventSourceCoordinator(doSnapshot):156 - Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.postgresql.Source:STRUCT}, sourceInfo=source_info[server='achilles'db='achilles', lsn=LSN{55/2DC15B88}, txId=20086061, timestamp=2022-07-11T15:50:24.170Z, snapshot=FALSE, schema=public, table=transactions_out], lastSnapshotRecord=true, lastCompletelyProcessedLsn=null, lastCommitLsn=null, streamingStoppingLsn=null, transactionContext=TransactionContext [currentTransactionId=null, perTableEventCount={}, totalEventCount=0], incrementalSnapshotContext=IncrementalSnapshotContext [windowOpened=false, chunkEndPosition=null, dataCollectionsToSnapshot=[], lastEventKeySent=null, maximumKey=null]]] 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.p.ChangeEventSourceCoordinator(streamingConnected):234 - Connected metrics set to 'true' 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.bank_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.partner_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.p.ChangeEventSourceCoordinator(streamEvents):173 - Starting streaming 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresStreamingChangeEventSource(execute):127 - Retrieved latest position from stored offset 'LSN{55/2DC15B88}' 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.c.WalPositionLocator():40 - Looking for WAL restart position for last commit LSN 'null' and last change LSN 'LSN{55/2DC15B88}' 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.c.PostgresConnection(readReplicationSlotInfo):251 - Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{53/E1155F98}, catalogXmin=19514215] 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:50:24 destination > 2022-07-11 15:50:24 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ has been created in bucket. 2022-07-11 15:50:24 destination > 2022-07-11 15:50:24 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_mnk_bank_config}} 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.u.Threads(threadFactory):270 - Requested thread factory for connector PostgresConnector, id = achilles named = keep-alive 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.u.Threads$3(newThread):287 - Creating thread debezium-postgresconnector-achilles-keep-alive 2022-07-11 15:50:24 destination > 2022-07-11 15:50:24 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_mnk_bank_config}} 2022-07-11 15:50:24 destination > 2022-07-11 15:50:24 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream bank_config (dataset raw_achilles): airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.bank_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_out' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.partner_config' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.transactions_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresSchema(printReplicaIdentityInfo):103 - REPLICA IDENTITY for 'public.files_in' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.p.PostgresStreamingChangeEventSource(searchWalPosition):314 - Searching for WAL resume position 2022-07-11 15:50:24 destination > 2022-07-11 15:50:24 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ does not exist in bucket; creating... 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.e.EmbeddedEngine(stop):1047 - Stopping the embedded engine 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.e.EmbeddedEngine(stop):1055 - Waiting for PT5M for connector to stop 2022-07-11 15:50:24 destination > 2022-07-11 15:50:24 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ has been created in bucket. 2022-07-11 15:50:24 destination > 2022-07-11 15:50:24 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xzl_files_in}} 2022-07-11 15:50:24 destination > 2022-07-11 15:50:24 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xzl_files_in}} 2022-07-11 15:50:24 destination > 2022-07-11 15:50:24 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream files_in (dataset raw_achilles): airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.e.EmbeddedEngine(run):846 - Stopping the task and engine 2022-07-11 15:50:24 source > 2022-07-11 15:50:24 INFO i.d.c.c.BaseSourceTask(stop):238 - Stopping down connector 2022-07-11 15:50:25 destination > 2022-07-11 15:50:25 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ does not exist in bucket; creating... 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.d.c.p.PostgresStreamingChangeEventSource(searchWalPosition):335 - WAL resume position 'null' discovered 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.d.u.Threads(threadFactory):270 - Requested thread factory for connector PostgresConnector, id = achilles named = keep-alive 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.d.u.Threads$3(newThread):287 - Creating thread debezium-postgresconnector-achilles-keep-alive 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.d.c.p.PostgresStreamingChangeEventSource(processMessages):202 - Processing messages 2022-07-11 15:50:25 destination > 2022-07-11 15:50:25 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ has been created in bucket. 2022-07-11 15:50:25 destination > 2022-07-11 15:50:25 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qvp_transactions_in}} 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.d.p.ChangeEventSourceCoordinator(streamEvents):175 - Finished streaming 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.d.p.ChangeEventSourceCoordinator(streamingConnected):234 - Connected metrics set to 'false' 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.d.j.JdbcConnection(lambda$doClose$3):956 - Connection gracefully closed 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.a.i.d.i.DebeziumRecordPublisher(lambda$start$1):85 - Debezium engine shutdown. 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.a.i.s.p.PostgresCdcStateHandler(saveState):32 - debezium state: {"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839866760,\"txId\":20086061,\"ts_usec\":1657554624170000,\"snapshot\":true}"} 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.a.i.s.r.AbstractDbSource(lambda$read$2):139 - Closing database connection pool. 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO c.z.h.HikariDataSource(close):350 - HikariPool-2 - Shutdown initiated... 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO c.z.h.HikariDataSource(close):352 - HikariPool-2 - Shutdown completed. 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.a.i.s.r.AbstractDbSource(lambda$read$2):141 - Closed database connection pool. 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:50:25 source > 2022-07-11 15:50:25 INFO i.a.i.b.a.AdaptiveSourceRunner$Runner(run):87 - Completed source: io.airbyte.integrations.base.ssh.SshWrappedSource 2022-07-11 15:50:25 destination > 2022-07-11 15:50:25 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qvp_transactions_in}} 2022-07-11 15:50:25 destination > 2022-07-11 15:50:25 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream transactions_in (dataset raw_achilles): airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:25 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):327 - Source has no more messages, closing connection. 2022-07-11 15:50:25 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):335 - Total records read: 419 (288 KB) 2022-07-11 15:50:25 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publictransactions_out. Error messages: [$.file_id is of an incorrect type. Expected it to be number, $.transaction_in_id is of an incorrect type. Expected it to be string, $.return_data is of an incorrect type. Expected it to be string, $._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:50:25 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicfiles_in. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:50:25 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publictransactions_in. Error messages: [$.destination_country_code is of an incorrect type. Expected it to be string, $.addenda_11 is of an incorrect type. Expected it to be string, $.destination_currency_code is of an incorrect type. Expected it to be string, $.addenda_99 is of an incorrect type. Expected it to be string, $.foreign_exchange_indicator is of an incorrect type. Expected it to be string, $.addenda_12 is of an incorrect type. Expected it to be string, $.addenda_05 is of an incorrect type. Expected it to be string, $.addenda_15 is of an incorrect type. Expected it to be string, $.originator_id is of an incorrect type. Expected it to be string, $.addenda_10 is of an incorrect type. Expected it to be string, $.addenda_02 is of an incorrect type. Expected it to be string, $.addenda_18 is of an incorrect type. Expected it to be string, $.addenda_98 is of an incorrect type. Expected it to be string, $.individual_id_no is of an incorrect type. Expected it to be string, $.addenda_13 is of an incorrect type. Expected it to be string, $.addenda_record_count is of an incorrect type. Expected it to be string, $.transaction_out_id is of an incorrect type. Expected it to be string, $.addenda_16 is of an incorrect type. Expected it to be string, $.addenda_17 is of an incorrect type. Expected it to be string, $.return_data is of an incorrect type. Expected it to be string, $.originating_currency_code is of an incorrect type. Expected it to be string, $.future_dated is of an incorrect type. Expected it to be boolean, $._ab_cdc_deleted_at is of an incorrect type. Expected it to be string, $.addenda_14 is of an incorrect type. Expected it to be string] 2022-07-11 15:50:25 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicfiles_out. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:50:25 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicpartner_config. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:50:25 WARN i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$5):338 - Schema validation errors found for stream publicbank_config. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2022-07-11 15:50:25 INFO i.a.w.g.DefaultReplicationWorker(run):174 - One of source or destination thread complete. Waiting on the other. 2022-07-11 15:50:25 destination > 2022-07-11 15:50:25 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ does not exist in bucket; creating... 2022-07-11 15:50:25 destination > 2022-07-11 15:50:25 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ has been created in bucket. 2022-07-11 15:50:25 destination > 2022-07-11 15:50:25 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wcj_files_out}} 2022-07-11 15:50:25 destination > 2022-07-11 15:50:25 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wcj_files_out}} 2022-07-11 15:50:25 destination > 2022-07-11 15:50:25 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream files_out (dataset raw_achilles): airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:25 destination > 2022-07-11 15:50:25 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ does not exist in bucket; creating... 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ has been created in bucket. 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.b.BigQueryGcsOperations(createTmpTableIfNotExists):94 - Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qpv_transactions_out}} 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.b.BigQueryUtils(createPartitionedTable):131 - Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qpv_transactions_out}} 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.b.BigQueryGcsOperations(createStageIfNotExists):101 - Creating staging path for stream transactions_out (dataset raw_achilles): airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):97 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ does not exist in bucket; creating... 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):99 - Storage Object synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ has been created in bucket. 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onStartFunction$3):107 - Preparing tmp tables in destination completed. 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream bank_config (current state: 0 bytes in 0 buffers) 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream files_in (current state: 0 bytes in 1 buffers) 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream files_out (current state: 0 bytes in 2 buffers) 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream partner_config (current state: 0 bytes in 3 buffers) 2022-07-11 15:50:26 destination > 2022-07-11 15:50:26 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream transactions_in (current state: 62 KB in 4 buffers) 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):48 - Starting a new buffer for stream transactions_out (current state: 125 KB in 5 buffers) 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO i.a.i.d.g.u.GcsUtils(getDefaultAvroSchema):25 - Default schema. 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded. 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO i.a.i.d.b.BufferedStreamConsumer(close):171 - executing on success close procedure. 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO i.a.i.d.r.SerializedBufferingStrategy(flushAll):103 - Flushing all 6 current buffers (188 KB in total) 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream partner_config (62 KB) 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream partner_config (62 KB) to staging 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to d0fbccf0-7418-47b5-be2e-79432d4c616b13800022824006392984.avro (111 KB) 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream partner_config (dataset raw_achilles): airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with full ID ABPnzm74uVI4M705gak3pWsPSWQjqC6PZV5O5HLKK4P0jdWrcPvm5NFcSrzQ3dr7SHsWdoc 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm74u...r7SHsWdoc]: Uploading leftover stream [Part number 1 containing 0.11 MB] 2022-07-11 15:50:27 destination > 2022-07-11 15:50:27 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm74u...r7SHsWdoc]: Finished uploading [Part number 1 containing 0.11 MB] 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm74u...r7SHsWdoc]: Completed 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: d0fbccf0-7418-47b5-be2e-79432d4c616b13800022824006392984.avro -> airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro (filename: 1.avro) 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data d0fbccf0-7418-47b5-be2e-79432d4c616b13800022824006392984.avro 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream files_in (325 bytes) 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream files_in (325 bytes) to staging 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to ac609a90-e4c4-4720-ba83-fa227950524d15111250756990867871.avro (23 KB) 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream files_in (dataset raw_achilles): airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with full ID ABPnzm6TAcGfX4fJ4wBKl33-hXcN5ARwNbj_NvsAfebPXFgGEbV0kGNobqFLSHWviPXsiwg 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm6TA...WviPXsiwg]: Uploading leftover stream [Part number 1 containing 0.02 MB] 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm6TA...WviPXsiwg]: Finished uploading [Part number 1 containing 0.02 MB] 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm6TA...WviPXsiwg]: Completed 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: ac609a90-e4c4-4720-ba83-fa227950524d15111250756990867871.avro -> airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro (filename: 1.avro) 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data ac609a90-e4c4-4720-ba83-fa227950524d15111250756990867871.avro 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream bank_config (328 bytes) 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream bank_config (328 bytes) to staging 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to bbb22eb4-ae68-4255-976e-593d05b241ca14161809809289262948.avro (2 KB) 2022-07-11 15:50:28 destination > 2022-07-11 15:50:28 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream bank_config (dataset raw_achilles): airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with full ID ABPnzm7hQ8RYf_quM7fj32vqQqVIQ1VKZs1wf1yxj0WBrjEqWCX7ip_rFLbYqskEnL7eN8Q 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm7hQ...kEnL7eN8Q]: Uploading leftover stream [Part number 1 containing 0.00 MB] 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm7hQ...kEnL7eN8Q]: Finished uploading [Part number 1 containing 0.00 MB] 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm7hQ...kEnL7eN8Q]: Completed 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: bbb22eb4-ae68-4255-976e-593d05b241ca14161809809289262948.avro -> airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro (filename: 1.avro) 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data bbb22eb4-ae68-4255-976e-593d05b241ca14161809809289262948.avro 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream transactions_in (62 KB) 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream transactions_in (62 KB) to staging 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to c1d01780-4ffd-44b2-b6fc-aac916c2fe6810671385356873506774.avro (62 KB) 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream transactions_in (dataset raw_achilles): airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with full ID ABPnzm7255vqnzRWpyMsTYJJ4nDRuXehhxVub7GepmbOOgYXF7mklh3xAIYGtas51WHN53w 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:50:29 destination > 2022-07-11 15:50:29 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm725...s51WHN53w]: Uploading leftover stream [Part number 1 containing 0.06 MB] 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm725...s51WHN53w]: Finished uploading [Part number 1 containing 0.06 MB] 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm725...s51WHN53w]: Completed 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: c1d01780-4ffd-44b2-b6fc-aac916c2fe6810671385356873506774.avro -> airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro (filename: 1.avro) 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data c1d01780-4ffd-44b2-b6fc-aac916c2fe6810671385356873506774.avro 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream transactions_out (63 KB) 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream transactions_out (63 KB) to staging 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to c8db20ed-73b1-4a2a-99ae-cabf78f8d60513271910756154874241.avro (93 KB) 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream transactions_out (dataset raw_achilles): airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with full ID ABPnzm7DV34wAvduqqXaWZFurlYdh8Jd2hW9WIXD-H7cu3NLJKD693hEsHEkFTH_8QOEEV8 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm7DV...H_8QOEEV8]: Uploading leftover stream [Part number 1 containing 0.09 MB] 2022-07-11 15:50:30 destination > 2022-07-11 15:50:30 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm7DV...H_8QOEEV8]: Finished uploading [Part number 1 containing 0.09 MB] 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm7DV...H_8QOEEV8]: Completed 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: c8db20ed-73b1-4a2a-99ae-cabf78f8d60513271910756154874241.avro -> airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro (filename: 1.avro) 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data c8db20ed-73b1-4a2a-99ae-cabf78f8d60513271910756154874241.avro 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$flushAll$2):106 - Flushing buffer of stream files_out (326 bytes) 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$flushBufferFunction$4):115 - Flushing buffer for stream files_out (326 bytes) to staging 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to 33658f20-a8a8-420e-9a05-42e73b53d7fa13054383740504693559.avro (14 KB) 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.b.BigQueryGcsOperations(uploadRecordsToStage):108 - Uploading records to staging for stream files_out (dataset raw_achilles): airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with full ID ABPnzm49bwRr9bQ4g-0qgt0nQXBzWNE1tz-s2Y2GgMPUPakuSA9VD3lRp96peIYYDbF8rlk 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000] 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO a.m.s.StreamTransferManager(complete):367 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm49b...YYDbF8rlk]: Uploading leftover stream [Part number 1 containing 0.01 MB] 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm49b...YYDbF8rlk]: Finished uploading [Part number 1 containing 0.01 MB] 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro with id ABPnzm49b...YYDbF8rlk]: Completed 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.s.S3StorageOperations(loadDataIntoBucket):178 - Uploaded buffer file to storage: 33658f20-a8a8-420e-9a05-42e73b53d7fa13054383740504693559.avro -> airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro (filename: 1.avro) 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.b.BigQueryWriteConfig(addStagedFile):53 - Added staged file: 1.avro 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.r.FileBuffer(deleteFile):81 - Deleting tempFile data 33658f20-a8a8-420e-9a05-42e73b53d7fa13054383740504693559.avro 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream partner_config 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream files_in 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream bank_config 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream transactions_in 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream transactions_out 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.r.SerializedBufferingStrategy(close):127 - Closing buffer for stream files_out 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):137 - Copying into tables in destination started for 6 streams 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_hdh_partner_config}} (dataset raw_achilles): [1.avro] 2022-07-11 15:50:31 destination > 2022-07-11 15:50:31 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:50:32 destination > 2022-07-11 15:50:32 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=b10181d7-27bf-41b0-8c89-14c9297a718d, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_hdh_partner_config}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=b10181d7-27bf-41b0-8c89-14c9297a718d, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554632115, endTime=null, startTime=1657554632215, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=hyJePzDGprVC6gwUvi3NQg==, generatedId=mainapi-282501:US.b10181d7-27bf-41b0-8c89-14c9297a718d, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/b10181d7-27bf-41b0-8c89-14c9297a718d?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_hdh_partner_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:50:32 destination > 2022-07-11 15:50:32 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=b10181d7-27bf-41b0-8c89-14c9297a718d, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554632115, endTime=null, startTime=1657554632215, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=hyJePzDGprVC6gwUvi3NQg==, generatedId=mainapi-282501:US.b10181d7-27bf-41b0-8c89-14c9297a718d, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/b10181d7-27bf-41b0-8c89-14c9297a718d?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_hdh_partner_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:50:35 destination > 2022-07-11 15:50:35 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=b10181d7-27bf-41b0-8c89-14c9297a718d, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554632115, endTime=null, startTime=1657554632215, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=hyJePzDGprVC6gwUvi3NQg==, generatedId=mainapi-282501:US.b10181d7-27bf-41b0-8c89-14c9297a718d, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/b10181d7-27bf-41b0-8c89-14c9297a718d?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_hdh_partner_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:50:35 destination > 2022-07-11 15:50:35 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=b10181d7-27bf-41b0-8c89-14c9297a718d, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_hdh_partner_config}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:50:35 destination > 2022-07-11 15:50:35 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_hdh_partner_config}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_partner_config}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:50:39 destination > 2022-07-11 15:50:39 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_hdh_partner_config}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_partner_config}} 2022-07-11 15:50:39 destination > 2022-07-11 15:50:39 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_mnk_bank_config}} (dataset raw_achilles): [1.avro] 2022-07-11 15:50:39 destination > 2022-07-11 15:50:39 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:50:39 destination > 2022-07-11 15:50:39 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=045ad092-dbbc-4aaf-a7e1-531a3259ce36, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_mnk_bank_config}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=045ad092-dbbc-4aaf-a7e1-531a3259ce36, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554639492, endTime=null, startTime=1657554639591, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=/7loNKBgzgIOsWPaA8SMvA==, generatedId=mainapi-282501:US.045ad092-dbbc-4aaf-a7e1-531a3259ce36, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/045ad092-dbbc-4aaf-a7e1-531a3259ce36?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_mnk_bank_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:50:39 destination > 2022-07-11 15:50:39 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=045ad092-dbbc-4aaf-a7e1-531a3259ce36, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554639492, endTime=null, startTime=1657554639591, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=/7loNKBgzgIOsWPaA8SMvA==, generatedId=mainapi-282501:US.045ad092-dbbc-4aaf-a7e1-531a3259ce36, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/045ad092-dbbc-4aaf-a7e1-531a3259ce36?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_mnk_bank_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:50:42 destination > 2022-07-11 15:50:42 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=045ad092-dbbc-4aaf-a7e1-531a3259ce36, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554639492, endTime=null, startTime=1657554639591, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=/7loNKBgzgIOsWPaA8SMvA==, generatedId=mainapi-282501:US.045ad092-dbbc-4aaf-a7e1-531a3259ce36, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/045ad092-dbbc-4aaf-a7e1-531a3259ce36?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_mnk_bank_config}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:50:42 destination > 2022-07-11 15:50:42 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=045ad092-dbbc-4aaf-a7e1-531a3259ce36, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_mnk_bank_config}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:50:42 destination > 2022-07-11 15:50:42 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_mnk_bank_config}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_bank_config}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:50:45 destination > 2022-07-11 15:50:45 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_mnk_bank_config}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_bank_config}} 2022-07-11 15:50:45 destination > 2022-07-11 15:50:45 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xzl_files_in}} (dataset raw_achilles): [1.avro] 2022-07-11 15:50:45 destination > 2022-07-11 15:50:45 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:50:48 destination > 2022-07-11 15:50:48 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=ca45f933-8d5b-4b1b-a50a-04783ba40439, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xzl_files_in}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=ca45f933-8d5b-4b1b-a50a-04783ba40439, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554646030, endTime=null, startTime=1657554647998, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=dSUoyLl6NSnOgQ25RGX4FA==, generatedId=mainapi-282501:US.ca45f933-8d5b-4b1b-a50a-04783ba40439, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/ca45f933-8d5b-4b1b-a50a-04783ba40439?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_xzl_files_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:50:48 destination > 2022-07-11 15:50:48 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=ca45f933-8d5b-4b1b-a50a-04783ba40439, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554646030, endTime=null, startTime=1657554647998, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=dSUoyLl6NSnOgQ25RGX4FA==, generatedId=mainapi-282501:US.ca45f933-8d5b-4b1b-a50a-04783ba40439, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/ca45f933-8d5b-4b1b-a50a-04783ba40439?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_xzl_files_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:50:53 destination > 2022-07-11 15:50:53 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=ca45f933-8d5b-4b1b-a50a-04783ba40439, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554646030, endTime=null, startTime=1657554647998, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=dSUoyLl6NSnOgQ25RGX4FA==, generatedId=mainapi-282501:US.ca45f933-8d5b-4b1b-a50a-04783ba40439, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/ca45f933-8d5b-4b1b-a50a-04783ba40439?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_xzl_files_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:50:53 destination > 2022-07-11 15:50:53 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=ca45f933-8d5b-4b1b-a50a-04783ba40439, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xzl_files_in}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:50:53 destination > 2022-07-11 15:50:53 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xzl_files_in}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_in}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:50:55 destination > 2022-07-11 15:50:55 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xzl_files_in}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_in}} 2022-07-11 15:50:55 destination > 2022-07-11 15:50:55 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qvp_transactions_in}} (dataset raw_achilles): [1.avro] 2022-07-11 15:50:55 destination > 2022-07-11 15:50:55 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:50:56 destination > 2022-07-11 15:50:56 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=5f30e8ab-8a4f-475c-9048-f5feb09a8fbe, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qvp_transactions_in}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=5f30e8ab-8a4f-475c-9048-f5feb09a8fbe, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554655875, endTime=null, startTime=1657554656034, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=+cjWAuf4yPflYsfNAMVw1A==, generatedId=mainapi-282501:US.5f30e8ab-8a4f-475c-9048-f5feb09a8fbe, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/5f30e8ab-8a4f-475c-9048-f5feb09a8fbe?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_qvp_transactions_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:50:56 destination > 2022-07-11 15:50:56 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=5f30e8ab-8a4f-475c-9048-f5feb09a8fbe, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554655875, endTime=null, startTime=1657554656034, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=+cjWAuf4yPflYsfNAMVw1A==, generatedId=mainapi-282501:US.5f30e8ab-8a4f-475c-9048-f5feb09a8fbe, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/5f30e8ab-8a4f-475c-9048-f5feb09a8fbe?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_qvp_transactions_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:50:59 destination > 2022-07-11 15:50:59 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=5f30e8ab-8a4f-475c-9048-f5feb09a8fbe, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554655875, endTime=null, startTime=1657554656034, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=+cjWAuf4yPflYsfNAMVw1A==, generatedId=mainapi-282501:US.5f30e8ab-8a4f-475c-9048-f5feb09a8fbe, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/5f30e8ab-8a4f-475c-9048-f5feb09a8fbe?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_qvp_transactions_in}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:50:59 destination > 2022-07-11 15:50:59 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=5f30e8ab-8a4f-475c-9048-f5feb09a8fbe, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qvp_transactions_in}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:50:59 destination > 2022-07-11 15:50:59 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qvp_transactions_in}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_in}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:51:01 destination > 2022-07-11 15:51:01 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qvp_transactions_in}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_in}} 2022-07-11 15:51:01 destination > 2022-07-11 15:51:01 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wcj_files_out}} (dataset raw_achilles): [1.avro] 2022-07-11 15:51:01 destination > 2022-07-11 15:51:01 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:51:01 destination > 2022-07-11 15:51:01 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=f62b7e1e-c23d-45fe-9c6b-27036ba13320, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wcj_files_out}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=f62b7e1e-c23d-45fe-9c6b-27036ba13320, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554661206, endTime=null, startTime=1657554661317, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=FNLy23hTanizCnMG2Y+LEg==, generatedId=mainapi-282501:US.f62b7e1e-c23d-45fe-9c6b-27036ba13320, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/f62b7e1e-c23d-45fe-9c6b-27036ba13320?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_wcj_files_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:51:01 destination > 2022-07-11 15:51:01 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=f62b7e1e-c23d-45fe-9c6b-27036ba13320, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554661206, endTime=null, startTime=1657554661317, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=FNLy23hTanizCnMG2Y+LEg==, generatedId=mainapi-282501:US.f62b7e1e-c23d-45fe-9c6b-27036ba13320, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/f62b7e1e-c23d-45fe-9c6b-27036ba13320?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_wcj_files_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:51:05 destination > 2022-07-11 15:51:05 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=f62b7e1e-c23d-45fe-9c6b-27036ba13320, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554661206, endTime=null, startTime=1657554661317, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=FNLy23hTanizCnMG2Y+LEg==, generatedId=mainapi-282501:US.f62b7e1e-c23d-45fe-9c6b-27036ba13320, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/f62b7e1e-c23d-45fe-9c6b-27036ba13320?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_wcj_files_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:51:05 destination > 2022-07-11 15:51:05 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=f62b7e1e-c23d-45fe-9c6b-27036ba13320, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wcj_files_out}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:51:05 destination > 2022-07-11 15:51:05 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wcj_files_out}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_out}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:51:08 destination > 2022-07-11 15:51:08 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wcj_files_out}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_files_out}} 2022-07-11 15:51:08 destination > 2022-07-11 15:51:08 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTmpTableFromStage):122 - Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qpv_transactions_out}} (dataset raw_achilles): [1.avro] 2022-07-11 15:51:08 destination > 2022-07-11 15:51:08 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):126 - Uploading staged file: gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:51:08 destination > 2022-07-11 15:51:08 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):135 - [JobId{project=mainapi-282501, job=34425d6e-744c-4b28-9ca1-5bfedd0b4eb2, location=US}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qpv_transactions_out}} (dataset raw_achilles): Job{job=JobId{project=mainapi-282501, job=34425d6e-744c-4b28-9ca1-5bfedd0b4eb2, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554668487, endTime=null, startTime=1657554668631, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=l3//9zyeIt+ioQriVAW9xw==, generatedId=mainapi-282501:US.34425d6e-744c-4b28-9ca1-5bfedd0b4eb2, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/34425d6e-744c-4b28-9ca1-5bfedd0b4eb2?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_qpv_transactions_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} 2022-07-11 15:51:08 destination > 2022-07-11 15:51:08 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):325 - Waiting for job finish Job{job=JobId{project=mainapi-282501, job=34425d6e-744c-4b28-9ca1-5bfedd0b4eb2, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554668487, endTime=null, startTime=1657554668631, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=l3//9zyeIt+ioQriVAW9xw==, generatedId=mainapi-282501:US.34425d6e-744c-4b28-9ca1-5bfedd0b4eb2, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/34425d6e-744c-4b28-9ca1-5bfedd0b4eb2?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_qpv_transactions_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:51:11 destination > 2022-07-11 15:51:11 INFO i.a.i.d.b.BigQueryUtils(waitForJobFinish):327 - Job finish Job{job=JobId{project=mainapi-282501, job=34425d6e-744c-4b28-9ca1-5bfedd0b4eb2, location=US}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1657554668487, endTime=null, startTime=1657554668631, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@mainapi-282501.iam.gserviceaccount.com, etag=l3//9zyeIt+ioQriVAW9xw==, generatedId=mainapi-282501:US.34425d6e-744c-4b28-9ca1-5bfedd0b4eb2, selfLink=https://www.googleapis.com/bigquery/v2/projects/mainapi-282501/jobs/34425d6e-744c-4b28-9ca1-5bfedd0b4eb2?location=US, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, projectId=mainapi-282501, tableId=_airbyte_tmp_qpv_transactions_out}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://synctera-data-staging/airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null} 2022-07-11 15:51:11 destination > 2022-07-11 15:51:11 INFO i.a.i.d.b.BigQueryGcsOperations(lambda$copyIntoTmpTableFromStage$0):139 - [JobId{project=mainapi-282501, job=34425d6e-744c-4b28-9ca1-5bfedd0b4eb2, location=US}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qpv_transactions_out}} (dataset raw_achilles) is successfully appended with staging files 2022-07-11 15:51:11 destination > 2022-07-11 15:51:11 INFO i.a.i.d.b.BigQueryGcsOperations(copyIntoTargetTable):162 - Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qpv_transactions_out}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_out}} (dataset raw_achilles, sync mode append_dedup) 2022-07-11 15:51:14 destination > 2022-07-11 15:51:14 INFO i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):185 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qpv_transactions_out}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_raw_transactions_out}} 2022-07-11 15:51:14 destination > 2022-07-11 15:51:14 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):156 - Finalizing tables in destination completed 2022-07-11 15:51:14 destination > 2022-07-11 15:51:14 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):159 - Cleaning up destination started for 6 streams 2022-07-11 15:51:14 destination > 2022-07-11 15:51:14 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_hdh_partner_config}} (dataset raw_achilles) 2022-07-11 15:51:14 destination > 2022-07-11 15:51:14 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream partner_config (dataset raw_achilles): airbyte/raw_achilles_partner_config 2022-07-11 15:51:14 destination > 2022-07-11 15:51:14 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:51:14 destination > 2022-07-11 15:51:14 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_partner_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:51:15 destination > 2022-07-11 15:51:15 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_partner_config has been cleaned-up (2 objects were deleted)... 2022-07-11 15:51:15 destination > 2022-07-11 15:51:15 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_mnk_bank_config}} (dataset raw_achilles) 2022-07-11 15:51:15 destination > 2022-07-11 15:51:15 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream bank_config (dataset raw_achilles): airbyte/raw_achilles_bank_config 2022-07-11 15:51:15 destination > 2022-07-11 15:51:15 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:51:15 destination > 2022-07-11 15:51:15 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_bank_config/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:51:15 destination > 2022-07-11 15:51:15 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_bank_config has been cleaned-up (2 objects were deleted)... 2022-07-11 15:51:15 destination > 2022-07-11 15:51:15 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_xzl_files_in}} (dataset raw_achilles) 2022-07-11 15:51:15 destination > 2022-07-11 15:51:15 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream files_in (dataset raw_achilles): airbyte/raw_achilles_files_in 2022-07-11 15:51:15 destination > 2022-07-11 15:51:15 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:51:16 destination > 2022-07-11 15:51:16 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:51:16 destination > 2022-07-11 15:51:16 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_files_in has been cleaned-up (2 objects were deleted)... 2022-07-11 15:51:16 destination > 2022-07-11 15:51:16 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qvp_transactions_in}} (dataset raw_achilles) 2022-07-11 15:51:16 destination > 2022-07-11 15:51:16 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream transactions_in (dataset raw_achilles): airbyte/raw_achilles_transactions_in 2022-07-11 15:51:16 destination > 2022-07-11 15:51:16 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:51:16 destination > 2022-07-11 15:51:16 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_in/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:51:16 destination > 2022-07-11 15:51:16 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_transactions_in has been cleaned-up (2 objects were deleted)... 2022-07-11 15:51:16 destination > 2022-07-11 15:51:16 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_wcj_files_out}} (dataset raw_achilles) 2022-07-11 15:51:16 destination > 2022-07-11 15:51:16 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream files_out (dataset raw_achilles): airbyte/raw_achilles_files_out 2022-07-11 15:51:16 destination > 2022-07-11 15:51:16 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:51:17 destination > 2022-07-11 15:51:17 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_files_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:51:17 destination > 2022-07-11 15:51:17 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_files_out has been cleaned-up (2 objects were deleted)... 2022-07-11 15:51:17 destination > 2022-07-11 15:51:17 INFO i.a.i.d.b.BigQueryGcsOperations(dropTableIfExists):172 - Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=raw_achilles, tableId=_airbyte_tmp_qpv_transactions_out}} (dataset raw_achilles) 2022-07-11 15:51:17 destination > 2022-07-11 15:51:17 INFO i.a.i.d.b.BigQueryGcsOperations(dropStageIfExists):183 - Cleaning up staging path for stream transactions_out (dataset raw_achilles): airbyte/raw_achilles_transactions_out 2022-07-11 15:51:17 destination > 2022-07-11 15:51:17 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/ 2022-07-11 15:51:17 destination > 2022-07-11 15:51:17 INFO i.a.i.d.g.GcsStorageOperations(cleanUpObjects):43 - Deleting object airbyte/raw_achilles_transactions_out/2022/07/11/15/e88dc468-12f8-44ae-b1e2-27336f786e43/1.avro 2022-07-11 15:51:17 destination > 2022-07-11 15:51:17 INFO i.a.i.d.s.S3StorageOperations(cleanUpBucketObject):270 - Storage bucket airbyte/raw_achilles_transactions_out has been cleaned-up (2 objects were deleted)... 2022-07-11 15:51:17 destination > 2022-07-11 15:51:17 INFO i.a.i.d.b.BigQueryStagingConsumerFactory(lambda$onCloseFunction$5):164 - Cleaning up destination completed. 2022-07-11 15:51:17 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):415 - State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@15af32eb[type=STATE,log=,spec=,connectionStatus=,catalog=,record=,state=io.airbyte.protocol.models.AirbyteStateMessage@1b68ce2e[type=GLOBAL,stream=,global=io.airbyte.protocol.models.AirbyteGlobalState@1f57599b[sharedState={"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839866760,\"txId\":20086061,\"ts_usec\":1657554624170000,\"snapshot\":true}"}},streamStates=[io.airbyte.protocol.models.AirbyteStreamState@5a55c415[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@36018dde[name=bank_config,namespace=public,additionalProperties={}],streamState={"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@6bc53c8e[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@4e7b2bec[name=files_in,namespace=public,additionalProperties={}],streamState={"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@3a7a3b5a[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@af2eb6[name=files_out,namespace=public,additionalProperties={}],streamState={"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@42cc5361[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@557f249f[name=partner_config,namespace=public,additionalProperties={}],streamState={"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@3e346581[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@4717366f[name=transactions_in,namespace=public,additionalProperties={}],streamState={"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}], io.airbyte.protocol.models.AirbyteStreamState@67e032b7[streamDescriptor=io.airbyte.protocol.models.StreamDescriptor@4b17a908[name=transactions_out,namespace=public,additionalProperties={}],streamState={"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]},additionalProperties={}]],additionalProperties={}],data={"cdc":true,"cdc_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839866760,\"txId\":20086061,\"ts_usec\":1657554624170000,\"snapshot\":true}"}},"streams":[{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}]},additionalProperties={}],trace=,additionalProperties={}] 2022-07-11 15:51:17 destination > 2022-07-11 15:51:17 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination 2022-07-11 15:51:17 INFO i.a.w.g.DefaultReplicationWorker(run):176 - Source and destination threads complete. 2022-07-11 15:51:17 INFO i.a.w.g.DefaultReplicationWorker(run):239 - sync summary: io.airbyte.config.ReplicationAttemptSummary@4d7e6d40[status=completed,recordsSynced=418,bytesSynced=295072,startTime=1657554616676,endTime=1657554677872,totalStats=io.airbyte.config.SyncStats@44a78b73[recordsEmitted=418,bytesEmitted=295072,stateMessagesEmitted=1,recordsCommitted=418],streamStats=[io.airbyte.config.StreamSyncStats@46e4f891[streamName=bank_config,stats=io.airbyte.config.SyncStats@1768360e[recordsEmitted=3,bytesEmitted=1792,stateMessagesEmitted=,recordsCommitted=3]], io.airbyte.config.StreamSyncStats@29424dce[streamName=transactions_out,stats=io.airbyte.config.SyncStats@51c6a786[recordsEmitted=113,bytesEmitted=90414,stateMessagesEmitted=,recordsCommitted=113]], io.airbyte.config.StreamSyncStats@3007fd5f[streamName=partner_config,stats=io.airbyte.config.SyncStats@66897517[recordsEmitted=206,bytesEmitted=104835,stateMessagesEmitted=,recordsCommitted=206]], io.airbyte.config.StreamSyncStats@5d9e7835[streamName=transactions_in,stats=io.airbyte.config.SyncStats@39000e97[recordsEmitted=26,bytesEmitted=62878,stateMessagesEmitted=,recordsCommitted=26]], io.airbyte.config.StreamSyncStats@3662f1ed[streamName=files_in,stats=io.airbyte.config.SyncStats@1bee1bfb[recordsEmitted=36,bytesEmitted=22085,stateMessagesEmitted=,recordsCommitted=36]], io.airbyte.config.StreamSyncStats@359f8763[streamName=files_out,stats=io.airbyte.config.SyncStats@c5253f1[recordsEmitted=34,bytesEmitted=13068,stateMessagesEmitted=,recordsCommitted=34]]]] 2022-07-11 15:51:17 INFO i.a.w.g.DefaultReplicationWorker(run):266 - Source output at least one state message 2022-07-11 15:51:17 INFO i.a.w.g.DefaultReplicationWorker(run):272 - State capture: Updated state to: Optional[io.airbyte.config.State@2d9e260f[state=[{"type":"GLOBAL","global":{"shared_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839866760,\"txId\":20086061,\"ts_usec\":1657554624170000,\"snapshot\":true}"}},"stream_states":[{"stream_descriptor":{"name":"bank_config","namespace":"public"},"stream_state":{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_in","namespace":"public"},"stream_state":{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_out","namespace":"public"},"stream_state":{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"partner_config","namespace":"public"},"stream_state":{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_in","namespace":"public"},"stream_state":{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_out","namespace":"public"},"stream_state":{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}}]},"data":{"cdc":true,"cdc_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839866760,\"txId\":20086061,\"ts_usec\":1657554624170000,\"snapshot\":true}"}},"streams":[{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}]}}]]] 2022-07-11 15:51:17 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:51:17 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):161 - sync summary: io.airbyte.config.StandardSyncOutput@49076dc4[standardSyncSummary=io.airbyte.config.StandardSyncSummary@c05ea80[status=completed,recordsSynced=418,bytesSynced=295072,startTime=1657554616676,endTime=1657554677872,totalStats=io.airbyte.config.SyncStats@44a78b73[recordsEmitted=418,bytesEmitted=295072,stateMessagesEmitted=1,recordsCommitted=418],streamStats=[io.airbyte.config.StreamSyncStats@46e4f891[streamName=bank_config,stats=io.airbyte.config.SyncStats@1768360e[recordsEmitted=3,bytesEmitted=1792,stateMessagesEmitted=,recordsCommitted=3]], io.airbyte.config.StreamSyncStats@29424dce[streamName=transactions_out,stats=io.airbyte.config.SyncStats@51c6a786[recordsEmitted=113,bytesEmitted=90414,stateMessagesEmitted=,recordsCommitted=113]], io.airbyte.config.StreamSyncStats@3007fd5f[streamName=partner_config,stats=io.airbyte.config.SyncStats@66897517[recordsEmitted=206,bytesEmitted=104835,stateMessagesEmitted=,recordsCommitted=206]], io.airbyte.config.StreamSyncStats@5d9e7835[streamName=transactions_in,stats=io.airbyte.config.SyncStats@39000e97[recordsEmitted=26,bytesEmitted=62878,stateMessagesEmitted=,recordsCommitted=26]], io.airbyte.config.StreamSyncStats@3662f1ed[streamName=files_in,stats=io.airbyte.config.SyncStats@1bee1bfb[recordsEmitted=36,bytesEmitted=22085,stateMessagesEmitted=,recordsCommitted=36]], io.airbyte.config.StreamSyncStats@359f8763[streamName=files_out,stats=io.airbyte.config.SyncStats@c5253f1[recordsEmitted=34,bytesEmitted=13068,stateMessagesEmitted=,recordsCommitted=34]]]],normalizationSummary=,state=io.airbyte.config.State@2d9e260f[state=[{"type":"GLOBAL","global":{"shared_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839866760,\"txId\":20086061,\"ts_usec\":1657554624170000,\"snapshot\":true}"}},"stream_states":[{"stream_descriptor":{"name":"bank_config","namespace":"public"},"stream_state":{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_in","namespace":"public"},"stream_state":{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"files_out","namespace":"public"},"stream_state":{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"partner_config","namespace":"public"},"stream_state":{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_in","namespace":"public"},"stream_state":{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]}},{"stream_descriptor":{"name":"transactions_out","namespace":"public"},"stream_state":{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}}]},"data":{"cdc":true,"cdc_state":{"state":{"{\"schema\":null,\"payload\":[\"achilles\",{\"server\":\"achilles\"}]}":"{\"last_snapshot_record\":true,\"lsn\":365839866760,\"txId\":20086061,\"ts_usec\":1657554624170000,\"snapshot\":true}"}},"streams":[{"stream_name":"bank_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"files_out","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"partner_config","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_in","stream_namespace":"public","cursor_field":["updated"]},{"stream_name":"transactions_out","stream_namespace":"public","cursor_field":["updated"]}]}}]],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@37f48712[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@3d43846f[stream=io.airbyte.protocol.models.AirbyteStream@4f58b74e[name=bank_config,jsonSchema={"type":"object","properties":{"name":{"type":"string"},"config":{"type":"string"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"routing_no":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[bank_id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[bank_id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@6c2a15cf[stream=io.airbyte.protocol.models.AirbyteStream@7f75d34f[name=files_in,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"ended":{"type":"string","format":"date-time","airbyte_type":"timestamp_without_timezone"},"started":{"type":"string","format":"date-time","airbyte_type":"timestamp_without_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"file_hash":{"type":"string"},"file_name":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"iat_entry_count":{"type":"number"},"std_entry_count":{"type":"number"},"total_batch_count":{"type":"number"},"total_entry_count":{"type":"number"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"},"preprocessing_path":{"type":"string"},"total_debit_amount":{"type":"number"},"postprocessing_path":{"type":"string"},"total_credit_amount":{"type":"number"},"iat_entries_processed":{"type":"number"},"std_entries_processed":{"type":"number"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@5eb44032[stream=io.airbyte.protocol.models.AirbyteStream@545c7e6d[name=files_out,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"file_hash":{"type":"string"},"file_name":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"batch_count":{"type":"number"},"exchange_window":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@68648d44[stream=io.airbyte.protocol.models.AirbyteStream@395ac4ac[name=partner_config,jsonSchema={"type":"object","properties":{"name":{"type":"string"},"config":{"type":"string"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"partner_id":{"type":"number"},"routing_no":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"account_prefix":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[partner_id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[partner_id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@4593b665[stream=io.airbyte.protocol.models.AirbyteStream@720b17ac[name=transactions_in,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"uuid":{"type":"string"},"amount":{"type":"number"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"returned":{"type":"boolean"},"sec_code":{"type":"string"},"file_hash":{"type":"string"},"file_name":{"type":"string"},"addenda_02":{"type":"string"},"addenda_05":{"type":"string"},"addenda_10":{"type":"string"},"addenda_11":{"type":"string"},"addenda_12":{"type":"string"},"addenda_13":{"type":"string"},"addenda_14":{"type":"string"},"addenda_15":{"type":"string"},"addenda_16":{"type":"string"},"addenda_17":{"type":"string"},"addenda_18":{"type":"string"},"addenda_98":{"type":"string"},"addenda_99":{"type":"string"},"batch_type":{"type":"string"},"company_id":{"type":"string"},"partner_id":{"type":"number"},"_ab_cdc_lsn":{"type":"number"},"external_id":{"type":"string"},"return_data":{"type":"string"},"batch_number":{"type":"number"},"company_name":{"type":"string"},"future_dated":{"type":"boolean"},"originator_id":{"type":"string"},"receiving_dfi":{"type":"string"},"dfi_account_no":{"type":"string"},"effective_date":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"entry_trace_no":{"type":"string"},"individual_name":{"type":"string"},"originating_dfi":{"type":"string"},"settlement_date":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"individual_id_no":{"type":"string"},"transaction_code":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"},"processing_history":{"type":"string"},"transaction_out_id":{"type":"string"},"addenda_record_count":{"type":"string"},"destination_country_code":{"type":"string"},"company_entry_description":{"type":"string"},"destination_currency_code":{"type":"string"},"originating_currency_code":{"type":"string"},"foreign_exchange_indicator":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@3e49aa04[stream=io.airbyte.protocol.models.AirbyteStream@515966c5[name=transactions_out,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"data":{"type":"string"},"uuid":{"type":"string"},"amount":{"type":"number"},"status":{"type":"string"},"bank_id":{"type":"number"},"created":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"file_id":{"type":"number"},"updated":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"trace_no":{"type":"string"},"account_no":{"type":"string"},"partner_id":{"type":"number"},"_ab_cdc_lsn":{"type":"number"},"description":{"type":"string"},"external_id":{"type":"string"},"is_same_day":{"type":"boolean"},"return_data":{"type":"string"},"account_name":{"type":"string"},"effective_date":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"},"reference_info":{"type":"string"},"transaction_code":{"type":"number"},"source_account_no":{"type":"string"},"transaction_in_id":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"},"source_account_name":{"type":"string"},"destination_bank_routing_no":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=raw_achilles,additionalProperties={}],syncMode=incremental,cursorField=[updated],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[]] 2022-07-11 15:51:17 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating... 2022-07-11 15:51:17 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):43 - Using default value for environment variable USE_STREAM_CAPABLE_STATE: 'false' 2022-07-11 15:51:17 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/89696/2/logs.log 2022-07-11 15:51:17 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.32-alpha 2022-07-11 15:51:17 INFO i.a.w.g.DefaultNormalizationWorker(run):49 - Running normalization. 2022-07-11 15:51:17 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization:0.2.6 2022-07-11 15:51:17 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization:0.2.6 exists... 2022-07-11 15:51:18 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization:0.2.6 was found locally. 2022-07-11 15:51:18 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 89696 2022-07-11 15:51:18 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/89696/2/normalize --log-driver none --name normalization-normalize-89696-2-gcjsu --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.39.32-alpha airbyte/normalization:0.2.6 run --integration-type bigquery --config destination_config.json --catalog destination_catalog.json 2022-07-11 15:51:18 normalization > Running: transform-config --config destination_config.json --integration-type bigquery --out /data/89696/2/normalize 2022-07-11 15:51:19 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/89696/2/normalize') 2022-07-11 15:51:19 normalization > transform_bigquery 2022-07-11 15:51:19 normalization > Running: transform-catalog --integration-type bigquery --profile-config-dir /data/89696/2/normalize --catalog destination_catalog.json --out /data/89696/2/normalize/models/generated/ --json-column _airbyte_data 2022-07-11 15:51:20 normalization > Processing destination_catalog.json... 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/bank_config_ab1.sql from bank_config 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/bank_config_ab2.sql from bank_config 2022-07-11 15:51:20 normalization > Generating airbyte_views/raw_achilles/bank_config_stg.sql from bank_config 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/scd/raw_achilles/bank_config_scd.sql from bank_config 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/raw_achilles/bank_config.sql from bank_config 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/files_in_ab1.sql from files_in 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/files_in_ab2.sql from files_in 2022-07-11 15:51:20 normalization > Generating airbyte_views/raw_achilles/files_in_stg.sql from files_in 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/scd/raw_achilles/files_in_scd.sql from files_in 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/raw_achilles/files_in.sql from files_in 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/files_out_ab1.sql from files_out 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/files_out_ab2.sql from files_out 2022-07-11 15:51:20 normalization > Generating airbyte_views/raw_achilles/files_out_stg.sql from files_out 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/scd/raw_achilles/files_out_scd.sql from files_out 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/raw_achilles/files_out.sql from files_out 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/partner_config_ab1.sql from partner_config 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/partner_config_ab2.sql from partner_config 2022-07-11 15:51:20 normalization > Generating airbyte_views/raw_achilles/partner_config_stg.sql from partner_config 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/scd/raw_achilles/partner_config_scd.sql from partner_config 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/raw_achilles/partner_config.sql from partner_config 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/transactions_in_ab1.sql from transactions_in 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/transactions_in_ab2.sql from transactions_in 2022-07-11 15:51:20 normalization > Generating airbyte_views/raw_achilles/transactions_in_stg.sql from transactions_in 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql from transactions_in 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/raw_achilles/transactions_in.sql from transactions_in 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/transactions_out_ab1.sql from transactions_out 2022-07-11 15:51:20 normalization > Generating airbyte_ctes/raw_achilles/transactions_out_ab2.sql from transactions_out 2022-07-11 15:51:20 normalization > Generating airbyte_views/raw_achilles/transactions_out_stg.sql from transactions_out 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/scd/raw_achilles/transactions_out_scd.sql from transactions_out 2022-07-11 15:51:20 normalization > Generating airbyte_incremental/raw_achilles/transactions_out.sql from transactions_out 2022-07-11 15:51:20 normalization > detected no config file for ssh, assuming ssh is off. 2022-07-11 15:51:23 normalization > [--event-buffer-size EVENT_BUFFER_SIZE] 2022-07-11 15:51:23 normalization > --event-buffer-size EVENT_BUFFER_SIZE 2022-07-11 15:51:23 normalization > 2022-07-11 15:51:23 normalization > DBT >=1.0.0 detected; using 10K event buffer size 2022-07-11 15:51:23 normalization > 2022-07-11 15:51:27 normalization > 15:51:27 Running with dbt=1.0.0 2022-07-11 15:51:27 normalization > 15:51:27 Partial parse save file not found. Starting full parse. 2022-07-11 15:51:30 normalization > 15:51:30 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-07-11 15:51:30 normalization > There are 1 unused configuration paths: 2022-07-11 15:51:30 normalization > - models.airbyte_utils.generated.airbyte_tables 2022-07-11 15:51:30 normalization > 2022-07-11 15:51:30 normalization > 15:51:30 Found 30 models, 0 tests, 0 snapshots, 0 analyses, 549 macros, 0 operations, 0 seed files, 6 sources, 0 exposures, 0 metrics 2022-07-11 15:51:30 normalization > 15:51:30 2022-07-11 15:51:31 normalization > 15:51:31 Concurrency: 8 threads (target='prod') 2022-07-11 15:51:31 normalization > 15:51:31 2022-07-11 15:51:32 normalization > 15:51:32 1 of 18 START view model _airbyte_raw_achilles.bank_config_stg.......................................................... [RUN] 2022-07-11 15:51:32 normalization > 15:51:32 2 of 18 START view model _airbyte_raw_achilles.files_in_stg............................................................. [RUN] 2022-07-11 15:51:32 normalization > 15:51:32 3 of 18 START view model _airbyte_raw_achilles.files_out_stg............................................................ [RUN] 2022-07-11 15:51:32 normalization > 15:51:32 4 of 18 START view model _airbyte_raw_achilles.partner_config_stg....................................................... [RUN] 2022-07-11 15:51:32 normalization > 15:51:32 5 of 18 START view model _airbyte_raw_achilles.transactions_out_stg..................................................... [RUN] 2022-07-11 15:51:32 normalization > 15:51:32 6 of 18 START view model _airbyte_raw_achilles.transactions_in_stg...................................................... [RUN] 2022-07-11 15:51:34 normalization > 15:51:34 3 of 18 OK created view model _airbyte_raw_achilles.files_out_stg....................................................... [OK in 1.25s] 2022-07-11 15:51:34 normalization > 15:51:34 2 of 18 OK created view model _airbyte_raw_achilles.files_in_stg........................................................ [OK in 1.26s] 2022-07-11 15:51:34 normalization > 15:51:34 7 of 18 START incremental model raw_achilles.files_out_scd.............................................................. [RUN] 2022-07-11 15:51:34 normalization > 15:51:34 8 of 18 START incremental model raw_achilles.files_in_scd............................................................... [RUN] 2022-07-11 15:51:34 normalization > 15:51:34 5 of 18 OK created view model _airbyte_raw_achilles.transactions_out_stg................................................ [OK in 1.36s] 2022-07-11 15:51:34 normalization > 15:51:34 1 of 18 OK created view model _airbyte_raw_achilles.bank_config_stg..................................................... [OK in 1.40s] 2022-07-11 15:51:34 normalization > 15:51:34 9 of 18 START incremental model raw_achilles.transactions_out_scd....................................................... [RUN] 2022-07-11 15:51:34 normalization > 15:51:34 10 of 18 START incremental model raw_achilles.bank_config_scd........................................................... [RUN] 2022-07-11 15:51:34 normalization > 15:51:34 4 of 18 OK created view model _airbyte_raw_achilles.partner_config_stg.................................................. [OK in 1.44s] 2022-07-11 15:51:34 normalization > 15:51:34 11 of 18 START incremental model raw_achilles.partner_config_scd........................................................ [RUN] 2022-07-11 15:51:34 normalization > 15:51:34 6 of 18 OK created view model _airbyte_raw_achilles.transactions_in_stg................................................. [OK in 1.54s] 2022-07-11 15:51:34 normalization > 15:51:34 12 of 18 START incremental model raw_achilles.transactions_in_scd....................................................... [RUN] 2022-07-11 15:51:34 normalization > 15:51:34 15:51:34 + `mainapi-282501`.raw_achilles.`transactions_in_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:51:40 normalization > 15:51:40 12 of 18 ERROR creating incremental model raw_achilles.transactions_in_scd.............................................. [ERROR in 6.48s] 2022-07-11 15:51:40 normalization > 15:51:40 13 of 18 SKIP relation raw_achilles.transactions_in..................................................................... [SKIP] 2022-07-11 15:51:46 normalization > 15:51:46 11 of 18 OK created incremental model raw_achilles.partner_config_scd................................................... [MERGE (412.0 rows, 298.0 KB processed) in 12.09s] 2022-07-11 15:51:46 normalization > 15:51:46 14 of 18 START incremental model raw_achilles.partner_config............................................................ [RUN] 2022-07-11 15:51:50 normalization > 15:51:50 9 of 18 OK created incremental model raw_achilles.transactions_out_scd.................................................. [MERGE (226.0 rows, 177.8 KB processed) in 16.15s] 2022-07-11 15:51:50 normalization > 15:51:50 15 of 18 START incremental model raw_achilles.transactions_out.......................................................... [RUN] 2022-07-11 15:51:52 normalization > 15:51:52 14 of 18 OK created incremental model raw_achilles.partner_config....................................................... [MERGE (206.0 rows, 168.4 KB processed) in 5.94s] 2022-07-11 15:51:55 normalization > 15:51:55 15 of 18 OK created incremental model raw_achilles.transactions_out..................................................... [MERGE (113.0 rows, 101.9 KB processed) in 5.64s] 2022-07-11 15:52:00 normalization > 15:52:00 8 of 18 OK created incremental model raw_achilles.files_in_scd.......................................................... [MERGE (72.0 rows, 47.9 KB processed) in 26.65s] 2022-07-11 15:52:00 normalization > 15:52:00 16 of 18 START incremental model raw_achilles.files_in.................................................................. [RUN] 2022-07-11 15:52:06 normalization > 15:52:06 16 of 18 OK created incremental model raw_achilles.files_in............................................................. [MERGE (36.0 rows, 26.6 KB processed) in 5.73s] 2022-07-11 15:52:09 normalization > 15:52:09 10 of 18 OK created incremental model raw_achilles.bank_config_scd...................................................... [MERGE (6.0 rows, 5.2 KB processed) in 35.41s] 2022-07-11 15:52:09 normalization > 15:52:09 17 of 18 START incremental model raw_achilles.bank_config............................................................... [RUN] 2022-07-11 15:52:16 normalization > 15:52:16 17 of 18 OK created incremental model raw_achilles.bank_config.......................................................... [MERGE (3.0 rows, 3.1 KB processed) in 6.34s] 2022-07-11 15:52:39 normalization > 15:52:39 7 of 18 OK created incremental model raw_achilles.files_out_scd......................................................... [MERGE (68.0 rows, 37.8 KB processed) in 65.25s] 2022-07-11 15:52:39 normalization > 15:52:39 18 of 18 START incremental model raw_achilles.files_out................................................................. [RUN] 2022-07-11 15:52:49 normalization > 15:52:49 18 of 18 OK created incremental model raw_achilles.files_out............................................................ [MERGE (34.0 rows, 20.2 KB processed) in 10.17s] 2022-07-11 15:52:49 normalization > 15:52:49 2022-07-11 15:52:49 normalization > 15:52:49 Finished running 6 view models, 12 incremental models in 78.57s. 2022-07-11 15:52:49 normalization > 15:52:49 2022-07-11 15:52:49 normalization > 15:52:49 Completed with 1 error and 0 warnings: 2022-07-11 15:52:49 normalization > 15:52:49 2022-07-11 15:52:49 normalization > 15:52:49 Database Error in model transactions_in_scd (models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql) 2022-07-11 15:52:49 normalization > 15:52:49 Invalid timestamp string "0000-12-30T00:00:00Z" 2022-07-11 15:52:49 normalization > 15:52:49 compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:52:49 normalization > 15:52:49 2022-07-11 15:52:49 normalization > 15:52:49 Done. PASS=16 WARN=0 ERROR=1 SKIP=1 TOTAL=18 2022-07-11 15:52:49 normalization > 2022-07-11 15:52:49 normalization > Diagnosing dbt debug to check if destination is available for dbt and well configured (1): 2022-07-11 15:52:49 normalization > 2022-07-11 15:52:53 normalization > 15:52:53 Running with dbt=1.0.0 2022-07-11 15:52:53 normalization > dbt version: 1.0.0 2022-07-11 15:52:53 normalization > python version: 3.9.9 2022-07-11 15:52:53 normalization > python path: /usr/local/bin/python 2022-07-11 15:52:53 normalization > os info: Linux-5.13.0-1024-gcp-x86_64-with-glibc2.31 2022-07-11 15:52:53 normalization > Using profiles.yml file at /data/89696/2/normalize/profiles.yml 2022-07-11 15:52:53 normalization > Using dbt_project.yml file at /data/89696/2/normalize/dbt_project.yml 2022-07-11 15:52:53 normalization > 2022-07-11 15:52:54 normalization > Configuration: 2022-07-11 15:52:54 normalization > profiles.yml file [OK found and valid] 2022-07-11 15:52:54 normalization > dbt_project.yml file [OK found and valid] 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > Required dependencies: 2022-07-11 15:52:54 normalization > - git [OK found] 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > Connection: 2022-07-11 15:52:54 normalization > method: service-account-json 2022-07-11 15:52:54 normalization > database: mainapi-282501 2022-07-11 15:52:54 normalization > schema: airbyte 2022-07-11 15:52:54 normalization > location: US 2022-07-11 15:52:54 normalization > priority: interactive 2022-07-11 15:52:54 normalization > timeout_seconds: 300 2022-07-11 15:52:54 normalization > maximum_bytes_billed: None 2022-07-11 15:52:54 normalization > execution_project: mainapi-282501 2022-07-11 15:52:54 normalization > Connection test: [OK connection ok] 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > All checks passed! 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > Forward dbt output logs to diagnose/debug errors (0): 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > ============================== 2022-07-11 15:51:27.820281 | 7b27d0bf-4635-4aea-a843-790631b6ecfc ============================== 2022-07-11 15:52:54 normalization > 15:51:27.820281 [info ] [MainThread]: Running with dbt=1.0.0 2022-07-11 15:52:54 normalization > 15:51:27.821004 [debug] [MainThread]: running dbt with arguments Namespace(record_timing_info=None, debug=None, log_format=None, write_json=None, use_colors=None, printer_width=None, warn_error=None, version_check=None, partial_parse=None, single_threaded=False, use_experimental_parser=None, static_parser=None, profiles_dir='/data/89696/2/normalize', send_anonymous_usage_stats=None, fail_fast=None, event_buffer_size='10000', project_dir='/data/89696/2/normalize', profile=None, target=None, vars='{}', log_cache_events=False, threads=None, select=None, exclude=None, selector_name=None, state=None, defer=None, full_refresh=False, cls=, which='run', rpc_method='run') 2022-07-11 15:52:54 normalization > 15:51:27.821486 [debug] [MainThread]: Tracking: do not track 2022-07-11 15:52:54 normalization > 15:51:27.857624 [info ] [MainThread]: Partial parse save file not found. Starting full parse. 2022-07-11 15:52:54 normalization > 15:51:27.906147 [debug] [MainThread]: Parsing macros/configuration.sql 2022-07-11 15:52:54 normalization > 15:51:27.911081 [debug] [MainThread]: Parsing macros/should_full_refresh.sql 2022-07-11 15:52:54 normalization > 15:51:27.920035 [debug] [MainThread]: Parsing macros/incremental.sql 2022-07-11 15:52:54 normalization > 15:51:27.931027 [debug] [MainThread]: Parsing macros/get_custom_schema.sql 2022-07-11 15:52:54 normalization > 15:51:27.932071 [debug] [MainThread]: Parsing macros/star_intersect.sql 2022-07-11 15:52:54 normalization > 15:51:27.942557 [debug] [MainThread]: Parsing macros/schema_tests/equal_rowcount.sql 2022-07-11 15:52:54 normalization > 15:51:27.944659 [debug] [MainThread]: Parsing macros/schema_tests/equality.sql 2022-07-11 15:52:54 normalization > 15:51:27.957164 [debug] [MainThread]: Parsing macros/cross_db_utils/except.sql 2022-07-11 15:52:54 normalization > 15:51:27.958647 [debug] [MainThread]: Parsing macros/cross_db_utils/current_timestamp.sql 2022-07-11 15:52:54 normalization > 15:51:27.959921 [debug] [MainThread]: Parsing macros/cross_db_utils/columns.sql 2022-07-11 15:52:54 normalization > 15:51:27.965786 [debug] [MainThread]: Parsing macros/cross_db_utils/hash.sql 2022-07-11 15:52:54 normalization > 15:51:27.966953 [debug] [MainThread]: Parsing macros/cross_db_utils/json_operations.sql 2022-07-11 15:52:54 normalization > 15:51:28.044711 [debug] [MainThread]: Parsing macros/cross_db_utils/quote.sql 2022-07-11 15:52:54 normalization > 15:51:28.047897 [debug] [MainThread]: Parsing macros/cross_db_utils/type_conversions.sql 2022-07-11 15:52:54 normalization > 15:51:28.064469 [debug] [MainThread]: Parsing macros/cross_db_utils/surrogate_key.sql 2022-07-11 15:52:54 normalization > 15:51:28.068044 [debug] [MainThread]: Parsing macros/cross_db_utils/datatypes.sql 2022-07-11 15:52:54 normalization > 15:51:28.091198 [debug] [MainThread]: Parsing macros/cross_db_utils/concat.sql 2022-07-11 15:52:54 normalization > 15:51:28.098562 [debug] [MainThread]: Parsing macros/cross_db_utils/array.sql 2022-07-11 15:52:54 normalization > 15:51:28.125257 [debug] [MainThread]: Parsing macros/adapters.sql 2022-07-11 15:52:54 normalization > 15:51:28.171560 [debug] [MainThread]: Parsing macros/catalog.sql 2022-07-11 15:52:54 normalization > 15:51:28.184219 [debug] [MainThread]: Parsing macros/etc.sql 2022-07-11 15:52:54 normalization > 15:51:28.187463 [debug] [MainThread]: Parsing macros/materializations/table.sql 2022-07-11 15:52:54 normalization > 15:51:28.194651 [debug] [MainThread]: Parsing macros/materializations/copy.sql 2022-07-11 15:52:54 normalization > 15:51:28.199166 [debug] [MainThread]: Parsing macros/materializations/seed.sql 2022-07-11 15:52:54 normalization > 15:51:28.203606 [debug] [MainThread]: Parsing macros/materializations/incremental.sql 2022-07-11 15:52:54 normalization > 15:51:28.231869 [debug] [MainThread]: Parsing macros/materializations/view.sql 2022-07-11 15:52:54 normalization > 15:51:28.236624 [debug] [MainThread]: Parsing macros/materializations/snapshot.sql 2022-07-11 15:52:54 normalization > 15:51:28.239506 [debug] [MainThread]: Parsing macros/etc/statement.sql 2022-07-11 15:52:54 normalization > 15:51:28.247340 [debug] [MainThread]: Parsing macros/etc/datetime.sql 2022-07-11 15:52:54 normalization > 15:51:28.263489 [debug] [MainThread]: Parsing macros/materializations/configs.sql 2022-07-11 15:52:54 normalization > 15:51:28.267526 [debug] [MainThread]: Parsing macros/materializations/hooks.sql 2022-07-11 15:52:54 normalization > 15:51:28.274406 [debug] [MainThread]: Parsing macros/materializations/tests/where_subquery.sql 2022-07-11 15:52:54 normalization > 15:51:28.277489 [debug] [MainThread]: Parsing macros/materializations/tests/helpers.sql 2022-07-11 15:52:54 normalization > 15:51:28.280567 [debug] [MainThread]: Parsing macros/materializations/tests/test.sql 2022-07-11 15:52:54 normalization > 15:51:28.288884 [debug] [MainThread]: Parsing macros/materializations/seeds/seed.sql 2022-07-11 15:52:54 normalization > 15:51:28.300074 [debug] [MainThread]: Parsing macros/materializations/seeds/helpers.sql 2022-07-11 15:52:54 normalization > 15:51:28.331591 [debug] [MainThread]: Parsing macros/materializations/models/table/table.sql 2022-07-11 15:52:54 normalization > 15:51:28.344742 [debug] [MainThread]: Parsing macros/materializations/models/table/create_table_as.sql 2022-07-11 15:52:54 normalization > 15:51:28.349765 [debug] [MainThread]: Parsing macros/materializations/models/incremental/incremental.sql 2022-07-11 15:52:54 normalization > 15:51:28.368282 [debug] [MainThread]: Parsing macros/materializations/models/incremental/column_helpers.sql 2022-07-11 15:52:54 normalization > 15:51:28.376382 [debug] [MainThread]: Parsing macros/materializations/models/incremental/merge.sql 2022-07-11 15:52:54 normalization > 15:51:28.398495 [debug] [MainThread]: Parsing macros/materializations/models/incremental/on_schema_change.sql 2022-07-11 15:52:54 normalization > 15:51:28.430498 [debug] [MainThread]: Parsing macros/materializations/models/incremental/is_incremental.sql 2022-07-11 15:52:54 normalization > 15:51:28.433341 [debug] [MainThread]: Parsing macros/materializations/models/view/create_or_replace_view.sql 2022-07-11 15:52:54 normalization > 15:51:28.438171 [debug] [MainThread]: Parsing macros/materializations/models/view/create_view_as.sql 2022-07-11 15:52:54 normalization > 15:51:28.442105 [debug] [MainThread]: Parsing macros/materializations/models/view/view.sql 2022-07-11 15:52:54 normalization > 15:51:28.454860 [debug] [MainThread]: Parsing macros/materializations/models/view/helpers.sql 2022-07-11 15:52:54 normalization > 15:51:28.458311 [debug] [MainThread]: Parsing macros/materializations/snapshots/snapshot_merge.sql 2022-07-11 15:52:54 normalization > 15:51:28.461242 [debug] [MainThread]: Parsing macros/materializations/snapshots/snapshot.sql 2022-07-11 15:52:54 normalization > 15:51:28.482957 [debug] [MainThread]: Parsing macros/materializations/snapshots/helpers.sql 2022-07-11 15:52:54 normalization > 15:51:28.505420 [debug] [MainThread]: Parsing macros/materializations/snapshots/strategies.sql 2022-07-11 15:52:54 normalization > 15:51:28.536822 [debug] [MainThread]: Parsing macros/adapters/persist_docs.sql 2022-07-11 15:52:54 normalization > 15:51:28.544990 [debug] [MainThread]: Parsing macros/adapters/columns.sql 2022-07-11 15:52:54 normalization > 15:51:28.563436 [debug] [MainThread]: Parsing macros/adapters/indexes.sql 2022-07-11 15:52:54 normalization > 15:51:28.568509 [debug] [MainThread]: Parsing macros/adapters/relation.sql 2022-07-11 15:52:54 normalization > 15:51:28.586397 [debug] [MainThread]: Parsing macros/adapters/schema.sql 2022-07-11 15:52:54 normalization > 15:51:28.590576 [debug] [MainThread]: Parsing macros/adapters/freshness.sql 2022-07-11 15:52:54 normalization > 15:51:28.596570 [debug] [MainThread]: Parsing macros/adapters/metadata.sql 2022-07-11 15:52:54 normalization > 15:51:28.609979 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_database.sql 2022-07-11 15:52:54 normalization > 15:51:28.612863 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_schema.sql 2022-07-11 15:52:54 normalization > 15:51:28.617966 [debug] [MainThread]: Parsing macros/get_custom_name/get_custom_alias.sql 2022-07-11 15:52:54 normalization > 15:51:28.620688 [debug] [MainThread]: Parsing macros/generic_test_sql/not_null.sql 2022-07-11 15:52:54 normalization > 15:51:28.621625 [debug] [MainThread]: Parsing macros/generic_test_sql/accepted_values.sql 2022-07-11 15:52:54 normalization > 15:51:28.624183 [debug] [MainThread]: Parsing macros/generic_test_sql/unique.sql 2022-07-11 15:52:54 normalization > 15:51:28.625412 [debug] [MainThread]: Parsing macros/generic_test_sql/relationships.sql 2022-07-11 15:52:54 normalization > 15:51:28.627047 [debug] [MainThread]: Parsing tests/generic/builtin.sql 2022-07-11 15:52:54 normalization > 15:51:28.632466 [debug] [MainThread]: Parsing macros/web/get_url_path.sql 2022-07-11 15:52:54 normalization > 15:51:28.637400 [debug] [MainThread]: Parsing macros/web/get_url_host.sql 2022-07-11 15:52:54 normalization > 15:51:28.641243 [debug] [MainThread]: Parsing macros/web/get_url_parameter.sql 2022-07-11 15:52:54 normalization > 15:51:28.644016 [debug] [MainThread]: Parsing macros/materializations/insert_by_period_materialization.sql 2022-07-11 15:52:54 normalization > 15:51:28.689962 [debug] [MainThread]: Parsing macros/schema_tests/test_not_null_where.sql 2022-07-11 15:52:54 normalization > 15:51:28.692777 [debug] [MainThread]: Parsing macros/schema_tests/test_unique_where.sql 2022-07-11 15:52:54 normalization > 15:51:28.695410 [debug] [MainThread]: Parsing macros/schema_tests/at_least_one.sql 2022-07-11 15:52:54 normalization > 15:51:28.697606 [debug] [MainThread]: Parsing macros/schema_tests/not_constant.sql 2022-07-11 15:52:54 normalization > 15:51:28.699744 [debug] [MainThread]: Parsing macros/schema_tests/expression_is_true.sql 2022-07-11 15:52:54 normalization > 15:51:28.702947 [debug] [MainThread]: Parsing macros/schema_tests/recency.sql 2022-07-11 15:52:54 normalization > 15:51:28.706033 [debug] [MainThread]: Parsing macros/schema_tests/equal_rowcount.sql 2022-07-11 15:52:54 normalization > 15:51:28.709046 [debug] [MainThread]: Parsing macros/schema_tests/not_null_proportion.sql 2022-07-11 15:52:54 normalization > 15:51:28.713068 [debug] [MainThread]: Parsing macros/schema_tests/accepted_range.sql 2022-07-11 15:52:54 normalization > 15:51:28.718399 [debug] [MainThread]: Parsing macros/schema_tests/not_accepted_values.sql 2022-07-11 15:52:54 normalization > 15:51:28.722301 [debug] [MainThread]: Parsing macros/schema_tests/cardinality_equality.sql 2022-07-11 15:52:54 normalization > 15:51:28.725891 [debug] [MainThread]: Parsing macros/schema_tests/unique_combination_of_columns.sql 2022-07-11 15:52:54 normalization > 15:51:28.731808 [debug] [MainThread]: Parsing macros/schema_tests/mutually_exclusive_ranges.sql 2022-07-11 15:52:54 normalization > 15:51:28.749947 [debug] [MainThread]: Parsing macros/schema_tests/fewer_rows_than.sql 2022-07-11 15:52:54 normalization > 15:51:28.753368 [debug] [MainThread]: Parsing macros/schema_tests/equality.sql 2022-07-11 15:52:54 normalization > 15:51:28.760200 [debug] [MainThread]: Parsing macros/schema_tests/relationships_where.sql 2022-07-11 15:52:54 normalization > 15:51:28.764238 [debug] [MainThread]: Parsing macros/schema_tests/sequential_values.sql 2022-07-11 15:52:54 normalization > 15:51:28.769573 [debug] [MainThread]: Parsing macros/cross_db_utils/except.sql 2022-07-11 15:52:54 normalization > 15:51:28.771393 [debug] [MainThread]: Parsing macros/cross_db_utils/length.sql 2022-07-11 15:52:54 normalization > 15:51:28.773561 [debug] [MainThread]: Parsing macros/cross_db_utils/position.sql 2022-07-11 15:52:54 normalization > 15:51:28.776240 [debug] [MainThread]: Parsing macros/cross_db_utils/current_timestamp.sql 2022-07-11 15:52:54 normalization > 15:51:28.782293 [debug] [MainThread]: Parsing macros/cross_db_utils/intersect.sql 2022-07-11 15:52:54 normalization > 15:51:28.784313 [debug] [MainThread]: Parsing macros/cross_db_utils/replace.sql 2022-07-11 15:52:54 normalization > 15:51:28.786436 [debug] [MainThread]: Parsing macros/cross_db_utils/escape_single_quotes.sql 2022-07-11 15:52:54 normalization > 15:51:28.789572 [debug] [MainThread]: Parsing macros/cross_db_utils/any_value.sql 2022-07-11 15:52:54 normalization > 15:51:28.791824 [debug] [MainThread]: Parsing macros/cross_db_utils/last_day.sql 2022-07-11 15:52:54 normalization > 15:51:28.798398 [debug] [MainThread]: Parsing macros/cross_db_utils/cast_bool_to_text.sql 2022-07-11 15:52:54 normalization > 15:51:28.800967 [debug] [MainThread]: Parsing macros/cross_db_utils/dateadd.sql 2022-07-11 15:52:54 normalization > 15:51:28.806136 [debug] [MainThread]: Parsing macros/cross_db_utils/literal.sql 2022-07-11 15:52:54 normalization > 15:51:28.807795 [debug] [MainThread]: Parsing macros/cross_db_utils/safe_cast.sql 2022-07-11 15:52:54 normalization > 15:51:28.810967 [debug] [MainThread]: Parsing macros/cross_db_utils/date_trunc.sql 2022-07-11 15:52:54 normalization > 15:51:28.813522 [debug] [MainThread]: Parsing macros/cross_db_utils/bool_or.sql 2022-07-11 15:52:54 normalization > 15:51:28.816370 [debug] [MainThread]: Parsing macros/cross_db_utils/width_bucket.sql 2022-07-11 15:52:54 normalization > 15:51:28.826719 [debug] [MainThread]: Parsing macros/cross_db_utils/hash.sql 2022-07-11 15:52:54 normalization > 15:51:28.829123 [debug] [MainThread]: Parsing macros/cross_db_utils/_is_ephemeral.sql 2022-07-11 15:52:54 normalization > 15:51:28.832542 [debug] [MainThread]: Parsing macros/cross_db_utils/_is_relation.sql 2022-07-11 15:52:54 normalization > 15:51:28.834473 [debug] [MainThread]: Parsing macros/cross_db_utils/right.sql 2022-07-11 15:52:54 normalization > 15:51:28.838885 [debug] [MainThread]: Parsing macros/cross_db_utils/split_part.sql 2022-07-11 15:52:54 normalization > 15:51:28.842065 [debug] [MainThread]: Parsing macros/cross_db_utils/datediff.sql 2022-07-11 15:52:54 normalization > 15:51:28.861248 [debug] [MainThread]: Parsing macros/cross_db_utils/datatypes.sql 2022-07-11 15:52:54 normalization > 15:51:28.872001 [debug] [MainThread]: Parsing macros/cross_db_utils/concat.sql 2022-07-11 15:52:54 normalization > 15:51:28.873708 [debug] [MainThread]: Parsing macros/cross_db_utils/identifier.sql 2022-07-11 15:52:54 normalization > 15:51:28.876525 [debug] [MainThread]: Parsing macros/sql/get_tables_by_prefix_sql.sql 2022-07-11 15:52:54 normalization > 15:51:28.879402 [debug] [MainThread]: Parsing macros/sql/get_column_values.sql 2022-07-11 15:52:54 normalization > 15:51:28.888799 [debug] [MainThread]: Parsing macros/sql/get_query_results_as_dict.sql 2022-07-11 15:52:54 normalization > 15:51:28.893067 [debug] [MainThread]: Parsing macros/sql/get_relations_by_pattern.sql 2022-07-11 15:52:54 normalization > 15:51:28.899192 [debug] [MainThread]: Parsing macros/sql/get_relations_by_prefix.sql 2022-07-11 15:52:54 normalization > 15:51:28.905370 [debug] [MainThread]: Parsing macros/sql/haversine_distance.sql 2022-07-11 15:52:54 normalization > 15:51:28.916071 [debug] [MainThread]: Parsing macros/sql/get_tables_by_pattern_sql.sql 2022-07-11 15:52:54 normalization > 15:51:28.928059 [debug] [MainThread]: Parsing macros/sql/pivot.sql 2022-07-11 15:52:54 normalization > 15:51:28.935561 [debug] [MainThread]: Parsing macros/sql/date_spine.sql 2022-07-11 15:52:54 normalization > 15:51:28.942960 [debug] [MainThread]: Parsing macros/sql/star.sql 2022-07-11 15:52:54 normalization > 15:51:28.950737 [debug] [MainThread]: Parsing macros/sql/union.sql 2022-07-11 15:52:54 normalization > 15:51:28.970793 [debug] [MainThread]: Parsing macros/sql/get_table_types_sql.sql 2022-07-11 15:52:54 normalization > 15:51:28.973279 [debug] [MainThread]: Parsing macros/sql/safe_add.sql 2022-07-11 15:52:54 normalization > 15:51:28.976078 [debug] [MainThread]: Parsing macros/sql/surrogate_key.sql 2022-07-11 15:52:54 normalization > 15:51:28.982104 [debug] [MainThread]: Parsing macros/sql/groupby.sql 2022-07-11 15:52:54 normalization > 15:51:28.984377 [debug] [MainThread]: Parsing macros/sql/generate_series.sql 2022-07-11 15:52:54 normalization > 15:51:28.992210 [debug] [MainThread]: Parsing macros/sql/nullcheck.sql 2022-07-11 15:52:54 normalization > 15:51:28.995389 [debug] [MainThread]: Parsing macros/sql/unpivot.sql 2022-07-11 15:52:54 normalization > 15:51:29.010191 [debug] [MainThread]: Parsing macros/sql/nullcheck_table.sql 2022-07-11 15:52:54 normalization > 15:51:29.013105 [debug] [MainThread]: Parsing macros/jinja_helpers/log_info.sql 2022-07-11 15:52:54 normalization > 15:51:29.015013 [debug] [MainThread]: Parsing macros/jinja_helpers/pretty_time.sql 2022-07-11 15:52:54 normalization > 15:51:29.017115 [debug] [MainThread]: Parsing macros/jinja_helpers/slugify.sql 2022-07-11 15:52:54 normalization > 15:51:29.019558 [debug] [MainThread]: Parsing macros/jinja_helpers/pretty_log_format.sql 2022-07-11 15:52:54 normalization > 15:51:29.748307 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/transactions_out_scd.sql 2022-07-11 15:52:54 normalization > 15:51:29.834062 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/transactions_out_scd.sql 2022-07-11 15:52:54 normalization > 15:51:29.836969 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/bank_config_scd.sql 2022-07-11 15:52:54 normalization > 15:51:29.871783 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/bank_config_scd.sql 2022-07-11 15:52:54 normalization > 15:51:29.874595 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:52:54 normalization > 15:51:29.911167 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:52:54 normalization > 15:51:29.914057 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/files_out_scd.sql 2022-07-11 15:52:54 normalization > 15:51:30.007567 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/files_out_scd.sql 2022-07-11 15:52:54 normalization > 15:51:30.010476 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/files_in_scd.sql 2022-07-11 15:52:54 normalization > 15:51:30.045945 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/files_in_scd.sql 2022-07-11 15:52:54 normalization > 15:51:30.048780 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/scd/raw_achilles/partner_config_scd.sql 2022-07-11 15:52:54 normalization > 15:51:30.084809 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/scd/raw_achilles/partner_config_scd.sql 2022-07-11 15:52:54 normalization > 15:51:30.087289 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/files_out.sql 2022-07-11 15:52:54 normalization > 15:51:30.104275 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/files_out.sql 2022-07-11 15:52:54 normalization > 15:51:30.106442 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/files_in.sql 2022-07-11 15:52:54 normalization > 15:51:30.120294 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/files_in.sql 2022-07-11 15:52:54 normalization > 15:51:30.122582 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/bank_config.sql 2022-07-11 15:52:54 normalization > 15:51:30.134604 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/bank_config.sql 2022-07-11 15:52:54 normalization > 15:51:30.136776 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/partner_config.sql 2022-07-11 15:52:54 normalization > 15:51:30.148961 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/partner_config.sql 2022-07-11 15:52:54 normalization > 15:51:30.151211 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/transactions_in.sql 2022-07-11 15:52:54 normalization > 15:51:30.162792 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/transactions_in.sql 2022-07-11 15:52:54 normalization > 15:51:30.164842 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_incremental/raw_achilles/transactions_out.sql 2022-07-11 15:52:54 normalization > 15:51:30.175951 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_incremental/raw_achilles/transactions_out.sql 2022-07-11 15:52:54 normalization > 15:51:30.178053 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_out_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.214488 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_out_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.216868 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_in_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.253251 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_in_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.255567 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_out_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.284778 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_out_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.287077 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/bank_config_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.304613 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/bank_config_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.306840 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/partner_config_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.333952 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/partner_config_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.336296 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/bank_config_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.358752 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/bank_config_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.360904 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/partner_config_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.380047 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/partner_config_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.382528 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_out_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.430523 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_out_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.432892 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/files_in_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.457599 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/files_in_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.460616 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_in_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.509394 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_in_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.512830 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_in_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.589539 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_in_ab1.sql 2022-07-11 15:52:54 normalization > 15:51:30.592114 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_ctes/raw_achilles/transactions_out_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.622633 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_ctes/raw_achilles/transactions_out_ab2.sql 2022-07-11 15:52:54 normalization > 15:51:30.625034 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/transactions_in_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.662917 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/transactions_in_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.665055 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/bank_config_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.680517 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/bank_config_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.682757 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/partner_config_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.699583 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/partner_config_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.701771 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/transactions_out_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.725402 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/transactions_out_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.727631 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/files_out_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.745691 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/files_out_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.747916 [debug] [MainThread]: 1603: static parser failed on generated/airbyte_views/raw_achilles/files_in_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.768137 [debug] [MainThread]: 1602: parser fallback to jinja rendering on generated/airbyte_views/raw_achilles/files_in_stg.sql 2022-07-11 15:52:54 normalization > 15:51:30.887915 [warn ] [MainThread]: [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-07-11 15:52:54 normalization > There are 1 unused configuration paths: 2022-07-11 15:52:54 normalization > - models.airbyte_utils.generated.airbyte_tables 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 15:51:30.923248 [info ] [MainThread]: Found 30 models, 0 tests, 0 snapshots, 0 analyses, 549 macros, 0 operations, 0 seed files, 6 sources, 0 exposures, 0 metrics 2022-07-11 15:52:54 normalization > 15:51:30.927541 [info ] [MainThread]: 2022-07-11 15:52:54 normalization > 15:51:30.928818 [debug] [MainThread]: Acquiring new bigquery connection "master" 2022-07-11 15:52:54 normalization > 15:51:30.931817 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501" 2022-07-11 15:52:54 normalization > 15:51:30.932440 [debug] [ThreadPool]: Opening a new connection, currently in state init 2022-07-11 15:52:54 normalization > 15:51:30.933359 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501" 2022-07-11 15:52:54 normalization > 15:51:30.934249 [debug] [ThreadPool]: Opening a new connection, currently in state init 2022-07-11 15:52:54 normalization > 15:51:31.602445 [debug] [ThreadPool]: Acquiring new bigquery connection "create_mainapi-282501__airbyte_raw_achilles" 2022-07-11 15:52:54 normalization > 15:51:31.603333 [debug] [ThreadPool]: Acquiring new bigquery connection "create_mainapi-282501__airbyte_raw_achilles" 2022-07-11 15:52:54 normalization > 15:51:31.603562 [debug] [ThreadPool]: BigQuery adapter: Creating schema "mainapi-282501._airbyte_raw_achilles". 2022-07-11 15:52:54 normalization > 15:51:31.603775 [debug] [ThreadPool]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:31.818461 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501__airbyte_raw_achilles" 2022-07-11 15:52:54 normalization > 15:51:31.819170 [debug] [ThreadPool]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:31.819996 [debug] [ThreadPool]: Acquiring new bigquery connection "list_mainapi-282501_raw_achilles" 2022-07-11 15:52:54 normalization > 15:51:31.821040 [debug] [ThreadPool]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:31.940439 [info ] [MainThread]: Concurrency: 8 threads (target='prod') 2022-07-11 15:52:54 normalization > 15:51:31.940982 [info ] [MainThread]: 2022-07-11 15:52:54 normalization > 15:51:31.969461 [debug] [Thread-1 ]: Began running node model.airbyte_utils.bank_config_ab1 2022-07-11 15:52:54 normalization > 15:51:31.969848 [debug] [Thread-2 ]: Began running node model.airbyte_utils.files_in_ab1 2022-07-11 15:52:54 normalization > 15:51:31.971109 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_ab1" 2022-07-11 15:52:54 normalization > 15:51:31.971396 [debug] [Thread-3 ]: Began running node model.airbyte_utils.files_out_ab1 2022-07-11 15:52:54 normalization > 15:51:31.971832 [debug] [Thread-4 ]: Began running node model.airbyte_utils.partner_config_ab1 2022-07-11 15:52:54 normalization > 15:51:31.972738 [debug] [Thread-2 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_ab1" 2022-07-11 15:52:54 normalization > 15:51:31.973008 [debug] [Thread-5 ]: Began running node model.airbyte_utils.transactions_in_ab1 2022-07-11 15:52:54 normalization > 15:51:31.973347 [debug] [Thread-6 ]: Began running node model.airbyte_utils.transactions_out_ab1 2022-07-11 15:52:54 normalization > 15:51:31.973699 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.bank_config_ab1 2022-07-11 15:52:54 normalization > 15:51:31.974747 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_ab1" 2022-07-11 15:52:54 normalization > 15:51:31.975727 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_ab1" 2022-07-11 15:52:54 normalization > 15:51:31.976057 [debug] [Thread-2 ]: Began compiling node model.airbyte_utils.files_in_ab1 2022-07-11 15:52:54 normalization > 15:51:31.976962 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_ab1" 2022-07-11 15:52:54 normalization > 15:51:31.977820 [debug] [Thread-6 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_ab1" 2022-07-11 15:52:54 normalization > 15:51:31.978239 [debug] [Thread-1 ]: Compiling model.airbyte_utils.bank_config_ab1 2022-07-11 15:52:54 normalization > 15:51:31.978664 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.files_out_ab1 2022-07-11 15:52:54 normalization > 15:51:31.978977 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.partner_config_ab1 2022-07-11 15:52:54 normalization > 15:51:31.979329 [debug] [Thread-2 ]: Compiling model.airbyte_utils.files_in_ab1 2022-07-11 15:52:54 normalization > 15:51:31.979628 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.transactions_in_ab1 2022-07-11 15:52:54 normalization > 15:51:31.979908 [debug] [Thread-6 ]: Began compiling node model.airbyte_utils.transactions_out_ab1 2022-07-11 15:52:54 normalization > 15:51:31.995932 [debug] [Thread-3 ]: Compiling model.airbyte_utils.files_out_ab1 2022-07-11 15:52:54 normalization > 15:51:32.004208 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_ab1" 2022-07-11 15:52:54 normalization > 15:51:32.004750 [debug] [Thread-4 ]: Compiling model.airbyte_utils.partner_config_ab1 2022-07-11 15:52:54 normalization > 15:51:32.008186 [debug] [Thread-5 ]: Compiling model.airbyte_utils.transactions_in_ab1 2022-07-11 15:52:54 normalization > 15:51:32.008989 [debug] [Thread-6 ]: Compiling model.airbyte_utils.transactions_out_ab1 2022-07-11 15:52:54 normalization > 15:51:32.041109 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.files_out_ab1" 2022-07-11 15:52:54 normalization > 15:51:32.132042 [debug] [Thread-2 ]: Writing injected SQL for node "model.airbyte_utils.files_in_ab1" 2022-07-11 15:52:54 normalization > 15:51:32.159547 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_ab1" 2022-07-11 15:52:54 normalization > 15:51:32.175016 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.284646 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.300082 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.318615 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.bank_config_ab1 2022-07-11 15:52:54 normalization > 15:51:32.338797 [debug] [Thread-2 ]: Finished running node model.airbyte_utils.files_in_ab1 2022-07-11 15:52:54 normalization > 15:51:32.340272 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_ab1" 2022-07-11 15:52:54 normalization > 15:51:32.342582 [debug] [Thread-6 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_ab1" 2022-07-11 15:52:54 normalization > 15:51:32.344261 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.files_out_ab1 2022-07-11 15:52:54 normalization > 15:51:32.344738 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.346746 [debug] [Thread-8 ]: Began running node model.airbyte_utils.bank_config_ab2 2022-07-11 15:52:54 normalization > 15:51:32.347649 [debug] [Thread-7 ]: Began running node model.airbyte_utils.files_in_ab2 2022-07-11 15:52:54 normalization > 15:51:32.350063 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.350417 [debug] [Thread-2 ]: Began running node model.airbyte_utils.files_out_ab2 2022-07-11 15:52:54 normalization > 15:51:32.350673 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.352119 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.partner_config_ab1 2022-07-11 15:52:54 normalization > 15:51:32.353351 [debug] [Thread-8 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.354395 [debug] [Thread-7 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.355507 [debug] [Thread-6 ]: Finished running node model.airbyte_utils.transactions_out_ab1 2022-07-11 15:52:54 normalization > 15:51:32.356415 [debug] [Thread-2 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.357276 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.transactions_in_ab1 2022-07-11 15:52:54 normalization > 15:51:32.358188 [debug] [Thread-3 ]: Began running node model.airbyte_utils.partner_config_ab2 2022-07-11 15:52:54 normalization > 15:51:32.358566 [debug] [Thread-8 ]: Began compiling node model.airbyte_utils.bank_config_ab2 2022-07-11 15:52:54 normalization > 15:51:32.358957 [debug] [Thread-7 ]: Began compiling node model.airbyte_utils.files_in_ab2 2022-07-11 15:52:54 normalization > 15:51:32.360024 [debug] [Thread-4 ]: Began running node model.airbyte_utils.transactions_out_ab2 2022-07-11 15:52:54 normalization > 15:51:32.360396 [debug] [Thread-2 ]: Began compiling node model.airbyte_utils.files_out_ab2 2022-07-11 15:52:54 normalization > 15:51:32.361253 [debug] [Thread-6 ]: Began running node model.airbyte_utils.transactions_in_ab2 2022-07-11 15:52:54 normalization > 15:51:32.362148 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.362592 [debug] [Thread-8 ]: Compiling model.airbyte_utils.bank_config_ab2 2022-07-11 15:52:54 normalization > 15:51:32.362910 [debug] [Thread-7 ]: Compiling model.airbyte_utils.files_in_ab2 2022-07-11 15:52:54 normalization > 15:51:32.363709 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.364046 [debug] [Thread-2 ]: Compiling model.airbyte_utils.files_out_ab2 2022-07-11 15:52:54 normalization > 15:51:32.364850 [debug] [Thread-6 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.365140 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.partner_config_ab2 2022-07-11 15:52:54 normalization > 15:51:32.423973 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.transactions_out_ab2 2022-07-11 15:52:54 normalization > 15:51:32.438084 [debug] [Thread-8 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.494534 [debug] [Thread-7 ]: Writing injected SQL for node "model.airbyte_utils.files_in_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.494896 [debug] [Thread-6 ]: Began compiling node model.airbyte_utils.transactions_in_ab2 2022-07-11 15:52:54 normalization > 15:51:32.516017 [debug] [Thread-2 ]: Writing injected SQL for node "model.airbyte_utils.files_out_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.516379 [debug] [Thread-3 ]: Compiling model.airbyte_utils.partner_config_ab2 2022-07-11 15:52:54 normalization > 15:51:32.516718 [debug] [Thread-4 ]: Compiling model.airbyte_utils.transactions_out_ab2 2022-07-11 15:52:54 normalization > 15:51:32.517665 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.517973 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.518190 [debug] [Thread-6 ]: Compiling model.airbyte_utils.transactions_in_ab2 2022-07-11 15:52:54 normalization > 15:51:32.539661 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.574191 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.575272 [debug] [Thread-8 ]: Finished running node model.airbyte_utils.bank_config_ab2 2022-07-11 15:52:54 normalization > 15:51:32.601145 [debug] [Thread-7 ]: Finished running node model.airbyte_utils.files_in_ab2 2022-07-11 15:52:54 normalization > 15:51:32.658358 [debug] [Thread-2 ]: Finished running node model.airbyte_utils.files_out_ab2 2022-07-11 15:52:54 normalization > 15:51:32.677119 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.709690 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.747453 [debug] [Thread-5 ]: Began running node model.airbyte_utils.bank_config_stg 2022-07-11 15:52:54 normalization > 15:51:32.753331 [debug] [Thread-1 ]: Began running node model.airbyte_utils.files_in_stg 2022-07-11 15:52:54 normalization > 15:51:32.770555 [debug] [Thread-7 ]: Began running node model.airbyte_utils.files_out_stg 2022-07-11 15:52:54 normalization > 15:51:32.793269 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.partner_config_ab2 2022-07-11 15:52:54 normalization > 15:51:32.795172 [debug] [Thread-6 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_ab2" 2022-07-11 15:52:54 normalization > 15:51:32.795479 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.796115 [info ] [Thread-5 ]: 1 of 18 START view model _airbyte_raw_achilles.bank_config_stg.......................................................... [RUN] 2022-07-11 15:52:54 normalization > 15:51:32.796601 [info ] [Thread-1 ]: 2 of 18 START view model _airbyte_raw_achilles.files_in_stg............................................................. [RUN] 2022-07-11 15:52:54 normalization > 15:51:32.797025 [info ] [Thread-7 ]: 3 of 18 START view model _airbyte_raw_achilles.files_out_stg............................................................ [RUN] 2022-07-11 15:52:54 normalization > 15:51:32.798072 [debug] [Thread-2 ]: Began running node model.airbyte_utils.partner_config_stg 2022-07-11 15:52:54 normalization > 15:51:32.798718 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:32.799631 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.transactions_out_ab2 2022-07-11 15:52:54 normalization > 15:51:32.800936 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_stg" 2022-07-11 15:52:54 normalization > 15:51:32.802244 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_stg" 2022-07-11 15:52:54 normalization > 15:51:32.803349 [info ] [Thread-2 ]: 4 of 18 START view model _airbyte_raw_achilles.partner_config_stg....................................................... [RUN] 2022-07-11 15:52:54 normalization > 15:51:32.804435 [debug] [Thread-6 ]: Finished running node model.airbyte_utils.transactions_in_ab2 2022-07-11 15:52:54 normalization > 15:51:32.805147 [debug] [Thread-7 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_stg" 2022-07-11 15:52:54 normalization > 15:51:32.806024 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.bank_config_stg 2022-07-11 15:52:54 normalization > 15:51:32.806288 [debug] [Thread-3 ]: Began running node model.airbyte_utils.transactions_out_stg 2022-07-11 15:52:54 normalization > 15:51:32.806595 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.files_in_stg 2022-07-11 15:52:54 normalization > 15:51:32.807858 [debug] [Thread-2 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_stg" 2022-07-11 15:52:54 normalization > 15:51:32.808681 [debug] [Thread-4 ]: Began running node model.airbyte_utils.transactions_in_stg 2022-07-11 15:52:54 normalization > 15:51:32.808904 [debug] [Thread-7 ]: Began compiling node model.airbyte_utils.files_out_stg 2022-07-11 15:52:54 normalization > 15:51:32.809274 [debug] [Thread-5 ]: Compiling model.airbyte_utils.bank_config_stg 2022-07-11 15:52:54 normalization > 15:51:32.809693 [info ] [Thread-3 ]: 5 of 18 START view model _airbyte_raw_achilles.transactions_out_stg..................................................... [RUN] 2022-07-11 15:52:54 normalization > 15:51:32.809995 [debug] [Thread-1 ]: Compiling model.airbyte_utils.files_in_stg 2022-07-11 15:52:54 normalization > 15:51:32.810357 [debug] [Thread-2 ]: Began compiling node model.airbyte_utils.partner_config_stg 2022-07-11 15:52:54 normalization > 15:51:32.810743 [info ] [Thread-4 ]: 6 of 18 START view model _airbyte_raw_achilles.transactions_in_stg...................................................... [RUN] 2022-07-11 15:52:54 normalization > 15:51:32.811032 [debug] [Thread-7 ]: Compiling model.airbyte_utils.files_out_stg 2022-07-11 15:52:54 normalization > 15:51:32.833258 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_stg" 2022-07-11 15:52:54 normalization > 15:51:32.874266 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_stg" 2022-07-11 15:52:54 normalization > 15:51:32.874616 [debug] [Thread-2 ]: Compiling model.airbyte_utils.partner_config_stg 2022-07-11 15:52:54 normalization > 15:51:32.896600 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_stg" 2022-07-11 15:52:54 normalization > 15:51:32.938737 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.transactions_out_stg 2022-07-11 15:52:54 normalization > 15:51:32.941478 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.files_in_stg" 2022-07-11 15:52:54 normalization > 15:51:32.965249 [debug] [Thread-7 ]: Writing injected SQL for node "model.airbyte_utils.files_out_stg" 2022-07-11 15:52:54 normalization > 15:51:32.991902 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:33.009970 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.transactions_in_stg 2022-07-11 15:52:54 normalization > 15:51:33.013773 [debug] [Thread-2 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_stg" 2022-07-11 15:52:54 normalization > 15:51:33.014184 [debug] [Thread-3 ]: Compiling model.airbyte_utils.transactions_out_stg 2022-07-11 15:52:54 normalization > 15:51:33.015277 [debug] [Thread-5 ]: Began executing node model.airbyte_utils.bank_config_stg 2022-07-11 15:52:54 normalization > 15:51:33.015531 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:33.015858 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:33.016103 [debug] [Thread-4 ]: Compiling model.airbyte_utils.transactions_in_stg 2022-07-11 15:52:54 normalization > 15:51:33.027532 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:33.075167 [debug] [Thread-1 ]: Began executing node model.airbyte_utils.files_in_stg 2022-07-11 15:52:54 normalization > 15:51:33.148243 [debug] [Thread-5 ]: Writing runtime SQL for node "model.airbyte_utils.bank_config_stg" 2022-07-11 15:52:54 normalization > 15:51:33.153984 [debug] [Thread-7 ]: Began executing node model.airbyte_utils.files_out_stg 2022-07-11 15:52:54 normalization > 15:51:33.167969 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_stg" 2022-07-11 15:52:54 normalization > 15:51:33.189219 [debug] [Thread-2 ]: Began executing node model.airbyte_utils.partner_config_stg 2022-07-11 15:52:54 normalization > 15:51:33.220220 [debug] [Thread-1 ]: Writing runtime SQL for node "model.airbyte_utils.files_in_stg" 2022-07-11 15:52:54 normalization > 15:51:33.252839 [debug] [Thread-7 ]: Writing runtime SQL for node "model.airbyte_utils.files_out_stg" 2022-07-11 15:52:54 normalization > 15:51:33.285673 [debug] [Thread-2 ]: Writing runtime SQL for node "model.airbyte_utils.partner_config_stg" 2022-07-11 15:52:54 normalization > 15:51:33.285998 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:33.286660 [debug] [Thread-5 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:33.323713 [debug] [Thread-7 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:33.324156 [debug] [Thread-1 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:33.366515 [debug] [Thread-2 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:33.368288 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_stg" 2022-07-11 15:52:54 normalization > 15:51:33.368573 [debug] [Thread-3 ]: Began executing node model.airbyte_utils.transactions_out_stg 2022-07-11 15:52:54 normalization > 15:51:33.370619 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:33.384095 [debug] [Thread-4 ]: Began executing node model.airbyte_utils.transactions_in_stg 2022-07-11 15:52:54 normalization > 15:51:33.383229 [debug] [Thread-7 ]: On model.airbyte_utils.files_out_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_stg"} */ 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`files_out_stg` 2022-07-11 15:52:54 normalization > OPTIONS() 2022-07-11 15:52:54 normalization > as 2022-07-11 15:52:54 normalization > with __dbt__cte__files_out_ab1 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:52:54 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_files_out 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['file_hash']") as file_hash, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['file_name']") as file_name, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['batch_count']") as batch_count, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['exchange_window']") as exchange_window, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_files_out as table_alias 2022-07-11 15:52:54 normalization > -- files_out 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > ), __dbt__cte__files_out_ab2 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__files_out_ab1 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > cast(id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as id, 2022-07-11 15:52:54 normalization > cast(bank_id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as bank_id, 2022-07-11 15:52:54 normalization > cast(nullif(created, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as created, 2022-07-11 15:52:54 normalization > cast(nullif(updated, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as updated, 2022-07-11 15:52:54 normalization > cast(file_hash as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as file_hash, 2022-07-11 15:52:54 normalization > cast(file_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as file_name, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > cast(batch_count as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as batch_count, 2022-07-11 15:52:54 normalization > cast(nullif(exchange_window, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as exchange_window, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from __dbt__cte__files_out_ab1 2022-07-11 15:52:54 normalization > -- files_out 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__files_out_ab2 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(file_hash as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(file_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(batch_count as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(exchange_window as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_files_out_hashid, 2022-07-11 15:52:54 normalization > tmp.* 2022-07-11 15:52:54 normalization > from __dbt__cte__files_out_ab2 tmp 2022-07-11 15:52:54 normalization > -- files_out 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > ; 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 15:51:33.383674 [debug] [Thread-2 ]: On model.airbyte_utils.partner_config_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_stg"} */ 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`partner_config_stg` 2022-07-11 15:52:54 normalization > OPTIONS() 2022-07-11 15:52:54 normalization > as 2022-07-11 15:52:54 normalization > with __dbt__cte__partner_config_ab1 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:52:54 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_partner_config 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['name']") as name, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['config']") as config, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['partner_id']") as partner_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['routing_no']") as routing_no, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['account_prefix']") as account_prefix, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_partner_config as table_alias 2022-07-11 15:52:54 normalization > -- partner_config 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > ), __dbt__cte__partner_config_ab2 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__partner_config_ab1 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > cast(name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as name, 2022-07-11 15:52:54 normalization > cast(config as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as config, 2022-07-11 15:52:54 normalization > cast(bank_id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as bank_id, 2022-07-11 15:52:54 normalization > cast(nullif(created, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as created, 2022-07-11 15:52:54 normalization > cast(nullif(updated, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as updated, 2022-07-11 15:52:54 normalization > cast(partner_id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as partner_id, 2022-07-11 15:52:54 normalization > cast(routing_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as routing_no, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > cast(account_prefix as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as account_prefix, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from __dbt__cte__partner_config_ab1 2022-07-11 15:52:54 normalization > -- partner_config 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__partner_config_ab2 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(config as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(partner_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(routing_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(account_prefix as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_partner_config_hashid, 2022-07-11 15:52:54 normalization > tmp.* 2022-07-11 15:52:54 normalization > from __dbt__cte__partner_config_ab2 tmp 2022-07-11 15:52:54 normalization > -- partner_config 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > ; 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 15:51:33.382648 [debug] [Thread-3 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_out_stg" 2022-07-11 15:52:54 normalization > 15:51:33.392101 [debug] [Thread-4 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_in_stg" 2022-07-11 15:52:54 normalization > 15:51:33.392651 [debug] [Thread-5 ]: On model.airbyte_utils.bank_config_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_stg"} */ 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`bank_config_stg` 2022-07-11 15:52:54 normalization > OPTIONS() 2022-07-11 15:52:54 normalization > as 2022-07-11 15:52:54 normalization > with __dbt__cte__bank_config_ab1 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:52:54 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_bank_config 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['name']") as name, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['config']") as config, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['routing_no']") as routing_no, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_bank_config as table_alias 2022-07-11 15:52:54 normalization > -- bank_config 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > ), __dbt__cte__bank_config_ab2 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__bank_config_ab1 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > cast(name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as name, 2022-07-11 15:52:54 normalization > cast(config as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as config, 2022-07-11 15:52:54 normalization > cast(bank_id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as bank_id, 2022-07-11 15:52:54 normalization > cast(nullif(created, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as created, 2022-07-11 15:52:54 normalization > cast(nullif(updated, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as updated, 2022-07-11 15:52:54 normalization > cast(routing_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as routing_no, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from __dbt__cte__bank_config_ab1 2022-07-11 15:52:54 normalization > -- bank_config 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__bank_config_ab2 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(config as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(routing_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_bank_config_hashid, 2022-07-11 15:52:54 normalization > tmp.* 2022-07-11 15:52:54 normalization > from __dbt__cte__bank_config_ab2 tmp 2022-07-11 15:52:54 normalization > -- bank_config 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > ; 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 15:51:33.394347 [debug] [Thread-1 ]: On model.airbyte_utils.files_in_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_stg"} */ 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`files_in_stg` 2022-07-11 15:52:54 normalization > OPTIONS() 2022-07-11 15:52:54 normalization > as 2022-07-11 15:52:54 normalization > with __dbt__cte__files_in_ab1 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:52:54 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_files_in 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['ended']") as ended, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['started']") as started, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['file_hash']") as file_hash, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['file_name']") as file_name, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['iat_entry_count']") as iat_entry_count, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['std_entry_count']") as std_entry_count, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['total_batch_count']") as total_batch_count, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['total_entry_count']") as total_entry_count, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['preprocessing_path']") as preprocessing_path, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['total_debit_amount']") as total_debit_amount, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['postprocessing_path']") as postprocessing_path, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['total_credit_amount']") as total_credit_amount, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['iat_entries_processed']") as iat_entries_processed, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['std_entries_processed']") as std_entries_processed, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_files_in as table_alias 2022-07-11 15:52:54 normalization > -- files_in 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > ), __dbt__cte__files_in_ab2 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__files_in_ab1 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > cast(id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as id, 2022-07-11 15:52:54 normalization > cast(nullif(ended, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as ended, 2022-07-11 15:52:54 normalization > cast(nullif(started, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as started, 2022-07-11 15:52:54 normalization > cast(nullif(updated, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as updated, 2022-07-11 15:52:54 normalization > cast(file_hash as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as file_hash, 2022-07-11 15:52:54 normalization > cast(file_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as file_name, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > cast(iat_entry_count as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as iat_entry_count, 2022-07-11 15:52:54 normalization > cast(std_entry_count as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as std_entry_count, 2022-07-11 15:52:54 normalization > cast(total_batch_count as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as total_batch_count, 2022-07-11 15:52:54 normalization > cast(total_entry_count as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as total_entry_count, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > cast(preprocessing_path as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as preprocessing_path, 2022-07-11 15:52:54 normalization > cast(total_debit_amount as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as total_debit_amount, 2022-07-11 15:52:54 normalization > cast(postprocessing_path as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as postprocessing_path, 2022-07-11 15:52:54 normalization > cast(total_credit_amount as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as total_credit_amount, 2022-07-11 15:52:54 normalization > cast(iat_entries_processed as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as iat_entries_processed, 2022-07-11 15:52:54 normalization > cast(std_entries_processed as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as std_entries_processed, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from __dbt__cte__files_in_ab1 2022-07-11 15:52:54 normalization > -- files_in 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__files_in_ab2 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(ended as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(started as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(file_hash as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(file_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(iat_entry_count as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(std_entry_count as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(total_batch_count as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(total_entry_count as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(preprocessing_path as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(total_debit_amount as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(postprocessing_path as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(total_credit_amount as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(iat_entries_processed as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(std_entries_processed as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_files_in_hashid, 2022-07-11 15:52:54 normalization > tmp.* 2022-07-11 15:52:54 normalization > from __dbt__cte__files_in_ab2 tmp 2022-07-11 15:52:54 normalization > -- files_in 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > ; 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 15:51:33.398517 [debug] [Thread-3 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:33.400609 [debug] [Thread-4 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:33.420032 [debug] [Thread-4 ]: On model.airbyte_utils.transactions_in_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_in_stg"} */ 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`transactions_in_stg` 2022-07-11 15:52:54 normalization > OPTIONS() 2022-07-11 15:52:54 normalization > as 2022-07-11 15:52:54 normalization > with __dbt__cte__transactions_in_ab1 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:52:54 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_transactions_in 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['uuid']") as uuid, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['amount']") as amount, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['returned']") as returned, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['sec_code']") as sec_code, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['file_hash']") as file_hash, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['file_name']") as file_name, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_02']") as addenda_02, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_05']") as addenda_05, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_10']") as addenda_10, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_11']") as addenda_11, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_12']") as addenda_12, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_13']") as addenda_13, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_14']") as addenda_14, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_15']") as addenda_15, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_16']") as addenda_16, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_17']") as addenda_17, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_18']") as addenda_18, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_98']") as addenda_98, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_99']") as addenda_99, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['batch_type']") as batch_type, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['company_id']") as company_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['partner_id']") as partner_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['external_id']") as external_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['return_data']") as return_data, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['batch_number']") as batch_number, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['company_name']") as company_name, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['future_dated']") as future_dated, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['originator_id']") as originator_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['receiving_dfi']") as receiving_dfi, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['dfi_account_no']") as dfi_account_no, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['effective_date']") as effective_date, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['entry_trace_no']") as entry_trace_no, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['individual_name']") as individual_name, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['originating_dfi']") as originating_dfi, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['settlement_date']") as settlement_date, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['individual_id_no']") as individual_id_no, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['transaction_code']") as transaction_code, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['processing_history']") as processing_history, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['transaction_out_id']") as transaction_out_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['addenda_record_count']") as addenda_record_count, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['destination_country_code']") as destination_country_code, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['company_entry_description']") as company_entry_description, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['destination_currency_code']") as destination_currency_code, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['originating_currency_code']") as originating_currency_code, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['foreign_exchange_indicator']") as foreign_exchange_indicator, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_in as table_alias 2022-07-11 15:52:54 normalization > -- transactions_in 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > ), __dbt__cte__transactions_in_ab2 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__transactions_in_ab1 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > cast(id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as id, 2022-07-11 15:52:54 normalization > cast(uuid as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as uuid, 2022-07-11 15:52:54 normalization > cast(amount as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as amount, 2022-07-11 15:52:54 normalization > cast(bank_id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as bank_id, 2022-07-11 15:52:54 normalization > cast(nullif(created, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as created, 2022-07-11 15:52:54 normalization > cast(nullif(updated, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as updated, 2022-07-11 15:52:54 normalization > cast(returned as boolean) as returned, 2022-07-11 15:52:54 normalization > cast(sec_code as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as sec_code, 2022-07-11 15:52:54 normalization > cast(file_hash as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as file_hash, 2022-07-11 15:52:54 normalization > cast(file_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as file_name, 2022-07-11 15:52:54 normalization > cast(addenda_02 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_02, 2022-07-11 15:52:54 normalization > cast(addenda_05 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_05, 2022-07-11 15:52:54 normalization > cast(addenda_10 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_10, 2022-07-11 15:52:54 normalization > cast(addenda_11 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_11, 2022-07-11 15:52:54 normalization > cast(addenda_12 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_12, 2022-07-11 15:52:54 normalization > cast(addenda_13 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_13, 2022-07-11 15:52:54 normalization > cast(addenda_14 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_14, 2022-07-11 15:52:54 normalization > cast(addenda_15 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_15, 2022-07-11 15:52:54 normalization > cast(addenda_16 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_16, 2022-07-11 15:52:54 normalization > cast(addenda_17 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_17, 2022-07-11 15:52:54 normalization > cast(addenda_18 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_18, 2022-07-11 15:52:54 normalization > cast(addenda_98 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_98, 2022-07-11 15:52:54 normalization > cast(addenda_99 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_99, 2022-07-11 15:52:54 normalization > cast(batch_type as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as batch_type, 2022-07-11 15:52:54 normalization > cast(company_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as company_id, 2022-07-11 15:52:54 normalization > cast(partner_id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as partner_id, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > cast(external_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as external_id, 2022-07-11 15:52:54 normalization > cast(return_data as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as return_data, 2022-07-11 15:52:54 normalization > cast(batch_number as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as batch_number, 2022-07-11 15:52:54 normalization > cast(company_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as company_name, 2022-07-11 15:52:54 normalization > cast(future_dated as boolean) as future_dated, 2022-07-11 15:52:54 normalization > cast(originator_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as originator_id, 2022-07-11 15:52:54 normalization > cast(receiving_dfi as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as receiving_dfi, 2022-07-11 15:52:54 normalization > cast(dfi_account_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as dfi_account_no, 2022-07-11 15:52:54 normalization > cast(nullif(effective_date, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as effective_date, 2022-07-11 15:52:54 normalization > cast(entry_trace_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as entry_trace_no, 2022-07-11 15:52:54 normalization > cast(individual_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as individual_name, 2022-07-11 15:52:54 normalization > cast(originating_dfi as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as originating_dfi, 2022-07-11 15:52:54 normalization > cast(nullif(settlement_date, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as settlement_date, 2022-07-11 15:52:54 normalization > cast(individual_id_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as individual_id_no, 2022-07-11 15:52:54 normalization > cast(transaction_code as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as transaction_code, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > cast(processing_history as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as processing_history, 2022-07-11 15:52:54 normalization > cast(transaction_out_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as transaction_out_id, 2022-07-11 15:52:54 normalization > cast(addenda_record_count as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as addenda_record_count, 2022-07-11 15:52:54 normalization > cast(destination_country_code as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as destination_country_code, 2022-07-11 15:52:54 normalization > cast(company_entry_description as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as company_entry_description, 2022-07-11 15:52:54 normalization > cast(destination_currency_code as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as destination_currency_code, 2022-07-11 15:52:54 normalization > cast(originating_currency_code as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as originating_currency_code, 2022-07-11 15:52:54 normalization > cast(foreign_exchange_indicator as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as foreign_exchange_indicator, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from __dbt__cte__transactions_in_ab1 2022-07-11 15:52:54 normalization > -- transactions_in 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__transactions_in_ab2 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(uuid as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(amount as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(returned as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(sec_code as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(file_hash as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(file_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_02 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_05 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_10 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_11 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_12 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_13 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_14 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_15 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_16 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_17 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_18 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_98 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_99 as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(batch_type as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(company_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(partner_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(external_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(return_data as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(batch_number as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(company_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(future_dated as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(originator_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(receiving_dfi as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(dfi_account_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(effective_date as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(entry_trace_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(individual_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(originating_dfi as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(settlement_date as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(individual_id_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(transaction_code as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(processing_history as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(transaction_out_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(addenda_record_count as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(destination_country_code as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(company_entry_description as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(destination_currency_code as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(originating_currency_code as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(foreign_exchange_indicator as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_transactions_in_hashid, 2022-07-11 15:52:54 normalization > tmp.* 2022-07-11 15:52:54 normalization > from __dbt__cte__transactions_in_ab2 tmp 2022-07-11 15:52:54 normalization > -- transactions_in 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > ; 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 15:51:33.420551 [debug] [Thread-3 ]: On model.airbyte_utils.transactions_out_stg: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_stg"} */ 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > create or replace view `mainapi-282501`._airbyte_raw_achilles.`transactions_out_stg` 2022-07-11 15:52:54 normalization > OPTIONS() 2022-07-11 15:52:54 normalization > as 2022-07-11 15:52:54 normalization > with __dbt__cte__transactions_out_ab1 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to parse JSON blob stored in a single column and extract into separated field columns as described by the JSON Schema 2022-07-11 15:52:54 normalization > -- depends_on: `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['id']") as id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['data']") as data, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['uuid']") as uuid, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['amount']") as amount, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['status']") as status, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['bank_id']") as bank_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['created']") as created, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['file_id']") as file_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['updated']") as updated, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['trace_no']") as trace_no, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['account_no']") as account_no, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['partner_id']") as partner_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_lsn']") as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['description']") as description, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['external_id']") as external_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['is_same_day']") as is_same_day, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['return_data']") as return_data, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['account_name']") as account_name, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['effective_date']") as effective_date, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['reference_info']") as reference_info, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['transaction_code']") as transaction_code, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['source_account_no']") as source_account_no, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['transaction_in_id']") as transaction_in_id, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_deleted_at']") as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['_ab_cdc_updated_at']") as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['source_account_name']") as source_account_name, 2022-07-11 15:52:54 normalization > json_extract_scalar(_airbyte_data, "$['destination_bank_routing_no']") as destination_bank_routing_no, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out as table_alias 2022-07-11 15:52:54 normalization > -- transactions_out 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > ), __dbt__cte__transactions_out_ab2 as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- SQL model to cast each column to its adequate SQL type converted from the JSON schema type 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__transactions_out_ab1 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > cast(id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as id, 2022-07-11 15:52:54 normalization > cast(data as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as data, 2022-07-11 15:52:54 normalization > cast(uuid as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as uuid, 2022-07-11 15:52:54 normalization > cast(amount as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as amount, 2022-07-11 15:52:54 normalization > cast(status as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as status, 2022-07-11 15:52:54 normalization > cast(bank_id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as bank_id, 2022-07-11 15:52:54 normalization > cast(nullif(created, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as created, 2022-07-11 15:52:54 normalization > cast(file_id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as file_id, 2022-07-11 15:52:54 normalization > cast(nullif(updated, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as updated, 2022-07-11 15:52:54 normalization > cast(trace_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as trace_no, 2022-07-11 15:52:54 normalization > cast(account_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as account_no, 2022-07-11 15:52:54 normalization > cast(partner_id as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as partner_id, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > cast(description as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as description, 2022-07-11 15:52:54 normalization > cast(external_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as external_id, 2022-07-11 15:52:54 normalization > cast(is_same_day as boolean) as is_same_day, 2022-07-11 15:52:54 normalization > cast(return_data as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as return_data, 2022-07-11 15:52:54 normalization > cast(account_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as account_name, 2022-07-11 15:52:54 normalization > cast(nullif(effective_date, '') as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) as effective_date, 2022-07-11 15:52:54 normalization > cast(reference_info as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as reference_info, 2022-07-11 15:52:54 normalization > cast(transaction_code as 2022-07-11 15:52:54 normalization > float64 2022-07-11 15:52:54 normalization > ) as transaction_code, 2022-07-11 15:52:54 normalization > cast(source_account_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as source_account_no, 2022-07-11 15:52:54 normalization > cast(transaction_in_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as transaction_in_id, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > cast(source_account_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as source_account_name, 2022-07-11 15:52:54 normalization > cast(destination_bank_routing_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) as destination_bank_routing_no, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at 2022-07-11 15:52:54 normalization > from __dbt__cte__transactions_out_ab1 2022-07-11 15:52:54 normalization > -- transactions_out 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > )-- SQL model to build a hash column based on the values of this record 2022-07-11 15:52:54 normalization > -- depends_on: __dbt__cte__transactions_out_ab2 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(data as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(uuid as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(amount as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(status as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(bank_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(created as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(file_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(updated as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(trace_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(account_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(partner_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_lsn as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(description as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(external_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(is_same_day as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(return_data as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(account_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(effective_date as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(reference_info as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(transaction_code as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(source_account_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(transaction_in_id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(source_account_name as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(destination_bank_routing_no as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_transactions_out_hashid, 2022-07-11 15:52:54 normalization > tmp.* 2022-07-11 15:52:54 normalization > from __dbt__cte__transactions_out_ab2 tmp 2022-07-11 15:52:54 normalization > -- transactions_out 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > ; 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 15:51:34.054501 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.057053 [info ] [Thread-7 ]: 3 of 18 OK created view model _airbyte_raw_achilles.files_out_stg....................................................... [OK in 1.25s] 2022-07-11 15:52:54 normalization > 15:51:34.057723 [debug] [Thread-7 ]: Finished running node model.airbyte_utils.files_out_stg 2022-07-11 15:52:54 normalization > 15:51:34.062173 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.063159 [debug] [Thread-6 ]: Began running node model.airbyte_utils.files_out_scd 2022-07-11 15:52:54 normalization > 15:51:34.064170 [info ] [Thread-1 ]: 2 of 18 OK created view model _airbyte_raw_achilles.files_in_stg........................................................ [OK in 1.26s] 2022-07-11 15:52:54 normalization > 15:51:34.064650 [info ] [Thread-6 ]: 7 of 18 START incremental model raw_achilles.files_out_scd.............................................................. [RUN] 2022-07-11 15:52:54 normalization > 15:51:34.065329 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.files_in_stg 2022-07-11 15:52:54 normalization > 15:51:34.066819 [debug] [Thread-6 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out_scd" 2022-07-11 15:52:54 normalization > 15:51:34.067593 [debug] [Thread-7 ]: Began running node model.airbyte_utils.files_in_scd 2022-07-11 15:52:54 normalization > 15:51:34.067946 [debug] [Thread-6 ]: Began compiling node model.airbyte_utils.files_out_scd 2022-07-11 15:52:54 normalization > 15:51:34.068353 [info ] [Thread-7 ]: 8 of 18 START incremental model raw_achilles.files_in_scd............................................................... [RUN] 2022-07-11 15:52:54 normalization > 15:51:34.068689 [debug] [Thread-6 ]: Compiling model.airbyte_utils.files_out_scd 2022-07-11 15:52:54 normalization > 15:51:34.070076 [debug] [Thread-7 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in_scd" 2022-07-11 15:52:54 normalization > 15:51:34.091134 [debug] [Thread-7 ]: Began compiling node model.airbyte_utils.files_in_scd 2022-07-11 15:52:54 normalization > 15:51:34.101854 [debug] [Thread-7 ]: Compiling model.airbyte_utils.files_in_scd 2022-07-11 15:52:54 normalization > 15:51:34.110040 [debug] [Thread-6 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:34.131076 [debug] [Thread-7 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:34.179123 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.184191 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.195418 [info ] [Thread-3 ]: 5 of 18 OK created view model _airbyte_raw_achilles.transactions_out_stg................................................ [OK in 1.36s] 2022-07-11 15:52:54 normalization > 15:51:34.196853 [info ] [Thread-5 ]: 1 of 18 OK created view model _airbyte_raw_achilles.bank_config_stg..................................................... [OK in 1.40s] 2022-07-11 15:52:54 normalization > 15:51:34.200083 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.transactions_out_stg 2022-07-11 15:52:54 normalization > 15:51:34.200777 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.bank_config_stg 2022-07-11 15:52:54 normalization > 15:51:34.202313 [debug] [Thread-1 ]: Began running node model.airbyte_utils.transactions_out_scd 2022-07-11 15:52:54 normalization > 15:51:34.203035 [info ] [Thread-1 ]: 9 of 18 START incremental model raw_achilles.transactions_out_scd....................................................... [RUN] 2022-07-11 15:52:54 normalization > 15:51:34.205132 [debug] [Thread-1 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out_scd" 2022-07-11 15:52:54 normalization > 15:51:34.205486 [debug] [Thread-1 ]: Began compiling node model.airbyte_utils.transactions_out_scd 2022-07-11 15:52:54 normalization > 15:51:34.205842 [debug] [Thread-1 ]: Compiling model.airbyte_utils.transactions_out_scd 2022-07-11 15:52:54 normalization > 15:51:34.230064 [debug] [Thread-3 ]: Began running node model.airbyte_utils.bank_config_scd 2022-07-11 15:52:54 normalization > 15:51:34.244040 [debug] [Thread-1 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:34.245023 [info ] [Thread-3 ]: 10 of 18 START incremental model raw_achilles.bank_config_scd........................................................... [RUN] 2022-07-11 15:52:54 normalization > 15:51:34.248278 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.249996 [debug] [Thread-3 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config_scd" 2022-07-11 15:52:54 normalization > 15:51:34.251144 [info ] [Thread-2 ]: 4 of 18 OK created view model _airbyte_raw_achilles.partner_config_stg.................................................. [OK in 1.44s] 2022-07-11 15:52:54 normalization > 15:51:34.251496 [debug] [Thread-3 ]: Began compiling node model.airbyte_utils.bank_config_scd 2022-07-11 15:52:54 normalization > 15:51:34.252298 [debug] [Thread-3 ]: Compiling model.airbyte_utils.bank_config_scd 2022-07-11 15:52:54 normalization > 15:51:34.260790 [debug] [Thread-2 ]: Finished running node model.airbyte_utils.partner_config_stg 2022-07-11 15:52:54 normalization > 15:51:34.276963 [debug] [Thread-3 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:34.291261 [debug] [Thread-5 ]: Began running node model.airbyte_utils.partner_config_scd 2022-07-11 15:52:54 normalization > 15:51:34.294895 [info ] [Thread-5 ]: 11 of 18 START incremental model raw_achilles.partner_config_scd........................................................ [RUN] 2022-07-11 15:52:54 normalization > 15:51:34.297902 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config_scd" 2022-07-11 15:52:54 normalization > 15:51:34.298434 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.partner_config_scd 2022-07-11 15:52:54 normalization > 15:51:34.299279 [debug] [Thread-5 ]: Compiling model.airbyte_utils.partner_config_scd 2022-07-11 15:52:54 normalization > 15:51:34.323849 [debug] [Thread-5 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:34.421397 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.434137 [info ] [Thread-4 ]: 6 of 18 OK created view model _airbyte_raw_achilles.transactions_in_stg................................................. [OK in 1.54s] 2022-07-11 15:52:54 normalization > 15:51:34.444641 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.transactions_in_stg 2022-07-11 15:52:54 normalization > 15:51:34.446692 [debug] [Thread-2 ]: Began running node model.airbyte_utils.transactions_in_scd 2022-07-11 15:52:54 normalization > 15:51:34.447119 [info ] [Thread-2 ]: 12 of 18 START incremental model raw_achilles.transactions_in_scd....................................................... [RUN] 2022-07-11 15:52:54 normalization > 15:51:34.448261 [debug] [Thread-2 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_in_scd" 2022-07-11 15:52:54 normalization > 15:51:34.448477 [debug] [Thread-2 ]: Began compiling node model.airbyte_utils.transactions_in_scd 2022-07-11 15:52:54 normalization > 15:51:34.448684 [debug] [Thread-2 ]: Compiling model.airbyte_utils.transactions_in_scd 2022-07-11 15:52:54 normalization > 15:51:34.477743 [debug] [Thread-2 ]: Writing injected SQL for node "model.airbyte_utils.transactions_in_scd" 2022-07-11 15:52:54 normalization > 15:51:34.487263 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.487551 [debug] [Thread-2 ]: Began executing node model.airbyte_utils.transactions_in_scd 2022-07-11 15:52:54 normalization > 15:51:34.523972 [debug] [Thread-2 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:54 normalization > 15:51:34.658154 [debug] [Thread-2 ]: BigQuery adapter: get_columns_in_relation error: 404 GET https://bigquery.googleapis.com/bigquery/v2/projects/mainapi-282501/datasets/raw_achilles/tables/transactions_in_scd?prettyPrint=false: Not found: Table mainapi-282501:raw_achilles.transactions_in_scd 2022-07-11 15:52:54 normalization > 15:51:34.674436 [info ] [Thread-2 ]: 15:51:34 + `mainapi-282501`.raw_achilles.`transactions_in_scd`._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-07-11 15:52:54 normalization > 15:51:34.722814 [debug] [Thread-2 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_in_scd" 2022-07-11 15:52:54 normalization > 15:51:34.745701 [debug] [Thread-2 ]: On model.airbyte_utils.transactions_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_in_scd"} */ 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > create or replace table `mainapi-282501`.raw_achilles.`transactions_in_scd` 2022-07-11 15:52:54 normalization > partition by range_bucket( 2022-07-11 15:52:54 normalization > _airbyte_active_row, 2022-07-11 15:52:54 normalization > generate_array(0, 1, 1) 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:52:54 normalization > OPTIONS() 2022-07-11 15:52:54 normalization > as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- depends_on: ref('transactions_in_stg') 2022-07-11 15:52:54 normalization > with 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > input_data as ( 2022-07-11 15:52:54 normalization > select * 2022-07-11 15:52:54 normalization > from `mainapi-282501`._airbyte_raw_achilles.`transactions_in_stg` 2022-07-11 15:52:54 normalization > -- transactions_in from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_in 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > scd_data as ( 2022-07-11 15:52:54 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:52:54 normalization > id, 2022-07-11 15:52:54 normalization > uuid, 2022-07-11 15:52:54 normalization > amount, 2022-07-11 15:52:54 normalization > bank_id, 2022-07-11 15:52:54 normalization > created, 2022-07-11 15:52:54 normalization > updated, 2022-07-11 15:52:54 normalization > returned, 2022-07-11 15:52:54 normalization > sec_code, 2022-07-11 15:52:54 normalization > file_hash, 2022-07-11 15:52:54 normalization > file_name, 2022-07-11 15:52:54 normalization > addenda_02, 2022-07-11 15:52:54 normalization > addenda_05, 2022-07-11 15:52:54 normalization > addenda_10, 2022-07-11 15:52:54 normalization > addenda_11, 2022-07-11 15:52:54 normalization > addenda_12, 2022-07-11 15:52:54 normalization > addenda_13, 2022-07-11 15:52:54 normalization > addenda_14, 2022-07-11 15:52:54 normalization > addenda_15, 2022-07-11 15:52:54 normalization > addenda_16, 2022-07-11 15:52:54 normalization > addenda_17, 2022-07-11 15:52:54 normalization > addenda_18, 2022-07-11 15:52:54 normalization > addenda_98, 2022-07-11 15:52:54 normalization > addenda_99, 2022-07-11 15:52:54 normalization > batch_type, 2022-07-11 15:52:54 normalization > company_id, 2022-07-11 15:52:54 normalization > partner_id, 2022-07-11 15:52:54 normalization > _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > external_id, 2022-07-11 15:52:54 normalization > return_data, 2022-07-11 15:52:54 normalization > batch_number, 2022-07-11 15:52:54 normalization > company_name, 2022-07-11 15:52:54 normalization > future_dated, 2022-07-11 15:52:54 normalization > originator_id, 2022-07-11 15:52:54 normalization > receiving_dfi, 2022-07-11 15:52:54 normalization > dfi_account_no, 2022-07-11 15:52:54 normalization > effective_date, 2022-07-11 15:52:54 normalization > entry_trace_no, 2022-07-11 15:52:54 normalization > individual_name, 2022-07-11 15:52:54 normalization > originating_dfi, 2022-07-11 15:52:54 normalization > settlement_date, 2022-07-11 15:52:54 normalization > individual_id_no, 2022-07-11 15:52:54 normalization > transaction_code, 2022-07-11 15:52:54 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > processing_history, 2022-07-11 15:52:54 normalization > transaction_out_id, 2022-07-11 15:52:54 normalization > addenda_record_count, 2022-07-11 15:52:54 normalization > destination_country_code, 2022-07-11 15:52:54 normalization > company_entry_description, 2022-07-11 15:52:54 normalization > destination_currency_code, 2022-07-11 15:52:54 normalization > originating_currency_code, 2022-07-11 15:52:54 normalization > foreign_exchange_indicator, 2022-07-11 15:52:54 normalization > updated as _airbyte_start_at, 2022-07-11 15:52:54 normalization > lag(updated) over ( 2022-07-11 15:52:54 normalization > partition by cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > order by 2022-07-11 15:52:54 normalization > updated is null asc, 2022-07-11 15:52:54 normalization > updated desc, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:54 normalization > ) as _airbyte_end_at, 2022-07-11 15:52:54 normalization > case when row_number() over ( 2022-07-11 15:52:54 normalization > partition by cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > order by 2022-07-11 15:52:54 normalization > updated is null asc, 2022-07-11 15:52:54 normalization > updated desc, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:54 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > _airbyte_transactions_in_hashid 2022-07-11 15:52:54 normalization > from input_data 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > dedup_data as ( 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:52:54 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:52:54 normalization > row_number() over ( 2022-07-11 15:52:54 normalization > partition by 2022-07-11 15:52:54 normalization > _airbyte_unique_key, 2022-07-11 15:52:54 normalization > _airbyte_start_at, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:52:54 normalization > ) as _airbyte_row_num, 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:52:54 normalization > scd_data.* 2022-07-11 15:52:54 normalization > from scd_data 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > _airbyte_unique_key, 2022-07-11 15:52:54 normalization > _airbyte_unique_key_scd, 2022-07-11 15:52:54 normalization > id, 2022-07-11 15:52:54 normalization > uuid, 2022-07-11 15:52:54 normalization > amount, 2022-07-11 15:52:54 normalization > bank_id, 2022-07-11 15:52:54 normalization > created, 2022-07-11 15:52:54 normalization > updated, 2022-07-11 15:52:54 normalization > returned, 2022-07-11 15:52:54 normalization > sec_code, 2022-07-11 15:52:54 normalization > file_hash, 2022-07-11 15:52:54 normalization > file_name, 2022-07-11 15:52:54 normalization > addenda_02, 2022-07-11 15:52:54 normalization > addenda_05, 2022-07-11 15:52:54 normalization > addenda_10, 2022-07-11 15:52:54 normalization > addenda_11, 2022-07-11 15:52:54 normalization > addenda_12, 2022-07-11 15:52:54 normalization > addenda_13, 2022-07-11 15:52:54 normalization > addenda_14, 2022-07-11 15:52:54 normalization > addenda_15, 2022-07-11 15:52:54 normalization > addenda_16, 2022-07-11 15:52:54 normalization > addenda_17, 2022-07-11 15:52:54 normalization > addenda_18, 2022-07-11 15:52:54 normalization > addenda_98, 2022-07-11 15:52:54 normalization > addenda_99, 2022-07-11 15:52:54 normalization > batch_type, 2022-07-11 15:52:54 normalization > company_id, 2022-07-11 15:52:54 normalization > partner_id, 2022-07-11 15:52:54 normalization > _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > external_id, 2022-07-11 15:52:54 normalization > return_data, 2022-07-11 15:52:54 normalization > batch_number, 2022-07-11 15:52:54 normalization > company_name, 2022-07-11 15:52:54 normalization > future_dated, 2022-07-11 15:52:54 normalization > originator_id, 2022-07-11 15:52:54 normalization > receiving_dfi, 2022-07-11 15:52:54 normalization > dfi_account_no, 2022-07-11 15:52:54 normalization > effective_date, 2022-07-11 15:52:54 normalization > entry_trace_no, 2022-07-11 15:52:54 normalization > individual_name, 2022-07-11 15:52:54 normalization > originating_dfi, 2022-07-11 15:52:54 normalization > settlement_date, 2022-07-11 15:52:54 normalization > individual_id_no, 2022-07-11 15:52:54 normalization > transaction_code, 2022-07-11 15:52:54 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > processing_history, 2022-07-11 15:52:54 normalization > transaction_out_id, 2022-07-11 15:52:54 normalization > addenda_record_count, 2022-07-11 15:52:54 normalization > destination_country_code, 2022-07-11 15:52:54 normalization > company_entry_description, 2022-07-11 15:52:54 normalization > destination_currency_code, 2022-07-11 15:52:54 normalization > originating_currency_code, 2022-07-11 15:52:54 normalization > foreign_exchange_indicator, 2022-07-11 15:52:54 normalization > _airbyte_start_at, 2022-07-11 15:52:54 normalization > _airbyte_end_at, 2022-07-11 15:52:54 normalization > _airbyte_active_row, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:52:54 normalization > _airbyte_transactions_in_hashid 2022-07-11 15:52:54 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:52:54 normalization > ); 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 15:51:34.776177 [debug] [Thread-6 ]: Writing injected SQL for node "model.airbyte_utils.files_out_scd" 2022-07-11 15:52:54 normalization > 15:51:34.776680 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.776903 [debug] [Thread-6 ]: Began executing node model.airbyte_utils.files_out_scd 2022-07-11 15:52:54 normalization > 15:51:34.809768 [debug] [Thread-3 ]: Writing injected SQL for node "model.airbyte_utils.bank_config_scd" 2022-07-11 15:52:54 normalization > 15:51:34.812968 [debug] [Thread-7 ]: Writing injected SQL for node "model.airbyte_utils.files_in_scd" 2022-07-11 15:52:54 normalization > 15:51:34.820068 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.821293 [debug] [Thread-7 ]: Began executing node model.airbyte_utils.files_in_scd 2022-07-11 15:52:54 normalization > 15:51:34.821663 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.832995 [debug] [Thread-3 ]: Began executing node model.airbyte_utils.bank_config_scd 2022-07-11 15:52:54 normalization > 15:51:34.854308 [debug] [Thread-6 ]: On model.airbyte_utils.files_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_scd"} */ 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_out_scd__dbt_tmp` 2022-07-11 15:52:54 normalization > partition by range_bucket( 2022-07-11 15:52:54 normalization > _airbyte_active_row, 2022-07-11 15:52:54 normalization > generate_array(0, 1, 1) 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:52:54 normalization > OPTIONS( 2022-07-11 15:52:54 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- depends_on: ref('files_out_stg') 2022-07-11 15:52:54 normalization > with 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > new_data as ( 2022-07-11 15:52:54 normalization > -- retrieve incremental "new" data 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > * 2022-07-11 15:52:54 normalization > from `mainapi-282501`._airbyte_raw_achilles.`files_out_stg` 2022-07-11 15:52:54 normalization > -- files_out from `mainapi-282501`.raw_achilles._airbyte_raw_files_out 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > and coalesce( 2022-07-11 15:52:54 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > )) from `mainapi-282501`.raw_achilles.`files_out_scd`), 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > true) 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > new_data_ids as ( 2022-07-11 15:52:54 normalization > -- build a subset of _airbyte_unique_key from rows that are new 2022-07-11 15:52:54 normalization > select distinct 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_unique_key 2022-07-11 15:52:54 normalization > from new_data 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > empty_new_data as ( 2022-07-11 15:52:54 normalization > -- build an empty table to only keep the table's column types 2022-07-11 15:52:54 normalization > select * from new_data where 1 = 0 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > previous_active_scd_data as ( 2022-07-11 15:52:54 normalization > -- retrieve "incomplete old" data that needs to be updated with an end date because of new changes 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > this_data.`_airbyte_files_out_hashid`, 2022-07-11 15:52:54 normalization > this_data.`id`, 2022-07-11 15:52:54 normalization > this_data.`bank_id`, 2022-07-11 15:52:54 normalization > this_data.`created`, 2022-07-11 15:52:54 normalization > this_data.`updated`, 2022-07-11 15:52:54 normalization > this_data.`file_hash`, 2022-07-11 15:52:54 normalization > this_data.`file_name`, 2022-07-11 15:52:54 normalization > this_data.`_ab_cdc_lsn`, 2022-07-11 15:52:54 normalization > this_data.`batch_count`, 2022-07-11 15:52:54 normalization > this_data.`exchange_window`, 2022-07-11 15:52:54 normalization > this_data.`_ab_cdc_deleted_at`, 2022-07-11 15:52:54 normalization > this_data.`_ab_cdc_updated_at`, 2022-07-11 15:52:54 normalization > this_data.`_airbyte_ab_id`, 2022-07-11 15:52:54 normalization > this_data.`_airbyte_emitted_at`, 2022-07-11 15:52:54 normalization > this_data.`_airbyte_normalized_at` 2022-07-11 15:52:54 normalization > from `mainapi-282501`.raw_achilles.`files_out_scd` as this_data 2022-07-11 15:52:54 normalization > -- make a join with new_data using primary key to filter active data that need to be updated only 2022-07-11 15:52:54 normalization > join new_data_ids on this_data._airbyte_unique_key = new_data_ids._airbyte_unique_key 2022-07-11 15:52:54 normalization > -- force left join to NULL values (we just need to transfer column types only for the star_intersect macro on schema changes) 2022-07-11 15:52:54 normalization > left join empty_new_data as inc_data on this_data._airbyte_ab_id = inc_data._airbyte_ab_id 2022-07-11 15:52:54 normalization > where _airbyte_active_row = 1 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > input_data as ( 2022-07-11 15:52:54 normalization > select `_airbyte_files_out_hashid`, 2022-07-11 15:52:54 normalization > `id`, 2022-07-11 15:52:54 normalization > `bank_id`, 2022-07-11 15:52:54 normalization > `created`, 2022-07-11 15:52:54 normalization > `updated`, 2022-07-11 15:52:54 normalization > `file_hash`, 2022-07-11 15:52:54 normalization > `file_name`, 2022-07-11 15:52:54 normalization > `_ab_cdc_lsn`, 2022-07-11 15:52:54 normalization > `batch_count`, 2022-07-11 15:52:54 normalization > `exchange_window`, 2022-07-11 15:52:54 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:52:54 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:52:54 normalization > `_airbyte_ab_id`, 2022-07-11 15:52:54 normalization > `_airbyte_emitted_at`, 2022-07-11 15:52:54 normalization > `_airbyte_normalized_at` from new_data 2022-07-11 15:52:54 normalization > union all 2022-07-11 15:52:54 normalization > select `_airbyte_files_out_hashid`, 2022-07-11 15:52:54 normalization > `id`, 2022-07-11 15:52:54 normalization > `bank_id`, 2022-07-11 15:52:54 normalization > `created`, 2022-07-11 15:52:54 normalization > `updated`, 2022-07-11 15:52:54 normalization > `file_hash`, 2022-07-11 15:52:54 normalization > `file_name`, 2022-07-11 15:52:54 normalization > `_ab_cdc_lsn`, 2022-07-11 15:52:54 normalization > `batch_count`, 2022-07-11 15:52:54 normalization > `exchange_window`, 2022-07-11 15:52:54 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:52:54 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:52:54 normalization > `_airbyte_ab_id`, 2022-07-11 15:52:54 normalization > `_airbyte_emitted_at`, 2022-07-11 15:52:54 normalization > `_airbyte_normalized_at` from previous_active_scd_data 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > scd_data as ( 2022-07-11 15:52:54 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:52:54 normalization > id, 2022-07-11 15:52:54 normalization > bank_id, 2022-07-11 15:52:54 normalization > created, 2022-07-11 15:52:54 normalization > updated, 2022-07-11 15:52:54 normalization > file_hash, 2022-07-11 15:52:54 normalization > file_name, 2022-07-11 15:52:54 normalization > _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > batch_count, 2022-07-11 15:52:54 normalization > exchange_window, 2022-07-11 15:52:54 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > updated as _airbyte_start_at, 2022-07-11 15:52:54 normalization > lag(updated) over ( 2022-07-11 15:52:54 normalization > partition by cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > order by 2022-07-11 15:52:54 normalization > updated is null asc, 2022-07-11 15:52:54 normalization > updated desc, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:54 normalization > ) as _airbyte_end_at, 2022-07-11 15:52:54 normalization > case when row_number() over ( 2022-07-11 15:52:54 normalization > partition by cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > order by 2022-07-11 15:52:54 normalization > updated is null asc, 2022-07-11 15:52:54 normalization > updated desc, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:54 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > _airbyte_files_out_hashid 2022-07-11 15:52:54 normalization > from input_data 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > dedup_data as ( 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:52:54 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:52:54 normalization > row_number() over ( 2022-07-11 15:52:54 normalization > partition by 2022-07-11 15:52:54 normalization > _airbyte_unique_key, 2022-07-11 15:52:54 normalization > _airbyte_start_at, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:52:54 normalization > ) as _airbyte_row_num, 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:52:54 normalization > scd_data.* 2022-07-11 15:52:54 normalization > from scd_data 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > _airbyte_unique_key, 2022-07-11 15:52:54 normalization > _airbyte_unique_key_scd, 2022-07-11 15:52:54 normalization > id, 2022-07-11 15:52:54 normalization > bank_id, 2022-07-11 15:52:54 normalization > created, 2022-07-11 15:52:54 normalization > updated, 2022-07-11 15:52:54 normalization > file_hash, 2022-07-11 15:52:54 normalization > file_name, 2022-07-11 15:52:54 normalization > _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > batch_count, 2022-07-11 15:52:54 normalization > exchange_window, 2022-07-11 15:52:54 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > _airbyte_start_at, 2022-07-11 15:52:54 normalization > _airbyte_end_at, 2022-07-11 15:52:54 normalization > _airbyte_active_row, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:52:54 normalization > _airbyte_files_out_hashid 2022-07-11 15:52:54 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:52:54 normalization > ); 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 15:51:34.872462 [debug] [Thread-1 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out_scd" 2022-07-11 15:52:54 normalization > 15:51:34.878743 [debug] [Thread-7 ]: On model.airbyte_utils.files_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_scd"} */ 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_in_scd__dbt_tmp` 2022-07-11 15:52:54 normalization > partition by range_bucket( 2022-07-11 15:52:54 normalization > _airbyte_active_row, 2022-07-11 15:52:54 normalization > generate_array(0, 1, 1) 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:52:54 normalization > OPTIONS( 2022-07-11 15:52:54 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- depends_on: ref('files_in_stg') 2022-07-11 15:52:54 normalization > with 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > new_data as ( 2022-07-11 15:52:54 normalization > -- retrieve incremental "new" data 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > * 2022-07-11 15:52:54 normalization > from `mainapi-282501`._airbyte_raw_achilles.`files_in_stg` 2022-07-11 15:52:54 normalization > -- files_in from `mainapi-282501`.raw_achilles._airbyte_raw_files_in 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > and coalesce( 2022-07-11 15:52:54 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > )) from `mainapi-282501`.raw_achilles.`files_in_scd`), 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > true) 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > new_data_ids as ( 2022-07-11 15:52:54 normalization > -- build a subset of _airbyte_unique_key from rows that are new 2022-07-11 15:52:54 normalization > select distinct 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_unique_key 2022-07-11 15:52:54 normalization > from new_data 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > empty_new_data as ( 2022-07-11 15:52:54 normalization > -- build an empty table to only keep the table's column types 2022-07-11 15:52:54 normalization > select * from new_data where 1 = 0 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > previous_active_scd_data as ( 2022-07-11 15:52:54 normalization > -- retrieve "incomplete old" data that needs to be updated with an end date because of new changes 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > this_data.`_airbyte_files_in_hashid`, 2022-07-11 15:52:54 normalization > this_data.`id`, 2022-07-11 15:52:54 normalization > this_data.`ended`, 2022-07-11 15:52:54 normalization > this_data.`started`, 2022-07-11 15:52:54 normalization > this_data.`updated`, 2022-07-11 15:52:54 normalization > this_data.`file_hash`, 2022-07-11 15:52:54 normalization > this_data.`file_name`, 2022-07-11 15:52:54 normalization > this_data.`_ab_cdc_lsn`, 2022-07-11 15:52:54 normalization > this_data.`iat_entry_count`, 2022-07-11 15:52:54 normalization > this_data.`std_entry_count`, 2022-07-11 15:52:54 normalization > this_data.`total_batch_count`, 2022-07-11 15:52:54 normalization > this_data.`total_entry_count`, 2022-07-11 15:52:54 normalization > this_data.`_ab_cdc_deleted_at`, 2022-07-11 15:52:54 normalization > this_data.`_ab_cdc_updated_at`, 2022-07-11 15:52:54 normalization > this_data.`preprocessing_path`, 2022-07-11 15:52:54 normalization > this_data.`total_debit_amount`, 2022-07-11 15:52:54 normalization > this_data.`postprocessing_path`, 2022-07-11 15:52:54 normalization > this_data.`total_credit_amount`, 2022-07-11 15:52:54 normalization > this_data.`iat_entries_processed`, 2022-07-11 15:52:54 normalization > this_data.`std_entries_processed`, 2022-07-11 15:52:54 normalization > this_data.`_airbyte_ab_id`, 2022-07-11 15:52:54 normalization > this_data.`_airbyte_emitted_at`, 2022-07-11 15:52:54 normalization > this_data.`_airbyte_normalized_at` 2022-07-11 15:52:54 normalization > from `mainapi-282501`.raw_achilles.`files_in_scd` as this_data 2022-07-11 15:52:54 normalization > -- make a join with new_data using primary key to filter active data that need to be updated only 2022-07-11 15:52:54 normalization > join new_data_ids on this_data._airbyte_unique_key = new_data_ids._airbyte_unique_key 2022-07-11 15:52:54 normalization > -- force left join to NULL values (we just need to transfer column types only for the star_intersect macro on schema changes) 2022-07-11 15:52:54 normalization > left join empty_new_data as inc_data on this_data._airbyte_ab_id = inc_data._airbyte_ab_id 2022-07-11 15:52:54 normalization > where _airbyte_active_row = 1 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > input_data as ( 2022-07-11 15:52:54 normalization > select `_airbyte_files_in_hashid`, 2022-07-11 15:52:54 normalization > `id`, 2022-07-11 15:52:54 normalization > `ended`, 2022-07-11 15:52:54 normalization > `started`, 2022-07-11 15:52:54 normalization > `updated`, 2022-07-11 15:52:54 normalization > `file_hash`, 2022-07-11 15:52:54 normalization > `file_name`, 2022-07-11 15:52:54 normalization > `_ab_cdc_lsn`, 2022-07-11 15:52:54 normalization > `iat_entry_count`, 2022-07-11 15:52:54 normalization > `std_entry_count`, 2022-07-11 15:52:54 normalization > `total_batch_count`, 2022-07-11 15:52:54 normalization > `total_entry_count`, 2022-07-11 15:52:54 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:52:54 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:52:54 normalization > `preprocessing_path`, 2022-07-11 15:52:54 normalization > `total_debit_amount`, 2022-07-11 15:52:54 normalization > `postprocessing_path`, 2022-07-11 15:52:54 normalization > `total_credit_amount`, 2022-07-11 15:52:54 normalization > `iat_entries_processed`, 2022-07-11 15:52:54 normalization > `std_entries_processed`, 2022-07-11 15:52:54 normalization > `_airbyte_ab_id`, 2022-07-11 15:52:54 normalization > `_airbyte_emitted_at`, 2022-07-11 15:52:54 normalization > `_airbyte_normalized_at` from new_data 2022-07-11 15:52:54 normalization > union all 2022-07-11 15:52:54 normalization > select `_airbyte_files_in_hashid`, 2022-07-11 15:52:54 normalization > `id`, 2022-07-11 15:52:54 normalization > `ended`, 2022-07-11 15:52:54 normalization > `started`, 2022-07-11 15:52:54 normalization > `updated`, 2022-07-11 15:52:54 normalization > `file_hash`, 2022-07-11 15:52:54 normalization > `file_name`, 2022-07-11 15:52:54 normalization > `_ab_cdc_lsn`, 2022-07-11 15:52:54 normalization > `iat_entry_count`, 2022-07-11 15:52:54 normalization > `std_entry_count`, 2022-07-11 15:52:54 normalization > `total_batch_count`, 2022-07-11 15:52:54 normalization > `total_entry_count`, 2022-07-11 15:52:54 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:52:54 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:52:54 normalization > `preprocessing_path`, 2022-07-11 15:52:54 normalization > `total_debit_amount`, 2022-07-11 15:52:54 normalization > `postprocessing_path`, 2022-07-11 15:52:54 normalization > `total_credit_amount`, 2022-07-11 15:52:54 normalization > `iat_entries_processed`, 2022-07-11 15:52:54 normalization > `std_entries_processed`, 2022-07-11 15:52:54 normalization > `_airbyte_ab_id`, 2022-07-11 15:52:54 normalization > `_airbyte_emitted_at`, 2022-07-11 15:52:54 normalization > `_airbyte_normalized_at` from previous_active_scd_data 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > scd_data as ( 2022-07-11 15:52:54 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:52:54 normalization > id, 2022-07-11 15:52:54 normalization > ended, 2022-07-11 15:52:54 normalization > started, 2022-07-11 15:52:54 normalization > updated, 2022-07-11 15:52:54 normalization > file_hash, 2022-07-11 15:52:54 normalization > file_name, 2022-07-11 15:52:54 normalization > _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > iat_entry_count, 2022-07-11 15:52:54 normalization > std_entry_count, 2022-07-11 15:52:54 normalization > total_batch_count, 2022-07-11 15:52:54 normalization > total_entry_count, 2022-07-11 15:52:54 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > preprocessing_path, 2022-07-11 15:52:54 normalization > total_debit_amount, 2022-07-11 15:52:54 normalization > postprocessing_path, 2022-07-11 15:52:54 normalization > total_credit_amount, 2022-07-11 15:52:54 normalization > iat_entries_processed, 2022-07-11 15:52:54 normalization > std_entries_processed, 2022-07-11 15:52:54 normalization > updated as _airbyte_start_at, 2022-07-11 15:52:54 normalization > lag(updated) over ( 2022-07-11 15:52:54 normalization > partition by cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > order by 2022-07-11 15:52:54 normalization > updated is null asc, 2022-07-11 15:52:54 normalization > updated desc, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:54 normalization > ) as _airbyte_end_at, 2022-07-11 15:52:54 normalization > case when row_number() over ( 2022-07-11 15:52:54 normalization > partition by cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > order by 2022-07-11 15:52:54 normalization > updated is null asc, 2022-07-11 15:52:54 normalization > updated desc, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:54 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > _airbyte_files_in_hashid 2022-07-11 15:52:54 normalization > from input_data 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > dedup_data as ( 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:52:54 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:52:54 normalization > row_number() over ( 2022-07-11 15:52:54 normalization > partition by 2022-07-11 15:52:54 normalization > _airbyte_unique_key, 2022-07-11 15:52:54 normalization > _airbyte_start_at, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:52:54 normalization > ) as _airbyte_row_num, 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:52:54 normalization > scd_data.* 2022-07-11 15:52:54 normalization > from scd_data 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > _airbyte_unique_key, 2022-07-11 15:52:54 normalization > _airbyte_unique_key_scd, 2022-07-11 15:52:54 normalization > id, 2022-07-11 15:52:54 normalization > ended, 2022-07-11 15:52:54 normalization > started, 2022-07-11 15:52:54 normalization > updated, 2022-07-11 15:52:54 normalization > file_hash, 2022-07-11 15:52:54 normalization > file_name, 2022-07-11 15:52:54 normalization > _ab_cdc_lsn, 2022-07-11 15:52:54 normalization > iat_entry_count, 2022-07-11 15:52:54 normalization > std_entry_count, 2022-07-11 15:52:54 normalization > total_batch_count, 2022-07-11 15:52:54 normalization > total_entry_count, 2022-07-11 15:52:54 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:54 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:54 normalization > preprocessing_path, 2022-07-11 15:52:54 normalization > total_debit_amount, 2022-07-11 15:52:54 normalization > postprocessing_path, 2022-07-11 15:52:54 normalization > total_credit_amount, 2022-07-11 15:52:54 normalization > iat_entries_processed, 2022-07-11 15:52:54 normalization > std_entries_processed, 2022-07-11 15:52:54 normalization > _airbyte_start_at, 2022-07-11 15:52:54 normalization > _airbyte_end_at, 2022-07-11 15:52:54 normalization > _airbyte_active_row, 2022-07-11 15:52:54 normalization > _airbyte_ab_id, 2022-07-11 15:52:54 normalization > _airbyte_emitted_at, 2022-07-11 15:52:54 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:52:54 normalization > _airbyte_files_in_hashid 2022-07-11 15:52:54 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:52:54 normalization > ); 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 15:51:34.879182 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.881856 [debug] [Thread-1 ]: Began executing node model.airbyte_utils.transactions_out_scd 2022-07-11 15:52:54 normalization > 15:51:34.923989 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.partner_config_scd" 2022-07-11 15:52:54 normalization > 15:51:34.928095 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:52:54 normalization > 15:51:34.935006 [debug] [Thread-1 ]: On model.airbyte_utils.transactions_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_scd"} */ 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > create or replace table `mainapi-282501`.raw_achilles.`transactions_out_scd__dbt_tmp` 2022-07-11 15:52:54 normalization > partition by range_bucket( 2022-07-11 15:52:54 normalization > _airbyte_active_row, 2022-07-11 15:52:54 normalization > generate_array(0, 1, 1) 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:52:54 normalization > OPTIONS( 2022-07-11 15:52:54 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:52:54 normalization > ) 2022-07-11 15:52:54 normalization > as ( 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > -- depends_on: ref('transactions_out_stg') 2022-07-11 15:52:54 normalization > with 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > new_data as ( 2022-07-11 15:52:54 normalization > -- retrieve incremental "new" data 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > * 2022-07-11 15:52:54 normalization > from `mainapi-282501`._airbyte_raw_achilles.`transactions_out_stg` 2022-07-11 15:52:54 normalization > -- transactions_out from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out 2022-07-11 15:52:54 normalization > where 1 = 1 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > and coalesce( 2022-07-11 15:52:54 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:52:54 normalization > timestamp 2022-07-11 15:52:54 normalization > )) from `mainapi-282501`.raw_achilles.`transactions_out_scd`), 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > true) 2022-07-11 15:52:54 normalization > 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > new_data_ids as ( 2022-07-11 15:52:54 normalization > -- build a subset of _airbyte_unique_key from rows that are new 2022-07-11 15:52:54 normalization > select distinct 2022-07-11 15:52:54 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ), '')) as 2022-07-11 15:52:54 normalization > string 2022-07-11 15:52:54 normalization > ))) as _airbyte_unique_key 2022-07-11 15:52:54 normalization > from new_data 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > empty_new_data as ( 2022-07-11 15:52:54 normalization > -- build an empty table to only keep the table's column types 2022-07-11 15:52:54 normalization > select * from new_data where 1 = 0 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > previous_active_scd_data as ( 2022-07-11 15:52:54 normalization > -- retrieve "incomplete old" data that needs to be updated with an end date because of new changes 2022-07-11 15:52:54 normalization > select 2022-07-11 15:52:54 normalization > this_data.`_airbyte_transactions_out_hashid`, 2022-07-11 15:52:54 normalization > this_data.`id`, 2022-07-11 15:52:54 normalization > this_data.`data`, 2022-07-11 15:52:54 normalization > this_data.`uuid`, 2022-07-11 15:52:54 normalization > this_data.`amount`, 2022-07-11 15:52:54 normalization > this_data.`status`, 2022-07-11 15:52:54 normalization > this_data.`bank_id`, 2022-07-11 15:52:54 normalization > this_data.`created`, 2022-07-11 15:52:54 normalization > this_data.`file_id`, 2022-07-11 15:52:54 normalization > this_data.`updated`, 2022-07-11 15:52:54 normalization > this_data.`trace_no`, 2022-07-11 15:52:54 normalization > this_data.`account_no`, 2022-07-11 15:52:54 normalization > this_data.`partner_id`, 2022-07-11 15:52:54 normalization > this_data.`_ab_cdc_lsn`, 2022-07-11 15:52:54 normalization > this_data.`description`, 2022-07-11 15:52:54 normalization > this_data.`external_id`, 2022-07-11 15:52:54 normalization > this_data.`is_same_day`, 2022-07-11 15:52:54 normalization > this_data.`return_data`, 2022-07-11 15:52:54 normalization > this_data.`account_name`, 2022-07-11 15:52:54 normalization > this_data.`effective_date`, 2022-07-11 15:52:54 normalization > this_data.`reference_info`, 2022-07-11 15:52:54 normalization > this_data.`transaction_code`, 2022-07-11 15:52:54 normalization > this_data.`source_account_no`, 2022-07-11 15:52:54 normalization > this_data.`transaction_in_id`, 2022-07-11 15:52:54 normalization > this_data.`_ab_cdc_deleted_at`, 2022-07-11 15:52:54 normalization > this_data.`_ab_cdc_updated_at`, 2022-07-11 15:52:54 normalization > this_data.`source_account_name`, 2022-07-11 15:52:54 normalization > this_data.`destination_bank_routing_no`, 2022-07-11 15:52:54 normalization > this_data.`_airbyte_ab_id`, 2022-07-11 15:52:54 normalization > this_data.`_airbyte_emitted_at`, 2022-07-11 15:52:54 normalization > this_data.`_airbyte_normalized_at` 2022-07-11 15:52:54 normalization > from `mainapi-282501`.raw_achilles.`transactions_out_scd` as this_data 2022-07-11 15:52:54 normalization > -- make a join with new_data using primary key to filter active data that need to be updated only 2022-07-11 15:52:54 normalization > join new_data_ids on this_data._airbyte_unique_key = new_data_ids._airbyte_unique_key 2022-07-11 15:52:54 normalization > -- force left join to NULL values (we just need to transfer column types only for the star_intersect macro on schema changes) 2022-07-11 15:52:54 normalization > left join empty_new_data as inc_data on this_data._airbyte_ab_id = inc_data._airbyte_ab_id 2022-07-11 15:52:54 normalization > where _airbyte_active_row = 1 2022-07-11 15:52:54 normalization > ), 2022-07-11 15:52:54 normalization > input_data as ( 2022-07-11 15:52:54 normalization > select `_airbyte_transactions_out_hashid`, 2022-07-11 15:52:55 normalization > `id`, 2022-07-11 15:52:55 normalization > `data`, 2022-07-11 15:52:55 normalization > `uuid`, 2022-07-11 15:52:55 normalization > `amount`, 2022-07-11 15:52:55 normalization > `status`, 2022-07-11 15:52:55 normalization > `bank_id`, 2022-07-11 15:52:55 normalization > `created`, 2022-07-11 15:52:55 normalization > `file_id`, 2022-07-11 15:52:55 normalization > `updated`, 2022-07-11 15:52:55 normalization > `trace_no`, 2022-07-11 15:52:55 normalization > `account_no`, 2022-07-11 15:52:55 normalization > `partner_id`, 2022-07-11 15:52:55 normalization > `_ab_cdc_lsn`, 2022-07-11 15:52:55 normalization > `description`, 2022-07-11 15:52:55 normalization > `external_id`, 2022-07-11 15:52:55 normalization > `is_same_day`, 2022-07-11 15:52:55 normalization > `return_data`, 2022-07-11 15:52:55 normalization > `account_name`, 2022-07-11 15:52:55 normalization > `effective_date`, 2022-07-11 15:52:55 normalization > `reference_info`, 2022-07-11 15:52:55 normalization > `transaction_code`, 2022-07-11 15:52:55 normalization > `source_account_no`, 2022-07-11 15:52:55 normalization > `transaction_in_id`, 2022-07-11 15:52:55 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:52:55 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:52:55 normalization > `source_account_name`, 2022-07-11 15:52:55 normalization > `destination_bank_routing_no`, 2022-07-11 15:52:55 normalization > `_airbyte_ab_id`, 2022-07-11 15:52:55 normalization > `_airbyte_emitted_at`, 2022-07-11 15:52:55 normalization > `_airbyte_normalized_at` from new_data 2022-07-11 15:52:55 normalization > union all 2022-07-11 15:52:55 normalization > select `_airbyte_transactions_out_hashid`, 2022-07-11 15:52:55 normalization > `id`, 2022-07-11 15:52:55 normalization > `data`, 2022-07-11 15:52:55 normalization > `uuid`, 2022-07-11 15:52:55 normalization > `amount`, 2022-07-11 15:52:55 normalization > `status`, 2022-07-11 15:52:55 normalization > `bank_id`, 2022-07-11 15:52:55 normalization > `created`, 2022-07-11 15:52:55 normalization > `file_id`, 2022-07-11 15:52:55 normalization > `updated`, 2022-07-11 15:52:55 normalization > `trace_no`, 2022-07-11 15:52:55 normalization > `account_no`, 2022-07-11 15:52:55 normalization > `partner_id`, 2022-07-11 15:52:55 normalization > `_ab_cdc_lsn`, 2022-07-11 15:52:55 normalization > `description`, 2022-07-11 15:52:55 normalization > `external_id`, 2022-07-11 15:52:55 normalization > `is_same_day`, 2022-07-11 15:52:55 normalization > `return_data`, 2022-07-11 15:52:55 normalization > `account_name`, 2022-07-11 15:52:55 normalization > `effective_date`, 2022-07-11 15:52:55 normalization > `reference_info`, 2022-07-11 15:52:55 normalization > `transaction_code`, 2022-07-11 15:52:55 normalization > `source_account_no`, 2022-07-11 15:52:55 normalization > `transaction_in_id`, 2022-07-11 15:52:55 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:52:55 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:52:55 normalization > `source_account_name`, 2022-07-11 15:52:55 normalization > `destination_bank_routing_no`, 2022-07-11 15:52:55 normalization > `_airbyte_ab_id`, 2022-07-11 15:52:55 normalization > `_airbyte_emitted_at`, 2022-07-11 15:52:55 normalization > `_airbyte_normalized_at` from previous_active_scd_data 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > scd_data as ( 2022-07-11 15:52:55 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > to_hex(md5(cast(concat(coalesce(cast(id as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), '')) as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:52:55 normalization > id, 2022-07-11 15:52:55 normalization > data, 2022-07-11 15:52:55 normalization > uuid, 2022-07-11 15:52:55 normalization > amount, 2022-07-11 15:52:55 normalization > status, 2022-07-11 15:52:55 normalization > bank_id, 2022-07-11 15:52:55 normalization > created, 2022-07-11 15:52:55 normalization > file_id, 2022-07-11 15:52:55 normalization > updated, 2022-07-11 15:52:55 normalization > trace_no, 2022-07-11 15:52:55 normalization > account_no, 2022-07-11 15:52:55 normalization > partner_id, 2022-07-11 15:52:55 normalization > _ab_cdc_lsn, 2022-07-11 15:52:55 normalization > description, 2022-07-11 15:52:55 normalization > external_id, 2022-07-11 15:52:55 normalization > is_same_day, 2022-07-11 15:52:55 normalization > return_data, 2022-07-11 15:52:55 normalization > account_name, 2022-07-11 15:52:55 normalization > effective_date, 2022-07-11 15:52:55 normalization > reference_info, 2022-07-11 15:52:55 normalization > transaction_code, 2022-07-11 15:52:55 normalization > source_account_no, 2022-07-11 15:52:55 normalization > transaction_in_id, 2022-07-11 15:52:55 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:55 normalization > source_account_name, 2022-07-11 15:52:55 normalization > destination_bank_routing_no, 2022-07-11 15:52:55 normalization > updated as _airbyte_start_at, 2022-07-11 15:52:55 normalization > lag(updated) over ( 2022-07-11 15:52:55 normalization > partition by cast(id as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > order by 2022-07-11 15:52:55 normalization > updated is null asc, 2022-07-11 15:52:55 normalization > updated desc, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:55 normalization > ) as _airbyte_end_at, 2022-07-11 15:52:55 normalization > case when row_number() over ( 2022-07-11 15:52:55 normalization > partition by cast(id as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > order by 2022-07-11 15:52:55 normalization > updated is null asc, 2022-07-11 15:52:55 normalization > updated desc, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:55 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:52:55 normalization > _airbyte_ab_id, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, 2022-07-11 15:52:55 normalization > _airbyte_transactions_out_hashid 2022-07-11 15:52:55 normalization > from input_data 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > dedup_data as ( 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:52:55 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:52:55 normalization > row_number() over ( 2022-07-11 15:52:55 normalization > partition by 2022-07-11 15:52:55 normalization > _airbyte_unique_key, 2022-07-11 15:52:55 normalization > _airbyte_start_at, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:52:55 normalization > ) as _airbyte_row_num, 2022-07-11 15:52:55 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), '')) as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:52:55 normalization > scd_data.* 2022-07-11 15:52:55 normalization > from scd_data 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > _airbyte_unique_key, 2022-07-11 15:52:55 normalization > _airbyte_unique_key_scd, 2022-07-11 15:52:55 normalization > id, 2022-07-11 15:52:55 normalization > data, 2022-07-11 15:52:55 normalization > uuid, 2022-07-11 15:52:55 normalization > amount, 2022-07-11 15:52:55 normalization > status, 2022-07-11 15:52:55 normalization > bank_id, 2022-07-11 15:52:55 normalization > created, 2022-07-11 15:52:55 normalization > file_id, 2022-07-11 15:52:55 normalization > updated, 2022-07-11 15:52:55 normalization > trace_no, 2022-07-11 15:52:55 normalization > account_no, 2022-07-11 15:52:55 normalization > partner_id, 2022-07-11 15:52:55 normalization > _ab_cdc_lsn, 2022-07-11 15:52:55 normalization > description, 2022-07-11 15:52:55 normalization > external_id, 2022-07-11 15:52:55 normalization > is_same_day, 2022-07-11 15:52:55 normalization > return_data, 2022-07-11 15:52:55 normalization > account_name, 2022-07-11 15:52:55 normalization > effective_date, 2022-07-11 15:52:55 normalization > reference_info, 2022-07-11 15:52:55 normalization > transaction_code, 2022-07-11 15:52:55 normalization > source_account_no, 2022-07-11 15:52:55 normalization > transaction_in_id, 2022-07-11 15:52:55 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:55 normalization > source_account_name, 2022-07-11 15:52:55 normalization > destination_bank_routing_no, 2022-07-11 15:52:55 normalization > _airbyte_start_at, 2022-07-11 15:52:55 normalization > _airbyte_end_at, 2022-07-11 15:52:55 normalization > _airbyte_active_row, 2022-07-11 15:52:55 normalization > _airbyte_ab_id, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, 2022-07-11 15:52:55 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:52:55 normalization > _airbyte_transactions_out_hashid 2022-07-11 15:52:55 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:52:55 normalization > ); 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:34.941846 [debug] [Thread-3 ]: On model.airbyte_utils.bank_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > create or replace table `mainapi-282501`.raw_achilles.`bank_config_scd__dbt_tmp` 2022-07-11 15:52:55 normalization > partition by range_bucket( 2022-07-11 15:52:55 normalization > _airbyte_active_row, 2022-07-11 15:52:55 normalization > generate_array(0, 1, 1) 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:52:55 normalization > OPTIONS( 2022-07-11 15:52:55 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > as ( 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- depends_on: ref('bank_config_stg') 2022-07-11 15:52:55 normalization > with 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > new_data as ( 2022-07-11 15:52:55 normalization > -- retrieve incremental "new" data 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > * 2022-07-11 15:52:55 normalization > from `mainapi-282501`._airbyte_raw_achilles.`bank_config_stg` 2022-07-11 15:52:55 normalization > -- bank_config from `mainapi-282501`.raw_achilles._airbyte_raw_bank_config 2022-07-11 15:52:55 normalization > where 1 = 1 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from `mainapi-282501`.raw_achilles.`bank_config_scd`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > new_data_ids as ( 2022-07-11 15:52:55 normalization > -- build a subset of _airbyte_unique_key from rows that are new 2022-07-11 15:52:55 normalization > select distinct 2022-07-11 15:52:55 normalization > to_hex(md5(cast(concat(coalesce(cast(bank_id as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), '')) as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ))) as _airbyte_unique_key 2022-07-11 15:52:55 normalization > from new_data 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > empty_new_data as ( 2022-07-11 15:52:55 normalization > -- build an empty table to only keep the table's column types 2022-07-11 15:52:55 normalization > select * from new_data where 1 = 0 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > previous_active_scd_data as ( 2022-07-11 15:52:55 normalization > -- retrieve "incomplete old" data that needs to be updated with an end date because of new changes 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > this_data.`_airbyte_bank_config_hashid`, 2022-07-11 15:52:55 normalization > this_data.`name`, 2022-07-11 15:52:55 normalization > this_data.`config`, 2022-07-11 15:52:55 normalization > this_data.`bank_id`, 2022-07-11 15:52:55 normalization > this_data.`created`, 2022-07-11 15:52:55 normalization > this_data.`updated`, 2022-07-11 15:52:55 normalization > this_data.`routing_no`, 2022-07-11 15:52:55 normalization > this_data.`_ab_cdc_lsn`, 2022-07-11 15:52:55 normalization > this_data.`_ab_cdc_deleted_at`, 2022-07-11 15:52:55 normalization > this_data.`_ab_cdc_updated_at`, 2022-07-11 15:52:55 normalization > this_data.`_airbyte_ab_id`, 2022-07-11 15:52:55 normalization > this_data.`_airbyte_emitted_at`, 2022-07-11 15:52:55 normalization > this_data.`_airbyte_normalized_at` 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`bank_config_scd` as this_data 2022-07-11 15:52:55 normalization > -- make a join with new_data using primary key to filter active data that need to be updated only 2022-07-11 15:52:55 normalization > join new_data_ids on this_data._airbyte_unique_key = new_data_ids._airbyte_unique_key 2022-07-11 15:52:55 normalization > -- force left join to NULL values (we just need to transfer column types only for the star_intersect macro on schema changes) 2022-07-11 15:52:55 normalization > left join empty_new_data as inc_data on this_data._airbyte_ab_id = inc_data._airbyte_ab_id 2022-07-11 15:52:55 normalization > where _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > input_data as ( 2022-07-11 15:52:55 normalization > select `_airbyte_bank_config_hashid`, 2022-07-11 15:52:55 normalization > `name`, 2022-07-11 15:52:55 normalization > `config`, 2022-07-11 15:52:55 normalization > `bank_id`, 2022-07-11 15:52:55 normalization > `created`, 2022-07-11 15:52:55 normalization > `updated`, 2022-07-11 15:52:55 normalization > `routing_no`, 2022-07-11 15:52:55 normalization > `_ab_cdc_lsn`, 2022-07-11 15:52:55 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:52:55 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:52:55 normalization > `_airbyte_ab_id`, 2022-07-11 15:52:55 normalization > `_airbyte_emitted_at`, 2022-07-11 15:52:55 normalization > `_airbyte_normalized_at` from new_data 2022-07-11 15:52:55 normalization > union all 2022-07-11 15:52:55 normalization > select `_airbyte_bank_config_hashid`, 2022-07-11 15:52:55 normalization > `name`, 2022-07-11 15:52:55 normalization > `config`, 2022-07-11 15:52:55 normalization > `bank_id`, 2022-07-11 15:52:55 normalization > `created`, 2022-07-11 15:52:55 normalization > `updated`, 2022-07-11 15:52:55 normalization > `routing_no`, 2022-07-11 15:52:55 normalization > `_ab_cdc_lsn`, 2022-07-11 15:52:55 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:52:55 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:52:55 normalization > `_airbyte_ab_id`, 2022-07-11 15:52:55 normalization > `_airbyte_emitted_at`, 2022-07-11 15:52:55 normalization > `_airbyte_normalized_at` from previous_active_scd_data 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > scd_data as ( 2022-07-11 15:52:55 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > to_hex(md5(cast(concat(coalesce(cast(bank_id as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), '')) as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:52:55 normalization > name, 2022-07-11 15:52:55 normalization > config, 2022-07-11 15:52:55 normalization > bank_id, 2022-07-11 15:52:55 normalization > created, 2022-07-11 15:52:55 normalization > updated, 2022-07-11 15:52:55 normalization > routing_no, 2022-07-11 15:52:55 normalization > _ab_cdc_lsn, 2022-07-11 15:52:55 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:55 normalization > updated as _airbyte_start_at, 2022-07-11 15:52:55 normalization > lag(updated) over ( 2022-07-11 15:52:55 normalization > partition by cast(bank_id as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > order by 2022-07-11 15:52:55 normalization > updated is null asc, 2022-07-11 15:52:55 normalization > updated desc, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:55 normalization > ) as _airbyte_end_at, 2022-07-11 15:52:55 normalization > case when row_number() over ( 2022-07-11 15:52:55 normalization > partition by cast(bank_id as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > order by 2022-07-11 15:52:55 normalization > updated is null asc, 2022-07-11 15:52:55 normalization > updated desc, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:55 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:52:55 normalization > _airbyte_ab_id, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, 2022-07-11 15:52:55 normalization > _airbyte_bank_config_hashid 2022-07-11 15:52:55 normalization > from input_data 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > dedup_data as ( 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:52:55 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:52:55 normalization > row_number() over ( 2022-07-11 15:52:55 normalization > partition by 2022-07-11 15:52:55 normalization > _airbyte_unique_key, 2022-07-11 15:52:55 normalization > _airbyte_start_at, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:52:55 normalization > ) as _airbyte_row_num, 2022-07-11 15:52:55 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), '')) as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:52:55 normalization > scd_data.* 2022-07-11 15:52:55 normalization > from scd_data 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > _airbyte_unique_key, 2022-07-11 15:52:55 normalization > _airbyte_unique_key_scd, 2022-07-11 15:52:55 normalization > name, 2022-07-11 15:52:55 normalization > config, 2022-07-11 15:52:55 normalization > bank_id, 2022-07-11 15:52:55 normalization > created, 2022-07-11 15:52:55 normalization > updated, 2022-07-11 15:52:55 normalization > routing_no, 2022-07-11 15:52:55 normalization > _ab_cdc_lsn, 2022-07-11 15:52:55 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:55 normalization > _airbyte_start_at, 2022-07-11 15:52:55 normalization > _airbyte_end_at, 2022-07-11 15:52:55 normalization > _airbyte_active_row, 2022-07-11 15:52:55 normalization > _airbyte_ab_id, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, 2022-07-11 15:52:55 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:52:55 normalization > _airbyte_bank_config_hashid 2022-07-11 15:52:55 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:52:55 normalization > ); 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:34.942406 [debug] [Thread-5 ]: Began executing node model.airbyte_utils.partner_config_scd 2022-07-11 15:52:55 normalization > 15:51:35.034613 [debug] [Thread-5 ]: On model.airbyte_utils.partner_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > create or replace table `mainapi-282501`.raw_achilles.`partner_config_scd__dbt_tmp` 2022-07-11 15:52:55 normalization > partition by range_bucket( 2022-07-11 15:52:55 normalization > _airbyte_active_row, 2022-07-11 15:52:55 normalization > generate_array(0, 1, 1) 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > cluster by _airbyte_unique_key_scd, _airbyte_emitted_at 2022-07-11 15:52:55 normalization > OPTIONS( 2022-07-11 15:52:55 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > as ( 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- depends_on: ref('partner_config_stg') 2022-07-11 15:52:55 normalization > with 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > new_data as ( 2022-07-11 15:52:55 normalization > -- retrieve incremental "new" data 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > * 2022-07-11 15:52:55 normalization > from `mainapi-282501`._airbyte_raw_achilles.`partner_config_stg` 2022-07-11 15:52:55 normalization > -- partner_config from `mainapi-282501`.raw_achilles._airbyte_raw_partner_config 2022-07-11 15:52:55 normalization > where 1 = 1 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from `mainapi-282501`.raw_achilles.`partner_config_scd`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > new_data_ids as ( 2022-07-11 15:52:55 normalization > -- build a subset of _airbyte_unique_key from rows that are new 2022-07-11 15:52:55 normalization > select distinct 2022-07-11 15:52:55 normalization > to_hex(md5(cast(concat(coalesce(cast(partner_id as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), '')) as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ))) as _airbyte_unique_key 2022-07-11 15:52:55 normalization > from new_data 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):161 - Completing future exceptionally... io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:63) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:60) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 3 more Suppressed: io.airbyte.workers.exception.WorkerException: Normalization process wasn't successful at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:162) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:48) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-07-11 15:52:55 normalization > empty_new_data as ( 2022-07-11 15:52:55 normalization > -- build an empty table to only keep the table's column types 2022-07-11 15:52:55 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-11 15:52:55 normalization > select * from new_data where 1 = 0 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating... 2022-07-11 15:52:55 normalization > previous_active_scd_data as ( 2022-07-11 15:52:55 normalization > -- retrieve "incomplete old" data that needs to be updated with an end date because of new changes 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > this_data.`_airbyte_partner_config_hashid`, 2022-07-11 15:52:55 normalization > this_data.`name`, 2022-07-11 15:52:55 normalization > this_data.`config`, 2022-07-11 15:52:55 normalization > this_data.`bank_id`, 2022-07-11 15:52:55 normalization > this_data.`created`, 2022-07-11 15:52:55 normalization > this_data.`updated`, 2022-07-11 15:52:55 normalization > this_data.`partner_id`, 2022-07-11 15:52:55 normalization > this_data.`routing_no`, 2022-07-11 15:52:55 normalization > this_data.`_ab_cdc_lsn`, 2022-07-11 15:52:55 normalization > this_data.`account_prefix`, 2022-07-11 15:52:55 normalization > this_data.`_ab_cdc_deleted_at`, 2022-07-11 15:52:55 WARN i.t.i.a.POJOActivityTaskHandler(activityFailureToResult):307 - Activity failure. ActivityId=5a671dcb-14c0-3011-a673-44205ea2aa80, activityType=Normalize, attempt=1 java.lang.RuntimeException: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:289) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.normalize(NormalizationActivityImpl.java:75) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at jdk.internal.reflect.GeneratedMethodAccessor386.invoke(Unknown Source) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?] at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?] at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityInboundCallsInterceptor.execute(POJOActivityTaskHandler.java:214) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.activity.POJOActivityTaskHandler$POJOActivityImplementation.execute(POJOActivityTaskHandler.java:180) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.activity.POJOActivityTaskHandler.handle(POJOActivityTaskHandler.java:120) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:204) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:164) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.8.1.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.temporal.serviceclient.CheckedExceptionWrapper.wrap(CheckedExceptionWrapper.java:56) ~[temporal-serviceclient-1.8.1.jar:?] at io.temporal.internal.sync.WorkflowInternal.wrap(WorkflowInternal.java:448) ~[temporal-sdk-1.8.1.jar:?] at io.temporal.activity.Activity.wrap(Activity.java:51) ~[temporal-sdk-1.8.1.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:138) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.lambda$normalize$3(NormalizationActivityImpl.java:103) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:284) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 13 more Caused by: java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Normalization Failed. at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:132) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.sync.NormalizationActivityImpl.lambda$normalize$3(NormalizationActivityImpl.java:103) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalUtils.withBackgroundHeartbeat(TemporalUtils.java:284) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 13 more Caused by: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:63) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 1 more Caused by: io.airbyte.workers.exception.WorkerException: Normalization Failed. at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:60) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] ... 1 more Suppressed: io.airbyte.workers.exception.WorkerException: Normalization process wasn't successful at io.airbyte.workers.normalization.DefaultNormalizationRunner.close(DefaultNormalizationRunner.java:162) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:48) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.general.DefaultNormalizationWorker.run(DefaultNormalizationWorker.java:21) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.32-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-07-11 15:52:55 normalization > this_data.`_ab_cdc_updated_at`, 2022-07-11 15:52:55 normalization > this_data.`_airbyte_ab_id`, 2022-07-11 15:52:55 normalization > this_data.`_airbyte_emitted_at`, 2022-07-11 15:52:55 normalization > this_data.`_airbyte_normalized_at` 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`partner_config_scd` as this_data 2022-07-11 15:52:55 normalization > -- make a join with new_data using primary key to filter active data that need to be updated only 2022-07-11 15:52:55 normalization > join new_data_ids on this_data._airbyte_unique_key = new_data_ids._airbyte_unique_key 2022-07-11 15:52:55 normalization > -- force left join to NULL values (we just need to transfer column types only for the star_intersect macro on schema changes) 2022-07-11 15:52:55 normalization > left join empty_new_data as inc_data on this_data._airbyte_ab_id = inc_data._airbyte_ab_id 2022-07-11 15:52:55 normalization > where _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > input_data as ( 2022-07-11 15:52:55 normalization > select `_airbyte_partner_config_hashid`, 2022-07-11 15:52:55 normalization > `name`, 2022-07-11 15:52:55 normalization > `config`, 2022-07-11 15:52:55 normalization > `bank_id`, 2022-07-11 15:52:55 normalization > `created`, 2022-07-11 15:52:55 normalization > `updated`, 2022-07-11 15:52:55 normalization > `partner_id`, 2022-07-11 15:52:55 normalization > `routing_no`, 2022-07-11 15:52:55 normalization > `_ab_cdc_lsn`, 2022-07-11 15:52:55 normalization > `account_prefix`, 2022-07-11 15:52:55 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:52:55 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:52:55 normalization > `_airbyte_ab_id`, 2022-07-11 15:52:55 normalization > `_airbyte_emitted_at`, 2022-07-11 15:52:55 normalization > `_airbyte_normalized_at` from new_data 2022-07-11 15:52:55 normalization > union all 2022-07-11 15:52:55 normalization > select `_airbyte_partner_config_hashid`, 2022-07-11 15:52:55 normalization > `name`, 2022-07-11 15:52:55 normalization > `config`, 2022-07-11 15:52:55 normalization > `bank_id`, 2022-07-11 15:52:55 normalization > `created`, 2022-07-11 15:52:55 normalization > `updated`, 2022-07-11 15:52:55 normalization > `partner_id`, 2022-07-11 15:52:55 normalization > `routing_no`, 2022-07-11 15:52:55 normalization > `_ab_cdc_lsn`, 2022-07-11 15:52:55 normalization > `account_prefix`, 2022-07-11 15:52:55 normalization > `_ab_cdc_deleted_at`, 2022-07-11 15:52:55 normalization > `_ab_cdc_updated_at`, 2022-07-11 15:52:55 normalization > `_airbyte_ab_id`, 2022-07-11 15:52:55 normalization > `_airbyte_emitted_at`, 2022-07-11 15:52:55 normalization > `_airbyte_normalized_at` from previous_active_scd_data 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > scd_data as ( 2022-07-11 15:52:55 normalization > -- SQL model to build a Type 2 Slowly Changing Dimension (SCD) table for each record identified by their primary key 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > to_hex(md5(cast(concat(coalesce(cast(partner_id as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), '')) as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ))) as _airbyte_unique_key, 2022-07-11 15:52:55 normalization > name, 2022-07-11 15:52:55 normalization > config, 2022-07-11 15:52:55 normalization > bank_id, 2022-07-11 15:52:55 normalization > created, 2022-07-11 15:52:55 normalization > updated, 2022-07-11 15:52:55 normalization > partner_id, 2022-07-11 15:52:55 normalization > routing_no, 2022-07-11 15:52:55 normalization > _ab_cdc_lsn, 2022-07-11 15:52:55 normalization > account_prefix, 2022-07-11 15:52:55 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:55 normalization > updated as _airbyte_start_at, 2022-07-11 15:52:55 normalization > lag(updated) over ( 2022-07-11 15:52:55 normalization > partition by cast(partner_id as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > order by 2022-07-11 15:52:55 normalization > updated is null asc, 2022-07-11 15:52:55 normalization > updated desc, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:55 normalization > ) as _airbyte_end_at, 2022-07-11 15:52:55 normalization > case when row_number() over ( 2022-07-11 15:52:55 normalization > partition by cast(partner_id as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > order by 2022-07-11 15:52:55 normalization > updated is null asc, 2022-07-11 15:52:55 normalization > updated desc, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at desc, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at desc 2022-07-11 15:52:55 normalization > ) = 1 and _ab_cdc_deleted_at is null then 1 else 0 end as _airbyte_active_row, 2022-07-11 15:52:55 normalization > _airbyte_ab_id, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, 2022-07-11 15:52:55 normalization > _airbyte_partner_config_hashid 2022-07-11 15:52:55 normalization > from input_data 2022-07-11 15:52:55 normalization > ), 2022-07-11 15:52:55 normalization > dedup_data as ( 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > -- we need to ensure de-duplicated rows for merge/update queries 2022-07-11 15:52:55 normalization > -- additionally, we generate a unique key for the scd table 2022-07-11 15:52:55 normalization > row_number() over ( 2022-07-11 15:52:55 normalization > partition by 2022-07-11 15:52:55 normalization > _airbyte_unique_key, 2022-07-11 15:52:55 normalization > _airbyte_start_at, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, cast(_ab_cdc_deleted_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), cast(_ab_cdc_updated_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > order by _airbyte_active_row desc, _airbyte_ab_id 2022-07-11 15:52:55 normalization > ) as _airbyte_row_num, 2022-07-11 15:52:55 normalization > to_hex(md5(cast(concat(coalesce(cast(_airbyte_unique_key as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_airbyte_start_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_ab_cdc_deleted_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), ''), '-', coalesce(cast(_ab_cdc_updated_at as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ), '')) as 2022-07-11 15:52:55 normalization > string 2022-07-11 15:52:55 normalization > ))) as _airbyte_unique_key_scd, 2022-07-11 15:52:55 normalization > scd_data.* 2022-07-11 15:52:55 normalization > from scd_data 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > _airbyte_unique_key, 2022-07-11 15:52:55 normalization > _airbyte_unique_key_scd, 2022-07-11 15:52:55 normalization > name, 2022-07-11 15:52:55 normalization > config, 2022-07-11 15:52:55 normalization > bank_id, 2022-07-11 15:52:55 normalization > created, 2022-07-11 15:52:55 normalization > updated, 2022-07-11 15:52:55 normalization > partner_id, 2022-07-11 15:52:55 normalization > routing_no, 2022-07-11 15:52:55 normalization > _ab_cdc_lsn, 2022-07-11 15:52:55 normalization > account_prefix, 2022-07-11 15:52:55 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:55 normalization > _airbyte_start_at, 2022-07-11 15:52:55 normalization > _airbyte_end_at, 2022-07-11 15:52:55 normalization > _airbyte_active_row, 2022-07-11 15:52:55 normalization > _airbyte_ab_id, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, 2022-07-11 15:52:55 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:52:55 normalization > _airbyte_partner_config_hashid 2022-07-11 15:52:55 normalization > from dedup_data where _airbyte_row_num = 1 2022-07-11 15:52:55 normalization > ); 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:35.715659 [debug] [Thread-2 ]: BigQuery adapter: Retry attempt 1 of 3 after error: BadRequest('Invalid timestamp string "0000-12-30T00:00:00Z"') 2022-07-11 15:52:55 normalization > 15:51:37.394519 [debug] [Thread-2 ]: BigQuery adapter: Retry attempt 2 of 3 after error: BadRequest('Invalid timestamp string "0000-12-30T00:00:00Z"') 2022-07-11 15:52:55 normalization > 15:51:37.908818 [debug] [Thread-7 ]: 2022-07-11 15:52:55 normalization > In `mainapi-282501`.`raw_achilles`.`files_in_scd`: 2022-07-11 15:52:55 normalization > Schema changed: False 2022-07-11 15:52:55 normalization > Source columns not in target: [] 2022-07-11 15:52:55 normalization > Target columns not in source: [] 2022-07-11 15:52:55 normalization > New column types: [] 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:37.938245 [debug] [Thread-7 ]: Writing runtime SQL for node "model.airbyte_utils.files_in_scd" 2022-07-11 15:52:55 normalization > 15:51:37.939283 [debug] [Thread-7 ]: On model.airbyte_utils.files_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > merge into `mainapi-282501`.raw_achilles.`files_in_scd` as DBT_INTERNAL_DEST 2022-07-11 15:52:55 normalization > using ( 2022-07-11 15:52:55 normalization > select * from `mainapi-282501`.raw_achilles.`files_in_scd__dbt_tmp` 2022-07-11 15:52:55 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:52:55 normalization > on 2022-07-11 15:52:55 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key_scd = DBT_INTERNAL_DEST._airbyte_unique_key_scd 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when matched then update set 2022-07-11 15:52:55 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`_airbyte_unique_key_scd` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key_scd`,`id` = DBT_INTERNAL_SOURCE.`id`,`ended` = DBT_INTERNAL_SOURCE.`ended`,`started` = DBT_INTERNAL_SOURCE.`started`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`file_hash` = DBT_INTERNAL_SOURCE.`file_hash`,`file_name` = DBT_INTERNAL_SOURCE.`file_name`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`iat_entry_count` = DBT_INTERNAL_SOURCE.`iat_entry_count`,`std_entry_count` = DBT_INTERNAL_SOURCE.`std_entry_count`,`total_batch_count` = DBT_INTERNAL_SOURCE.`total_batch_count`,`total_entry_count` = DBT_INTERNAL_SOURCE.`total_entry_count`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`preprocessing_path` = DBT_INTERNAL_SOURCE.`preprocessing_path`,`total_debit_amount` = DBT_INTERNAL_SOURCE.`total_debit_amount`,`postprocessing_path` = DBT_INTERNAL_SOURCE.`postprocessing_path`,`total_credit_amount` = DBT_INTERNAL_SOURCE.`total_credit_amount`,`iat_entries_processed` = DBT_INTERNAL_SOURCE.`iat_entries_processed`,`std_entries_processed` = DBT_INTERNAL_SOURCE.`std_entries_processed`,`_airbyte_start_at` = DBT_INTERNAL_SOURCE.`_airbyte_start_at`,`_airbyte_end_at` = DBT_INTERNAL_SOURCE.`_airbyte_end_at`,`_airbyte_active_row` = DBT_INTERNAL_SOURCE.`_airbyte_active_row`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_files_in_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_files_in_hashid` 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when not matched then insert 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `ended`, `started`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `iat_entry_count`, `std_entry_count`, `total_batch_count`, `total_entry_count`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `preprocessing_path`, `total_debit_amount`, `postprocessing_path`, `total_credit_amount`, `iat_entries_processed`, `std_entries_processed`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_in_hashid`) 2022-07-11 15:52:55 normalization > values 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `ended`, `started`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `iat_entry_count`, `std_entry_count`, `total_batch_count`, `total_entry_count`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `preprocessing_path`, `total_debit_amount`, `postprocessing_path`, `total_credit_amount`, `iat_entries_processed`, `std_entries_processed`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_in_hashid`) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:38.975721 [debug] [Thread-2 ]: BigQuery adapter: Retry attempt 3 of 3 after error: BadRequest('Invalid timestamp string "0000-12-30T00:00:00Z"') 2022-07-11 15:52:55 normalization > 15:51:39.393376 [debug] [Thread-5 ]: 2022-07-11 15:52:55 normalization > In `mainapi-282501`.`raw_achilles`.`partner_config_scd`: 2022-07-11 15:52:55 normalization > Schema changed: False 2022-07-11 15:52:55 normalization > Source columns not in target: [] 2022-07-11 15:52:55 normalization > Target columns not in source: [] 2022-07-11 15:52:55 normalization > New column types: [] 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:39.395601 [debug] [Thread-5 ]: Writing runtime SQL for node "model.airbyte_utils.partner_config_scd" 2022-07-11 15:52:55 normalization > 15:51:39.396210 [debug] [Thread-5 ]: On model.airbyte_utils.partner_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > merge into `mainapi-282501`.raw_achilles.`partner_config_scd` as DBT_INTERNAL_DEST 2022-07-11 15:52:55 normalization > using ( 2022-07-11 15:52:55 normalization > select * from `mainapi-282501`.raw_achilles.`partner_config_scd__dbt_tmp` 2022-07-11 15:52:55 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:52:55 normalization > on 2022-07-11 15:52:55 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key_scd = DBT_INTERNAL_DEST._airbyte_unique_key_scd 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when matched then update set 2022-07-11 15:52:55 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`_airbyte_unique_key_scd` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key_scd`,`name` = DBT_INTERNAL_SOURCE.`name`,`config` = DBT_INTERNAL_SOURCE.`config`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`partner_id` = DBT_INTERNAL_SOURCE.`partner_id`,`routing_no` = DBT_INTERNAL_SOURCE.`routing_no`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`account_prefix` = DBT_INTERNAL_SOURCE.`account_prefix`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_start_at` = DBT_INTERNAL_SOURCE.`_airbyte_start_at`,`_airbyte_end_at` = DBT_INTERNAL_SOURCE.`_airbyte_end_at`,`_airbyte_active_row` = DBT_INTERNAL_SOURCE.`_airbyte_active_row`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_partner_config_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_partner_config_hashid` 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when not matched then insert 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `name`, `config`, `bank_id`, `created`, `updated`, `partner_id`, `routing_no`, `_ab_cdc_lsn`, `account_prefix`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_partner_config_hashid`) 2022-07-11 15:52:55 normalization > values 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `name`, `config`, `bank_id`, `created`, `updated`, `partner_id`, `routing_no`, `_ab_cdc_lsn`, `account_prefix`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_partner_config_hashid`) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:40.924030 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:51:40.925036 [debug] [Thread-2 ]: Database Error in model transactions_in_scd (models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql) 2022-07-11 15:52:55 normalization > Invalid timestamp string "0000-12-30T00:00:00Z" 2022-07-11 15:52:55 normalization > compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:52:55 normalization > 15:51:40.925610 [error] [Thread-2 ]: 12 of 18 ERROR creating incremental model raw_achilles.transactions_in_scd.............................................. [ERROR in 6.48s] 2022-07-11 15:52:55 normalization > 15:51:40.926140 [debug] [Thread-2 ]: Finished running node model.airbyte_utils.transactions_in_scd 2022-07-11 15:52:55 normalization > 15:51:40.927201 [debug] [Thread-4 ]: Began running node model.airbyte_utils.transactions_in 2022-07-11 15:52:55 normalization > 15:51:40.927526 [info ] [Thread-4 ]: 13 of 18 SKIP relation raw_achilles.transactions_in..................................................................... [SKIP] 2022-07-11 15:52:55 normalization > 15:51:40.928068 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.transactions_in 2022-07-11 15:52:55 normalization > 15:51:41.947048 [debug] [Thread-1 ]: 2022-07-11 15:52:55 normalization > In `mainapi-282501`.`raw_achilles`.`transactions_out_scd`: 2022-07-11 15:52:55 normalization > Schema changed: False 2022-07-11 15:52:55 normalization > Source columns not in target: [] 2022-07-11 15:52:55 normalization > Target columns not in source: [] 2022-07-11 15:52:55 normalization > New column types: [] 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:41.950106 [debug] [Thread-1 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_out_scd" 2022-07-11 15:52:55 normalization > 15:51:41.950790 [debug] [Thread-1 ]: On model.airbyte_utils.transactions_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > merge into `mainapi-282501`.raw_achilles.`transactions_out_scd` as DBT_INTERNAL_DEST 2022-07-11 15:52:55 normalization > using ( 2022-07-11 15:52:55 normalization > select * from `mainapi-282501`.raw_achilles.`transactions_out_scd__dbt_tmp` 2022-07-11 15:52:55 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:52:55 normalization > on 2022-07-11 15:52:55 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key_scd = DBT_INTERNAL_DEST._airbyte_unique_key_scd 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when matched then update set 2022-07-11 15:52:55 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`_airbyte_unique_key_scd` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key_scd`,`id` = DBT_INTERNAL_SOURCE.`id`,`data` = DBT_INTERNAL_SOURCE.`data`,`uuid` = DBT_INTERNAL_SOURCE.`uuid`,`amount` = DBT_INTERNAL_SOURCE.`amount`,`status` = DBT_INTERNAL_SOURCE.`status`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`file_id` = DBT_INTERNAL_SOURCE.`file_id`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`trace_no` = DBT_INTERNAL_SOURCE.`trace_no`,`account_no` = DBT_INTERNAL_SOURCE.`account_no`,`partner_id` = DBT_INTERNAL_SOURCE.`partner_id`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`description` = DBT_INTERNAL_SOURCE.`description`,`external_id` = DBT_INTERNAL_SOURCE.`external_id`,`is_same_day` = DBT_INTERNAL_SOURCE.`is_same_day`,`return_data` = DBT_INTERNAL_SOURCE.`return_data`,`account_name` = DBT_INTERNAL_SOURCE.`account_name`,`effective_date` = DBT_INTERNAL_SOURCE.`effective_date`,`reference_info` = DBT_INTERNAL_SOURCE.`reference_info`,`transaction_code` = DBT_INTERNAL_SOURCE.`transaction_code`,`source_account_no` = DBT_INTERNAL_SOURCE.`source_account_no`,`transaction_in_id` = DBT_INTERNAL_SOURCE.`transaction_in_id`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`source_account_name` = DBT_INTERNAL_SOURCE.`source_account_name`,`destination_bank_routing_no` = DBT_INTERNAL_SOURCE.`destination_bank_routing_no`,`_airbyte_start_at` = DBT_INTERNAL_SOURCE.`_airbyte_start_at`,`_airbyte_end_at` = DBT_INTERNAL_SOURCE.`_airbyte_end_at`,`_airbyte_active_row` = DBT_INTERNAL_SOURCE.`_airbyte_active_row`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_transactions_out_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_transactions_out_hashid` 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when not matched then insert 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `data`, `uuid`, `amount`, `status`, `bank_id`, `created`, `file_id`, `updated`, `trace_no`, `account_no`, `partner_id`, `_ab_cdc_lsn`, `description`, `external_id`, `is_same_day`, `return_data`, `account_name`, `effective_date`, `reference_info`, `transaction_code`, `source_account_no`, `transaction_in_id`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `source_account_name`, `destination_bank_routing_no`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_transactions_out_hashid`) 2022-07-11 15:52:55 normalization > values 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `data`, `uuid`, `amount`, `status`, `bank_id`, `created`, `file_id`, `updated`, `trace_no`, `account_no`, `partner_id`, `_ab_cdc_lsn`, `description`, `external_id`, `is_same_day`, `return_data`, `account_name`, `effective_date`, `reference_info`, `transaction_code`, `source_account_no`, `transaction_in_id`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `source_account_name`, `destination_bank_routing_no`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_transactions_out_hashid`) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:42.567694 [debug] [Thread-5 ]: On model.airbyte_utils.partner_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- Delete records which are no longer active: 2022-07-11 15:52:55 normalization > -- This query is equivalent, but the left join version is more performant: 2022-07-11 15:52:55 normalization > -- delete from final_table where unique_key in ( 2022-07-11 15:52:55 normalization > -- select unique_key from scd_table where 1 = 1 2022-07-11 15:52:55 normalization > -- ) and unique_key not in ( 2022-07-11 15:52:55 normalization > -- select unique_key from scd_table where active_row = 1 2022-07-11 15:52:55 normalization > -- ) 2022-07-11 15:52:55 normalization > -- We're incremental against normalized_at rather than emitted_at because we need to fetch the SCD 2022-07-11 15:52:55 normalization > -- entries that were _updated_ recently. This is because a deleted record will have an SCD record 2022-07-11 15:52:55 normalization > -- which was emitted a long time ago, but recently re-normalized to have active_row = 0. 2022-07-11 15:52:55 normalization > delete from `mainapi-282501`.`raw_achilles`.`partner_config` final_table where final_table._airbyte_unique_key in ( 2022-07-11 15:52:55 normalization > select recent_records.unique_key 2022-07-11 15:52:55 normalization > from ( 2022-07-11 15:52:55 normalization > select distinct _airbyte_unique_key as unique_key 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`partner_config_scd` 2022-07-11 15:52:55 normalization > where 1=1 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from raw_achilles.`partner_config`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ) recent_records 2022-07-11 15:52:55 normalization > left join ( 2022-07-11 15:52:55 normalization > select _airbyte_unique_key as unique_key, count(_airbyte_unique_key) as active_count 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`partner_config_scd` 2022-07-11 15:52:55 normalization > where _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from raw_achilles.`partner_config`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > group by _airbyte_unique_key 2022-07-11 15:52:55 normalization > ) active_counts 2022-07-11 15:52:55 normalization > on recent_records.unique_key = active_counts.unique_key 2022-07-11 15:52:55 normalization > where active_count is null or active_count = 0 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:45.140334 [debug] [Thread-1 ]: On model.airbyte_utils.transactions_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- Delete records which are no longer active: 2022-07-11 15:52:55 normalization > -- This query is equivalent, but the left join version is more performant: 2022-07-11 15:52:55 normalization > -- delete from final_table where unique_key in ( 2022-07-11 15:52:55 normalization > -- select unique_key from scd_table where 1 = 1 2022-07-11 15:52:55 normalization > -- ) and unique_key not in ( 2022-07-11 15:52:55 normalization > -- select unique_key from scd_table where active_row = 1 2022-07-11 15:52:55 normalization > -- ) 2022-07-11 15:52:55 normalization > -- We're incremental against normalized_at rather than emitted_at because we need to fetch the SCD 2022-07-11 15:52:55 normalization > -- entries that were _updated_ recently. This is because a deleted record will have an SCD record 2022-07-11 15:52:55 normalization > -- which was emitted a long time ago, but recently re-normalized to have active_row = 0. 2022-07-11 15:52:55 normalization > delete from `mainapi-282501`.`raw_achilles`.`transactions_out` final_table where final_table._airbyte_unique_key in ( 2022-07-11 15:52:55 normalization > select recent_records.unique_key 2022-07-11 15:52:55 normalization > from ( 2022-07-11 15:52:55 normalization > select distinct _airbyte_unique_key as unique_key 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`transactions_out_scd` 2022-07-11 15:52:55 normalization > where 1=1 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from raw_achilles.`transactions_out`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ) recent_records 2022-07-11 15:52:55 normalization > left join ( 2022-07-11 15:52:55 normalization > select _airbyte_unique_key as unique_key, count(_airbyte_unique_key) as active_count 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`transactions_out_scd` 2022-07-11 15:52:55 normalization > where _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from raw_achilles.`transactions_out`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > group by _airbyte_unique_key 2022-07-11 15:52:55 normalization > ) active_counts 2022-07-11 15:52:55 normalization > on recent_records.unique_key = active_counts.unique_key 2022-07-11 15:52:55 normalization > where active_count is null or active_count = 0 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:45.911678 [debug] [Thread-5 ]: On model.airbyte_utils.partner_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > drop view _airbyte_raw_achilles.partner_config_stg 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:46.384944 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:51:46.386090 [info ] [Thread-5 ]: 11 of 18 OK created incremental model raw_achilles.partner_config_scd................................................... [MERGE (412.0 rows, 298.0 KB processed) in 12.09s] 2022-07-11 15:52:55 normalization > 15:51:46.386647 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.partner_config_scd 2022-07-11 15:52:55 normalization > 15:51:46.387979 [debug] [Thread-2 ]: Began running node model.airbyte_utils.partner_config 2022-07-11 15:52:55 normalization > 15:51:46.388768 [info ] [Thread-2 ]: 14 of 18 START incremental model raw_achilles.partner_config............................................................ [RUN] 2022-07-11 15:52:55 normalization > 15:51:46.390474 [debug] [Thread-2 ]: Acquiring new bigquery connection "model.airbyte_utils.partner_config" 2022-07-11 15:52:55 normalization > 15:51:46.390826 [debug] [Thread-2 ]: Began compiling node model.airbyte_utils.partner_config 2022-07-11 15:52:55 normalization > 15:51:46.391183 [debug] [Thread-2 ]: Compiling model.airbyte_utils.partner_config 2022-07-11 15:52:55 normalization > 15:51:46.413282 [debug] [Thread-2 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:55 normalization > 15:51:46.552915 [debug] [Thread-2 ]: Writing injected SQL for node "model.airbyte_utils.partner_config" 2022-07-11 15:52:55 normalization > 15:51:46.553732 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:51:46.553972 [debug] [Thread-2 ]: Began executing node model.airbyte_utils.partner_config 2022-07-11 15:52:55 normalization > 15:51:46.721606 [debug] [Thread-2 ]: On model.airbyte_utils.partner_config: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > create or replace table `mainapi-282501`.raw_achilles.`partner_config__dbt_tmp` 2022-07-11 15:52:55 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:52:55 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:52:55 normalization > OPTIONS( 2022-07-11 15:52:55 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > as ( 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- Final base SQL model 2022-07-11 15:52:55 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`partner_config_scd` 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > _airbyte_unique_key, 2022-07-11 15:52:55 normalization > name, 2022-07-11 15:52:55 normalization > config, 2022-07-11 15:52:55 normalization > bank_id, 2022-07-11 15:52:55 normalization > created, 2022-07-11 15:52:55 normalization > updated, 2022-07-11 15:52:55 normalization > partner_id, 2022-07-11 15:52:55 normalization > routing_no, 2022-07-11 15:52:55 normalization > _ab_cdc_lsn, 2022-07-11 15:52:55 normalization > account_prefix, 2022-07-11 15:52:55 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:55 normalization > _airbyte_ab_id, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, 2022-07-11 15:52:55 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:52:55 normalization > _airbyte_partner_config_hashid 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`partner_config_scd` 2022-07-11 15:52:55 normalization > -- partner_config from `mainapi-282501`.raw_achilles._airbyte_raw_partner_config 2022-07-11 15:52:55 normalization > where 1 = 1 2022-07-11 15:52:55 normalization > and _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from `mainapi-282501`.raw_achilles.`partner_config`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ); 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:47.089059 [debug] [Thread-6 ]: 2022-07-11 15:52:55 normalization > In `mainapi-282501`.`raw_achilles`.`files_out_scd`: 2022-07-11 15:52:55 normalization > Schema changed: False 2022-07-11 15:52:55 normalization > Source columns not in target: [] 2022-07-11 15:52:55 normalization > Target columns not in source: [] 2022-07-11 15:52:55 normalization > New column types: [] 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:47.091350 [debug] [Thread-6 ]: Writing runtime SQL for node "model.airbyte_utils.files_out_scd" 2022-07-11 15:52:55 normalization > 15:51:47.091943 [debug] [Thread-6 ]: On model.airbyte_utils.files_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > merge into `mainapi-282501`.raw_achilles.`files_out_scd` as DBT_INTERNAL_DEST 2022-07-11 15:52:55 normalization > using ( 2022-07-11 15:52:55 normalization > select * from `mainapi-282501`.raw_achilles.`files_out_scd__dbt_tmp` 2022-07-11 15:52:55 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:52:55 normalization > on 2022-07-11 15:52:55 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key_scd = DBT_INTERNAL_DEST._airbyte_unique_key_scd 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when matched then update set 2022-07-11 15:52:55 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`_airbyte_unique_key_scd` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key_scd`,`id` = DBT_INTERNAL_SOURCE.`id`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`file_hash` = DBT_INTERNAL_SOURCE.`file_hash`,`file_name` = DBT_INTERNAL_SOURCE.`file_name`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`batch_count` = DBT_INTERNAL_SOURCE.`batch_count`,`exchange_window` = DBT_INTERNAL_SOURCE.`exchange_window`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_start_at` = DBT_INTERNAL_SOURCE.`_airbyte_start_at`,`_airbyte_end_at` = DBT_INTERNAL_SOURCE.`_airbyte_end_at`,`_airbyte_active_row` = DBT_INTERNAL_SOURCE.`_airbyte_active_row`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_files_out_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_files_out_hashid` 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when not matched then insert 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `bank_id`, `created`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `batch_count`, `exchange_window`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_out_hashid`) 2022-07-11 15:52:55 normalization > values 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `id`, `bank_id`, `created`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `batch_count`, `exchange_window`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_out_hashid`) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:48.635991 [debug] [Thread-1 ]: On model.airbyte_utils.transactions_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > drop view _airbyte_raw_achilles.transactions_out_stg 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:49.684538 [debug] [Thread-2 ]: 2022-07-11 15:52:55 normalization > In `mainapi-282501`.`raw_achilles`.`partner_config`: 2022-07-11 15:52:55 normalization > Schema changed: False 2022-07-11 15:52:55 normalization > Source columns not in target: [] 2022-07-11 15:52:55 normalization > Target columns not in source: [] 2022-07-11 15:52:55 normalization > New column types: [] 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:49.687127 [debug] [Thread-2 ]: Writing runtime SQL for node "model.airbyte_utils.partner_config" 2022-07-11 15:52:55 normalization > 15:51:49.687722 [debug] [Thread-2 ]: On model.airbyte_utils.partner_config: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.partner_config"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > merge into `mainapi-282501`.raw_achilles.`partner_config` as DBT_INTERNAL_DEST 2022-07-11 15:52:55 normalization > using ( 2022-07-11 15:52:55 normalization > select * from `mainapi-282501`.raw_achilles.`partner_config__dbt_tmp` 2022-07-11 15:52:55 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:52:55 normalization > on 2022-07-11 15:52:55 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key = DBT_INTERNAL_DEST._airbyte_unique_key 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when matched then update set 2022-07-11 15:52:55 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`name` = DBT_INTERNAL_SOURCE.`name`,`config` = DBT_INTERNAL_SOURCE.`config`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`partner_id` = DBT_INTERNAL_SOURCE.`partner_id`,`routing_no` = DBT_INTERNAL_SOURCE.`routing_no`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`account_prefix` = DBT_INTERNAL_SOURCE.`account_prefix`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_partner_config_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_partner_config_hashid` 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when not matched then insert 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `name`, `config`, `bank_id`, `created`, `updated`, `partner_id`, `routing_no`, `_ab_cdc_lsn`, `account_prefix`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_partner_config_hashid`) 2022-07-11 15:52:55 normalization > values 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `name`, `config`, `bank_id`, `created`, `updated`, `partner_id`, `routing_no`, `_ab_cdc_lsn`, `account_prefix`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_partner_config_hashid`) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:50.349226 [debug] [Thread-1 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:51:50.351233 [info ] [Thread-1 ]: 9 of 18 OK created incremental model raw_achilles.transactions_out_scd.................................................. [MERGE (226.0 rows, 177.8 KB processed) in 16.15s] 2022-07-11 15:52:55 normalization > 15:51:50.351933 [debug] [Thread-1 ]: Finished running node model.airbyte_utils.transactions_out_scd 2022-07-11 15:52:55 normalization > 15:51:50.353229 [debug] [Thread-4 ]: Began running node model.airbyte_utils.transactions_out 2022-07-11 15:52:55 normalization > 15:51:50.354015 [info ] [Thread-4 ]: 15 of 18 START incremental model raw_achilles.transactions_out.......................................................... [RUN] 2022-07-11 15:52:55 normalization > 15:51:50.355851 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.transactions_out" 2022-07-11 15:52:55 normalization > 15:51:50.356299 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.transactions_out 2022-07-11 15:52:55 normalization > 15:51:50.356749 [debug] [Thread-4 ]: Compiling model.airbyte_utils.transactions_out 2022-07-11 15:52:55 normalization > 15:51:50.370971 [debug] [Thread-4 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:55 normalization > 15:51:50.486426 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.transactions_out" 2022-07-11 15:52:55 normalization > 15:51:50.487020 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:51:50.487279 [debug] [Thread-4 ]: Began executing node model.airbyte_utils.transactions_out 2022-07-11 15:52:55 normalization > 15:51:50.565598 [debug] [Thread-4 ]: On model.airbyte_utils.transactions_out: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > create or replace table `mainapi-282501`.raw_achilles.`transactions_out__dbt_tmp` 2022-07-11 15:52:55 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:52:55 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:52:55 normalization > OPTIONS( 2022-07-11 15:52:55 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > as ( 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- Final base SQL model 2022-07-11 15:52:55 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`transactions_out_scd` 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > _airbyte_unique_key, 2022-07-11 15:52:55 normalization > id, 2022-07-11 15:52:55 normalization > data, 2022-07-11 15:52:55 normalization > uuid, 2022-07-11 15:52:55 normalization > amount, 2022-07-11 15:52:55 normalization > status, 2022-07-11 15:52:55 normalization > bank_id, 2022-07-11 15:52:55 normalization > created, 2022-07-11 15:52:55 normalization > file_id, 2022-07-11 15:52:55 normalization > updated, 2022-07-11 15:52:55 normalization > trace_no, 2022-07-11 15:52:55 normalization > account_no, 2022-07-11 15:52:55 normalization > partner_id, 2022-07-11 15:52:55 normalization > _ab_cdc_lsn, 2022-07-11 15:52:55 normalization > description, 2022-07-11 15:52:55 normalization > external_id, 2022-07-11 15:52:55 normalization > is_same_day, 2022-07-11 15:52:55 normalization > return_data, 2022-07-11 15:52:55 normalization > account_name, 2022-07-11 15:52:55 normalization > effective_date, 2022-07-11 15:52:55 normalization > reference_info, 2022-07-11 15:52:55 normalization > transaction_code, 2022-07-11 15:52:55 normalization > source_account_no, 2022-07-11 15:52:55 normalization > transaction_in_id, 2022-07-11 15:52:55 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:55 normalization > source_account_name, 2022-07-11 15:52:55 normalization > destination_bank_routing_no, 2022-07-11 15:52:55 normalization > _airbyte_ab_id, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, 2022-07-11 15:52:55 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:52:55 normalization > _airbyte_transactions_out_hashid 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`transactions_out_scd` 2022-07-11 15:52:55 normalization > -- transactions_out from `mainapi-282501`.raw_achilles._airbyte_raw_transactions_out 2022-07-11 15:52:55 normalization > where 1 = 1 2022-07-11 15:52:55 normalization > and _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from `mainapi-282501`.raw_achilles.`transactions_out`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ); 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:52.328887 [debug] [Thread-2 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:51:52.330150 [info ] [Thread-2 ]: 14 of 18 OK created incremental model raw_achilles.partner_config....................................................... [MERGE (206.0 rows, 168.4 KB processed) in 5.94s] 2022-07-11 15:52:55 normalization > 15:51:52.330847 [debug] [Thread-2 ]: Finished running node model.airbyte_utils.partner_config 2022-07-11 15:52:55 normalization > 15:51:53.021833 [debug] [Thread-4 ]: 2022-07-11 15:52:55 normalization > In `mainapi-282501`.`raw_achilles`.`transactions_out`: 2022-07-11 15:52:55 normalization > Schema changed: False 2022-07-11 15:52:55 normalization > Source columns not in target: [] 2022-07-11 15:52:55 normalization > Target columns not in source: [] 2022-07-11 15:52:55 normalization > New column types: [] 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:53.024577 [debug] [Thread-4 ]: Writing runtime SQL for node "model.airbyte_utils.transactions_out" 2022-07-11 15:52:55 normalization > 15:51:53.025101 [debug] [Thread-4 ]: On model.airbyte_utils.transactions_out: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.transactions_out"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > merge into `mainapi-282501`.raw_achilles.`transactions_out` as DBT_INTERNAL_DEST 2022-07-11 15:52:55 normalization > using ( 2022-07-11 15:52:55 normalization > select * from `mainapi-282501`.raw_achilles.`transactions_out__dbt_tmp` 2022-07-11 15:52:55 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:52:55 normalization > on 2022-07-11 15:52:55 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key = DBT_INTERNAL_DEST._airbyte_unique_key 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when matched then update set 2022-07-11 15:52:55 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`id` = DBT_INTERNAL_SOURCE.`id`,`data` = DBT_INTERNAL_SOURCE.`data`,`uuid` = DBT_INTERNAL_SOURCE.`uuid`,`amount` = DBT_INTERNAL_SOURCE.`amount`,`status` = DBT_INTERNAL_SOURCE.`status`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`file_id` = DBT_INTERNAL_SOURCE.`file_id`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`trace_no` = DBT_INTERNAL_SOURCE.`trace_no`,`account_no` = DBT_INTERNAL_SOURCE.`account_no`,`partner_id` = DBT_INTERNAL_SOURCE.`partner_id`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`description` = DBT_INTERNAL_SOURCE.`description`,`external_id` = DBT_INTERNAL_SOURCE.`external_id`,`is_same_day` = DBT_INTERNAL_SOURCE.`is_same_day`,`return_data` = DBT_INTERNAL_SOURCE.`return_data`,`account_name` = DBT_INTERNAL_SOURCE.`account_name`,`effective_date` = DBT_INTERNAL_SOURCE.`effective_date`,`reference_info` = DBT_INTERNAL_SOURCE.`reference_info`,`transaction_code` = DBT_INTERNAL_SOURCE.`transaction_code`,`source_account_no` = DBT_INTERNAL_SOURCE.`source_account_no`,`transaction_in_id` = DBT_INTERNAL_SOURCE.`transaction_in_id`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`source_account_name` = DBT_INTERNAL_SOURCE.`source_account_name`,`destination_bank_routing_no` = DBT_INTERNAL_SOURCE.`destination_bank_routing_no`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_transactions_out_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_transactions_out_hashid` 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when not matched then insert 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `id`, `data`, `uuid`, `amount`, `status`, `bank_id`, `created`, `file_id`, `updated`, `trace_no`, `account_no`, `partner_id`, `_ab_cdc_lsn`, `description`, `external_id`, `is_same_day`, `return_data`, `account_name`, `effective_date`, `reference_info`, `transaction_code`, `source_account_no`, `transaction_in_id`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `source_account_name`, `destination_bank_routing_no`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_transactions_out_hashid`) 2022-07-11 15:52:55 normalization > values 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `id`, `data`, `uuid`, `amount`, `status`, `bank_id`, `created`, `file_id`, `updated`, `trace_no`, `account_no`, `partner_id`, `_ab_cdc_lsn`, `description`, `external_id`, `is_same_day`, `return_data`, `account_name`, `effective_date`, `reference_info`, `transaction_code`, `source_account_no`, `transaction_in_id`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `source_account_name`, `destination_bank_routing_no`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_transactions_out_hashid`) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:51:55.990940 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:51:55.992375 [info ] [Thread-4 ]: 15 of 18 OK created incremental model raw_achilles.transactions_out..................................................... [MERGE (113.0 rows, 101.9 KB processed) in 5.64s] 2022-07-11 15:52:55 normalization > 15:51:55.993102 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.transactions_out 2022-07-11 15:52:55 normalization > 15:51:57.284913 [debug] [Thread-7 ]: On model.airbyte_utils.files_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- Delete records which are no longer active: 2022-07-11 15:52:55 normalization > -- This query is equivalent, but the left join version is more performant: 2022-07-11 15:52:55 normalization > -- delete from final_table where unique_key in ( 2022-07-11 15:52:55 normalization > -- select unique_key from scd_table where 1 = 1 2022-07-11 15:52:55 normalization > -- ) and unique_key not in ( 2022-07-11 15:52:55 normalization > -- select unique_key from scd_table where active_row = 1 2022-07-11 15:52:55 normalization > -- ) 2022-07-11 15:52:55 normalization > -- We're incremental against normalized_at rather than emitted_at because we need to fetch the SCD 2022-07-11 15:52:55 normalization > -- entries that were _updated_ recently. This is because a deleted record will have an SCD record 2022-07-11 15:52:55 normalization > -- which was emitted a long time ago, but recently re-normalized to have active_row = 0. 2022-07-11 15:52:55 normalization > delete from `mainapi-282501`.`raw_achilles`.`files_in` final_table where final_table._airbyte_unique_key in ( 2022-07-11 15:52:55 normalization > select recent_records.unique_key 2022-07-11 15:52:55 normalization > from ( 2022-07-11 15:52:55 normalization > select distinct _airbyte_unique_key as unique_key 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`files_in_scd` 2022-07-11 15:52:55 normalization > where 1=1 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from raw_achilles.`files_in`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ) recent_records 2022-07-11 15:52:55 normalization > left join ( 2022-07-11 15:52:55 normalization > select _airbyte_unique_key as unique_key, count(_airbyte_unique_key) as active_count 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`files_in_scd` 2022-07-11 15:52:55 normalization > where _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from raw_achilles.`files_in`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > group by _airbyte_unique_key 2022-07-11 15:52:55 normalization > ) active_counts 2022-07-11 15:52:55 normalization > on recent_records.unique_key = active_counts.unique_key 2022-07-11 15:52:55 normalization > where active_count is null or active_count = 0 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:00.185010 [debug] [Thread-7 ]: On model.airbyte_utils.files_in_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > drop view _airbyte_raw_achilles.files_in_stg 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:00.718616 [debug] [Thread-7 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:52:00.719873 [info ] [Thread-7 ]: 8 of 18 OK created incremental model raw_achilles.files_in_scd.......................................................... [MERGE (72.0 rows, 47.9 KB processed) in 26.65s] 2022-07-11 15:52:55 normalization > 15:52:00.721063 [debug] [Thread-7 ]: Finished running node model.airbyte_utils.files_in_scd 2022-07-11 15:52:55 normalization > 15:52:00.722395 [debug] [Thread-8 ]: Began running node model.airbyte_utils.files_in 2022-07-11 15:52:55 normalization > 15:52:00.722797 [info ] [Thread-8 ]: 16 of 18 START incremental model raw_achilles.files_in.................................................................. [RUN] 2022-07-11 15:52:55 normalization > 15:52:00.724001 [debug] [Thread-8 ]: Acquiring new bigquery connection "model.airbyte_utils.files_in" 2022-07-11 15:52:55 normalization > 15:52:00.724303 [debug] [Thread-8 ]: Began compiling node model.airbyte_utils.files_in 2022-07-11 15:52:55 normalization > 15:52:00.724543 [debug] [Thread-8 ]: Compiling model.airbyte_utils.files_in 2022-07-11 15:52:55 normalization > 15:52:00.735854 [debug] [Thread-8 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:55 normalization > 15:52:00.838413 [debug] [Thread-8 ]: Writing injected SQL for node "model.airbyte_utils.files_in" 2022-07-11 15:52:55 normalization > 15:52:00.838975 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:52:00.839229 [debug] [Thread-8 ]: Began executing node model.airbyte_utils.files_in 2022-07-11 15:52:55 normalization > 15:52:00.920678 [debug] [Thread-8 ]: On model.airbyte_utils.files_in: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_in__dbt_tmp` 2022-07-11 15:52:55 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:52:55 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:52:55 normalization > OPTIONS( 2022-07-11 15:52:55 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > as ( 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- Final base SQL model 2022-07-11 15:52:55 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`files_in_scd` 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > _airbyte_unique_key, 2022-07-11 15:52:55 normalization > id, 2022-07-11 15:52:55 normalization > ended, 2022-07-11 15:52:55 normalization > started, 2022-07-11 15:52:55 normalization > updated, 2022-07-11 15:52:55 normalization > file_hash, 2022-07-11 15:52:55 normalization > file_name, 2022-07-11 15:52:55 normalization > _ab_cdc_lsn, 2022-07-11 15:52:55 normalization > iat_entry_count, 2022-07-11 15:52:55 normalization > std_entry_count, 2022-07-11 15:52:55 normalization > total_batch_count, 2022-07-11 15:52:55 normalization > total_entry_count, 2022-07-11 15:52:55 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:55 normalization > preprocessing_path, 2022-07-11 15:52:55 normalization > total_debit_amount, 2022-07-11 15:52:55 normalization > postprocessing_path, 2022-07-11 15:52:55 normalization > total_credit_amount, 2022-07-11 15:52:55 normalization > iat_entries_processed, 2022-07-11 15:52:55 normalization > std_entries_processed, 2022-07-11 15:52:55 normalization > _airbyte_ab_id, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, 2022-07-11 15:52:55 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:52:55 normalization > _airbyte_files_in_hashid 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`files_in_scd` 2022-07-11 15:52:55 normalization > -- files_in from `mainapi-282501`.raw_achilles._airbyte_raw_files_in 2022-07-11 15:52:55 normalization > where 1 = 1 2022-07-11 15:52:55 normalization > and _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from `mainapi-282501`.raw_achilles.`files_in`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ); 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:01.420197 [debug] [Thread-3 ]: 2022-07-11 15:52:55 normalization > In `mainapi-282501`.`raw_achilles`.`bank_config_scd`: 2022-07-11 15:52:55 normalization > Schema changed: False 2022-07-11 15:52:55 normalization > Source columns not in target: [] 2022-07-11 15:52:55 normalization > Target columns not in source: [] 2022-07-11 15:52:55 normalization > New column types: [] 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:01.422415 [debug] [Thread-3 ]: Writing runtime SQL for node "model.airbyte_utils.bank_config_scd" 2022-07-11 15:52:55 normalization > 15:52:01.423042 [debug] [Thread-3 ]: On model.airbyte_utils.bank_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > merge into `mainapi-282501`.raw_achilles.`bank_config_scd` as DBT_INTERNAL_DEST 2022-07-11 15:52:55 normalization > using ( 2022-07-11 15:52:55 normalization > select * from `mainapi-282501`.raw_achilles.`bank_config_scd__dbt_tmp` 2022-07-11 15:52:55 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:52:55 normalization > on 2022-07-11 15:52:55 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key_scd = DBT_INTERNAL_DEST._airbyte_unique_key_scd 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when matched then update set 2022-07-11 15:52:55 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`_airbyte_unique_key_scd` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key_scd`,`name` = DBT_INTERNAL_SOURCE.`name`,`config` = DBT_INTERNAL_SOURCE.`config`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`routing_no` = DBT_INTERNAL_SOURCE.`routing_no`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_start_at` = DBT_INTERNAL_SOURCE.`_airbyte_start_at`,`_airbyte_end_at` = DBT_INTERNAL_SOURCE.`_airbyte_end_at`,`_airbyte_active_row` = DBT_INTERNAL_SOURCE.`_airbyte_active_row`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_bank_config_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_bank_config_hashid` 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when not matched then insert 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `name`, `config`, `bank_id`, `created`, `updated`, `routing_no`, `_ab_cdc_lsn`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_bank_config_hashid`) 2022-07-11 15:52:55 normalization > values 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `_airbyte_unique_key_scd`, `name`, `config`, `bank_id`, `created`, `updated`, `routing_no`, `_ab_cdc_lsn`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_start_at`, `_airbyte_end_at`, `_airbyte_active_row`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_bank_config_hashid`) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:03.899658 [debug] [Thread-8 ]: 2022-07-11 15:52:55 normalization > In `mainapi-282501`.`raw_achilles`.`files_in`: 2022-07-11 15:52:55 normalization > Schema changed: False 2022-07-11 15:52:55 normalization > Source columns not in target: [] 2022-07-11 15:52:55 normalization > Target columns not in source: [] 2022-07-11 15:52:55 normalization > New column types: [] 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:03.902107 [debug] [Thread-8 ]: Writing runtime SQL for node "model.airbyte_utils.files_in" 2022-07-11 15:52:55 normalization > 15:52:03.902722 [debug] [Thread-8 ]: On model.airbyte_utils.files_in: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_in"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > merge into `mainapi-282501`.raw_achilles.`files_in` as DBT_INTERNAL_DEST 2022-07-11 15:52:55 normalization > using ( 2022-07-11 15:52:55 normalization > select * from `mainapi-282501`.raw_achilles.`files_in__dbt_tmp` 2022-07-11 15:52:55 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:52:55 normalization > on 2022-07-11 15:52:55 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key = DBT_INTERNAL_DEST._airbyte_unique_key 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when matched then update set 2022-07-11 15:52:55 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`id` = DBT_INTERNAL_SOURCE.`id`,`ended` = DBT_INTERNAL_SOURCE.`ended`,`started` = DBT_INTERNAL_SOURCE.`started`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`file_hash` = DBT_INTERNAL_SOURCE.`file_hash`,`file_name` = DBT_INTERNAL_SOURCE.`file_name`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`iat_entry_count` = DBT_INTERNAL_SOURCE.`iat_entry_count`,`std_entry_count` = DBT_INTERNAL_SOURCE.`std_entry_count`,`total_batch_count` = DBT_INTERNAL_SOURCE.`total_batch_count`,`total_entry_count` = DBT_INTERNAL_SOURCE.`total_entry_count`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`preprocessing_path` = DBT_INTERNAL_SOURCE.`preprocessing_path`,`total_debit_amount` = DBT_INTERNAL_SOURCE.`total_debit_amount`,`postprocessing_path` = DBT_INTERNAL_SOURCE.`postprocessing_path`,`total_credit_amount` = DBT_INTERNAL_SOURCE.`total_credit_amount`,`iat_entries_processed` = DBT_INTERNAL_SOURCE.`iat_entries_processed`,`std_entries_processed` = DBT_INTERNAL_SOURCE.`std_entries_processed`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_files_in_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_files_in_hashid` 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when not matched then insert 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `id`, `ended`, `started`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `iat_entry_count`, `std_entry_count`, `total_batch_count`, `total_entry_count`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `preprocessing_path`, `total_debit_amount`, `postprocessing_path`, `total_credit_amount`, `iat_entries_processed`, `std_entries_processed`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_in_hashid`) 2022-07-11 15:52:55 normalization > values 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `id`, `ended`, `started`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `iat_entry_count`, `std_entry_count`, `total_batch_count`, `total_entry_count`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `preprocessing_path`, `total_debit_amount`, `postprocessing_path`, `total_credit_amount`, `iat_entries_processed`, `std_entries_processed`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_in_hashid`) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:05.837359 [debug] [Thread-3 ]: On model.airbyte_utils.bank_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- Delete records which are no longer active: 2022-07-11 15:52:55 normalization > -- This query is equivalent, but the left join version is more performant: 2022-07-11 15:52:55 normalization > -- delete from final_table where unique_key in ( 2022-07-11 15:52:55 normalization > -- select unique_key from scd_table where 1 = 1 2022-07-11 15:52:55 normalization > -- ) and unique_key not in ( 2022-07-11 15:52:55 normalization > -- select unique_key from scd_table where active_row = 1 2022-07-11 15:52:55 normalization > -- ) 2022-07-11 15:52:55 normalization > -- We're incremental against normalized_at rather than emitted_at because we need to fetch the SCD 2022-07-11 15:52:55 normalization > -- entries that were _updated_ recently. This is because a deleted record will have an SCD record 2022-07-11 15:52:55 normalization > -- which was emitted a long time ago, but recently re-normalized to have active_row = 0. 2022-07-11 15:52:55 normalization > delete from `mainapi-282501`.`raw_achilles`.`bank_config` final_table where final_table._airbyte_unique_key in ( 2022-07-11 15:52:55 normalization > select recent_records.unique_key 2022-07-11 15:52:55 normalization > from ( 2022-07-11 15:52:55 normalization > select distinct _airbyte_unique_key as unique_key 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`bank_config_scd` 2022-07-11 15:52:55 normalization > where 1=1 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from raw_achilles.`bank_config`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ) recent_records 2022-07-11 15:52:55 normalization > left join ( 2022-07-11 15:52:55 normalization > select _airbyte_unique_key as unique_key, count(_airbyte_unique_key) as active_count 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`bank_config_scd` 2022-07-11 15:52:55 normalization > where _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from raw_achilles.`bank_config`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > group by _airbyte_unique_key 2022-07-11 15:52:55 normalization > ) active_counts 2022-07-11 15:52:55 normalization > on recent_records.unique_key = active_counts.unique_key 2022-07-11 15:52:55 normalization > where active_count is null or active_count = 0 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:06.450266 [debug] [Thread-8 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:52:06.451765 [info ] [Thread-8 ]: 16 of 18 OK created incremental model raw_achilles.files_in............................................................. [MERGE (36.0 rows, 26.6 KB processed) in 5.73s] 2022-07-11 15:52:55 normalization > 15:52:06.452453 [debug] [Thread-8 ]: Finished running node model.airbyte_utils.files_in 2022-07-11 15:52:55 normalization > 15:52:09.061233 [debug] [Thread-3 ]: On model.airbyte_utils.bank_config_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > drop view _airbyte_raw_achilles.bank_config_stg 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:09.659421 [debug] [Thread-3 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:52:09.660621 [info ] [Thread-3 ]: 10 of 18 OK created incremental model raw_achilles.bank_config_scd...................................................... [MERGE (6.0 rows, 5.2 KB processed) in 35.41s] 2022-07-11 15:52:55 normalization > 15:52:09.661124 [debug] [Thread-3 ]: Finished running node model.airbyte_utils.bank_config_scd 2022-07-11 15:52:55 normalization > 15:52:09.662298 [debug] [Thread-5 ]: Began running node model.airbyte_utils.bank_config 2022-07-11 15:52:55 normalization > 15:52:09.662716 [info ] [Thread-5 ]: 17 of 18 START incremental model raw_achilles.bank_config............................................................... [RUN] 2022-07-11 15:52:55 normalization > 15:52:09.663963 [debug] [Thread-5 ]: Acquiring new bigquery connection "model.airbyte_utils.bank_config" 2022-07-11 15:52:55 normalization > 15:52:09.664232 [debug] [Thread-5 ]: Began compiling node model.airbyte_utils.bank_config 2022-07-11 15:52:55 normalization > 15:52:09.664488 [debug] [Thread-5 ]: Compiling model.airbyte_utils.bank_config 2022-07-11 15:52:55 normalization > 15:52:09.678073 [debug] [Thread-5 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:55 normalization > 15:52:09.764249 [debug] [Thread-5 ]: Writing injected SQL for node "model.airbyte_utils.bank_config" 2022-07-11 15:52:55 normalization > 15:52:09.764853 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:52:09.765087 [debug] [Thread-5 ]: Began executing node model.airbyte_utils.bank_config 2022-07-11 15:52:55 normalization > 15:52:09.826444 [debug] [Thread-5 ]: On model.airbyte_utils.bank_config: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > create or replace table `mainapi-282501`.raw_achilles.`bank_config__dbt_tmp` 2022-07-11 15:52:55 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:52:55 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:52:55 normalization > OPTIONS( 2022-07-11 15:52:55 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > as ( 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- Final base SQL model 2022-07-11 15:52:55 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`bank_config_scd` 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > _airbyte_unique_key, 2022-07-11 15:52:55 normalization > name, 2022-07-11 15:52:55 normalization > config, 2022-07-11 15:52:55 normalization > bank_id, 2022-07-11 15:52:55 normalization > created, 2022-07-11 15:52:55 normalization > updated, 2022-07-11 15:52:55 normalization > routing_no, 2022-07-11 15:52:55 normalization > _ab_cdc_lsn, 2022-07-11 15:52:55 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:55 normalization > _airbyte_ab_id, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, 2022-07-11 15:52:55 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:52:55 normalization > _airbyte_bank_config_hashid 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`bank_config_scd` 2022-07-11 15:52:55 normalization > -- bank_config from `mainapi-282501`.raw_achilles._airbyte_raw_bank_config 2022-07-11 15:52:55 normalization > where 1 = 1 2022-07-11 15:52:55 normalization > and _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from `mainapi-282501`.raw_achilles.`bank_config`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ); 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:13.115130 [debug] [Thread-5 ]: 2022-07-11 15:52:55 normalization > In `mainapi-282501`.`raw_achilles`.`bank_config`: 2022-07-11 15:52:55 normalization > Schema changed: False 2022-07-11 15:52:55 normalization > Source columns not in target: [] 2022-07-11 15:52:55 normalization > Target columns not in source: [] 2022-07-11 15:52:55 normalization > New column types: [] 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:13.117776 [debug] [Thread-5 ]: Writing runtime SQL for node "model.airbyte_utils.bank_config" 2022-07-11 15:52:55 normalization > 15:52:13.118443 [debug] [Thread-5 ]: On model.airbyte_utils.bank_config: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.bank_config"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > merge into `mainapi-282501`.raw_achilles.`bank_config` as DBT_INTERNAL_DEST 2022-07-11 15:52:55 normalization > using ( 2022-07-11 15:52:55 normalization > select * from `mainapi-282501`.raw_achilles.`bank_config__dbt_tmp` 2022-07-11 15:52:55 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:52:55 normalization > on 2022-07-11 15:52:55 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key = DBT_INTERNAL_DEST._airbyte_unique_key 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when matched then update set 2022-07-11 15:52:55 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`name` = DBT_INTERNAL_SOURCE.`name`,`config` = DBT_INTERNAL_SOURCE.`config`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`routing_no` = DBT_INTERNAL_SOURCE.`routing_no`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_bank_config_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_bank_config_hashid` 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when not matched then insert 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `name`, `config`, `bank_id`, `created`, `updated`, `routing_no`, `_ab_cdc_lsn`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_bank_config_hashid`) 2022-07-11 15:52:55 normalization > values 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `name`, `config`, `bank_id`, `created`, `updated`, `routing_no`, `_ab_cdc_lsn`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_bank_config_hashid`) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:16.002501 [debug] [Thread-5 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:52:16.003796 [info ] [Thread-5 ]: 17 of 18 OK created incremental model raw_achilles.bank_config.......................................................... [MERGE (3.0 rows, 3.1 KB processed) in 6.34s] 2022-07-11 15:52:55 normalization > 15:52:16.004409 [debug] [Thread-5 ]: Finished running node model.airbyte_utils.bank_config 2022-07-11 15:52:55 normalization > 15:52:35.157971 [debug] [Thread-6 ]: On model.airbyte_utils.files_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- Delete records which are no longer active: 2022-07-11 15:52:55 normalization > -- This query is equivalent, but the left join version is more performant: 2022-07-11 15:52:55 normalization > -- delete from final_table where unique_key in ( 2022-07-11 15:52:55 normalization > -- select unique_key from scd_table where 1 = 1 2022-07-11 15:52:55 normalization > -- ) and unique_key not in ( 2022-07-11 15:52:55 normalization > -- select unique_key from scd_table where active_row = 1 2022-07-11 15:52:55 normalization > -- ) 2022-07-11 15:52:55 normalization > -- We're incremental against normalized_at rather than emitted_at because we need to fetch the SCD 2022-07-11 15:52:55 normalization > -- entries that were _updated_ recently. This is because a deleted record will have an SCD record 2022-07-11 15:52:55 normalization > -- which was emitted a long time ago, but recently re-normalized to have active_row = 0. 2022-07-11 15:52:55 normalization > delete from `mainapi-282501`.`raw_achilles`.`files_out` final_table where final_table._airbyte_unique_key in ( 2022-07-11 15:52:55 normalization > select recent_records.unique_key 2022-07-11 15:52:55 normalization > from ( 2022-07-11 15:52:55 normalization > select distinct _airbyte_unique_key as unique_key 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`files_out_scd` 2022-07-11 15:52:55 normalization > where 1=1 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from raw_achilles.`files_out`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ) recent_records 2022-07-11 15:52:55 normalization > left join ( 2022-07-11 15:52:55 normalization > select _airbyte_unique_key as unique_key, count(_airbyte_unique_key) as active_count 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`files_out_scd` 2022-07-11 15:52:55 normalization > where _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_normalized_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from raw_achilles.`files_out`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > group by _airbyte_unique_key 2022-07-11 15:52:55 normalization > ) active_counts 2022-07-11 15:52:55 normalization > on recent_records.unique_key = active_counts.unique_key 2022-07-11 15:52:55 normalization > where active_count is null or active_count = 0 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:38.823223 [debug] [Thread-6 ]: On model.airbyte_utils.files_out_scd: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out_scd"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > drop view _airbyte_raw_achilles.files_out_stg 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:39.310684 [debug] [Thread-6 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:52:39.311815 [info ] [Thread-6 ]: 7 of 18 OK created incremental model raw_achilles.files_out_scd......................................................... [MERGE (68.0 rows, 37.8 KB processed) in 65.25s] 2022-07-11 15:52:55 normalization > 15:52:39.312443 [debug] [Thread-6 ]: Finished running node model.airbyte_utils.files_out_scd 2022-07-11 15:52:55 normalization > 15:52:39.313818 [debug] [Thread-4 ]: Began running node model.airbyte_utils.files_out 2022-07-11 15:52:55 normalization > 15:52:39.314476 [info ] [Thread-4 ]: 18 of 18 START incremental model raw_achilles.files_out................................................................. [RUN] 2022-07-11 15:52:55 normalization > 15:52:39.316038 [debug] [Thread-4 ]: Acquiring new bigquery connection "model.airbyte_utils.files_out" 2022-07-11 15:52:55 normalization > 15:52:39.316278 [debug] [Thread-4 ]: Began compiling node model.airbyte_utils.files_out 2022-07-11 15:52:55 normalization > 15:52:39.316504 [debug] [Thread-4 ]: Compiling model.airbyte_utils.files_out 2022-07-11 15:52:55 normalization > 15:52:39.328599 [debug] [Thread-4 ]: Opening a new connection, currently in state closed 2022-07-11 15:52:55 normalization > 15:52:39.485587 [debug] [Thread-4 ]: Writing injected SQL for node "model.airbyte_utils.files_out" 2022-07-11 15:52:55 normalization > 15:52:39.486382 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:52:39.486634 [debug] [Thread-4 ]: Began executing node model.airbyte_utils.files_out 2022-07-11 15:52:55 normalization > 15:52:39.540855 [debug] [Thread-4 ]: On model.airbyte_utils.files_out: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > create or replace table `mainapi-282501`.raw_achilles.`files_out__dbt_tmp` 2022-07-11 15:52:55 normalization > partition by timestamp_trunc(_airbyte_emitted_at, day) 2022-07-11 15:52:55 normalization > cluster by _airbyte_unique_key, _airbyte_emitted_at 2022-07-11 15:52:55 normalization > OPTIONS( 2022-07-11 15:52:55 normalization > expiration_timestamp=TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL 12 hour) 2022-07-11 15:52:55 normalization > ) 2022-07-11 15:52:55 normalization > as ( 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > -- Final base SQL model 2022-07-11 15:52:55 normalization > -- depends_on: `mainapi-282501`.raw_achilles.`files_out_scd` 2022-07-11 15:52:55 normalization > select 2022-07-11 15:52:55 normalization > _airbyte_unique_key, 2022-07-11 15:52:55 normalization > id, 2022-07-11 15:52:55 normalization > bank_id, 2022-07-11 15:52:55 normalization > created, 2022-07-11 15:52:55 normalization > updated, 2022-07-11 15:52:55 normalization > file_hash, 2022-07-11 15:52:55 normalization > file_name, 2022-07-11 15:52:55 normalization > _ab_cdc_lsn, 2022-07-11 15:52:55 normalization > batch_count, 2022-07-11 15:52:55 normalization > exchange_window, 2022-07-11 15:52:55 normalization > _ab_cdc_deleted_at, 2022-07-11 15:52:55 normalization > _ab_cdc_updated_at, 2022-07-11 15:52:55 normalization > _airbyte_ab_id, 2022-07-11 15:52:55 normalization > _airbyte_emitted_at, 2022-07-11 15:52:55 normalization > CURRENT_TIMESTAMP() as _airbyte_normalized_at, 2022-07-11 15:52:55 normalization > _airbyte_files_out_hashid 2022-07-11 15:52:55 normalization > from `mainapi-282501`.raw_achilles.`files_out_scd` 2022-07-11 15:52:55 normalization > -- files_out from `mainapi-282501`.raw_achilles._airbyte_raw_files_out 2022-07-11 15:52:55 normalization > where 1 = 1 2022-07-11 15:52:55 normalization > and _airbyte_active_row = 1 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > and coalesce( 2022-07-11 15:52:55 normalization > cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > ) >= (select max(cast(_airbyte_emitted_at as 2022-07-11 15:52:55 normalization > timestamp 2022-07-11 15:52:55 normalization > )) from `mainapi-282501`.raw_achilles.`files_out`), 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > true) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > ); 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:46.268981 [debug] [Thread-4 ]: 2022-07-11 15:52:55 normalization > In `mainapi-282501`.`raw_achilles`.`files_out`: 2022-07-11 15:52:55 normalization > Schema changed: False 2022-07-11 15:52:55 normalization > Source columns not in target: [] 2022-07-11 15:52:55 normalization > Target columns not in source: [] 2022-07-11 15:52:55 normalization > New column types: [] 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:46.271124 [debug] [Thread-4 ]: Writing runtime SQL for node "model.airbyte_utils.files_out" 2022-07-11 15:52:55 normalization > 15:52:46.271611 [debug] [Thread-4 ]: On model.airbyte_utils.files_out: /* {"app": "dbt", "dbt_version": "1.0.0", "profile_name": "normalize", "target_name": "prod", "node_id": "model.airbyte_utils.files_out"} */ 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > merge into `mainapi-282501`.raw_achilles.`files_out` as DBT_INTERNAL_DEST 2022-07-11 15:52:55 normalization > using ( 2022-07-11 15:52:55 normalization > select * from `mainapi-282501`.raw_achilles.`files_out__dbt_tmp` 2022-07-11 15:52:55 normalization > ) as DBT_INTERNAL_SOURCE 2022-07-11 15:52:55 normalization > on 2022-07-11 15:52:55 normalization > DBT_INTERNAL_SOURCE._airbyte_unique_key = DBT_INTERNAL_DEST._airbyte_unique_key 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when matched then update set 2022-07-11 15:52:55 normalization > `_airbyte_unique_key` = DBT_INTERNAL_SOURCE.`_airbyte_unique_key`,`id` = DBT_INTERNAL_SOURCE.`id`,`bank_id` = DBT_INTERNAL_SOURCE.`bank_id`,`created` = DBT_INTERNAL_SOURCE.`created`,`updated` = DBT_INTERNAL_SOURCE.`updated`,`file_hash` = DBT_INTERNAL_SOURCE.`file_hash`,`file_name` = DBT_INTERNAL_SOURCE.`file_name`,`_ab_cdc_lsn` = DBT_INTERNAL_SOURCE.`_ab_cdc_lsn`,`batch_count` = DBT_INTERNAL_SOURCE.`batch_count`,`exchange_window` = DBT_INTERNAL_SOURCE.`exchange_window`,`_ab_cdc_deleted_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_deleted_at`,`_ab_cdc_updated_at` = DBT_INTERNAL_SOURCE.`_ab_cdc_updated_at`,`_airbyte_ab_id` = DBT_INTERNAL_SOURCE.`_airbyte_ab_id`,`_airbyte_emitted_at` = DBT_INTERNAL_SOURCE.`_airbyte_emitted_at`,`_airbyte_normalized_at` = DBT_INTERNAL_SOURCE.`_airbyte_normalized_at`,`_airbyte_files_out_hashid` = DBT_INTERNAL_SOURCE.`_airbyte_files_out_hashid` 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > when not matched then insert 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `id`, `bank_id`, `created`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `batch_count`, `exchange_window`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_out_hashid`) 2022-07-11 15:52:55 normalization > values 2022-07-11 15:52:55 normalization > (`_airbyte_unique_key`, `id`, `bank_id`, `created`, `updated`, `file_hash`, `file_name`, `_ab_cdc_lsn`, `batch_count`, `exchange_window`, `_ab_cdc_deleted_at`, `_ab_cdc_updated_at`, `_airbyte_ab_id`, `_airbyte_emitted_at`, `_airbyte_normalized_at`, `_airbyte_files_out_hashid`) 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 2022-07-11 15:52:55 normalization > 15:52:49.488414 [debug] [Thread-4 ]: finished collecting timing info 2022-07-11 15:52:55 normalization > 15:52:49.489712 [info ] [Thread-4 ]: 18 of 18 OK created incremental model raw_achilles.files_out............................................................ [MERGE (34.0 rows, 20.2 KB processed) in 10.17s] 2022-07-11 15:52:55 normalization > 15:52:49.490408 [debug] [Thread-4 ]: Finished running node model.airbyte_utils.files_out 2022-07-11 15:52:55 normalization > 15:52:49.494374 [debug] [MainThread]: Acquiring new bigquery connection "master" 2022-07-11 15:52:55 normalization > 15:52:49.495469 [info ] [MainThread]: 2022-07-11 15:52:55 normalization > 15:52:49.495882 [info ] [MainThread]: Finished running 6 view models, 12 incremental models in 78.57s. 2022-07-11 15:52:55 normalization > 15:52:49.496508 [debug] [MainThread]: Connection 'master' was properly closed. 2022-07-11 15:52:55 normalization > 15:52:49.496985 [debug] [MainThread]: Connection 'model.airbyte_utils.partner_config' was properly closed. 2022-07-11 15:52:55 normalization > 15:52:49.497390 [debug] [MainThread]: Connection 'model.airbyte_utils.transactions_out_scd' was properly closed. 2022-07-11 15:52:55 normalization > 15:52:49.497582 [debug] [MainThread]: Connection 'model.airbyte_utils.bank_config_scd' was properly closed. 2022-07-11 15:52:55 normalization > 15:52:49.497782 [debug] [MainThread]: Connection 'model.airbyte_utils.files_out' was properly closed. 2022-07-11 15:52:55 normalization > 15:52:49.497947 [debug] [MainThread]: Connection 'model.airbyte_utils.bank_config' was properly closed. 2022-07-11 15:52:55 normalization > 15:52:49.498108 [debug] [MainThread]: Connection 'model.airbyte_utils.files_out_scd' was properly closed. 2022-07-11 15:52:55 normalization > 15:52:49.498332 [debug] [MainThread]: Connection 'model.airbyte_utils.files_in' was properly closed. 2022-07-11 15:52:55 normalization > 15:52:49.498504 [debug] [MainThread]: Connection 'model.airbyte_utils.files_in_scd' was properly closed. 2022-07-11 15:52:55 normalization > 15:52:49.520885 [info ] [MainThread]: 2022-07-11 15:52:55 normalization > 15:52:49.521381 [info ] [MainThread]: Completed with 1 error and 0 warnings: 2022-07-11 15:52:55 normalization > 15:52:49.522043 [info ] [MainThread]: 2022-07-11 15:52:55 normalization > 15:52:49.522897 [error] [MainThread]: Database Error in model transactions_in_scd (models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql) 2022-07-11 15:52:55 normalization > 15:52:49.523580 [error] [MainThread]: Invalid timestamp string "0000-12-30T00:00:00Z" 2022-07-11 15:52:55 normalization > 15:52:49.524209 [error] [MainThread]: compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/raw_achilles/transactions_in_scd.sql 2022-07-11 15:52:55 normalization > 15:52:49.524859 [info ] [MainThread]: 2022-07-11 15:52:55 normalization > 15:52:49.525482 [info ] [MainThread]: Done. PASS=16 WARN=0 ERROR=1 SKIP=1 TOTAL=18