2022-07-27 13:18:02 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.access_token: is missing but it is required, $.refresh_token: is missing but it is required 2022-07-27 13:18:02 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.password: object found, string expected 2022-07-27 13:18:02 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.method: does not have a value in the enumeration [Standard] 2022-07-27 13:18:02 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/0/logs.log 2022-07-27 13:18:02 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:18:02 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-stripe:0.1.34 exists... 2022-07-27 13:18:02 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-stripe:0.1.34 was found locally. 2022-07-27 13:18:02 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:18:02 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/0 --log-driver none --name source-stripe-check-6-0-aoxau --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/source-stripe:0.1.34 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/source-stripe:0.1.34 check --config source_config.json 2022-07-27 13:18:04 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):97 - Check succeeded 2022-07-27 13:18:04 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:18:04 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/0/logs.log 2022-07-27 13:18:04 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:18:04 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.31 exists... 2022-07-27 13:18:04 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.31 was found locally. 2022-07-27 13:18:04 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:18:04 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/0 --log-driver none --name destination-snowflake-check-6-0-nfgqo --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.31 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/destination-snowflake:0.4.31 check --config source_config.json 2022-07-27 13:18:05 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Class path contains multiple SLF4J bindings. 2022-07-27 13:18:05 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:05 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:05 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:05 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:05 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-27 13:18:05 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-27 13:18:07 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:07 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-07-27 13:18:07 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:07 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-07-27 13:18:07 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:07 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-07-27 13:18:07 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:07 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-07-27 13:18:07 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:07 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:07 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:07 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:07 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:07 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:07 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:07 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:07 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:07 INFO i.a.i.d.j.c.SwitchingDestination(check):55 - Using destination type: INTERNAL_STAGING 2022-07-27 13:18:07 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:07 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-07-27 13:18:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:09 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@7a45d714 2022-07-27 13:18:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:09 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-07-27 13:18:09 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:09 INFO i.a.d.j.DefaultJdbcDatabase(lambda$unsafeQuery$1):105 - closing connection 2022-07-27 13:18:10 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:10 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-07-27 13:18:10 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:10 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-07-27 13:18:11 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:18:12 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/0/logs.log 2022-07-27 13:18:12 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:18:12 INFO i.a.w.g.DefaultReplicationWorker(run):115 - start sync worker. job id: 6 attempt id: 0 2022-07-27 13:18:12 INFO i.a.w.g.DefaultReplicationWorker(run):127 - configured sync modes: {null.subscription_items=full_refresh - overwrite} 2022-07-27 13:18:12 INFO i.a.w.i.DefaultAirbyteDestination(start):69 - Running destination... 2022-07-27 13:18:12 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.31 exists... 2022-07-27 13:18:12 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.31 was found locally. 2022-07-27 13:18:12 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:18:12 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/0 --log-driver none --name destination-snowflake-write-6-0-vxcth --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.31 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/destination-snowflake:0.4.31 write --config destination_config.json --catalog destination_catalog.json 2022-07-27 13:18:12 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-stripe:0.1.34 exists... 2022-07-27 13:18:12 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-stripe:0.1.34 was found locally. 2022-07-27 13:18:12 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:18:12 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/0 --log-driver none --name source-stripe-read-6-0-bcrny --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/source-stripe:0.1.34 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/source-stripe:0.1.34 read --config source_config.json --catalog source_catalog.json --state input_state.json 2022-07-27 13:18:12 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):405 - Destination output thread started. 2022-07-27 13:18:12 INFO i.a.w.g.DefaultReplicationWorker(run):169 - Waiting for source and destination threads to complete. 2022-07-27 13:18:12 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):298 - Replication thread started. 2022-07-27 13:18:13 destination > SLF4J: Class path contains multiple SLF4J bindings. 2022-07-27 13:18:13 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:13 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:13 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:13 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:13 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-27 13:18:13 source > Starting syncing SourceStripe 2022-07-27 13:18:13 source > Syncing stream: subscription_items 2022-07-27 13:18:13 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-27 13:18:13 source > { "error": { "message": "Invalid status: must be one of active, past_due, unpaid, incomplete, incomplete_expired, trialing, or paused", "param": "status", "type": "invalid_request_error" } } 2022-07-27 13:18:13 source > Encountered an exception while reading stream subscription_items Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 114, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 173, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 267, in _read_full_refresh for record in records: File "/airbyte/integration_code/source_stripe/streams.py", line 266, in read_records for record in parent_stream.read_records(sync_mode=SyncMode.full_refresh): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 421, in read_records response = self._send_request(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 339, in _send_request return backoff_handler(user_backoff_handler)(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 306, in _send raise exc File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 303, in _send response.raise_for_status() File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all 2022-07-27 13:18:13 source > Finished syncing subscription_items 2022-07-27 13:18:13 source > SourceStripe runtimes: Syncing stream subscription_items 0:00:00.283210 2022-07-27 13:18:13 source > 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 114, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 173, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 267, in _read_full_refresh for record in records: File "/airbyte/integration_code/source_stripe/streams.py", line 266, in read_records for record in parent_stream.read_records(sync_mode=SyncMode.full_refresh): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 421, in read_records response = self._send_request(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 339, in _send_request return backoff_handler(user_backoff_handler)(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 306, in _send raise exc File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 303, in _send response.raise_for_status() File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/airbyte/integration_code/main.py", line 13, in launch(source, sys.argv[1:]) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 123, in launch for message in source_entrypoint.run(parsed_args): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 114, in run for message in generator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 127, in read raise AirbyteTracedException.from_exception(e, message=display_message) from e airbyte_cdk.utils.traced_exception.AirbyteTracedException: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all 2022-07-27 13:18:13 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):327 - Source has no more messages, closing connection. 2022-07-27 13:18:14 destination > 2022-07-27 13:18:14 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2022-07-27 13:18:14 destination > 2022-07-27 13:18:14 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-07-27 13:18:14 destination > 2022-07-27 13:18:14 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: WRITE 2022-07-27 13:18:14 destination > 2022-07-27 13:18:14 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2022-07-27 13:18:15 destination > 2022-07-27 13:18:15 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:15 destination > 2022-07-27 13:18:15 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:15 destination > 2022-07-27 13:18:15 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:15 destination > 2022-07-27 13:18:15 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:15 destination > 2022-07-27 13:18:15 INFO i.a.i.d.j.c.SwitchingDestination(getConsumer):65 - Using destination type: INTERNAL_STAGING 2022-07-27 13:18:15 destination > 2022-07-27 13:18:15 INFO i.a.i.d.s.StagingConsumerFactory(lambda$toWriteConfig$0):99 - Write config: WriteConfig{streamName=stripe_subscription_items, namespace=null, outputSchemaName=AIRBYTE, tmpTableName=_airbyte_tmp_rlk_stripe_subscription_items, outputTableName=_airbyte_raw_stripe_subscription_items, syncMode=overwrite} 2022-07-27 13:18:15 destination > 2022-07-27 13:18:15 INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2022-07-27 13:18:15 destination > 2022-07-27 13:18:15 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):117 - Preparing tmp tables in destination started for 1 streams 2022-07-27 13:18:15 destination > 2022-07-27 13:18:15 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):125 - Preparing staging area in destination started for schema AIRBYTE stream stripe_subscription_items: tmp table: _airbyte_tmp_rlk_stripe_subscription_items, stage: 2022/07/27/13/10AA7D26-C81D-4570-A26A-65BE13ABE5F8/ 2022-07-27 13:18:15 destination > 2022-07-27 13:18:15 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@4833eff3 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 INFO i.a.d.j.DefaultJdbcDatabase(lambda$unsafeQuery$1):105 - closing connection 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):136 - Preparing staging area in destination completed for schema AIRBYTE stream stripe_subscription_items 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):139 - Preparing tmp tables in destination completed. 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 WARN i.a.i.d.b.BufferedStreamConsumer(acceptTracked):145 - Unexpected message: TRACE 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 WARN i.a.i.b.IntegrationRunner(watchForOrphanThreads):241 - The main thread is exiting while children non-daemon threads from a connector are still active. 2022-07-27 13:18:17 destination > Ideally, this situation should not happen... 2022-07-27 13:18:17 destination > Please check with maintainers if the connector or library code should safely clean up its threads before quitting instead. 2022-07-27 13:18:17 destination > The main thread is: main (RUNNABLE) 2022-07-27 13:18:17 destination > Thread stacktrace: java.base/java.lang.Thread.getStackTrace(Thread.java:1610) 2022-07-27 13:18:17 destination > at io.airbyte.integrations.base.IntegrationRunner.dumpThread(IntegrationRunner.java:276) 2022-07-27 13:18:17 destination > at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:245) 2022-07-27 13:18:17 destination > at io.airbyte.integrations.base.IntegrationRunner.runConsumer(IntegrationRunner.java:202) 2022-07-27 13:18:17 destination > at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$1(IntegrationRunner.java:165) 2022-07-27 13:18:17 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54) 2022-07-27 13:18:17 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:38) 2022-07-27 13:18:17 destination > at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:165) 2022-07-27 13:18:17 destination > at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:107) 2022-07-27 13:18:17 destination > at io.airbyte.integrations.destination.snowflake.SnowflakeDestination.main(SnowflakeDestination.java:30) 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 WARN i.a.i.b.IntegrationRunner(watchForOrphanThreads):255 - Active non-daemon thread: pool-4-thread-1 (TIMED_WAITING) 2022-07-27 13:18:17 destination > Thread stacktrace: java.base@17.0.1/jdk.internal.misc.Unsafe.park(Native Method) 2022-07-27 13:18:17 destination > at java.base@17.0.1/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:252) 2022-07-27 13:18:17 destination > at java.base@17.0.1/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1672) 2022-07-27 13:18:17 destination > at java.base@17.0.1/java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueue.java:460) 2022-07-27 13:18:17 destination > at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1061) 2022-07-27 13:18:17 destination > at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1122) 2022-07-27 13:18:17 destination > at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) 2022-07-27 13:18:17 destination > at java.base@17.0.1/java.lang.Thread.run(Thread.java:833) 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded. 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 INFO i.a.i.d.b.BufferedStreamConsumer(close):171 - executing on success close procedure. 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 INFO i.a.i.d.r.SerializedBufferingStrategy(flushAll):103 - Flushing all 0 current buffers (0 bytes in total) 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):186 - Copying into tables in destination started for 1 streams 2022-07-27 13:18:17 destination > 2022-07-27 13:18:17 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):195 - Copying stream stripe_subscription_items of schema AIRBYTE into tmp table _airbyte_tmp_rlk_stripe_subscription_items to final table _airbyte_raw_stripe_subscription_items from stage path 2022/07/27/13/10AA7D26-C81D-4570-A26A-65BE13ABE5F8/ with 0 file(s) [] 2022-07-27 13:18:18 destination > 2022-07-27 13:18:18 INFO i.a.i.d.j.SqlOperations(onDestinationCloseOperations):137 - No onDestinationCloseOperations required for this destination. 2022-07-27 13:18:18 destination > 2022-07-27 13:18:18 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):216 - Executing finalization of tables. 2022-07-27 13:18:19 destination > 2022-07-27 13:18:19 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):218 - Finalizing tables in destination completed. 2022-07-27 13:18:19 destination > 2022-07-27 13:18:19 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):220 - Cleaning up destination started for 1 streams 2022-07-27 13:18:19 destination > 2022-07-27 13:18:19 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):224 - Cleaning tmp table in destination started for stream stripe_subscription_items. schema AIRBYTE, tmp table name: _airbyte_tmp_rlk_stripe_subscription_items 2022-07-27 13:18:19 destination > 2022-07-27 13:18:19 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):230 - Cleaning stage in destination started for stream stripe_subscription_items. schema AIRBYTE, stage: AIRBYTE_STRIPE_SUBSCRIPTION_ITEMS 2022-07-27 13:18:19 destination > 2022-07-27 13:18:19 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):235 - Cleaning up destination completed. 2022-07-27 13:18:19 destination > 2022-07-27 13:18:19 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-07-27 13:18:21 ERROR i.a.w.g.DefaultReplicationWorker(run):180 - Sync worker failed. java.util.concurrent.ExecutionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped! at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:173) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:65) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Suppressed: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:137) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:65) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped! at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:331) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:329) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more 2022-07-27 13:18:21 INFO i.a.w.g.DefaultReplicationWorker(run):239 - sync summary: io.airbyte.config.ReplicationAttemptSummary@d5e7752[status=failed,recordsSynced=0,bytesSynced=0,startTime=1658927892177,endTime=1658927901439,totalStats=io.airbyte.config.SyncStats@717453bb[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]] 2022-07-27 13:18:21 INFO i.a.w.g.DefaultReplicationWorker(run):268 - Source did not output any state messages 2022-07-27 13:18:21 WARN i.a.w.g.DefaultReplicationWorker(run):276 - State capture: No new state, falling back on input state: io.airbyte.config.State@18966e8b[state={}] 2022-07-27 13:18:21 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:18:21 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):161 - sync summary: io.airbyte.config.StandardSyncOutput@287bc27b[standardSyncSummary=io.airbyte.config.StandardSyncSummary@231e2b44[status=failed,recordsSynced=0,bytesSynced=0,startTime=1658927892177,endTime=1658927901439,totalStats=io.airbyte.config.SyncStats@717453bb[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]],normalizationSummary=,state=io.airbyte.config.State@18966e8b[state={}],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@5049af16[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@72e9e7c6[stream=io.airbyte.protocol.models.AirbyteStream@4779bfa[name=stripe_subscription_items,jsonSchema={"type":["null","object"],"properties":{"id":{"type":["null","string"]},"plan":{"type":["null","object","string"],"properties":{"id":{"type":["null","string"]},"name":{"type":["null","string"]},"tiers":{"type":["null","array"],"items":{"type":["null","string","object"],"properties":{"up_to":{"type":["null","integer"]},"flat_amount":{"type":["null","integer"]},"unit_amount":{"type":["null","integer"]}}}},"active":{"type":["null","boolean"]},"amount":{"type":["null","integer"]},"object":{"type":["null","string"]},"created":{"type":["null","integer"]},"product":{"type":["null","string"]},"updated":{"type":["null","number"]},"currency":{"type":["null","string"]},"interval":{"type":["null","string"]},"livemode":{"type":["null","boolean"]},"metadata":{"type":["null","object"],"properties":{}},"nickname":{"type":["null","string"]},"tiers_mode":{"type":["null","string"]},"usage_type":{"type":["null","string"]},"billing_scheme":{"type":["null","string"]},"interval_count":{"type":["null","integer"]},"aggregate_usage":{"type":["null","string"]},"transform_usage":{"type":["null","string"]},"trial_period_days":{"type":["null","integer"]},"statement_descriptor":{"type":["null","string"]},"statement_description":{"type":["null","string"]}}},"start":{"type":["null","integer"]},"object":{"type":["null","string"]},"status":{"type":["null","string"]},"created":{"type":["null","integer"]},"customer":{"type":["null","string"]},"discount":{"type":["null","object"],"properties":{}},"ended_at":{"type":["null","number"]},"livemode":{"type":["null","boolean"]},"metadata":{"type":["null","object"],"properties":{}},"quantity":{"type":["null","integer"]},"trial_end":{"type":["null","number"]},"canceled_at":{"type":["null","string"],"format":"date-time"},"tax_percent":{"type":["null","number"]},"trial_start":{"type":["null","integer"]},"subscription":{"type":["null","string"]},"current_period_end":{"type":["null","string"],"format":"date-time"},"cancel_at_period_end":{"type":["null","boolean"]},"current_period_start":{"type":["null","integer"]},"application_fee_percent":{"type":["null","number"]}}},supportedSyncModes=[full_refresh],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@56e1c43b[failureOrigin=source,failureType=system_error,internalMessage=400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all,externalMessage=Invalid status: must be one of active, past_due, unpaid, incomplete, incomplete_expired, trialing, or paused,metadata=io.airbyte.config.Metadata@4b4de0ee[additionalProperties={attemptNumber=0, jobId=6, from_trace_message=true}],stacktrace=Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 114, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 173, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 267, in _read_full_refresh for record in records: File "/airbyte/integration_code/source_stripe/streams.py", line 266, in read_records for record in parent_stream.read_records(sync_mode=SyncMode.full_refresh): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 421, in read_records response = self._send_request(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 339, in _send_request return backoff_handler(user_backoff_handler)(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 306, in _send raise exc File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 303, in _send response.raise_for_status() File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all ,retryable=,timestamp=1658927893707], io.airbyte.config.FailureReason@76a7f996[failureOrigin=source,failureType=,internalMessage=io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!,externalMessage=Something went wrong within the source connector,metadata=io.airbyte.config.Metadata@4e3704f3[additionalProperties={attemptNumber=0, jobId=6}],stacktrace=java.util.concurrent.CompletionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped! at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped! at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:331) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ... 3 more Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:329) ... 4 more ,retryable=,timestamp=1658927893984]]] 2022-07-27 13:18:21 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating... 2022-07-27 13:18:21 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/0/logs.log 2022-07-27 13:18:21 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:18:21 INFO i.a.w.g.DefaultNormalizationWorker(run):49 - Running normalization. 2022-07-27 13:18:21 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.2.8 2022-07-27 13:18:21 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.2.8 exists... 2022-07-27 13:18:21 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.2.8 was found locally. 2022-07-27 13:18:21 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:18:21 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/0/normalize --log-driver none --name normalization-snowflake-normalize-6-0-wesat --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.39.37-alpha airbyte/normalization-snowflake:0.2.8 run --integration-type snowflake --config destination_config.json --catalog destination_catalog.json 2022-07-27 13:18:22 normalization > Running: transform-config --config destination_config.json --integration-type snowflake --out /data/6/0/normalize 2022-07-27 13:18:22 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/6/0/normalize') 2022-07-27 13:18:22 normalization > transform_snowflake 2022-07-27 13:18:22 normalization > Running: transform-catalog --integration-type snowflake --profile-config-dir /data/6/0/normalize --catalog destination_catalog.json --out /data/6/0/normalize/models/generated/ --json-column _airbyte_data 2022-07-27 13:18:23 normalization > Processing destination_catalog.json... 2022-07-27 13:18:23 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_AB1.sql from stripe_subscription_items 2022-07-27 13:18:23 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_AB2.sql from stripe_subscription_items 2022-07-27 13:18:23 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_AB3.sql from stripe_subscription_items 2022-07-27 13:18:23 normalization > Generating airbyte_tables/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS.sql from stripe_subscription_items 2022-07-27 13:18:23 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_AB1.sql from stripe_subscription_items/plan 2022-07-27 13:18:23 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_AB2.sql from stripe_subscription_items/plan 2022-07-27 13:18:23 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_AB3.sql from stripe_subscription_items/plan 2022-07-27 13:18:23 normalization > Generating airbyte_tables/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN.sql from stripe_subscription_items/plan 2022-07-27 13:18:23 normalization > Ignoring stream 'discount' from stripe_subscription_items/discount because properties list is empty 2022-07-27 13:18:23 normalization > Ignoring stream 'metadata' from stripe_subscription_items/metadata because properties list is empty 2022-07-27 13:18:23 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS_AB1.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:18:23 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS_AB2.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:18:23 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS_AB3.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:18:23 normalization > Generating airbyte_tables/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:18:23 normalization > Ignoring stream 'metadata' from stripe_subscription_items/plan/metadata because properties list is empty 2022-07-27 13:18:23 normalization > detected no config file for ssh, assuming ssh is off. 2022-07-27 13:18:27 normalization > [--event-buffer-size EVENT_BUFFER_SIZE] 2022-07-27 13:18:27 normalization > --event-buffer-size EVENT_BUFFER_SIZE 2022-07-27 13:18:27 normalization > 2022-07-27 13:18:27 normalization > DBT >=1.0.0 detected; using 10K event buffer size 2022-07-27 13:18:27 normalization > 2022-07-27 13:18:31 normalization > 13:18:31 Running with dbt=1.0.0 2022-07-27 13:18:31 normalization > 13:18:31 Partial parse save file not found. Starting full parse. 2022-07-27 13:18:33 normalization > 13:18:33 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-07-27 13:18:33 normalization > There are 2 unused configuration paths: 2022-07-27 13:18:33 normalization > - models.airbyte_utils.generated.airbyte_incremental 2022-07-27 13:18:33 normalization > - models.airbyte_utils.generated.airbyte_views 2022-07-27 13:18:33 normalization > 2022-07-27 13:18:33 normalization > 13:18:33 Found 12 models, 0 tests, 0 snapshots, 0 analyses, 544 macros, 0 operations, 0 seed files, 1 source, 0 exposures, 0 metrics 2022-07-27 13:18:33 normalization > 13:18:33 2022-07-27 13:18:35 normalization > 13:18:35 Concurrency: 5 threads (target='prod') 2022-07-27 13:18:35 normalization > 13:18:35 2022-07-27 13:18:35 normalization > 13:18:35 1 of 3 START table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS.............................................................. [RUN] 2022-07-27 13:18:37 normalization > 13:18:37 1 of 3 OK created table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS......................................................... [SUCCESS 1 in 1.89s] 2022-07-27 13:18:37 normalization > 13:18:37 2 of 3 START table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN......................................................... [RUN] 2022-07-27 13:18:39 normalization > 13:18:39 2 of 3 OK created table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN.................................................... [SUCCESS 1 in 1.61s] 2022-07-27 13:18:39 normalization > 13:18:39 3 of 3 START table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS................................................... [RUN] 2022-07-27 13:18:40 normalization > 13:18:40 3 of 3 OK created table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS.............................................. [SUCCESS 1 in 1.22s] 2022-07-27 13:18:40 normalization > 13:18:40 2022-07-27 13:18:40 normalization > 13:18:40 Finished running 3 table models in 7.00s. 2022-07-27 13:18:40 normalization > 13:18:40 2022-07-27 13:18:40 normalization > 13:18:40 Completed successfully 2022-07-27 13:18:40 normalization > 13:18:40 2022-07-27 13:18:40 normalization > 13:18:40 Done. PASS=3 WARN=0 ERROR=0 SKIP=0 TOTAL=3 2022-07-27 13:18:41 INFO i.a.w.g.DefaultNormalizationWorker(run):73 - Normalization executed in 19 seconds. 2022-07-27 13:18:41 INFO i.a.w.g.DefaultNormalizationWorker(run):79 - Normalization summary: io.airbyte.config.NormalizationSummary@60f58641[startTime=1658927901679,endTime=1658927921211] 2022-07-27 13:18:41 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:18:41 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating... 2022-07-27 13:18:42 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.access_token: is missing but it is required, $.refresh_token: is missing but it is required 2022-07-27 13:18:42 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.password: object found, string expected 2022-07-27 13:18:42 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.method: does not have a value in the enumeration [Standard] 2022-07-27 13:18:42 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/1/logs.log 2022-07-27 13:18:42 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:18:42 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-stripe:0.1.34 exists... 2022-07-27 13:18:42 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:18:42 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/1 --log-driver none --name source-stripe-check-6-1-bgupm --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/source-stripe:0.1.34 -e WORKER_JOB_ATTEMPT=1 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/source-stripe:0.1.34 check --config source_config.json 2022-07-27 13:18:42 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-stripe:0.1.34 was found locally. 2022-07-27 13:18:43 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):97 - Check succeeded 2022-07-27 13:18:44 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:18:44 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/1/logs.log 2022-07-27 13:18:44 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:18:44 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.31 exists... 2022-07-27 13:18:44 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.31 was found locally. 2022-07-27 13:18:44 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:18:44 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/1 --log-driver none --name destination-snowflake-check-6-1-qodxr --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.31 -e WORKER_JOB_ATTEMPT=1 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/destination-snowflake:0.4.31 check --config source_config.json 2022-07-27 13:18:45 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Class path contains multiple SLF4J bindings. 2022-07-27 13:18:45 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:45 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:45 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:45 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:45 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-27 13:18:45 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-27 13:18:46 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:46 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-07-27 13:18:46 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:46 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-07-27 13:18:46 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:46 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-07-27 13:18:46 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:46 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-07-27 13:18:46 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:46 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:46 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:46 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:46 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:46 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:46 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:46 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:46 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:46 INFO i.a.i.d.j.c.SwitchingDestination(check):55 - Using destination type: INTERNAL_STAGING 2022-07-27 13:18:47 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:47 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-07-27 13:18:49 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:49 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@7a45d714 2022-07-27 13:18:49 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:49 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-07-27 13:18:49 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:49 INFO i.a.d.j.DefaultJdbcDatabase(lambda$unsafeQuery$1):105 - closing connection 2022-07-27 13:18:49 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:49 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-07-27 13:18:50 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:18:50 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-07-27 13:18:51 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:18:51 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/1/logs.log 2022-07-27 13:18:51 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:18:51 INFO i.a.w.g.DefaultReplicationWorker(run):115 - start sync worker. job id: 6 attempt id: 1 2022-07-27 13:18:51 INFO i.a.w.g.DefaultReplicationWorker(run):127 - configured sync modes: {null.subscription_items=full_refresh - overwrite} 2022-07-27 13:18:51 INFO i.a.w.i.DefaultAirbyteDestination(start):69 - Running destination... 2022-07-27 13:18:51 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.31 exists... 2022-07-27 13:18:52 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.31 was found locally. 2022-07-27 13:18:52 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:18:52 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/1 --log-driver none --name destination-snowflake-write-6-1-alsci --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.31 -e WORKER_JOB_ATTEMPT=1 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/destination-snowflake:0.4.31 write --config destination_config.json --catalog destination_catalog.json 2022-07-27 13:18:52 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-stripe:0.1.34 exists... 2022-07-27 13:18:52 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-stripe:0.1.34 was found locally. 2022-07-27 13:18:52 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:18:52 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/1 --log-driver none --name source-stripe-read-6-1-xbjrl --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/source-stripe:0.1.34 -e WORKER_JOB_ATTEMPT=1 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/source-stripe:0.1.34 read --config source_config.json --catalog source_catalog.json --state input_state.json 2022-07-27 13:18:52 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):405 - Destination output thread started. 2022-07-27 13:18:52 INFO i.a.w.g.DefaultReplicationWorker(run):169 - Waiting for source and destination threads to complete. 2022-07-27 13:18:52 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):298 - Replication thread started. 2022-07-27 13:18:53 destination > SLF4J: Class path contains multiple SLF4J bindings. 2022-07-27 13:18:53 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:53 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:53 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:53 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:18:53 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-27 13:18:53 source > Starting syncing SourceStripe 2022-07-27 13:18:53 source > Syncing stream: subscription_items 2022-07-27 13:18:53 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-27 13:18:53 source > { "error": { "message": "Invalid status: must be one of active, past_due, unpaid, incomplete, incomplete_expired, trialing, or paused", "param": "status", "type": "invalid_request_error" } } 2022-07-27 13:18:53 source > Encountered an exception while reading stream subscription_items Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 114, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 173, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 267, in _read_full_refresh for record in records: File "/airbyte/integration_code/source_stripe/streams.py", line 266, in read_records for record in parent_stream.read_records(sync_mode=SyncMode.full_refresh): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 421, in read_records response = self._send_request(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 339, in _send_request return backoff_handler(user_backoff_handler)(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 306, in _send raise exc File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 303, in _send response.raise_for_status() File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all 2022-07-27 13:18:53 source > Finished syncing subscription_items 2022-07-27 13:18:53 source > SourceStripe runtimes: Syncing stream subscription_items 0:00:00.292038 2022-07-27 13:18:53 source > 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 114, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 173, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 267, in _read_full_refresh for record in records: File "/airbyte/integration_code/source_stripe/streams.py", line 266, in read_records for record in parent_stream.read_records(sync_mode=SyncMode.full_refresh): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 421, in read_records response = self._send_request(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 339, in _send_request return backoff_handler(user_backoff_handler)(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 306, in _send raise exc File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 303, in _send response.raise_for_status() File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/airbyte/integration_code/main.py", line 13, in launch(source, sys.argv[1:]) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 123, in launch for message in source_entrypoint.run(parsed_args): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 114, in run for message in generator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 127, in read raise AirbyteTracedException.from_exception(e, message=display_message) from e airbyte_cdk.utils.traced_exception.AirbyteTracedException: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all 2022-07-27 13:18:53 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):327 - Source has no more messages, closing connection. 2022-07-27 13:18:54 destination > 2022-07-27 13:18:54 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2022-07-27 13:18:54 destination > 2022-07-27 13:18:54 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-07-27 13:18:54 destination > 2022-07-27 13:18:54 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: WRITE 2022-07-27 13:18:54 destination > 2022-07-27 13:18:54 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2022-07-27 13:18:55 destination > 2022-07-27 13:18:55 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:55 destination > 2022-07-27 13:18:55 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:55 destination > 2022-07-27 13:18:55 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:55 destination > 2022-07-27 13:18:55 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:18:55 destination > 2022-07-27 13:18:55 INFO i.a.i.d.j.c.SwitchingDestination(getConsumer):65 - Using destination type: INTERNAL_STAGING 2022-07-27 13:18:55 destination > 2022-07-27 13:18:55 INFO i.a.i.d.s.StagingConsumerFactory(lambda$toWriteConfig$0):99 - Write config: WriteConfig{streamName=stripe_subscription_items, namespace=null, outputSchemaName=AIRBYTE, tmpTableName=_airbyte_tmp_iah_stripe_subscription_items, outputTableName=_airbyte_raw_stripe_subscription_items, syncMode=overwrite} 2022-07-27 13:18:55 destination > 2022-07-27 13:18:55 INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2022-07-27 13:18:55 destination > 2022-07-27 13:18:55 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):117 - Preparing tmp tables in destination started for 1 streams 2022-07-27 13:18:55 destination > 2022-07-27 13:18:55 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):125 - Preparing staging area in destination started for schema AIRBYTE stream stripe_subscription_items: tmp table: _airbyte_tmp_iah_stripe_subscription_items, stage: 2022/07/27/13/44C5F4DF-B545-4902-A387-25EA3BB3F033/ 2022-07-27 13:18:55 destination > 2022-07-27 13:18:55 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-07-27 13:18:57 destination > 2022-07-27 13:18:57 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@5d4e13e1 2022-07-27 13:18:57 destination > 2022-07-27 13:18:57 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-07-27 13:18:57 destination > 2022-07-27 13:18:57 INFO i.a.d.j.DefaultJdbcDatabase(lambda$unsafeQuery$1):105 - closing connection 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):136 - Preparing staging area in destination completed for schema AIRBYTE stream stripe_subscription_items 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):139 - Preparing tmp tables in destination completed. 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 WARN i.a.i.d.b.BufferedStreamConsumer(acceptTracked):145 - Unexpected message: TRACE 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 WARN i.a.i.b.IntegrationRunner(watchForOrphanThreads):241 - The main thread is exiting while children non-daemon threads from a connector are still active. 2022-07-27 13:18:58 destination > Ideally, this situation should not happen... 2022-07-27 13:18:58 destination > Please check with maintainers if the connector or library code should safely clean up its threads before quitting instead. 2022-07-27 13:18:58 destination > The main thread is: main (RUNNABLE) 2022-07-27 13:18:58 destination > Thread stacktrace: java.base/java.lang.Thread.getStackTrace(Thread.java:1610) 2022-07-27 13:18:58 destination > at io.airbyte.integrations.base.IntegrationRunner.dumpThread(IntegrationRunner.java:276) 2022-07-27 13:18:58 destination > at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:245) 2022-07-27 13:18:58 destination > at io.airbyte.integrations.base.IntegrationRunner.runConsumer(IntegrationRunner.java:202) 2022-07-27 13:18:58 destination > at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$1(IntegrationRunner.java:165) 2022-07-27 13:18:58 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54) 2022-07-27 13:18:58 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:38) 2022-07-27 13:18:58 destination > at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:165) 2022-07-27 13:18:58 destination > at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:107) 2022-07-27 13:18:58 destination > at io.airbyte.integrations.destination.snowflake.SnowflakeDestination.main(SnowflakeDestination.java:30) 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 WARN i.a.i.b.IntegrationRunner(watchForOrphanThreads):255 - Active non-daemon thread: pool-4-thread-1 (TIMED_WAITING) 2022-07-27 13:18:58 destination > Thread stacktrace: java.base@17.0.1/jdk.internal.misc.Unsafe.park(Native Method) 2022-07-27 13:18:58 destination > at java.base@17.0.1/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:252) 2022-07-27 13:18:58 destination > at java.base@17.0.1/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1672) 2022-07-27 13:18:58 destination > at java.base@17.0.1/java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueue.java:460) 2022-07-27 13:18:58 destination > at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1061) 2022-07-27 13:18:58 destination > at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1122) 2022-07-27 13:18:58 destination > at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) 2022-07-27 13:18:58 destination > at java.base@17.0.1/java.lang.Thread.run(Thread.java:833) 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded. 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 INFO i.a.i.d.b.BufferedStreamConsumer(close):171 - executing on success close procedure. 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 INFO i.a.i.d.r.SerializedBufferingStrategy(flushAll):103 - Flushing all 0 current buffers (0 bytes in total) 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):186 - Copying into tables in destination started for 1 streams 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):195 - Copying stream stripe_subscription_items of schema AIRBYTE into tmp table _airbyte_tmp_iah_stripe_subscription_items to final table _airbyte_raw_stripe_subscription_items from stage path 2022/07/27/13/44C5F4DF-B545-4902-A387-25EA3BB3F033/ with 0 file(s) [] 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 INFO i.a.i.d.j.SqlOperations(onDestinationCloseOperations):137 - No onDestinationCloseOperations required for this destination. 2022-07-27 13:18:58 destination > 2022-07-27 13:18:58 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):216 - Executing finalization of tables. 2022-07-27 13:18:59 destination > 2022-07-27 13:18:59 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):218 - Finalizing tables in destination completed. 2022-07-27 13:18:59 destination > 2022-07-27 13:18:59 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):220 - Cleaning up destination started for 1 streams 2022-07-27 13:18:59 destination > 2022-07-27 13:18:59 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):224 - Cleaning tmp table in destination started for stream stripe_subscription_items. schema AIRBYTE, tmp table name: _airbyte_tmp_iah_stripe_subscription_items 2022-07-27 13:18:59 destination > 2022-07-27 13:18:59 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):230 - Cleaning stage in destination started for stream stripe_subscription_items. schema AIRBYTE, stage: AIRBYTE_STRIPE_SUBSCRIPTION_ITEMS 2022-07-27 13:18:59 destination > 2022-07-27 13:18:59 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):235 - Cleaning up destination completed. 2022-07-27 13:18:59 destination > 2022-07-27 13:18:59 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-07-27 13:19:00 ERROR i.a.w.g.DefaultReplicationWorker(run):180 - Sync worker failed. java.util.concurrent.ExecutionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped! at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:173) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:65) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Suppressed: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:137) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:65) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped! at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:331) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:329) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more 2022-07-27 13:19:00 INFO i.a.w.g.DefaultReplicationWorker(run):239 - sync summary: io.airbyte.config.ReplicationAttemptSummary@57fc1da0[status=failed,recordsSynced=0,bytesSynced=0,startTime=1658927931928,endTime=1658927940949,totalStats=io.airbyte.config.SyncStats@52491dd[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]] 2022-07-27 13:19:00 INFO i.a.w.g.DefaultReplicationWorker(run):268 - Source did not output any state messages 2022-07-27 13:19:00 WARN i.a.w.g.DefaultReplicationWorker(run):276 - State capture: No new state, falling back on input state: io.airbyte.config.State@20e48cc[state={}] 2022-07-27 13:19:00 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:19:00 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):161 - sync summary: io.airbyte.config.StandardSyncOutput@3d9bfc53[standardSyncSummary=io.airbyte.config.StandardSyncSummary@78997584[status=failed,recordsSynced=0,bytesSynced=0,startTime=1658927931928,endTime=1658927940949,totalStats=io.airbyte.config.SyncStats@52491dd[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]],normalizationSummary=,state=io.airbyte.config.State@20e48cc[state={}],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@53e82f3b[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@49abeb7e[stream=io.airbyte.protocol.models.AirbyteStream@2b30ef7c[name=stripe_subscription_items,jsonSchema={"type":["null","object"],"properties":{"id":{"type":["null","string"]},"plan":{"type":["null","object","string"],"properties":{"id":{"type":["null","string"]},"name":{"type":["null","string"]},"tiers":{"type":["null","array"],"items":{"type":["null","string","object"],"properties":{"up_to":{"type":["null","integer"]},"flat_amount":{"type":["null","integer"]},"unit_amount":{"type":["null","integer"]}}}},"active":{"type":["null","boolean"]},"amount":{"type":["null","integer"]},"object":{"type":["null","string"]},"created":{"type":["null","integer"]},"product":{"type":["null","string"]},"updated":{"type":["null","number"]},"currency":{"type":["null","string"]},"interval":{"type":["null","string"]},"livemode":{"type":["null","boolean"]},"metadata":{"type":["null","object"],"properties":{}},"nickname":{"type":["null","string"]},"tiers_mode":{"type":["null","string"]},"usage_type":{"type":["null","string"]},"billing_scheme":{"type":["null","string"]},"interval_count":{"type":["null","integer"]},"aggregate_usage":{"type":["null","string"]},"transform_usage":{"type":["null","string"]},"trial_period_days":{"type":["null","integer"]},"statement_descriptor":{"type":["null","string"]},"statement_description":{"type":["null","string"]}}},"start":{"type":["null","integer"]},"object":{"type":["null","string"]},"status":{"type":["null","string"]},"created":{"type":["null","integer"]},"customer":{"type":["null","string"]},"discount":{"type":["null","object"],"properties":{}},"ended_at":{"type":["null","number"]},"livemode":{"type":["null","boolean"]},"metadata":{"type":["null","object"],"properties":{}},"quantity":{"type":["null","integer"]},"trial_end":{"type":["null","number"]},"canceled_at":{"type":["null","string"],"format":"date-time"},"tax_percent":{"type":["null","number"]},"trial_start":{"type":["null","integer"]},"subscription":{"type":["null","string"]},"current_period_end":{"type":["null","string"],"format":"date-time"},"cancel_at_period_end":{"type":["null","boolean"]},"current_period_start":{"type":["null","integer"]},"application_fee_percent":{"type":["null","number"]}}},supportedSyncModes=[full_refresh],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@18b68acb[failureOrigin=source,failureType=system_error,internalMessage=400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all,externalMessage=Invalid status: must be one of active, past_due, unpaid, incomplete, incomplete_expired, trialing, or paused,metadata=io.airbyte.config.Metadata@ab5ff11[additionalProperties={attemptNumber=1, jobId=6, from_trace_message=true}],stacktrace=Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 114, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 173, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 267, in _read_full_refresh for record in records: File "/airbyte/integration_code/source_stripe/streams.py", line 266, in read_records for record in parent_stream.read_records(sync_mode=SyncMode.full_refresh): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 421, in read_records response = self._send_request(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 339, in _send_request return backoff_handler(user_backoff_handler)(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 306, in _send raise exc File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 303, in _send response.raise_for_status() File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all ,retryable=,timestamp=1658927933685], io.airbyte.config.FailureReason@1cd31522[failureOrigin=source,failureType=,internalMessage=io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped!,externalMessage=Something went wrong within the source connector,metadata=io.airbyte.config.Metadata@303442cb[additionalProperties={attemptNumber=1, jobId=6}],stacktrace=java.util.concurrent.CompletionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped! at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source cannot be stopped! at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:331) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ... 3 more Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:329) ... 4 more ,retryable=,timestamp=1658927933941]]] 2022-07-27 13:19:00 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating... 2022-07-27 13:19:01 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/1/logs.log 2022-07-27 13:19:01 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:19:01 INFO i.a.w.g.DefaultNormalizationWorker(run):49 - Running normalization. 2022-07-27 13:19:01 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.2.8 2022-07-27 13:19:01 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.2.8 exists... 2022-07-27 13:19:01 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.2.8 was found locally. 2022-07-27 13:19:01 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:19:01 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/1/normalize --log-driver none --name normalization-snowflake-normalize-6-1-rtkzb --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.39.37-alpha airbyte/normalization-snowflake:0.2.8 run --integration-type snowflake --config destination_config.json --catalog destination_catalog.json 2022-07-27 13:19:01 normalization > Running: transform-config --config destination_config.json --integration-type snowflake --out /data/6/1/normalize 2022-07-27 13:19:02 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/6/1/normalize') 2022-07-27 13:19:02 normalization > transform_snowflake 2022-07-27 13:19:02 normalization > Running: transform-catalog --integration-type snowflake --profile-config-dir /data/6/1/normalize --catalog destination_catalog.json --out /data/6/1/normalize/models/generated/ --json-column _airbyte_data 2022-07-27 13:19:02 normalization > Processing destination_catalog.json... 2022-07-27 13:19:02 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_AB1.sql from stripe_subscription_items 2022-07-27 13:19:02 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_AB2.sql from stripe_subscription_items 2022-07-27 13:19:02 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_AB3.sql from stripe_subscription_items 2022-07-27 13:19:02 normalization > Generating airbyte_tables/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS.sql from stripe_subscription_items 2022-07-27 13:19:02 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_AB1.sql from stripe_subscription_items/plan 2022-07-27 13:19:02 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_AB2.sql from stripe_subscription_items/plan 2022-07-27 13:19:02 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_AB3.sql from stripe_subscription_items/plan 2022-07-27 13:19:02 normalization > Generating airbyte_tables/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN.sql from stripe_subscription_items/plan 2022-07-27 13:19:02 normalization > Ignoring stream 'discount' from stripe_subscription_items/discount because properties list is empty 2022-07-27 13:19:02 normalization > Ignoring stream 'metadata' from stripe_subscription_items/metadata because properties list is empty 2022-07-27 13:19:02 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS_AB1.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:19:02 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS_AB2.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:19:02 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS_AB3.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:19:02 normalization > Generating airbyte_tables/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:19:02 normalization > Ignoring stream 'metadata' from stripe_subscription_items/plan/metadata because properties list is empty 2022-07-27 13:19:03 normalization > detected no config file for ssh, assuming ssh is off. 2022-07-27 13:19:06 normalization > [--event-buffer-size EVENT_BUFFER_SIZE] 2022-07-27 13:19:06 normalization > --event-buffer-size EVENT_BUFFER_SIZE 2022-07-27 13:19:06 normalization > 2022-07-27 13:19:06 normalization > DBT >=1.0.0 detected; using 10K event buffer size 2022-07-27 13:19:06 normalization > 2022-07-27 13:19:11 normalization > 13:19:11 Running with dbt=1.0.0 2022-07-27 13:19:11 normalization > 13:19:11 Partial parse save file not found. Starting full parse. 2022-07-27 13:19:13 normalization > 13:19:13 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-07-27 13:19:13 normalization > There are 2 unused configuration paths: 2022-07-27 13:19:13 normalization > - models.airbyte_utils.generated.airbyte_views 2022-07-27 13:19:13 normalization > - models.airbyte_utils.generated.airbyte_incremental 2022-07-27 13:19:13 normalization > 2022-07-27 13:19:13 normalization > 13:19:13 Found 12 models, 0 tests, 0 snapshots, 0 analyses, 544 macros, 0 operations, 0 seed files, 1 source, 0 exposures, 0 metrics 2022-07-27 13:19:13 normalization > 13:19:13 2022-07-27 13:19:15 normalization > 13:19:15 Concurrency: 5 threads (target='prod') 2022-07-27 13:19:15 normalization > 13:19:15 2022-07-27 13:19:15 normalization > 13:19:15 1 of 3 START table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS.............................................................. [RUN] 2022-07-27 13:19:17 normalization > 13:19:17 1 of 3 OK created table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS......................................................... [SUCCESS 1 in 1.90s] 2022-07-27 13:19:17 normalization > 13:19:17 2 of 3 START table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN......................................................... [RUN] 2022-07-27 13:19:19 normalization > 13:19:19 2 of 3 OK created table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN.................................................... [SUCCESS 1 in 1.67s] 2022-07-27 13:19:19 normalization > 13:19:19 3 of 3 START table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS................................................... [RUN] 2022-07-27 13:19:20 normalization > 13:19:20 3 of 3 OK created table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS.............................................. [SUCCESS 1 in 1.06s] 2022-07-27 13:19:20 normalization > 13:19:20 2022-07-27 13:19:20 normalization > 13:19:20 Finished running 3 table models in 7.17s. 2022-07-27 13:19:20 normalization > 13:19:20 2022-07-27 13:19:20 normalization > 13:19:20 Completed successfully 2022-07-27 13:19:20 normalization > 13:19:20 2022-07-27 13:19:20 normalization > 13:19:20 Done. PASS=3 WARN=0 ERROR=0 SKIP=0 TOTAL=3 2022-07-27 13:19:21 INFO i.a.w.g.DefaultNormalizationWorker(run):73 - Normalization executed in 19 seconds. 2022-07-27 13:19:21 INFO i.a.w.g.DefaultNormalizationWorker(run):79 - Normalization summary: io.airbyte.config.NormalizationSummary@7c3b6924[startTime=1658927941197,endTime=1658927961166] 2022-07-27 13:19:21 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:19:21 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating... 2022-07-27 13:19:21 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.access_token: is missing but it is required, $.refresh_token: is missing but it is required 2022-07-27 13:19:21 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.password: object found, string expected 2022-07-27 13:19:21 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. errors: $.method: does not have a value in the enumeration [Standard] 2022-07-27 13:19:22 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/2/logs.log 2022-07-27 13:19:22 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:19:22 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-stripe:0.1.34 exists... 2022-07-27 13:19:22 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-stripe:0.1.34 was found locally. 2022-07-27 13:19:22 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:19:22 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/2 --log-driver none --name source-stripe-check-6-2-oliif --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/source-stripe:0.1.34 -e WORKER_JOB_ATTEMPT=2 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/source-stripe:0.1.34 check --config source_config.json 2022-07-27 13:19:23 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):97 - Check succeeded 2022-07-27 13:19:23 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:19:23 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/2/logs.log 2022-07-27 13:19:23 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:19:24 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.31 exists... 2022-07-27 13:19:24 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.31 was found locally. 2022-07-27 13:19:24 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:19:24 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/2 --log-driver none --name destination-snowflake-check-6-2-vklxh --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.31 -e WORKER_JOB_ATTEMPT=2 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/destination-snowflake:0.4.31 check --config source_config.json 2022-07-27 13:19:24 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Class path contains multiple SLF4J bindings. 2022-07-27 13:19:24 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:19:24 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:19:24 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:19:24 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:19:24 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-27 13:19:24 ERROR i.a.c.i.LineGobbler(voidCall):82 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-27 13:19:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:26 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json} 2022-07-27 13:19:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:26 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-07-27 13:19:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:26 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK 2022-07-27 13:19:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:26 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2022-07-27 13:19:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:26 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:19:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:26 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:19:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:26 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:19:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:26 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:19:26 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:26 INFO i.a.i.d.j.c.SwitchingDestination(check):55 - Using destination type: INTERNAL_STAGING 2022-07-27 13:19:27 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:27 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-07-27 13:19:28 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:28 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@6c0905f6 2022-07-27 13:19:28 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:28 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-07-27 13:19:28 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:28 INFO i.a.d.j.DefaultJdbcDatabase(lambda$unsafeQuery$1):105 - closing connection 2022-07-27 13:19:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:29 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated... 2022-07-27 13:19:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-07-27 13:19:29 INFO c.z.h.HikariDataSource(close):352 - HikariPool-1 - Shutdown completed. 2022-07-27 13:19:30 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:19:31 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/2/logs.log 2022-07-27 13:19:31 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:19:31 INFO i.a.w.g.DefaultReplicationWorker(run):115 - start sync worker. job id: 6 attempt id: 2 2022-07-27 13:19:31 INFO i.a.w.g.DefaultReplicationWorker(run):127 - configured sync modes: {null.subscription_items=full_refresh - overwrite} 2022-07-27 13:19:31 INFO i.a.w.i.DefaultAirbyteDestination(start):69 - Running destination... 2022-07-27 13:19:31 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.31 exists... 2022-07-27 13:19:31 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.31 was found locally. 2022-07-27 13:19:31 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:19:31 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/2 --log-driver none --name destination-snowflake-write-6-2-jfvrw --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.31 -e WORKER_JOB_ATTEMPT=2 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/destination-snowflake:0.4.31 write --config destination_config.json --catalog destination_catalog.json 2022-07-27 13:19:31 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-stripe:0.1.34 exists... 2022-07-27 13:19:31 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-stripe:0.1.34 was found locally. 2022-07-27 13:19:31 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:19:31 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/2 --log-driver none --name source-stripe-read-6-2-pnyvi --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=false -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_CONNECTOR_IMAGE=airbyte/source-stripe:0.1.34 -e WORKER_JOB_ATTEMPT=2 -e AIRBYTE_VERSION=0.39.37-alpha -e WORKER_JOB_ID=6 airbyte/source-stripe:0.1.34 read --config source_config.json --catalog source_catalog.json --state input_state.json 2022-07-27 13:19:31 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):405 - Destination output thread started. 2022-07-27 13:19:31 INFO i.a.w.g.DefaultReplicationWorker(run):169 - Waiting for source and destination threads to complete. 2022-07-27 13:19:31 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):298 - Replication thread started. 2022-07-27 13:19:32 destination > SLF4J: Class path contains multiple SLF4J bindings. 2022-07-27 13:19:32 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:19:32 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:19:32 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:19:32 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-07-27 13:19:32 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-07-27 13:19:32 source > Starting syncing SourceStripe 2022-07-27 13:19:32 source > Syncing stream: subscription_items 2022-07-27 13:19:32 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-07-27 13:19:33 source > { "error": { "message": "Invalid status: must be one of active, past_due, unpaid, incomplete, incomplete_expired, trialing, or paused", "param": "status", "type": "invalid_request_error" } } 2022-07-27 13:19:33 source > Encountered an exception while reading stream subscription_items Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 114, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 173, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 267, in _read_full_refresh for record in records: File "/airbyte/integration_code/source_stripe/streams.py", line 266, in read_records for record in parent_stream.read_records(sync_mode=SyncMode.full_refresh): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 421, in read_records response = self._send_request(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 339, in _send_request return backoff_handler(user_backoff_handler)(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 306, in _send raise exc File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 303, in _send response.raise_for_status() File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all 2022-07-27 13:19:33 source > Finished syncing subscription_items 2022-07-27 13:19:33 source > SourceStripe runtimes: Syncing stream subscription_items 0:00:00.321002 2022-07-27 13:19:33 source > 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 114, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 173, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 267, in _read_full_refresh for record in records: File "/airbyte/integration_code/source_stripe/streams.py", line 266, in read_records for record in parent_stream.read_records(sync_mode=SyncMode.full_refresh): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 421, in read_records response = self._send_request(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 339, in _send_request return backoff_handler(user_backoff_handler)(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 306, in _send raise exc File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 303, in _send response.raise_for_status() File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/airbyte/integration_code/main.py", line 13, in launch(source, sys.argv[1:]) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 123, in launch for message in source_entrypoint.run(parsed_args): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 114, in run for message in generator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 127, in read raise AirbyteTracedException.from_exception(e, message=display_message) from e airbyte_cdk.utils.traced_exception.AirbyteTracedException: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all 2022-07-27 13:19:33 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):335 - Total records read: 1 (0 bytes) 2022-07-27 13:19:34 destination > 2022-07-27 13:19:34 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2022-07-27 13:19:34 destination > 2022-07-27 13:19:34 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-07-27 13:19:34 destination > 2022-07-27 13:19:34 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: WRITE 2022-07-27 13:19:34 destination > 2022-07-27 13:19:34 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2022-07-27 13:19:34 destination > 2022-07-27 13:19:34 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:19:34 destination > 2022-07-27 13:19:34 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:19:34 destination > 2022-07-27 13:19:34 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:19:34 destination > 2022-07-27 13:19:34 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-07-27 13:19:34 destination > 2022-07-27 13:19:34 INFO i.a.i.d.j.c.SwitchingDestination(getConsumer):65 - Using destination type: INTERNAL_STAGING 2022-07-27 13:19:35 destination > 2022-07-27 13:19:35 INFO i.a.i.d.s.StagingConsumerFactory(lambda$toWriteConfig$0):99 - Write config: WriteConfig{streamName=stripe_subscription_items, namespace=null, outputSchemaName=AIRBYTE, tmpTableName=_airbyte_tmp_dxr_stripe_subscription_items, outputTableName=_airbyte_raw_stripe_subscription_items, syncMode=overwrite} 2022-07-27 13:19:35 destination > 2022-07-27 13:19:35 INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2022-07-27 13:19:35 destination > 2022-07-27 13:19:35 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):117 - Preparing tmp tables in destination started for 1 streams 2022-07-27 13:19:35 destination > 2022-07-27 13:19:35 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):125 - Preparing staging area in destination started for schema AIRBYTE stream stripe_subscription_items: tmp table: _airbyte_tmp_dxr_stripe_subscription_items, stage: 2022/07/27/13/07E8DDA8-9A46-49A9-B8B4-F192B4D961B7/ 2022-07-27 13:19:35 destination > 2022-07-27 13:19:35 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-07-27 13:19:36 destination > 2022-07-27 13:19:36 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@3976ebfa 2022-07-27 13:19:36 destination > 2022-07-27 13:19:36 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-07-27 13:19:36 destination > 2022-07-27 13:19:36 INFO i.a.d.j.DefaultJdbcDatabase(lambda$unsafeQuery$1):105 - closing connection 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):136 - Preparing staging area in destination completed for schema AIRBYTE stream stripe_subscription_items 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):139 - Preparing tmp tables in destination completed. 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 WARN i.a.i.d.b.BufferedStreamConsumer(acceptTracked):145 - Unexpected message: TRACE 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 WARN i.a.i.b.IntegrationRunner(watchForOrphanThreads):241 - The main thread is exiting while children non-daemon threads from a connector are still active. 2022-07-27 13:19:37 destination > Ideally, this situation should not happen... 2022-07-27 13:19:37 destination > Please check with maintainers if the connector or library code should safely clean up its threads before quitting instead. 2022-07-27 13:19:37 destination > The main thread is: main (RUNNABLE) 2022-07-27 13:19:37 destination > Thread stacktrace: java.base/java.lang.Thread.getStackTrace(Thread.java:1610) 2022-07-27 13:19:37 destination > at io.airbyte.integrations.base.IntegrationRunner.dumpThread(IntegrationRunner.java:276) 2022-07-27 13:19:37 destination > at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:245) 2022-07-27 13:19:37 destination > at io.airbyte.integrations.base.IntegrationRunner.runConsumer(IntegrationRunner.java:202) 2022-07-27 13:19:37 destination > at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$1(IntegrationRunner.java:165) 2022-07-27 13:19:37 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54) 2022-07-27 13:19:37 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:38) 2022-07-27 13:19:37 destination > at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:165) 2022-07-27 13:19:37 destination > at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:107) 2022-07-27 13:19:37 destination > at io.airbyte.integrations.destination.snowflake.SnowflakeDestination.main(SnowflakeDestination.java:30) 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 WARN i.a.i.b.IntegrationRunner(watchForOrphanThreads):255 - Active non-daemon thread: pool-4-thread-1 (TIMED_WAITING) 2022-07-27 13:19:37 destination > Thread stacktrace: java.base@17.0.1/jdk.internal.misc.Unsafe.park(Native Method) 2022-07-27 13:19:37 destination > at java.base@17.0.1/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:252) 2022-07-27 13:19:37 destination > at java.base@17.0.1/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1672) 2022-07-27 13:19:37 destination > at java.base@17.0.1/java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueue.java:460) 2022-07-27 13:19:37 destination > at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1061) 2022-07-27 13:19:37 destination > at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1122) 2022-07-27 13:19:37 destination > at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) 2022-07-27 13:19:37 destination > at java.base@17.0.1/java.lang.Thread.run(Thread.java:833) 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded. 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 INFO i.a.i.d.b.BufferedStreamConsumer(close):171 - executing on success close procedure. 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 INFO i.a.i.d.r.SerializedBufferingStrategy(flushAll):103 - Flushing all 0 current buffers (0 bytes in total) 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):186 - Copying into tables in destination started for 1 streams 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):195 - Copying stream stripe_subscription_items of schema AIRBYTE into tmp table _airbyte_tmp_dxr_stripe_subscription_items to final table _airbyte_raw_stripe_subscription_items from stage path 2022/07/27/13/07E8DDA8-9A46-49A9-B8B4-F192B4D961B7/ with 0 file(s) [] 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 INFO i.a.i.d.j.SqlOperations(onDestinationCloseOperations):137 - No onDestinationCloseOperations required for this destination. 2022-07-27 13:19:37 destination > 2022-07-27 13:19:37 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):216 - Executing finalization of tables. 2022-07-27 13:19:38 destination > 2022-07-27 13:19:38 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):218 - Finalizing tables in destination completed. 2022-07-27 13:19:38 destination > 2022-07-27 13:19:38 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):220 - Cleaning up destination started for 1 streams 2022-07-27 13:19:38 destination > 2022-07-27 13:19:38 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):224 - Cleaning tmp table in destination started for stream stripe_subscription_items. schema AIRBYTE, tmp table name: _airbyte_tmp_dxr_stripe_subscription_items 2022-07-27 13:19:38 destination > 2022-07-27 13:19:38 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):230 - Cleaning stage in destination started for stream stripe_subscription_items. schema AIRBYTE, stage: AIRBYTE_STRIPE_SUBSCRIPTION_ITEMS 2022-07-27 13:19:38 destination > 2022-07-27 13:19:38 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):235 - Cleaning up destination completed. 2022-07-27 13:19:38 destination > 2022-07-27 13:19:38 INFO i.a.i.b.IntegrationRunner(runInternal):171 - Completed integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-07-27 13:19:39 ERROR i.a.w.g.DefaultReplicationWorker(run):180 - Sync worker failed. java.util.concurrent.ExecutionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1 at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:173) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:65) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Suppressed: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:137) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:65) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:158) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1 at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:348) ~[io.airbyte-airbyte-workers-0.39.37-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more 2022-07-27 13:19:39 INFO i.a.w.g.DefaultReplicationWorker(run):239 - sync summary: io.airbyte.config.ReplicationAttemptSummary@f9b727d[status=failed,recordsSynced=0,bytesSynced=0,startTime=1658927971260,endTime=1658927979918,totalStats=io.airbyte.config.SyncStats@2c3f320d[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]] 2022-07-27 13:19:39 INFO i.a.w.g.DefaultReplicationWorker(run):268 - Source did not output any state messages 2022-07-27 13:19:39 WARN i.a.w.g.DefaultReplicationWorker(run):276 - State capture: No new state, falling back on input state: io.airbyte.config.State@68910277[state={}] 2022-07-27 13:19:39 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:19:39 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):161 - sync summary: io.airbyte.config.StandardSyncOutput@37f30115[standardSyncSummary=io.airbyte.config.StandardSyncSummary@444a8d7f[status=failed,recordsSynced=0,bytesSynced=0,startTime=1658927971260,endTime=1658927979918,totalStats=io.airbyte.config.SyncStats@2c3f320d[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]],normalizationSummary=,state=io.airbyte.config.State@68910277[state={}],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@4fa628b9[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@15da0530[stream=io.airbyte.protocol.models.AirbyteStream@3a0e5537[name=stripe_subscription_items,jsonSchema={"type":["null","object"],"properties":{"id":{"type":["null","string"]},"plan":{"type":["null","object","string"],"properties":{"id":{"type":["null","string"]},"name":{"type":["null","string"]},"tiers":{"type":["null","array"],"items":{"type":["null","string","object"],"properties":{"up_to":{"type":["null","integer"]},"flat_amount":{"type":["null","integer"]},"unit_amount":{"type":["null","integer"]}}}},"active":{"type":["null","boolean"]},"amount":{"type":["null","integer"]},"object":{"type":["null","string"]},"created":{"type":["null","integer"]},"product":{"type":["null","string"]},"updated":{"type":["null","number"]},"currency":{"type":["null","string"]},"interval":{"type":["null","string"]},"livemode":{"type":["null","boolean"]},"metadata":{"type":["null","object"],"properties":{}},"nickname":{"type":["null","string"]},"tiers_mode":{"type":["null","string"]},"usage_type":{"type":["null","string"]},"billing_scheme":{"type":["null","string"]},"interval_count":{"type":["null","integer"]},"aggregate_usage":{"type":["null","string"]},"transform_usage":{"type":["null","string"]},"trial_period_days":{"type":["null","integer"]},"statement_descriptor":{"type":["null","string"]},"statement_description":{"type":["null","string"]}}},"start":{"type":["null","integer"]},"object":{"type":["null","string"]},"status":{"type":["null","string"]},"created":{"type":["null","integer"]},"customer":{"type":["null","string"]},"discount":{"type":["null","object"],"properties":{}},"ended_at":{"type":["null","number"]},"livemode":{"type":["null","boolean"]},"metadata":{"type":["null","object"],"properties":{}},"quantity":{"type":["null","integer"]},"trial_end":{"type":["null","number"]},"canceled_at":{"type":["null","string"],"format":"date-time"},"tax_percent":{"type":["null","number"]},"trial_start":{"type":["null","integer"]},"subscription":{"type":["null","string"]},"current_period_end":{"type":["null","string"],"format":"date-time"},"cancel_at_period_end":{"type":["null","boolean"]},"current_period_start":{"type":["null","integer"]},"application_fee_percent":{"type":["null","number"]}}},supportedSyncModes=[full_refresh],sourceDefinedCursor=,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@2a3f067c[failureOrigin=source,failureType=system_error,internalMessage=400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all,externalMessage=Invalid status: must be one of active, past_due, unpaid, incomplete, incomplete_expired, trialing, or paused,metadata=io.airbyte.config.Metadata@54a20062[additionalProperties={attemptNumber=2, jobId=6, from_trace_message=true}],stacktrace=Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 114, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 173, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 267, in _read_full_refresh for record in records: File "/airbyte/integration_code/source_stripe/streams.py", line 266, in read_records for record in parent_stream.read_records(sync_mode=SyncMode.full_refresh): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 421, in read_records response = self._send_request(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 339, in _send_request return backoff_handler(user_backoff_handler)(request, request_kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 306, in _send raise exc File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/http.py", line 303, in _send response.raise_for_status() File "/usr/local/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.stripe.com/v1/subscriptions?limit=100&created%5Bgte%5D=1467331200&status=all ,retryable=,timestamp=1658927973088], io.airbyte.config.FailureReason@11947c43[failureOrigin=source,failureType=,internalMessage=io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1,externalMessage=Something went wrong within the source connector,metadata=io.airbyte.config.Metadata@597b24ab[additionalProperties={attemptNumber=2, jobId=6}],stacktrace=java.util.concurrent.CompletionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1 at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1 at io.airbyte.workers.general.DefaultReplicationWorker.lambda$getReplicationRunnable$6(DefaultReplicationWorker.java:348) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ... 3 more ,retryable=,timestamp=1658927973356]]] 2022-07-27 13:19:39 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating... 2022-07-27 13:19:40 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/6/2/logs.log 2022-07-27 13:19:40 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.37-alpha 2022-07-27 13:19:40 INFO i.a.w.g.DefaultNormalizationWorker(run):49 - Running normalization. 2022-07-27 13:19:40 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.2.8 2022-07-27 13:19:40 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.2.8 exists... 2022-07-27 13:19:40 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.2.8 was found locally. 2022-07-27 13:19:40 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 6 2022-07-27 13:19:40 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/6/2/normalize --log-driver none --name normalization-snowflake-normalize-6-2-rjpgw --network host -v /app/airbyte/airbyte-workspace:/data -v /tmp/airbyte-local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.39.37-alpha airbyte/normalization-snowflake:0.2.8 run --integration-type snowflake --config destination_config.json --catalog destination_catalog.json 2022-07-27 13:19:40 normalization > Running: transform-config --config destination_config.json --integration-type snowflake --out /data/6/2/normalize 2022-07-27 13:19:41 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/6/2/normalize') 2022-07-27 13:19:41 normalization > transform_snowflake 2022-07-27 13:19:41 normalization > Running: transform-catalog --integration-type snowflake --profile-config-dir /data/6/2/normalize --catalog destination_catalog.json --out /data/6/2/normalize/models/generated/ --json-column _airbyte_data 2022-07-27 13:19:41 normalization > Processing destination_catalog.json... 2022-07-27 13:19:41 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_AB1.sql from stripe_subscription_items 2022-07-27 13:19:41 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_AB2.sql from stripe_subscription_items 2022-07-27 13:19:41 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_AB3.sql from stripe_subscription_items 2022-07-27 13:19:41 normalization > Generating airbyte_tables/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS.sql from stripe_subscription_items 2022-07-27 13:19:41 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_AB1.sql from stripe_subscription_items/plan 2022-07-27 13:19:41 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_AB2.sql from stripe_subscription_items/plan 2022-07-27 13:19:41 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_AB3.sql from stripe_subscription_items/plan 2022-07-27 13:19:41 normalization > Generating airbyte_tables/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN.sql from stripe_subscription_items/plan 2022-07-27 13:19:41 normalization > Ignoring stream 'discount' from stripe_subscription_items/discount because properties list is empty 2022-07-27 13:19:41 normalization > Ignoring stream 'metadata' from stripe_subscription_items/metadata because properties list is empty 2022-07-27 13:19:41 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS_AB1.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:19:41 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS_AB2.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:19:41 normalization > Generating airbyte_ctes/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS_AB3.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:19:41 normalization > Generating airbyte_tables/AIRBYTE/STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS.sql from stripe_subscription_items/plan/tiers 2022-07-27 13:19:41 normalization > Ignoring stream 'metadata' from stripe_subscription_items/plan/metadata because properties list is empty 2022-07-27 13:19:41 normalization > detected no config file for ssh, assuming ssh is off. 2022-07-27 13:19:45 normalization > [--event-buffer-size EVENT_BUFFER_SIZE] 2022-07-27 13:19:45 normalization > --event-buffer-size EVENT_BUFFER_SIZE 2022-07-27 13:19:45 normalization > 2022-07-27 13:19:45 normalization > DBT >=1.0.0 detected; using 10K event buffer size 2022-07-27 13:19:45 normalization > 2022-07-27 13:19:49 normalization > 13:19:49 Running with dbt=1.0.0 2022-07-27 13:19:49 normalization > 13:19:49 Partial parse save file not found. Starting full parse. 2022-07-27 13:19:52 normalization > 13:19:52 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-07-27 13:19:52 normalization > There are 2 unused configuration paths: 2022-07-27 13:19:52 normalization > - models.airbyte_utils.generated.airbyte_views 2022-07-27 13:19:52 normalization > - models.airbyte_utils.generated.airbyte_incremental 2022-07-27 13:19:52 normalization > 2022-07-27 13:19:52 normalization > 13:19:52 Found 12 models, 0 tests, 0 snapshots, 0 analyses, 544 macros, 0 operations, 0 seed files, 1 source, 0 exposures, 0 metrics 2022-07-27 13:19:52 normalization > 13:19:52 2022-07-27 13:19:53 normalization > 13:19:53 Concurrency: 5 threads (target='prod') 2022-07-27 13:19:53 normalization > 13:19:53 2022-07-27 13:19:54 normalization > 13:19:54 1 of 3 START table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS.............................................................. [RUN] 2022-07-27 13:19:55 normalization > 13:19:55 1 of 3 OK created table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS......................................................... [SUCCESS 1 in 1.73s] 2022-07-27 13:19:56 normalization > 13:19:56 2 of 3 START table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN......................................................... [RUN] 2022-07-27 13:19:57 normalization > 13:19:57 2 of 3 OK created table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN.................................................... [SUCCESS 1 in 1.43s] 2022-07-27 13:19:57 normalization > 13:19:57 3 of 3 START table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS................................................... [RUN] 2022-07-27 13:19:58 normalization > 13:19:58 3 of 3 OK created table model AIRBYTE.STRIPE_SUBSCRIPTION_ITEMS_PLAN_TIERS.............................................. [SUCCESS 1 in 1.18s] 2022-07-27 13:19:58 normalization > 13:19:58 2022-07-27 13:19:58 normalization > 13:19:58 Finished running 3 table models in 6.54s. 2022-07-27 13:19:58 normalization > 13:19:58 2022-07-27 13:19:58 normalization > 13:19:58 Completed successfully 2022-07-27 13:19:58 normalization > 13:19:58 2022-07-27 13:19:58 normalization > 13:19:58 Done. PASS=3 WARN=0 ERROR=0 SKIP=0 TOTAL=3 2022-07-27 13:19:59 INFO i.a.w.g.DefaultNormalizationWorker(run):73 - Normalization executed in 19 seconds. 2022-07-27 13:19:59 INFO i.a.w.g.DefaultNormalizationWorker(run):79 - Normalization summary: io.airbyte.config.NormalizationSummary@53e2b14f[startTime=1658927980164,endTime=1658927999189] 2022-07-27 13:19:59 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling... 2022-07-27 13:19:59 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):291 - Stopping temporal heartbeating...