2022-05-16 12:09:38 INFO i.a.w.w.WorkerRun(call):49 - Executing worker wrapper. Airbyte version: 0.36.1-alpha 2022-05-16 12:09:38 INFO i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/47/0/logs.log 2022-05-16 12:09:38 INFO i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.36.1-alpha 2022-05-16 12:09:39 INFO i.a.w.DefaultReplicationWorker(run):104 - start sync worker. job id: 47 attempt id: 0 2022-05-16 12:09:39 INFO i.a.w.DefaultReplicationWorker(run):116 - configured sync modes: {null.Attribute_Bundle_Item_Mapping__Tag=incremental - append_dedup} 2022-05-16 12:09:39 INFO i.a.w.p.a.DefaultAirbyteDestination(start):69 - Running destination... 2022-05-16 12:09:39 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.24 exists... 2022-05-16 12:09:39 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.24 was found locally. 2022-05-16 12:09:39 INFO i.a.w.p.DockerProcessFactory(create):106 - Creating docker job ID: 47 2022-05-16 12:09:39 INFO i.a.w.p.DockerProcessFactory(create):158 - Preparing command: docker run --rm --init -i -w /data/47/0 --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.24 -e WORKER_JOB_ATTEMPT=0 -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.36.1-alpha -e WORKER_JOB_ID=47 airbyte/destination-snowflake:0.4.24 write --config destination_config.json --catalog destination_catalog.json 2022-05-16 12:09:39 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-salesforce:1.0.9 exists... 2022-05-16 12:09:39 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-salesforce:1.0.9 was found locally. 2022-05-16 12:09:39 INFO i.a.w.p.DockerProcessFactory(create):106 - Creating docker job ID: 47 2022-05-16 12:09:39 INFO i.a.w.p.DockerProcessFactory(create):158 - Preparing command: docker run --rm --init -i -w /data/47/0 --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_CONNECTOR_IMAGE=airbyte/source-salesforce:1.0.9 -e WORKER_JOB_ATTEMPT=0 -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.36.1-alpha -e WORKER_JOB_ID=47 airbyte/source-salesforce:1.0.9 read --config source_config.json --catalog source_catalog.json 2022-05-16 12:09:39 INFO i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):279 - Replication thread started. 2022-05-16 12:09:39 INFO i.a.w.DefaultReplicationWorker(run):158 - Waiting for source and destination threads to complete. 2022-05-16 12:09:39 INFO i.a.w.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$6):339 - Destination output thread started. 2022-05-16 12:09:39 destination > SLF4J: Class path contains multiple SLF4J bindings. 2022-05-16 12:09:39 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-05-16 12:09:39 destination > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2022-05-16 12:09:39 destination > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2022-05-16 12:09:39 destination > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 2022-05-16 12:09:40 destination > 2022-05-16 12:09:40 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json} 2022-05-16 12:09:40 destination > 2022-05-16 12:09:40 INFO i.a.i.b.IntegrationRunner(runInternal):121 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-05-16 12:09:40 destination > 2022-05-16 12:09:40 INFO i.a.i.b.IntegrationRunner(runInternal):122 - Command: WRITE 2022-05-16 12:09:40 destination > 2022-05-16 12:09:40 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'} 2022-05-16 12:09:40 destination > 2022-05-16 12:09:40 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-05-16 12:09:40 destination > 2022-05-16 12:09:40 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-05-16 12:09:40 destination > 2022-05-16 12:09:40 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-05-16 12:09:40 destination > 2022-05-16 12:09:40 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2022-05-16 12:09:40 destination > 2022-05-16 12:09:40 INFO i.a.i.d.j.c.SwitchingDestination(getConsumer):65 - Using destination type: INTERNAL_STAGING 2022-05-16 12:09:41 source > Starting generating streams 2022-05-16 12:09:41 destination > 2022-05-16 12:09:41 INFO i.a.i.d.s.StagingConsumerFactory(lambda$toWriteConfig$0):96 - Write config: WriteConfig{streamName=Attribute_Bundle_Item_Mapping__Tag, namespace=null, outputSchemaName=SALESFORCE_TEST, tmpTableName=_airbyte_tmp_iba_Attribute_Bundle_Item_Mapping__Tag, outputTableName=_airbyte_raw_Attribute_Bundle_Item_Mapping__Tag, syncMode=append_dedup} 2022-05-16 12:09:41 destination > 2022-05-16 12:09:41 INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started. 2022-05-16 12:09:41 destination > 2022-05-16 12:09:41 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):114 - Preparing tmp tables in destination started for 1 streams 2022-05-16 12:09:41 destination > 2022-05-16 12:09:41 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):122 - Preparing staging area in destination started for schema SALESFORCE_TEST stream Attribute_Bundle_Item_Mapping__Tag: tmp table: _airbyte_tmp_iba_Attribute_Bundle_Item_Mapping__Tag, stage: 2022/05/16/12/222743A4-B752-45C8-98AD-903354D4D94A/ 2022-05-16 12:09:41 destination > 2022-05-16 12:09:41 INFO c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting... 2022-05-16 12:09:41 source > Starting syncing SourceSalesforce 2022-05-16 12:09:41 source > Syncing stream: Attribute_Bundle_Item_Mapping__Tag 2022-05-16 12:09:41 source > error body: [{"errorCode":"INVALIDENTITY","message":"Entity '01I0e000000FMSK.Tag' is not supported by the Bulk API."}], sobject options: {'activateable': False, 'associateEntityType': 'Tag', 'associateParentEntity': 'Attribute_Bundle_Item_Mapping__c', 'createable': True, 'custom': False, 'customSetting': False, 'deepCloneable': False, 'deletable': True, 'deprecatedAndHidden': False, 'feedEnabled': False, 'hasSubtypes': False, 'isInterface': False, 'isSubtype': False, 'keyPrefix': None, 'label': 'Tag: Attribute To Bundle Item Mapping', 'labelPlural': 'Tag: Attribute To Bundle Item Mapping', 'layoutable': False, 'mergeable': False, 'mruEnabled': False, 'queryable': True, 'replicateable': False, 'retrieveable': True, 'searchable': False, 'triggerable': False, 'undeletable': False, 'updateable': False, 'urls': {'rowTemplate': '/services/data/v52.0/sobjects/Attribute_Bundle_Item_Mapping__Tag/{ID}', 'describe': '/services/data/v52.0/sobjects/Attribute_Bundle_Item_Mapping__Tag/describe', 'sobject': '/services/data/v52.0/sobjects/Attribute_Bundle_Item_Mapping__Tag'}} 2022-05-16 12:09:41 source > Giving up for returned HTTP status: 400, body: [{"errorCode":"INVALIDENTITY","message":"Entity '01I0e000000FMSK.Tag' is not supported by the Bulk API."}] 2022-05-16 12:09:41 source > Giving up _send_http_request(...) after 1 tries (requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://global.my.salesforce.com/services/data/v52.0/jobs/query) 2022-05-16 12:09:41 source > Cannot receive data for stream 'Attribute_Bundle_Item_Mapping__Tag' using BULK API, sobject options: {'activateable': False, 'associateEntityType': 'Tag', 'associateParentEntity': 'Attribute_Bundle_Item_Mapping__c', 'createable': True, 'custom': False, 'customSetting': False, 'deepCloneable': False, 'deletable': True, 'deprecatedAndHidden': False, 'feedEnabled': False, 'hasSubtypes': False, 'isInterface': False, 'isSubtype': False, 'keyPrefix': None, 'label': 'Tag: Attribute To Bundle Item Mapping', 'labelPlural': 'Tag: Attribute To Bundle Item Mapping', 'layoutable': False, 'mergeable': False, 'mruEnabled': False, 'queryable': True, 'replicateable': False, 'retrieveable': True, 'searchable': False, 'triggerable': False, 'undeletable': False, 'updateable': False, 'urls': {'rowTemplate': '/services/data/v52.0/sobjects/Attribute_Bundle_Item_Mapping__Tag/{ID}', 'describe': '/services/data/v52.0/sobjects/Attribute_Bundle_Item_Mapping__Tag/describe', 'sobject': '/services/data/v52.0/sobjects/Attribute_Bundle_Item_Mapping__Tag'}}, error message: 'Entity '01I0e000000FMSK.Tag' is not supported by the Bulk API.' 2022-05-16 12:09:41 source > Encountered an exception while reading stream SourceSalesforce Traceback (most recent call last): File "/airbyte/integration_code/source_salesforce/source.py", line 111, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 159, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 215, in _read_incremental for record_counter, record_data in enumerate(records, start=1): File "/airbyte/integration_code/source_salesforce/streams.py", line 381, in read_records raise SalesforceException(f"Job for {self.name} stream using BULK API was failed.") source_salesforce.exceptions.SalesforceException: Job for Attribute_Bundle_Item_Mapping__Tag stream using BULK API was failed. 2022-05-16 12:09:41 source > Job for Attribute_Bundle_Item_Mapping__Tag stream using BULK API was failed. Traceback (most recent call last): File "/airbyte/integration_code/main.py", line 13, in launch(source, sys.argv[1:]) File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 129, in launch for message in source_entrypoint.run(parsed_args): File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 120, in run for message in generator: File "/airbyte/integration_code/source_salesforce/source.py", line 128, in read raise e File "/airbyte/integration_code/source_salesforce/source.py", line 111, in read yield from self._read_stream( File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 159, in _read_stream for record in record_iterator: File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py", line 215, in _read_incremental for record_counter, record_data in enumerate(records, start=1): File "/airbyte/integration_code/source_salesforce/streams.py", line 381, in read_records raise SalesforceException(f"Job for {self.name} stream using BULK API was failed.") source_salesforce.exceptions.SalesforceException: Job for Attribute_Bundle_Item_Mapping__Tag stream using BULK API was failed. 2022-05-16 12:09:41 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. errors: $.type: does not have a value in the enumeration [RECORD, STATE, LOG, SPEC, CONNECTION_STATUS, CATALOG] 2022-05-16 12:09:41 ERROR i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: {"type":"TRACE","trace":{"type":"ERROR","emitted_at":1.65270298180575E12,"error":{"message":"Something went wrong in the connector. See the logs for more details.","internal_message":"Job for Attribute_Bundle_Item_Mapping__Tag stream using BULK API was failed.","stack_trace":"Traceback (most recent call last):\n File \"/airbyte/integration_code/main.py\", line 13, in \n launch(source, sys.argv[1:])\n File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 129, in launch\n for message in source_entrypoint.run(parsed_args):\n File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py\", line 120, in run\n for message in generator:\n File \"/airbyte/integration_code/source_salesforce/source.py\", line 128, in read\n raise e\n File \"/airbyte/integration_code/source_salesforce/source.py\", line 111, in read\n yield from self._read_stream(\n File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py\", line 159, in _read_stream\n for record in record_iterator:\n File \"/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/abstract_source.py\", line 215, in _read_incremental\n for record_counter, record_data in enumerate(records, start=1):\n File \"/airbyte/integration_code/source_salesforce/streams.py\", line 381, in read_records\n raise SalesforceException(f\"Job for {self.name} stream using BULK API was failed.\")\nsource_salesforce.exceptions.SalesforceException: Job for Attribute_Bundle_Item_Mapping__Tag stream using BULK API was failed.\n","failure_type":"system_error"}}} 2022-05-16 12:09:42 INFO i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):305 - Total records read: 0 (0 bytes) 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 INFO c.z.h.p.HikariPool(checkFailFast):565 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@480b57e2 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 INFO c.z.h.HikariDataSource(getConnection):123 - HikariPool-1 - Start completed. 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 INFO i.a.d.j.DefaultJdbcDatabase(lambda$unsafeQuery$1):106 - closing connection 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):133 - Preparing staging area in destination completed for schema SALESFORCE_TEST stream Attribute_Bundle_Item_Mapping__Tag 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):136 - Preparing tmp tables in destination completed. 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 WARN i.a.i.b.IntegrationRunner(watchForOrphanThreads):229 - The main thread is exiting while children non-daemon threads from a connector are still active. 2022-05-16 12:09:42 destination > Ideally, this situation should not happen... 2022-05-16 12:09:42 destination > Please check with maintainers if the connector or library code should safely clean up its threads before quitting instead. 2022-05-16 12:09:42 destination > The main thread is: main (RUNNABLE) 2022-05-16 12:09:42 destination > Thread stacktrace: java.base/java.lang.Thread.getStackTrace(Thread.java:1610) 2022-05-16 12:09:42 destination > at io.airbyte.integrations.base.IntegrationRunner.dumpThread(IntegrationRunner.java:264) 2022-05-16 12:09:42 destination > at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:233) 2022-05-16 12:09:42 destination > at io.airbyte.integrations.base.IntegrationRunner.runConsumer(IntegrationRunner.java:190) 2022-05-16 12:09:42 destination > at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$1(IntegrationRunner.java:163) 2022-05-16 12:09:42 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54) 2022-05-16 12:09:42 destination > at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:38) 2022-05-16 12:09:42 destination > at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:163) 2022-05-16 12:09:42 destination > at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:105) 2022-05-16 12:09:42 destination > at io.airbyte.integrations.destination.snowflake.SnowflakeDestination.main(SnowflakeDestination.java:30) 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 WARN i.a.i.b.IntegrationRunner(watchForOrphanThreads):243 - Active non-daemon thread: pool-4-thread-1 (RUNNABLE) 2022-05-16 12:09:42 destination > Thread stacktrace: java.base@17.0.1/sun.nio.ch.Net.poll(Native Method) 2022-05-16 12:09:42 destination > at java.base@17.0.1/sun.nio.ch.NioSocketImpl.park(NioSocketImpl.java:181) 2022-05-16 12:09:42 destination > at java.base@17.0.1/sun.nio.ch.NioSocketImpl.timedRead(NioSocketImpl.java:285) 2022-05-16 12:09:42 destination > at java.base@17.0.1/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:309) 2022-05-16 12:09:42 destination > at java.base@17.0.1/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:350) 2022-05-16 12:09:42 destination > at java.base@17.0.1/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:803) 2022-05-16 12:09:42 destination > at java.base@17.0.1/java.net.Socket$SocketInputStream.read(Socket.java:966) 2022-05-16 12:09:42 destination > at java.base@17.0.1/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:478) 2022-05-16 12:09:42 destination > at java.base@17.0.1/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:472) 2022-05-16 12:09:42 destination > at java.base@17.0.1/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) 2022-05-16 12:09:42 destination > at java.base@17.0.1/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1455) 2022-05-16 12:09:42 destination > at java.base@17.0.1/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1059) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.internal.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.RestRequest.execute(RestRequest.java:160) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.core.HttpUtil.executeRequestInternal(HttpUtil.java:639) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.core.HttpUtil.executeRequest(HttpUtil.java:584) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.core.HttpUtil.executeGeneralRequest(HttpUtil.java:551) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.telemetry.TelemetryClient.sendBatch(TelemetryClient.java:256) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.telemetry.TelemetryClient.lambda$sendBatchAsync$0(TelemetryClient.java:204) 2022-05-16 12:09:42 destination > at app//net.snowflake.client.jdbc.telemetry.TelemetryClient$$Lambda$254/0x0000000800fbfa68.call(Unknown Source) 2022-05-16 12:09:42 destination > at java.base@17.0.1/java.util.concurrent.FutureTask.run(FutureTask.java:264) 2022-05-16 12:09:42 destination > at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) 2022-05-16 12:09:42 destination > at java.base@17.0.1/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) 2022-05-16 12:09:42 destination > at java.base@17.0.1/java.lang.Thread.run(Thread.java:833) 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded. 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 INFO i.a.i.d.b.BufferedStreamConsumer(close):170 - executing on success close procedure. 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 INFO i.a.i.d.r.SerializedBufferingStrategy(flushAll):92 - Flushing all 0 current buffers (0 bytes in total) 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):182 - Copying into tables in destination started for 1 streams 2022-05-16 12:09:42 destination > 2022-05-16 12:09:42 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):191 - Copying stream Attribute_Bundle_Item_Mapping__Tag of schema SALESFORCE_TEST into tmp table _airbyte_tmp_iba_Attribute_Bundle_Item_Mapping__Tag to final table _airbyte_raw_Attribute_Bundle_Item_Mapping__Tag from stage path 2022/05/16/12/222743A4-B752-45C8-98AD-903354D4D94A/ with 0 file(s) [] 2022-05-16 12:09:43 destination > 2022-05-16 12:09:43 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):213 - Executing finalization of tables. 2022-05-16 12:09:44 destination > 2022-05-16 12:09:44 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):215 - Finalizing tables in destination completed. 2022-05-16 12:09:44 destination > 2022-05-16 12:09:44 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):217 - Cleaning up destination started for 1 streams 2022-05-16 12:09:44 destination > 2022-05-16 12:09:44 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):221 - Cleaning tmp table in destination started for stream Attribute_Bundle_Item_Mapping__Tag. schema SALESFORCE_TEST, tmp table name: _airbyte_tmp_iba_Attribute_Bundle_Item_Mapping__Tag 2022-05-16 12:09:44 destination > 2022-05-16 12:09:44 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):226 - Cleaning stage in destination started for stream Attribute_Bundle_Item_Mapping__Tag. schema SALESFORCE_TEST, stage: SALESFORCE_TEST_ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG 2022-05-16 12:09:44 destination > 2022-05-16 12:09:44 INFO i.a.i.d.s.StagingConsumerFactory(lambda$onCloseFunction$4):230 - Cleaning up destination completed. 2022-05-16 12:09:44 destination > 2022-05-16 12:09:44 INFO i.a.i.b.IntegrationRunner(runInternal):169 - Completed integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination 2022-05-16 12:09:45 ERROR i.a.w.DefaultReplicationWorker(run):169 - Sync worker failed. java.util.concurrent.ExecutionException: io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1 at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:162) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Suppressed: io.airbyte.workers.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.protocols.airbyte.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:126) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at java.lang.Thread.run(Thread.java:833) [?:?] Caused by: io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1 at io.airbyte.workers.DefaultReplicationWorker.lambda$getReplicationRunnable$5(DefaultReplicationWorker.java:312) ~[io.airbyte-airbyte-workers-0.36.1-alpha.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] ... 1 more 2022-05-16 12:09:45 INFO i.a.w.DefaultReplicationWorker(run):228 - sync summary: io.airbyte.config.ReplicationAttemptSummary@748f77[status=failed,recordsSynced=0,bytesSynced=0,startTime=1652702979014,endTime=1652702985692,totalStats=io.airbyte.config.SyncStats@3129fd87[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]] 2022-05-16 12:09:45 INFO i.a.w.DefaultReplicationWorker(run):250 - Source did not output any state messages 2022-05-16 12:09:45 WARN i.a.w.DefaultReplicationWorker(run):261 - State capture: No state retained. 2022-05-16 12:09:45 INFO i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling... 2022-05-16 12:09:45 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$1):147 - sync summary: io.airbyte.config.StandardSyncOutput@67a9c841[standardSyncSummary=io.airbyte.config.StandardSyncSummary@6dc90704[status=failed,recordsSynced=0,bytesSynced=0,startTime=1652702979014,endTime=1652702985692,totalStats=io.airbyte.config.SyncStats@3129fd87[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]],normalizationSummary=,state=,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@1f556ace[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@3cf0047d[stream=io.airbyte.protocol.models.AirbyteStream@15b23862[name=Attribute_Bundle_Item_Mapping__Tag,jsonSchema={"type":"object","$schema":"http://json-schema.org/draft-07/schema#","properties":{"Id":{"type":["string","null"]},"Name":{"type":["string","null"]},"Type":{"type":["string","null"]},"ItemId":{"type":["string","null"]},"IsDeleted":{"type":["boolean","null"]},"CreatedDate":{"type":["string","null"],"format":"date-time"},"SystemModstamp":{"type":["string","null"],"format":"date-time"},"TagDefinitionId":{"type":["string","null"]}},"additionalProperties":true},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[SystemModstamp],sourceDefinedPrimaryKey=[[Id]],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[SystemModstamp],destinationSyncMode=append_dedup,primaryKey=[[Id]],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@30e6ecf7[failureOrigin=source,failureType=,internalMessage=io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1,externalMessage=Something went wrong within the source connector,metadata=io.airbyte.config.Metadata@451e33f[additionalProperties={attemptNumber=0, jobId=47}],stacktrace=java.util.concurrent.CompletionException: io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1 at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1 at io.airbyte.workers.DefaultReplicationWorker.lambda$getReplicationRunnable$5(DefaultReplicationWorker.java:312) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ... 3 more ,retryable=,timestamp=1652702982046]]] 2022-05-16 12:09:45 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):235 - Stopping temporal heartbeating... 2022-05-16 12:09:45 INFO i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/47/0/logs.log 2022-05-16 12:09:45 INFO i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.36.1-alpha 2022-05-16 12:09:45 INFO i.a.w.DefaultNormalizationWorker(run):47 - Running normalization. 2022-05-16 12:09:45 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):122 - Running with normalization version: airbyte/normalization-snowflake:0.1.75 2022-05-16 12:09:45 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/normalization-snowflake:0.1.75 exists... 2022-05-16 12:09:45 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/normalization-snowflake:0.1.75 was found locally. 2022-05-16 12:09:45 INFO i.a.w.p.DockerProcessFactory(create):106 - Creating docker job ID: 47 2022-05-16 12:09:45 INFO i.a.w.p.DockerProcessFactory(create):158 - Preparing command: docker run --rm --init -i -w /data/47/0/normalize --log-driver none --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.36.1-alpha airbyte/normalization-snowflake:0.1.75 run --integration-type snowflake --config destination_config.json --catalog destination_catalog.json 2022-05-16 12:09:46 normalization > Running: transform-config --config destination_config.json --integration-type snowflake --out /data/47/0/normalize 2022-05-16 12:09:46 normalization > Namespace(config='destination_config.json', integration_type=, out='/data/47/0/normalize') 2022-05-16 12:09:46 normalization > transform_snowflake 2022-05-16 12:09:46 normalization > Running: transform-catalog --integration-type snowflake --profile-config-dir /data/47/0/normalize --catalog destination_catalog.json --out /data/47/0/normalize/models/generated/ --json-column _airbyte_data 2022-05-16 12:09:46 normalization > Processing destination_catalog.json... 2022-05-16 12:09:46 normalization > Generating airbyte_ctes/SALESFORCE_TEST/ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG_AB1.sql from Attribute_Bundle_Item_Mapping__Tag 2022-05-16 12:09:46 normalization > Generating airbyte_ctes/SALESFORCE_TEST/ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG_AB2.sql from Attribute_Bundle_Item_Mapping__Tag 2022-05-16 12:09:46 normalization > Generating airbyte_views/SALESFORCE_TEST/ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG_STG.sql from Attribute_Bundle_Item_Mapping__Tag 2022-05-16 12:09:46 normalization > Generating airbyte_incremental/scd/SALESFORCE_TEST/ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG_SCD.sql from Attribute_Bundle_Item_Mapping__Tag 2022-05-16 12:09:46 normalization > Generating airbyte_incremental/SALESFORCE_TEST/ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG.sql from Attribute_Bundle_Item_Mapping__Tag 2022-05-16 12:09:46 normalization > detected no config file for ssh, assuming ssh is off. 2022-05-16 12:09:48 normalization > [--event-buffer-size EVENT_BUFFER_SIZE] 2022-05-16 12:09:48 normalization > --event-buffer-size EVENT_BUFFER_SIZE 2022-05-16 12:09:48 normalization > 2022-05-16 12:09:48 normalization > DBT >=1.0.0 detected; using 10K event buffer size 2022-05-16 12:09:48 normalization > 2022-05-16 12:09:50 normalization > 12:09:50 Running with dbt=1.0.0 2022-05-16 12:09:50 normalization > 12:09:50 Partial parse save file not found. Starting full parse. 2022-05-16 12:09:52 normalization > 12:09:52 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources. 2022-05-16 12:09:52 normalization > There are 1 unused configuration paths: 2022-05-16 12:09:52 normalization > - models.airbyte_utils.generated.airbyte_tables 2022-05-16 12:09:52 normalization > 2022-05-16 12:09:52 normalization > 12:09:52 Found 5 models, 0 tests, 0 snapshots, 0 analyses, 528 macros, 0 operations, 0 seed files, 1 source, 0 exposures, 0 metrics 2022-05-16 12:09:52 normalization > 12:09:52 2022-05-16 12:09:54 normalization > 12:09:54 Concurrency: 5 threads (target='prod') 2022-05-16 12:09:54 normalization > 12:09:54 2022-05-16 12:09:54 normalization > 12:09:54 1 of 3 START view model _AIRBYTE_SALESFORCE_TEST.ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG_STG............................ [RUN] 2022-05-16 12:09:55 normalization > 12:09:55 1 of 3 OK created view model _AIRBYTE_SALESFORCE_TEST.ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG_STG....................... [SUCCESS 1 in 0.94s] 2022-05-16 12:09:55 normalization > 12:09:55 2 of 3 START incremental model SALESFORCE_TEST.ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG_SCD.............................. [RUN] 2022-05-16 12:09:55 normalization > 12:09:55 12:09:55 + "AIRBYTE_DATABASE".SALESFORCE_TEST."ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG_SCD"._AIRBYTE_AB_ID does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-05-16 12:09:56 normalization > 12:09:56 2 of 3 OK created incremental model SALESFORCE_TEST.ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG_SCD......................... [SUCCESS 1 in 1.75s] 2022-05-16 12:09:56 normalization > 12:09:56 3 of 3 START incremental model SALESFORCE_TEST.ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG.................................. [RUN] 2022-05-16 12:09:57 normalization > 12:09:57 12:09:57 + "AIRBYTE_DATABASE".SALESFORCE_TEST."ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG"._AIRBYTE_AB_ID does not exist yet. The table will be created or rebuilt with dbt.full_refresh 2022-05-16 12:09:58 normalization > 12:09:58 3 of 3 OK created incremental model SALESFORCE_TEST.ATTRIBUTE_BUNDLE_ITEM_MAPPING__TAG............................. [SUCCESS 1 in 1.36s] 2022-05-16 12:09:58 normalization > 12:09:58 2022-05-16 12:09:58 normalization > 12:09:58 Finished running 1 view model, 2 incremental models in 6.19s. 2022-05-16 12:09:58 normalization > 12:09:58 2022-05-16 12:09:58 normalization > 12:09:58 Completed successfully 2022-05-16 12:09:58 normalization > 12:09:58 2022-05-16 12:09:58 normalization > 12:09:58 Done. PASS=3 WARN=0 ERROR=0 SKIP=0 TOTAL=3 2022-05-16 12:09:58 INFO i.a.w.DefaultNormalizationWorker(run):71 - Normalization executed in 12 seconds. 2022-05-16 12:09:58 INFO i.a.w.DefaultNormalizationWorker(run):77 - Normalization summary: io.airbyte.config.NormalizationSummary@326ea0de[startTime=1652702985747,endTime=1652702998605] 2022-05-16 12:09:58 INFO i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling... 2022-05-16 12:09:58 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):235 - Stopping temporal heartbeating... 2022-05-16 12:09:58 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. errors: $.access_token: is missing but it is required, $.refresh_token: is missing but it is required 2022-05-16 12:09:58 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. errors: $.method: does not have a value in the enumeration [Standard]