Issue in sync data from sheet to snowflake

2022-05-10 04:10:35 [32mINFO[m i.a.w.w.WorkerRun(call):49 - Executing worker wrapper. Airbyte version: 0.37.0-alpha
2022-05-10 04:10:45 [33mWARN[m i.t.i.r.GrpcSyncRetryer(retry):56 - Retrying after failure
io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9.999791100s. [closed=[], open=[[buffered_nanos=9999932500, waiting_for_connection]]]
at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:262) ~[grpc-stub-1.44.1.jar:1.44.1]
at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:243) ~[grpc-stub-1.44.1.jar:1.44.1]
at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:156) ~[grpc-stub-1.44.1.jar:1.44.1]
at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.startWorkflowExecution(WorkflowServiceGrpc.java:2631) ~[temporal-serviceclient-1.8.1.jar:?]
at io.temporal.internal.client.external.GenericWorkflowClientExternalImpl.lambda$start$0(GenericWorkflowClientExternalImpl.java:88) ~[temporal-sdk-1.8.1.jar:?]
at io.temporal.internal.retryer.GrpcSyncRetryer.retry(GrpcSyncRetryer.java:61) ~[temporal-serviceclient-1.8.1.jar:?]
at io.temporal.internal.retryer.GrpcRetryer.retryWithResult(GrpcRetryer.java:51) ~[temporal-serviceclient-1.8.1.jar:?]
at io.temporal.internal.client.external.GenericWorkflowClientExternalImpl.start(GenericWorkflowClientExternalImpl.java:81) ~[temporal-sdk-1.8.1.jar:?]
at io.temporal.internal.client.RootWorkflowClientInvoker.start(RootWorkflowClientInvoker.java:55) ~[temporal-sdk-1.8.1.jar:?]
at io.temporal.internal.sync.WorkflowStubImpl.startWithOptions(WorkflowStubImpl.java:113) ~[temporal-sdk-1.8.1.jar:?]
at io.temporal.internal.sync.WorkflowStubImpl.start(WorkflowStubImpl.java:138) ~[temporal-sdk-1.8.1.jar:?]
at io.temporal.internal.sync.WorkflowInvocationHandler.startWorkflow(WorkflowInvocationHandler.java:192) ~[temporal-sdk-1.8.1.jar:?]
at io.temporal.internal.sync.WorkflowInvocationHandler.access$300(WorkflowInvocationHandler.java:48) ~[temporal-sdk-1.8.1.jar:?]
at io.temporal.internal.sync.WorkflowInvocationHandler$SyncWorkflowInvocationHandler.startWorkflow(WorkflowInvocationHandler.java:314) ~[temporal-sdk-1.8.1.jar:?]
at io.temporal.internal.sync.WorkflowInvocationHandler$SyncWorkflowInvocationHandler.invoke(WorkflowInvocationHandler.java:270) ~[temporal-sdk-1.8.1.jar:?]
at io.temporal.internal.sync.WorkflowInvocationHandler.invoke(WorkflowInvocationHandler.java:178) ~[temporal-sdk-1.8.1.jar:?]
at jdk.proxy2.$Proxy40.run(Unknown Source) ~[?:?]
at io.airbyte.workers.temporal.TemporalClient.lambda$submitSync$3(TemporalClient.java:151) ~[io.airbyte-airbyte-workers-0.37.0-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalClient.execute(TemporalClient.java:498) ~[io.airbyte-airbyte-workers-0.37.0-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalClient.submitSync(TemporalClient.java:150) ~[io.airbyte-airbyte-workers-0.37.0-alpha.jar:?]
at io.airbyte.workers.worker_run.TemporalWorkerRunFactory.lambda$createSupplier$0(TemporalWorkerRunFactory.java:49) ~[io.airbyte-airbyte-workers-0.37.0-alpha.jar:?]
at io.airbyte.workers.worker_run.WorkerRun.call(WorkerRun.java:51) [io.airbyte-airbyte-workers-0.37.0-alpha.jar:?]
at io.airbyte.workers.worker_run.WorkerRun.call(WorkerRun.java:22) [io.airbyte-airbyte-workers-0.37.0-alpha.jar:?]
at io.airbyte.commons.concurrency.LifecycledCallable.execute(LifecycledCallable.java:94) [io.airbyte-airbyte-commons-0.37.0-alpha.jar:?]
at io.airbyte.commons.concurrency.LifecycledCallable.call(LifecycledCallable.java:78) [io.airbyte-airbyte-commons-0.37.0-alpha.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
2022-05-10 04:10:45 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/101/2/logs.log
2022-05-10 04:10:45 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.37.0-alpha
2022-05-10 04:10:45 [32mINFO[m i.a.w.DefaultReplicationWorker(run):104 - start sync worker. job id: 101 attempt id: 2
2022-05-10 04:10:45 [32mINFO[m i.a.w.DefaultReplicationWorker(run):116 - configured sync modes: {null.salaries=full_refresh - overwrite}
2022-05-10 04:10:45 [32mINFO[m i.a.w.p.a.DefaultAirbyteDestination(start):69 - Running destination...
2022-05-10 04:10:45 [32mINFO[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-snowflake:0.4.25 exists...
2022-05-10 04:10:45 [32mINFO[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-snowflake:0.4.25 was found locally.
2022-05-10 04:10:45 [32mINFO[m i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 101
2022-05-10 04:10:45 [32mINFO[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/101/2 --log-driver none --name destination-snowflake-write-101-2-bgfwu --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=2 -e WORKER_CONNECTOR_IMAGE=airbyte/destination-snowflake:0.4.25 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.37.0-alpha -e WORKER_JOB_ID=101 airbyte/destination-snowflake:0.4.25 write --config destination_config.json --catalog destination_catalog.json
2022-05-10 04:10:45 [32mINFO[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-google-sheets:0.2.12 exists...
2022-05-10 04:10:45 [32mINFO[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-google-sheets:0.2.12 was found locally.
2022-05-10 04:10:45 [32mINFO[m i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 101
2022-05-10 04:10:45 [32mINFO[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/101/2 --log-driver none --name source-google-sheets-read-101-2-ouipp --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_JOB_ATTEMPT=2 -e WORKER_CONNECTOR_IMAGE=airbyte/source-google-sheets:0.2.12 -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.37.0-alpha -e WORKER_JOB_ID=101 airbyte/source-google-sheets:0.2.12 read --config source_config.json --catalog source_catalog.json
2022-05-10 04:10:45 [32mINFO[m i.a.w.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$6):346 - Destination output thread started.
2022-05-10 04:10:45 [32mINFO[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):279 - Replication thread started.
2022-05-10 04:10:45 [32mINFO[m i.a.w.DefaultReplicationWorker(run):158 - Waiting for source and destination threads to complete.
2022-05-10 04:10:46 [43mdestination[0m > SLF4J: Class path contains multiple SLF4J bindings.
2022-05-10 04:10:46 [43mdestination[0m > SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2022-05-10 04:10:46 [43mdestination[0m > SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2022-05-10 04:10:46 [43mdestination[0m > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2022-05-10 04:10:46 [43mdestination[0m > SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [32mINFO[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [32mINFO[m i.a.i.b.IntegrationRunner(runInternal):121 - Running integration: io.airbyte.integrations.destination.snowflake.SnowflakeDestination
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [32mINFO[m i.a.i.b.IntegrationRunner(runInternal):122 - Command: WRITE
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [32mINFO[m i.a.i.b.IntegrationRunner(runInternal):123 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [33mWARN[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [32mINFO[m i.a.i.d.j.c.SwitchingDestination(getConsumer):65 - Using destination type: INTERNAL_STAGING
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [32mINFO[m i.a.i.d.s.StagingConsumerFactory(lambda$toWriteConfig$0):99 - Write config: WriteConfig{streamName=salaries, namespace=null, outputSchemaName=shazly_test_googlesheet, tmpTableName=_airbyte_tmp_gfn_salaries, outputTableName=_airbyte_raw_salaries, syncMode=overwrite}
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [32mINFO[m i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started.
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [32mINFO[m i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):117 - Preparing tmp tables in destination started for 1 streams
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [32mINFO[m i.a.i.d.s.StagingConsumerFactory(lambda$onStartFunction$2):125 - Preparing staging area in destination started for schema shazly_test_googlesheet stream salaries: tmp table: _airbyte_tmp_gfn_salaries, stage: 2022/05/10/04/7A8B06E0-100E-4628-91A5-8A32B7E7FA76/
2022-05-10 04:10:47 [43mdestination[0m > 2022-05-10 04:10:47 [32mINFO[m c.z.h.HikariDataSource(getConnection):110 - HikariPool-1 - Starting...
2022-05-10 04:10:48 [44msource[0m > Starting syncing spreadsheet 1SiEn_vS_YNRu2uBdYLPF_v4YoiS653kaJg0MFKRf4lo
2022-05-10 04:10:53 [44msource[0m > Unable to find the server at sheets.googleapis.com
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/httplib2/__init__.py", line 1343, in _conn_request
conn.connect()
File "/usr/local/lib/python3.9/site-packages/httplib2/__init__.py", line 1119, in connect
address_info = socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM)
File "/usr/local/lib/python3.9/socket.py", line 954, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno -3] Try again During handling of the above exception, another exception occurred:

Hey looks to me like there is some crunch is the resources? Can you share the resource details like CPU and RAM?

Hi there from the Community Assistance team.
We’re letting you know about an issue we discovered with the back-end process we use to handle topics and responses on the forum. If you experienced a situation where you posted the last message in a topic that did not receive any further replies, please open a new topic to continue the discussion. In addition, if you’re having a problem and find a closed topic on the subject, go ahead and open a new topic on it and we’ll follow up with you. We apologize for the inconvenience, and appreciate your willingness to work with us to provide a supportive community.