Some users have in some occasions this error in the temporal container.
io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 9.983793757s.
Workaround
Today the best advice when dealing with this error is to increase the resources (memory/disk/cpu) of your server and restart Airbyte.
Further investigation: Understand the root cause of this error or if it’s possible to change a temporal configuration to stop it
Hi @marcosmarxm
Is there any update on this topic?
It’s not clear yet @boggdan but increasing resources memory/disk/cpu for a couple of users solved the issue.
Hi @marcosmarxm ,
I faced this issue last night. I am fine with increasing resources of the EC2 instance where my airbyte instance is deployed, but is there any way to see precisely what happened ? Knowing that I had to kill the whole instance to restart the docker container. Is there a way to access the docker logs from before the restart ?
Thanks
if you already restarted it is not possible to see the logs, only if your deployment is saving data to a file.
@marcosmarxm increasing resources is not an option for us, anything else I can do?
Ulan
June 1, 2022, 1:49pm
8
Hi, having the same issue here on the local WSL2 docker-compose.
airbyte-server | io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: Deadline exceeded after 4.998909226s.
airbyte-server | at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:262) ~[grpc-stub-1.44.1.jar:1.44.1]
airbyte-server | at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:243) ~[grpc-stub-1.44.1.jar:1.44.1]
airbyte-server | at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:156) ~[grpc-stub-1.44.1.jar:1.44.1]
airbyte-server | at io.grpc.health.v1.HealthGrpc$HealthBlockingStub.check(HealthGrpc.java:252) ~[grpc-services-1.44.1.jar:1.44.1]
airbyte-server | at io.temporal.serviceclient.WorkflowServiceStubsImpl.lambda$checkHealth$2(WorkflowServiceStubsImpl.java:286) ~[temporal-serviceclient-1.8.1.jar:?]
airbyte-server | at io.temporal.internal.retryer.GrpcSyncRetryer.retry(GrpcSyncRetryer.java:61) ~[temporal-serviceclient-1.8.1.jar:?]
airbyte-server | at io.temporal.internal.retryer.GrpcRetryer.retryWithResult(GrpcRetryer.java:51) ~[temporal-serviceclient-1.8.1.jar:?]
airbyte-server | at io.temporal.serviceclient.WorkflowServiceStubsImpl.checkHealth(WorkflowServiceStubsImpl.java:279) ~[temporal-serviceclient-1.8.1.jar:?]
airbyte-server | at io.temporal.serviceclient.WorkflowServiceStubsImpl.<init>(WorkflowServiceStubsImpl.java:186) ~[temporal-serviceclient-1.8.1.jar:?]
airbyte-server | at io.temporal.serviceclient.WorkflowServiceStubs.newInstance(WorkflowServiceStubs.java:51) ~[temporal-serviceclient-1.8.1.jar:?]
airbyte-server | at io.temporal.serviceclient.WorkflowServiceStubs.newInstance(WorkflowServiceStubs.java:41) ~[temporal-serviceclient-1.8.1.jar:?]
airbyte-server | at io.airbyte.workers.temporal.TemporalUtils.lambda$createTemporalService$0(TemporalUtils.java:61) ~[io.airbyte-airbyte-workers-0.39.7-alpha.jar:?]
airbyte-server | at io.airbyte.workers.temporal.TemporalUtils.getTemporalClientWhenConnected(TemporalUtils.java:190) [io.airbyte-airbyte-workers-0.39.7-alpha.jar:?]
airbyte-server | at io.airbyte.workers.temporal.TemporalUtils.createTemporalService(TemporalUtils.java:57) [io.airbyte-airbyte-workers-0.39.7-alpha.jar:?]
airbyte-server | at io.airbyte.server.ServerApp.getServer(ServerApp.java:204) [io.airbyte-airbyte-server-0.39.7-alpha.jar:?]
airbyte-server | at io.airbyte.server.ServerApp.main(ServerApp.java:296) [io.airbyte-airbyte-server-0.39.7-alpha.jar:?]
And it’s not an issue related to RAM etc:
$ free -m
total used free shared buff/cache available
Mem: 29079 1198 27297 20 582 27512
Swap: 8192 0 8192
docker stats:
CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS
84a4feb41a59 airbyte-server 0.09% 374.8MiB / 28.4GiB 1.29% 36.6kB / 28.7kB 0B / 0B 49
af84793aa480 airbyte-worker 0.10% 228.2MiB / 28.4GiB 0.78% 31.7kB / 25.2kB 0B / 0B 49
d514e4fc4d6b init 0.00% 0B / 0B 0.00% 0B / 0B 0B / 0B 0
9dc329a30392 airbyte-db 0.04% 86.82MiB / 28.4GiB 0.30% 844kB / 830kB 0B / 0B 46
a4ce7ed65bf3 airbyte-scheduler 0.71% 148.8MiB / 28.4GiB 0.51% 2.52kB / 1.86kB 0B / 0B 52
b4d1e96e038e airbyte-webapp 0.00% 11.59MiB / 28.4GiB 0.04% 1.03kB / 0B 0B / 0B 17
9a5ad81f9970 airbyte-temporal 1.84% 47.53MiB / 28.4GiB 0.16% 231kB / 262kB 0B / 0B 23
keshav-multi and Ulan Yisaev what is the env resources from your Airbyte instance?
Ulan
June 2, 2022, 11:57am
10
I have standard .env files (attached). I also tried to increase the memory size through parameters *_MEMORY_LIMIT=8G, without success.
_env.txt (3.0 KB)
Sorry, what is the instance size you’re using
Ulan
June 2, 2022, 6:21pm
12
I’m tryin to launch Airbyte on my local ubuntu WSL2. My Pc has the following configuration:
Processor AMD Ryzen 7 PRO 5850U with Radeon Graphics 1.90 GHz
Installed RAM 32.0 GB (30.8 GB usable)
SDD: 500 Gb
Ulan
June 5, 2022, 6:30pm
13
Hi team,
I was able to solve the problem by installing Docker Desktop. Before that, I had a Docker-ce under Ubuntu.
Thank you!
16 core CPU, 64GB, 8TB SSD
Sorry to ask individual questions Keshav, what OS? Looks this problem is related with your docker setup/OS
OS :
Distributor ID: Ubuntu
Description: Ubuntu 20.04.2 LTS
Release: 20.04
Codename: focal
Docker :
Client: Docker Engine - Community
Version: 20.10.8
API version: 1.41
Go version: go1.16.6
Git commit: 3967b7d
Built: Fri Jul 30 19:54:27 2021
OS/Arch: linux/amd64
Context: default
Experimental: true
Server: Docker Engine - Community
Engine:
Version: 20.10.8
API version: 1.41 (minimum version 1.12)
Go version: go1.16.6
Git commit: 75249d8
Built: Fri Jul 30 19:52:33 2021
OS/Arch: linux/amd64
Experimental: false
containerd:
Version: 1.4.9
GitCommit: e25210fe30a0a703442421b0f60afac609f950a3
runc:
Version: 1.0.1
GitCommit: v1.0.1-0-g4144b63
docker-init:
Version: 0.19.0
GitCommit: de40ad0
Docker Compose:
docker-compose version 1.29.2, build 5becea4c
We are running Airbyte on the same machine for close to 8 months now - never faced this problem. It’s not a VPS so we cant temporarily increase resources. Would be happy to help to track down the issue.
This issue shouldn’t happen in latest version of Airbyte Keshav and in the Ubuntu OS it’s the first time I saw this. Do you keep having the problem?
@marcosmarxm we have this issue. our EC2 (docker) instance dies intermittently. Instance is m5.4xlarge i.e. 64gb memory. We never come close to hitting those limits.
Hi there from the Community Assistance team.
We’re letting you know about an issue we discovered with the back-end process we use to handle topics and responses on the forum. If you experienced a situation where you posted the last message in a topic that did not receive any further replies, please open a new topic to continue the discussion. In addition, if you’re having a problem and find a closed topic on the subject, go ahead and open a new topic on it and we’ll follow up with you. We apologize for the inconvenience, and appreciate your willingness to work with us to provide a supportive community.