Error syncing data to BigQuery after upgrading Airbyte on GKE cluster

Summary

Encountering error when syncing data to BigQuery after upgrading Airbyte on GKE cluster


Question

Hi team,I’ve deployed Airbyte on a GKE cluster using Helm. However, after upgrading my Airbyte to the latest version and then syncing data from various data sources to BigQuery i am encountering an error.
Any help would be greatly appreciated. Thanks!



This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want
to access the original thread.

Join the conversation on Slack

["error", "syncing-data", "bigquery", "upgrading", "gke-cluster", "airbyte"]

can you do a kubectl get pods -n <AIRBYTE_NAMESPACE> when you start the job and watch to see what happens with the job pods?

Also i tried to restart the pods after that when i tried to restart the sync i t gave me another error

can you do a kubectl get pods -n <AIRBYTE_NAMESPACE> when you start the job and watch to see what happens with the job pods?>>>>>

can you do a kubectl rollout restart deployment/airbyte-abctl-workload-launcheragain and see if that fixes the permission issue. We have a fix coming for that soon.>>>>>>>>

ah, can you do a kubectl get deployments and find the right name

Should I access my airbyte instance now?

getting this on my kunernetes ui

The Airbyte is working fine now <@U07C8CCC68Y>. Now I tried to create a conn for mongodb - BQ but when i tried to create the source as mongodb it is giving me error.
I have added the error log also.

I do not, you might want to ask again in this channel as a separate issues, I do not know much about monogo or the BQ connector.

Thanks <@U07C8CCC68Y> for all the help

what cpu and memeory sizes do you have for your nodes? This could be a resource issue.

Hi <@U07C8CCC68Y> bumping this msg on top of your other msgs.
Did you get any workaround for the above issue that i am facing?

Thanks in advance

Hi <@U07C8CCC68Y>
Below i have added the screenshot for your ref to your question.

ok that should be plenty

can you do a kubectl rollout restart deployment/airbyte-abctl-workload-launcheragain and see if that fixes the permission issue. We have a fix coming for that soon.

Yeah the permission issue looks to be the problem here. If the restart is not working, it might take a day or two to get a fix out.

yeah you can try and again and see if that issue has been resolved

can you look at the logs of those pods?

Hi <@U07C8CCC68Y> as per the screenshot added one possible solution is to do something with the role bindings. Can you please help me with the role bindings?