Destination Error during Tabular normalization to Big Query

  • Is this your first time deploying Airbyte?: Yes
  • Deployment: Docker
  • Airbyte Version: Open Source
  • Source name/version:HubSpot
  • Destination name/version: Big Query
    I am having trouble creating tabular normalization at my Big Query Destination. I keep getting this error this is a sample “Clients have non-trivial state that is local and unpickleable.
    2022-10-14 01:57:26 e[42mnormalizatione[0m > 23 of 77 ERROR creating table model Marketing.ticket_pipelines… [e[31mERRORe[0m in 0.94s]
    2022-10-14 01:57:26 e[42mnormalizatione[0m > e[31mUnhandled error while executing model.airbyte_utils.ticketse[0m
    Pickling client objects is explicitly not supported.” In the past I was able to get the raw json, but once I decide to normalize this error populates and nothing gets stored. The sync keeps failing. Has anyone encountered this issue?

Hello there! You are receiving this message because none of your fellow community members has stepped in to respond to your topic post. (If you are a community member and you are reading this response, feel free to jump in if you have the answer!) As a result, the Community Assistance Team has been made aware of this topic and will be investigating and responding as quickly as possible.
Some important considerations that will help your to get your issue solved faster:

  • It is best to use our topic creation template; if you haven’t yet, we recommend posting a followup with the requested information. With that information the team will be able to more quickly search for similar issues with connectors and the platform and troubleshoot more quickly your specific question or problem.
  • Make sure to upload the complete log file; a common investigation roadblock is that sometimes the error for the issue happens well before the problem is surfaced to the user, and so having the tail of the log is less useful than having the whole log to scan through.
  • Be as descriptive and specific as possible; when investigating it is extremely valuable to know what steps were taken to encounter the issue, what version of connector / platform / Java / Python / docker / k8s was used, etc. The more context supplied, the quicker the investigation can start on your topic and the faster we can drive towards an answer.
  • We in the Community Assistance Team are glad you’ve made yourself part of our community, and we’ll do our best to answer your questions and resolve the problems as quickly as possible. Expect to hear from a specific team member as soon as possible.

Thank you for your time and attention.
Best,
The Community Assistance Team

Hi @novice_analyst, could you please provide the full logs you’re getting? That would help to look further into this issue.

@natalyjazzviolin Thank you for your response. I was able to resolve the issue today. I think the issue was with my destination connection. Although every time I performed a destination test everything seemed fine, but in reality it wasn’t. This is the error log.

Thank you
huspot_bq_errorlog (2).txt (1.2 MB)

Thanks for the update!

Hi, could you please elaborate on your solution? I keep getting the same error.

Thanks!

@Albinas_Plesnys I am happy to help you/work you through how I solved the issue.

  1. If the error is a destination error go back to your big query account and make sure that the location you picked is the same location as the one you picked when setting up your destination in airbyte.
  2. In Big query I noticed that I just needed a bucket and that I did not need to create any folders in my bucket. as shown in the image below.
    image
  3. In airbyte I am using the mirror source structure.
  4. This is probably the most important step is to make sure that you setup everything correctly with Big query.
    If you are still having issues after trying all these. I am happy to have a video call with you so we can troubleshoot it together.
2 Likes

Oh wow, that was actually it! Can confirm that the problem was with the dataset location setting. Curiously, the actual dataset location was us-east1, and I had set that on Airbyte; but the bucket location was US (multiregion). Changing the setting on Airbyte to US did the trick. Thanks again!!!

You are welcome. Glad I was able to help.