Normalization error and final table not created with Jira to Redshift

  • Is this your first time deploying Airbyte?: Yes
  • OS Version / Instance: Ubuntu
  • Memory / Disk:
  • Deployment: Kubernetes
  • Airbyte Version: 0.36.0-alpha
  • Source name/version: Jira (0.2.20)
  • Destination name/version: Redshift (0.3.28)
  • Step: The issue is happening during sync
  • Description:

After creating the connector, it was not possible to ingest data to redshift with Normalized tabular data mode. Here are some sample logs:
logs-2919.txt (68.1 KB)

And I thought that choosing Raw data mode, airbyte would create a final destination table on redshift the Destination Stream Prefix. But It’s not creating. Here are the logs for this sync:
logs-2917.txt (36.0 KB).

There is anything that could be done to use tabular data and the final table desired?
Let me know if there is any doubt about this issue.

Hi @luccafialho, thanks for posting and all the info. I’m looking into this and should have a few ideas for you soon!

2 Likes

I am getting a similar error, except mine is:
Unsupported nested cursor field fields.updated for stream board_issues

Logs attached
logs-101.txt (1.6 MB)

  • Is this your first time deploying Airbyte?: Yes
  • OS Version / Instance: AWS Linux 2
  • Memory / Disk:
  • Deployment: Docker
  • Airbyte Version: 0.39.38-alpha
  • Source name/version: Jira (0.2.21)
  • Destination name/version: Snowflake (0.1.13)
  • Step: The issue is happening during sync

Hi @ndavies-om1 and @luccafialho, I’ve found a related issue on GitHub:
https://github.com/airbytehq/airbyte/issues/8473

Keep an eye on it, but also try the workaround mentioned and let me know if that helps!

Thanks @natalyjazzviolin. I’ve already commented on this issue. I’ll keep looking at it!

Ah I see it now! I know the engineering team is prioritizing database connectors right now, so this might fall into Q4 or so. Would you be interested in submitting a PR yourself?

our work around is to export raw json and then flatten it and extract what we want in Snowflake

Thanks for bringing that work around to our attention, @ndavies-om1!