Source Snowflake - Arrays/objects are converted to escaped strings in Elasticsearch/local JSON destinations

  • Is this your first time deploying Airbyte?: No
  • OS Version / Instance: Ubuntu
  • Memory / Disk: 32Gb / 1 Tb
  • Deployment: Docker
  • Airbyte Version: 0.40.18
  • Source name/version: Snowflake 0.1.24
  • Destination name/version: Elasticsearch 0.1.6 / Local JSON 0.2.11
  • Step: The issue is happening during sync
  • Description:

When I try to sync array/object/variant Snowflake fields to either Elasticsearch or Local JSON, the content of the fields are converted to an escaped string whereas I expect a valid array/json object.
e.g. :
I get “COUNTRY”:“[\n "FR",\n "DE",\n "GB",\n "IT",\n "CN",\n "HK"\n]” instead of a valid json array “COUNTRY”:[“FR”,“DE”,“GB”,“IT”,“CN”,“HK”]

Is this normal? Am I missing something in the setup (jdbc parameter maybe)?

After some further investigations, it may be a a bug in the source Snowflake connector as described here https://github.com/airbytehq/airbyte/issues/19122

Hey Marc,

Thanks for the question and welcome to the community. Sorry for the delay, we’ve been experiencing a large influx of posts and questions from the community and are trying out best to reply to everyone.

Thanks for creating the issue, I see that it’s already been triaged. If we can write a quick fix to improve the type conversions, we will.