Data Inserted into Single JSON Column in Databricks Lakehouse

Summary

Data from MySQL is being inserted into a single JSON column in Databricks Lakehouse instead of tabular format. User is seeking clarification on why this is happening and if tabular data should be available in Databricks SQL warehouse. User also mentions that normalization and transformation operations are not supported for this connection.


Question

Hello Airbyte community,
I setup a connection from MySQL to Databricks Lakehouse with full refresh sync in Airbyte 0.50.47
I dont understand why all the MySQL tabular data are inserted into one single JSON in a _airbyte_data column in my databricks destination.
From what I see in the doc https://docs.airbyte.com/integrations/destinations/databricks , I should get the tabular data also in databricks sql warehouse. No?
I also see that the “Normalization and Transformation operations are not supported for this connection” from the connection setup.
Thank for your time and help :slightly_smiling_face:



This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want
to access the original thread.

Join the conversation on Slack

["mysql", "databricks-lakehouse", "full-refresh-sync", "airbyte-0.50.47", "json-column", "tabular-data", "databricks-sql-warehouse", "normalization", "transformation"]