MySQL to Databricks Lakehouse Data Insertion Issue

Summary

User is experiencing an issue where all MySQL tabular data is being inserted into a single JSON column in Databricks Lakehouse instead of being tabular as expected, referencing documentation and connection setup limitations.


Question

Hello Airbyte community,
I setup a connection from MySQL to Databricks Lakehouse with full refresh sync in Airbyte 0.50.47
I dont understand why all the MySQL tabular data are inserted into one single JSON in a _airbyte_data column in my databricks destination.
From what I see in the doc https://docs.airbyte.com/integrations/destinations/databricks , I should get the tabular data also in databricks sql warehouse. No?
I also see that the “Normalization and Transformation operations are not supported for this connection” from the connection setup.
Thank for your time and help :slightly_smiling_face:



This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want
to access the original thread.

Join the conversation on Slack

['mysql', 'databricks-lakehouse', 'airbyte-0.50.47', 'data-insertion', 'normalization', 'transformation']