Summary
POC encountering SSL error when adding Databricks as destination connector
Question
Hi there. Any one has use data bricks destination connector? We are doing POC and getting SSL error while adding data bricks as destination .
This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want to access the original thread.
Join the conversation on Slack
["ssl-error", "databricks-destination-connector", "poc"]
I’m sorry to hear that you’re experiencing issues with the Databricks destination connector. Based on the information provided in the extracts, it seems that other users have also encountered similar issues. One user reported a timeout error when trying to create a Databricks connector. The issue was that the connector was not able to connect to the Databricks cluster before it was fully spun up. The user found that the connector worked successfully when the cluster was already in a running state before the connector started using it. If the cluster was not ready, the connector would spin up the cluster and run the job, but it would fail while waiting for the cluster to be ready. You can find more details about this issue on this forum post. In terms of SSL errors, it’s important to note that Airbyte currently only supports TLS 1.2 and TLS 1.3. If you’re using a lower version of TLS, you might encounter SSL errors. To resolve this issue, you should enable a higher version of TLS on your data source. Once you have enabled the appropriate TLS version, the connection should pass successfully. This information was provided by a support agent in response to a user who was encountering a TLS error when creating a Microsoft SQL Server source in Airbyte. You can find more details about this issue on this forum post. I hope this information helps! If you’re still encountering issues, I would recommend checking the log files for any error messages that might provide more information about the issue. You might also want to check your network configuration and firewall settings to ensure that connections to the Databricks cluster are not being blocked.