SSL error when testing S3 source with custom CA certificate in Airbyte OSS version locally using Docker

Summary

When testing the S3 source in Airbyte OSS version locally using Docker with a custom CA certificate and encountering an SSL error, the user needs to define the path to the CA cert. The question is whether Airbyte supports defining a custom CA certificate path and how to configure it.


Question

Hi, I’m testing Airbyte OSS version locally using Docker. For outbound traffic, I have a proxy and a custom CA certificate. When testing the S3 source, I encounter an SSL error (botocore.exceptions.SSLError: SSL validation failed). I’m pretty sure it’s because I need to define the path to the CA cert (for example AWS_CA_BUNDLE). Does Airbyte support defining a custom CA certificate path? If so, how can I configure it?



This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want
to access the original thread.

Join the conversation on Slack

["ssl-error", "s3-source", "custom-ca-certificate", "docker", "airbyte-oss", "proxy", "ssl-validation", "ca-cert-path"]

https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-s3/source_s3/v4/stream_reader.py#L266

I have a bad feeling that custom ca certs aren’t supported for most of the existing Airbyte connectors, correct? And I may have to create my own versions of Source S3 for example (and any other con).

Since this is an outbound connection, I don’t think your local CA should matter in this context

It’s possible that you have trust chain issues in your local environment, so I would try connecting via curl and seeing if you get the same error

is the proxy just inbound, or are you actually rewriting the outbound requests/responses?

Hi Justin, thanks for your response.

I am actually referring to my internet/DLP proxy and all API calls to public endpoints that go through it. In this particular case with source-s3 it’s <http://s3.ca-central-1.amazonaws.com|s3.ca-central-1.amazonaws.com> which is AWS S3 service endpoint (https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-s3/source_s3/v4/stream_reader.py#L199)

Because of DLP being man in the middle, CA cert has to be defined/added. My local env is totally fine as I am using this as part of my bach profile:

export REQUESTS_CA_BUNDLE=$SSL_CERT_FILE
export CURL_CA_BUNDLE=$SSL_CERT_FILE
export AWS_CA_BUNDLE=$SSL_CERT_FILE```
Now, when it comes to *Airbyte*, when user tries to create source-s3 and clicks `Set up source` in UI,
behind the scene Airbyte launches `airbyte/source-s3:4.7.1` container and tries to execute the code (stream_reader.py).

I assume the solution would be to copy the certificate into the container's airbyte/source-s3:4.7.1 file system, or if there is a single base image update just that. I was hoping to get more guidance from Airbyte team or related docs (which I didn't find).