Unable to connect to Kafka with SASL_SSL + SCRAM

• Is this your first time deploying Airbyte?: No
• OS Version: Linux Mint 20.3 host and Lubuntu 22.04 on VM
• Memory / Disk / CPU: VM with 2 cores, 4GB RAM and 30GB disk space
• Deployment: docker-compose + Kubernetes
• Airbyte Version: 0.39.23-alpha
• Source name/version: BigQuery v2
• Destination name/version: Kafka 3.2.0 
• Step: create Kafka destination 

Hello everyone.
I am trying to connect to Kafka in secure way and having a problem with SSL. Kafka servers in my organization use SASL+SCRAM authorization but i am unable to connect to them with Airbyte since both SASL Options in UI don“t have fields for Keystore.


I tested connection with kafka-cli from VM and successfully connected with SASL_SSL + SCRAM-SHA-256 using keystore and with SASL_PLAINTEXT + SCRAM-SHA-256 without any keystore. But in Airbyte I get errors like the ones in log files. Of course, I“ve checked urls, ports and jaas.config and sure that they“re correct.
sasl-plaintext.log (4.9 KB)
sasl-ssl-scram.log (20.9 KB)

For now I“m using VM since docker-compose conflicts with VPN on my host machine, but we;ve deployed Airbyte on Kubernetes, and getting similair errors there, so I want to test everything locally first
Right now I don“t see any solution for SSL connection to Kafka on Airbyte that are suitable for my case. So, I want to ask if there any workaround like passing keystore with volume or overriding sasl.mechanism or security.protocol parameters? Of course, i can use just Plaintext without encryption, but it require to add some kind of proxy-kafka because of the security policies, so I want to avoid this.

Hey @SAmser93,
Thank you for raising this problem and for your investigation.
Our destination kafka connector is still alpha and indeed misses a bunch of feature to match the variety of authentication mechanism Kafka offers.
I’m not a Kafka user myself so I’d love a bit of detail about what would be your ideal auth flow for your use case so that I can fill an issue on our repo to make sure the current limitations are tracked.
From what I read you want to use SASL SSL protocol with a mechanism SCRAM-SHA-256 and a custom keystore (which is not supported yet). You can also open the issue yourself and link it here so that I’ll take care to triage and affect it to the right technical team.

I would suggest you take a look at how the connector builds the Kafka producer and identify what is missing for you there.

I want to ask if there any workaround like passing keystore with volume

The /tmp/airbyte path of your docker host is mounted to /local on your job containers. If you are able to set the Keystore location in your JASS config you can indeed try to have the Keystore mounted to your container.

I hope this helps!

Hello, @alafanechere, thanks for the help. Sorry for the late response. I didn’t have a chance to get to my workspace for these 3 working days.

I’d love a bit of detail about what would be your ideal auth flow

In my tests I use these two .properties file to connect to Kafka in our company passing them to cli (like this - Encrypt and authenticate Confluent Platform and resources with TLS | Confluent Documentation):

  • the one without SSL:
sasl.mechanism=SCRAM-SHA-256
security.protocol=SASL_PLAINTEXT
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="john_smith" password="not_a_password";

It’s unachievable in Airbyte since in SASL_PLAINTEXT submenu I can’t choose SASL Mechanism and it’s always «PLAIN»

  • the one with SSL:
security.protocol=SASL_SSL
sasl.mechanism=SCRAM-SHA-256
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="john_smith" password="not_a_password";
ssl.truststore.location=/home/kafka/keys/my.truststore.jks
ssl.truststore.password=changeit

Also unachievable since lack of truststores support

So I see two ways to improve this:

  1. Make DropBoxes «Protocol» and «SASL Mechanism» independent (or at least don’t hide the second one in «SASL PLAINTEXT» mode)
  2. Add support of «ssl.truststore.*» fields in KafkaDestinationConfig.java (their names can be taken from org.apache.kafka.common.config.SslConfigs class and looks like it can be done the same way as in «propertiesByProtocol» function) and add TextField in UI. It’s more complicated than a first one, but truststore support is necessary in some organizations.

If you are able to set the Keystore location in your JASS config you can indeed try to have the Keystore mounted to your container.

I thought about this, but couldn’t find a solution. Judging by Configure SASL/SCRAM for Confluent Platform | Confluent Documentation, ScramLoginModule only gets username and password and I’m not sure that the other modules like Krb5LoginModule that support keystores can be used with Kafka.
My colleagues told me to try passing my cert to java ca-certificates used by java inside containers so that Java inside Airbyte could use it by default. I guess, this is my last hope right now, since now you told me to which directory I can work with.

I guess I’ll play some more with that and then create an Github issue since for me this looks like an important feature to have.

My colleagues told me to try passing my cert to java ca-certificates used by java inside containers so that Java inside Airbyte could use it by default.

Yes, it is definitely worth a try. Let me know if it works.

then create an Github issue since for me this looks like an important feature to have.

Yes please feel free to open an issue with the details you shared above. Even if the workaround you’re trying above works it’s not a convenient one and we’d want to improve this connector in the direction you suggested. Please share the link to the issue there. Thank you!

So, I found out some kind of workaround for that:

  1. I put my certificate in this image - airbyte/airbyte-integrations/connectors/destination-kafka at master · airbytehq/airbyte · GitHub by adding these lines to Dockerfile:
COPY ./certnew.cer $JAVA_HOME/lib/security/
RUN cd $JAVA_HOME/lib/security/ && keytool -import -trustcacerts -alias <your_alias> -file certnew.cer -cacerts -storepass <your_password> -noprompt
  1. Then it’s necessary to run gradle build for this module because image uses compiled binaries in line ADD bin/${APPLICATION}-${VERSION}.tar /app. I found out that this action requires latest versions of gradle and java to work, so keep attention for that
  2. Finally I built local image and tagged it to force docker to use it instead of getting one from repo
docker build -t airbyte/destination-kafka:<latest_image_version>  .

And now when creating Kafka-destination I’m able to get through SSL handshake and successfully passing the topic creation test. The only thing left to worry about is that user mentioned in sasl.jaas.config should have rights to create topics and write to them at this Kafka-server.
Hope it helps somebody with similar problem

Thank you for sharing this workaround. The volume approach did not work?
The downside of this approach is that you are building a custom image a can’t easily benefit from upgrades made on the connector.
Do you mind opening an issue on our repo with the details you previously shared?

Volume solution doesn’t work because connection happens in “airbyte/destination-kafka” image which is used only in the moment of connection test and sync. So I have to somehow rebuild the image at that moment and I wasn’t able to find a way to do that.
Add support of keystores for Kafka destination/source · Issue #14474 · airbytehq/airbyte · GitHub - I created an issue with “Feature request” because I’m not sure if this can be called a bug.

Thank you for creating the issue! It was triaged to reach our team’s backlog.

I did not see any updates on the linked issue itself, so wanted to check on it here. Is custom image the only option?