Kafka connection fails

I am trying to send data to kafka cluster created on GCP as Kubernetes cluster. The destination connection is successfully established but when I send data from snowflake to kafka it gives errors. Below I have copied the lines from where I started getting error message:

  1. WARN o.a.k.c.NetworkClient$DefaultMetadataUpdater(handleServerDisconnect):1060 - [Producer clientId=producer-1] Bootstrap broker 10.99.246.222:9093 (id: -3 rack: null) disconnected

2022-04-20 16:17:33 destination > 2022-04-20 16:17:33 WARN o.a.k.c.NetworkClient$DefaultMetadataUpdater(handleServerDisconnect):1060 - [Producer clientId=producer-1] Bootstrap broker 10.99.246.222:9091 (id: -1 rack: null) disconnected

2022-04-20 16:18:23 destination > 2022-04-20 16:18:23 ERROR i.a.i.d.k.KafkaRecordConsumer(lambda$sendRecord$2):95 - Error sending message to topic.

2022-04-20 16:18:23 destination > org.apache.kafka.common.errors.TimeoutException: Topic allergies not present in metadata after 60000 ms.

2022-04-20 16:18:23 destination > 2022-04-20 16:18:23 WARN i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):61 - Airbyte message consumer: failed.

2022-04-20 16:18:23 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed.

errors: $: null found, object expected

Please upload the complete log file and use the template suggested when creating the topic.

logs-40.txt (60.3 KB)

  • Is this your first time deploying Airbyte?: No
  • OS Version / Instance: Ubuntu
  • Deployment: Kafka on Kubernetes
  • Airbyte Version: 0.35.65-alpha
  • Source name/version: snowflake
  • Destination name/version: Kafka
  • Step: The issue is happening during sync
2022-04-20 21:26:16 e[43mdestinatione[0m > 2022-04-20 21:26:16 e[1;31mERRORe[m i.a.i.d.k.KafkaRecordConsumer(lambda$sendRecord$2):95 - Error sending message to topic.
2022-04-20 21:26:16 e[43mdestinatione[0m > org.apache.kafka.common.errors.TimeoutException: Topic test_topic not present in metadata after 60000 ms.

Can you confirm you have the topic created in Kafka?

topic is created on kafka. I have tried this using dynamic topic creation as well but still getting this error. Does it has to do with the warning messages that Bootstrap broker 10.99.246.222:9091 (id: -1 rack: null) disconnected? for bootstrap broker server I am using cluster ip:ports.

I have looked into it kafka topic is created and can be accessed. I have changed the bootstrap server address and did it again but now sync fails and it closes the stream. I have attached the logs here as well.
logs-53.txt (56.6 KB)

I think the Airbyte server is not resolving the DNS to your Kafka Cluster.

2022-04-22 15:38:23 e[43mdestinatione[0m > 2022-04-22 15:38:23 e[33mWARNe[m o.a.k.c.ClientUtils(parseAndValidateAddresses):75 - Couldn't resolve server testany7-kafka-strimzi-kafka-bootstrap:9092 from bootstrap.servers as DNS resolution failed for testany7-kafka-strimzi-kafka-bootstrap

any suggestions how we can resolve this issue?

Can you confirm you can ping the kafka dns from the Airbyte instance?
ping testany7-kafka-strimzi-kafka-bootstrap:9092

I have tried transferring data to my local kafka cluster but still getting error. I have attached the log file below.
logs-214.txt (230.3 KB)

Are you trying to connect using the docker container name? Error connecting to node b2481f68dbfb:9092 (id: 1001 rack: null) ?
Did you try to create a docker bridge between Airbyte to Kafka? The problem is that Airbyte can’t connect to your Kafka cluster.

I am using dockerize version of kafka. It’s bitnami/kafka installation.

I have provided topic name in which I want to send data in topic pattern field and bootstrap server which is localhost:9092. All the other settings are default.

Komal did you try to change the server address? See some tips here: https://docs.airbyte.com/troubleshooting/new-connection#connection-refused-errors-when-connecting-to-a-local-db

By changing the server address you mean I have to add my public IP as extrahosts in docker yml file of kakfa?

You have your Kafka and Airbyte deployed in the same machine using docker right? If yes you should try different hosts/address

here’s a little update. I have downgraded to older versions of zookeeper and kafka and now I am getting a success message with zero bytes data transferred. In the log file the destination is accessible and it shows sync summary as well. why is there a success message and 0 data transferred? Also which version of kafka and zookeeper are compatible with airbyte?
I have attached the latest log file here.
logs-355.txt (8.4 KB)

But you had executed a Reset operation which cleans the data in your destination (Kafka). You need to trigger a Sync now to try send data from Snowflake to Kafka.

Hi there from the Community Assistance team.
We’re letting you know about an issue we discovered with the back-end process we use to handle topics and responses on the forum. If you experienced a situation where you posted the last message in a topic that did not receive any further replies, please open a new topic to continue the discussion. In addition, if you’re having a problem and find a closed topic on the subject, go ahead and open a new topic on it and we’ll follow up with you. We apologize for the inconvenience, and appreciate your willingness to work with us to provide a supportive community.