Kafka Source JSON schema validation failed and topic_partitions is missing

  • Is this your first time deploying Airbyte?: Yes
  • OS Version / Instance: Ubuntu
  • Memory / Disk: 4GB / 30GB
  • Deployment: Docker
  • Airbyte Version: v0.35.59-alpha
  • Source name/version: kafka v0.1.4
  • Destination name/version: databricks (dev)
  • Step: The issue is happening during sync, creating the connection or a new source? During sync
  • Description: Kafka source source to Databricks destination connection succeeds but does not ingest any data and logs a JSON schema validation failed and topic_partitions is missing but it is required error.

Verified that Kafka topic was created with 1 partition and was able to be consumed by a test consumer.

Python produce code used as defined below:
self.producer.produce(self.topic, msg, partition=0, callback=lambda err, original_msg=msg: self.delivery_report(err, original_msg),)

logs-11.txt (16.8 KB)

This is only a spec validation @Anew5082. Could you try another source (like PokeAPI) just to validate that the destination Databricks is receiving data or the problem is with the Source?

Hi there from the Community Assistance team.
We’re letting you know about an issue we discovered with the back-end process we use to handle topics and responses on the forum. If you experienced a situation where you posted the last message in a topic that did not receive any further replies, please open a new topic to continue the discussion. In addition, if you’re having a problem and find a closed topic on the subject, go ahead and open a new topic on it and we’ll follow up with you. We apologize for the inconvenience, and appreciate your willingness to work with us to provide a supportive community.