Fail to ingest Hubspot Data

  • Is this your first time deploying Airbyte?: No
  • OS Version / Instance: Kubernetes with airbyte image
  • Memory / Disk:
  • Deployment: Are you using Docker or Kubernetes deployment?
  • Airbyte Version: 0.39.11
  • Source name/version: Hubspot 0.1.68
  • Destination name/version: GCS 0.2.6
  • Step: During Sync
  • Description: When syncing Hubspot with GCS I got the following error :
tech.allegro.schema.json2avro.converter.AvroConversionException: Failed to convert JSON to Avro: Could not evaluate union, field timestamp is expected to be one of these: NULL, LONG, STRING. If this is a complex type, check if offending field (path: timestamp) adheres to schema: 1654779932443

Don’t know how to solve it would be awesome to get some help :slight_smile: thanks

You could try to use Json format? Avro/Parquet has limited options with complex schemas.

HI it works better with JSON format indeed.
However I still have an issue in k8s env it seems that temporal fails (is it possible to get the logs from temporal ?) With docker-compose in local it works pretty well :slight_smile:

Update I increased the pod resource and now I have this error :

java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null

Any idea ?

The 143 error means OOM you need to increase resources for the connector. For k8s are you using Json format in destination too and same version as docker-compose?

Works fine with the last version of the schema. maybe a drift between version. Will close the thread