Error in Mongo -> Clickhouse integration

Based on the error message you’ve shared, it seems like the memory limit is being exceeded during the data sync process. This is a common issue when dealing with large datasets or complex transformations. In one of the forum posts extracted above, a user faced a similar issue while trying to sync a 1GB table from MySQL to Clickhouse. The support agent suggested checking the memory configuration of the source worker as it reads up to 10,000 records in memory. This can present problems for database sources with tables that have large row sizes. The Java connectors in Airbyte follow Java’s default behaviour with container memory and will only use up to 1/4 of the host’s allocated memory. If the memory usage goes higher, it may result in Out-of-Memory exceptions. You may want to customize this by setting JOB_MAIN_CONTAINER_MEMORY_REQUEST and JOB_MAIN_CONTAINER_MEMORY_LIMIT environment variables to custom values. Here are the relevant documents for your reference: - Scaling Airbyte - Configuring Connector Resources Please review these documents and adjust your settings accordingly. If the issue persists, it would be helpful to know more about your instance and the connector versions you’re using. Source: Forum Post: Mysql to clickhouse fails with Memory exception for incremental dedup data for 1GB table > Answers Part 1