Summary
User is experiencing out of memory errors when loading large tables from Clickhouse using the Airbyte JDBC connector. They are seeking a method to configure the connector to chunk the full and incremental loads to avoid memory issues.
Question
Anyone have experience loading large tables using Airbyte? I’m specifically loading from Clickhouse and the way the JDBC connector works is it runs
> SELECT col1, col2, col3 FROM table ORDER BY time
ASC
but this causes an out of memory error DB::Exception: Memory limit (total) exceeded
Anyway to tell the Airbyte connector to chunk up its full + incremental load?
This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want
to access the original thread.