Airbyte source read batch size for uploading records to S3 (Redshift target)

Summary

Inquiring about the Airbyte source read batch size for uploading records to S3 with Redshift as the target. Facing issues with inserting records into the Redshift airbyte_internal table due to an error related to _airbyte_data exceeding the maximum value size.


Question

What is the airbyte source read batch size while uploading the records to S3(for redshift as target)?
In my case airbyte reading too many records, and unable to insert into redshift airbyte_internal table - getting error _airbyte_data exceeds max value size



This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want to access the original thread.

Join the conversation on Slack

["airbyte", "source-read-batch-size", "uploading-records", "s3", "redshift", "airbyte-internal-table", "error"]