Handling large data streams in Airbyte connector to avoid temporary file size limit error

Summary

The user is facing a temporary file size limit error due to large data streams in the database. They are looking for a way to update the connector to regulate and send data in chunks to avoid this issue.


Question

java.lang.RuntimeException: org.postgresql.util.PSQLException: ERROR: temporary file size exceeds temp_file_limit (30904768kB)
Basically my issue is that the database cannot handle the amout of data it is being streamed into. Can I update the connector in any way to regulate and send the data in chunks, I have increased my database memory several times now and this issue keeps repeating.



This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want
to access the original thread.

Join the conversation on Slack

["large-data-streams", "temporary-file-size-limit", "database-memory", "data-chunks", "connector-update"]