Summary
The sync process between SFTP source and BigQuery halts after reaching 50,000 rows per CSV file, even though there are more records to transfer. Decreasing the chunk size did not resolve the issue.
Question
Hello! I’ve set up a connection between an SFTP source and BigQuery. During the sync process, everything appears to function correctly. However, I’ve encountered an issue where the transfer halts after reaching 50,000 rows per CSV file. Upon reviewing the logs, I’ve noticed that it reads 50,000 rows and identifies 15,697 remaining records, but for some reason, it doesn’t proceed to transfer these additional rows. How do I fix this? I’ve tried decreasing the chunk size from 15 to 12, but it still cuts off at 50,000 per file. For example, when performing a full-refresh with three files, it loads 150,000 rows even when the three files contain more than that.
This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want to access the original thread.