Optimizing MongoDB to Postgres Data Transfer

Summary

User inquires about ways to reduce the transfer time of 20M records from a MongoDB connector to a Postgres destination while maintaining data integrity. They also seek guidance on configuring the batch size for data transfer from a temporary table to the main table.


Question

hi community,
I am trying to start a connector from mongo db to postgres.
i have around 20M records which will be transferred to postgres.
the loading time from mongo to tmp table is around 30 mins, but from tmp to main table it is table 9hours. due to json parsing and a lot of records.
questions.

  1. how can I reduces time without compromising on data and columns.
  2. how to configure batch size of data from tmp table to main table.


This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want
to access the original thread.

Join the conversation on Slack

['mongodb', 'postgres', 'data-transfer', 'batch-size', 'json-parsing']