Mssql CDC connector hangs for long time

Hi All,
I am new to airbyte. I have configured airbyte mssql connector for cdc and scheduled for every hours to run however, it just hang for more than hour. is there any timeout if there is no new records ? Could you provide me best practice on scheduling cdc sync in mssql ?

Thanks
Siva

Hey what do you mean by hanging for more than an hour? If the sync is hung could you share the logs for that sync

logs-5010.txt (1.1 MB)

Attached logs from airbyte

Job stuck in same position for ~1 hour where it suppose to run in couple of minutes

2022-06-10 01:15:55 e[43mdestinatione[0m > 2022-06-10 01:15:55 e[32mINFOe[m i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):55 - Starting a new buffer for stream ConsumerAttribute (current state: 0 bytes in 7 buffers)
2022-06-10 02:48:18 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(lambda$getCancellationChecker$3):194 - Running sync worker cancellation…

Hey looks like there is some error with the connection can you retry and check?

got stuck in same place. I have tried cancelling 3 times. Current run status

“”

2022-06-10 03:56:58 destination > 2022-06-10 03:56:58 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):55 - Starting a new buffer for stream CreditProfileState (current state: 0 bytes in 0 buffers)

2022-06-10 03:56:58 destination > 2022-06-10 03:56:58 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):55 - Starting a new buffer for stream Consumer (current state: 0 bytes in 1 buffers)

2022-06-10 03:56:58 destination > 2022-06-10 03:56:58 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):55 - Starting a new buffer for stream Document (current state: 0 bytes in 2 buffers)

2022-06-10 03:56:58 destination > 2022-06-10 03:56:58 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):55 - Starting a new buffer for stream ConsumerStats (current state: 0 bytes in 3 buffers)

2022-06-10 03:56:58 destination > 2022-06-10 03:56:58 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):55 - Starting a new buffer for stream ConsumerIdentification (current state: 0 bytes in 4 buffers)

2022-06-10 03:56:58 destination > 2022-06-10 03:56:58 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):55 - Starting a new buffer for stream ConsumerAddress (current state: 0 bytes in 5 buffers)

2022-06-10 03:56:58 destination > 2022-06-10 03:56:58 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):55 - Starting a new buffer for stream VerifiedAddress (current state: 0 bytes in 6 buffers)

2022-06-10 03:56:58 destination > 2022-06-10 03:56:58 INFO i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):55 - Starting a new buffer for stream ConsumerAttribute (current state: 0 bytes in 7 buffers)

“”

Hey could you check the resources like CPU/RAM usage and see if there is something there which could help?

@harshith Thank you for your reply. CPU/memory is usage is ~20% of total memory. I am expecting around ~100 MB data only on this run.

Attempt failed and it shows “Caused by: java.net.SocketException: No route to host” . could you shed some light on this ? Attaching full logs
logs-5014.txt (4.8 MB)

Hey have created an issue around this https://github.com/airbytehq/airbyte/issues/13678 could you update the Airbyte version, source version and destination version there

@harshith thank you so much for your help. Just updating solution here. upgrading airbyte version to 0.39.x solved the hanging issue.

Hi there from the Community Assistance team.
We’re letting you know about an issue we discovered with the back-end process we use to handle topics and responses on the forum. If you experienced a situation where you posted the last message in a topic that did not receive any further replies, please open a new topic to continue the discussion. In addition, if you’re having a problem and find a closed topic on the subject, go ahead and open a new topic on it and we’ll follow up with you. We apologize for the inconvenience, and appreciate your willingness to work with us to provide a supportive community.