Sync fails on Oracle Destination connector

  • Is this your first time deploying Airbyte?: No
  • OS Version / Instance:Windows
  • Memory / Disk: you can use something like 4Gb / 1 Tb
  • Deployment: Docker
  • Airbyte Version: 0.39.32-alpha
  • Source name/version: custom connector which produces csv file
  • Destination name/version: Oracle/0.1.16
  • Step: The issue is happening during sync
  • Description: Sync fails with oracle destination connector. It read around 56000 records, then ti failed.
    Attached Sync logs for your reference.
    Attaching the csv file used to read the data as well.
1 Like

logs-188 (2).txt (875.8 KB)

Please change the extn of the files to zip.
As zip files are not allowed, I have changed the extn to txt.

ucmfa00675437.txt (27.4 KB)

One more observation, I have tested the same source with Mysql destination connector. Its working fine.

Attached logs for the same.
logs-192.txt (50.2 KB)

csv file being the same.

It seems to be an issue with the oracle destination connector.

Can you please suggest?

Hey @rnncredit,
I spotted the following error in your logs:

2022-07-05 09:00:49 e[43mdestinatione[0m > 2022-07-05 09:00:49 e[1;31mERRORe[m i.a.i.b.FailureTrackingAirbyteMessageConsumer(accept):52 - Exception while accepting message
2022-07-05 09:00:49 e[43mdestinatione[0m > java.sql.SQLSyntaxErrorException: ORA-01745: invalid host/bind variable name

This looks like a syntax error in the generated SQL statements. Could you please share the version of your Oracle DB destination?

According to our documentation:

To use the Oracle destination, you’ll need:

  • An Oracle server version 18 or above

  • It’s possible to use Oracle 12+ but you need to configure the table name length to 120 chars.

Hey @alafanechere
We are using Oracle 19c to load the data from csv file.
Its a strange that it processed or read around 56000 records, then started throwing this weird exception.

Do you still face this error if you cut your CSV and only keep the first ~50k rows? Maybe a special character is causing the error :thinking:

@alafanechere I have tested the with csv file having 50K records. This time used FILE connector to load csv file rather than our custom connector. It still throws the same error.

Attached files for your reference.
50krecordsLogsandCsvFile.txt (426.4 KB)
please change the xtn to zip to go over the logs and csv data

To corner out the issue, I have tried with 1000 records, but it failed with different error.

Attached logs and csv for your reference.
logs-210.txt (54.2 KB)
ucmperiods_1K.txt (35.0 KB)

Please change the extension of the file ucmperiods_1k.txt to csv.

Exception in thread "main" java.sql.SQLRecoverableException: No more data to read from socket
It probably means the connection to your destination database was dropped because of an exceeded timeout. Could you monitor the connection on the oracle side when the sync runs and check if increasing a timeout configuration on Oracle changes this behavior?

@alafanechere I tried with other oracle database, still it errors out with the same message.
Attached logs for your reference.

Can you check from your end once ?
logs-222.txt (35.2 KB)

Could you check if you have the same error using the PokeAPI source and also with a single-line CSV file?

Looks like there is a limit for oracle bind variables in a PreparedStatement, the limit is 65536 (2^16)

and in the logs attached it looks like that limit is crossed,

2022-06-20 | 08:53:14.661 +0000 | ERROR | pool-314-thread-1              | P: 394     | S: 50      | i.a.w.internal.DefaultAirbyteDestination | INTO AIRBYTE_USER.airbyte_tmp_DIM_LookupValuesPVO_90d7fa6125e14205abdee558c71a92c5 ("_AIRBYTE_AB_ID", "_AIRBYTE_DATA", "_AIRBYTE_EMITTED_AT") VALUES (:68386 , :68387 , :68388 )
2022-06-20 | 08:53:14.661 +0000 | ERROR | pool-314-thread-1              | P: 394     | S: 50      | i.a.w.internal.DefaultAirbyteDestination |  SELECT 1 FROM DUAL, OriginalSql = INSERT ALL INTO AIRBYTE_USER.airbyte_tmp_DIM_LookupValuesPVO_90d7fa6125e14205abdee558c71a92c5 ("_AIRBYTE_AB_ID", "_AIRBYTE_DATA", "_AIRBYTE_EMITTED_AT") VALUES (?, ?, ?)

I was able to put together a small test JDBC program and could simulate this scenario
PreparedStatementLimitTest.txt (3.4 KB)

Good catch @vamsikurre1!
Could you please share the type of your region_id column?
Do you also have this java.sql.SQLSyntaxErrorException: ORA-01745: invalid host/bind variable name error?

  • _AIRBYTE_AB_ID is VARCHAR(64)
  • _AIRBYTE_DATA is NCLOB
  • _AIRBYTE_EMITTED_AT is TIMESTAMP WITH TIME ZONE
    Do you thin inserting one of this column with a value > 65536 will raise this error?

@alafanechere Let say we trying to load 30k records in oracle destination connector.
As connector creates prepared statement with bind parameters i.e, 30k rows*3 columns =90k bind parameters, it throws the above ORA-01745 exception. Oracle do not support more than 65536 binds.

@alafanechere in my test program region_id is just a number column,

OracleOperations.java here we are trying to insert raw data into database with single query with a lot of bind variables. when bind parameters are more than 65536 immediately getting this error java.sql.SQLSyntaxErrorException: ORA-01745: invalid host/bind variable name

Also, I observed that in the oracle XE(Express edition) version database there is an even lower limit for the bind variables, 5k bind variables are working fine but 50k bind variables are failing with java.sql.SQLRecoverableException: No more data to read from socket exception, once the limit exceeds 65536 then the exception changes to java.sql.SQLSyntaxErrorException: ORA-01745: invalid host/bind variable name

Thank you for re-explaining :slight_smile: I got it now. I opened an issue on our repo. Please subscribe to it to received updates!