Facebook Marketing -> BigQuery Schema Errors

  • Is this your first time deploying Airbyte?: No
  • OS Version / Instance: Debian GNU/Linux 10 (buster)
  • Memory / Disk: e2-medium (5GB / 100GB)
  • Deployment: Docker
  • Airbyte Version: 0.35.65-alpha
  • Source name/version: Facebook Marketing (0.2.42)
  • Destination name/version: BigQuery (1.0.2)
  • Step: The issue is happening during sync, creating the connection or a new source? Sync
  • Description:

Sync starts to run, all tables are created in the destination, but failing no matter what type of replication I choose. Seems to be related to some sort of schema error. Last attempt log errors:

2022-05-03 14:26:04 e[44msourcee[0m > Encountered an exception while reading stream ad_account
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/abstract_source.py", line 119, in read
    internal_config=internal_config,
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/abstract_source.py", line 159, in _read_stream
    for record in record_iterator:
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/abstract_source.py", line 249, in _read_full_refresh
    yield self._as_airbyte_record(configured_stream.stream.name, record)
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/abstract_source.py", line 281, in _as_airbyte_record
    transformer.transform(data, schema)  # type: ignore
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/utils/transform.py", line 172, in transform
    for e in normalizer.iter_errors(record):
  File "/usr/local/lib/python3.7/site-packages/jsonschema/validators.py", line 328, in iter_errors
    for error in errors:
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/utils/transform.py", line 152, in normalizator
    instance[k] = self.__normalize(instance[k], subschema)
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/utils/transform.py", line 79, in __normalize
    original_item = self.default_convert(original_item, subschema)
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/utils/transform.py", line 110, in default_convert
    return float(original_item)
TypeError: float() argument must be a string or a number, not 'dict'

2022-05-03 14:26:04 e[44msourcee[0m > Finished syncing ad_account
2022-05-03 14:26:04 e[44msourcee[0m > SourceFacebookMarketing runtimes:
Syncing stream ad_account 0:00:01.031227
2022-05-03 14:26:04 e[44msourcee[0m > float() argument must be a string or a number, not 'dict'
Traceback (most recent call last):
  File "/airbyte/integration_code/main.py", line 13, in <module>
    launch(source, sys.argv[1:])
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/entrypoint.py", line 127, in launch
    for message in source_entrypoint.run(parsed_args):
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/entrypoint.py", line 118, in run
    for message in generator:
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/abstract_source.py", line 123, in read
    raise e
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/abstract_source.py", line 119, in read
    internal_config=internal_config,
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/abstract_source.py", line 159, in _read_stream
    for record in record_iterator:
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/abstract_source.py", line 249, in _read_full_refresh
    yield self._as_airbyte_record(configured_stream.stream.name, record)
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/abstract_source.py", line 281, in _as_airbyte_record
    transformer.transform(data, schema)  # type: ignore
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/utils/transform.py", line 172, in transform
    for e in normalizer.iter_errors(record):
  File "/usr/local/lib/python3.7/site-packages/jsonschema/validators.py", line 328, in iter_errors
    for error in errors:
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/utils/transform.py", line 152, in normalizator
    instance[k] = self.__normalize(instance[k], subschema)
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/utils/transform.py", line 79, in __normalize
    original_item = self.default_convert(original_item, subschema)
  File "/usr/local/lib/python3.7/site-packages/airbyte_cdk/sources/utils/transform.py", line 110, in default_convert
    return float(original_item)
TypeError: float() argument must be a string or a number, not 'dict'
2022-05-03 14:26:05 e[32mINFOe[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):305 - Total records read: 0 (0 bytes)
2022-05-03 14:26:06 e[43mdestinatione[0m > 2022-05-03 14:26:06 e[32mINFOe[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2022-05-03 14:26:06 e[43mdestinatione[0m > 2022-05-03 14:26:06 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):121 - Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination
2022-05-03 14:26:06 e[43mdestinatione[0m > 2022-05-03 14:26:06 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):122 - Command: WRITE
2022-05-03 14:26:06 e[43mdestinatione[0m > 2022-05-03 14:26:06 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):123 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2022-05-03 14:26:06 e[43mdestinatione[0m > 2022-05-03 14:26:06 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-05-03 14:26:06 e[43mdestinatione[0m > 2022-05-03 14:26:06 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-05-03 14:26:07 e[43mdestinatione[0m > 2022-05-03 14:26:07 e[32mINFOe[m i.a.i.d.b.BigQueryUtils(getLoadingMethod):284 - Selected loading method is set to: STANDARD
2022-05-03 14:26:09 e[43mdestinatione[0m > 2022-05-03 14:26:09 e[32mINFOe[m i.a.i.d.b.BigQueryUtils(createPartitionedTable):124 - Partitioned Table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=facebook_marketing, tableId=_airbyte_tmp_hsl_ad_account}} created successfully
2022-05-03 14:26:09 e[43mdestinatione[0m > 2022-05-03 14:26:09 e[32mINFOe[m i.a.i.d.b.BigQueryUtils(getLoadingMethod):284 - Selected loading method is set to: STANDARD
2022-05-03 14:26:09 e[43mdestinatione[0m > 2022-05-03 14:26:09 e[32mINFOe[m i.a.i.d.b.BigQueryUtils(createPartitionedTable):124 - Partitioned Table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=facebook_marketing, tableId=_airbyte_tmp_xhh_ads_insights}} created successfully
2022-05-03 14:26:10 e[43mdestinatione[0m > 2022-05-03 14:26:10 e[32mINFOe[m i.a.i.d.b.BigQueryUtils(getLoadingMethod):284 - Selected loading method is set to: STANDARD
2022-05-03 14:26:10 e[43mdestinatione[0m > 2022-05-03 14:26:10 e[32mINFOe[m i.a.i.d.b.BigQueryUtils(createPartitionedTable):124 - Partitioned Table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=facebook_marketing, tableId=_airbyte_tmp_hks_campaigns}} created successfully
2022-05-03 14:26:10 e[43mdestinatione[0m > 2022-05-03 14:26:10 e[32mINFOe[m i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded.
2022-05-03 14:26:10 e[43mdestinatione[0m > 2022-05-03 14:26:10 e[32mINFOe[m i.a.i.d.b.BigQueryRecordConsumer(close):58 - Started closing all connections
2022-05-03 14:26:10 e[43mdestinatione[0m > 2022-05-03 14:26:10 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(close):76 - Field fails during format : 
2022-05-03 14:26:10 e[43mdestinatione[0m > 2022-05-03 14:26:10 e[32mINFOe[m i.a.i.d.b.f.BigQueryRecordFormatter(printAndCleanFieldFails):70 - No field fails during record format.
2022-05-03 14:26:10 e[43mdestinatione[0m > 2022-05-03 14:26:10 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(close):79 - Closing connector:AbstractBigQueryUploader{table=_airbyte_raw_campaigns, tmpTable=_airbyte_tmp_hks_campaigns, syncMode=WRITE_TRUNCATE, writer=class io.airbyte.integrations.destination.bigquery.writer.BigQueryTableWriter, recordFormatter=class io.airbyte.integrations.destination.bigquery.formatter.DefaultBigQueryRecordFormatter}
2022-05-03 14:26:10 e[43mdestinatione[0m > 2022-05-03 14:26:10 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadData):96 - Uploading data from the tmp table _airbyte_tmp_hks_campaigns to the source table _airbyte_raw_campaigns.
2022-05-03 14:26:10 e[43mdestinatione[0m > 2022-05-03 14:26:10 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadDataToTableFromTmpTable):121 - Replication finished with no explicit errors. Copying data from tmp tables to permanent
2022-05-03 14:26:11 e[43mdestinatione[0m > 2022-05-03 14:26:11 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):187 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=facebook_marketing, tableId=_airbyte_tmp_hks_campaigns}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=facebook_marketing, tableId=_airbyte_raw_campaigns}}
2022-05-03 14:26:11 e[43mdestinatione[0m > 2022-05-03 14:26:11 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadData):98 - Data is successfully loaded to the source table _airbyte_raw_campaigns!
2022-05-03 14:26:11 e[32mINFOe[m i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
errors: $: null found, object expected
2022-05-03 14:26:11 e[1;31mERRORe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: null
2022-05-03 14:26:11 e[43mdestinatione[0m > 2022-05-03 14:26:11 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadData):100 - Final state message is accepted.
2022-05-03 14:26:11 e[43mdestinatione[0m > 2022-05-03 14:26:11 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(dropTmpTable):112 - Removing tmp tables...
2022-05-03 14:26:11 e[43mdestinatione[0m > 2022-05-03 14:26:11 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(dropTmpTable):114 - Finishing destination process...completed
2022-05-03 14:26:11 e[43mdestinatione[0m > 2022-05-03 14:26:11 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(close):86 - Closed connector:AbstractBigQueryUploader{table=_airbyte_raw_campaigns, tmpTable=_airbyte_tmp_hks_campaigns, syncMode=WRITE_TRUNCATE, writer=class io.airbyte.integrations.destination.bigquery.writer.BigQueryTableWriter, recordFormatter=class io.airbyte.integrations.destination.bigquery.formatter.DefaultBigQueryRecordFormatter}
2022-05-03 14:26:11 e[43mdestinatione[0m > 2022-05-03 14:26:11 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(close):76 - Field fails during format : 
2022-05-03 14:26:11 e[43mdestinatione[0m > 2022-05-03 14:26:11 e[32mINFOe[m i.a.i.d.b.f.BigQueryRecordFormatter(printAndCleanFieldFails):70 - No field fails during record format.
2022-05-03 14:26:11 e[43mdestinatione[0m > 2022-05-03 14:26:11 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(close):79 - Closing connector:AbstractBigQueryUploader{table=_airbyte_raw_ads_insights, tmpTable=_airbyte_tmp_xhh_ads_insights, syncMode=WRITE_TRUNCATE, writer=class io.airbyte.integrations.destination.bigquery.writer.BigQueryTableWriter, recordFormatter=class io.airbyte.integrations.destination.bigquery.formatter.DefaultBigQueryRecordFormatter}
2022-05-03 14:26:11 e[43mdestinatione[0m > 2022-05-03 14:26:11 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadData):96 - Uploading data from the tmp table _airbyte_tmp_xhh_ads_insights to the source table _airbyte_raw_ads_insights.
2022-05-03 14:26:11 e[43mdestinatione[0m > 2022-05-03 14:26:11 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadDataToTableFromTmpTable):121 - Replication finished with no explicit errors. Copying data from tmp tables to permanent
2022-05-03 14:26:13 e[43mdestinatione[0m > 2022-05-03 14:26:13 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):187 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=facebook_marketing, tableId=_airbyte_tmp_xhh_ads_insights}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=facebook_marketing, tableId=_airbyte_raw_ads_insights}}
2022-05-03 14:26:13 e[43mdestinatione[0m > 2022-05-03 14:26:13 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadData):98 - Data is successfully loaded to the source table _airbyte_raw_ads_insights!
2022-05-03 14:26:13 e[32mINFOe[m i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
errors: $: null found, object expected
2022-05-03 14:26:13 e[1;31mERRORe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: null
2022-05-03 14:26:13 e[43mdestinatione[0m > 2022-05-03 14:26:13 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadData):100 - Final state message is accepted.
2022-05-03 14:26:13 e[43mdestinatione[0m > 2022-05-03 14:26:13 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(dropTmpTable):112 - Removing tmp tables...
2022-05-03 14:26:13 e[43mdestinatione[0m > 2022-05-03 14:26:13 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(dropTmpTable):114 - Finishing destination process...completed
2022-05-03 14:26:13 e[43mdestinatione[0m > 2022-05-03 14:26:13 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(close):86 - Closed connector:AbstractBigQueryUploader{table=_airbyte_raw_ads_insights, tmpTable=_airbyte_tmp_xhh_ads_insights, syncMode=WRITE_TRUNCATE, writer=class io.airbyte.integrations.destination.bigquery.writer.BigQueryTableWriter, recordFormatter=class io.airbyte.integrations.destination.bigquery.formatter.DefaultBigQueryRecordFormatter}
2022-05-03 14:26:13 e[43mdestinatione[0m > 2022-05-03 14:26:13 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(close):76 - Field fails during format : 
2022-05-03 14:26:13 e[43mdestinatione[0m > 2022-05-03 14:26:13 e[32mINFOe[m i.a.i.d.b.f.BigQueryRecordFormatter(printAndCleanFieldFails):70 - No field fails during record format.
2022-05-03 14:26:13 e[43mdestinatione[0m > 2022-05-03 14:26:13 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(close):79 - Closing connector:AbstractBigQueryUploader{table=_airbyte_raw_ad_account, tmpTable=_airbyte_tmp_hsl_ad_account, syncMode=WRITE_TRUNCATE, writer=class io.airbyte.integrations.destination.bigquery.writer.BigQueryTableWriter, recordFormatter=class io.airbyte.integrations.destination.bigquery.formatter.DefaultBigQueryRecordFormatter}
2022-05-03 14:26:13 e[43mdestinatione[0m > 2022-05-03 14:26:13 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadData):96 - Uploading data from the tmp table _airbyte_tmp_hsl_ad_account to the source table _airbyte_raw_ad_account.
2022-05-03 14:26:13 e[43mdestinatione[0m > 2022-05-03 14:26:13 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadDataToTableFromTmpTable):121 - Replication finished with no explicit errors. Copying data from tmp tables to permanent
2022-05-03 14:26:16 e[43mdestinatione[0m > 2022-05-03 14:26:16 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(copyTable):187 - successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=facebook_marketing, tableId=_airbyte_tmp_hsl_ad_account}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=facebook_marketing, tableId=_airbyte_raw_ad_account}}
2022-05-03 14:26:16 e[43mdestinatione[0m > 2022-05-03 14:26:16 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadData):98 - Data is successfully loaded to the source table _airbyte_raw_ad_account!
2022-05-03 14:26:16 e[32mINFOe[m i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
errors: $: null found, object expected
2022-05-03 14:26:16 e[1;31mERRORe[m i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: null
2022-05-03 14:26:16 e[43mdestinatione[0m > 2022-05-03 14:26:16 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(uploadData):100 - Final state message is accepted.
2022-05-03 14:26:16 e[43mdestinatione[0m > 2022-05-03 14:26:16 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(dropTmpTable):112 - Removing tmp tables...
2022-05-03 14:26:16 e[43mdestinatione[0m > 2022-05-03 14:26:16 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(dropTmpTable):114 - Finishing destination process...completed
2022-05-03 14:26:16 e[43mdestinatione[0m > 2022-05-03 14:26:16 e[32mINFOe[m i.a.i.d.b.u.AbstractBigQueryUploader(close):86 - Closed connector:AbstractBigQueryUploader{table=_airbyte_raw_ad_account, tmpTable=_airbyte_tmp_hsl_ad_account, syncMode=WRITE_TRUNCATE, writer=class io.airbyte.integrations.destination.bigquery.writer.BigQueryTableWriter, recordFormatter=class io.airbyte.integrations.destination.bigquery.formatter.DefaultBigQueryRecordFormatter}
2022-05-03 14:26:16 e[43mdestinatione[0m > 2022-05-03 14:26:16 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):169 - Completed integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination
2022-05-03 14:26:16 e[1;31mERRORe[m i.a.w.DefaultReplicationWorker(run):169 - Sync worker failed.
java.util.concurrent.ExecutionException: io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1
	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
	at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:162) ~[io.airbyte-airbyte-workers-0.35.65-alpha.jar:?]
	at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.35.65-alpha.jar:?]
	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.65-alpha.jar:?]
	at java.lang.Thread.run(Thread.java:833) [?:?]
	Suppressed: io.airbyte.workers.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
		at io.airbyte.workers.protocols.airbyte.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.35.65-alpha.jar:?]
		at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:126) ~[io.airbyte-airbyte-workers-0.35.65-alpha.jar:?]
		at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.35.65-alpha.jar:?]
		at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.65-alpha.jar:?]
		at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1
	at io.airbyte.workers.DefaultReplicationWorker.lambda$getReplicationRunnable$5(DefaultReplicationWorker.java:312) ~[io.airbyte-airbyte-workers-0.35.65-alpha.jar:?]
	at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
	... 1 more
2022-05-03 14:26:16 e[32mINFOe[m i.a.w.DefaultReplicationWorker(run):228 - sync summary: io.airbyte.config.ReplicationAttemptSummary@11f5a092[status=failed,recordsSynced=0,bytesSynced=0,startTime=1651587962467,endTime=1651587976330,totalStats=io.airbyte.config.SyncStats@3e078cbd[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]]

Hi @phjohnson08,
This error looks related to a change of a type in an API response.
Could you please try to upgrade the source-acebook-marketing connector to its latest version (0.2.45) and let me know if the error still occurs?

Hi there from the Community Assistance team.
We’re letting you know about an issue we discovered with the back-end process we use to handle topics and responses on the forum. If you experienced a situation where you posted the last message in a topic that did not receive any further replies, please open a new topic to continue the discussion. In addition, if you’re having a problem and find a closed topic on the subject, go ahead and open a new topic on it and we’ll follow up with you. We apologize for the inconvenience, and appreciate your willingness to work with us to provide a supportive community.