Is this your first time deploying Airbyte?: No
OS Version / Instance: AWS ubuntu - focal - 20.04 - amd64 - server
Memory / Disk: you can use something like 8GB
Deployment: Docker
Airbyte Version: 0.35.51-alpha
Step: sync
Hi, I’ve been having some problems with the update of the Redshift destination 0.3.32. As shown by many issues in github such as All Syncs with redshift destination 0.3.32 fail rollback to version 0.3.28 resolves the issue · Issue #12265 · airbytehq/airbyte · GitHub . I decided to revert to 0.3.27, which was working previously. However, the same error is still showing:
function json_extract_path_text(super, “unknown”, boolean) does not exist
Then
JSON schema validation failed.
Debug info:
Airbyte Version: 0.35.51-alpha
Source: Google Sheets (0.2.9)
Destination: Redshift (0.3.27)
I think the normalization (dbt file) is not being generated again when I downgrade or upgrade. I am not sure about what to do to completely rollback the configuration.
I attach a one of the logs.
logs-379.txt (153.5 KB)
Did you try reset the data after rolling back to previous version?
This problem can happen if normalization detected your main json field _airbyte_data
as VARCHAR, but it in reality it’s type is SUPER
1 Like
It seems like something that should work. I modified the “Namespace Custom Format” and it works. But I do not want to erase the past data nor create another table.
I’d like to know if there is something like a config file I could delete or modify to have the same effect. The spreadsheets I’m loading don’t matter that much but I do not want to erase the Zendesk data which connector has the same problem (Redshift destination).
in really you cannot revert back
because new desitnation-redshift 0.3.32 converted you table _airbyte_raw_name
field _airbyte_data
to redshfit SUPER type (from VARCHAR)
if you downgrade back destination-redshift you will only add some mess to your system
Can you please send not only logs but all contents of airbyte_workspace:/data
directory ?
you can pack this way:
docker run -ti --rm -v airbyte_workspace:/data -v `pwd`:/result ubuntu tar cfz /result/data.tar.gz /data/379
Hi,
Given this answer I talked with my team over possible solutions. I think we are going to reset from scratch during the weekend since it’s the simpler solution.
On the airbyte_workspace:/data directory. I can give it if you still require it for something, however I would have to delete some data as I saw it has some credentials and stuff like that inside the .tar file.
Thanks for your answers.
Andres if you only delete the raw table from your destination the sync should works again. But you’re going to loose the data already synced.
if it possible to would be cool to get /data
to try to understand the source of this problem.
/data contains a lot of logs related to sync and normalization.
Yes you are right /data
contains credentials in destination_config.json
this file must be removed before compression of archive.
One of the reasons why people complain about destination-redshift
upgrade 0.3.29(or less)
→ 0.3.31+
can be next:
We have 2 closely coupled components: destination-redshift
, base-normalization
After we have added SUPER type this 2 components has to be upgraded both:
destination-redshift 0.3.31+
base-normalization 0.1.77+
(airbyte platform v0.36.2-alpha+
)
It means if you upgrade destination-redshift
you also must upgrade Airbyte platform to be at least v0.36.2-alpha+
Thanks. I’ll talk with the devops team to upgrade the platform.
About the archive. It seems I can’t upload a zip file here. Do you have a mail I can send it to?
Can you please upload to this github issue
opened 07:59AM - 22 Apr 22 UTC
type/bug
community
team/connectors-java
## Environment
- Is this your first time deploying Airbyte?: No
- OS Version /… Instance: Amazon Linux
- Memory / Disk: you can use something like 8Gb / 60 Tb
- Deployment: Docker
- Airbyte Version: 0.36.2-alpha
- Source name/version:
Google sheets: 0.2.12
Hubspot: 0.1.53
Stripe: 0.1.31 etc.....
- **Severity**: Critical
- **Step where error happened**: Sync job
## Current Behavior
syncs all fail with version 0.3.32 of the redshift destination connector going back to version 0.3.28 solves the issue.
## Expected Behavior
Sync should work
## Logs
<details>
<summary>LOG</summary>
```
example from sheets:
2022-04-22 06:38:17 [43mdestination[0m > 2022-04-22 06:38:17 [32mINFO[m a.m.s.StreamTransferManager(complete):367 - [Manager uploading to feedr-airbyte-prod-eu-west-2//Google_Sheets_Import/Tracker/2022_04_22_1650609493302_d1405323-22e6-4e58-bd39-60a3cf95c6c1.csv with id 0G4kNjyAV...msdos6H88]: Uploading leftover stream [Part number 1 containing 0.71 MB]
2022-04-22 06:38:17 [43mdestination[0m > 2022-04-22 06:38:17 [32mINFO[m a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to feedr-airbyte-prod-eu-west-2//Google_Sheets_Import/Tracker/2022_04_22_1650609493302_d1405323-22e6-4e58-bd39-60a3cf95c6c1.csv with id 0G4kNjyAV...msdos6H88]: Finished uploading [Part number 1 containing 0.71 MB]
2022-04-22 06:38:17 [43mdestination[0m > 2022-04-22 06:38:17 [32mINFO[m a.m.s.StreamTransferManager(complete):395 - [Manager uploading to feedr-airbyte-prod-eu-west-2//Google_Sheets_Import/Tracker/2022_04_22_1650609493302_d1405323-22e6-4e58-bd39-60a3cf95c6c1.csv with id 0G4kNjyAV...msdos6H88]: Completed
2022-04-22 06:38:17 [43mdestination[0m > 2022-04-22 06:38:17 [32mINFO[m i.a.i.d.s.w.BaseS3Writer(close):115 - Upload completed for stream 'Tracker'.
2022-04-22 06:38:17 [43mdestination[0m > 2022-04-22 06:38:17 [32mINFO[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationSchema):155 - Creating schema in destination if it doesn't exist: google_sheets_import
2022-04-22 06:38:18 [43mdestination[0m > 2022-04-22 06:38:18 [32mINFO[m i.a.i.d.j.c.s.S3StreamCopier(createTemporaryTable):161 - Preparing tmp table in destination for stream: Tracker, schema: google_sheets_import, tmp table name: _airbyte_tmp_rsd_tracker.
2022-04-22 06:38:18 [43mdestination[0m > 2022-04-22 06:38:18 [32mINFO[m i.a.i.d.r.RedshiftStreamCopier(copyStagingFileToTemporaryTable):86 - Starting copy to tmp table: _airbyte_tmp_rsd_tracker in destination for stream: Tracker, schema: google_sheets_import, .
2022-04-22 06:38:19 [43mdestination[0m > 2022-04-22 06:38:19 [32mINFO[m i.a.i.d.r.RedshiftStreamCopier(copyStagingFileToTemporaryTable):90 - Copy to tmp table _airbyte_tmp_rsd_tracker in destination for stream Tracker complete.
2022-04-22 06:38:19 [43mdestination[0m > 2022-04-22 06:38:19 [32mINFO[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationTable):177 - Preparing table _airbyte_raw_tracker in destination.
2022-04-22 06:38:19 [43mdestination[0m > 2022-04-22 06:38:19 [32mINFO[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationTable):179 - Table _airbyte_tmp_rsd_tracker in destination prepared.
2022-04-22 06:38:19 [43mdestination[0m > 2022-04-22 06:38:19 [32mINFO[m i.a.i.d.j.c.s.S3StreamCopier(generateMergeStatement):186 - Preparing to merge tmp table _airbyte_tmp_rsd_tracker to dest table: _airbyte_raw_tracker, schema: google_sheets_import, in destination.
2022-04-22 06:38:19 [43mdestination[0m > 2022-04-22 06:38:19 [32mINFO[m i.a.i.d.j.c.s.S3StreamCopier(generateMergeStatement):190 - Destination OVERWRITE mode detected. Dest table: _airbyte_raw_tracker, schema: google_sheets_import, truncated.
2022-04-22 06:38:19 [43mdestination[0m > 2022-04-22 06:38:19 [32mINFO[m i.a.i.d.r.RedshiftSqlOperations(onDestinationCloseOperations):110 - Executing operations for Redshift Destination DB engine...
2022-04-22 06:38:19 [43mdestination[0m > 2022-04-22 06:38:19 [32mINFO[m i.a.i.d.r.RedshiftSqlOperations(discoverNotSuperTables):129 - Discovering NOT SUPER table types...
2022-04-22 06:38:19 [43mdestination[0m > 2022-04-22 06:38:19 [32mINFO[m i.a.i.d.r.RedshiftSqlOperations(onDestinationCloseOperations):118 - Executing operations for Redshift Destination DB engine completed.
2022-04-22 06:38:19 [43mdestination[0m > 2022-04-22 06:38:19 [32mINFO[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):201 - S3 staging file /Google_Sheets_Import/Tracker/2022_04_22_1650609493302_d1405323-22e6-4e58-bd39-60a3cf95c6c1.csv cleaned.
2022-04-22 06:38:19 [43mdestination[0m > 2022-04-22 06:38:19 [32mINFO[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):205 - Begin cleaning _airbyte_tmp_rsd_tracker tmp table in destination.
2022-04-22 06:38:19 [43mdestination[0m > Exception in thread "main" java.sql.SQLException: [Amazon](500310) Invalid operation: current transaction is aborted, commands ignored until end of transaction block;
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.messages.inbound.ErrorResponse.toErrorException(Unknown Source)
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGMessagingContext.handleErrorResponse(Unknown Source)
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGMessagingContext.handleMessage(Unknown Source)
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.jdbc.communications.InboundMessagesPipeline.getNextMessageOfClass(Unknown Source)
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGMessagingContext.doMoveToNextClass(Unknown Source)
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGMessagingContext.moveThroughMetadata(Unknown Source)
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGMessagingContext.getNoData(Unknown Source)
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGClient.directExecuteExtraMetadataWithMessage(Unknown Source)
2022-04-22 06:38:19 [43mdestination[0m > 2022-04-22 06:38:19 [1;31mERROR[m i.a.i.d.b.BufferedStreamConsumer(close):191 - Close failed.
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.dataengine.PGQueryExecutor$CallableExecuteTask.call(Unknown Source)
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.dataengine.PGQueryExecutor$CallableExecuteTask.call(Unknown Source)
2022-04-22 06:38:19 [43mdestination[0m > at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
2022-04-22 06:38:19 [43mdestination[0m > at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
2022-04-22 06:38:19 [43mdestination[0m > at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
2022-04-22 06:38:19 [43mdestination[0m > Caused by: com.amazon.support.exceptions.ErrorException: [Amazon](500310) Invalid operation: current transaction is aborted, commands ignored until end of transaction block;
2022-04-22 06:38:19 [43mdestination[0m > ... 13 more
2022-04-22 06:38:19 [43mdestination[0m > java.sql.SQLException: [Amazon](500310) Invalid operation: current transaction is aborted, commands ignored until end of transaction block;
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.messages.inbound.ErrorResponse.toErrorException(Unknown Source) ~[redshift-jdbc42-no-awssdk-1.2.51.1078.jar:RedshiftJDBC_1.2.51.1078]
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGMessagingContext.handleErrorResponse(Unknown Source) ~[redshift-jdbc42-no-awssdk-1.2.51.1078.jar:RedshiftJDBC_1.2.51.1078]
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGMessagingContext.handleMessage(Unknown Source) ~[redshift-jdbc42-no-awssdk-1.2.51.1078.jar:RedshiftJDBC_1.2.51.1078]
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.jdbc.communications.InboundMessagesPipeline.getNextMessageOfClass(Unknown Source) ~[redshift-jdbc42-no-awssdk-1.2.51.1078.jar:RedshiftJDBC_1.2.51.1078]
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGMessagingContext.doMoveToNextClass(Unknown Source) ~[redshift-jdbc42-no-awssdk-1.2.51.1078.jar:RedshiftJDBC_1.2.51.1078]
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGMessagingContext.moveThroughMetadata(Unknown Source) ~[redshift-jdbc42-no-awssdk-1.2.51.1078.jar:RedshiftJDBC_1.2.51.1078]
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGMessagingContext.getNoData(Unknown Source) ~[redshift-jdbc42-no-awssdk-1.2.51.1078.jar:RedshiftJDBC_1.2.51.1078]
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.client.PGClient.directExecuteExtraMetadataWithMessage(Unknown Source) ~[redshift-jdbc42-no-awssdk-1.2.51.1078.jar:RedshiftJDBC_1.2.51.1078]
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.dataengine.PGQueryExecutor$CallableExecuteTask.call(Unknown Source) ~[redshift-jdbc42-no-awssdk-1.2.51.1078.jar:RedshiftJDBC_1.2.51.1078]
2022-04-22 06:38:19 [43mdestination[0m > at com.amazon.redshift.dataengine.PGQueryExecutor$CallableExecuteTask.call(Unknown Source) ~[redshift-jdbc42-no-awssdk-1.2.51.1078.jar:RedshiftJDBC_1.2.51.1078]
2022-04-22 06:38:19 [43mdestination[0m > at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
2022-04-22 06:38:19 [43mdestination[0m > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
2022-04-22 06:38:19 [43mdestination[0m > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
2022-04-22 06:38:19 [43mdestination[0m > Caused by: com.amazon.support.exceptions.ErrorException: [Amazon](500310) Invalid operation: current transaction is aborted, commands ignored until end of transaction block;
2022-04-22 06:38:19 [43mdestination[0m > ... 13 more
2022-04-22 06:38:19 [1;31mERROR[m i.a.w.DefaultReplicationWorker(run):169 - Sync worker failed.
java.util.concurrent.ExecutionException: io.airbyte.workers.DefaultReplicationWorker$DestinationException: Destination process exited with non-zero exit code 1
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:164) ~[io.airbyte-airbyte-workers-0.36.2-alpha.jar:?]
at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.36.2-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.2-alpha.jar:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
Suppressed: io.airbyte.workers.WorkerException: Destination process exit with code 1. This warning is normal if the job was cancelled.
at io.airbyte.workers.protocols.airbyte.DefaultAirbyteDestination.close(DefaultAirbyteDestination.java:119) ~[io.airbyte-airbyte-workers-0.36.2-alpha.jar:?]
at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:126) ~[io.airbyte-airbyte-workers-0.36.2-alpha.jar:?]
at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.36.2-alpha.jar:?]
at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.2-alpha.jar:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: io.airbyte.workers.DefaultReplicationWorker$DestinationException: Destination process exited with non-zero exit code 1
at io.airbyte.workers.DefaultReplicationWorker.lambda$getDestinationOutputRunnable$6(DefaultReplicationWorker.java:354) ~[io.airbyte-airbyte-workers-0.36.2-alpha.jar:?]
at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
... 1 more
2022-04-22 06:38:19 [32mINFO[m i.a.w.DefaultReplicationWorker(run):228 - sync summary: io.airbyte.config.ReplicationAttemptSummary@7e000863[status=failed,recordsSynced=1968,bytesSynced=556599,startTime=1650609487582,endTime=1650609499928,totalStats=io.airbyte.config.SyncStats@1d01a85f[recordsEmitted=1968,bytesEmitted=556599,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[io.airbyte.config.StreamSyncStats@7fcb5e82[streamName=Tracker,stats=io.airbyte.config.SyncStats@680d96eb[recordsEmitted=1968,bytesEmitted=556599,stateMessagesEmitted=<null>,recordsCommitted=<null>]]]]
2022-04-22 06:38:19 [32mINFO[m i.a.w.DefaultReplicationWorker(run):250 - Source did not output any state messages
2022-04-22 06:38:19 [33mWARN[m i.a.w.DefaultReplicationWorker(run):258 - State capture: No new state, falling back on input state: io.airbyte.config.State@16b0c899[state={}]
2022-04-22 06:38:19 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling...
example from mail chimp:
2022-04-20 17:46:30 destination > 2022-04-20 17:46:30 INFO a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000]
2022-04-20 17:46:30 destination > 2022-04-20 17:46:30 WARN a.m.s.MultiPartOutputStream(close):160 - [MultipartOutputStream for parts 1 - 10000] is already closed
2022-04-20 17:46:31 destination > 2022-04-20 17:46:31 INFO a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to feedr-airbyte-prod-eu-west-2//mailchimp/email_activity/2022_04_20_1650474601884_123c2740-22da-43f7-81c5-c22aac1a7719.csv with id 5HLzePKXZ...bGXe7yNdi]: Finished uploading [Part number 3 containing 5.68 MB]
2022-04-20 17:46:31 destination > 2022-04-20 17:46:31 INFO a.m.s.StreamTransferManager(complete):395 - [Manager uploading to feedr-airbyte-prod-eu-west-2//mailchimp/email_activity/2022_04_20_1650474601884_123c2740-22da-43f7-81c5-c22aac1a7719.csv with id 5HLzePKXZ...bGXe7yNdi]: Completed
2022-04-20 17:46:31 destination > 2022-04-20 17:46:31 INFO i.a.i.d.s.w.BaseS3Writer(close):115 - Upload completed for stream 'email_activity'.
2022-04-20 17:46:31 destination > 2022-04-20 17:46:31 INFO i.a.i.d.j.c.s.S3StreamCopier(createDestinationSchema):155 - Creating schema in destination if it doesn't exist: mailchimp
2022-04-20 17:46:31 destination > 2022-04-20 17:46:31 INFO i.a.i.d.j.c.s.S3StreamCopier(createTemporaryTable):161 - Preparing tmp table in destination for stream: email_activity, schema: mailchimp, tmp table name: _airbyte_tmp_jpk_email_activity.
2022-04-20 17:46:32 destination > 2022-04-20 17:46:32 INFO i.a.i.d.r.RedshiftStreamCopier(copyStagingFileToTemporaryTable):86 - Starting copy to tmp table: _airbyte_tmp_jpk_email_activity in destination for stream: email_activity, schema: mailchimp, .
2022-04-20 17:46:54 destination > 2022-04-20 17:46:54 INFO i.a.i.d.r.RedshiftStreamCopier(copyStagingFileToTemporaryTable):90 - Copy to tmp table _airbyte_tmp_jpk_email_activity in destination for stream email_activity complete.
2022-04-20 17:46:54 destination > 2022-04-20 17:46:54 INFO i.a.i.d.j.c.s.S3StreamCopier(createDestinationTable):177 - Preparing table _airbyte_raw_email_activity in destination.
2022-04-20 17:46:54 destination > 2022-04-20 17:46:54 INFO i.a.i.d.j.c.s.S3StreamCopier(createDestinationTable):179 - Table _airbyte_tmp_jpk_email_activity in destination prepared.
2022-04-20 17:46:54 destination > 2022-04-20 17:46:54 INFO i.a.i.d.j.c.s.S3StreamCopier(generateMergeStatement):186 - Preparing to merge tmp table _airbyte_tmp_jpk_email_activity to dest table: _airbyte_raw_email_activity, schema: mailchimp, in destination.
2022-04-20 17:46:54 destination > 2022-04-20 17:46:54 INFO i.a.i.d.j.c.s.S3StreamCopier(generateMergeStatement):190 - Destination OVERWRITE mode detected. Dest table: _airbyte_raw_email_activity, schema: mailchimp, truncated.
2022-04-20 17:46:54 destination > 2022-04-20 17:46:54 INFO i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):201 - S3 staging file /mailchimp/lists/2022_04_20_1650474601889_60e21a7f-b060-4cb1-8a8a-0776d8dec37d.csv cleaned.
2022-04-20 17:46:54 destination > 2022-04-20 17:46:54 INFO i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):205 - Begin cleaning _airbyte_tmp_kff_lists tmp table in destination.
2022-04-20 17:46:54 destination > Exception in thread "main" java.sql.SQLException: [Amazon](500310) Invalid operation: current transaction is aborted, commands ignored until end of transaction block;
2022-04-20 17:46:54 destination > 2022-04-20 17:46:54 ERROR i.a.i.d.b.BufferedStreamConsumer(close):191 - Close failed.
```
</details>
?
Please remove sensitive data from archive
I have looked at logs and I see next versions:
airbyte/destination-redshift:0.3.28
airbyte/normalization:0.1.69
these are old versions (which are not supported SUPER) but you already convered raw table to SUPER type.
Anyway just try to upgrade platform and airbyte/destination-redshift
to the latest version and run sync again. If you again encounter the problem please upload data folder to github for investigation.
Thank you
Hi there from the Community Assistance team.
We’re letting you know about an issue we discovered with the back-end process we use to handle topics and responses on the forum. If you experienced a situation where you posted the last message in a topic that did not receive any further replies, please open a new topic to continue the discussion. In addition, if you’re having a problem and find a closed topic on the subject, go ahead and open a new topic on it and we’ll follow up with you. We apologize for the inconvenience, and appreciate your willingness to work with us to provide a supportive community.