Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot insert a message longer than 32K to CLOB (Oracle) #690

Open
lobashin opened this issue Aug 5, 2019 · 15 comments
Open

Cannot insert a message longer than 32K to CLOB (Oracle) #690

lobashin opened this issue Aug 5, 2019 · 15 comments

Comments

@lobashin
Copy link

lobashin commented Aug 5, 2019

Hello,

In the study of the functionality of the sink kafka-connect-jdbc with Oracle: cannot insert a message longer 32767 bytes to CLOB column. I think that the problem with the binding statement for CLOB column to string in GenericDatabaseDialect.java. If the field length is less than 32767 there are no problems with inserting, otherwise must use setClob instead setString.

Maybe I'm wrong, please tell me how to get around this problem ?

Error:
ORA-01461: can bind a LONG value only for insert into a LONG column

Driver:
ojdbc7.jar

Sink:
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
table.name.format=TEST
topics.regex=^CONNECT-TEST$
auto.create=true
auto.evolve=true
name=TEST-SINK
insert.mode=upsert
pk.mode=record_value
pk.fields=RECORD_ID
connection.url=jdbc:oracle:thin:@x.x.x.x:1521:x
connection.user=x
connection.password=x

Table:
CREATE TABLE "TEST" (
"RECORD_ID" NUMBER(*,0) NOT NULL,
"SOURCECODE" CLOB NULL,
PRIMARY KEY("RECORD_ID"))
 
Schema:
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081
{
  "subject": "CONNECT-TEST-value",
  "version": 1,
  "id": 963,
  "schema": "{"type":"record","name":"TEST","fields":[{"name":"RECORD_ID","type":{"type":"bytes","scale":0,"precision":64,"connect.version":1,"connect.parameters":{"scale":"0"},"connect.name":"org.apache.kafka.connect.data.Decimal
","logicalType":"decimal"}},{"name":"SOURCECODE","type":["null","string"],"default":null}],"connect.name":"TEST"}"
}
 
Topic:
kafka-avro-console-consumer --bootstrap-server x.x.x.x:9092 --topic CONNECT-TEST --offset=0 --partition 0 --max-messages=1
{"RECORD_ID":"\u0001","SOURCECODE":{"string":"GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG…....More 32K.......GGGG}}

@wicknicks
Copy link
Member

I think the ORA-01461 error is occurring because of how RECORD_ID is formatted. I don't think this has anything to do with the SOURCECODE string.

@lobashin
Copy link
Author

I think the ORA-01461 error is occurring because of how RECORD_ID is formatted. I don't think this has anything to do with the SOURCECODE string.

Hi wicknicks.

I do not think so.
Why then there are no errors if SOURCECODE is less 32K ?

See a similar issue in Spring Framework spring-projects/spring-framework#16854 They started using setClob for big strings.

@gprince19
Copy link

Hello, i have the same issue in "upsert" mode but in "insert" it works fine.
I hope that this detail will help.

Regards

@gaetancollaud
Copy link

gaetancollaud commented Jan 7, 2020

Any update or workaround on this ? I would like something like "insert and ignore on constraint" so I have to use merge. Insert throws a lot of ORA-00001: unique constraint (AOO_TESTS.SYS_C005543) violated and make the task fail.

@CsCallister
Copy link

No solution yet ? I have run into the same issue. Note that since the error is thrown by the sink connector we can not use errors.tolerance=all property cause it will not help.

@gauravpatel1833
Copy link

Did we got any solution for this problem?. As we cannot go with the insert.mode = "insert" it will cause problems during updates.

@CsCallister
Copy link

Because I did not find any solution on this I decided to go with a dirty hack :
https://docs.confluent.io/platform/current/connect/transforms/filter-confluent.html

In the connector configuration I added :
transforms: filterSpammer
transforms.filterSpammer.type: io.confluent.connect.transforms.Filter$Value
transforms.filterSpammer.filter.condition: $[?(@.mail_body =~ /.{32001,}/i)]
transforms.filterSpammer.filter.type: exclude
transforms.filterSpammer.missing.or.null.behavior: include

This will skip the messages from the topic that have a field with size > 32000 bytes.
Of course, this is just to not break the connector which was my case but you will lose all messages with fields > 32000 bytes.

@monnetchr
Copy link

If the messages are JSON, that will make them unparseable.

@whatsupbros
Copy link

whatsupbros commented Mar 11, 2021

Have the same issue here - cannot insert any json-data to a CLOB database column because of this exception in UPSERT mode:

java.sql.BatchUpdateException: ORA-01461: can bind a LONG value only for insert into a LONG column

INSERT mode seems to work okay.

Do we have any news on the issue?
Seems to have been also mentioned in #714

@whatsupbros
Copy link

It seems to be fixed and merged in #925, and released with version 10.0.2 - at least in my tests with Oracle it works now, when I upgraded from 10.0.1 to 10.0.2.

@aashokggupta
Copy link

aashokggupta commented Sep 5, 2021

Any solution for this issue.
I am facing same problem.
confluent version - 5.5.x
JDBC Drier version - OJDBC8.jar
Oracle version - 12.2.0.1

@whatsupbros
Copy link

@aashokggupta, which version of JDBC connector are you using?

Upgrade to the latest version of the connector, it should work there.

@aashokggupta
Copy link

Where can i check the versions of JDBC connector. Can you please share. I am using 5.5.1 version of jdbc connector

@18015290123
Copy link

这个问题,我解决了,希望能给你参考。目标表的字段,如果设置为NCLOB,而不是CLOB,那就不会报错。原理还不清楚。

@Anatoliy3000
Copy link

Hi All
Any news about this issue?
I get the error on 10.7.4 version
With NCLOB type it also doesn't work for me

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests