You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When serializing protobuf messages, the schema that is being published/checked against the registry (i.e. the printed protobuf source file string) does not seem to match the schema that is being published/checked by other confluent kafka libraries.
In my specific example, I am using confluent kafka python which uses the base64 encoded file descriptor which then gets converted into a string server side. The difference is that the schema created by the schema registry contains the json_name field options whereas the one created by the confluent go library does not. Here's a basic example reproducing/demonstrating the difference:
(on the left is the schema produced by the go library, and on the right the one produced by the python library)
A fix may be to update this library so that the generated schema (i.e. the protobuf file printed based off of the file descriptor) matches the one that is being generated server side in the schema registry. However, if the only difference is the json_name field, another workaround may be to have the schema registry recognize that the two protobufs in the screenshot above are semantically equivalent (when using normalize=true)
How to reproduce
Publish the same protobuf message using this library and the confluent kafka python library. Despite being the same message, the schema will end up being different
Checklist
Please provide the following information:
confluent-kafka-go and librdkafka version (LibraryVersion()):
from go.mod: github.com/confluentinc/confluent-kafka-go/v2 v2.3.0
from pyproject.toml for the confluent kafka python library confluent-kafka = "^2.3.0"
Apache Kafka broker version:
using this image: confluentinc/cp-kafka:7.3.0
Client configuration: ConfigMap{...}
using normalize = true for the registry
Operating system:
Provide client logs (with "debug": ".." as necessary)
Provide broker log excerpts
N/A
Critical issue
The text was updated successfully, but these errors were encountered:
Description
When serializing protobuf messages, the schema that is being published/checked against the registry (i.e. the printed protobuf source file string) does not seem to match the schema that is being published/checked by other confluent kafka libraries.
In my specific example, I am using confluent kafka python which uses the base64 encoded file descriptor which then gets converted into a string server side. The difference is that the schema created by the schema registry contains the
json_name
field options whereas the one created by the confluent go library does not. Here's a basic example reproducing/demonstrating the difference:(on the left is the schema produced by the go library, and on the right the one produced by the python library)
A fix may be to update this library so that the generated schema (i.e. the protobuf file printed based off of the file descriptor) matches the one that is being generated server side in the schema registry. However, if the only difference is the
json_name
field, another workaround may be to have the schema registry recognize that the two protobufs in the screenshot above are semantically equivalent (when usingnormalize=true
)How to reproduce
Publish the same protobuf message using this library and the confluent kafka python library. Despite being the same message, the schema will end up being different
Checklist
Please provide the following information:
LibraryVersion()
):from go.mod:
github.com/confluentinc/confluent-kafka-go/v2 v2.3.0
from
pyproject.toml
for the confluent kafka python libraryconfluent-kafka = "^2.3.0"
using this image:
confluentinc/cp-kafka:7.3.0
ConfigMap{...}
using normalize = true for the registry
"debug": ".."
as necessary)N/A
The text was updated successfully, but these errors were encountered: