-
Notifications
You must be signed in to change notification settings - Fork 901
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python SchemaRegistryClient failed to connect to SchemaRegistry Server #1850
Comments
I have seen the same exact error when i try to connect o my brokers with the confluent Schema Producer class. I think im definitly zapping something javax.net.ssl|ERROR|32|data-plane-kafka-network-thread-0-ListenerName(SSL)-SSL-5|2024-11-15 16:17:35.292 CET|TransportContext.java:352|Fatal (HANDSHAKE_FAILURE): Insufficient buffer remaining for AEAD cipher fragment (2). Needs to be more than tag size (16) (
"throwable" : {
javax.crypto.BadPaddingException: Insufficient buffer remaining for AEAD cipher fragment (2). Needs to be more than tag size (16)
at java.base/sun.security.ssl.SSLCipher$T13GcmReadCipherGenerator$GcmReadCipher.decrypt(SSLCipher.java:1894)
at java.base/sun.security.ssl.SSLEngineInputRecord.decodeInputRecord(SSLEngineInputRecord.java:240)
at java.base/sun.security.ssl.SSLEngineInputRecord.decode(SSLEngineInputRecord.java:197)
at java.base/sun.security.ssl.SSLEngineInputRecord.decode(SSLEngineInputRecord.java:160)
at java.base/sun.security.ssl.SSLTransport.decode(SSLTransport.java:111)
at java.base/sun.security.ssl.SSLEngineImpl.decode(SSLEngineImpl.java:681)
at java.base/sun.security.ssl.SSLEngineImpl.readRecord(SSLEngineImpl.java:636)
at java.base/sun.security.ssl.SSLEngineImpl.unwrap(SSLEngineImpl.java:454)
at java.base/sun.security.ssl.SSLEngineImpl.unwrap(SSLEngineImpl.java:433)
at java.base/javax.net.ssl.SSLEngine.unwrap(SSLEngine.java:637)
at org.apache.kafka.common.network.SslTransportLayer.handshakeUnwrap(SslTransportLayer.java:527)
at org.apache.kafka.common.network.SslTransportLayer.doHandshake(SslTransportLayer.java:381)
at org.apache.kafka.common.network.SslTransportLayer.handshake(SslTransportLayer.java:301)
at org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:178)
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:543)
|
This suggests that the Python client is not handling the certificate in the same way as curl since you specified the ca on curl.
could you please share the result if you disable hostname verification and client auth
there might be a different option for disabling certificate verification that is available on the confluent producer page. This will allow the Python client to bypass SSL verification, similar to how curl works when specifying |
There is no ValueError: Unrecognized properties: enable.ssl.certificate.verification |
You're absolutely correct, and I apologize for the confusion with a different library but I mentioned there is probably a different option for it based on how the library is programmed.. After reviewing the relevant documentation and code at https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/_modules/confluent_kafka/schema_registry/schema_registry_client.html), the My understanding is they all built on librdkafka which clearly has enable https://docs.confluent.io/platform/current/clients/librdkafka/html/md_CONFIGURATION.html I dont claim i m kafka expert but i was trying to connect the dots together. Additionally can yuo try to use the following and let me know if it works for you .. this is basd on my analysis of the client code...
|
i would also verify... which i assume you did .. that you have all CA certificates to include the intermediates in the trust store and they all got imported . So in other words make sure also your |
The schema.keystore.jks is properly set up. If it weren’t, I wouldn’t have been able to successfully run the curl command with the --cacert option. While setting sr_client._rest_client.session.verify = False does fix the issue, it’s not the solution I’m aiming for. My goal is for the SchemaRegistryClient to properly verify the server using the certificate provided via the ssl.ca.location configuration. In this case, it feels like the behavior is equivalent to providing ssl.ca.location: ''. |
There are differences between how the library clients and curl in how they handle and process SSL verification and certificate chains. Limitations exist and the libraries would probably be more strict. I think looking more at the ca and keystore files wont hurt. I wanted to make sure to explore all possible angles .. Another possibility that the |
Doing what you said i just get SSL error because it try to verify cserver certificate but sr_client._rest_client.session.verify is not a good option to put the CA CERT |
You're right ..I initially thought the class had this method to handle this directly as well but I was mistaken. As a workaround I submitted a pull request to add an |
Hi @Jay-boo , i am seeing the same issue. |
No 😭😭😭 |
Description
Hi i have a Schema registry server running with the following
server.properties
here is my test Python script
I get the following logs
The problem is that when trying to do it with simple curl it works :/
How to reproduce
confluent-schema-registry version 7.7.1 and python package confluent-kafka is version 2.5.3
Checklist
Please provide the following information:
confluent_kafka.version()
andconfluent_kafka.libversion()
):{...}
'debug': '..'
as necessary)The text was updated successfully, but these errors were encountered: