-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add logic for inferring newer broker versions #2038
Conversation
4c9095b
to
4ea0677
Compare
- New Fetch / ListOffsets request / response objects - Add new test cases to inferr code based on mentioned objects - Add unit test to check inferred version against whatever resides in KAFKA_VERSION - Add new kafka broker versions to integration list - Add more kafka broker versions to travis task list - Add support for broker version 2.5 id
4ea0677
to
3952b9a
Compare
LGTM, but let's not add too much to Travis, it runs really long. |
@gabriel-tincu Great job! |
Thanks for working on this, I don't have time today but will try to look at this over the weekend. |
@@ -24,9 +24,12 @@ | |||
from kafka.future import Future | |||
from kafka.metrics.stats import Avg, Count, Max, Rate | |||
from kafka.oauth.abstract import AbstractTokenProvider | |||
from kafka.protocol.admin import SaslHandShakeRequest | |||
from kafka.protocol.admin import SaslHandShakeRequest, DescribeAclsRequest_v2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This plus my last commit is in response to the broker not recognizing the new V2 request even tho the schema is unchanged per
https://github.com/apache/kafka/blob/2.5/clients/src/main/resources/common/message/DescribeAclsRequest.json
This leads me to believe that objects are serialized and deserialized differently when flexible versions are enabled , but i need some alone time with the java client and / or wireshark to figure out how that actually happens
UPDATE: According to https://cwiki.apache.org/confluence/display/KAFKA/KIP-482%3A+The+Kafka+Protocol+should+Support+Optional+Tagged+Fields, the request header needs to be updated to account for (0) tagged fields at the minimum, otherwise the server will not be able to parse that request
Specifically , a good place to start would be a Variant base type , on whitch the compact string data type and new header format depends
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@gabriel-tincu Ohhhh, so they changed the protocol specification in version 2.4.0 basically. All new versions after that would require us to have support for Tagged fields on protocol level. For now, we can leave the auto-discovery up to 2.5, that's not a problem, but the change in Admin client above should be reverted if version 2 does not work.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@gabriel-tincu As for the Compact versions of each type it should not be too hard, as we already have a varint implementation for length in protocol utils. It was used for message V2 parser.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reverted the admin code
cc53f44
to
8976815
Compare
8976815
to
3c68df5
Compare
…ve unused older versions for inference code, remove one minor version from known server list Do not use newly created ACL request / responses in allowed version lists, due to flexible versions enabling in kafka actually requiring a serialization protocol header update Revert admin client file change
3c68df5
to
f1aa128
Compare
@jeffwidman FYI. I will merge this PR with 1 minor point - it adds DescribeAclsRequest_v2 and DescribeAclsResponse_v2 that are not fully correct, as they don't abide to this new Protocol format, that supports Tagged fields. We need to add support to it before we can finish this class. |
@gabriel-tincu Thanks greatly for the contributions! |
Can this please be released? cc @dpkp |
* Add logic for inferring newer broker versions - New Fetch / ListOffsets request / response objects - Add new test cases to inferr code based on mentioned objects - Add unit test to check inferred version against whatever resides in KAFKA_VERSION - Add new kafka broker versions to integration list - Add more kafka broker versions to travis task list - Add support for broker version 2.5 id * Implement PR change requests: fewer versions for travis testing, remove unused older versions for inference code, remove one minor version from known server list Do not use newly created ACL request / responses in allowed version lists, due to flexible versions enabling in kafka actually requiring a serialization protocol header update Revert admin client file change
As a side note, is there a ticket open or are there plans to use the same method the java client uses for keeping the Request/response classes up to date? It would seem much easier to import the folder with json api files and have our classes generated from that information, rather that doing a copy paste cycle multiple times, which can get very error prone
@jeffwidman
This change is