Skip to content

Commit

Permalink
Release 1.4.0
Browse files Browse the repository at this point in the history
  • Loading branch information
dpkp committed Feb 7, 2018
1 parent acc3a0f commit 0c2523c
Show file tree
Hide file tree
Showing 6 changed files with 189 additions and 7 deletions.
85 changes: 85 additions & 0 deletions CHANGES.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,88 @@
# 1.4.0 (Feb 6, 2018)

This is a substantial release. Although there are no known 'showstopper' bugs as of release,
we do recommend you test any planned upgrade to your application prior to running in production.

Some of the major changes include:
* We have officially dropped python 2.6 support
* The KafkaConsumer now includes a background thread to handle coordinator heartbeats
* API protocol handling has been separated from networking code into a new class, KafkaProtocol
* Added support for kafka message format v2
* Refactored DNS lookups during kafka broker connections
* SASL authentication is working (we think)
* Removed several circular references to improve gc on close()

Thanks to all contributors -- the state of the kafka-python community is strong!

Detailed changelog are listed below:

Client
* Fixes for SASL support
* Refactor SASL/gssapi support (dpkp #1248 #1249 #1257 #1262 #1280)
* Add security layer negotiation to the GSSAPI authentication (asdaraujo #1283)
* Fix overriding sasl_kerberos_service_name in KafkaConsumer / KafkaProducer (natedogs911 #1264)
* Fix typo in _try_authenticate_plain (everpcpc #1333)
* Fix for Python 3 byte string handling in SASL auth (christophelec #1353)
* Move callback processing from BrokerConnection to KafkaClient (dpkp #1258)
* Use socket timeout of request_timeout_ms to prevent blocking forever on send (dpkp #1281)
* Refactor dns lookup in BrokerConnection (dpkp #1312)
* Read all available socket bytes (dpkp #1332)
* Honor reconnect_backoff in conn.connect() (dpkp #1342)

Consumer
* KAFKA-3977: Defer fetch parsing for space efficiency, and to raise exceptions to user (dpkp #1245)
* KAFKA-4034: Avoid unnecessary consumer coordinator lookup (dpkp #1254)
* Handle lookup_coordinator send failures (dpkp #1279)
* KAFKA-3888 Use background thread to process consumer heartbeats (dpkp #1266)
* Improve KafkaConsumer cleanup (dpkp #1339)
* Fix coordinator join_future race condition (dpkp #1338)
* Avoid KeyError when filtering fetchable partitions (dpkp #1344)
* Name heartbeat thread with group_id; use backoff when polling (dpkp #1345)
* KAFKA-3949: Avoid race condition when subscription changes during rebalance (dpkp #1364)
* Fix #1239 regression to avoid consuming duplicate compressed messages from mid-batch (dpkp #1367)

Producer
* Fix timestamp not passed to RecordMetadata (tvoinarovskyi #1273)
* Raise non-API exceptions (jeffwidman #1316)
* Fix reconnect_backoff_max_ms default config bug in KafkaProducer (YaoC #1352)

Core / Protocol
* Add kafka.protocol.parser.KafkaProtocol w/ receive and send (dpkp #1230)
* Refactor MessageSet and Message into LegacyRecordBatch to later support v2 message format (tvoinarovskyi #1252)
* Add DefaultRecordBatch implementation aka V2 message format parser/builder. (tvoinarovskyi #1185)
* optimize util.crc32 (ofek #1304)
* Raise better struct pack/unpack errors (jeffwidman #1320)
* Add Request/Response structs for kafka broker 1.0.0 (dpkp #1368)

Bugfixes
* use python standard max value (lukekingbru #1303)
* changed for to use enumerate() (TheAtomicOption #1301)
* Explicitly check for None rather than falsey (jeffwidman #1269)
* Minor Exception cleanup (jeffwidman #1317)
* Use non-deprecated exception handling (jeffwidman a699f6a)
* Remove assertion with side effect in client.wakeup() (bgedik #1348)
* use absolute imports everywhere (kevinkjt2000 #1362)

Test Infrastructure
* Use 0.11.0.2 kafka broker for integration testing (dpkp #1357 #1244)
* Add a Makefile to help build the project, generate docs, and run tests (tvoinarovskyi #1247)
* Add fixture support for 1.0.0 broker (dpkp #1275)
* Add kafka 1.0.0 to travis integration tests (dpkp #1365)
* Change fixture default host to localhost (asdaraujo #1305)
* Minor test cleanups (dpkp #1343)
* Use latest pytest 3.4.0, but drop pytest-sugar due to incompatibility (dpkp #1361)

Documentation
* Expand metrics docs (jeffwidman #1243)
* Fix docstring (jeffwidman #1261)
* Added controlled thread shutdown to example.py (TheAtomicOption #1268)
* Add license to wheel (jeffwidman #1286)
* Use correct casing for MB (jeffwidman #1298)

Logging / Error Messages
* Fix two bugs in printing bytes instance (jeffwidman #1296)


# 1.3.5 (Oct 7, 2017)

Bugfixes
Expand Down
4 changes: 2 additions & 2 deletions README.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Kafka Python client
------------------------

.. image:: https://img.shields.io/badge/kafka-0.11%2C%200.10%2C%200.9%2C%200.8-brightgreen.svg
.. image:: https://img.shields.io/badge/kafka-1.0%2C%200.11%2C%200.10%2C%200.9%2C%200.8-brightgreen.svg
:target: https://kafka-python.readthedocs.io/compatibility.html
.. image:: https://img.shields.io/pypi/pyversions/kafka-python.svg
:target: https://pypi.python.org/pypi/kafka-python
Expand Down Expand Up @@ -141,7 +141,7 @@ for interacting with kafka brokers via the python repl. This is useful for
testing, probing, and general experimentation. The protocol support is
leveraged to enable a KafkaClient.check_version() method that
probes a kafka broker and attempts to identify which version it is running
(0.8.0 to 0.11).
(0.8.0 to 1.0).

Low-level
*********
Expand Down
97 changes: 97 additions & 0 deletions docs/changelog.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,103 @@
Changelog
=========

1.4.0 (Feb 6, 2018)
###################

This is a substantial release. Although there are no known 'showstopper' bugs as of release,
we do recommend you test any planned upgrade to your application prior to running in production.

Some of the major changes include:

* We have officially dropped python 2.6 support
* The KafkaConsumer now includes a background thread to handle coordinator heartbeats
* API protocol handling has been separated from networking code into a new class, KafkaProtocol
* Added support for kafka message format v2
* Refactored DNS lookups during kafka broker connections
* SASL authentication is working (we think)
* Removed several circular references to improve gc on close()

Thanks to all contributors -- the state of the kafka-python community is strong!

Detailed changelog are listed below:

Client
------
* Fixes for SASL support

* Refactor SASL/gssapi support (dpkp #1248 #1249 #1257 #1262 #1280)
* Add security layer negotiation to the GSSAPI authentication (asdaraujo #1283)
* Fix overriding sasl_kerberos_service_name in KafkaConsumer / KafkaProducer (natedogs911 #1264)
* Fix typo in _try_authenticate_plain (everpcpc #1333)
* Fix for Python 3 byte string handling in SASL auth (christophelec #1353)

* Move callback processing from BrokerConnection to KafkaClient (dpkp #1258)
* Use socket timeout of request_timeout_ms to prevent blocking forever on send (dpkp #1281)
* Refactor dns lookup in BrokerConnection (dpkp #1312)
* Read all available socket bytes (dpkp #1332)
* Honor reconnect_backoff in conn.connect() (dpkp #1342)

Consumer
--------
* KAFKA-3977: Defer fetch parsing for space efficiency, and to raise exceptions to user (dpkp #1245)
* KAFKA-4034: Avoid unnecessary consumer coordinator lookup (dpkp #1254)
* Handle lookup_coordinator send failures (dpkp #1279)
* KAFKA-3888 Use background thread to process consumer heartbeats (dpkp #1266)
* Improve KafkaConsumer cleanup (dpkp #1339)
* Fix coordinator join_future race condition (dpkp #1338)
* Avoid KeyError when filtering fetchable partitions (dpkp #1344)
* Name heartbeat thread with group_id; use backoff when polling (dpkp #1345)
* KAFKA-3949: Avoid race condition when subscription changes during rebalance (dpkp #1364)
* Fix #1239 regression to avoid consuming duplicate compressed messages from mid-batch (dpkp #1367)

Producer
--------
* Fix timestamp not passed to RecordMetadata (tvoinarovskyi #1273)
* Raise non-API exceptions (jeffwidman #1316)
* Fix reconnect_backoff_max_ms default config bug in KafkaProducer (YaoC #1352)

Core / Protocol
---------------
* Add kafka.protocol.parser.KafkaProtocol w/ receive and send (dpkp #1230)
* Refactor MessageSet and Message into LegacyRecordBatch to later support v2 message format (tvoinarovskyi #1252)
* Add DefaultRecordBatch implementation aka V2 message format parser/builder. (tvoinarovskyi #1185)
* optimize util.crc32 (ofek #1304)
* Raise better struct pack/unpack errors (jeffwidman #1320)
* Add Request/Response structs for kafka broker 1.0.0 (dpkp #1368)

Bugfixes
--------
* use python standard max value (lukekingbru #1303)
* changed for to use enumerate() (TheAtomicOption #1301)
* Explicitly check for None rather than falsey (jeffwidman #1269)
* Minor Exception cleanup (jeffwidman #1317)
* Use non-deprecated exception handling (jeffwidman a699f6a)
* Remove assertion with side effect in client.wakeup() (bgedik #1348)
* use absolute imports everywhere (kevinkjt2000 #1362)

Test Infrastructure
-------------------
* Use 0.11.0.2 kafka broker for integration testing (dpkp #1357 #1244)
* Add a Makefile to help build the project, generate docs, and run tests (tvoinarovskyi #1247)
* Add fixture support for 1.0.0 broker (dpkp #1275)
* Add kafka 1.0.0 to travis integration tests (dpkp #1365)
* Change fixture default host to localhost (asdaraujo #1305)
* Minor test cleanups (dpkp #1343)
* Use latest pytest 3.4.0, but drop pytest-sugar due to incompatibility (dpkp #1361)

Documentation
-------------
* Expand metrics docs (jeffwidman #1243)
* Fix docstring (jeffwidman #1261)
* Added controlled thread shutdown to example.py (TheAtomicOption #1268)
* Add license to wheel (jeffwidman #1286)
* Use correct casing for MB (jeffwidman #1298)

Logging / Error Messages
------------------------
* Fix two bugs in printing bytes instance (jeffwidman #1296)


1.3.5 (Oct 7, 2017)
####################

Expand Down
4 changes: 2 additions & 2 deletions docs/compatibility.rst
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
Compatibility
-------------

.. image:: https://img.shields.io/badge/kafka-0.11%2C%200.10%2C%200.9%2C%200.8-brightgreen.svg
.. image:: https://img.shields.io/badge/kafka-1.0%2C%200.11%2C%200.10%2C%200.9%2C%200.8-brightgreen.svg
:target: https://kafka-python.readthedocs.io/compatibility.html
.. image:: https://img.shields.io/pypi/pyversions/kafka-python.svg
:target: https://pypi.python.org/pypi/kafka-python

kafka-python is compatible with (and tested against) broker versions 0.11
kafka-python is compatible with (and tested against) broker versions 1.0
through 0.8.0 . kafka-python is not compatible with the 0.8.2-beta release.

kafka-python is tested on python 2.7, 3.4, 3.5, 3.6 and pypy.
Expand Down
4 changes: 2 additions & 2 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
kafka-python
############

.. image:: https://img.shields.io/badge/kafka-0.11%2C%200.10%2C%200.9%2C%200.8-brightgreen.svg
.. image:: https://img.shields.io/badge/kafka-1.0%2C%200.11%2C%200.10%2C%200.9%2C%200.8-brightgreen.svg
:target: https://kafka-python.readthedocs.io/compatibility.html
.. image:: https://img.shields.io/pypi/pyversions/kafka-python.svg
:target: https://pypi.python.org/pypi/kafka-python
Expand Down Expand Up @@ -136,7 +136,7 @@ for interacting with kafka brokers via the python repl. This is useful for
testing, probing, and general experimentation. The protocol support is
leveraged to enable a :meth:`~kafka.KafkaClient.check_version()`
method that probes a kafka broker and
attempts to identify which version it is running (0.8.0 to 0.11).
attempts to identify which version it is running (0.8.0 to 1.0).


Low-level
Expand Down
2 changes: 1 addition & 1 deletion kafka/version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = '1.3.6.dev'
__version__ = '1.4.0'

1 comment on commit 0c2523c

@jeffwidman
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🍰

Please sign in to comment.