Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introduced pyproject.toml and moved static metadata from setup.py #1592

Merged
merged 5 commits into from
Oct 18, 2024
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

v2.6.1 is a maintenance release with the following fixes and enhancements:

- Migrated build system from `setup.py` to `pyproject.toml` in accordance with `PEP 517` and `PEP 518`, improving project configuration, build system requirements management, and compatibility with modern Python packaging tools like `pip` and `build`.
- Added an example for OAUTH OIDC producer with support for confluent cloud (#1769, @sarwarbhuiyan)

confluent-kafka-python is based on librdkafka v2.6.1, see the
Expand Down
6 changes: 3 additions & 3 deletions DEVELOPER.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,19 +5,19 @@ This document provides information useful to developers working on confluent-kaf

## Build

$ python setup.py build
$ python -m build

If librdkafka is installed in a non-standard location provide the include and library directories with:

$ C_INCLUDE_PATH=/path/to/include LIBRARY_PATH=/path/to/lib python setup.py ...
$ C_INCLUDE_PATH=/path/to/include LIBRARY_PATH=/path/to/lib python -m build

**Note**: On Windows the variables for Visual Studio are named INCLUDE and LIB

## Generate Documentation

Install sphinx and sphinx_rtd_theme packages:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Install sphinx and sphinx_rtd_theme packages:
Install docs dependencies:


$ pip install sphinx sphinx_rtd_theme
$ pip install .[docs]

Build HTML docs:

Expand Down
File renamed without changes.
4 changes: 2 additions & 2 deletions MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
include README.md
include LICENSE.txt
emasab marked this conversation as resolved.
Show resolved Hide resolved
include test-requirements.txt
include src/confluent_kafka/src/*.[ch]
prune tests
prune docs
3 changes: 2 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,8 @@ all:


clean:
python setup.py clean
pip cache purge
rm -rf dist
make -C docs clean

.PHONY: docs
Expand Down
8 changes: 0 additions & 8 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,6 @@
# serve to show the default.

import sphinx_rtd_theme
import sys
import os
from glob import glob

# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
sys.path[:0] = [os.path.abspath(x) for x in glob('../build/lib.*')]

# -- General configuration ------------------------------------------------

Expand Down
8 changes: 3 additions & 5 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,8 @@ To setup a venv with the latest release version of confluent-kafka and dependenc
```
$ python3 -m venv venv_examples
$ source venv_examples/bin/activate
$ cd examples
$ pip install -r requirements.txt
$ pip install confluent_kafka
$ pip install -r requirements/requirements-examples.txt
```

To setup a venv that uses the current source tree version of confluent_kafka, you
Expand All @@ -42,9 +42,7 @@ need to have a C compiler and librdkafka installed
```
$ python3 -m venv venv_examples
$ source venv_examples/bin/activate
$ python setup.py develop
$ cd examples
$ pip install -r requirements.txt
$ pip install .[examples]
```

When you're finished with the venv:
Expand Down
7 changes: 4 additions & 3 deletions examples/docker/Dockerfile.alpine
Original file line number Diff line number Diff line change
Expand Up @@ -73,9 +73,10 @@ RUN \
mkdir -p /usr/src/confluent-kafka-python && \
cd /usr/src/confluent-kafka-python && \
rm -rf build && \
python3 setup.py clean -a && \
python3 setup.py build && \
python3 setup.py install && \
rm -rf dist && \
python3 -m pip install build \
python3 -m build && \
python3 -m pip install dist/confluent_kafka*whl && \
cd / && \
rm -rf /usr/src/confluent-kafka-python

Expand Down
41 changes: 41 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
[build-system]
requires = [ "setuptools>=62", "wheel"]
build-backend = "setuptools.build_meta"

[project]
name = "confluent-kafka"
version = "2.6.0.post1.dev1"
description = "Confluent's Python client for Apache Kafka"
classifiers = [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Topic :: Software Development :: Libraries :: Python Modules"]
readme = "README.md"
license = { file = "LICENSE" }
requires-python = ">=3.7"
dynamic = ["optional-dependencies","dependencies"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit:

Suggested change
dynamic = ["optional-dependencies","dependencies"]
dynamic = ["dependencies", "optional-dependencies"]


[[project.authors]]
name = "Confluent Inc."
email = "support@confluent.io"

[project.urls]
Homepage = "https://github.com/confluentinc/confluent-kafka-python"

[tool.setuptools]
include-package-data = false

[tool.setuptools.dynamic]
dependencies = {file = ["requirements/requirements.txt"]}
optional-dependencies.schemaregistry = { file = ["requirements/requirements-schemaregistry.txt"] }
optional-dependencies.avro = { file = ["requirements/requirements-avro.txt", "requirements/requirements-schemaregistry.txt"] }
optional-dependencies.json = { file = ["requirements/requirements-json.txt", "requirements/requirements-schemaregistry.txt"] }
optional-dependencies.protobuf = { file = ["requirements/requirements-protobuf.txt", "requirements/requirements-schemaregistry.txt"] }
optional-dependencies.dev = { file = ["requirements/requirements-tests.txt", "requirements/requirements-docs.txt", "requirements/requirements-examples.txt"] }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add the SR dependencies to dev (for testing) and break these long lines.
As it's not possible to use requirements-tests-install.txt, containing -r, here:

Suggested change
optional-dependencies.dev = { file = ["requirements/requirements-tests.txt", "requirements/requirements-docs.txt", "requirements/requirements-examples.txt"] }
optional-dependencies.dev = { file = [
"requirements/requirements-docs.txt",
"requirements/requirements-examples.txt",
"requirements/requirements-tests.txt",
"requirements/requirements-schemaregistry.txt",
"requirements/requirements-avro.txt",
"requirements/requirements-json.txt",
"requirements/requirements-protobuf.txt"] }

optional-dependencies.docs = { file = ["requirements/requirements-docs.txt"] }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For generating docs Sphinx needs also the SR dependencies.

Suggested change
optional-dependencies.docs = { file = ["requirements/requirements-docs.txt"] }
optional-dependencies.docs = { file = [
"requirements/requirements-docs.txt",
"requirements/requirements-schemaregistry.txt",
"requirements/requirements-avro.txt",
"requirements/requirements-json.txt",
"requirements/requirements-protobuf.txt"] }

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add an all optional dependency

optional-dependencies.all = { file = [
    "requirements/requirements-soaktest.txt",
    "requirements/requirements-docs.txt",
    "requirements/requirements-examples.txt",
    "requirements/requirements-tests.txt",
    "requirements/requirements-schemaregistry.txt",
    "requirements/requirements-avro.txt",
    "requirements/requirements-json.txt",
    "requirements/requirements-protobuf.txt"] }

optional-dependencies.tests = { file = ["requirements/requirements-tests.txt"] }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add the SR dependencies here

Suggested change
optional-dependencies.tests = { file = ["requirements/requirements-tests.txt"] }
optional-dependencies.tests = { file = [
"requirements/requirements-tests.txt",
"requirements/requirements-schemaregistry.txt",
"requirements/requirements-avro.txt",
"requirements/requirements-json.txt",
"requirements/requirements-protobuf.txt"
] }

optional-dependencies.examples = { file = ["requirements/requirements-examples.txt"] }
optional-dependencies.soaktest = { file = ["requirements/requirements-soaktest.txt"] }
9 changes: 9 additions & 0 deletions requirements/requirements-all.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
-r requirements.txt
-r requirements-schemaregistry.txt
-r requirements-avro.txt
-r requirements-protobuf.txt
-r requirements-json.txt
-r requirements-examples.txt
-r requirements-tests.txt
-r requirements-docs.txt
-r requirements-soaktest.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
fastavro>=0.23.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There was a problem with fastavro and 3.7 in versions higher than 1.8.0, but maybe it's been fixed in 1.8.1

Suggested change
fastavro>=0.23.0
fastavro < 1.8.0; python_version == "3.7"
fastavro < 2; python_version > "3.7"

requests
avro>=1.11.1,<2
avro>=1.11.1,<2
File renamed without changes.
File renamed without changes.
2 changes: 2 additions & 0 deletions requirements/requirements-json.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
pyrsistent
jsonschema
1 change: 1 addition & 0 deletions requirements/requirements-protobuf.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
protobuf
1 change: 1 addition & 0 deletions requirements/requirements-schemaregistry.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
requests
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,14 +1,17 @@
# core test requirements
urllib3<2.0.0;python_version<="3.7"
urllib3
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
urllib3
urllib3 >= 2.0.0,<3; python_version > "3.7"

flake8
pytest==4.6.9;python_version<="3.0"
pytest>=6.0.0;python_version>="3.0"
pytest
pytest-timeout
requests-mock
trivup>=0.8.3

# other requirements
fastavro<1.8.0;python_version=="3.7"
fastavro>=1.8.4;python_version>"3.7"
fastavro
requests
avro>=1.11.1,<2
jsonschema
protobuf
protobuf>=3.6.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These dependencies would be duplicate, please remove them and add a requirements/requirements-tests-install.txt that one can use to install test dependencies plus SR ones.

-r requirements-tests.txt
-r requirements-schemaregistry.txt
-r requirements-avro.txt
-r requirements-protobuf.txt
-r requirements-json.txt

Empty file added requirements/requirements.txt
Empty file.
72 changes: 2 additions & 70 deletions setup.py
Original file line number Diff line number Diff line change
@@ -1,41 +1,14 @@
#!/usr/bin/env python

import os
from setuptools import setup, find_packages
from setuptools import setup
from distutils.core import Extension
import platform

work_dir = os.path.dirname(os.path.realpath(__file__))
mod_dir = os.path.join(work_dir, 'src', 'confluent_kafka')
ext_dir = os.path.join(mod_dir, 'src')

INSTALL_REQUIRES = [
'futures;python_version<"3.2"',
'enum34;python_version<"3.4"',
]

TEST_REQUIRES = [
'pytest==4.6.4;python_version<"3.0"',
'pytest;python_version>="3.0"',
'pytest-timeout',
'flake8'
]

DOC_REQUIRES = ['sphinx', 'sphinx-rtd-theme']

SCHEMA_REGISTRY_REQUIRES = ['requests']

AVRO_REQUIRES = ['fastavro>=0.23.0,<1.0;python_version<"3.0"',
'fastavro>=1.0;python_version>"3.0"',
'avro>=1.11.1,<2',
] + SCHEMA_REGISTRY_REQUIRES

JSON_REQUIRES = ['pyrsistent==0.16.1;python_version<"3.0"',
'pyrsistent;python_version>"3.0"',
'jsonschema'] + SCHEMA_REGISTRY_REQUIRES

PROTO_REQUIRES = ['protobuf'] + SCHEMA_REGISTRY_REQUIRES

# On Un*x the library is linked as -lrdkafka,
# while on windows we need the full librdkafka name.
if platform.system() == 'Windows':
Expand All @@ -52,45 +25,4 @@
os.path.join(ext_dir, 'AdminTypes.c'),
os.path.join(ext_dir, 'Admin.c')])


def get_install_requirements(path):
content = open(os.path.join(os.path.dirname(__file__), path)).read()
return [
req
for req in content.split("\n")
if req != '' and not req.startswith('#')
]


trove_classifiers = [
'Development Status :: 5 - Production/Stable',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Topic :: Software Development :: Libraries :: Python Modules',
]

setup(name='confluent-kafka',
# Make sure to bump CFL_VERSION* in confluent_kafka/src/confluent_kafka.h
# and version in docs/conf.py.
version='2.6.0',
description='Confluent\'s Python client for Apache Kafka',
author='Confluent Inc',
author_email='support@confluent.io',
url='https://github.com/confluentinc/confluent-kafka-python',
ext_modules=[module],
packages=find_packages('src'),
package_dir={'': 'src'},
data_files=[('', [os.path.join(work_dir, 'LICENSE.txt')])],
install_requires=INSTALL_REQUIRES,
classifiers=trove_classifiers,
extras_require={
'schema-registry': SCHEMA_REGISTRY_REQUIRES,
'avro': AVRO_REQUIRES,
'json': JSON_REQUIRES,
'protobuf': PROTO_REQUIRES,
'dev': TEST_REQUIRES + AVRO_REQUIRES,
'doc': DOC_REQUIRES + AVRO_REQUIRES
})
setup(ext_modules=[module])
2 changes: 0 additions & 2 deletions src/confluent_kafka/requirements.txt

This file was deleted.

6 changes: 0 additions & 6 deletions src/confluent_kafka/schema_registry/requirements.txt

This file was deleted.

2 changes: 1 addition & 1 deletion src/confluent_kafka/src/confluent_kafka.h
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@


/**
* @brief confluent-kafka-python version, must match that of setup.py.
* @brief confluent-kafka-python version, must match that of pyproject.toml.
*
* Hex version representation:
* 0xMMmmRRPP
Expand Down
5 changes: 2 additions & 3 deletions tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,8 @@ A python3 env suitable for running tests:

$ python3 -m venv venv_test
$ source venv_test/bin/activate
$ pip install -r tests/requirements.txt
$ python setup.py build
$ python setup.py install
$ pip install -r requirements/requirements-tests.txt
$ python3 -m pip install .
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
$ pip install -r requirements/requirements-tests.txt
$ python3 -m pip install .
$ python3 -m pip install .[tests]


When you're finished with it:

Expand Down
4 changes: 1 addition & 3 deletions tests/soak/bootstrap.sh
Original file line number Diff line number Diff line change
Expand Up @@ -45,9 +45,7 @@ source $venv/bin/activate

pip install -U pip

pip install -v .

pip install -r tests/soak/requirements.txt
pip install -v .[soaktest]

popd # ..python

Expand Down
2 changes: 0 additions & 2 deletions tests/soak/build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,6 @@ set -u
pushd confluent-kafka-python
git fetch --tags
git checkout $cflpy_version
python3 setup.py clean -a
python3 setup.py build
python3 -m pip install .
popd

Expand Down
29 changes: 14 additions & 15 deletions tools/RELEASE.md
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ RCs, so it only needs to be set once for each release.
* `src/confluent_kafka/src/confluent_kafka.h`
update both `CFL_VERSION` and `CFL_VERSION_STR`.
* `docs/conf.py` - change `version` variable.
* `setup.py` - change `version` argument to `setup()`.
* `pyproject.toml` - change `version` field.

Commit these changes with a commit-message containing the version:

Expand Down Expand Up @@ -290,30 +290,29 @@ With the PR merged to master, check out and update master:
Now go back to 5.1 and start the final RELEASE ITERATION.


### 5.6. Upload wheel packages to PyPi
### 5.6. Create source distribution

**CANDIDATE ITERATION:** To upload binary packages to test.pypi.org, use:
When creating the source packages make sure to have checked out the correct tag
and that you do not have any uncommited modifications and that the `dist/`
directory is empty.

$ twine upload -r test dl-v0.11.4rc1/*
$ python -m build -s

**RELEASE ITERATION:** To upload binary packages to the proper pypi.org (WARNING!), use:
The above command will create the necessary source distribution. Move this
generated sdist file to correct `tools\dl-<tag>` folder

$ twine upload dl-v0.11.4rc1/*
$ mv dist/confluent-kafka-0.11.4rc1.tar.gz tools/dl-v0.11.4rc1/


### 5.7. Upload source packages to PyPi
### 5.7. Upload wheel packages and sdist to PyPi

When uploading source packages make sure to have checked out the correct tag
and that you do not have any uncommited modifications and that the `build/`
directory is empty.

**CANDIDATE ITERATION:** Upload source packages to test.pypi.org:
**CANDIDATE ITERATION:** To upload binary packages to test.pypi.org, use:

$ python setup.py sdist upload -r test
$ twine upload -r test tools/dl-v0.11.4rc1/*

**RELEASE ITERATION:** Upload source packages to the proper pypi.org (WARNING!):
**RELEASE ITERATION:** To upload binary packages to the proper pypi.org (WARNING!), use:

$ python setup.py sdist upload
$ twine upload tools/dl-v0.11.4rc1/*


### 5.8. Verify installation from PyPi
Expand Down
2 changes: 1 addition & 1 deletion tools/build-manylinux.sh
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
# docker run -t -v $(pwd):/io quay.io/pypa/manylinux2010_x86_64:latest /io/tools/build-manylinux.sh <librdkafka_tag>

LIBRDKAFKA_VERSION=$1
PYTHON_VERSIONS=("cp36" "cp37" "cp38" "cp39" "cp310" "cp311" "cp312" "cp313")
PYTHON_VERSIONS=("cp37" "cp38" "cp39" "cp310" "cp311" "cp312" "cp313")

if [[ -z "$LIBRDKAFKA_VERSION" ]]; then
echo "Usage: $0 <librdkafka_tag>"
Expand Down
Loading