Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introduced pyproject.toml and moved static metadata from setup.py #1592

Merged
merged 5 commits into from
Oct 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

v2.6.1 is a maintenance release with the following fixes and enhancements:

- Migrated build system from `setup.py` to `pyproject.toml` in accordance with `PEP 517` and `PEP 518`, improving project configuration, build system requirements management, and compatibility with modern Python packaging tools like `pip` and `build`.
- Added an example for OAUTH OIDC producer with support for confluent cloud (#1769, @sarwarbhuiyan)

confluent-kafka-python is based on librdkafka v2.6.1, see the
Expand Down
8 changes: 4 additions & 4 deletions DEVELOPER.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,19 +5,19 @@ This document provides information useful to developers working on confluent-kaf

## Build

$ python setup.py build
$ python -m build

If librdkafka is installed in a non-standard location provide the include and library directories with:

$ C_INCLUDE_PATH=/path/to/include LIBRARY_PATH=/path/to/lib python setup.py ...
$ C_INCLUDE_PATH=/path/to/include LIBRARY_PATH=/path/to/lib python -m build

**Note**: On Windows the variables for Visual Studio are named INCLUDE and LIB

## Generate Documentation

Install sphinx and sphinx_rtd_theme packages:
Install docs dependencies:

$ pip install sphinx sphinx_rtd_theme
$ pip install .[docs]

Build HTML docs:

Expand Down
File renamed without changes.
4 changes: 2 additions & 2 deletions MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
include README.md
include LICENSE.txt
emasab marked this conversation as resolved.
Show resolved Hide resolved
include test-requirements.txt
include src/confluent_kafka/src/*.[ch]
prune tests
prune docs
3 changes: 2 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,8 @@ all:


clean:
python setup.py clean
pip cache purge
rm -rf dist
make -C docs clean

.PHONY: docs
Expand Down
8 changes: 0 additions & 8 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,6 @@
# serve to show the default.

import sphinx_rtd_theme
import sys
import os
from glob import glob

# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
sys.path[:0] = [os.path.abspath(x) for x in glob('../build/lib.*')]

# -- General configuration ------------------------------------------------

Expand Down
8 changes: 3 additions & 5 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,8 @@ To setup a venv with the latest release version of confluent-kafka and dependenc
```
$ python3 -m venv venv_examples
$ source venv_examples/bin/activate
$ cd examples
$ pip install -r requirements.txt
$ pip install confluent_kafka
$ pip install -r requirements/requirements-examples.txt
```

To setup a venv that uses the current source tree version of confluent_kafka, you
Expand All @@ -42,9 +42,7 @@ need to have a C compiler and librdkafka installed
```
$ python3 -m venv venv_examples
$ source venv_examples/bin/activate
$ python setup.py develop
$ cd examples
$ pip install -r requirements.txt
$ pip install .[examples]
```

When you're finished with the venv:
Expand Down
7 changes: 4 additions & 3 deletions examples/docker/Dockerfile.alpine
Original file line number Diff line number Diff line change
Expand Up @@ -73,9 +73,10 @@ RUN \
mkdir -p /usr/src/confluent-kafka-python && \
cd /usr/src/confluent-kafka-python && \
rm -rf build && \
python3 setup.py clean -a && \
python3 setup.py build && \
python3 setup.py install && \
rm -rf dist && \
python3 -m pip install build \
python3 -m build && \
python3 -m pip install dist/confluent_kafka*whl && \
cd / && \
rm -rf /usr/src/confluent-kafka-python

Expand Down
67 changes: 67 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
[build-system]
requires = [ "setuptools>=62", "wheel"]
build-backend = "setuptools.build_meta"

[project]
name = "confluent-kafka"
version = "2.6.0.post1.dev1"
description = "Confluent's Python client for Apache Kafka"
classifiers = [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Topic :: Software Development :: Libraries :: Python Modules"]
readme = "README.md"
license = { file = "LICENSE" }
requires-python = ">=3.7"
dynamic = ["dependencies", "optional-dependencies"]

[[project.authors]]
name = "Confluent Inc."
email = "support@confluent.io"

[project.urls]
Homepage = "https://github.com/confluentinc/confluent-kafka-python"

[tool.setuptools]
include-package-data = false

[tool.setuptools.dynamic]
dependencies = {file = ["requirements/requirements.txt"]}
optional-dependencies.schemaregistry = { file = ["requirements/requirements-schemaregistry.txt"] }
optional-dependencies.avro = { file = ["requirements/requirements-avro.txt", "requirements/requirements-schemaregistry.txt"] }
optional-dependencies.json = { file = ["requirements/requirements-json.txt", "requirements/requirements-schemaregistry.txt"] }
optional-dependencies.protobuf = { file = ["requirements/requirements-protobuf.txt", "requirements/requirements-schemaregistry.txt"] }
optional-dependencies.dev = { file = [
"requirements/requirements-docs.txt",
"requirements/requirements-examples.txt",
"requirements/requirements-tests.txt",
"requirements/requirements-schemaregistry.txt",
"requirements/requirements-avro.txt",
"requirements/requirements-json.txt",
"requirements/requirements-protobuf.txt"] }
optional-dependencies.docs = { file = [
"requirements/requirements-docs.txt",
"requirements/requirements-schemaregistry.txt",
"requirements/requirements-avro.txt",
"requirements/requirements-json.txt",
"requirements/requirements-protobuf.txt"] }
optional-dependencies.tests = { file = [
"requirements/requirements-tests.txt",
"requirements/requirements-schemaregistry.txt",
"requirements/requirements-avro.txt",
"requirements/requirements-json.txt",
"requirements/requirements-protobuf.txt"] }
optional-dependencies.examples = { file = ["requirements/requirements-examples.txt"] }
optional-dependencies.soaktest = { file = ["requirements/requirements-soaktest.txt"] }
optional-dependencies.all = { file = [
"requirements/requirements-soaktest.txt",
"requirements/requirements-docs.txt",
"requirements/requirements-examples.txt",
"requirements/requirements-tests.txt",
"requirements/requirements-schemaregistry.txt",
"requirements/requirements-avro.txt",
"requirements/requirements-json.txt",
"requirements/requirements-protobuf.txt"] }
9 changes: 9 additions & 0 deletions requirements/requirements-all.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
-r requirements.txt
-r requirements-schemaregistry.txt
-r requirements-avro.txt
-r requirements-protobuf.txt
-r requirements-json.txt
-r requirements-examples.txt
-r requirements-tests.txt
-r requirements-docs.txt
-r requirements-soaktest.txt
4 changes: 4 additions & 0 deletions requirements/requirements-avro.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
fastavro < 1.8.0; python_version == "3.7"
fastavro < 2; python_version > "3.7"
requests
avro>=1.11.1,<2
File renamed without changes.
File renamed without changes.
2 changes: 2 additions & 0 deletions requirements/requirements-json.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
pyrsistent
jsonschema
1 change: 1 addition & 0 deletions requirements/requirements-protobuf.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
protobuf
1 change: 1 addition & 0 deletions requirements/requirements-schemaregistry.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
requests
File renamed without changes.
5 changes: 5 additions & 0 deletions requirements/requirements-tests-install.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
-r requirements-tests.txt
-r requirements-schemaregistry.txt
-r requirements-avro.txt
-r requirements-protobuf.txt
-r requirements-json.txt
8 changes: 8 additions & 0 deletions requirements/requirements-tests.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# core test requirements
urllib3<2.0.0;python_version<="3.7"
urllib3 >= 2.0.0,<3; python_version > "3.7"
flake8
pytest
pytest-timeout
requests-mock
trivup>=0.8.3
Empty file added requirements/requirements.txt
Empty file.
72 changes: 2 additions & 70 deletions setup.py
Original file line number Diff line number Diff line change
@@ -1,41 +1,14 @@
#!/usr/bin/env python

import os
from setuptools import setup, find_packages
from setuptools import setup
from distutils.core import Extension
import platform

work_dir = os.path.dirname(os.path.realpath(__file__))
mod_dir = os.path.join(work_dir, 'src', 'confluent_kafka')
ext_dir = os.path.join(mod_dir, 'src')

INSTALL_REQUIRES = [
'futures;python_version<"3.2"',
'enum34;python_version<"3.4"',
]

TEST_REQUIRES = [
'pytest==4.6.4;python_version<"3.0"',
'pytest;python_version>="3.0"',
'pytest-timeout',
'flake8'
]

DOC_REQUIRES = ['sphinx', 'sphinx-rtd-theme']

SCHEMA_REGISTRY_REQUIRES = ['requests']

AVRO_REQUIRES = ['fastavro>=0.23.0,<1.0;python_version<"3.0"',
'fastavro>=1.0;python_version>"3.0"',
'avro>=1.11.1,<2',
] + SCHEMA_REGISTRY_REQUIRES

JSON_REQUIRES = ['pyrsistent==0.16.1;python_version<"3.0"',
'pyrsistent;python_version>"3.0"',
'jsonschema'] + SCHEMA_REGISTRY_REQUIRES

PROTO_REQUIRES = ['protobuf'] + SCHEMA_REGISTRY_REQUIRES

# On Un*x the library is linked as -lrdkafka,
# while on windows we need the full librdkafka name.
if platform.system() == 'Windows':
Expand All @@ -52,45 +25,4 @@
os.path.join(ext_dir, 'AdminTypes.c'),
os.path.join(ext_dir, 'Admin.c')])


def get_install_requirements(path):
content = open(os.path.join(os.path.dirname(__file__), path)).read()
return [
req
for req in content.split("\n")
if req != '' and not req.startswith('#')
]


trove_classifiers = [
'Development Status :: 5 - Production/Stable',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Topic :: Software Development :: Libraries :: Python Modules',
]

setup(name='confluent-kafka',
# Make sure to bump CFL_VERSION* in confluent_kafka/src/confluent_kafka.h
# and version in docs/conf.py.
version='2.6.0',
description='Confluent\'s Python client for Apache Kafka',
author='Confluent Inc',
author_email='support@confluent.io',
url='https://github.com/confluentinc/confluent-kafka-python',
ext_modules=[module],
packages=find_packages('src'),
package_dir={'': 'src'},
data_files=[('', [os.path.join(work_dir, 'LICENSE.txt')])],
install_requires=INSTALL_REQUIRES,
classifiers=trove_classifiers,
extras_require={
'schema-registry': SCHEMA_REGISTRY_REQUIRES,
'avro': AVRO_REQUIRES,
'json': JSON_REQUIRES,
'protobuf': PROTO_REQUIRES,
'dev': TEST_REQUIRES + AVRO_REQUIRES,
'doc': DOC_REQUIRES + AVRO_REQUIRES
})
setup(ext_modules=[module])
3 changes: 0 additions & 3 deletions src/confluent_kafka/avro/requirements.txt

This file was deleted.

2 changes: 0 additions & 2 deletions src/confluent_kafka/requirements.txt

This file was deleted.

6 changes: 0 additions & 6 deletions src/confluent_kafka/schema_registry/requirements.txt

This file was deleted.

2 changes: 1 addition & 1 deletion src/confluent_kafka/src/confluent_kafka.h
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@


/**
* @brief confluent-kafka-python version, must match that of setup.py.
* @brief confluent-kafka-python version, must match that of pyproject.toml.
*
* Hex version representation:
* 0xMMmmRRPP
Expand Down
4 changes: 1 addition & 3 deletions tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,7 @@ A python3 env suitable for running tests:

$ python3 -m venv venv_test
$ source venv_test/bin/activate
$ pip install -r tests/requirements.txt
$ python setup.py build
$ python setup.py install
$ python3 -m pip install .[tests]

When you're finished with it:

Expand Down
14 changes: 0 additions & 14 deletions tests/requirements.txt

This file was deleted.

4 changes: 1 addition & 3 deletions tests/soak/bootstrap.sh
Original file line number Diff line number Diff line change
Expand Up @@ -45,9 +45,7 @@ source $venv/bin/activate

pip install -U pip

pip install -v .

pip install -r tests/soak/requirements.txt
pip install -v .[soaktest]

popd # ..python

Expand Down
2 changes: 0 additions & 2 deletions tests/soak/build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,6 @@ set -u
pushd confluent-kafka-python
git fetch --tags
git checkout $cflpy_version
python3 setup.py clean -a
python3 setup.py build
python3 -m pip install .
popd

Expand Down
Loading