Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate all packages to openssl3 and python 3.11 #5820

Merged
merged 86 commits into from
Aug 12, 2023

Conversation

th0ma7
Copy link
Contributor

@th0ma7 th0ma7 commented Jul 22, 2023

Description

Fixes #5800, #5818

Checklist

  • Build rule all-supported completed successfully
  • New installation of package completed successfully
  • Package upgrade completed successfully (Manually install the package again)
  • Package functionality was tested
  • Any needed documentation is updated/created

Type of change

  • Bug fix
  • New Package
  • Package update
  • Includes small framework changes
  • This change requires a documentation update (e.g. Wiki)

Package testing (x86_64)

Package migrated to spksrc.python.mk

  • python310 --> Not Applicable
  • python311 --> Not Applicable
  • bazarr
  • beets
  • borgbackup
  • deluge
  • duplicity
  • flexget
  • headphones --> Nothing to do
  • n/a homeassistant (homeassistant v2023.1.7 must stay on Python 3.10)
  • mercurial
  • octoprint
  • plexpy-custom --> Nothing to do
  • rdiff-backup
  • rutorrent --> Nothing to do
  • sabnzbd
  • salt-master
  • salt-minion
  • sickchill
  • znc
  • tvheadend --> Nothing to do

Build rule all-supported completed successfully

  • python310
  • python311
  • bazarr
  • beets
  • borgbackup
  • deluge
  • duplicity
  • flexget
  • headphones --> Nothing to do
  • homeassistant
  • mercurial
  • octoprint
  • plexpy-custom --> Nothing to do
  • rdiff-backup
  • rutorrent --> Nothing to do
  • sabnzbd
  • salt-master
  • salt-minion
  • sickchill
  • znc
  • tvheadend --> To be tested through another PR

PUBLISHING DAY 1

  • deluge
  • rutorrent
  • sabnzbd
  • sickchill

PUBLISHING DAY 3

PUBLISHING any day from DAY 5

  • bazarr
  • beets
  • borgbackup - Borgbackup update v1.2.6 #5877
  • duplicity
  • flexget
  • headphones
  • mercurial
  • octoprint
  • plexpy-custom
  • rdiff-backup
  • salt-master - pending saltgui enablement PR ...
  • salt-minion - pending saltgui enablement PR ...
  • znc
  • vim

Note to self

How to get the list of supported language standards:
https://stackoverflow.com/questions/34836775/compiler-standards-support-c11-c14-c17

$ gcc -v --help 2> /dev/null | sed -n '/^ *-std=\([^<][^ ]\+\).*/ {s//\1/p}'
...
c++0x
c++11
c++14
c++17
c++1y
c++1z
c++2a
c++98
...

@th0ma7 th0ma7 requested a review from hgy59 July 22, 2023 20:09
@hgy59
Copy link
Contributor

hgy59 commented Jul 22, 2023

@th0ma7 homeassistant can't be updated to python311 without update of homeassistant to at least 2023.5.x.
This is already WIP in #5757.

@hgy59
Copy link
Contributor

hgy59 commented Jul 22, 2023

@th0ma7 octoprint needs to be updated to v1.9.0 at least to work with python311 (I have already a local build with octoprint 1.9.0, but latest is v1.9.2).

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jul 23, 2023

@th0ma7 homeassistant can't be updated to python311 without update of homeassistant to at least 2023.5.x. This is already WIP in #5757.

Thnx for the heads-up, in the meantime I'll re-map it to py310 but enforce it to use newer openssl3 version. We can check afterwards if it's worth publishing or not depending of you WIP PR.

@th0ma7 th0ma7 changed the title Migrate all packages to openssl3 and python 3.11 [WIP] Migrate all packages to openssl3 and python 3.11 Jul 23, 2023
@hgy59
Copy link
Contributor

hgy59 commented Jul 23, 2023

@th0ma7 found that crypography < 40.0 has static binding to openssl and prebuilt wheels of version 38.0.3 are linked to openssl 1.1.

currently trying to build cryptography==38.0.3 with cross/cryptography_38 and cross/openssl3 for homeassistant 2023.1.7.

just wondering that with cryptography==39.0.0 openssl 1.1 support was dropped:
BACKWARDS INCOMPATIBLE: Support for OpenSSL 1.1.0 has been removed. Users on older version of OpenSSL will need to upgrade.

EDIT: oops - OpenSSL 1.1.0 has been dropped, but not OpenSSL 1.1.1

spk/homeassistant/Makefile Outdated Show resolved Hide resolved
@hgy59
Copy link
Contributor

hgy59 commented Jul 23, 2023

@th0ma7 headphones has a known issue with python 3.11
I got the same error as in rembo10/headphones#3320

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jul 24, 2023

headphones has a known issue with python 3.11 I got the same error as in rembo10/headphones#3320

@hgy59 Reading through the thread is appears that the develop branch has a working fix for it. So I simply switched to this branch and at first glance things looks like working well. Can you confirm on your end?

@hgy59
Copy link
Contributor

hgy59 commented Jul 24, 2023

@th0ma7 sorry, I missed that you already fixed the codec2 URL...

@hgy59
Copy link
Contributor

hgy59 commented Jul 24, 2023

@th0ma7 fixed protobuf wheel for duplicity.
The build (github action in your fork) still fails for arch-hi3535-6.2.4 and arch-88f6281-6.2.4.

This is strange since my local build successfully builds all archs (except OLD_PPC_ARCHS) like:

  • duplicity_hi3535-6.2.4_1.2.3-10.spk
  • duplicity_88f6281-6.2.4_1.2.3-10.spk
  • duplicity_88f6281-5.2_1.2.3-10.spk

for OLD_PPC_ARCHS even Python 3.11 fails to build with configure warning:

Platform "powerpc-none-linux-gnuspe" with compiler "gcc" is not supported by the
CPython core team, see https://peps.python.org/pep-0011/ for more information.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jul 25, 2023

@hgy59 I've hit a really interesting case where greenlet >= 2.x would fail using gcc-4.9 at first attempt but work on the second try when resuming the build process?!

It hapends that it had worked for DSM7 using newer gcc and resuming DSM6 build would re-use the pip cache, thus finding the already resulting wheel.

In this specific case, greenlet also needed -fpermissive to work correctly on gcc-4.9.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jul 25, 2023

fixed protobuf wheel for duplicity. The build (github action in your fork) still fails for arch-hi3535-6.2.4 and arch-88f6281-6.2.4.

@hgy59 thnx catching that.

Note that with that train of thought I've added a section in the top summary for build all-supported to confirm all modified packages do build as expected for all archs. If you have a few spare cycles feel free to test build a few. In the meantime I'm working on the last bit (e.g. salt-*) in hope of getting it to work (confirmed to be compatible, just missing the howto exacly)... Much thnx in advance (and if other want to chime in and help much appreciated 😃 )

EDIT: Honestly I'd remove salt-* entirely as this is a pain to maintain ... while well documented adequate application startup is pure black-magic to me... Also, I doubt there are any active users based on the lack of issues reported.

@hgy59
Copy link
Contributor

hgy59 commented Jul 25, 2023

@hgy59 I've hit a really interesting case where greenlet >= 2.x would fail using gcc-4.9 at first attempt but work on the second try when resuming the build process?!

It hapends that it had worked for DSM7 using newer gcc and resuming DSM6 build would re-use the pip cache, thus finding the already resulting wheel.

That was my finding too!
Finally all archs worked (except qoriq). I thought that greenlet 2.0.1 failed but greenlet 2.0.2 succeeded (but flexget requires greenlet==2.0.1)

BTW I have updated flexget to 3.7.10 too but was delayed by validation whether the transmission integration works, and found that the wheel installation does not work as expected (cross env wheels are not taken from the package but downloaded from index, and pip cache is ignored on updates with DSM 6 due to wrong user [install runs as root but all folders under var are changed to sc-flexget])

@hgy59
Copy link
Contributor

hgy59 commented Jul 25, 2023

Note that with that train of thought I've added a section in the top summary for build all-supported to confirm all modified packages do build as expected for all archs. If you have a few spare cycles feel free to test build a few.

Just popped up that it would be very useful if we could reuse a prevously built python (incl crossenv) when building packages with wheels. Otherwise you need to build python for each package and arch...

@hgy59
Copy link
Contributor

hgy59 commented Jul 25, 2023

ffmpeg et. al. fail to build as nasm.us is currently down (https://updownradar.com/status/nasm.us)

EDIT: and we are not the only one (microsoft/vcpkg#32741)

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jul 25, 2023

Note that with that train of thought I've added a section in the top summary for build all-supported to confirm all modified packages do build as expected for all archs. If you have a few spare cycles feel free to test build a few.

Just popped up that it would be very useful if we could reuse a prevously built python (incl crossenv) when building packages with wheels. Otherwise you need to build python for each package and arch...

That is something i'd really really much like doind, sinilar to ffmpeg.... That would save so many build cycles.

I'll tentatively give that a shot and see if i can make that to work. It's the perfect pr for doing that...

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jul 25, 2023

@hgy59 I've hit a really interesting case where greenlet >= 2.x would fail using gcc-4.9 at first attempt but work on the second try when resuming the build process?!

It hapends that it had worked for DSM7 using newer gcc and resuming DSM6 build would re-use the pip cache, thus finding the already resulting wheel.

That was my finding too!
Finally all archs worked (except qoriq). I thought that greenlet 2.0.1 failed but greenlet 2.0.2 succeeded (but flexget requires greenlet==2.0.1)

Qoriq was working yesterday on my build. It's hi3535 that was not.

BTW I have updated flexget to 3.7.10 too but was delayed by validation whether the transmission integration works, and found that the wheel installation does not work as expected (cross env wheels are not taken from the package but downloaded from index, and pip cache is ignored on updates with DSM 6 due to wrong user [install runs as root but all folders under var are changed to sc-flexget])

Crossenv are used locally but mainly on non x86_64 archs and for wheels not providing pre-built download option. Otherwise you are right, it fetches them all.

For dsm6, wonder if a chown hook somewhere couldn't fix that?

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jul 25, 2023

@hgy59 I got the basics to prepare the build environment working using salt-master as prototype in order to always re-use spk/python311 pre-built libraries and crossenv. Currently for an unknown reason it completes the preparation portion then stops. Although working directory shows that it is all set for it to work.

I'll have more cycles tonight to further look into it, but feel free to dig-in a little in the meantime if you have spare cycles.

Comment on lines 14 to 32
SPK_DEPENDS := "python$(PYTHON_VERSION)>=$(shell sed -n 's/^SPK_VERS = \(.*\)/\1/p' $(shell pwd)/../python$(PYTHON_VERSION)/Makefile)-$(shell sed -n 's/^SPK_REV = \(.*\)/\1/p' $(shell pwd)/../python$(PYTHON_VERSION)/Makefile)"

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMHO SPK_DEPENDS must be defined by the respective package.

This would not work for packages that depend not only on python (like SPK_DEPENDS = "python311:git")
And probably it is not wanted to include the python package revision in the dependency.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have a different view on this, at least food for thoughts... I'm doing something in tvheadend that sort of solves this and would automate the populating of the SPK_DEPENDS variable value.

Something like the following in spksrc.python.mk would ensure python is always included in the SPK_DEPENDS:

PYTHON_DEPENDS = python$(PYTHON_VERSION)>=$(shell sed -n 's/^SPK_VERS = \(.*\)/\1/p' $(shell pwd)/../python$(PYTHON_VERSION)/Makefile)-$(shell sed -n 's/^SPK_REV = \(.*\)/\1/p' $(shell pwd)/../python$(PYTHON_VERSION)/Makefile)
ifneq ($(wildcard $(SPK_DEPENDS)),)
SPK_DEPENDS := "$(SPK_DEPENDS):$(PYTHON_DEPENDS)"
else
SPK_DEPENDS = "$(PYTHON_DEPENDS)"
endif

mk/spksrc.python.mk Outdated Show resolved Hide resolved
mk/spksrc.python.mk Outdated Show resolved Hide resolved
mk/spksrc.python.mk Outdated Show resolved Hide resolved
@hgy59
Copy link
Contributor

hgy59 commented Jul 25, 2023

@th0ma7 I have a working version of spksrc.python.mk.

just needs some verifications....

@hgy59
Copy link
Contributor

hgy59 commented Jul 25, 2023

verification done.

needed some investigation as the *.so files in the cross compiled wheels have binary differences when reusing prebuilt python.

The difference is caused by different RPATH in the so files (i.e. Library Path).
Tested with spk/octoprint:

  • regular build: RPATH [/var/packages/octoprint/target/lib]
  • built with reuse of python311: RPATH [/var/packages/octoprint/target/lib:/var/packages/python311/target/lib]

IMHO this is not an issue that must be fixed since the package folder has precedence over the python311 folder.

@hgy59
Copy link
Contributor

hgy59 commented Jul 25, 2023

verification done.

needed some investigation as the *.so files in the cross compiled wheels have binary differences when reusing prebuilt python.

The difference is caused by different RPATH in the so files (i.e. Library Path). Tested with spk/octoprint:

* regular build: `RPATH` [/var/packages/octoprint/target/lib]

* built with reuse of python311:  `RPATH` [/var/packages/octoprint/target/lib:/var/packages/python311/target/lib]

IMHO this is not an issue that must be fixed since the package folder has precedence over the python311 folder.

There is a possible source of runtime errors for cross compiled wheels:
Due to the python wheel cache (in distrib/pip/wheels) we must ensure that there is no re-use of wheels built for another package. Or those will fail to find dependent libraries (like libssl.so) due to the wrong RPATH.
As long as only one package is built by github action, this is not an issue.
But for local builds it only works when the pip wheel cache is cleared for every new package built with (cross) wheels.
And when the github build action builds multiple packages with cross compiled wheels, the pip wheel cache must be cleared for each package.

The only solution is to disable the pip wheel cache (we can keep the cache for source files in distrib/pip/http)

@th0ma7 any other suggestion?

@hgy59
Copy link
Contributor

hgy59 commented Aug 15, 2023

or just drop python311 for ARMv5 (as we did for OLD_PPC_ARCHS)

@th0ma7
Copy link
Contributor Author

th0ma7 commented Aug 15, 2023

@hgy59 honestly, I'm thinking along the same line and just go ahead with deleting the armv5 packages online and mark it as such in python311.

I'd push things even a bit further, declaring qoriq as deprecated as well from a py311 standpoint. Unless you (or @SynoCommunity/developers) has one close by that we could try to isolate the segfault core dump from?

@Safihre
Copy link
Contributor

Safihre commented Aug 16, 2023

We should then also mark all Python packages (like SABnzbd) as not supporting those arch's anymore.

@hgy59
Copy link
Contributor

hgy59 commented Aug 16, 2023

has one close by that we could try to isolate the segfault core dump from?

IMO a segfault indicates that the bigendian arch is not considered by the source of the affected library.
My single PPC (bigendian) model is DS210+ with ppc853x-5.2, that has the same issue on openssl3 for python311 as ARMv7 (lacking libatomic due to gcc < 4.8).

The only models with qoric are DS213+, DS413.


I like to support old models, as it helps to keep those usable and to reduce resource wasting...
As long as code does not depend on kernel versions, we might get further by creation of our own toolchains with support for newer gcc (follow up of #5225 and #4469) - and we should have a look at crosstool-ng.

Or we reinvest into chroot packages (at least for models without docker support) - but this might not be a solution for (32-bit) PPC archs (and ARMv5 was only supported until debian 9 aka stretch with EOL 2022.06)

@th0ma7
Copy link
Contributor Author

th0ma7 commented Aug 16, 2023

Available options for armv5 are:

  1. to if, else with openssl v1 and have python311 packages being supported until it gets deprecated (which is due to september 11th 2023, almost tomorrow). And even perhaps afterwards with a message on its changelog stating that it is now unsupported somehow, or that this is the last version to be supplied for ARMv5?
  2. mark is as deprecated right away and delete packages online - and add unsupported flags to python311

As for qoriq options relatively to sabnzbd are (I presume this is cryptography related):

  1. Find where the issue is? FYI, endianness looks OK:
$ file ./Cheetah/_namemapper.cpython-311-powerpc-linux-gnuspe.so ./cryptography/hazmat/bindings/_rust.abi3.so
./Cheetah/_namemapper.cpython-311-powerpc-linux-gnuspe.so: ELF 32-bit MSB shared object, PowerPC or cisco 4500, version 1 (SYSV), dynamically linked, with debug_info, not stripped
./cryptography/hazmat/bindings/_rust.abi3.so:              ELF 32-bit MSB shared object, PowerPC or cisco 4500, version 1 (SYSV), dynamically linked, with debug_info, not stripped
  1. similar as with armv5 for python311, to if, else but this time relatively to cryptography to keep using the older C version? And for this one as well, mark it as being the last version to be supplied for qoriq
  2. mark is as deprecated right away and delete packages online - and add unsupported flags to cross/cryptography

As for supporting newer gcc, I've tried that and came to the conclusion that to build our own cross-compiler we have to restart from scratch and go with the stage1 and stage2 builds. It's a hell of a road to get to something that works. crosstool-ng may be a hell lot more easier to use?

@hgy59
Copy link
Contributor

hgy59 commented Aug 16, 2023

@th0ma7
I am trying to build homeassistant (currently 2023.8.2) that depends on Pillow==10.0.0 and I get the following error:

===>  _PYTHON_HOST_PLATFORM=aarch64-unknown-linux-gnueabi /spksrc/spk/python311/work-aarch64-6.2.4/crossenv/cross/bin/pip wheel --disable-pip-version-check --no-binary :all: --find-links /spksrc/spk/homeassistant/../../distrib/pip --cache-dir /spksrc/spk/homeassistant/work-aarch64-6.2.4/pip --no-deps --wheel-dir /spksrc/spk/homeassistant/work-aarch64-6.2.4/wheelhouse --no-index --global-option=build_ext --global-option=--disable-platform-guessing --global-option=--enable-freetype --global-option=--enable-jpeg --global-option=--enable-zlib --no-use-pep517 --no-build-isolation Pillow==10.0.0
DEPRECATION: --build-option and --global-option are deprecated. pip 23.3 will enforce this behaviour change. A possible replacement is to use --config-settings. Discussion can be found at https://github.com/pypa/pip/issues/11859
WARNING: Implying --no-binary=:all: due to the presence of --build-option / --global-option.
Looking in links: /spksrc/spk/homeassistant/../../distrib/pip
Processing /spksrc/distrib/pip/Pillow-10.0.0.tar.gz
ERROR: Disabling PEP 517 processing is invalid: project specifies a build backend of backend in pyproject.toml

If I remove the following line in spksrc.wheel.mk:

         pip_global_option=$${pip_global_option}" --no-use-pep517" ; \

then cross build of the wheel does not work. It takes headers in the build system instead of the toolchain:
/usr/include/bits/floatn.h:87:9: error: unknown type name ‘__float128’

Any hint?

@th0ma7
Copy link
Contributor Author

th0ma7 commented Aug 16, 2023

Yup, i've hit that same issue a few weeks ago while updating python packages. I had to stick with the previous 9.x version as it didn't work.

My investigations led me to exactly the same point. I would argue either trying out from cross and/or opening a ticket on their github project page...

@hgy59
Copy link
Contributor

hgy59 commented Aug 16, 2023

My investigations led me to exactly the same point. I would argue either trying out from cross and/or opening a ticket on their github project page...

Thanks, I prefer to stay on homeassistant 2023.7.3 for the moment.

hgy59 added a commit to hgy59/spksrc that referenced this pull request Aug 19, 2023
- update homeassistant to v2023.7.3
- update to python311 with openssl3
- this PR now depends on SynoCommunity#5820 to be merged first
hgy59 added a commit that referenced this pull request Aug 20, 2023
* homeassistant: update to v2023.5.4
- update homeassistant to v2023.5.4
- update HACS integration to v1.32.1
- update DTLSSocket to v0.1.15
- update and adjust requirements

* fix build of greenlet wheel for older compilers
* fix build of protobuf wheel for older compilers

* fix build of webrtcvad for arch-qoriq-6.1
- thanks to @th0ma7 for wiseman/py-webrtcvad#78
- thanks to @smaarn for wiseman/py-webrtcvad@3bd7613

* homeassistant: update to v2023.7.3
- update homeassistant to v2023.7.3
- update to python311 with openssl3
- this PR now depends on #5820 to be merged first

* fix some integrations
- add cross/libstdc++ to fix grpcio for DSM < 7
- add cross/opus to fix voice over ip integration
- fix Google Generative AI Conversation integration (requires grpcio fix and google_generativeai)

* migrate to PYTHON_PACKAGE
@smaarn smaarn mentioned this pull request Aug 27, 2023
5 tasks
@th0ma7 th0ma7 mentioned this pull request Sep 2, 2023
10 tasks
@th0ma7
Copy link
Contributor Author

th0ma7 commented Sep 2, 2023

I am trying to build homeassistant (currently 2023.8.2) that depends on Pillow==10.0.0 and I get the following error:

@hgy59 I've looked at this and may have found a few interesting bits... some of it pertains to next release of numpy that will have to migrate to scikit-build-core (which in turns uses cmake) or meson-python (which uses meson+ninja). It's totally unclear how to pass our cross-environment files to them so they do use Synology toolchain properly.

Building python wheels is a mix-bag of magic, vodoo and luck... it seems that starting with newer pillow and numpy we just got luckier :)

@hgy59
Copy link
Contributor

hgy59 commented Sep 8, 2023

Yup, i've hit that same issue a few weeks ago while updating python packages. I had to stick with the previous 9.x version as it didn't work.

My investigations led me to exactly the same point. I would argue either trying out from cross and/or opening a ticket on their github project page...

I tried agaein to build Pillow==10.0.0
It is as easy as removing --no-use-pep517 from pip_global_option in spksrc.wheel.mk.

The error message tells, that this option is not valid when building with pyproject.toml

ERROR: Disabling PEP 517 processing is invalid: project specifies a build backend of backend in pyproject.toml

I think we should fully rely on PEP 517 and remove the --no-use-pep517 option when building wheels.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Sep 8, 2023

There will be some wheels that will refuse to build. I recall when i added it initially there where many failures.

Although situation may have changed since whereas only a few may still fail for which we could then add the option (e.g. disabling pep517) on a per wheel basis.

@hgy59
Copy link
Contributor

hgy59 commented Sep 9, 2023

There will be some wheels that will refuse to build. I recall when i added it initially there where many failures.

Although situation may have changed since whereas only a few may still fail for which we could then add the option (e.g. disabling pep517) on a per wheel basis.

OK, one issue I found with PEP517:

      In file included from /usr/include/stdlib.h:55:0,
                       from /spksrc/spk/python311/work-armv7-6.2.4/install/var/packages/python311/target/include/python3.11/Python.h:23,
                       from src/_imagingft.c:22:
      /usr/include/bits/floatn.h:75:1: error: unknown machine mode ‘__TC__’

Cross builds need investigation. The error above shows that it takes include files from the system, instead of the toolchain.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Sep 10, 2023

@hgy59 was that from pillow build or another one entirely?

@th0ma7
Copy link
Contributor Author

th0ma7 commented Sep 10, 2023

@hgy59 if I recall you had looked into salt-master along with enabling the saltgui (which probably needs updating).

Is this something you have cycles for to complete before I publish salt-* packages? Otherwise I could probably publish them as-is, but unsure of the value of it?

@th0ma7 th0ma7 mentioned this pull request Sep 10, 2023
10 tasks
@hgy59
Copy link
Contributor

hgy59 commented Sep 11, 2023

@hgy59 was that from pillow build or another one entirely?

yes, it is when building wheel Pillow==10.0.0 for homeassistant with arch-armv7-6.2.4 without the --no-use-pep517 parameter in spksrc.wheel.mk.

The code in Python.h (from spk/python311/work-armv7-6.2.4/install/...) looks like this

// stdlib.h, stdio.h, errno.h and string.h headers are not used by Python
// headers, but kept for backward compatibility. They are excluded from the
// limited C API of Python 3.11.
#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000
#  include <stdlib.h>
#  include <stdio.h>              // FILE*
#  include <errno.h>              // errno
#  include <string.h>             // memcpy()
#endif

@mreid-tt
Copy link
Contributor

@th0ma7, are there any other packages to publish from this? Just wondering if we should change the label to "published" or not.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Nov 22, 2023

@mreid-tt all packages have been migrated and published. The only remaining items relates to :

  1. there are the salt-* packages that requires attention. Re-using someone else's code I was able to migrate it from python2 to python3, then to python 3.11. But under the hood I ended-up noticing it is missing the actual web gui enablement which never was provided with the package (not even with python2). So I did introduced the web portion whereas it's now packaged but I didn't enabled it as web apps isn't really my area I can contribute much. I'd need someone else to look at that specifically or we just call it a day and stop supporting it (as I doubt it ever fully worked without the web portal being enabled anyway?!?).
  2. figuring out how to get powerpc to work with rust as otherwise, similarly to armv5, that arch will have to become unsupported. I've spent numerous hours on it and may be close to a result ... that will hopefuly actually work.
  3. before too long we will have to reconcile how to manage (or not) the --no-use-pep517 flag using pip. As mentionned previously, saying that cross-compiling python wheel is a challenge in an understatement as the Python community doesn't agree or commit to a standard toolset (ref. https://lwn.net/Articles/924114/)

So long-story-short:

  1. salt-* is in need for someone else to look into it or we decide to drop it
  2. if I can find a few more cycles I may have a tentative solution, probably a bit sloppy at first
  3. when @hgy59 looks into the next homeassistant update we'll need to revisit the --no-use-pep517 flag issue

@mreid-tt
Copy link
Contributor

mreid-tt commented Nov 22, 2023

@th0ma7 thanks for the update. I've never used salt or dug very deeply into python packages but I have touched on a few with web gui components. If you have a branch or a PR that I can look at I can try to help.

For now based on your comments, I'll leave the labels on this PR as is.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Nov 22, 2023

@th0ma7 thanks for the update. I've never used salt or dug very deeply into python packages but I have touched on a few with web gui components. If you have a branch or a PR that I can look at I can try to help.

I don't have anything started on salt packages. They are already migrated to python 3.11 and openssl3 as-is. If you do have spare cycles you'll find the web portion being a dependency of salt-gui makefile if i recall. What's missing is enabling it and confirming that it actually "works" somehow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants