Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch sickchill to use a pypi based install to match other distributions #5431

Closed
wants to merge 18 commits into from

Conversation

miigotu
Copy link
Contributor

@miigotu miigotu commented Sep 14, 2022

Uses pypi based wheel, rather than a git install. This allows us to skip the cross step and match the syno install with windows and linux.

Added .idea folder to .gitignore because it always is in my projects from pycharm =P

@ymartin59 what are the thoughts on us disabling the internal updater completely, and me checking out spksrc and updating the requirements-pure.txt with our newly released versions and creating a PR to spksrc? This way, users never get a broken install inside a working spk install and get their updates through the syno package manager as was designed?

Maybe we could work on a template for other projects to use for doing the same thing, so that they can use it in their github actions and automate the process.

Signed-off-by: miigotu miigotu@gmail.com

Description

Switched to a wheel install rather than a problematic git install
Possibly requires documentation, guidance, and wiki updates to remove old suggestions about git resets, pulls, etc. Currently if an update fails, a user will be able to uninstall and reinstall the package (we havent disabled the updater in SC for pypi based installs yet, but we might need to)
Fixes BROKEN because we removed old tags on SC.

Checklist

  • Build rule all-supported completed successfully
  • New installation of package completed successfully
  • Package upgrade completed successfully (Manually install the package again)
  • Package functionality was tested
  • Any needed documentation is updated/created

Type of change

  • Bug fix
  • New Package
  • Package update
  • Includes small framework changes
  • This change requires a documentation update (e.g. Wiki)

@BKSteve @sickchill/collaborators @sickchill/moderators

…ions

Signed-off-by: miigotu <miigotu@gmail.com>
@BKSteve
Copy link
Contributor

BKSteve commented Sep 15, 2022

I built this and it isn't grabbing all the pip install items before crashing with

Starting sickchill command /volume1/@appstore/sickchill/env/bin/SickChill --daemon --nolaunch --pidfile /volume1/@appdata/sickchill/sickchill.pid --config /volume1/@appdata/sickchill/data/config.ini --datadir /volume1/@appdata/sickchill/data
/var/packages/sickchill/scripts/start-stop-status: line 76: /volume1/@appstore/sickchill/env/bin/SickChill: No such file or directory
sickchill is not running

the error on line 76 is because it doesn't exist.
Activate, pip and python only in env/bin

Only got pip and setuptools in env/lib/python3.10/site-packages/ too.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 15, 2022

Strange, afaik the wheels are supposed to be installed on post_install, line 40 of the service setup script install_python_wheels

I copied the way they do it in the beets package

@miigotu
Copy link
Contributor Author

miigotu commented Sep 15, 2022

By the way, you don't need to build them manually. Go to the actions tab and look at the run for this PR, the spk is in the artifacts for each arch. Download the zip.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 15, 2022

Maybe @publicarray or someone else knows what's going on.

@BKSteve
Copy link
Contributor

BKSteve commented Sep 15, 2022

same result with the x64 7 from here https://github.com/SynoCommunity/spksrc/actions/runs/3056555847

Further review shows error in packages. Requirements-pure issue.

2022/09/15 21:59:22	INFO: pip is looking at multiple versions of cachecontrol to determine which version is compatible with other requirements. This could take a while.
2022/09/15 21:59:22	ERROR: Cannot install -r /volume1/@appstore/sickchill/share/wheelhouse/requirements.txt (line 42), -r /volume1/@appstore/sickchill/share/wheelhouse/requirements.txt (line 77) and beekeeper-alt==2021.7.16 because these package versions have conflicting dependencies.
2022/09/15 21:59:22	The conflict is caused by:
2022/09/15 21:59:22	    The user requested beekeeper-alt==2021.7.16
2022/09/15 21:59:22	    kodipydent-alt 2021.7.16 depends on beekeeper-alt<2022.0.0 and >=2021.7.16
2022/09/15 21:59:22	    sickchill 2022.9.14 depends on beekeeper-alt>=2022.9.3
2022/09/15 21:59:22	To fix this you could try to:
2022/09/15 21:59:22	1. loosen the range of package versions you've specified
2022/09/15 21:59:22	2. remove package versions to allow pip attempt to solve the dependency conflict
2022/09/15 21:59:22	ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

@th0ma7
Copy link
Contributor

th0ma7 commented Sep 15, 2022

I may be able to give it a look over the weekend if that can help

@BKSteve
Copy link
Contributor

BKSteve commented Sep 15, 2022

Adjusted the requirements files, needs a pull in @miigotu repo

Update requirements-crossenv.txt
Update requirements-pure.txt
@miigotu
Copy link
Contributor Author

miigotu commented Sep 15, 2022

There must be a nice way to generate the requirements files automatically when they change lol.

I'm curious how I should make a PR to this repo automatically from GitHub actions if I have no easy way to tell that a package is pure python or requires cross compilation.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 15, 2022

Maybe I can make a PR to this https://pypi.org/project/johnnydep/#description
to add an option to check if each requirement is pure python, idk. What is the reason that all of the wheels can't just be built with the build environment available? If gcc or rustc or whatever isn't needed, it just won't be used. All of the wheels could be built and included in only one requirement.txt and allow me to export it in a release action on SickChill.

@BKSteve
Copy link
Contributor

BKSteve commented Sep 15, 2022

do we even need version numbers in the requirements files if we want the current latest?

@miigotu
Copy link
Contributor Author

miigotu commented Sep 15, 2022

do we even need version numbers in the requirements files if we want the current latest?

Probably not, but it is best practices. There shouldn't need to be requirements files at all, if the sickchill wheel is installed into the virtualenv at build time rather than post_install you wouldn't need a requirements file at all. Just pip install sickchill with the build environment available.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 15, 2022

I bet if you emptied the requirements files and put the sickchill==2022.9.14 into requirements-cross.txt it would work just like that. It should build all of the dependencies required even if they need compiled and put them in wheelhouse.

It might make the spk larger though, because pure wheels don't really need to be in the package. There needs to be a pep to determine if a package needs compiled...

To work around the larger spk, you could delete any none-any.wheel in the makefile after the build before packaging.

@BKSteve
Copy link
Contributor

BKSteve commented Sep 15, 2022

cryptography is being a pain in the butt again.

38.0.1 fails in build and requires > 35.0
ERROR: Failed building wheel for cryptography

pyopenssl 22.0.0 depends on cryptography>=35.0

@miigotu
Copy link
Contributor Author

miigotu commented Sep 15, 2022

I made a branch on my repo to test building it different. Maybe I can add something to the build process here to simplify upstream packages being able to help keep their packages up to date automatically.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 15, 2022

cryptography is being a pain in the butt again.

38.0.1 fails in build and requires > 35.0 ERROR: Failed building wheel for cryptography

pyopenssl 22.0.0 depends on cryptography>=35.0

Is cross/cryptography supposed to be specified in our makefile?

@miigotu
Copy link
Contributor Author

miigotu commented Sep 15, 2022

There's something odd going on, on my test pr I removed requirements-pure.txt and somehow it's regenerated in the build with poetry-date-version-plugin, which nothing should rely on. I'll need to run a dependency tree to see where that's from, maybe new-rtorrent-python or new-qbittorrent

@hgy59
Copy link
Contributor

hgy59 commented Sep 15, 2022

Is cross/cryptography supposed to be specified in our makefile?

cryptography must be cross compiled and depends on setuptools-rust (and rust compiler).
I already started to build cryptography wheel in #5399, but no success so far.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 15, 2022

@BKSteve probably because I removed the extra index URL that has built wheels for cryptography. I really would rather it build natively in this tool chain, but it would work just fine from their repo
https://github.com/SynoCommunity/spksrc/pull/5431/files#diff-ba37a95b64f7df4409a5def77572e53c876a9792d2690bc6d062f50047467dd8L45

@BKSteve
Copy link
Contributor

BKSteve commented Sep 16, 2022

I bet if you emptied the requirements files and put the sickchill==2022.9.14 into requirements-cross.txt

I did a build with this suggestion. On my DSM 7 VM x64 machine it loaded perfectly and is running.
Cryptography 38.0.1 loaded with no issues.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 16, 2022

I did it by hand and forgot to modify the cryptography==3.3.2 I then changed to and tried my own builds with 38.0.1 an 36.0.1 and they failed.

With SC only and nothing else it built and loaded

You are on an x86_64 arch though, which has prebuilt wheels for cryptography on pypi. Arm7l doesn't for example, so it would not run for those people who don't have x64, amd64, arm64, aarch64, etc

https://pypi.org/project/cryptography/38.0.1/#files

@BKSteve
Copy link
Contributor

BKSteve commented Sep 16, 2022

cryptography must be cross compiled and depends on setuptools-rust (and rust compiler).
I already started to build cryptography wheel in #5399, but no success so far.

OK. in 8160 I have a link to the spk for those that are lucky enough to be on x64 which have cryptography available can use.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 16, 2022

Is cross/cryptography supposed to be specified in our makefile?

cryptography must be cross compiled and depends on setuptools-rust (and rust compiler). I already started to build cryptography wheel in #5399, but no success so far.

Took a look at that PR and I see it's nearly done. Maybe when I'm at an actual PC I can try to build it 😂

@miigotu
Copy link
Contributor Author

miigotu commented Sep 16, 2022

@hgy59 as a side note, do you have any insight on how o can export our requirements easily to split them into requirements-cross and pure?

I could move them all into requirements-cross (or idk if requirements.txt works directly) which would probably double the size of the spk. I know that seems bad because the initial download of the package takes longer, but install would definitely be faster. I also don't know the requirements your org has to manage artifact storage or syno repo storage, and if that comes into account.

My thing is, I want to be able to write a GitHub action step in our release process that automatically checks out spksrc, exports our requirements so you can prebuild the wheels, edit the sickchill version string in the makefile, and open a PR here. Of course also checking if an existing PR is open from our spksrc:sickchill to here before opening one and if so just push to the open PR.

It feels much cleaner to me to disable the sickchill in-app self updater and only allow updates through syno package center. Maybe we can create a script on spksrc that can filter a requirements.txt into pure and cross requirements files? Even if we have to hard code the list of cross requirements to check for it would help.

I'd much rather lessen the load on us both of trying to keep this package up to date manually.

Thoughts?

@th0ma7
Copy link
Contributor

th0ma7 commented Sep 16, 2022

@hgy59 as a side note, do you have any insight on how o can export our requirements easily to split them into requirements-cross and pure?

Yes, no, maybie. There would probably be something based on other existing spk/*/src/requirements*.txt that could be done but beyond that not much that I can think of.

I could move them all into requirements-cross (or idk if requirements.txt works directly) which would probably double the size of the spk. I know that seems bad because the initial download of the package takes longer, but install would definitely be faster. I also don't know the requirements your org has to manage artifact storage or syno repo storage, and if that comes into account.

If you move everything under cross you'll end-up getting build issues, often related to pip's inhability to find matching version. Although if you want your packate to include pure python wheels as well you can use WHEELS_PURE_PYTHON_PACKAGING_ENABLE = 1

EDIT: One of the many reasons is also with the objective of reducing file sizes on our repository as we're lacking that a lot.

My thing is, I want to be able to write a GitHub action step in our release process that automatically checks out spksrc, exports our requirements so you can prebuild the wheels, edit the sickchill version string in the makefile, and open a PR here. Of course also checking if an existing PR is open from our spksrc:sickchill to here before opening one and if so just push to the open PR.

Once you know which requirements needs to be cross-compiled everything else is pure. With that in mind you could generate your new requirement files from grep'ing your previous requirement-cross.txt release and assume everything else is always pure.

It feels much cleaner to me to disable the sickchill in-app self updater and only allow updates through syno package center. Maybe we can create a script on spksrc that can filter a requirements.txt into pure and cross requirements files? Even if we have to hard code the list of cross requirements to check for it would help.

Again, existing package requirement file could be used as an hard-coded list. Far from perfect but that's a start.

Food for thoughts, at the framework level, as there where a lot of improvements over the past year for building wheels, perhaps requirements could be re-combined and, first pass using cross-compile mode and if it hits a build-error it automatically assume it must be retried using pure python build, then either builds OK or exit with an error code.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 16, 2022

@hgy59 as a side note, do you have any insight on how o can export our requirements easily to split them into requirements-cross and pure?

Yes, no, maybie. There would probably be something based on other existing spk/*/src/requirements*.txt that could be done but beyond that not much that I can think of.

I could move them all into requirements-cross (or idk if requirements.txt works directly) which would probably double the size of the spk. I know that seems bad because the initial download of the package takes longer, but install would definitely be faster. I also don't know the requirements your org has to manage artifact storage or syno repo storage, and if that comes into account.

If you move everything under cross you'll end-up getting build issues, often related to pip's inhability to find matching version. Although if you want your packate to include pure python wheels as well you can use WHEELS_PURE_PYTHON_PACKAGING_ENABLE = 1

EDIT: One of the many reasons is also with the objective of reducing file sizes on our repository as we're lacking that a lot.

My thing is, I want to be able to write a GitHub action step in our release process that automatically checks out spksrc, exports our requirements so you can prebuild the wheels, edit the sickchill version string in the makefile, and open a PR here. Of course also checking if an existing PR is open from our spksrc:sickchill to here before opening one and if so just push to the open PR.

Once you know which requirements needs to be cross-compiled everything else is pure. With that in mind you could generate your new requirement files from grep'ing your previous requirement-cross.txt release and assume everything else is always pure.

It feels much cleaner to me to disable the sickchill in-app self updater and only allow updates through syno package center. Maybe we can create a script on spksrc that can filter a requirements.txt into pure and cross requirements files? Even if we have to hard code the list of cross requirements to check for it would help.

Again, existing package requirement file could be used as an hard-coded list. Far from perfect but that's a start.

Food for thoughts, at the framework level, as there where a lot of improvements over the past year for building wheels, perhaps requirements could be re-combined and, first pass using cross-compile mode and if it hits a build-error it automatically assume it must be retried using pure python build, then either builds OK or exit with an error code.

I'm curious what would cause a pure python package to fail to build simply because a compiler and build dependencies are present? Without seeing the error, that seems like a flaw in the process somewhere. Just trying to imagine how that happens.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 16, 2022

@th0ma7 do you think if I made a global file in spksrc we can check against for packages that need cross that it would be accepted?

I could create it from all of the requirements-cross.txt, strip the versions of the packages and remove duplicates, and then make a script that splits requirements.txt by comparison to that master list of packages that need cross. I guess I could do that entirely on my end or I could pass that script and list into spksrc via pr is my question.

@th0ma7
Copy link
Contributor

th0ma7 commented Sep 18, 2022

@th0ma7 do you think if I made a global file in spksrc we can check against for packages that need cross that it would be accepted?

Well that something that a simple grep -li <wheel> spk/*/src/requirements-crossenv.txt would do, no? Maintaining a file for this will most probably end-up being off at a point or another.

I could create it from all of the requirements-cross.txt, strip the versions of the packages and remove duplicates, and then make a script that splits requirements.txt by comparison to that master list of packages that need cross. I guess I could do that entirely on my end or I could pass that script and list into spksrc via pr is my question.

I would keep that dynamic based on the status of existing package.

But as I mentionned previously, I'll check if something could be done to allow unique requirements.txt file when absolutely needed.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 18, 2022

If the package was still using py38 or 39 we could use wheels from piwheels for all the versions pyca doesn't build 😭
https://www.piwheels.org/simple/cryptography/

@BKSteve
Copy link
Contributor

BKSteve commented Sep 22, 2022

Easy enough to drop back to Python38

@miigotu
Copy link
Contributor Author

miigotu commented Sep 22, 2022

I just haven't had time to mess with it. I guess I need to set up a VM with a syno again and mess with it. I haven't had one set up in years, but testing non-standard architecture is going to be a hassle for either of us. Maybe I can help with the rust install for the tool chain at least

@th0ma7
Copy link
Contributor

th0ma7 commented Sep 22, 2022

Moving back to python38 until wheel rust support gets in the framework is certainly an option. We're doing our best to add this up but this will take a few more of our limited cycles. More I dive into it and more it shows that rust is still relatively young as we're clearly not the only ones hitting road-blocks at cross-compiling. But the upcoming adoption of rust in the kernel will most probably ease this as time passes.

@miigotu
Copy link
Contributor Author

miigotu commented Sep 22, 2022

Moving back to python38 until wheel rust support gets in the framework is certainly an option. We're doing our best to add this up but this will take a few more of our limited cycles. More I dive into it and more it shows that rust is still relatively young as we're clearly not the only ones hitting road-blocks at cross-compiling. But the upcoming adoption of rust in the kernel will most probably ease this as time passes.

Yeah it's young for sure, but absolutely going to be a huge adoption. I'll see what I can do, I actually want to learn rust 😂

@th0ma7
Copy link
Contributor

th0ma7 commented Oct 12, 2022

@miigotu rebasing against master should now allow you to build rust wheels :)

@miigotu
Copy link
Contributor Author

miigotu commented Oct 12, 2022

Sweet, I'll go over the whole app tonight.

@th0ma7
Copy link
Contributor

th0ma7 commented Oct 13, 2022

I've added your PR in the META issue #5043
If that can help, what I normally do is the following for requirements:

  1. create a version "empty" requirement.txt
  2. install the package on x86_64 and let pip install everything by default. You may need a few trial & error for older wheels that do not have pre-built x86_64 online by adding them manually but on large packages this normally only represent a handful of them.
  3. once the installation complete, do a copy/paste of the install summary from /var/log/packages/sickchill.log into a requirement-pure.txt. That will be your real starting point.
  4. from there, you'll notice right on top of that all the ones that where pre-existing from the default python310 install. Again just a handfull of them. What I normally do is commenting them such as #certifi==a.b.c ==> py310
  5. then using grep catch up all x86_64 downloaded wheels (e.g. $ grep x86_64 sickchill.log). What I do I comment them such as #PyYAML==6.0 ==> crossenv
  6. then grep all crossenv from your pure file to create a crossenv file such as $ grep crossenv requirements-pure.txt > requirement-crossenv.txt
  7. then re-test your build on x86_64 and if possible on a arm arch.

You can find an really easy build at #5447

Hope this helps and let me know if you require assistance.

EDIT: @miigotu also note, many of the usual complex wheel building examples are available under spk/python310 and commented for you to import in your project or test-bed with.

@BKSteve
Copy link
Contributor

BKSteve commented Oct 13, 2022

Thanks for the step by step. Have started and will update the files here once tested.

@BKSteve
Copy link
Contributor

BKSteve commented Oct 13, 2022

cryptography throwing errors when only in cross.
SickChill_x64-7.0_crypto-error.log

I added cross/cryptography to makefile as only in requirements-cross.txt was also throwing a similar error.

# [cryptography]
DEPENDS += cross/cryptography
ENV += PYO3_CROSS_LIB_DIR=$(STAGING_INSTALL_PREFIX)/lib/
ENV += PYO3_CROSS_INCLUDE_DIR=$(STAGING_INSTALL_PREFIX)/include/

Error
ImportError: libffi.so.6: cannot open shared object file: No such file or directory
Full log
build-x64-7.0.zip

maybe my bad. certifi was in pure still not commented out as in Py310

@BKSteve
Copy link
Contributor

BKSteve commented Oct 13, 2022

x64 issues resolved
arm7 failed.
Added cross/cryptography to resolve.
Raised my own pull request for building tests.

@miigotu
Copy link
Contributor Author

miigotu commented Oct 13, 2022

@BKSteve you can either tag the new PR here and close this one or push those changes here. Good job.

@BKSteve
Copy link
Contributor

BKSteve commented Oct 13, 2022

Let's get some feedback on 8247 and move from there. I only did my own pull request to synocomm to build quicker. I'd built 10 but it built 13 in same time

@BKSteve
Copy link
Contributor

BKSteve commented Oct 14, 2022

@miigotu maybe you close this one and we move to #5449 which is building and I've just reduced the pure file as python310 loaded a few other packages.

@miigotu miigotu closed this Oct 14, 2022
@miigotu
Copy link
Contributor Author

miigotu commented Oct 14, 2022

Sure

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants