You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I use poetry for many projects, including some PySpark projects. In this case, I am used to leveraging conda-pack to create a standalone tar.gz containing the Python binary and all the dependency that I can use with spark-submit.
To generate a conda environment I am forced to use export to generate a requirements.txtand then to load it with conda.
Conda supports a yaml environment specification file and it would useful if poetry export could emit a file satisfying this specification.
It does not seem extremely complicated to implement but before jumping in and writing code I wanted to ask:
Would this functionality be useful?
Should this functionality be added to this repository or as a separate plugin?
Do you see any blocker to implement this?
Thank you in advance for your inputs!
The text was updated successfully, but these errors were encountered:
over at conda-forge/wetterdienst-feedstock#771, we have been starting to search for a solution on this matter, and happily discovered your discussion here.
We will be all ears when there is something usable and will try to apply it on the dependency definitions of Wetterdienst quickly.
Hello!
I use poetry for many projects, including some PySpark projects. In this case, I am used to leveraging
conda-pack
to create a standalone tar.gz containing the Python binary and all the dependency that I can use withspark-submit
.To generate a conda environment I am forced to use
export
to generate arequirements.txt
and then to load it with conda.Conda supports a yaml environment specification file and it would useful if poetry export could emit a file satisfying this specification.
It does not seem extremely complicated to implement but before jumping in and writing code I wanted to ask:
Thank you in advance for your inputs!
The text was updated successfully, but these errors were encountered: