You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are using dbt for Python "models", which handles Python stored procedures to create tables and abstracts that from users.
However they are leaving it to users (at least for now) to handle the pushing of code outside the model into a stage.
Although I think the feature requested here applies more broadly to reuse logic between stored procedures/UDFs.
Request
To be able to create a package of a user defined Python library, that depends on a number of external libs (in Anaconda or alternatively in PyPI), with the packaging of the library and PyPI dependencies being handled and submitted to a Snowflake stage.
E.g. snow package lib <lib location>, which packages a local library defined in <lib location> alongside its dependencies.
Example
I managed to do one using your wrapper for packaging PyPI dependencies:
But I miused the snow package create <package> by depending only on a single external package (pyjokes), and "knowing" your implementation compresses the local folder (which includes my lib) in a zip and then conveniently submits it to a stage. Ideally the user lists a number of dependencies (e.g. pyproject.toml), Anaconda ones are ignored and PyPI only ones are pulled and compressed alongside our lib in a zip, and pushed to a stage.
The text was updated successfully, but these errors were encountered:
Actually inspecting the code, I found that there is a route, using the package command on a wheel with the user created library (which has 3rd party dependencies - the wheel file knows the dependencies of the lib, you seem to invoke pip which pulls the dependencies of the wheel, and your CLI takes care of the rest). E.g.:
Starting from your solution, @n-batalha, ... package create ... can be replaced with pip install -t and zip -r. This puts the top level module (named example below) in the right format for ... package upload ....
@n-batalha I'm closing this one as there were numerous fixes in package commands in 2.0 version. If this still persists in 2.0 please reopen this issue.
Context
We are using dbt for Python "models", which handles Python stored procedures to create tables and abstracts that from users.
However they are leaving it to users (at least for now) to handle the pushing of code outside the model into a stage.
Although I think the feature requested here applies more broadly to reuse logic between stored procedures/UDFs.
Request
To be able to create a package of a user defined Python library, that depends on a number of external libs (in Anaconda or alternatively in PyPI), with the packaging of the library and PyPI dependencies being handled and submitted to a Snowflake stage.
E.g.
snow package lib <lib location>
, which packages a local library defined in<lib location>
alongside its dependencies.Example
I managed to do one using your wrapper for packaging PyPI dependencies:
Then:
And in dbt, a model is written as:
But I miused the
snow package create <package>
by depending only on a single external package (pyjokes
), and "knowing" your implementation compresses the local folder (which includes my lib) in a zip and then conveniently submits it to a stage. Ideally the user lists a number of dependencies (e.g.pyproject.toml
), Anaconda ones are ignored and PyPI only ones are pulled and compressed alongside our lib in a zip, and pushed to a stage.The text was updated successfully, but these errors were encountered: