Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Including companion dependency only for specific versions of another dependency #8813

Open
2 tasks done
gerbenoostra opened this issue Dec 21, 2023 · 3 comments
Open
2 tasks done
Labels
kind/feature Feature requests/implementations status/triage This issue needs to be triaged

Comments

@gerbenoostra
Copy link

  • I have searched the issues of this repo and believe that this is not a duplicate.
  • I have searched the FAQ and general documentation and believe that my question is not already covered.

Feature Request

I have an optional dev dependency (a companion package containing stubs), which I only want to include for certain versions of a main dependency.

For example, when using pyspark~=3.0, I want to use the pyspark-stubs~=3.0. However, when using pyspark~=3.3., I don't want to (as that package already includes the stubs).

My attempt:

[tool.poetry.dependencies]
pyspark = [
  {version = "~3.0", optional = true, python = "<=3.8"},
  {version = "~3.3", optional = true, python = ">3.8"}
]

[tool.poetry.group.test.dependencies]
pyspark-stubs = {version = "~3.0", python = "<=3.8", optional=true}

Which fails to resolve, as pyspark-stubs depends on pyspark ~3.0, and the project depends on both pyspark ~3.0 and pyspark ~3.3.

The problem in this case is of course that for pyspark 3.3, there's no compatible pyspark-stubs specified. But in this case, I want the resolution to not use any pyspark-stubs. I tried to enforce this by using the same restriction on python, but that didn't work.

How can I specify such a companion dependency (like stubs) only for specific versions of my main dependency? Or is there a way to group them together in another way?

@gerbenoostra gerbenoostra added kind/feature Feature requests/implementations status/triage This issue needs to be triaged labels Dec 21, 2023
@dimbleby
Copy link
Contributor

duplicate #8499, both seem unlikely ever to happen

@gerbenoostra
Copy link
Author

gerbenoostra commented Dec 21, 2023

It does work with transitive dependencies though, I assume.

If I depend on library A version 1 or 2, where version 1 has different dependencies than 2, depending on the version (perhaps based on python version), either the set of 1 or the set of 2 would get installed.

Maybe I can create a workaround where I release 2 versions of a private (empty) package: one with pyspark 3.0 & pyspark-stubs as dependencies, and release another new version that depends only on pyspark 3.3.

@radoering
Copy link
Member

Which fails to resolve, as pyspark-stubs depends on pyspark ~3.0, and the project depends on both pyspark ~3.0 and pyspark ~3.3.

If I understand correctly this example should resolve because the intersection of the markers of pystark-stubs and pyspark ~ 3.3 is empty. However, this example might be a duplicate of #5506.

Maybe I can create a workaround where I release 2 versions of a private (empty) package: one with pyspark 3.0 & pyspark-stubs as dependencies, and release another new version that depends only on pyspark 3.3.

You may run into the same solver issue or it may work. I'd like to hear whether this changes anything. 🙂

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/feature Feature requests/implementations status/triage This issue needs to be triaged
Projects
None yet
Development

No branches or pull requests

3 participants