Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Resolution fails if no versions can be found for a conditioned package #28

Open
uranusjr opened this issue Sep 1, 2018 · 3 comments
Open
Labels
bug Something isn't working

Comments

@uranusjr
Copy link
Member

uranusjr commented Sep 1, 2018

This fails if locked on 2.7:

black ; python_version >= '3.6'

I’m not completely sure what is the best approach. Should passa just always ignore requires-python when resolving? Should it still respect it, but choose the lowest version possible when it’s not possible to resolve?

Erroring out is not an option because e.g. pip-shims specifies modutil; python_version >= '3.7', and locking would fail for everything under if a project depends on it. But pip-shims is intended to be used for those lower versions.

@uranusjr uranusjr added the bug Something isn't working label Sep 1, 2018
@techalchemy
Copy link
Member

So currently, passa locks pip-shims to that marker as well, which simply isn't true. Parents shouldn't inherit the markers of their children, the relationship should only flow in the other direction.

In this case, the children of modutil should inherit the python_version marker from modutil, unless those children are specified somewhere else without a marker

I know modutil has no markers on it but just to use the same example

@uranusjr
Copy link
Member Author

uranusjr commented Sep 2, 2018

I agree about modutil’s children should inherit the marker from modutil (Passa already does it), but what I’m describing is that when Passa finds versions on PyPI for modutil, it should consult its parent’s marker. Note that this is not about finding dependencies, but for finding versions of a requirement.

modutil is not a very good example here to describe the problem, so I’ll invent something else. Django has the following Python version support matrix:

  • Django 1.11: Python 2.7, 3.4, 3.5, 3.6
  • Django 2.0: Python 3.4, 3.5, 3.6, 3.7

Django did not actually specify Requires-Python on PyPI for 1.11, but let’s pretend it did. Also, let’s pretend Django 2.0 is the latest release, for simplicity’s sake.

Say I’m running Python 2.7, and have a package specifying this dependency:

django >= 2.0; python_version >= '3.7'

What version of Django should it resolve to? Currently it won’t resolve at all, since the resolver checks whether the running interpreter satisfies a given version’s Requires-Python, and therefore won’t be able to find a version.

The above mentioned #37 removes this restriction, and finds 2.0 correctly, and slap the marker on it. But this raises another problem. What if I have a dependency graph like this:

foo
 ↓
bar ; python_version < '3'
 ↓
django>=1.11

Previously this would correctly resolve Django to 1.11, but with #37 it would resolve into an uninstallable result (Django 2.0; python_version < '3').

The solution is that when the resolver finds versions for Django, it needs to compare the parent’s (bar here) markers to Django’s Requires-Python, and finds only versions inside the intersected range.

This is low priority for me, since you can simply provide additional hints in Pipfile to work around the problem. But it is still theoretically solvable, and I want to post it out so it is known and can be worked on if possible.

@techalchemy
Copy link
Member

Say I’m running Python 2.7, and have a package specifying this dependency:

 django >= 2.0; python_version >= '3.7'

What version of Django should it resolve to? Currently it won’t resolve at all, since the resolver checks whether the running interpreter satisfies a given version’s Requires-Python, and therefore won’t be able to find a version.

Given this set of requirements it would be correct to not resolve anything. There is an explicit marker on django which essentially says 'don't install it at all unless you can install this version'. This is an issue with actually transferring and pinning the parent marker to the child requirement, especially in cases where the markers are potentially going to lead you down a path that is exclusive with another path (python_version, sys_platform, os_name, etc). The trouble here is that you really only want to inherit the parent's markers if you're also inheriting the parent, which is why we decided not to do any of this and just dropped the marker entirely for the children in pipenv.

The above mentioned #37 removes this restriction, and finds 2.0 correctly, and slap the marker on it. But this raises another problem. What if I have a dependency graph like this:

foo
 ↓
bar ; python_version < '3'
 ↓
django>=1.11

Previously this would correctly resolve Django to 1.11, but with #37 it would resolve into an uninstallable result (Django 2.0; python_version < '3').

So here's the real question, and since I know we talked about this in the past, I'll keep it short. What we really need is the ability to keep 2 copies of the same package in the lockfile, with different markers. In this example, we'd want to run the lock 2 times -- one with Requires-Python >= 3.0 and then one with the inverse operator of whatever we used for the first one, i.e. Requires-Python < 3.0, dropping all the pins and re-resolving. That way we would already know how to handle both cases before we ever hand off the lockfile.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants