Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating only one locked dependency #966

Closed
k4nar opened this issue Oct 24, 2017 · 83 comments
Closed

Updating only one locked dependency #966

k4nar opened this issue Oct 24, 2017 · 83 comments
Labels
Type: Question ❔ This is a question or a request for support.

Comments

@k4nar
Copy link
Contributor

k4nar commented Oct 24, 2017

Sometimes I'm doing a PR and I want to update a specific dependency but I don't want to deal with updates of all my dependencies (aiohttp, flake8, etc…). If any breaking change was introduced in those dependencies, I want to deal with it in another PR.

As far as I know, the only way to do that would be to pin all the dependencies that I don't want to update in the Pipfile. But I find it to defeat the purpose of Pipenv in the first place :) .

So my feature request would be to be able to do something like:

$ pipenv lock --only my-awesome-dep

That would generate a Pipfile.lock with updates for only my-awesome-dep and its dependencies.

I can probably make a PR for that, but I would like to get some feedback first.

@k4nar
Copy link
Contributor Author

k4nar commented Oct 24, 2017

That could also be useful for pipenv install, as sometimes I want to install a new dependency without updating others.

@vphilippon
Copy link
Member

vphilippon commented Oct 24, 2017

There's a little thing to take into account here: Changing a single dependency could change the overall set of requirements.
Ex: Updating foo from 1.0 to 2.0 could require to update bar to >=2.0 (while it was <2.0 before), and so on.

I know that in the context of pip-tools itself (from which pipenv takes its dependency resolution algorithm), running the dependency resolution will only "update" the required packages when "re-locking" if there's an existing lock file. It does so by checking if the existing pins in the lockfile are valid candidate first when selecting candidate in the resolving. pipenv could probably do the same.

I think its a reasonable idea. Otherwise, if you want to update absolutely only one dependency, pipenv would have to have a mode to block if changing a dependency causes other changes, or else you would loose the guarantee of a valid environment.

I hope this helps!

@erinxocon erinxocon added the Type: Question ❔ This is a question or a request for support. label Oct 24, 2017
@k4nar
Copy link
Contributor Author

k4nar commented Oct 24, 2017

Indeed, that was what I meant by:

That would generate a Pipfile.lock with updates for only my-awesome-dep and its dependencies.

@brettdh
Copy link

brettdh commented Oct 24, 2017

Agree 100% - and I'll go a bit farther: this should be the default.

That is, pipenv install foo should never touch anything besides foo and its dependencies. And pipenv lock should certainly never upgrade anything - it should just lock what's already installed.

AFAICT, this is how npm, yarn, gem, etc. work; it makes no sense to have a lockfile that doesn't actually lock packages, but trusts package authors to not break things in patch releases, and therefore upgrades them without being asked. I can see the use of allowing upgrades, but that should be opt-in, since it's more surprising than not upgrading them.

I apologize if I'm hijacking this issue for something else, but since this is so closely related to an issue I was about to create, I thought I'd start the conversation here. Feel free to tell me I should make a new one.

@brettdh
Copy link

brettdh commented Oct 24, 2017

Just found this related issue as well: https://github.com/kennethreitz/pipenv/issues/418

Being able to specify pipenv install --upgrade-strategy=only-if-needed seems like what I'm looking for, though of course as I mentioned I think that should be the default, as it's becoming in pip 10. But being able to specify it semi-permanently via env var would be something, anyway.

I would be surprised if that change breaks anyone's workflow (famous last words), since it's more conservative than --upgrade-strategy=eager.

@brettdh
Copy link

brettdh commented Oct 25, 2017

Tried to work around this by setting export PIP_UPGRADE_STRATEGY=only-if-needed in my shell config. This doesn't work, and pipenv lock exhibits these surprising behaviors:

  1. It "upgrades" packages that don't need to be upgraded (but...)
  2. It actually doesn't upgrade the installed versions! i.e. pip freeze and Pipfile.lock show different versions!

Guessing pipenv is delegating to pip for the install, and pip respects its environment variable settings, but pipenv lock doesn't.

@techalchemy
Copy link
Member

@k4nar What happens right now that you are finding undesirable? Because if you upgrade a dependency that has cascading requirements obviously it will have consequences for other dependencies. Are you suggesting some kind of resolver logic to determine the most current version of a specific package in the context of the current lockfile? I am hesitant to encourage too many hacks to resolution logic, which is already complicated and difficult to debug.

@brettdh I think I can shed some light because you have most of the pieces. pipenv lock doesn't install anything, and it doesn't claim to. It only generates the lockfile given your host environment, python version, and a provided Pipfile. If you manipulate your environment in some other way or if you use pip directly/manipulate pip settings outside of pipenv / are not using pipenv run or using pip freeze inside a pipenv subshell, it is quite easy for a lockfile to be out of sync from pip freeze. The two aren't really related.

To be clear:

  1. Pipfile.lock is a strictly-pinned dependency resolution using the pip-tools resolver based on the user's Pipfile
  2. If you want to maintain strict pins of everything while upgrading only one package, I believe you can do this by strictly pinning everything in your Pipfile except for the one thing you want to upgrade (correct me if I'm wrong @vphilippon)

As for your lockfile and pip freeze disagreeing with one another, I'd have to know more information, but I believe we have an open issue regarding our lockfile resolver when using non-system versions of python to resolve.

@k4nar
Copy link
Contributor Author

k4nar commented Oct 26, 2017

@techalchemy : If I have a Pipfile.lock with A, B and C where B is a dependency of A, I would like to be able to update A and B without updating C, or C without updating A and B.
Again of course I can pin all my dependencies & their dependencies in my Pipfile in order to do that, but that would be a burden to maintain (like most requirements.txt are).

@brettdh
Copy link

brettdh commented Oct 26, 2017 via email

@techalchemy
Copy link
Member

techalchemy commented Oct 26, 2017

Hm I see what you guys are saying. The premise of passing a setting to pip is not what I’m worried about, it’s resolving with pip-tools that concerns me. What does this behavior look like right now?

@brettdh
Copy link

brettdh commented Oct 26, 2017

@techalchemy I mentioned the pip freeze difference as a shorthand for "the package versions that pipenv install installs differ from the package versions that pipenv lock saves to Pipfile.lock."

True, this only happens when I've changed pip's default args via environment variable; I was just pointing out that it was surprising that pipenv delegated to pip for installation but not for version locking; i.e. rather than locking what's installed, it locks what it thinks should be installed, potentially with unrequested upgrades.

Could you clarify your question a bit? I think "resolving with pip-tools" is referring to what pipenv lock is doing, and the reason it's not affected when I set pip defaults? And could you be more specific about what you mean by "this behavior"?

@vphilippon
Copy link
Member

vphilippon commented Oct 26, 2017

@brettdh The locking mechanism include a notion of "dependency resolution" that does not exist in pip. Its handled by pip-tools (or rather, a patched version of it, integrated in a special way by pipenv that bring a few differences with the original tool). In short, the locking mechanism reads the Pipfile and performs a full dependency resolution to select a full set of package that will meet every constraints defined by the required packages and their dependencies.

@techalchemy

[...] it’s resolving with pip-tools that concerns me.

I'm not sure how those --upgrade-strategy would affect pip-tools, because it works on some low-level internals of pip. I have the feeling this would not give the expected result, as these option take into account what's installed, and that's not what's being dealt with in that mechanism. But we have another approach to this in pip-tools that could be done here.

The "original" pip-tools behavior is that it only updates what's is needed in the lockfile (in its context, its the requirements.txt), but this was "lost" in the way the resolver was integrated in pipenv. Let me explain why.

Pointing back to my resume of how pip-tools works: https://github.com/kennethreitz/pipenv/issues/875#issuecomment-337717817
Remember the "select a candidate" part? That's done by querying the Repository object.
In pipenv, we directly configure a PyPIRepository for the Resolver, but pip-tools does something else, it uses a LocalRequirementsRepository object, which keeps the existing pins from the previously existing requirements.txt (if found), and "fallbacks" on PyPIRepository.

So in pip-tools, the following happens when selecting a candidate:

  1. Query LocalRequirementsRepository for a candidate that match foobar>=1.0,<2.0.
  2. Check if an existing pin meets that requirements:
    • If yes, return that pin as the candidate.
    • If not, query the proxied_repository (PyPIRepository) for the candidate.
  3. Use the candidate returned

Effectively, it means that existing pins are given a "priority" as candidate to try first.

But in pipenv, currently, it simply:

  1. Query PyPIRepository (directly) for a candidate that match foobar>=1.0,<2.0.
  2. Use the candidate returned.

So, I think the same behavior for the locking in pipenv could be done by parsing the Pipfile.lock to get the existing pins and use a LocalRequirementsRepository, like pip-tools does in its pip-compile command.

@techalchemy
Copy link
Member

@vphilippon do you have a sense of how difficult implementation on that would be?

@vphilippon
Copy link
Member

@techalchemy

  • Parsing the Pipfile.lock to extract the existing pins: Haven't looked at that. Depends on how things are structured in pipenv. We need a set of InstallRequirements that represents the pins in the Pipfile.lock.
  • Using LocalRequirementsRepository: Fairly easy: change our current PyPIRepository for a LocalRequirementsRepository.

But, as I'm looking into this, and following @brettdh comments, I realize a few things:

  1. The current default pipenv install behavior doesn't match the pipenv lock behavior. Doing pipenv install requests alone won't update requests if a new version comes out (much like straight pip install). However, doing pipenv lock will update the Pipfile.lock with the latest version of requests that matches the Pipfile specifier, and the dependency constraints.
    There's 2 main way to see this:

    • A) The Pipfile.lock should stay as stable as possible by default, not changing pins unless required, in order to stay like the current environment, and only change in the event that we change the environment.
    • B) The Pipfile.lock should get the newest versions that respect the environment constrains/dependencies in order to freely benefit from the open ranges in the Pipfile and lib dependencies, allowing to continuously acquire new compatible versions in your environment. You can then run pipenv update to benefit from the fresh lock.

    IMHO, I would align the default behavior, which would be to go with A) by default. Because right now, everytime a lock is performed (i.e. after each installation), new versions can come in, which make the lockfile drive the update of the environment, which seems weird. But, this is arguable of course. While in development, I might want to continuously update my requirements to no get stale, like with B), so that should also be easily doable.

  2. Even if we use LocalRequirementsRepository to avoid updating correct existing pins, and end up aligning the default behaviors, we then need to address the equivalent of --upgrade and --upgrade-strategy for the locking part. Currently, defining some environment variable (like PIP_UPGRADE and PIP_UPGRADE_STRATEGY) will affect the pipenv install behavior, but will not affect pipenv lock, as it doesn't affect the behavior of pip-tools (I confirmed that, as I was unsure at first).
    Otherwise, there will be no way to update the environment without either deleting the Pipfile.lock (feels clunky, and "all or nothing") or requiring a newer version (I mean doing an explicit pipenv install requests>2.18.4, which requires you to know that a new version is out, and changes the specifier in the Pipfile itself, increasing the lower bound), which is wrong. As the "original pip-tools" doesn't deffer to pip to deal with this (as it's not related that what is currently installed), it offers an option to specify the dependencies to update in the lockfile, and simply remove the pins for these packages (or all) from the existing_pins list, effectively falling back to querying PyPI. I'm not sure how we can match the notion of "--upgrade-strategy" with this.


@techalchemy
So while I was saying it was fairly easy to just "align the default behavior", I now realize that this would cause some major issue with being able to update the packages (as in: just fetch the latest version that match my current constraints).

If there's something unclear, ask away, a lot of editing went on when writing this.

(Dependency resolution is not easy. Good and practical dependency resolution is even worst 😄 )

@techalchemy
Copy link
Member

@vphilippon that's exactly what I meant. Keeping the things that pip installs in sync with the things that pip-tools resolves is non-trivial unless you drive the process backwards, using the resolved lockfile to do the installation. I'm pretty sure that was why things were designed the way they were.

B) The Pipfile.lock should get the newest versions that respect the environment constrains/dependencies in order to freely benefit from the open ranges in the Pipfile and lib dependencies, allowing to continuously acquire new compatible versions in your environment. You can then run pipenv update to benefit from the fresh lock.

This workflow can possibly work with the current configuration. You can use pipenv lock to generate a lockfile, but pipenv update will reinstall the whole environment. I'm pretty sure we can use one of our various output formats to resolve the dependency graph (we already have a json format as you know) and only reinstall things that don't align to the lockfile. This might be more sensible, but I would be curious about the input of @nateprewitt or @erinxocon before making a decision

@brettdh
Copy link

brettdh commented Oct 27, 2017

@vphilippon Totally agree that A and B are desirable workflows in different situations. Some of your phrasing around B confused me a bit, though, seeming to say that pipenv lock might result in a lockfile that doesn't actually match the environment - I particularly heard this in that one would need to "run pipenv update to benefit from the fresh lock" - as if the lock is "ahead" of the environment rather than matching it.

Regardless of whether you are in an A workflow or a B workflow, a few things seem constant to me, and I think this squares with what @techalchemy is saying as well:

  • The result of pipenv lock should always be a lockfile that matches the environment.
  • The result of pipenv install should always be an environment that matches the lockfile.

I'm ignoring implementation details, but that's kind of the baseline behavior I expect from a package manager with a lockfile feature.

Running pipenv update periodically allows you to stay in B mode as long as you want everything to be fresh, and having the ability to pipenv install --upgrade requests would allow specific updates of one package and its dependencies, without affecting packages that don't need to be upgraded unnecessarily.

Am I missing any use cases? I can think of optimizations for B - e.g. a flag or env var that tells it to always update eagerly - but I think that covers the basics. I also know I'm retreading ground you've already covered; it's just helpful for me to make sure I understand what you're talking about. :)

@techalchemy
Copy link
Member

Some of your phrasing around B confused me a bit, though, seeming to say that pipenv lock might result in a lockfile that doesn't actually match the environment

@brettdh this is correct -- the pip-tools resolver we use to generate Pipfile.lock doesn't ask the virtualenv for a list of which packages have been installed. Instead, it compiles a list of packages that meet the criteria specified in the list of pins from the Pipfile. Because the resolver itself runs using the system or outer python / pipenv / pip-tools install, we are doing some supreme fuckery to convince it to resolve packages with the same version of python used in the virtualenv. The assumption would be that pip install would resolve things similarly, but that isn't always the case, although even I'm not 100% sure about that. But yes, pipenv lock is not generated based on the virtualenv, it is generated based on the Pipfile. It is a dependency resolution lockfile, not an environment state pin.

@ncoghlan
Copy link
Member

As a potential resolution to this: something that pip itself currently supports, but pip-compile doesn't, is the notion of a constraints file.

A constraints file differs from a requirements file, in that it says "If this component is installed, then it must meet this version constraint". However, if a particular package in the constraints file doesn't show up in the dependency tree anywhere, it doesn't get added to the set of packages to be installed.

This is the feature that's currently missing from pipenv, as the desired inputs to the Pipfile.lock generation are:

  1. The updated Pipfile contents as a new requirements input file
  2. The full set of existing dependencies from Pipfile.lock as a constraints file, excluding the packages specifically named in the current command

Constraints file support at the pip-tools resolver level would then be enough for pipenv to support a mode where attempted implicit upgrades of dependencies would fail as a constraint violation, allowing the user to decide whether or not they wanted to add that package to the set being updated.

@kennethreitz
Copy link
Contributor

currently not supported, thanks for the feedback

@taion
Copy link

taion commented Nov 22, 2017

@kennethreitz

Do you mean:

  1. This behavior should be changed, but it's not currently a priority,
  2. This behavior should be added as something optional, but it's not currently a priority, or
  3. This behavior should not be added?

This is a sufficient inconvenience given the inconsistency with how other similar locking package managers work that it would be good to keep this open as a solicitation for PRs.

If instead it's (3), and this will not be added, then I think a number of us on the issue will need to adjust our plans for our choice of Python package management tools.

@kennethreitz
Copy link
Contributor

I mean that this is currently not supported, and I appreciate the feedback.

@taion
Copy link

taion commented Nov 22, 2017

I understand that it's not supported. Are you also saying that you would not accept PRs either changing this behavior or adding this as an option?

@kennethreitz
Copy link
Contributor

I have no idea.

@brettdh
Copy link

brettdh commented Nov 27, 2017

@k4nar still interested in doing a PR for this? Specifically, something like pipenv install --only <dep-to-update which prevents unrelated deps from being updated. Since @kennethreitz seems uninterested in discussing further, it seems to me that that's the only way to find out whether that behavior addition/change could be acceptable (and, by extension, whether folks like @taion and I can continue using pipenv).

@k4nar
Copy link
Contributor Author

k4nar commented Nov 27, 2017

I'm interested but I'm not sure to know how would be the best way to implement this. There are a lot of components in action (pip, pip-tools, pipfile, pipenv…) and probably a lot of possible solutions.

@tilgovi
Copy link

tilgovi commented Aug 15, 2018

The --keep-outdated flag does not seem to be working as intended, as was stated when the issue was re-opened. Whether that behavior should or should not be the default and how it aligns with other package managers is not really the central issue here. Let's fix the thing first.

@alecbz
Copy link

alecbz commented Aug 15, 2018

@brettdh

on reflection, it now occurs to me that this is what npm/yarn do by default when you install a package; they find the latest major.minor version and specify ^major.minor.0 in package.json, which prevents unexpected major version upgrades, even when an upgrade-to-latest is explicitly requested. I wonder if Pipenv should do the same - but that would be a separate issue.

Yeah that's along the lines of what I was trying to suggest in #966 (comment)

@benkuhn
Copy link

benkuhn commented Aug 20, 2018

Really excited to hear this is being worked on!

In the mean time, does anyone have a suggested workaround that's less laborious and error-prone than running pipenv lock and hand-reverting the resulting lockfile changes that I don't want to apply?

@wichert
Copy link

wichert commented Aug 20, 2018

@benkuhn Not that I know off - I do the same lock & revert dance all the time.

@benkuhn
Copy link

benkuhn commented Aug 20, 2018

Ah ok, you can at least sometimes avoid hand-reverting:

  1. pipenv lock
  2. git commit -m "FIXME: revert"
  3. pipenv install packagename
  4. git commit -m 'Add dependency on packagename'
  5. git rebase -i
  6. Drop the FIXME: revert commit

Unfortunately it's still possible to create an inconsistent Pipfile.lock if your Pipfile.lock starts out containing a version of a package that's too old to satisfy the requirements of packagename, but perhaps pipenv will complain about this if it happens?

@jhrmnn
Copy link

jhrmnn commented Aug 21, 2018

--keep-outdated seems to systematically keep outdated only the explicit dependencies that are specified (unpinned) in Pipfile, while all the implicit dependencies are updated.

@max-arnold
Copy link

Am I correct that it is not possible to update/install single dependency using pipenv==2018.7.1 without updating other dependencies? I tried different combinations of --selective-upgrade and --keep-outdated with no success.

Editing Pipfile.lock manually is no fun...

@mrsarm
Copy link

mrsarm commented Aug 28, 2018

Same than @max-arnold , it's my first day using the tool in an existing project, and I have to say I'm really disappointed, before I started to use it, I checked the doc site and the video demo, it looked impressive to me, and now this: in real project, work with pip or pipenv is almost the same, i don't see the point, like many other said in the thread, if I have a lock file, why you are updating my other dependencies if there is no need to update them.

Of course, ### if the update is mandatory, it's OK to update all the necessary dependencies, but just those, not all the outdated instead.

Also the options --selective-upgrade and --keep-outdated are not clear for what are useful for, there is another issue highlighting this here #1554 , and nobody is able to respond what these options do, incredible.

But my major disappointing is why this package was recommended by the Python official documentation itself, these recommendations should be more careful conducted, I know this can be a great project in the feature, have a lot of potential, but simple things like this (we are not talking about a bug or a minor feature), make this project not eligible for production environments, but suddenly because it was recommended by the Python docs, everybody are trying to use it, instead of looking for other tools that maybe work better, or just stick with pip, that doesn't solve also these issues, but at least it's very minimalist and it's mostly included in any environment (does not add extra dependencies).

@uranusjr
Copy link
Member

uranusjr commented Aug 28, 2018

@mrsarm Thank you for your opinion. Sorry things don’t work for you. I don’t understand where the disappointment comes from, however. Nobody is forcing Pipenv on anyone; if it doesn’t work for you, don’t use it. That is how recommendations work.

Your rant also has nothing particularly related to this issue. I understand it requires a little self-control to not dumping trash on people when things don’t go your way, but please show some respect, and refrain from doing so.

@mrsarm
Copy link

mrsarm commented Aug 28, 2018

@uranusjr it's not trash, it's an opinion, and some times it's not an option, like my case, where somebody chose pipenv to create a project where I started to work now, and I have to deal with this.

But things get worst just now, and what I going to say it's not an opinion, it's a fact.

After trying to add one dependency that just I dismissed to avoid to deal with this issue (because it's a dev dependency, so I created a second environment with pip and the old requirements-dev.txt approach, just with that tool), I needed to add another dependency.

The new dependency is PyYAML, let say the latest version. If you install it in any new environment with pip, you will see that the library does not add any dependency, so only PyYAML is installed, is that simple in these cases with Pip. But adding the dependency with Pipenv (because a project that I didn't create is managed with Pipenv) the same issue happened, despite PyYAML doesn't have any dependency, and it's not previously installed in the project (an older version), pipenv updates all my dependencies in the lock file and the virtual environment, but I don't want to update the others dependencies, I just one to add a single new module without any dependency.

So the conclusion (and again an opinion, not a fact like pipenv broke all my dependencies) it's that Pipenv instead of help me to deal with the dependencies management, it turn it into hell.

@dfee
Copy link
Contributor

dfee commented Aug 28, 2018

I've followed this thread for months, and I think any real project will ultimately stumble upon this issue, because the behavior is unexpected, counter-intuitive, and yes: dangerous.

About a month ago I tried out a more-comprehensive alternative to pipenv, poetry; it solved the problems I needed to solve:

  1. managing one set of dependencies (setup.py, setup.cfg, pip, and pipfile -> pyproject.toml)
  2. future oriented, backwards-compatible (again pyproject.toml)
  3. fairly un-opinionated (no i'm really not asking to install redis)
  4. and the solution to the classic Pipenv problem: "Also, you have to explicitly tell it [pipenv] to not update the locked packages when you install new ones. This should be the default." [1] [2]

I weighed sharing these thoughts on the pipenv issue, but as @uranusjr said, "no one is forcing Pipenv on anyone", and I'm not forcing Poetry. I like it, it works well, and it solves my problems, but I'm just sharing an alternative, more-comprehensive solution to the problem I was having. Just take all that as my 2¢.

  • as a disclaimer, I am not a member of the Poetry team or affiliated with them.

p.s. I think the concern about Pipenv being the "official" solution is due to it's first-class integrations – something that you, @uranusjr, might see it as a simple recommendation – the industry at large is taking it as the "blessed approach going forward". Frankly, that recommendation is more authoritative in the community than certain PEPs that have been around for more than a year.

@techalchemy
Copy link
Member

Nobody is forcing you to participate in our issue tracker; if you don’t have a productive comment please find a forum that is not for triaging issues.

For users who are interested in trying the alternative resolver @uranusjr and myself have been implementing for several weeks now, please try out https://github.com/sarugaku/passa which will generate compatible lockfiles. Poetry does a lot of different things, but it also has limitations and issues itself, and we have a design philosophy disagreement about scope.

This is a project we manage in our spare time; if you want to see something fixed or you have a better approach, we are happy to accept contributions. If you are here to simply tell us we ruined your day and your project, I will ask you only once to see yourself out.

We have not forgotten or ignored this issue, we have a full implementation of a fix in the resolver linked above. Have patience, be courteous, or find somewhere else to talk. To those who have been waiting patiently for a fix, please do try the resolver mentioned above— we are eager to see if it meets your needs. It implements proper backtracking and resolution and shouldn’t handle this upgrade strategy

In the shorter term I think we can get a band aid for this into pipenv if we don’t wind up cutting over first.

@techalchemy
Copy link
Member

@dfee I am not really sure that blurring lines between applications and libraries is the correct answer to dependency management, so I don’t see poetry’s approach as an advantage. I wasn’t involved in whatever your issue was with the recommendation engine, but we took that out some time ago...

@sdispater
Copy link

@techalchemy

I am not really sure that blurring lines between applications and libraries is the correct answer to dependency management, so I don’t see poetry’s approach as an advantage.

Why though? I never understood this idea that you should manage the dependencies of a library and an application differently. The only difference between the two is the lock file which is needed for an application to ensure a reproducible environment. Other than that it's the same thing. This is the standard in most other languages and Python seems the exception here for some reason and this is bad from a user experience standpoint since this is making things more complex than they should be.

it also has limitations and issues itself

Which ones? I am really curious about the issues or limitations you encountered while using Poetry.

@mrsarm
Copy link

mrsarm commented Aug 29, 2018

My apologies to bean so rude. Now reading my comments I realize that despite the info I provided and some of my options are still valid (IMHO), it's wasn't appropriated the way I wrote what I wanted to say.

I understand that the issue tracker is most a place where to discuss bugs and improvements, and discuss whether this is a bug or an error by design is not clear in the thread, but again my apologies.

I thing there are two strong topics here:

  • Should pipenv update all your outdated dependencies where you are trying just to install a new dependency: the ones that are not needed to update because the new package / version we are trying to install can works with the existent dependencies, and even the ones that aren't dependencies of the new package we are trying to install? Maybe this is out of scope of this ticket, but it's a really important topic to discuss.
  • Do one of these parameters --keep-outdated --selective-upgrade allow us to avoid these behaviour? It's not clear what these options do, there is a lack of documentation about them, and even in the related issue (install's --keep-outdated does not seem to be respected #1554) nobody is answering that.

In case it's a bug in on one of these params --keep-outdated --selective-upgrade, I still thinking that do not set whatever param solves the unnecessary update of the dependencies as default it's a really bad idea.

To compare with a similar scenario, imagine that you execute apt-get install vim to just install the vim tool in your system (and the necessary vim's dependencies or updates if apply), but imagine also that in this situation apt updates all the other dependencies of your system: python, the QT system, the Linux kernel... and so on. It's not that apt shouldn't allow us to updates other dependencies, but there is a clear command to do that: apt-get upgrade, while apt-get install PACKAGE just install / update PACKAGE and it's dependencies.

@techalchemy
Copy link
Member

@sdispater the distinction is at the heart of every disagreement we've ever had and it's incredibly subtle but I'd point you at https://caremad.io/posts/2013/07/setup-vs-requirement/ or a good article for the elixir use case: http://blog.plataformatec.com.br/2016/07/understanding-deps-and-applications-in-your-mixfile/

pyproject.toml isn't really supported for library definition metadata -- and not at all by any version of pip that doesn't implement peps 517 and 518 (both of which are still having implementation details worked out) as an authoritative library declaration file. setup.cfg exists for that purpose (the actual successor to setup.py ) and IMHO both of those should really be supported. A library is published and intended for consumption with abstract dependencies so that they can play nice in the sandbox with others; applications are usually large, complex beasts with sometimes hundreds of direct dependencies. So one of our main divergences is that when we design and build our tooling, we take this into account also

@mrsarm For your first question, the update behavior was intentional (and was discussed extensively at the time, /cc @ncoghlan and related to OWASP security concerns). On the second point, the behavior is currently not properly implemented which is why the issue is still opened, which led us to rewriting the backing resolver behind pipenv, which I mentioned above. It simply didn't support this behavior. --selective-upgrade is supposed to selectively upgrade only things that are dependencies of the new package, while --keep-outdated would hold back anything that satisfied the dependencies required by a new package. Slightly different, but I am fairly sure neither works correctly right now.

@mmerickel
Copy link

pyproject.toml isn't really supported for library definition metadata -- and not at all by any version of pip that doesn't implement peps 517 and 518 (both of which are still having implementation details worked out) as an authoritative library declaration file. setup.cfg exists for that purpose (the actual successor to setup.py ) and IMHO both of those should really be supported.

Well this is certainly off topic but it's an important discussion so I can't help myself.

There is actually no standard around setup.cfg right now other than the conventions established by distutils and setuptools. pyproject.toml is absolutely for library metadata as the successor to setup.py or the community would have placed build requirements in setup.cfg instead.

pyproject.toml describes how to build a project (PEP 518), and part of building is describing metadata. I'm NOT saying that pyproject.toml needs a standard location for this metadata, but PEP 518 uses this file to install a build tool and from there it's very reasonable to expect that the build tool will use declarative configuration from somewhere else in the file to determine how to build the project.

Anyway, going back to pipenv vs poetry - there seems to be some idea floating around that applications don't need certain features that libraries get, like entry points, and this is just incorrect. It should be straightforward for an application to be a python package.

The only true difference between an application and a library in my experience with python and with other ecosystems is whether you're using a lockfile or not. Of course there's a third case where you really just want a requirements.txt or Pipfile and no actual code and that seems to be all that pipenv has focused on so far (pipenv install -e . falls into this category as pipenv is still afraid to try and support the package metadata). Unfortunately, while the design of pipenv is cleaner with this approach, it's also way less useful for most applications because PEP 518 decided to punt on how to install projects into editable mode so in order to continue using pipenv we will be stuck on setuptools quite a while longer as you cannot use pyproject.toml to switch away from setuptools and still use pipenv install -e ..

@techalchemy
Copy link
Member

There is actually no standard around setup.cfg right now other than the conventions established by distutils and setuptools. pyproject.toml is absolutely for library metadata as the successor to setup.py or the community would have placed build requirements in setup.cfg instead.

Distutils is part of the standard library and setuptools is installed with pip now, so saying that there is no standard is a bit silly. Not to mention it uses the standard outlined in pep 345 for metadata, among others, and can also be used to specify build requirements.

the community would have placed build requirements in setup.cfg instead.

Do you mean the pep authors? You can ask them why they made their decision, they outline it all in the pep.

pyproject.toml describes how to build a project (PEP 518), and part of building is describing metadata. I'm NOT saying that pyproject.toml needs a standard location for this metadata, but PEP 518 uses this file to install a build tool and from there it's very reasonable to expect that the build tool will use declarative configuration from somewhere else in the file to determine how to build the project.

This came up on the mailing list recently -- nothing anywhere has declared a standard around pyproject.toml other than that it will be used to declare build system requirements. Anything else is an assumption; you can call that "library definition metadata", but it isn't. Try only defining a build system with no additional information about your project (i.e. no pep-345 compliant metadata) and upload it to pypi and let me know how that goes.

Anyway, going back to pipenv vs poetry - there seems to be some idea floating around that applications don't need certain features that libraries get, like entry points, and this is just incorrect. It should be straightforward for an application to be a python package.

Who is saying that applications don't require entry points? Pipenv has an entire construct to handle this.

so in order to continue using pipenv we will be stuck on setuptools quite a while longer as you cannot use pyproject.toml to switch away from setuptools and still use pipenv install -e .

Not following here... we are not going to leave pip vendored at version 10 forever, I've literally been describing our new resolver, and the actual installer just falls back to pip directly... how does this prevent people from using editable installs?

@digitalresistor
Copy link
Contributor

digitalresistor commented Aug 29, 2018

This came up on the mailing list recently -- nothing anywhere has declared a standard around pyproject.toml

That's correct, it is not a "standard", yet in that same thread recognise that by calling it pyproject.toml they likely asked for people to use this file for other project related settings/config.

So by the same logic you invoked here:

Distutils is part of the standard library and setuptools is installed with pip now, so saying that there is no standard is a bit silly.

pyproject.toml is a standard, and the community has adopted it as the standard location to place information related to the build system, and other parts of a Python project.

@digitalresistor
Copy link
Contributor

Not following here... we are not going to leave pip vendored at version 10 forever, I've literally been describing our new resolver, and the actual installer just falls back to pip directly... how does this prevent people from using editable installs?

PEP 517 punted on editable installs... which means there is no standard way to install a project in editable mode if you are not using setup tools (which has a concept known as develop mode which installs the project in editable mode).

@mmerickel

This comment has been minimized.

@techalchemy
Copy link
Member

@techalchemy
Copy link
Member

techalchemy commented Aug 29, 2018

@bertjwregeer:

pyproject.toml is a standard, and the community has adopted it as the standard location to place information related to the build system, and other parts of a Python project.

Great, and we are happy to accommodate sdists and wheels built using this system and until there is a standard for editable installs we will continue to pursue using pip to build sdists and wheels and handle dependency resolution that way. Please read my responses in full. The authors and maintainers of pip, of the peps in question, and myself and @uranusjr are pretty well versed on the differences between editable installs and the implications of building them under the constraints of pep 517 and 518. So far All I'm seeing is that the peps in question didn't specifically address how to build them because they leave it up to the tooling, which for some reason everyone thinks means pipenv will never be able to use anything but setuptools?

I've said already this is not correct. If you are actually interested in the implementation and having a productive conversation I'm happy to have that. If you are simply here to say that we don't know what we're doing, but not interested in first learning what it is we are doing, this is your only warning. We are volunteers with limited time and I am practicing a 0 tolerance policy for toxic engagements. I do not pretend my work is perfect and I don't pretend that pipenv is perfect. I will be happy to contribute my time and effort to these kinds of discussions; in exchange I ask that they be kept respectful, that they stick to facts, and that those who participate also be willing to learn, listen, and hear me out. If you are here just to soapbox you will have to find another platform; this is an issue tracker. I will moderate it as necessary.

This discussion is wildly off topic. If anyone has something constructive to say about the issue at hand, please feel free to continue that discussion. If anyone has issues or questions about our build system implementations, please open a new issue. If you have issues with our documentation, we accept many pull requests around documentation and we are aware it needs work. Please defer all of that discussion to new issues for those topics. And please note: the same rules will still apply -- this is not a soapbox, it is an issue tracker.

@mmerickel
Copy link

https://pipenv.readthedocs.io/en/latest/advanced/#custom-script-shortcuts
Please read the documentation.

Entry points are a more general concept than just console scripts and this link is completely erroneous in addressing those concerns. <soapbox>Ban away - you're not the only maintainer of large open source projects on here and none of my comments have been a personal attack on you or the project. People commenting here are doing so because they want to use pipenv and appreciate a lot of what it does. My comment was not the first off topic post on this thread, yet is the only one marked. Your snarky comments indicating that you think I don't know what I'm talking about are embarrassing and toxic.

@techalchemy
Copy link
Member

techalchemy commented Aug 29, 2018

In the project we maintain, we can soapbox. And yes, pip will support all compliant build systems which you both yourselves seem to full well understand will produce consumable metadata, and as pipenv uses pip as the backing tool to drive its installation process, as I described, yes, pipenv will support all compliant tooling. I already said this.

So yeah, please take your toxicity somewhere else. Your attitude is not welcome here. Final warning. Persistent attempts to incite conflict won’t be tolerated.

@pypa pypa deleted a comment from digitalresistor Aug 29, 2018
@pypa pypa locked as off-topic and limited conversation to collaborators Aug 29, 2018
@matteius
Copy link
Member

matteius commented Jun 16, 2022

There is too much to parse here and I would just be echoing the sentiment that:

There's a little thing to take into account here: Changing a single dependency could change the overall set of requirements.
Ex: Updating foo from 1.0 to 2.0 could require to update bar to >=2.0 (while it was <2.0 before), and so on.

or that

upgrade a dependency that has cascading requirements obviously it will have consequences for other dependencies

In fact -- we have pretty great resolution of dependencies now that will update within the constraints you specify in your Pipfile to what is the possible allowed latest versions where all specifiers and constraints are met, or otherwise it will error out and let you know that you cannot do that. There is also the --skip-lock flag that might be helpful for some.

Anyway I am closing this one down for the history books and If there is anything relevant in this thread that is not addressed or needs a new issue -- feel free to open one, but keep in mind that the current behavior is correct with respect to dependency resolution and keeping the constraints of the Pipfile all happy and satisfied.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Type: Question ❔ This is a question or a request for support.
Projects
None yet
Development

No branches or pull requests