-
Notifications
You must be signed in to change notification settings - Fork 252
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add nuget level option for "resolve highest" #1192
Comments
Packages should depend on a minimum version which is known to work. Assuming a future version will always work often leads to problems over time when the dependency is owned by a different author and it goes through breaking changes. Project.json allows floating versions to solve the problem of always needing the highest, but only the user can set this. If each package could specify it's own floating behavior for a dependency it would quickly get out of hand and lead to behavior conflicts when packages had different settings for the same package. |
How would this get our of hand? isnt this very similar to a user setting Either way resolving to "the highest in a dependency range" is the most likely what people want. For example I manage several fody nugets https://www.nuget.org/profiles/simoncropp. i regularly get people saying
Each time i respond with
So I would much prefer that the default resolution was "HighestMinor". That is the most likely to be correct from a consumers perspective. Failing that i would like to control dependency resolution at the nuspec level. |
also
I think you are conflating "what a package says is its minimum" and "what is a sensible approach when first installing dependencies" Assume a package says it is compatible with version 1.0 from 2 years ago and there is a 1.0.15 from 2 weeks ago. So there are at least 15 bug fixes in that dependencies. Why would you resolve the clearly out of date 1.0 version. Also assuming that package follows semver then the highest 1.x should be resolved. |
I see that many of the Fody packages do not specify a version range for the Fody dependency. This is going to cause users to get the oldest available version. You can fix this by setting the minimum version to the current release of Fody when releasing new packages. Users typically want their packages to "just work" and for most packages happens when using the packages specified in the minimum versions because that is what the package author created and tested their package with. NuGet encourages packages to follow the semver rules, but this isn't always the case, and even when a package does use semver a "bug fix" from one author can end up changing behaviors that break packages from another author depending on them. In your scenario if the 1.0.15 package broke the parent package the other author would need to go in and release another package with a maximum version of 1.0.14 just to keep new installs working. Users can set the flag you mentioned and it is also available in a UI drop down to make it easy for users who want to install the latest dependencies to get just that. For your packages it makes sense to always take the highest version since you also own the dependency package and are careful about not breaking things, but for the majority of packages this isn't the case. |
No the nuget practice of resolving oldest causes this
Yes and this results in the un-maintainable scenario of deploying a nuget every time its dependencis deploy just to update the nuget range
This is a assumption that is plainly false since people will never get bug fixes for dependencies. It is basically a stance that ignore semver
Yes this is true but this is a problem for people managing packages to handle, not nuget. For example http://docs.particular.net/nservicebus/rabbitmq/rabbitmqclient-nuget
Yes but this is a setting they need to switch every single time they install a package. It is also a setting they need to remember when using the command line.
What are you basing this on? For all the packages I consume (with the exception of rabbit) i want the newest dependencies. |
I'm with @SimonCropp on this. I just recently went through the unnecessarily gargantuan task of setting up a new framework with in a few of our products at my job and ran it many cases were because of this "oldest" default, I was installing ancient versions of nuget packages, or, in cases were I might have already installed a dependency, multiple different versions of the same package. I ended up doing a mass update run, and was forced to reset the "Highest Minor" switch every time I went into a new project.
I'd point out that this might make a good case for NuGet users not to default to "Highest Minor", but package owners are probably much more sure about how their dependent packages are versioned.
Agreed. With the exception of some very edge cases (JQuery), I usually want the newest version of minor always, and the newest verions of major often.
I would argue then that this is the Author's responsibility. I would also argue for allowing a user to override a packages dependency resolution hint to make it more conservative if they so desire. I'd argue making it more liberal (highest minor/patch) is more often more desirable than more conservative. |
@RichiCoder1 would setting the default dependency behavior in your nuget.config have helped? Or is the global setting too broad for the packages you were installing?
If a new package specified that an already installed package should be the highest version would you expect it to force an update? What if the user installed that specific version manually? |
If that setting worked it would mitigate it. Unfortunately i have never gotten that to work has had this for ages
and from my nuget "dialog" so clearly this is not true
But this only fixes the problem for me, not for the many people who use the ~80 packages i manage. as i said I want to reduce the occurrence of "your nuget is broken since it resolved a version with bugs" |
It would have been a little to broad, but I'll admit I didn't even know that config setting existed. That's a little off the beaten path and out of the way of the happy path.
Hmm. That's actually a good question. If they explicitly said that they want the highest when installing a depdendent package, I would, but if it was unspecified I'd just keep the installed version as long as it fit the dependency version check. Though that also brings the argument of being able to "pin" package versions so a package has to be explicitly and manually updated. |
Defaulting to oldest known compatible is the safest resolution a package manager can do not knowing for sure that package authors apply SemVer (which isn't exact science either, mistakes happen, assumptions are...). When such mistakes happen, all packages depending on it are affected when choosing highest version resolution, and people will blame the tool for not protecting against it (I believe this happened in the past?). Consumers have the experience of dealing with specific packages and have the option to choose differently, typically before the install operation starts. Providing the author with the option to define resolution behavior is a dangerous thing to do if you depend on other packages you don't control, and ultimately will break consumers if one of those gets updated to a breaking change without reason (you might not have changed the minimum version of your dependency, but the resolution behavior you defined would overrule that). I'm sure you'd want NuGet to respect that constraint instead?
This is something I believe NuGet can be smarter about. See this issue I logged when experiencing the same unconsolidated install when newer dependencies were already installed in the solution: #835. |
Which is why I would argue for allowing the author to hint a better default if they're certain that the 99% case is that it'll be better for the user, but still allow the user to override that hint should they wish.
That would be awesome, and would resolve my specific issue. Managing packages across projects, especially when updating them or installing new packages, feels very messy at the moment. |
@SimonCropp good catch, that looks like a UI bug to me. Please copy that into a new issue so it can get fixed.
This would help new installs, but at some point the user needs to update their Fody package themselves. For example if they install BasicFodyAddin.Fody today, get the current highest version of Fody, but tomorrow Fody is updated they are once again missing out on bug fixes. project.json solves this by allowing users to float to the highest version of a package on every restore/build. |
I would also consider that a bug if enabled by default. It means OOTB i can get different build behavior with no code changes. |
This always confuses me.. People shouldn't have to narrow the range of versions they support just to avoid people getting buggy versions. If a jQuery plugin supports "every version of jQuery ever", it would never makes sense to install and old buggy version. It seems valid for a package to want to express a preference that is narrower than the supported version. Dragging the minimum supported version forwards will only make resolution harder. The smaller the ranges of supported versions, the more likely you'll end up with an unresolvable set of dependencies. |
I would be happy with the setting being a "soft recommendation" |
@DanTup yeah for most of the Particular packages we have adopted a "current minor range" eg this https://www.nuget.org/packages/nservicebus.autofac is
But my issue is that clearly when someone install that package we want the newest minor of both those dependencies. |
Users have to opt into the floating behavior. |
Installing a minimum version that is not buggy is the only sane way to build packages. Particularly when these packages are resolved automatically. If you want to build an opinionated and not so opinionated package you can do it today by building a core package with implementation in minimal working/testing dependency and another meta package that always pulls in the latest/most tested package they work together with, Practically speaking - Pull in the latest possible from a package, is very dangerous.
|
@yishaigalatzer you have closed this without addressig many of the concerns people have raised
I never asked for that. I asked for "highest version within the dependency range". in most cases this would effectively be the HighestMinor. This is the most sensible default for packages that follow semver, which as i understand is something nuget is trying to adopt/push
No in the current nuget approach dependencies are all resolved at install time. there is no "might" involved.
This stance is ignores what people actual are forced to do atm which is follow every install by an update to check if any dependencies are buggy.
I see no way of doing this. can you please provide some more details or point to some documentation So please re open this issue |
A package author can still ask for a range in this case that goes past HighestMinor.
That's incorrect, for both ASP.NET 5 and project.json scenarios in NuGet3 dependencies are resolved at restore time, and they are resolved as a whole. So that means this feature will make builds not repeatable. It also means that even in the install case, it makes the behavior unpredictable.
In Visual Studio 2015, we introduced dependency control, where the user can pick how far he wants to go with dependencies, this solves the issue, but forces the user to do that.
You can build a core package
Let's continue the discssion, opening means we plan to fix it, at this point I still think we are not taking a fix for this, based on everything I see, but it is worth continuing the discussion. |
I could not agree more with @SimonCropp, there is nothing inherently safer about resolving to lowest then there is about resolving to highest. Not resolving to latest and greatest because it might introduce bugs ignores the fact that latest and greatest might also fix a thing or two. Making an assumption either way is bad but assuming the latest version breaks causes folks to stay on older versions and more overhead for OSS library developers.
This is not true and why am over the moon to find out today project.lock.json will be split in two. I would even argue one step further: I can fix my direct package versions but not transient ones:
Given LibC released "But what if LibC does not adhere to SemVer?" is the same question as "But what if it does?" we have to break one scenario so why not design for best practices? |
Can we re-start this discussion? I think the key point being made here is that package authors are asking for an opt-in default behavior for packages that know they're keeping strict compat to a patch update. The problem is made somewhat worse with project.json as transitive references are not always obvious and are hidden by default. This can lead to security vulnerabilities that a NuGet package author has NO ability to patch (unlike Microsoft with WU). Suppose I have a 1.0.0 lib that was later found to contain a serious vulnerability. Now as a responsible author, I fix it and release 1.0.1. I'd previously marked my package to suggest that NuGet take the highest patch. I'm in reasonably good shape as I can be sure that the transitive refs that use my library will automatically get the fix. Now look at it without that. I release 1.0.1 and consumers that get my library transitively won't even know that they need to update and will be vulnerable. This is really bad too. I get what everyone is saying about assumptions and making guesses and no one is saying this should be default behavior. It should be opt-in. NuGet can then disclaim all responsibility for any future "breaks". NuGet is the delivery vehicle; blame should lie with the package author. I'd really like to see this re-opened with a discussion going around how this can be addressed. |
We are tracking a separate mechanism to track vulnerability or deprecation. This is not it Sent from my Windows Phone From: Oren Novotnymailto:notifications@github.com Can we re-start this discussion? I think the key point being made here is that package authors are asking for an opt-in default behavior for packages that know they're keeping strict compat to a patch update. The problem is made somewhat worse with project.json as transitive references are not always obvious and are hidden by default. This can lead to security vulnerabilities that a NuGet package author has NO ability to patch (unlike Microsoft with WU). Suppose I have a 1.0.0 lib that was later found to contain a serious vulnerability. Now as a responsible author, I fix it and release 1.0.1. I'd previously marked my package to suggest that NuGet take the highest patch. I'm in reasonably good shape as I can be sure that the transitive refs that use my library will automatically get the fix. Now look at it without that. I release 1.0.1 and consumers that get my library transitively won't even know that they need to update and will be vulnerable. This is really bad too. I get what everyone is saying about assumptions and making guesses and no one is saying this should be default behavior. It should be opt-in. NuGet can then disclaim all responsibility for any future "breaks". NuGet is the delivery vehicle; blame should lie with the package author. I'd really like to see this re-opened with a discussion going around how this can be addressed. — |
Would love to hear your thoughts / plans for that |
The plan is to enable marking a package as deprecated by the owner and in turn package restore or install will barf and guide you to the new package Sent from my Windows Phone From: Oren Novotnymailto:notifications@github.com Would love to hear your thoughts / plans for that — |
@SimonCropp we are coming up with ideas that could enable this scenario, stay tuned to when we come up with the lineup design docs. |
@yishaigalatzer awesome and thanks. let me know if u need a beta tester? |
We are a couple months out, this is a pretty fundamental feature we want to add, and this feature can be built on top of it. But I will definitely ping you when it becomes a thing. |
🎉 🍰 thanks for revisiting this ticket! |
Is this the latest issue for this feature? I understand that lockfiles address the repeatable builds aspect. |
I am going to take a different but safer position on this: leave this up to the user. Why on earth should Microsoft FORCE users to adopt the "safest is best" strategy in the first place? Is there any reason why we cannot add a switch to nuget.exe and NPM that allows the user to decide their own package restore strategy? This way consumers of nuget can align nuget behavior to their own business strategies, be they conservative or aggressive. What I don't understand is why Microsoft has to decide this for consumers in the first place. To me this is a straight up consumer decision based on private business strategy and policy. It makes absolutely no sense why a tool vendor should allow themselves to unilaterally decide business strategy for all nuget consumers on the planet without an option to override that. It's been two years and there's no apparent movement on this issue. Leave lowest version as the default strategy if you want Microsoft, but for goodness sakes offer a switch to allow consumers to make their own business decisions. |
@emgarten Was this fixed/planned? |
Why is this closed? I am confused. |
This is ridiculous. NuGet dev response: "It's by design" |
@erikpowa yeah i fought the good fight and lost. |
And here I am facing the same issue. Except it is 10x worse because one can't really remove a version from NuGet. Once you uploaded PKG-1.0.0 to NuGet.org and somebody makes a Consumer-1.0.0, that requires PKG [1.0,2.0), there is no way to have even fresh installs of Consumer-1.0.0 use the fixed version PKG-1.0.1, because NuGet will restore PKG-1.0.0 despite it being unlisted. I am unsure what to make of it: if you need repeatable builds, use lock files. |
well looks like @yishaigalatzer has moved on from Microsoft. perhaps time to revisit? @SimonCropp ? I'm completely in agreement with @lostmsu that repeatable builds are established using lock files, not trying to do both with a single step. |
@clairernovotny - hey... now that you're the dotnet boss can we get an executive decision on this? 🥇 |
I develop a library that use the NuGet package Microsoft.EntityFrameworkCore.SqlServer 3.1. <Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.EntityFrameworkCore.SqlServer" Version="[3.1, 3.2)" />
</ItemGroup>
</Project> But when my library is installed in a project, then it resolves Microsoft.EntityFrameworkCore.SqlServer 3.1.0 that is a old version. Ideally (in my case) would be NuGet to resolve the highest version of the range defines from a dependency. Maybe this can be : <PackageReference Include="Microsoft.EntityFrameworkCore.SqlServer" Version="[3.1, *, 3.2)" /> I understand that a new version of Microsoft.EntityFrameworkCore.SqlServer 3.1 may cause a regression. <PackageReference Include="Microsoft.EntityFrameworkCore.SqlServer" Version="[3.1, 3.1.22, 3.2)" /> It explicity say :
|
So, how do I get NuGet to pull in the highest version for trans dependencies these days? |
@CumpsD you need to manually keep track of them |
Right, no practical solution then :) I've migrated all to Paket and it's solved too ;) They allow to specify resolution strategy |
I want a way for a nuget package to say it wants the "highest version within the dependency range"
There are many packages where this makes the most sense. for example bootstrap packages that help people get started https://www.nuget.org/packages/NServiceBus.Bootstrap.WindowsService
It really should be up the the package author how the dependency is resolved.
The text was updated successfully, but these errors were encountered: