Skip to content
This repository has been archived by the owner on Sep 13, 2022. It is now read-only.

Confusion around the netstandard.dll #146

Closed
kimbell opened this issue Jan 5, 2017 · 15 comments
Closed

Confusion around the netstandard.dll #146

kimbell opened this issue Jan 5, 2017 · 15 comments

Comments

@kimbell
Copy link

kimbell commented Jan 5, 2017

I’ve been following @terrajobst YouTube videos, and particularly the one on .net standard.

In this video you mention netstandard.dll, and that makes things a bit confusing for me.
Let’s say we are using MemoryStream from a .NET Core app targeting netstandard 2.0, and running on a Windows machine with latest framework installed.

Am I correct in the following:

  1. My custom assembly is linked to netstandard.dll
  2. Things are then type forwarded to System.IO.dll
  3. System.IO.dll than type forwards things to mscorlib

When decompiling an assembly, you can easily see what other assemblies are referenced. How will this work in a netstandard.dll world? Will all I see be the netstandard.dll?

@terrajobst
Copy link
Member

terrajobst commented Jan 7, 2017

Hey @kimbell,

Here is the relevant part from the video.

Yes, when you compile an assembly against .NET Standard 2.x, your resulting assembly will (mostly) have references to netstandard.dll, as opposed to mscorlib or System.Runtime. So as far as .NET Standard 2.x is concerned, MemoryStream lives in netstandard.

When your library is consumed on a specific platform, say, .NET Core, there is a corresponding netstandard.dll which type forwards all the types from .NET Standard where they live in that particular .NET platform. For .NET Core, this means System.IO, on .NET Framework this means mscorlib.

When decompiling an assembly, you can easily see what other assemblies are referenced. How will this work in a netstandard.dll world? Will all I see be the netstandard.dll?

It will work the same way. It's just that when you compile against .NET Standard, the compiler will mostly only ever see netstandard.dll. You'll only see additional assemblies when you're adding references to components that are available for .NET Standard as extensions, such as the Windows registry or JSON.NET.

@kimbell
Copy link
Author

kimbell commented Jan 7, 2017

OK, lets assume we remain in the .NET core context. In order to run the application, we need to package up all the assemblies; thinking of xcopy deployment. When you add a reference to .NET Standard 2.x nuget package, I'm assuming you also get reference to the System.IO nuget package and any other relevant packages?

Can we then consider the .NET Standard 2.x package as a 'meta package'? If I understand things correctly, it will bring in a bunch of assemblies you may not actually need for your application. Doesn't this go against the 'pay for play' idea for .NET core? Or is there some other build/packaging tool that figures out the minimum set of assemblies that we need?

One of the arguments for splitting things over multiple packages is that one can update them individually. Lets say MemoryStream gets a new method I'd like to use, but hasn't made it into the .NET Standard yet. Does type forwarding ignore assembly versions? Will it automatically use the latest one? If you add a reference to the new System.IO package, how does the compiler know what assembly to use?

@terrajobst
Copy link
Member

terrajobst commented Jan 9, 2017

OK, lets assume we remain in the .NET core context. In order to run the application, we need to package up all the assemblies; thinking of xcopy deployment. When you add a reference to .NET Standard 2.x nuget package, I'm assuming you also get reference to the System.IO nuget package and any other relevant packages?

Sort of. With .NET Core 1.x we've learned that a super granular package graph is causing more grief than it is worth. That's why we plan on collapsing the package graph. Basically, the idea is to flatten the meta packages for .NET Core 2.0 and .NET Standard 2.0, meaning you wouldn't need a reference for the System.IO package because the System.IO binary is provided by the .NET Core meta package.

Generally speaking, for XCOPY deployment you shouldn't have to worry about missing dependencies. The idea is that if you can consume the types at compile time, building the app will ensure you've the right artifacts in the output folder.

Can we then consider the .NET Standard 2.x package as a 'meta package'?

Yes. But we're planning on going even a step further and will remove the requirement from the project to include the meta package itself -- it will come from the built-in targets and is implied by the combination of the SDK and TFM. In other words, you can basically assume the platform to be "just there".

One of the arguments for splitting things over multiple packages is that one can update them individually.

Agreed, but we've learned that it doesn't quite work for the platform layer as you often have to update several packages in combination in order to get to a sane state.

Lets say MemoryStream gets a new method I'd like to use, but hasn't made it into the .NET Standard yet.

That's a completely different problem, which has to do with how fast .NET Standard can evolve. Generally speaking, the thinking is that new APIs will likely come to .NET Core first, because that's the platform whose BCL has the biggest community momentum. From there, it will flow to other platforms (probably Mono and Xamarin first, then .NET Framework). We'll definitively not wait until all .NET platforms have the new APIs before revving the .NET Standard, but on the other hand you need at least two platforms in order to make an updated .NET Standard useful.

Does type forwarding ignore assembly versions? Will it automatically use the latest one? If you add a reference to the new System.IO package, how does the compiler know what assembly to use?

Remember that .NET Standard is just a spec. In general, creative use of type forwarding or tricks in assembly factoring doesn't allow us to add members to types that are part of a .NET platform. But to answer your question: no, type forwarding doesn't ignore assembly versions. The type forwarder uses a versioned assembly reference. These references are resolved like any other assembly references and are thus subject to unification behavior provided by the .NET runtime (such as binding redirects, host policy, GAC etc).

@kimbell
Copy link
Author

kimbell commented Jan 10, 2017

In the current .NET Core project files, we can reference .NETCoreApp 1.x (EXE) or .NETStandard 1.x (DLL).

Since type forwarding is bound to a specific assembly version, selecting .NET Standard 2.x locks me to a specific set of assembly versions. If I want to leverage an API that is not part of the standard, I need to choose something else. For EXE projects, I imagine a .NETCoreApp 2.x will become available, but what about DLL's? Will there be a .NETCoreDll 2.x that we haven't seen yet?

You mention new API's probably coming to .NET Core first, then later to .NET Framework. If you run a .NET Core app on a windows machine with .NET Framework, aren't most of the types in the end forwarded to mscorlib? Or am I missing something? I take it I won't be having a reference to netstandard.dll, but something else. Do we then need a reference to an assembly that contains an implementation for some types and forwards the rest to mscorlib?

@mellinoe
Copy link

mellinoe commented Jan 10, 2017

".NET Core App" (or netcoreapp) is just the NuGet moniker we've given to the platform itself, it doesn't have anything to do with the type of assembly you are building (e.g. whether or not it has an entrypoint), even though it has the word "app" in it. Projects targeting .NET Core will use the netcoreapp moniker regardless of whether they are a library or application.

@kimbell
Copy link
Author

kimbell commented Jan 10, 2017

That's interesting, but still a bit confusing. Why not just use 'netcore'?

In VS2017 RC, selecting a class library project seems to be locked to .NET Standard, whilst console apps, test projects and web apps are .NET Core. For a class library i created, I cannot select .NETCoreApp 1.x. as the target framework. Is this a change that is coming, but hasn't made it into the published bits yet? This class library was created in VS2017 and not upgraded from a previous version.

I'm currently running 1.0.0-preview5-004384

@akoeplinger
Copy link
Member

If you run a .NET Core app on a windows machine with .NET Framework, aren't most of the types in the end forwarded to mscorlib?

No, .NET Core is a separate product in this case and doesn't rely on .NET Framework (that's why it also works on Windows Nano Server which doesn't have .NET Framework) so there's no forwarding from a .NET Core app to .NET Framework libraries.

I take it I won't be having a reference to netstandard.dll, but something else.

Correct, if you target netcoreapp then you won't get a netstandard.dll reference, instead you get the API surface area of netcoreapp and can use APIs that were only introduced there at the expense of not being able to run on other .NET Standard platforms.

@kimbell
Copy link
Author

kimbell commented Jan 10, 2017

Writing software for specific .NET Standard version is one thing; this allows it to work on multiple platforms. How a given platform chooses to implement the standard is a different matter.

If you decompile C:\Users....nuget\packages\System.IO\4.4.0-beta-24903-02\lib\net461\System.IO.dll (which I believe is part of .NET Core), all the types are forwarded to mscorlib.dll.

If you decompile C:\Users....nuget\packages\System.IO\4.4.0-beta-24903-02\ref\netcoreapp1.1\System.IO.dll, types are forwarded to a System.Runtime.dll reference assembly.

Am I barking up the wrong tree? Am I in the right forest?

@akoeplinger
Copy link
Member

akoeplinger commented Jan 10, 2017

The lib\net461\System.IO.dll assembly is specifically for when you reference the System.IO nuget package in an app/library targeting .NET Framework 4.6.1, i.e. the full framework. In that case, yes, the types are forwarded to mscorlib.dll since you're (supposed to be) running on the .NET Framework.

The System.IO nuget package is not strictly related to .NET Core, it "works" on regular .NET Framework too.

Aside: The ref folder is a new concept that only newer NuGet v3 based systems understand: https://docs.nuget.org/ndocs/create-packages/project.json-and-uwp#ref. Assemblies in the ref folder are used during compilation, assemblies in the lib folder are used during runtime.

@kimbell
Copy link
Author

kimbell commented Jan 10, 2017

Thanks @akoeplinger for the useful information about the ref folder.

On my machine, System.IO only contains a dll under lib\net462. This is the full framework. You mentioned that .NET Core applications should be able to run on Nano server. What dll will then be used? I can't find a target for Nano server. Unix and Mac I get are not present since that is a different world, but Nano is still based on Windows.

@terrajobst wrote

Basically, the idea is to flatten the meta packages for .NET Core 2.0 and .NET Standard 2.0, meaning you wouldn't need a reference for the System.IO package because the System.IO binary is provided by the .NET Core meta package.

So when I add a reference to the .NET Core meta package, I get an implicit reference to a specific version of System.IO.dll. Correct? Or is this namespace provided in a different assembly, as is the case with mscorlib.dll on a full .NET Framework.

Assuming System.IO.dll will exist as a separate dll and nuget package in the 2.0 time frame. For my .NET Core application, I have a reference through the Meta Package, but want to use some new functionality available in a newer version. Do I add an explicit reference to System.IO nuget? With two different versions, how does the compiler know which to use? Or will such a case in practice lead to a newer version of the .NET Core meta package?

@terrajobst
Copy link
Member

terrajobst commented Jan 10, 2017

@kimbell:

Since type forwarding is bound to a specific assembly version, selecting .NET Standard 2.x locks me to a specific set of assembly versions. If I want to leverage an API that is not part of the standard, I need to choose something else. For EXE projects, I imagine a .NETCoreApp 2.x will become available, but what about DLL's? Will there be a .NETCoreDll 2.x that we haven't seen yet?

I think I now understand the confusion. Here is how to think about this:

  • .NET Core. This is a concrete .NET platform and implements .NET Standard. You can use .NET Core to build applications and libraries. .NET Core libraries can access the entirey of the .NET Core API set but can only be consumed by .NET Core applications.
  • .NET Standard. A specification of APIs that are implemented by multiple .NET platforms. Thus, libraries targeting .NET Standard can be consumed by any .NET platform that implements the same (or a higher) version of the .NET Standard you built the library for. However, you can't build apps targeting .NET Standard because an application (by definition) requires a concrete .NET platform to run on.

In the upcoming developer experience for .NET Core and .NET Standard library projects, you'll be able to change a project from targeting .NET Standard to .NET Core (and vice versa) by changing the TFM in the project file. For example you can go from .NET Standard:

<Project Sdk="Microsoft.NET.Sdk" ToolsVersion="15.0">
  <PropertyGroup>
    <TargetFramework>netstandard20</TargetFramework>
  </PropertyGroup>
</Project

to .NET Core:

<Project Sdk="Microsoft.NET.Sdk" ToolsVersion="15.0">
  <PropertyGroup>
    <TargetFramework>netcoreapp20</TargetFramework>
  </PropertyGroup>
</Project

(the actual syntax and project file might still change, but you get the idea)

Libraries should generally target .NET Standard as this ensures that they can be consumed by any app. There will be circumstances where you need to access .NET Core specific APIs, either because the API is new and not implemented anywhere else, or the concept is .NET Core only. That's why I believe we should make it easy to retarget between .NET Standard and .NET Core so that developers never have to fear being "locked in". Start with NET Standard and retarget if necessary & revert back once a new version of the standard is available that has all the APIs you need.

Also, we plan on making it easier to cross-compile, meaning you'll be able to compile a given project multiple teams, for different TFMs. This way you can provide a mostly portable implementation and light-up platform specific features using #if.

Does this help?

@kimbell
Copy link
Author

kimbell commented Jan 10, 2017

Things are starting to get clearer :)

Based on the information provided by @terrajobst

Basically, the idea is to flatten the meta packages for .NET Core 2.0 and .NET Standard 2.0, meaning you wouldn't need a reference for the System.IO package because the System.IO binary is provided by the .NET Core meta package

can I assume that you will never release an updated version of System.IO unless it's part of a new netcoreapp2x TFM? i.e. .NET Core will be spread over multiple assemblies, but released as a single package?

@terrajobst
Copy link
Member

terrajobst commented Jan 11, 2017

can I assume that you will never release an updated version of System.IO unless it's part of a new netcoreapp2x TFM?

Pretty much .The current plan is to no longer provide and thus update granular packages like System.IO. We will probably still have to ship updated versions of the meta package. And, of course, we'll continue to ship "out-of-band" packages that extend the platform. These are more fine grained then the platform, but that's OK.

@kimbell
Copy link
Author

kimbell commented Jan 12, 2017

Thank you for an interesting and enlightening discussion.

@kimbell kimbell closed this as completed Jan 12, 2017
@terrajobst
Copy link
Member

Absolutely! Thanks for engaging!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants