-
-
Notifications
You must be signed in to change notification settings - Fork 127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
System.IO.FileNotFoundException when using IKVMC #65
Comments
Not the only one here.
Possibly: The FrameworkDir is always Footnotes |
Thanks for the report. A few things stand out as potential issues, although it appears the primary problem is that the ikvmc tool depends on a specific version of the .NET SDK to be installed so the reference assemblies exist (see below).
Thanks for pointing to the source. Looks like it is hard-coded to use the "3.1.0" directory instead of choosing the highest version that is installed. But then, the approach we use to resolve the directory just seems wrong. The quick workaround is to install a version of .NET Core SDK that installs the On my machine, the folder was created on 2019-12-09, but I currently only have the following .NET Core 3.x SDKs installed:
For a longer term solution, IMO it doesn't make a lot of sense to depend on a specific version of an SDK. There are at least a couple of options:
Since the latter option only increases the distribution size by 3.71 MB, IMO it would be better to deploy the reference assemblies with the ikvm tools for netcoreapp3.1. On the .NET Framework side, we can also deploy reference assemblies using Microsoft.NETFramework.ReferenceAssemblies.net461, but I am not sure if we gain anything there because the package is useful in cases where the .NET Framework Developer Packs are not installed, such as when building on Linux/macOS. I suppose building on Mono is for .NET Framework is still supported, but that seems like an edge case we don't need to cover. |
Already did that, but that doesn't work, as I suspect that Relying on //More Context for GetRuntimeDirectory:
Whereas the returns description is
With Footnotes |
Thank you for your response.
I'll try to play with the .NET Core 3.1 versions now but I strongly support the idea of including ref assemblies (at least until the dotnet tool becomes a thing). Thanks again |
Alright. Since @wasabii and I both have it functioning, there appears to be another dependency somewhere that isn't being accounted for. Going through the GitHub Actions config may yield some clues, but at first glance I don't see any tests that confirm functionality of I can confirm that I have .NET Core 3.1, .NET 5, and .NET 6 SDKs installed locally:
|
Builds are working as they are invoked using Footnotes |
I've installed your recommended version (3.1.100 SDK). It didn't fix my problem automatically but after I manually copied files from My dotnet --info currently is:
|
Nice, that's a pretty clean solution which should work with my build pipeline. Thank you. |
Beware! That doesn't work with the distributed binaries zip file, as those include the runtime already, thus have the same issue as running ikvmc.exe. |
I see it now. I thought that executing the library directly via dotnet would reference the missing libraries magically. Nevermind. 🙂 |
Well … there we go1: Looking at IKVM.artifacts.msbuildproj2 reveals that the publish specifies a runtime identifier, thus the dotnet tooling creates a framework independent package.
Footnotes |
Eh, probably should have spent more time fixing this when I came across it, but it was a week ago and skipped my mind since. Run it with -nostdlib, and point it to the reference assemblies you WANT to point it to. That will cause it to not even bother coming up with that reference assembly path, and it'll link only with the assemblies you specify as references. This makes more sense to me, as there's no requirement that you be linking to .NET Core 3 for your assembly... even if you're running the .NET Core 3 version of ikvmc. This is how the build of IKVM.Java works. It actually piggy backs on MS Build's resolution of assemblies, and passes those to ikvmc, ignoring whatever environment ikvmc itself might have been distributed with. It's dependencies are not necessarily your dependencies. |
So it is completely up to ikvmc users to specify which reference framework should be used? |
It's half and half at this point. The GOAL is for any version of ikvmc to be able to create any version of assembly referencing any framework. That is, you should be able to run the ikvmc .NET 3.1 version, on Linux, to generate a Framework 4.0 assembly. Because that's somewhat of a required capability for cross-targeting. That said, there are some places in the code where I know it makes assumptions based on whether it's running on Core or Framework, that prevent this, currently. The default case (not specifying nostdlib) should work. But I know that how it goes about locating the required reference assemblies is a bit wonky. Needs to be fixed. But, as a work around, you should be able to just turn that off and specify exactly what you need in your build process. You should be able to point to your own reference assemblies. |
Another thing.... I think this is just going to naturally be more work on Core, forever. It was easy on Framework. You just specify -r:System.Runtime, and go with it. Types never moved, they were always there. However on Core they've been moving things around constantly. Everything isn't piled into mscorlib or System or System.Runtime anymore. It's been split into 30+ packages. And then combined back! And then moved around! And it keeps changing. They do provide type forwards, from old assembly names to new ones. But that only helps for runtime binding. When compiling code (like MS Build does, ultimately invoking csc.exe/Roslyn), you actually have to trace down the correct reference assemblies for the correct things. And to top it off you can't just look at some well-known path to find em all. They aren't hanging out in C:\Windows\Microsoft.NET\Framework, a path you can find in the registry. They differ not only by runtime, but by where you install the runtime. Or whether you even have a single installed runtime as in the case of runtime-independent apps. I'm going to fix up ikvmc to properly find it's own /ref directory, in the simple case. So the Core 3.1 build will probably find the Core 3.1 reference assemblies. But this just makes the "I want to convert an assembly from the command line to .NET Core 3.1 one off" use case easier. I think any more complicated use (building for later runtimes, incorporating .NET code), is going to require something more complicated to wrap ikvmc, determine what the actual target and build environment looks like, and pass it the necessary options. |
One option could be creating a build for each target runtime. So one build for net4x, netcoreapp3.1, net6.0 (and only support intermediate runtimes as long as the next LTS comes around). |
Yeah, but the cross-targeting use case isn't fixed by that, and I think that's very important. You have to be able to run dotnet build on Linux, and expect to properly output Framework assemblies. CI/CD pipelines demand it now. I'm using csc.exe as a model here. Roslyn is written in C#, and built against whatever version of the runtime it is built against at the time. But it's more than capable of compiling earlier versions of C#, or linking them against assemblies from Framework, or Core, or different versions of Core. If you look at the command line arguments ultimately passed to Roslyn, it's like 50+ items that are just the paths to the Reference assemblies of your target at the time. The build environment for Roslyn doesn't bleed into the execution environment. But, the users' build environment does then have to do a lot of heavy lifting. Another consideration is MSBuild. We are going to eventually have MSBuild tasks for the functionality of ikvmc. We're going to need/want it. Well, people are going to be able to expect to open that in Visual Studio and hit Build. That's going to run the ikvmc code inside Framework, but be producing .NET Core assemblies. And likewise, the same Task loaded into 'dotnet build' should be able to produce Framework assemblies even though that's a Core environment. Another example: You are using the .NET 6 SDK to build .NET Core 3.1 projects. This works just fine right now. All the MSBuild code runs in .NET 6, but is perfectly capable of building .NET Core output. Basically, the version of ikvmc you are running is dictated by where you are, and the version of the references and assemblies your outputting is dictated by what you're trying to make. |
To the OP: So the best way to integrate this into a build right now, if you are using MSBuild, is to construct the ikvmc command line from the data available to MSBuild. Because it already figures out the right dependencies based on the TFM and puts them into an ItemGroup. Here's an snippet from IKVM.Java.targets:
|
@wasabii - Thanks for the clarification. So, for the lower level For the short term, it seems we will need better documentation to walk through the process of using ikvmc on .NET Core a bit more. Two possible options for documentation are
Both of them require the docs in Markdown, so we could easily start out with option 2 and then switch to option 1. I think option 2 is simpler to get started because it requires no extra setup to get source control for docs. Another possible way to go for a shorter-term/mid-term fix would be to provide an I may be way off in suggesting this, but I am curious what synergy @wasabii can add to this idea or if he will just flat out reject it because he has a long-term idea that can be implemented in about the same timeframe this would take. At the end of the day though, having a hard-coded SDK version at this low level is a bug. I am just not sure what to suggest to fix the code, or if this is something all users must work around until we go through the next iteration of build toolchain improvements. Any thoughts? |
@NightOwl888 I agree with option 2.
#54 Is what you're talking about. I am working on fixing ikvmc for the simple case. It should be, unless you specify something else, searching for reference assemblies in it's own '/ref' directory. And Core has support for navigating this structure from .deps.json, instead of trying to build some crazy path by replacing strings. |
So, did some research. The early arguments of ikvmc, and the way it did things, was modeled after csc.exe of the time. Not surprising. This is still mostly applicable when viewing it from a .NET 4 perspective. csc.exe still exists on .NET 4. And works about the same. For instance, when you compile a simple test.cs file using csc.exe on .NET 4, it works. However, looking at Core, the picture is quite different. There is no longer a csc.exe. Roslyn, in dotnet\sdk\v\Roslyn\bincore, has a csc.dll, which models the old csc.exe utility. But it doesn't actually work. In fact it works quite like I've been saying ikvmc should work.
It can't even compile a simple single file on it's own. It's not even capable of locating it's own standard lib (netstandard) without the user specifying the path to it. You have to give it everything. It doesn't even try. |
The .NET Framework 4 version works the same when -nostdlib is specified. Basically, the Core version has -nostdlib enabled always. No way to disable it either. |
I should note it is the Roslyn version of csc. So a couple options actually. a) Include the compilation context in the ikvmc tool output, and consider it the standard library and library search path. But only do this in the case -nostdlib is not specified. We'd bundle the entire .NET Core 3 reference set in the tool ZIP, inside /ref (we're doing this now, it just can't find it). Part of me is favoring (d). It's the easiest on us. We actually only have to go remove some code. It produces the cleanest tool, and doesn't allow a user to accidently include a platform reference he doesn't actually intend to use his assembly with. Basically the tool (every version) would act like csc.exe does on Core. I think I favor this over having the tool behave differently depending on whether you're running the Framework or Core version. First, we would want to push everybody to the Core way of doing things. It's just the way it is now. Get over it? We don't really want to make it easier to accidently pick Framework. And people can still use the Framework tool: they just have to specify the Framework references (mscorlib) by hand, same way MS Build does. It means the Framework tool and the Core tool would operate the same. The point of choosing one over the other wouldn't be whether you want to produce Framework or Core assemblies, but whether you were executing it on a machine that forced you to pick one version over the other. Worth noting, on Framework MS Build, when not using the code server, and using the new SDK, csc.exe is invoked with the -nostdlib by default. Even when targeting Framework. Since MS Build knows the references anyways. Basically the only people not using -nostdlib are people who drop to the command line and type 'csc.exe' out of Framework 4 without knowing any better. We'd also reduce the size of the tools drop by removing the refs directory. This would be a breaking change of user expectations with the Framework version of the tool. But now's the time for those things. And it's not like MS didn't make this same change in their Core versions. Anybody have any opinions? |
Being that the path from .NET Framework build tools to .NET Core was a long and winding road with many bumps along the way, I am in favor of trying to avoid as many of those bumps as possible. IMO, the best way to do that is to make a contract first with a standard (but somewhat limited) set of options that cover most use cases. This is sort of why I think the So, with that in mind, I agree with you that we should keep the ikvmc tool in line with what .NET Core is doing even if it is more work to deal with manually than it was previously and it may result in breaking changes from prior versions of IKVM. Either of these 2 scenarios make sense to me:
Either way, the earlier we create a contract so that IKVM builds don't break every time there is an upgrade, the better. I favor the latter approach in this regard because it avoids the issue of users having to migrate from option 1 to option 2 later. But of course, this assumes that users will wait for the tool to be developed, which may or may not be the case. |
I remain a bit confused about that. (1) creates a contract that doesn't break in the future. I also prioritize shipping something vs shipping nothing. :) |
Not sure whether you actually mean (a) or (1) with this comment. If you meant (1), I am fine with that but users are going to need some hand holding both to migrate from old But if we are going to introduce breaking changes to But frankly, I don't have a strong opinion either way. |
So I think this bug is resolved at this point. Because the auto directory detection stuff has been removed. You have to explicitly specify the paths to the references in Core. So I'm closing it out at this point. |
Hello,
first of all - thank you for your effort, I've been following your work and I really appreciate it. I really wish I could understand everything you've done in last few weeks and help you at least a little bit.🙂
But to the problem - I've been trying to use the new release but unfortunately with no success. I'm using Windows 10 so I downloaded the IKVM-8.2.0-prerelease.392-tools-netcoreapp3.1-win7-x86.zip artifact and tried to recompile my .jar with the new IKVMC but this error popped up:
Maybe I'm just missing something or doing something stupidly wrong. If that's the case could you please tell me what?🙂
Thanks
The text was updated successfully, but these errors were encountered: