-
-
Notifications
You must be signed in to change notification settings - Fork 366
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable buildInfoStaticCompiled
on ScalaJSModule and ScalaNativeModule
#2562
Enable buildInfoStaticCompiled
on ScalaJSModule and ScalaNativeModule
#2562
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMHO, we should make static compilation the general default, on all platforms. After all, this feature is some kind of optimization of the developer experience and not without runtime costs.
The issue with making static compilation the default is that with common practice - setting versions based on git dirty sha, including the version as a buildinfo field, and using buildinfo in some upstream util module - it ends up invalidating all compilation caches everywhere if you touch any file in the repo. Even an unrelated, unused readme.md file It's really problematic, and made working with the com-lihaoyi/mill repo very annoying for me. It totally breaks the incremental nature of Mill. I don't think we should make that the common-case default for users running Scala on the JVM FWIW we hit the same problem at work. With Bazel instead of Mill, in C++ rather than Scala. But the fundamental problem is the same: compiled buildinfo + version in buildinfo + git dirty sha used in version = everything is invalidated all the time. The solution was the same too. So this does seem like a fundamental issue, and not a quirk of our current implementation |
I see it differently, but it is configurable after all. I think, we should add the implications (pros, cons) to the docs of the setting, as these are far from obvious to a user. If you both agree, I'm not against merging this as-is. IMHO, a project should test the same artifacts it delivers, and using a fine grained classpath in (integration) tests but a jar in production always has some potential to fail due to the mismatch. E.g. in one of my client projects OSGi is used, which has enriched metadata in the jar manifest and it is key to use the full jars downstream when testing. This has an impact to the developer experience, but instead of compromising accuracy, we just changed the way we format dirty git hashes in version numbers (we keep the "dirty", but not the hash). And as another side note: I still have to fix |
Another possibility is to ship it without a default and let users decide. It's not that nice fox UX but allows users to take informed decisions. |
I think a sane default is better than non, in this case. I'd assume, most projects don't include a dirty git hash (which changes with every file change) in their versions, and are not affected at all. Those projects will profit from a static config default. Projects that have some highly dynamic content in their build info like Mill, might want to flip that setting. My own projects also have dynamic build info content, yet I still prefer the safer setting, as it can't be the cause of any trouble. The safer settings also makes the implementation simpler, as we don't need to match based on the module type, but which was needed to implement this PR. Also, there are already a lot of settings to override when using the BuildInfo plugin. That was the main reason I typically just wrote the generated files by hand instead of using the BuildInfo plugin. |
I think this is fine to merge as is, or with some of the discussion captured in the scaladoc The added complexity isn't too bad. In future, we could even bundle up some JSON with the Scala.js Javascript to support out-of-band buildinfo bundling there, and come up with some solution for Scala-Native: not sure what, but definitely possible. People have been bundling data with their executables since forever. I think "version string includes dirty without the SHA" isn't a good compromise. The SHA is useful. If I'm running It's true that compiling things into bytecode is simpler, but it's too simple: given the common-usage-pattern of People with advanced use cases would already be equipped with the skills for dealing with libraries requiring auxiliary non-classfile classpath entries, so it won't come as too big a surprise: SPI files, conf files, manifests, etc.. They would also have larger projects that would suffer most from the lack of incrementality, which might not matter that much to you, but does matter to many. Let's keep the flag for the people who prefer it, and this PR flipping the default for JS/Native is fine since we don't have an alternative for now. But for the bulk of people with JVM code with uncomplicated classpaths, or with complicated classpaths but the skills to manage them, leaving the flag off by default is the right thing to do. |
@@ -729,8 +729,8 @@ object contrib extends Module { | |||
} | |||
|
|||
object buildinfo extends ContribModule { | |||
def compileModuleDeps = Seq(scalalib, scalajslib, scalanativelib) | |||
def testModuleDeps = super.testModuleDeps ++ Seq(scalalib, scalajslib, scalanativelib) | |||
def moduleDeps = Seq(scalalib, scalajslib, scalanativelib) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This means, if somebody includes this module into his project, it will also transitively include the Mill dependendencies. And if the versions do not match exactly, we end up with two different versions in the classpath.
Mill plugins typically always keep dependencies to Mill as compileIvyDeps
or compileModuleDeps
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But as a contrib module, wouldn't the versions always line up? Since contrib modules are co-versioned with Mill
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, if used as we suggest. But I've seen also other constructs in the wild. And sometimes, these are even valid. We have no guard against it. The error are typically some strange compiler error messages and aren't helpful.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it. I'll open a follow up to move back to compileIvyDeps
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And if the versions do not match exactly, we end up with two different versions in the classpath.
Doesn't the version get evicted to the newer one?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now, that we control the whole build process of build.sc
scripts with Mill, we could probably enhance our dependency management. E.g. we could apply some forceVersion
to all Mill dependencies and such things.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah that works too. Or just skip them entirely during resolution (is that possible???) since they'll all be bundled anyway
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doesn't the version get evicted to the newer one?
It wouldn't, because the version bundled in the Mill assembly isnt resolved through ivy/maven and thus can't be evicted nor can it evict the other
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doesn't the version get evicted to the newer one?
It wouldn't, because the version bundled in the Mill assembly isnt resolved through ivy/maven and thus can't be evicted nor can it evict the other
Also, in case we include a newer transitive Mill, it could still fail.
Maybe, we should try to resolve them from coursier and reduce the size of our bundle further. If we only come with our runner and coursier, we could load everything else via coursier on demand.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Opened a PR to go back to the earlier behavior via #2568
The non-transitive nature of compileModuleDeps
is definitely unfortunate; it means that anyone who depends on the buildInfo
plugin needs to manually depend on scalajslib
and scalanativelib
as well. This isn't a problem for user code, since the Mill assembly always contains both. But can be an annoyance for plugins, e.g. in my PR we needed to add the dependencies to contrib,scoverage
rather than having them be pulled in automatically
I think the status quo is probably fine for 0.11.0. After that, we can consider other options, e.g. making use of exclude
to allow normal dependencies while automatically removing the Mill jars from dependency resolution.
Shrinking the bundle is possible. With the current bootstrapping mechanism, we would need to keep at least scalalib
, since that's required for MillBootstrapRootModule
. With an alternative bootstrapping mechanism, we could cut it further, though it's unclear whether that's worth the complexity: for small size, the current bootstrap bash/bat script already wins. We can leave that debate for later
Status quo fails linking with a cryptic error message where previous version was working out of the box.
This sets the default to the static behavior on Scala.js and Scala Native so users don't need to care.