-
-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create workflow file for Windows nightly builds #13070
Conversation
Well, that might be useful... |
To be fair to them, they don't seem against the idea of nightly builds. They expressed opposition against providing official releases this way. But this is out of scope for now - the purpose of the workflow in this PR is really just for "experimental" nightly builds for Windows (for the benefit of Windows users willing to test new commits). This is not meant to replace/supersede official current official binary releases. Though in time, providing a GitHub Actions based fully automated (and as reproducible as possible) release workflow is what we should strive for. We might have to convince them of that still... In the meantime, this is can be though of as a stepping-stone towards that goal. The prolonged absence, however, is a now real problem. |
I am not against providing night builds (if this is not clear from my previous comment).
I approve this PR generally. Although I can't check its internals. |
it was clear, no problem.
I also don't want to get most users too used to just using the nightly builds. But I think it is preferable to have people using nightly builds vs. "rollback culture"/"I'll just stay in Ubuntu users already have this available to them in practice, via the unstable PPA. This is just a nice way of making development snapshots available to Windows users as well (and in the future, other Linux distros and macOS).
A large part of it is book keeping around setting up the build environment: enabling devcmd, generating a proper Here is the last example run from my And here is a slightly older example, when I was able to get all cache hits (this is best and very unlikely scenario, since like mentioned above, all of these 4 caches would exceed the 5 GiB max. GitHub probably took enough time to delete the excess for this run to complete while hitting all 4 caches): Since there is not enough space for all caches, at least one cache will miss, causing the total workflow time to reach about ~1h 45 min, but the jobs that hit cache will finish in 10~15 min. |
Qt 5.15 is the next LTS release. Once the versions are bumped, do you plan to drop to 2 builds or still use 5.12 LTS + 5.15 latest? |
Currently, You can guess that continuing with this strategy would amount to periodically switching between 2 and 4 builds, depending on whether the latest LTS happens to also be the latest release overall. However, keep in mind that we are on the cusp of a major transition (Qt 5.x -> Qt 6), and that [1]: If necessary, one can always have two separate |
Force push:
|
Have you tried dynamic linking? |
Soon we will know: https://github.com/FranciscoPombal/qBittorrent/actions/runs/158671381 This run will fail at building qBittorrent itself, but I'm really just checking the build time of the dependencies, which is what dominates the cost anyway. I believe it will take about the same time (1h20 - 1h30 currently). Even if it shaves off 20 minutes, it's not worth the trouble - the improvement would have to be very significant to be worth it IMO. |
I thought you use existing binaries to link qBittorrent against. |
Yeah this is not the case.
Yep. The builds will only be significantly faster if they are cached. As mentioned above, the Github Actions cache is far too small to be as useful as we'd like it to be (but not a deal-breaker considering the scope of this PR). Is there anything left to address before merging this? |
It looks suboptimal/wasteful to rebuild the same things (dependency libraries) each time you need to build qBittorrent... |
As I mentioned, only 2/4 of the builds can be cached due to space constraints. The whole workflow will of course still take as long as the slowest build. We don't really care though, since, again, this is not for CI, and it is done on some Github server automagically, so it's not a waste of anyone's time. |
FYI, I have been using your linked So far, it has worked out well. Great job! 👍 |
What about maintainability of this feature? What regular tasks are required here? How often should we change anything in the config files, and what exactly? |
Basically, updating the versions of the dependencies, as needed.
Just the
I can write a wiki page about it (in fact, I should do that), to increase the bus factor of the maintenance of this feature. But the gist of it is really simple: checkout repository -> install dependencies -> build -> upload artifacts. In this case, vcpkg is used to install dependencies, but it could have been something else. I just decided that it was the best option for the job right now. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can write a wiki page about it (in fact, I should do that), to increase the bus factor of the maintenance of this feature.
👍
@FranciscoPombal: While you are at this, any chance you would be willing to post a variant of qBittorrent's (qBittorrent's build process is way too complex for me to even think about building it myself, and I prefer to build projects using AppVeyor anyway.) |
Here, by "cached packages [currently] used for the official qBT AppVeyor scripts" I assume you mean the ones pulled from https://builds.shiki.hu/, correct? Where are you suggesting I pull the dependencies from instead?
Isn't this what this PR is already doing? It provides the built binaries for each commit or PR against |
Actually, something occurred to me which I should have though of earlier. It's only worth it to cache a workflow's jobs dependencies if all can be cached. Otherwise the total workflow time will be that of the slowest build anyway, so, for example, caching dependencies for 2/4 jobs (as is the case here) ends up being irrelevant. It only wastes valuable caching space with caches that offer no real benefit in terms of improving actual workflow duration. So, I should probably remove all caching from this workflow, and create another workflow file for Windows CI purposes. For CI, we only really need one (Qt version, libtorrent version) set of dependencies to achieve feature-parity with the current AppVeyor setup. There is enough caching space for that. This means there will also be enough space for caching dependencies of a possible macOS CI workflow file. Ubuntu CI builds only need to cache a libtorrent build at most (everything else is just a quick This can all be done afterwards though, don't let these musings block this PR. I can later come up with workflows for CI implementing what I describe and a small patch to remove the caching from this workflow. But for now, IMO the priority is to get this one merged to get nightly Windows builds to users ASAP. |
In the case of the alternate version of And yes, I am aware that such a comprehensive building process would be insanely slow—but it would be the easiest way to guarantee that each qBittorrent build I test has the absolute latest version of every dependency (which is my greatest gripe about waiting months for each official qBittorrent release). P.S. I admit that this request is purely for my own personal convenience. By having AppVeyor do all the work of building qBittorrent for me, I can avoid cluttering up my own system with a build environment that I barely know how to use. |
Building all of qBittorrent's main dependencies (zlib, OpenSSL, Boost, Qt5, libtorrent) manually is a pretty complex process. It also involves manually editing the build files and stuff like that. This requires one to keep up with the ideal options to compile each of them and what to change/edit in the build files. This is not expected to change too often or too much between, say, different Qt5 versions, but still. While that process can be replicated in a CI script, it would make the script pretty unmaintainable IMO. We don't really care exactly how any of the "fundamental" dependencies is built (meaning, everything except for libtorrent, which we might want to tweak more thoroughly), we just want good general-purpose builds of those dependencies. This is the reason I chose to use vcpkg for dependency management. Everything gets built with consistent and predictable configurations and toolchain. Ideally, we would be able to get appropriate prebuilt binaries from somewhere, but having everything easily built from source with the help of vcpkg is the second best thing IMO. Plus it's easy to cache vcpkg artifacts, making subsequent builds very fast. The only constraint is available caching space on GH Actions (AppVeyor probably also has some kind of space limitation for caching buildtime artifacts).
Probably not any slower than with vcpkg, but much more complex to write out and maintain - this is the problem.
vcpkg kind of solves this though. Previously you'd have to wait for someone to update the prebuilt artifacts on https://builds.shiki.hu/ on their on time. OTOH, vcpkg packages are updated quite frequently. In fact, just yesterday or the day before yesterday, Qt was updated to 5.15 on vcpkg. It's not exactly "updated on release day", but it's fast enough IMO. Boost, OpenSSL and zlib are updated even faster typically. To update the dependency versions in the workflow, we just have to update the vcpkg commit and Qt version names.
This is what this PR does, just with GH Actions, not AppVeyor. |
If you didn't fixate on static linking, you could use the binaries provided by Qt. Then you would only need to build the Boost, ZLib and libtorrent. |
Later we can focus on other strategies. The goal of this PR is just to automate https://github.com/qbittorrent/qBittorrent/wiki/Compiling-with-MSVC-2019-(static-linkage).
You'd still need OpenSSL as well for libtorrent, unless Qt comes with prebuilt OpenSSL binaries we can use for that. |
Yeah, it's not optimal, but it's 2 hours on Github's time for free so ¯\_(ツ)_/¯ |
Rebased to take advantage of lukka/run-vcpkg#32. Cache is no longer used, since, as mentioned in #13070 (comment), it is useless in this workflow. This way, we still have the full 5 GiB available to use for caching artifacts for other workflows, like CI, for example, where builds really need to be fast and really need to use cached artifacts to be fast. |
vcpkg is a bit of a foreign concept to me but looking at the GitHub repo, it appears that vcpkg has updated to Qt 5.15. |
Just a random thought...wouldn't it be ideal to append the build date or the last commit hash in the version string to make it easier to figure out which version a user is using for testing? |
Yeah it's in my TODO list after the CMake PR lands... As a stop-gap measure, at least the artifact names currently contain all the relevant commit hashes. If you want a sneak peak of what I'm thinking about implementing, take a look at pobrn/mktorrent#50, in particular the |
Well I ended up just bumping now... @Chocobo1 when can this be finally merged? It would have proven useful multiple times by now... why stall this any longer? |
Sorry, I'm pausing (more like abstaining) my vote here: I presented the same concept to sledgehammer999 in the past, but he denied the idea and currently I don't really feel inclined of needing it. |
OK, I was not aware of that - glassez previously claimed that as far as sledge was concerned, there was no issue (#13070 (comment)). For reference, can you please link such posts? I'd like to know on what grounds @sledgehammer999 has denied something like this before. I think the circumstances are much different now.
I disagree, I think this is much needed. It gets tiresome to manually start a run on my own branches or asking @xavier2k6 to provide test builds every time a build of the latest master commits is needed. This would ease that task. Many bug reports would be closed/fixed much faster if users had easy access to such builds. |
The exact link is hard to find, I only recall I asked him that we can provide build artifacts of the appveyor CI to the users and he expressed that he prefer not to. |
|
OK, as expected that was a long time ago (5 years). And they didn't even justify the reason. Presumably they were concerned with CI completion time/occupied space? In particular, if the latter was a very limiting factor, it makes sense to reject it, since they were confident in their ability to provide builds to make up for it. But that assumption is invalid now. Now there are CI options where artifact space is not a problem, and that provide a fully automated build process not dependent on anyone's personally maintained bundle. That was the trade-off taken previously, and now we're actually suffering its consequences. Plus, I should remind that this PR is not about CI, it's about test builds independent of CI builds. The one specifically about CI is this one: #13288. This one is simply about automating the work that me and @xavier2k6 do manually right now. |
Builds 4 variants with qmake: - Qt5 latest + libtorrent RC_1_2 HEAD - Qt5 latest + libtorrent latest 1.2.x release - Qt5 LTS + libtorrent RC_1_2 HEAD - Qt5 LTS + libtorrent latest 1.2.x release Dependencies are installed with vcpkg. Common to all builds: x64, static linkage, release build type. Co-authored-by: jagannatharjun <jagannatharjun11@gmail.com>
ef4c5f5
to
c0576ac
Compare
Force push: bump vcpkg version to include libtorrent 1.2.10 as the stable version, and remove toolset specification in the dev cmd setup. Now the latest toolchain is used always (previous toolchain versions are not guaranteed to exist anyway). |
Personally, I support the idea of providing test builds in the similar way. But if we do switch to GitHub Actions CI in the end, and the build artifacts are available, then it is unnecessary to have another such thing. |
@glassez CI is meant to use cache and be quick. These nightlies are supposed to cover a broader range of combinations (e.g. both the latest 1.2.x libtorrent release and the latest RC_1_2 commit + the latest Qt and the latest LTS Qt, ....), for users to test in case they run into problems with a specific version of Qt, for example (as has happened in the past, for instance, with RSS download and screen scaling stuff). |
@FranciscoPombal P.S. Of course, it doesn't make sense to duplicate variants from the CI (if we do migrate on GitHub Actions CI, and the build artifacts will also be available for testing). |
Now that #13327 is merged, I submit an improved version of this, but leveraging the better CMake build system. It is a shame that this (or similar) wasn't merged in the 3 months that passed until now, it would have been very useful during that time 👎. |
@FranciscoPombal |
@glassez Yes, using CMake, which will be much simpler. |
Follow-up to #12357
I think this is now good enough to be useful. Unfortunately, not all 4 variants can always be cached. There is enough space for at least two builds to always hit cache, but more than that is not guaranteed. The goal is to to get fresh builds into the hands of end users for testing/early access reasonably quickly and automatically. Caching here is just a cherry on top. It does not really matter if not every build is cached, as this is not meant for CI at this time, so the builds don't need to be super fast.
For reference:
After #12746 is merged, I'll think about more variants/workflows to add, including builds for other OSs.