Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add rid for Asianux Server 8 #2129

Closed
m10k opened this issue Jan 24, 2020 · 9 comments
Closed

Add rid for Asianux Server 8 #2129

m10k opened this issue Jan 24, 2020 · 9 comments

Comments

@m10k
Copy link
Contributor

m10k commented Jan 24, 2020

Hello,

I was wondering if it's possible to have an rid for Asianux added to the upstream source. I have already added it to my own repository, but before I send a pull request, is there anything else I should do or keep in mind?

The repo+branch in question is: https://github.com/m10k/runtime/tree/asianux

Thank you!

@Dotnet-GitSync-Bot Dotnet-GitSync-Bot added the untriaged New issue has not been triaged by the area owner label Jan 24, 2020
@ViktorHofer
Copy link
Member

@ericstj @richlander

@ViktorHofer ViktorHofer removed the untriaged New issue has not been triaged by the area owner label Jul 8, 2020
@ViktorHofer ViktorHofer added this to the Future milestone Jul 8, 2020
@ericstj
Copy link
Member

ericstj commented Jul 8, 2020

This seems like a reasonable addition (similar to ol). /cc @wfurt @bartonjs

@m10k for the purposes of discussion: how do you imagine this distro being supported? Would it just use our portable-linux build of the runtime? Do you plan to build and release a runtime?

@m10k
Copy link
Contributor Author

m10k commented Jul 20, 2020

@ericstj we have already built and released a runtime together with the release of Asianux Server 8, and we intend to release updates on a regular basis. As of now, we are applying Asianux-specific patches on top of the upstream source during the RPM build process, but the patches are somewhat awkward since we needed to work around the problem of Asianux being unknown to the bootstrap cli that is used during the build.

So basically, we were hoping to get the runtime id for Asianux mainlined, since it would probably eliminate the need for our patches, and make the release of updates easier for us.

@dagood
Copy link
Member

dagood commented Jul 20, 2020

You might be interested in https://github.com/dotnet/source-build, where we add the current machine's RID to the graph during the build rather than as a Git patch. That repository is intended for distro package maintainers, so it has tooling to bootstrap and use previously-built assets for the next time (if re-bootstrapping is not acceptable in your distro). Feel free to file issues or chat on Gitter if you'd like to give that a try and hit problems. It would help us know what we need to do to make source-build more approachable. 🙂

Side note: I'm not able to find much info about Asianux, but it seems to be based on Red Hat Enterprise Linux, and RHEL has .NET Core in the distribution. Maybe I'm a bit naive about Linux distributions, but my understanding is that RHEL's .NET Core could be used here rather than an independent effort to build from source. /cc @omajid

@m10k
Copy link
Contributor Author

m10k commented Jul 22, 2020

@dagood thanks for the heads-up - At the end of the day, that's actually where our tarballs are coming from! :)
The only issue I have experienced with those tarballs is that, during the build process, our build system is recognized as Asianux, and the prepackaged bootstrap cli subsequently stumbles over the missing asianux rid in the graph - which is the reason I opened this issue. If I remember correctly, there were two ways to work around the problem: one way was to turn off the TestSharedFramework, and the other one was to rebuild the bootstrap cli before building the entire tarball. Either solution required patches that we would ideally like to keep out of the SRPM though.

By the way, you are correct that Asianux is RHEL based - and as far as I can tell it is compatible with RHEL in many regards - truth be told, I am still somewhat new to the game though (getting .NET Core to compile in our build environment was my first task at the new job), so I can't say much about the pros and cons of rebuilding from source - I imagine the reasons are not purely technical, but that is out of my realm.

Either way, what would be the correct way to go about adding the Asianux RID? Should I rebase my fork and submit a merge request to this repository, or is the change needed in the repository that the tarballs for distributions are built from?

@dagood
Copy link
Member

dagood commented Jul 22, 2020

Interesting, thanks for the extra info. 🙂 We have some extra validation we plan to do in source-build to make sure it works on distros that aren't in the upstream RID graph (dotnet/source-build#297), but it seems reasonable not to try to tackle that yet. It might give us a head start if you can describe what you run into in that issue, actually.

is the change needed in the repository that the tarballs for distributions are built from?

I wouldn't expect any changes are necessary in source-build to add a new RID. It may naturally take some time for your merged PR to get into a release available through source-build. (Until then, though, it seems to me a patch is easier to justify when the change is already merged into a future branch.)

@m10k
Copy link
Contributor Author

m10k commented Nov 8, 2021

I'm very sorry for unearthing this old thread, but since this is an issue that keeps coming up for us whenever we build dotnet RPMs I figured I'd give it another go.

A couple of minutes ago, I created PR #61304, which adds RIDs for MIRACLE LINUX 8. Is this the correct way to get the RIDs upstreamed?

A bit of additional information: We re-renamed the distribution back to Miracle Linux, since the companies that used to develop Asianux have parted ways. Miracle Linux is a RedHat-derivative like CentOS or Oracle Linux, so we figured it makes sense to upstream our RIDs the same way the other RHEL-derivatives do.

@m10k
Copy link
Contributor Author

m10k commented Feb 24, 2022

I'm sorry about the long delay in between messages. We're a rather small team, so I didn't have the time to figure out why automatic tests were failing for my PR.

Anyways, last week I've finally figured out what the problem was and fixed it (and added RIDs for ML9 on that occasion). There are some failing automatic tests (see below), but it seems they are unrelated to this pull request, am I right?
Is there anything else that needs to be addressed in the PR?

Thank you for any feedback!

 .packages\microsoft.dotnet.helix.sdk\7.0.0-beta.22110.7\tools\Microsoft.DotNet.Helix.Sdk.MultiQueue.targets#L55

.packages\microsoft.dotnet.helix.sdk\7.0.0-beta.22110.7\tools\Microsoft.DotNet.Helix.Sdk.MultiQueue.targets(55,5): error : (NETCORE_ENGINEERING_TELEMETRY=Helix) RestApiException`1: The response contained an invalid status code 500 Internal Server Error

Body: {"Message":"An error occured.","ActivityId":"21462b09d525ae42b066af4c9de03c3c"}
   at Microsoft.DotNet.Helix.Client.Job.OnDetailsFailed(Request req, Response res) in /_/src/Microsoft.DotNet.Helix/Client/CSharp/generated-code/Job.cs:line 490
   at Microsoft.DotNet.Helix.Client.Job.DetailsAsync(String job, CancellationToken cancellationToken) in /_/src/Microsoft.DotNet.Helix/Client/CSharp/generated-code/Job.cs:line 452
   at Microsoft.DotNet.Helix.Sdk.WaitForHelixJobCompletion.WaitForHelixJobAsync(String jobName, String queueName, String helixJobCancellationToken, CancellationToken cancellationToken) in /_/src/Microsoft.DotNet.Helix/Sdk/WaitForHelixJobCompletion.cs:line 49
   at Microsoft.DotNet.Helix.Sdk.WaitForHelixJobCompletion.ExecuteCore(CancellationToken cancellationToken) in /_/src/Microsoft.DotNet.Helix/Sdk/WaitForHelixJobCompletion.cs:line 31
   at Microsoft.DotNet.Helix.Sdk.HelixTask.Execute() in /_/src/Microsoft.DotNet.Helix/Sdk/HelixTask.cs:line 58

@m10k
Copy link
Contributor Author

m10k commented Mar 4, 2022

#61304 has been merged, so I'll close this issue. Thank you everyone!

@m10k m10k closed this as completed Mar 4, 2022
@ghost ghost locked as resolved and limited conversation to collaborators Apr 3, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

6 participants