Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue #3783: CI Check Builds #3792

Merged
merged 1 commit into from
Jul 16, 2022
Merged

Issue #3783: CI Check Builds #3792

merged 1 commit into from
Jul 16, 2022

Conversation

hs-apotell
Copy link
Collaborator

Issue #3783: CI Check Builds

Adding workflow for Github hosted CI check builds. Supports all three
supported platforms with all runtime languages except swift.

Also, includes cpp native builds across all supported platforms i.e.
independent on the test run builds.

Every configuration build generates an artifact (archive of the entire
repository) and is uploaded as a result of the build (regardless of
success or failure).

Signed-off-by: HS hs@apotell.com

Adding workflow for Github hosted CI check builds. Supports all three
supported platforms with all runtime languages except swift.

Also, includes cpp native builds across all supported platforms i.e.
independent on the test run builds.

Every congiguration build generates an artifact (archive of the entire
repository) and is uploaded as a result of the build (regardless of
success or failure).

Signed-off-by: HS <hs@apotell.com>
@ericvergnaud
Copy link
Contributor

Looks great: doesn't break self-hosted
What's in the way for also having swift ?

@ericvergnaud
Copy link
Contributor

And let's find out how many free builds we get

@parrt
Copy link
Member

parrt commented Jul 16, 2022

@hs-apotell Amazing. Pulling in! I should merge that file into dev as well right so PRs and our dev gets tested?

@parrt parrt merged commit 88b8e76 into antlr:dev Jul 16, 2022
@parrt
Copy link
Member

parrt commented Jul 16, 2022

Oh, it's in dev now. sorry. thanks!

Copy link
Member

@KvanTTT KvanTTT left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it also possible to include test for tool itself? (not runtime) See https://app.circleci.com/pipelines/github/antlr/antlr4/1885/workflows/cb672c7b-206e-4390-b823-0a3f6f6454da/jobs/20238 for instance.

.github/workflows/hosted.yml Show resolved Hide resolved
if: always()
run: |
cd ${{ github.workspace }}/..
tar czfp antlr_${{ matrix.os }}_${{ matrix.compiler }}.tgz antlr4
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to use more standard zip format instead of tgz?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure if the default VM image for all platforms have zip installed or not. Will have to investigate.

BTW, standard is very much platform dependent. On Windows, zip but on Posix it would be tar/tgz.

.github/workflows/hosted.yml Show resolved Hide resolved
@KvanTTT
Copy link
Member

KvanTTT commented Jul 16, 2022

Am I understand correctly that now CircleCI is not required at all since GitHub supports all OS.

@hs-apotell
Copy link
Collaborator Author

I would let the current change bake for about a week and follow up later with others (swift and tool).

I would like to fix all the compile warnings in native build and also update the builder and test runner to post the logs. If you can assist with the latter (it's Java code) that would be great.

@parrt
Copy link
Member

parrt commented Aug 13, 2022

Hi @hs-apotell it's getting confusing with all of the hosted versus Remote testing... It looks like it's more reliable hosted than remote. Could you check the actions and see if you concur? @ericvergnaud are you OK if we turn off the remote CI?

Something is screwy because some of the targets are now failing but it's hard to tell looking at all of the actions what's going on so I wanted to simplify. Seems we get a lot of timeouts from maven :(

@parrt
Copy link
Member

parrt commented Aug 13, 2022

(PS: i just tweaked to use 1.19 Go)

@hs-apotell
Copy link
Collaborator Author

Could you list a few runs that have failed for me to investigate?

In general, my experience has been to the contrary. Self hosted need recurring maintenance and updates which is time consuming. GitHub hosted are managed, which at times breaks because of updates and such but there is always help out there.

Regardless, I will be sure to see what I can do to make the builds more reliable.

@parrt
Copy link
Member

parrt commented Aug 13, 2022

Hi. All pass but php. The php run just failed again due to timeout:

https://github.com/antlr/antlr4/runs/7821060959?check_suite_focus=true

Run stCarolas/setup-maven@v4.4
  with:
    maven-version: 3.5.4
  env:
    JAVA_HOME: /Users/runner/hostedtoolcache/Java_Zulu_jdk/11.0.16-8/x64
downloading https://archive.apache.org/dist/maven/maven-3/3.5.4/binaries/apache-maven-3.5.4-bin.tar.gz
Error: connect ETIMEDOUT 138.[2](https://github.com/antlr/antlr4/runs/7821060959?check_suite_focus=true#step:5:2)01.1[3](https://github.com/antlr/antlr4/runs/7821060959?check_suite_focus=true#step:5:3)1.13[4](https://github.com/antlr/antlr4/runs/7821060959?check_suite_focus=true#step:5:4):443

@parrt
Copy link
Member

parrt commented Aug 13, 2022

@hs-apotell
Copy link
Collaborator Author

The timeout issue is fixable. The current workflow doesn't cache the dependencies yet. Here's an action that can be incorporated so the builds don't reach out to remote server for dependencies. I will incorporate this and will submit a PR.

https://github.com/actions/cache

@parrt
Copy link
Member

parrt commented Aug 13, 2022

It looks like the time out did not happen upon rerunning it. I'm going to turn off the remote actions as it looks like we are good to go with hosted.

@parrt parrt mentioned this pull request Aug 13, 2022
@hs-apotell
Copy link
Collaborator Author

I am not sure what your criteria is for judging the reliability of the builds on Github hosted vs. self-hosted, so I can't really comment on it. Not a single build has succeeded in last 20 days - whether hosted or otherwise. I checked nearly 40 builds on dev branch in last 20 days and most failures are the typical network timeout for Github hosted. For self-hosted, some of the runtimes that are supported are not even being checked (Java and dart are disabled, cpp isn't even included, go has been consistently failing).

Network timeout issue is fixable - remote sites tend to block multiple requests from the same IP to avoid spamming and other attacks. This is a well understood problem and has proven solutions that work.

My personal experience says otherwise, but If you feel self-hosted builds are more reliable and is the way forward, please do so.

@parrt
Copy link
Member

parrt commented Aug 13, 2022

I'm saying the opposite and agreeing with you. :) I have turned off the remote and left the hosted running, which only fail due to time out at the moment. Maybe I should say self hosted instead of remote.

@hs-apotell
Copy link
Collaborator Author

Your use of term 'remote' was confusing. Let's just stick to the official terminology of Github hosted and self-hosted.

I will post a PR with the cache action to avoid the network issue shortly.

@parrt
Copy link
Member

parrt commented Aug 13, 2022

Sounds good. Sorry for the confusion. self-hosted vs hosted / github-hosted. Thanks for the assist.

@hs-apotell
Copy link
Collaborator Author

PR #3820

Please promote the workflow changes to master as well. Empty workflow is failing on master.

@parrt
Copy link
Member

parrt commented Aug 14, 2022

Done.

@parrt
Copy link
Member

parrt commented Aug 14, 2022

@hs-apotell looks like we might have issues with master: https://github.com/antlr/antlr4/runs/7828046200?check_suite_focus=true

C:\Program Files\Microsoft Visual Studio\2022\Enterprise\VC\Tools\Llvm\x64\bin\clang.exe
clang: error: unknown argument '-version'; did you mean '--version'?

@hs-apotell
Copy link
Collaborator Author

Ahh there are other changes that will have to come to master for the builds to succeed here. Master & dev have digressed too far. The reported issues in the build failures are all fixed in dev.

Is a release due any time soon? If yes, then I would leave master as is and let the merge take care of it. If not, then I can disable the failing builds for master. Fixing those builds in master alone would be throw away work since it's going to get overwritten by dev anyways.

Alternatively, you could just leave it as is and let the future merge take care of it. The workflow was failing before the merge and will continue to fail still, but just for different reasons. The hope was that the merge would get the master builds to succeed but that won't happen unless all the other relevant changes are also merged.

P.S. I don't pay attention to this repository on a regular basis so if there's something that needs my attention in context to CI builds, please file a bug and assign it to me.

@parrt
Copy link
Member

parrt commented Aug 15, 2022

Ok, sounds good.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants