Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade wasmvm to v1.0.0-beta9; use libwasmvm_muslc.aarch64.a in docker #790

Merged
merged 1 commit into from
Mar 30, 2022

Conversation

webmaster128
Copy link
Member

@webmaster128 webmaster128 commented Mar 22, 2022

This is woking now 🎉

Steps to test:

  • Checkup this branch
  • Build x86_64 image: docker build --pull -t "cosmwasm/wasmd:manual" . (unchanged)
  • Build aarch64 image: docker build --pull -t "cosmwasm/wasmd:manual" --build-arg arch=aarch64 .

@codecov
Copy link

codecov bot commented Mar 22, 2022

Codecov Report

Merging #790 (35d86bd) into master (16ea5b0) will increase coverage by 0.03%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #790      +/-   ##
==========================================
+ Coverage   58.64%   58.68%   +0.03%     
==========================================
  Files          50       50              
  Lines        5845     5845              
==========================================
+ Hits         3428     3430       +2     
+ Misses       2166     2165       -1     
+ Partials      251      250       -1     
Impacted Files Coverage Δ
x/wasm/keeper/keeper.go 87.90% <0.00%> (+0.35%) ⬆️

@webmaster128 webmaster128 changed the title Try using libwasmvm_muslc.aarch64.a in docker image Upgrade wasmvm to v1.0.0-beta8; use libwasmvm_muslc.aarch64.a in docker Mar 24, 2022
@webmaster128 webmaster128 marked this pull request as ready for review March 24, 2022 15:37
@webmaster128 webmaster128 requested a review from alpe as a code owner March 24, 2022 15:37
Copy link
Contributor

@alpe alpe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a milestone 🏆 and will make many devs happy.

I have built and manually tested on OSX Intel but it would be good to have @maurolacy to also confirm for M1.
With this working, let's remove M1 macs are not fully supported. in https://github.com/CosmWasm/wasmd#supported-systems

RUN LEDGER_ENABLED=false BUILD_TAGS=muslc make build
RUN LEDGER_ENABLED=false BUILD_TAGS=muslc LINK_STATICALLY=true make build
RUN echo "Ensuring binary is statically linked ..." \
&& (file /code/build/wasmd | grep "statically linked")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice check 👏

@maurolacy
Copy link
Contributor

I'm getting

#19 73.08 /usr/lib/gcc/aarch64-alpine-linux-musl/10.3.1/../../../../aarch64-alpine-linux-musl/bin/ld: skipping incompatible /lib/../lib/libwasmvm_muslc.a when searching for -lwasmvm_muslc
#19 73.08 /usr/lib/gcc/aarch64-alpine-linux-musl/10.3.1/../../../../aarch64-alpine-linux-musl/bin/ld: skipping incompatible /lib/libwasmvm_muslc.a when searching for -lwasmvm_muslc
#19 73.08 /usr/lib/gcc/aarch64-alpine-linux-musl/10.3.1/../../../../aarch64-alpine-linux-musl/bin/ld: cannot find -lwasmvm_muslc
#19 73.08 collect2: error: ld returned 1 exit status
#19 96.30 make: *** [Makefile:85: build] Error 2

when building for x86_64. (docker build --pull -t "cosmwasm/wasmd:manual" .)

The arm64 build worked.

@webmaster128
Copy link
Member Author

when building for x86_64. (docker build --pull -t "cosmwasm/wasmd:manual" .)

This is expected since you start with an ARM base image when working on an M1 machine. There are two options to build thimages now:

  1. Build each arch on a matching host natively
  2. Use docker cross arch building

This is going to be the next question for the deployment: do we build the ARM version in an ARM machine on CircleCI (1.)? Or do we use cross builds and build both from the Intel machines (2.)?

@maurolacy
Copy link
Contributor

Makes sense. In fact, I now tried with

docker buildx build --platform linux/amd64 --pull -t "cosmwasm/wasmd:manual" .

and it worked. It takes ~9 minutes to build the amd64 image this way, though.

@webmaster128 webmaster128 changed the title Upgrade wasmvm to v1.0.0-beta8; use libwasmvm_muslc.aarch64.a in docker Upgrade wasmvm to v1.0.0-beta9; use libwasmvm_muslc.aarch64.a in docker Mar 26, 2022
@alpe
Copy link
Contributor

alpe commented Mar 28, 2022

This is going to be the next question for the deployment: do we build the ARM version in an ARM machine on CircleCI (1.)? Or do we use cross builds and build both from the Intel machines (2.)?

We do not build release artifacts with wasmd but leave this to the implementing chains.

@webmaster128
Copy link
Member Author

We do not build release artifacts with wasmd but leave this to the implementing chains.

I'm thinking of https://hub.docker.com/r/cosmwasm/wasmd, which is heavily used for development. I think it's worth considering to have both an x86_64 and and aarch64 build using docker's multi-arch functionality. But this is a separate ticket.

@ethanfrey
Copy link
Member

I think it's worth considering to have both an x86_64 and and aarch64 build using docker's multi-arch functionality.

Yeah, I did some multi arch build for some of the tinygo deps when getting things working on arm64 and once you figure out the process (build them both, then make some linking image that references them), it was actually relatively easy. Just a bit of a time sink the first time.

@webmaster128 webmaster128 merged commit 09629ab into master Mar 30, 2022
@webmaster128 webmaster128 deleted the use-libwasmvm_muslc.aarch64.a branch March 30, 2022 13:55
@webmaster128
Copy link
Member Author

Thank you @maurolacy for the cross build instructions. I collected them in this wikipage for all 4 cases: https://github.com/CosmWasm/wasmd/wiki/Maintainer-notes

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants