Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spades fails to compile from source on an aws r7i.2xlarge instance #1362

Open
1 task done
wiebepo opened this issue Aug 27, 2024 · 7 comments
Open
1 task done

Spades fails to compile from source on an aws r7i.2xlarge instance #1362

wiebepo opened this issue Aug 27, 2024 · 7 comments

Comments

@wiebepo
Copy link

wiebepo commented Aug 27, 2024

Description of bug

We migrated our Jenkins server to an r7i.2xlarge instance (8 vCPU, 64 GiB, DDR5 memory). One of our Jenkins jobs includes building spades from source. Here is an excerpts of the docker file:

FROM library/ubuntu:22.04

RUN apt-get update && \
    apt-get upgrade -y && \
    apt-get install -y libbz2-dev liblz1 g++ cmake python-is-python3

RUN git clone --quiet https://github.com/ablab/spades.git -b main &&   \
    cd spades && \
    PREFIX=/usr/local ./spades_compile.sh && \
    cd .. &&   \
    rm -fr spades && \
    spades.py --test

spades.log

spades.log

params.txt

no log available. Failure occurs during compilation.

SPAdes version

SPades version 4.0.0

Operating System

Ubuntu 22.04

Python Version

3.10.12

Method of SPAdes installation

compile from source

No errors reported in spades.log

  • Yes
@asl
Copy link
Member

asl commented Aug 27, 2024

@wiebepo Does this reproduce only on this kind of instance? And no problem on smaller ones?

@wiebepo
Copy link
Author

wiebepo commented Aug 27, 2024

@wiebepo Does this reproduce only on this kind of instance? And no problem on smaller ones?

We cloned the server on an r6i.2xlarge instance, and the job succeeds. It is the predecessor and is also 8 vCPUs and 64 GiB RAM; however, it uses DDR4 memory. Not sure if this matters.

@wiebepo
Copy link
Author

wiebepo commented Aug 27, 2024

FWIW this error from the attached log seemed to be the most relevant:

/spades/ext/include/blaze/blaze/system/CacheSize.h:77:1: error: static assertion failed: Compile time condition violated
77 | BLAZE_STATIC_ASSERT( blaze::cacheSize > 100000UL && blaze::cacheSize < 100000000UL );
| ^~~~~~~~~~~~~~~~~~~

@asl
Copy link
Member

asl commented Aug 27, 2024

FWIW this error from the attached log seemed to be the most relevant:

Yes, exactly. This is why I asked if the problem reproduced on smaller instance.

@asl
Copy link
Member

asl commented Aug 27, 2024

Will you please provide the output of the following commands on that instance?

cat /sys/devices/system/cpu/cpu0/cache/index3/size
cat /sys/devices/system/cpu/cpu0/cache/index2/size
cat /sys/devices/system/cpu/cpu0/cache/index1/size

Thanks!

@wiebepo
Copy link
Author

wiebepo commented Aug 27, 2024

Added to the dockerfile, and rerun:

RUN cat /sys/devices/system/cpu/cpu0/cache/index3/size && \
    cat /sys/devices/system/cpu/cpu0/cache/index2/size && \
    cat /sys/devices/system/cpu/cpu0/cache/index1/size

Output:

107520K
2048K
32K

@asl
Copy link
Member

asl commented Aug 27, 2024

I filled upstream issue with blaze, let's see what they will say. IMO this static assert does not make much sense

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants
@asl @wiebepo and others