Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

native-image memory usage higher in vm-21.2.0 compiled from source compared to the released binary #3697

Closed
zakkak opened this issue Aug 19, 2021 · 4 comments
Assignees

Comments

@zakkak
Copy link
Collaborator

zakkak commented Aug 19, 2021

Describe the issue

Looking into #3435 I tried compiling HelloWorld with vm-21.2.0 downloaded from https://github.com/graalvm/graalvm-ce-builds/releases/tag/vm-21.2.0 and noticed that native-image didn't use more memory than in vm-21.1.0. However, compiling vm-21.2.0 from source using:

mx --primary-suite-path substratevm --components='Native Image' --native-image=native-image build

I can still observe increased memory usage.

21.2.0 release 21.2.0 built from source
2.4 GB 3.2 GB

How is this possible?

Comparing 10ff481 downloaded from https://github.com/graalvm/graalvm-ce-dev-builds/releases/tag/21.3.0-dev-20210817_2030 with 10ff481 built from source I don't see such a difference (both report 3.2GB)

Steps to reproduce the issue
Build native-image from source and compare memory usage to that of the corresponding native-image binary released upstream. (run multiple times to ensure the measurements are stable)

For the measurements I compiled:

public class HelloWorld {
    public static void main(String[] args) {
        System.out.println("Hello World!");
    }
}

using
native-image HelloWorld

@christianwimmer
Copy link

@zakkak I have no idea what would be different, but if you find something I'm interested to hear what is going on.

@jerboaa
Copy link
Collaborator

jerboaa commented Sep 17, 2021

@zakkak You mention "Build native-image from source". How exactly are you building it? Specifically is this building libgraal native?

@zakkak
Copy link
Collaborator Author

zakkak commented Sep 20, 2021

@zakkak You mention "Build native-image from source". How exactly are you building it?

The command I used to build from source is:

mx --primary-suite-path substratevm --components='Native Image' --native-image=native-image build

Specifically is this building libgraal native?

Good catch, no it doesn't!

Building with:

mx --primary-suite-path substratevm --components='Native Image',LibGraal --native-images=native-image,lib:jvmcicompiler build

does so, but I am still getting similar results.

Enabling GC logs I see that the actual heap usage is not that different between the two builds, despite the consistent increased heap capacity in the case of the build-from-source GraalVM.

GraalVM 21.2.0 built from source with native libgraal

[0.016s][info][gc] Using Parallel
[0.507s][info][gc] GC(0) Pause Young (Metadata GC Threshold) 138M->8M(981M) 4.839ms
[0.528s][info][gc] GC(1) Pause Full (Metadata GC Threshold) 8M->8M(981M) 21.432ms
[0.903s][info][gc] GC(2) Pause Young (Metadata GC Threshold) 77M->18M(981M) 6.608ms
[0.947s][info][gc] GC(3) Pause Full (Metadata GC Threshold) 18M->17M(981M) 43.954ms
[1.381s][info][gc] GC(4) Pause Young (Metadata GC Threshold) 117M->31M(981M) 8.141ms
[1.494s][info][gc] GC(5) Pause Full (Metadata GC Threshold) 31M->29M(981M) 112.252ms
[helloworld:236347]    classlist:   1,259.74 ms,  0.96 GB
[helloworld:236347]        (cap):     396.66 ms,  0.96 GB
[helloworld:236347]        setup:   1,780.92 ms,  0.96 GB
[4.252s][info][gc] GC(6) Pause Young (Allocation Failure) 285M->70M(981M) 20.240ms
[5.199s][info][gc] GC(7) Pause Young (Allocation Failure) 326M->110M(981M) 34.386ms
[5.948s][info][gc] GC(8) Pause Young (Allocation Failure) 366M->151M(1249M) 44.884ms
[7.459s][info][gc] GC(9) Pause Young (Allocation Failure) 640M->225M(1271M) 71.958ms
[8.990s][info][gc] GC(10) Pause Young (Allocation Failure) 715M->279M(1784M) 74.979ms
[helloworld:236347]     (clinit):     162.36 ms,  1.74 GB
[helloworld:236347]   (typeflow):   3,978.54 ms,  1.74 GB
[helloworld:236347]    (objects):   2,901.01 ms,  1.74 GB
[helloworld:236347]   (features):     351.57 ms,  1.74 GB
[helloworld:236347]     analysis:   7,624.95 ms,  1.74 GB
[helloworld:236347]     universe:     692.29 ms,  1.74 GB
[12.432s][info][gc] GC(11) Pause Young (Allocation Failure) 1234M->356M(1808M) 98.267ms
[helloworld:236347]      (parse):     680.72 ms,  1.77 GB
[13.779s][info][gc] GC(12) Pause Young (Allocation Failure) 1311M->439M(2407M) 160.348ms
[helloworld:236347]     (inline):   1,098.49 ms,  2.35 GB
[19.376s][info][gc] GC(13) Pause Young (Allocation Failure) 1945M->456M(2429M) 160.039ms
[20.522s][info][gc] GC(14) Pause Young (Metadata GC Threshold) 925M->463M(3325M) 15.969ms
[21.262s][info][gc] GC(15) Pause Full (Metadata GC Threshold) 463M->261M(3325M) 740.533ms
[helloworld:236347]    (compile):  11,532.26 ms,  3.25 GB
[helloworld:236347]      compile:  14,199.87 ms,  3.25 GB
[27.396s][info][gc] GC(16) Pause Young (Allocation Failure) 2670M->320M(3328M) 24.389ms
[helloworld:236347]        image:   1,836.08 ms,  3.25 GB
[helloworld:236347]        write:     239.65 ms,  3.25 GB
[helloworld:236347]      [total]:  27,840.22 ms,  3.25 GB

GraalVM 21.2.0 downloaded from releases (with native libgraal as well)

[0.004s][info][gc] Using Parallel
[0.505s][info][gc] GC(0) Pause Young (Metadata GC Threshold) 159M->8M(981M) 8.801ms
[0.531s][info][gc] GC(1) Pause Full (Metadata GC Threshold) 8M->8M(981M) 25.212ms
[0.814s][info][gc] GC(2) Pause Young (Metadata GC Threshold) 84M->18M(981M) 4.074ms
[0.861s][info][gc] GC(3) Pause Full (Metadata GC Threshold) 18M->17M(981M) 47.066ms
[1.267s][info][gc] GC(4) Pause Young (Metadata GC Threshold) 115M->31M(981M) 8.739ms
[1.386s][info][gc] GC(5) Pause Full (Metadata GC Threshold) 31M->30M(981M) 118.520ms
[helloworld:237044]    classlist:   1,227.69 ms,  0.96 GB
[helloworld:237044]        (cap):     440.13 ms,  0.96 GB
[helloworld:237044]        setup:   1,994.59 ms,  0.96 GB
[4.503s][info][gc] GC(6) Pause Young (Allocation Failure) 286M->70M(981M) 48.718ms
[5.696s][info][gc] GC(7) Pause Young (Allocation Failure) 326M->110M(981M) 36.204ms
[6.520s][info][gc] GC(8) Pause Young (Allocation Failure) 366M->145M(1252M) 46.954ms
[7.878s][info][gc] GC(9) Pause Young (Allocation Failure) 638M->221M(1271M) 72.533ms
[9.585s][info][gc] GC(10) Pause Young (Allocation Failure) 714M->275M(1790M) 85.279ms
[helloworld:237044]     (clinit):     235.08 ms,  1.75 GB
[helloworld:237044]   (typeflow):   4,196.75 ms,  1.75 GB
[helloworld:237044]    (objects):   3,014.55 ms,  1.75 GB
[helloworld:237044]   (features):     386.21 ms,  1.75 GB
[helloworld:237044]     analysis:   8,093.87 ms,  1.75 GB
[helloworld:237044]     universe:     884.13 ms,  1.75 GB
[13.759s][info][gc] GC(11) Pause Young (Allocation Failure) 1238M->372M(1813M) 146.518ms
[helloworld:237044]      (parse):   1,109.51 ms,  1.77 GB
[helloworld:237044]     (inline):   1,759.35 ms,  1.77 GB
[16.199s][info][gc] GC(12) Pause Young (Allocation Failure) 1335M->436M(2429M) 194.098ms
[17.646s][info][gc] GC(13) Pause Young (Metadata GC Threshold) 608M->440M(2447M) 109.376ms
[18.362s][info][gc] GC(14) Pause Full (Metadata GC Threshold) 440M->245M(2447M) 715.395ms
[23.276s][info][gc] GC(15) Pause Young (Allocation Failure) 1768M->262M(2470M) 13.477ms
[26.177s][info][gc] GC(16) Pause Young (Allocation Failure) 1808M->279M(2465M) 20.019ms
[helloworld:237044]    (compile):  11,609.34 ms,  2.41 GB
[helloworld:237044]      compile:  15,188.41 ms,  2.41 GB
[helloworld:237044]        image:   1,728.64 ms,  2.41 GB
[helloworld:237044]        write:     210.80 ms,  2.41 GB
[helloworld:237044]      [total]:  29,587.77 ms,  2.41 GB

.

@zakkak
Copy link
Collaborator Author

zakkak commented Sep 23, 2021

I am closing this, as it seems related to a slightly different allocation pattern that leads to the GC algorithm increasing the heap size a bit more aggressively in one of the two cases while the actual memory usage is not that different.

@zakkak zakkak closed this as completed Sep 23, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants