Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Intermittent Upload Failure of Hardhat Artifact Files to Sourcify Verifier #1270

Closed
mshakeg opened this issue Feb 15, 2024 · 14 comments
Closed
Assignees

Comments

@mshakeg
Copy link

mshakeg commented Feb 15, 2024

Description

I've been experiencing intermittent failures when trying to verify contracts by uploading Hardhat artifact files (build-info/{hash}.json) to the Sourcify Verifier (https://sourcify.dev/#/verifier). This issue has only started occurring recently; I had been able to upload files successfully in the past without any problem. The failures are sporadic, with some attempts succeeding while others fail(or never complete "Checking contracts") without a clear pattern or identifiable cause.

Steps to Reproduce

  1. Go to https://sourcify.dev/#/verifier.
  2. Click on "Verify Contracts".
  3. Select the appropriate network.
  4. Upload the build-info/{hash}.json file generated by Hardhat. Such as the attached file.
  5. Observe that it never completes "Checking contracts"

Expected Behavior

The expected behavior is for the file to be uploaded successfully every time, allowing the contract to be verified without issue.

Actual Behavior

The upload process intermittently fails. Sometimes, it proceeds as expected, but at other times, the upload does not complete, and the verification process cannot proceed. There is no error message displayed on the UI.

4be0a250f2e0bdcedfcf348751eb11c6.json

View in Huly HI-507

@mshakeg
Copy link
Author

mshakeg commented Feb 15, 2024

Sometimes the above error popup shows up

sourcify upload artifact issue

@kuzdogan kuzdogan self-assigned this Feb 16, 2024
@kuzdogan
Copy link
Member

Thanks for the detailed description!

From what I see we have some performance issues when adding large files. At first sight the hash function we use to get identifiers for the files in session seems to take quite a long time. I couldn't reproduce it on https://staging.sourcify.dev/#/verifier fully but there it also seems to take a long time.

We should replace the keccaks with more lightweight identifier generators.

@kuzdogan
Copy link
Member

I can save couple seconds locally by changing how we create identifiers for the session: 5800643

But the main issue seems to come in lib-sourcify when we are generating variations:

for (const pathContent of files) {
for (const variation of generateVariations(pathContent)) {
const calculatedHash = keccak256str(variation.content);
byHash.set(calculatedHash, variation);
}
}

Here we need to use keccak256 because in the metadata file the source file identifiers are in keccak. Here we generate variations of each source file extracted (adding whitespaces etc.) and hashing every single one of them. This results in a whopping 1360 variations meaning 1360 hashes. Locally it takes couple seconds but in a busy server I guess this takes a long time, longer than the request timeout which is why you are having that error..

In the meantime I can recommend you to use hardhat-verify directly if possible. That way you don't have to use the session API and you don't have to send the source files that are not needed which might be happening by dumping the whole build artifact.

@mshakeg
Copy link
Author

mshakeg commented Feb 16, 2024

@kuzdogan ok, thanks for the update. Not sure what the sourcify cloud set up is like, but if each instance has multiple cores available you could look to parallelize the execution of keccak256str across multiple worker threads.

@mshakeg
Copy link
Author

mshakeg commented Feb 24, 2024

@kuzdogan not much has changed in my experience, not sure if autoscaling is enabled to handle increased load? maybe tighten the per IP rate limits, as the service really hasn't been functional since I created this issue.

@kuzdogan kuzdogan assigned marcocastignoli and unassigned kuzdogan Feb 26, 2024
@kuzdogan
Copy link
Member

Yes sorry we found out a general performance issue with the session APIs, affecting not only this contract but almost all. Since the session APIs are not used much, it seemed to have gone unnoticed. If the second keccak generation was the issue this would have manifested itself in the non-session too, so the problem is not there.

In the meantime, can you please share the chainId and the address of the contract you want to verify?

If just verifying this contract anyway solves your issue you can verify via the non session API or we can verify for you.

I'm off this week so @marcocastignoli will take over this issue.

@mshakeg
Copy link
Author

mshakeg commented Feb 26, 2024

@kuzdogan sure the contracts are on avalanche c-chain(chainId 43114). I've verified them on the https://snowtrace.io/ explorer.

ICHIVaultFactory: 0xDD2346e0dA9540792C2F2E86016bc44Ba39DC72d
UV3Math: 0x921aCCA39e8D3519A503EE4A11b56d6eEACbb2Aa
ICHIVaultDeployer: 0xf3145E8Cd87E94B65cF5Ba336292d557aD380e5B

@kuzdogan kuzdogan assigned kuzdogan and unassigned marcocastignoli Mar 5, 2024
@kuzdogan
Copy link
Member

kuzdogan commented Mar 5, 2024

I was able to manually verify the ICHIVaultFactory at 0xDD2346e0dA9540792C2F2E86016bc44Ba39DC72d but the other two have nested auxdatas and that's why the verification failed. See #851

We are soon going to handle these cases it's just not added to the verification yet cc: @marcocastignoli

We still need to fix the performance issue in the session verification.

@kuzdogan kuzdogan moved this to Todo in Sourcify Public Mar 6, 2024
@kuzdogan kuzdogan moved this from Todo to In Progress in Sourcify Public Mar 6, 2024
@kuzdogan kuzdogan moved this from In Progress to Todo in Sourcify Public Mar 11, 2024
@mshakeg
Copy link
Author

mshakeg commented Mar 18, 2024

Hey @kuzdogan I just wanted to follow up on the progress of this issue. Sourcify is basically not functional at least whenever I try to use it.

@kuzdogan
Copy link
Member

kuzdogan commented Mar 28, 2024

@mshakeg Sorry for the late response. I'm trying to further debug into this.

Could you please try verifying in an incognite browser window, or after deleting the cookies?

The issue seems to be with the existing sessions

@marcocastignoli
Copy link
Member

marcocastignoli commented Apr 3, 2024

I discovered that we are creating a new session for every request (not only the ones calling "/session" endpoints).

I think that in production, a new session is created for each of the hundreds of requests we receive every second causing the session storage to become very slow. Before moving on with #1321 I would try to limit the session usage to only "/session" endpoints and see if it fixes the problem.

@kuzdogan
Copy link
Member

kuzdogan commented Apr 3, 2024

@marcocastignoli Great, let's see if this solves the issue, and if so do a quick release.

We should move away from the MemorySession regardless but the fixing this should be ASAP

@kuzdogan
Copy link
Member

kuzdogan commented Apr 4, 2024

The issue should be fixed with the latest release @mshakeg Sorry for taking too long to fix this.

@kuzdogan kuzdogan closed this as completed Apr 4, 2024
@github-project-automation github-project-automation bot moved this from Todo to Done in Sourcify Public Apr 4, 2024
@mshakeg
Copy link
Author

mshakeg commented Apr 4, 2024

@kuzdogan thanks, it seems to be much better now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Archived in project
Development

No branches or pull requests

3 participants