Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor: replace direct SNARK with batched SNARK #197

Open
2 of 3 tasks
adr1anh opened this issue Dec 20, 2023 · 2 comments
Open
2 of 3 tasks

refactor: replace direct SNARK with batched SNARK #197

adr1anh opened this issue Dec 20, 2023 · 2 comments
Labels
enhancement New feature or request

Comments

@adr1anh
Copy link
Contributor

adr1anh commented Dec 20, 2023

The implementations of the batched SNARKs are should be equivalent to the non-batched versions. We can replace the direct implementation with the batched one using a number of instance = 1.

  • direct: Conditionally squeeze outer_r challenge in cases where number of instances = 1.
  • pre-processing: Move all SumcheckEngine related structs from ppsnark.rs to sumcheck.rs
  • Implement RelaxedR1CSSNARK for batched_ppsnark::RelaxedR1CSSNARK and batched::BatchedRelaxedR1CSSNARK using vectors of size 1.

This should help with maintenance going forward as we will only have to maintain 2 SNARKs rather than 4.

@adr1anh adr1anh added the enhancement New feature or request label Dec 20, 2023
@huitseeker
Copy link
Member

  1. see Refactor pre-processing SNARK #220 for the pre-processing step
  2. I think we could simply delete the direct SNARK, as it is unused, and there are no plans for using it.

@adr1anh
Copy link
Contributor Author

adr1anh commented Jan 1, 2024

+1 on removing the direct SNARK. it would also allow us to remove a lot of of the SumcheckProof::prove_* and focus on extending the SumcheckEngine approach.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants