Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(noir_js): Expose UltraHonk and integration tests #5656

Merged
merged 9 commits into from
Aug 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
142 changes: 141 additions & 1 deletion compiler/integration-tests/test/node/prove_and_verify.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,12 @@ import { expect } from 'chai';
import assert_lt_json from '../../circuits/assert_lt/target/assert_lt.json' assert { type: 'json' };
import fold_fibonacci_json from '../../circuits/fold_fibonacci/target/fold_fibonacci.json' assert { type: 'json' };
import { Noir } from '@noir-lang/noir_js';
import { BarretenbergBackend as Backend, BarretenbergVerifier as Verifier } from '@noir-lang/backend_barretenberg';
import {
BarretenbergBackend as Backend,
BarretenbergVerifier as Verifier,
UltraHonkBackend,
UltraHonkVerifier,
} from '@noir-lang/backend_barretenberg';
import { CompiledCircuit } from '@noir-lang/types';

const assert_lt_program = assert_lt_json as CompiledCircuit;
Expand Down Expand Up @@ -150,3 +155,138 @@ it('end-to-end proof creation and verification for multiple ACIR circuits (inner
const isValid = await backend.verifyProof(proof);
expect(isValid).to.be.true;
});

const honkBackend = new UltraHonkBackend(assert_lt_program);

it('UltraHonk end-to-end proof creation and verification (outer)', async () => {
// Noir.Js part
const inputs = {
x: '2',
y: '3',
};

const program = new Noir(assert_lt_program);

const { witness } = await program.execute(inputs);

// bb.js part
//
// Proof creation
const proof = await honkBackend.generateProof(witness);

// Proof verification
const isValid = await honkBackend.verifyProof(proof);
expect(isValid).to.be.true;
});

it('UltraHonk end-to-end proof creation and verification (outer) -- Verifier API', async () => {
// Noir.Js part
const inputs = {
x: '2',
y: '3',
};

// Execute program
const program = new Noir(assert_lt_program);
const { witness } = await program.execute(inputs);

// Generate proof
const proof = await honkBackend.generateProof(witness);

const verificationKey = await honkBackend.getVerificationKey();

// Proof verification
const verifier = new UltraHonkVerifier();
const isValid = await verifier.verifyProof(proof, verificationKey);
expect(isValid).to.be.true;
});

it('UltraHonk end-to-end proof creation and verification (inner)', async () => {
// Noir.Js part
const inputs = {
x: '2',
y: '3',
};

const program = new Noir(assert_lt_program);

const { witness } = await program.execute(inputs);

// bb.js part
//
// Proof creation
const proof = await honkBackend.generateProof(witness);

// Proof verification
const isValid = await honkBackend.verifyProof(proof);
expect(isValid).to.be.true;
});

it('UltraHonk end-to-end proving and verification with different instances', async () => {
// Noir.Js part
const inputs = {
x: '2',
y: '3',
};

const program = new Noir(assert_lt_program);

const { witness } = await program.execute(inputs);

// bb.js part
const proof = await honkBackend.generateProof(witness);

const verifier = new UltraHonkBackend(assert_lt_program);
const proof_is_valid = await verifier.verifyProof(proof);
expect(proof_is_valid).to.be.true;
});

it('[BUG] -- UltraHonk bb.js null function or function signature mismatch (outer-inner) ', async () => {
// Noir.Js part
const inputs = {
x: '2',
y: '3',
};

const program = new Noir(assert_lt_program);

const { witness } = await program.execute(inputs);

// bb.js part
//
// Proof creation
//
// Create a proof using both proving systems, the majority of the time
// one would only use outer proofs.
const proofOuter = await honkBackend.generateProof(witness);
const _proofInner = await honkBackend.generateProof(witness);

// Proof verification
//
const isValidOuter = await honkBackend.verifyProof(proofOuter);
expect(isValidOuter).to.be.true;
// We can also try verifying an inner proof and it will fail.
const isValidInner = await honkBackend.verifyProof(_proofInner);
expect(isValidInner).to.be.true;
});

it('UltraHonk end-to-end proof creation and verification for multiple ACIR circuits (inner)', async () => {
// Noir.Js part
const inputs = {
x: '10',
};

const program = new Noir(fold_fibonacci_program);

const { witness } = await program.execute(inputs);

// bb.js part
//
// Proof creation
const honkBackend = new UltraHonkBackend(fold_fibonacci_program);
const proof = await honkBackend.generateProof(witness);

// Proof verification
const isValid = await honkBackend.verifyProof(proof);
expect(isValid).to.be.true;
});
14 changes: 14 additions & 0 deletions docs/docs/tutorials/noirjs_app.md
Original file line number Diff line number Diff line change
Expand Up @@ -346,3 +346,17 @@ You have successfully generated a client-side Noir web app!
You can see how noirjs is used in a full stack Next.js hardhat application in the [noir-starter repo here](https://github.com/noir-lang/noir-starter/tree/main/vite-hardhat). The example shows how to calculate a proof in the browser and verify it with a deployed Solidity verifier contract from noirjs.

You should also check out the more advanced examples in the [noir-examples repo](https://github.com/noir-lang/noir-examples), where you'll find reference usage for some cool apps.

## UltraHonk Backend

Barretenberg has recently exposed a new UltraHonk backend. We can use UltraHonk in NoirJS after version 0.33.0. Everything will be the same as the tutorial above, except that the class we need to import will change:
```js
import { UltraHonkBackend, UltraHonkVerifier as Verifier } from '@noir-lang/backend_barretenberg';
```
The backend will then be instantiated as such:
```js
const backend = new UltraHonkBackend(circuit);
```
Then all the commands to prove and verify your circuit will be same.

The only feature currently unsupported with UltraHonk are [recursive proofs](../explainers/explainer-recursion.md).
144 changes: 141 additions & 3 deletions tooling/noir_js_backend_barretenberg/src/backend.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@ import { decompressSync as gunzip } from 'fflate';
import { acirToUint8Array } from './serialize.js';
import { Backend, CompiledCircuit, ProofData, VerifierBackend } from '@noir-lang/types';
import { BackendOptions } from './types.js';
import { deflattenPublicInputs } from './public_inputs.js';
import { reconstructProofWithPublicInputs } from './verifier.js';
import { deflattenFields } from './public_inputs.js';
import { reconstructProofWithPublicInputs, reconstructProofWithPublicInputsHonk } from './verifier.js';
import { type Barretenberg } from '@aztec/bb.js';

// This is the number of bytes in a UltraPlonk proof
Expand Down Expand Up @@ -50,6 +50,7 @@ export class BarretenbergBackend implements Backend, VerifierBackend {
this.acirUncompressedBytecode,
honkRecursion,
);

const crs = await Crs.new(subgroupSize + 1);
await api.commonInitSlabAllocator(subgroupSize);
await api.srsInitSrs(new RawBuffer(crs.getG1Data()), crs.numPoints, new RawBuffer(crs.getG2Data()));
Expand All @@ -73,7 +74,7 @@ export class BarretenbergBackend implements Backend, VerifierBackend {

const publicInputsConcatenated = proofWithPublicInputs.slice(0, splitIndex);
const proof = proofWithPublicInputs.slice(splitIndex);
const publicInputs = deflattenPublicInputs(publicInputsConcatenated);
const publicInputs = deflattenFields(publicInputsConcatenated);

return { proof, publicInputs };
}
Expand Down Expand Up @@ -143,3 +144,140 @@ export class BarretenbergBackend implements Backend, VerifierBackend {
await this.api.destroy();
}
}

// Buffers are prepended with their size. The size takes 4 bytes.
const serializedBufferSize = 4;
const fieldByteSize = 32;
const publicInputOffset = 3;
const publicInputsOffsetBytes = publicInputOffset * fieldByteSize;

export class UltraHonkBackend implements Backend, VerifierBackend {
// These type assertions are used so that we don't
// have to initialize `api` in the constructor.
// These are initialized asynchronously in the `init` function,
// constructors cannot be asynchronous which is why we do this.

protected api!: Barretenberg;
protected acirUncompressedBytecode: Uint8Array;

constructor(
acirCircuit: CompiledCircuit,
protected options: BackendOptions = { threads: 1 },
) {
const acirBytecodeBase64 = acirCircuit.bytecode;
this.acirUncompressedBytecode = acirToUint8Array(acirBytecodeBase64);
}

/** @ignore */
async instantiate(): Promise<void> {
if (!this.api) {
if (typeof navigator !== 'undefined' && navigator.hardwareConcurrency) {
this.options.threads = navigator.hardwareConcurrency;
} else {
try {
const os = await import('os');
this.options.threads = os.cpus().length;
} catch (e) {
console.log('Could not detect environment. Falling back to one thread.', e);
}
}
const { Barretenberg, RawBuffer, Crs } = await import('@aztec/bb.js');
const api = await Barretenberg.new(this.options);

const honkRecursion = true;
const [_exact, _total, subgroupSize] = await api.acirGetCircuitSizes(
this.acirUncompressedBytecode,
honkRecursion,
);
const crs = await Crs.new(subgroupSize + 1);
await api.commonInitSlabAllocator(subgroupSize);
await api.srsInitSrs(new RawBuffer(crs.getG1Data()), crs.numPoints, new RawBuffer(crs.getG2Data()));

// We don't init a proving key here in the Honk API
// await api.acirInitProvingKey(this.acirComposer, this.acirUncompressedBytecode);
this.api = api;
}
}

async generateProof(decompressedWitness: Uint8Array): Promise<ProofData> {
await this.instantiate();
const proofWithPublicInputs = await this.api.acirProveUltraHonk(
this.acirUncompressedBytecode,
gunzip(decompressedWitness),
);
const proofAsStrings = deflattenFields(proofWithPublicInputs.slice(4));

const numPublicInputs = Number(proofAsStrings[1]);

// Account for the serialized buffer size at start
const publicInputsOffset = publicInputsOffsetBytes + serializedBufferSize;
// Get the part before and after the public inputs
const proofStart = proofWithPublicInputs.slice(0, publicInputsOffset);
const publicInputsSplitIndex = numPublicInputs * fieldByteSize;
const proofEnd = proofWithPublicInputs.slice(publicInputsOffset + publicInputsSplitIndex);
// Construct the proof without the public inputs
const proof = new Uint8Array([...proofStart, ...proofEnd]);

// Fetch the number of public inputs out of the proof string
const publicInputsConcatenated = proofWithPublicInputs.slice(
publicInputsOffset,
publicInputsOffset + publicInputsSplitIndex,
);
const publicInputs = deflattenFields(publicInputsConcatenated);

return { proof, publicInputs };
}

async verifyProof(proofData: ProofData): Promise<boolean> {
const { RawBuffer } = await import('@aztec/bb.js');

const proof = reconstructProofWithPublicInputsHonk(proofData);

await this.instantiate();
const vkBuf = await this.api.acirWriteVkUltraHonk(this.acirUncompressedBytecode);

return await this.api.acirVerifyUltraHonk(proof, new RawBuffer(vkBuf));
}

async getVerificationKey(): Promise<Uint8Array> {
await this.instantiate();
return await this.api.acirWriteVkUltraHonk(this.acirUncompressedBytecode);
}

// TODO(https://github.com/noir-lang/noir/issues/5661): Update this to handle Honk recursive aggregation in the browser once it is ready in the backend itself
async generateRecursiveProofArtifacts(
_proofData: ProofData,
_numOfPublicInputs: number,
): Promise<{ proofAsFields: string[]; vkAsFields: string[]; vkHash: string }> {
await this.instantiate();
// TODO(https://github.com/noir-lang/noir/issues/5661): This needs to be updated to handle recursive aggregation.
// There is still a proofAsFields method but we could consider getting rid of it as the proof itself
// is a list of field elements.
// UltraHonk also does not have public inputs directly prepended to the proof and they are still instead
// inserted at an offset.
// const proof = reconstructProofWithPublicInputs(proofData);
// const proofAsFields = (await this.api.acirProofAsFieldsUltraHonk(proof)).slice(numOfPublicInputs);

// TODO: perhaps we should put this in the init function. Need to benchmark
// TODO how long it takes.
const vkBuf = await this.api.acirWriteVkUltraHonk(this.acirUncompressedBytecode);
vezenovm marked this conversation as resolved.
Show resolved Hide resolved
const vk = await this.api.acirVkAsFieldsUltraHonk(vkBuf);

return {
// TODO(https://github.com/noir-lang/noir/issues/5661)
proofAsFields: [],
vkAsFields: vk.map((vk) => vk.toString()),
// We use an empty string for the vk hash here as it is unneeded as part of the recursive artifacts
// The user can be expected to hash the vk inside their circuit to check whether the vk is the circuit
// they expect
vkHash: '',
};
}

async destroy(): Promise<void> {
if (!this.api) {
return;
}
await this.api.destroy();
}
}
4 changes: 2 additions & 2 deletions tooling/noir_js_backend_barretenberg/src/index.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
export { BarretenbergBackend } from './backend.js';
export { BarretenbergVerifier } from './verifier.js';
export { BarretenbergBackend, UltraHonkBackend } from './backend.js';
export { BarretenbergVerifier, UltraHonkVerifier } from './verifier.js';

// typedoc exports
export { Backend, CompiledCircuit, ProofData } from '@noir-lang/types';
Expand Down
10 changes: 5 additions & 5 deletions tooling/noir_js_backend_barretenberg/src/public_inputs.ts
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
import { WitnessMap } from '@noir-lang/types';

export function flattenPublicInputsAsArray(publicInputs: string[]): Uint8Array {
const flattenedPublicInputs = publicInputs.map(hexToUint8Array);
export function flattenFieldsAsArray(fields: string[]): Uint8Array {
const flattenedPublicInputs = fields.map(hexToUint8Array);
return flattenUint8Arrays(flattenedPublicInputs);
}

export function deflattenPublicInputs(flattenedPublicInputs: Uint8Array): string[] {
export function deflattenFields(flattenedFields: Uint8Array): string[] {
const publicInputSize = 32;
const chunkedFlattenedPublicInputs: Uint8Array[] = [];

for (let i = 0; i < flattenedPublicInputs.length; i += publicInputSize) {
const publicInput = flattenedPublicInputs.slice(i, i + publicInputSize);
for (let i = 0; i < flattenedFields.length; i += publicInputSize) {
const publicInput = flattenedFields.slice(i, i + publicInputSize);
chunkedFlattenedPublicInputs.push(publicInput);
}

Expand Down
Loading
Loading