Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Katana: Add state update DA encodings #2474

Merged
merged 4 commits into from
Sep 25, 2024
Merged

Katana: Add state update DA encodings #2474

merged 4 commits into from
Sep 25, 2024

Conversation

kariy
Copy link
Member

@kariy kariy commented Sep 24, 2024

Summary by CodeRabbit

  • New Features

    • Introduced a new recover function for data recovery from blobs.
    • Added functionality for encoding and decoding state updates related to blockchain state changes.
    • Implemented an inverse Fast Fourier Transform (IFFT) for BigUint vectors.
    • New module structure for better organization, including blob, eip4844, encoding, math, and serde.
    • Added a function to parse hexadecimal strings into blob data.
  • Bug Fixes

    • Enhanced error handling in encoding and decoding processes.
  • Tests

    • Added tests for parsing and encoding state updates from blob data.
    • Introduced a helper function for reading blob data in tests.

Copy link

coderabbitai bot commented Sep 24, 2024

Walkthrough

Ohayo, sensei! This pull request introduces several new files and modules in the katana/primitives crate, focusing on data availability and cryptographic operations. Key changes include the addition of functions for encoding and decoding state updates, the implementation of the inverse Fast Fourier Transform (IFFT), and the introduction of constants related to EIP-4844. Additionally, the Cargo.toml file has been updated to manage dependencies more effectively.

Changes

File Path Change Summary
crates/katana/primitives/Cargo.toml - Added num-traits and num-bigint = "0.4.6" dependencies.
- Removed num-traits from [dev-dependencies] section.
- Added rstest to [dev-dependencies].
crates/katana/primitives/src/da/blob.rs - Added pub fn recover(data: Vec<BigUint>) -> Vec<BigUint> for recovering original data from a blob.
- Added pub fn transform(data: Vec<BigUint>) -> Vec<BigUint> for transforming data using FFT.
crates/katana/primitives/src/da/eip4844.rs - Added constants: pub const BLOB_LEN: usize, pub static ref BLS_MODULUS: BigUint, pub static ref GENERATOR: BigUint, and pub static ref TWO: BigUint for EIP-4844 related operations.
crates/katana/primitives/src/da/encoding.rs - Added pub fn encode_state_updates(value: StateUpdates) -> Vec<BigUint> and pub fn decode_state_updates(value: &[BigUint]) -> Result<StateUpdates, EncodingError>.
- Introduced Metadata and ContractUpdate structs.
crates/katana/primitives/src/da/math.rs - Added pub fn ifft(arr: Vec<BigUint>, xs: Vec<BigUint>, p: &BigUint) -> Vec<BigUint> and pub fn div_mod(a: BigUint, b: BigUint, p: &BigUint) -> BigUint for mathematical operations.
- Introduced pub static ref TWO: BigUint.
crates/katana/primitives/src/da/mod.rs - Added new modules: pub mod blob;, pub mod eip4844;, pub mod encoding;, pub mod math;, and pub mod serde;.
crates/katana/primitives/src/da/serde.rs - Added pub fn parse_str_to_blob_data(data: &str) -> Vec<BigUint> for parsing hexadecimal string data.
crates/katana/primitives/src/lib.rs - Added pub mod da; to expose the da module.
crates/katana/primitives/tests/blobs.rs - Introduced test functions: fn parse_blobs_rt(#[case] blob: &str) -> Result<()> and fn read(path: &str) -> Vec<BigUint> for testing blob data parsing and encoding.

Possibly related PRs

Suggested labels

sozo


Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    -- I pushed a fix in commit <commit_id>, please review it.
    -- Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    -- @coderabbitai generate unit testing code for this file.
    -- @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    -- @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    -- @coderabbitai read src/utils.ts and generate unit testing code.
    -- @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    -- @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@kariy kariy changed the title Katana: Add state update data availability encodings Katana: Add state update DA encodings Sep 24, 2024
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 10

Outside diff range and nitpick comments (11)
crates/katana/primitives/src/da/mod.rs (1)

1-5: Ohayo, sensei! The module structure looks sharp!

The organization of these modules seems well-thought-out, covering various aspects of data availability and cryptographic operations. It's a clean way to separate concerns and maintain modularity.

To enhance code clarity, consider adding documentation comments for each module. This will help other developers understand the purpose and contents of each module at a glance. For example:

/// Module for handling binary large objects (blobs)
pub mod blob;

/// Implementation of EIP-4844 related functionality
pub mod eip4844;

/// Utilities for encoding and decoding data
pub mod encoding;

/// Mathematical operations and utilities
pub mod math;

/// Serialization and deserialization functionality
pub mod serde;

What do you think, sensei? Would you like me to open a GitHub issue to track this documentation task?

crates/katana/primitives/src/da/serde.rs (1)

6-11: Ohayo, sensei! The function signature looks good, but let's enhance the documentation!

The function name and signature are clear and appropriate. However, we could improve the documentation to provide more context and details:

Consider expanding the documentation as follows:

/// Parse a hexadecimal string into a vector of `BigUint` representing blob data.
/// 
/// This function expects a hexadecimal string of length 64 * BLOB_LEN characters.
/// It splits the string into BLOB_LEN segments of 64 characters each and converts
/// each segment into a `BigUint`.
///
/// # Arguments
/// * `data` - A hexadecimal string representing the blob data.
///
/// # Returns
/// A vector of `BigUint` values, where each `BigUint` represents a 32-byte word of the blob.
///
/// # Panics
/// This function will panic if the input string contains invalid hexadecimal characters
/// or if its length is not exactly 64 * BLOB_LEN.
crates/katana/primitives/Cargo.toml (1)

14-14: Ohayo, sensei! New dependencies look good, but let's tweak num-bigint.

The addition of num-traits as a workspace dependency is great! However, for num-bigint, consider using workspace versioning for consistency:

-num-bigint = "0.4.6"
+num-bigint.workspace = true

This change will make it easier to manage versions across the workspace, sensei!

Also applies to: 27-27

crates/katana/primitives/src/da/eip4844.rs (2)

6-10: Ohayo, sensei! The constant looks good, but let's add a bit more flavor to the comment!

The BLOB_LEN constant is correctly defined and aligns with EIP-4844 specifications. Great job! However, we could make the comment more informative.

Consider expanding the comment to provide more context:

-/// Length of the blob.
+/// Length of the blob in bytes, as specified in EIP-4844.
+/// This constant defines the size of each data blob used in the data availability layer.
 pub const BLOB_LEN: usize = 4096;

12-26: Ohayo, sensei! The lazy_static definitions are mostly on point, but let's sharpen that katana a bit more!

The BLS_MODULUS and GENERATOR constants are correctly defined and align with EIP-4844 specifications. Excellent use of lazy_static for these large values!

However, the TWO constant seems a bit unnecessary as a lazy_static definition.

Consider replacing the TWO constant with a simple integer or using the const keyword instead:

-    pub static ref TWO: BigUint = 2u32.to_biguint().unwrap();
+    pub const TWO: u32 = 2;

If you need TWO as a BigUint in multiple places, you could create a function to convert it on demand:

pub fn two_as_biguint() -> BigUint {
    2u32.to_biguint().unwrap()
}

This approach would be more efficient and clearer in its intent.

crates/katana/primitives/src/da/math.rs (4)

35-38: Nitpick: Avoid unnecessary cloning of TWO

Ohayo, sensei! You can improve performance by referencing TWO directly instead of cloning it.

Apply this change:

-        res0.push(div_mod(a + b, TWO.clone(), p));
+        res0.push(div_mod(a + b, &TWO, p));

-        res1.push(div_mod(diff, TWO.clone() * x, p));
+        res1.push(div_mod(diff, &TWO * x, p));

40-40: Nitpick: Use &TWO to avoid cloning

Ohayo, sensei! Similarly, you can avoid cloning TWO here by using a reference.

Apply this change:

-        new_xs.push(x.modpow(&TWO.clone(), p));
+        new_xs.push(x.modpow(&TWO, p));

68-68: Nitpick: Prevent unnecessary cloning in div_mod

Ohayo, sensei! You can avoid cloning TWO in the div_mod function as well.

Apply this change:

-    a * b.modpow(&(p - TWO.clone()), p) % p
+    a * b.modpow(&(p - &TWO), p) % p

19-69: Offer Assistance: Add unit tests for mathematical functions

Ohayo, sensei! To ensure the correctness of ifft and div_mod, it's beneficial to include unit tests covering various cases, including edge conditions.

Would you like me to help draft some unit tests for these functions?

crates/katana/primitives/src/da/encoding.rs (2)

261-283: Remove commented-out code to improve code cleanliness

Ohayo, sensei! The ContractUpdate::decode function is commented out (lines 261-283). If this code is no longer needed, consider removing it to keep the codebase clean and maintainable. Alternatively, if it's a work in progress, you might want to implement it fully or add a TODO comment.


190-197: Use proper Rust doc comments for documentation

Ohayo, sensei! The documentation for the Metadata struct's encoding format (lines 190-197) uses // comments. To generate documentation properly and follow Rust conventions, please use /// for struct-level documentation or //! for module-level documentation.

Suggested change:

 /// Metadata information about the contract update.
-// Encoding format:
-//
-// ┌───────────────┬───────────────┬───────────────┬───────────────────────────┐
-// │    padding    │  class flag   │   new nonce   │   no. storage updates     │
-// ├───────────────┼───────────────┼───────────────┼───────────────────────────┤
-// │    127 bits   │    1 bit      │    64 bits    │         64 bits           │
-// └───────────────┴───────────────┴───────────────┴───────────────────────────┘
+ /// Encoding format:
+ ///
+ /// ┌───────────────┬───────────────┬───────────────┬───────────────────────────┐
+ /// │    padding    │  class flag   │   new nonce   │   no. storage updates     │
+ /// ├───────────────┼───────────────┼───────────────┼───────────────────────────┤
+ /// │    127 bits   │    1 bit      │    64 bits    │         64 bits           │
+ /// └───────────────┴───────────────┴───────────────┴───────────────────────────┘
Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 8f4bcbb and 526ebc5.

Files ignored due to path filters (1)
  • Cargo.lock is excluded by !**/*.lock
Files selected for processing (9)
  • crates/katana/primitives/Cargo.toml (2 hunks)
  • crates/katana/primitives/src/da/blob.rs (1 hunks)
  • crates/katana/primitives/src/da/eip4844.rs (1 hunks)
  • crates/katana/primitives/src/da/encoding.rs (1 hunks)
  • crates/katana/primitives/src/da/math.rs (1 hunks)
  • crates/katana/primitives/src/da/mod.rs (1 hunks)
  • crates/katana/primitives/src/da/serde.rs (1 hunks)
  • crates/katana/primitives/src/lib.rs (1 hunks)
  • crates/katana/primitives/tests/blobs.rs (1 hunks)
Additional comments not posted (6)
crates/katana/primitives/src/lib.rs (1)

7-7: Ohayo, sensei! Welcome to the new da module!

The addition of the da module looks great and follows the existing structure of the codebase. It's placed in the correct alphabetical order among the other module declarations.

crates/katana/primitives/src/da/serde.rs (1)

1-4: Ohayo, sensei! The imports look good!

The necessary types and constants are imported correctly for the function's implementation.

crates/katana/primitives/tests/blobs.rs (1)

1-6: Ohayo, sensei! The imports look good!

The necessary modules and types are imported correctly, providing all the required functionality for blob parsing and encoding tests.

crates/katana/primitives/Cargo.toml (1)

31-31: Ohayo again, sensei! Dev-dependencies look sharp!

The changes to the dev-dependencies section are on point:

  • Removing num-traits is correct as it's now a regular dependency.
  • Adding rstest as a workspace dependency is consistent with the project structure.

These changes will keep our testing setup clean and maintainable. Excellent work, sensei!

crates/katana/primitives/src/da/eip4844.rs (1)

1-4: Ohayo, sensei! The imports look sharp and focused!

The necessary imports for FromStr, lazy_static, and BigUint are all present and accounted for. No unnecessary imports detected. Excellent work!

crates/katana/primitives/src/da/blob.rs (1)

7-18: Ohayo, sensei! Great job on the documentation

The function is well-documented with clear explanations of its purpose, arguments, and return values.

crates/katana/primitives/src/da/serde.rs Show resolved Hide resolved
crates/katana/primitives/tests/blobs.rs Show resolved Hide resolved
Comment on lines 15 to 42
#[rstest]
#[case("./tests/test-data/blobs/block_636262.txt")]
#[case("./tests/test-data/blobs/block_636263.txt")]
#[case("./tests/test-data/blobs/block_636264.txt")]
fn parse_blobs_rt(#[case] blob: &str) -> Result<()> {
let encoded = blob::recover(read(blob));
let state_update = encoding::decode_state_updates(&encoded)?;
let _ = encode_state_updates(state_update);
Ok(())
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Ohayo, sensei! Let's enhance the parse_blobs_rt test function.

The function structure looks good, but we can improve it to make it a more effective test:

  1. Add assertions to verify the correctness of the operations.
  2. Consider comparing the re-encoded state updates with the original encoded data.

Here's a suggested improvement:

#[rstest]
#[case("./tests/test-data/blobs/block_636262.txt")]
#[case("./tests/test-data/blobs/block_636263.txt")]
#[case("./tests/test-data/blobs/block_636264.txt")]
fn parse_blobs_rt(#[case] blob: &str) -> Result<()> {
    let original_encoded = blob::recover(read(blob)?);
    let state_update = encoding::decode_state_updates(&original_encoded)?;
    let re_encoded = encode_state_updates(state_update.clone())?;
    
    assert_eq!(original_encoded, re_encoded, "Re-encoded data should match the original");
    assert!(!state_update.is_empty(), "Decoded state update should not be empty");
    
    Ok(())
}

These changes will make the test more robust by verifying that the encoding and decoding processes are reversible and that the state update is not empty.

crates/katana/primitives/src/da/blob.rs Show resolved Hide resolved
crates/katana/primitives/src/da/math.rs Show resolved Hide resolved
return arr;
}

let n = arr.len() / 2;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Issue: Ensure the input length is a power of two

Ohayo, sensei! The function assumes that arr.len() is divisible by 2. If arr.len() is odd, the calculation let n = arr.len() / 2; may lead to incorrect behavior. Please add a check to confirm that the length of arr is a power of two.

Consider adding this validation:

+    if arr.len() % 2 != 0 {
+        panic!("Input array length must be a power of two");
+    }

Committable suggestion was skipped due to low confidence.

Comment on lines +31 to +49
let a = &arr[i];
let b = &arr[i + 1];
let x = &xs[i];
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

Missing Length Check for xs in ifft Calls

Ohayo, sensei! It appears there are no validations ensuring that xs.len() is at least as long as arr.len() before invoking ifft. This could lead to potential out-of-bounds errors. Please add the necessary length checks to ensure safe usage.

Analysis chain

Verify: Check that xs has sufficient length

Ohayo, sensei! The code uses xs[i] in the loop, but there's no check to ensure xs is long enough. Please verify that xs.len() is at least as long as arr.len().

Run the following script to confirm:


Ohayo, sensei! Let's ensure that every ifft call is safe by confirming that xs is indeed long enough. Running the above script will help us verify that xs.len() is checked against arr.len() wherever ifft is invoked. Once you have the results, feel free to share them for the next steps!


Ohayo, sensei! The previous check didn’t yield any results. Let’s run this new script to comprehensively verify that xs is sufficiently long before each ifft call. Share the results once it's executed, and we'll proceed from there!

Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Ensure that 'xs' vectors provided to 'ifft' have appropriate lengths.

# Test: Search for all calls to 'ifft' and compare the lengths of 'arr' and 'xs'.
rg --type rust 'ifft\((?P<arr>[^,]+),\s*(?P<xs>[^,]+),' -A1

Length of output: 623


Script:

#!/bin/bash
# Description: Verify that 'xs.len()' is at least as long as 'arr.len()' before 'ifft' calls.

# Search for 'ifft(' and include 5 lines of context above each match
rg --type rust -C5 'ifft\(' | \
# Filter the results to include only those where 'xs.len() >= arr.len()' exists in the context
grep -B5 'xs\.len\(\)\s*>=\s*arr\.len\(\)'

Length of output: 74


Script:

#!/bin/bash
# Description: Verify that 'xs.len()' is at least as long as 'arr.len()' before 'ifft' calls with broader search patterns.

# Search for 'ifft(' and include 5 lines of context above each match
rg --type rust -C5 'ifft\(' | \
# Look for any comparison between 'xs.len()' and 'arr.len()' in the context
grep -E 'xs\.len\(\)\s*(>=|>|==)\s*arr\.len\(\)|arr\.len\(\)\s*(<=|<|==)\s*xs\.len\(\)'

Length of output: 119

crates/katana/primitives/src/da/encoding.rs Outdated Show resolved Hide resolved
crates/katana/primitives/src/da/encoding.rs Outdated Show resolved Hide resolved
Copy link

codecov bot commented Sep 24, 2024

Codecov Report

Attention: Patch coverage is 99.17012% with 2 lines in your changes missing coverage. Please review.

Project coverage is 68.61%. Comparing base (731ff47) to head (07ddee3).
Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
crates/katana/primitives/src/da/encoding.rs 98.80% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2474      +/-   ##
==========================================
+ Coverage   68.45%   68.61%   +0.15%     
==========================================
  Files         368      372       +4     
  Lines       48185    48426     +241     
==========================================
+ Hits        32987    33226     +239     
- Misses      15198    15200       +2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL

Commits

Files that changed from the base of the PR and between c26f76b and 07ddee3.

Files ignored due to path filters (1)
  • Cargo.lock is excluded by !**/*.lock
Files selected for processing (9)
  • crates/katana/primitives/Cargo.toml (2 hunks)
  • crates/katana/primitives/src/da/blob.rs (1 hunks)
  • crates/katana/primitives/src/da/eip4844.rs (1 hunks)
  • crates/katana/primitives/src/da/encoding.rs (1 hunks)
  • crates/katana/primitives/src/da/math.rs (1 hunks)
  • crates/katana/primitives/src/da/mod.rs (1 hunks)
  • crates/katana/primitives/src/da/serde.rs (1 hunks)
  • crates/katana/primitives/src/lib.rs (1 hunks)
  • crates/katana/primitives/tests/blobs.rs (1 hunks)
Files skipped from review as they are similar to previous changes (8)
  • crates/katana/primitives/Cargo.toml
  • crates/katana/primitives/src/da/eip4844.rs
  • crates/katana/primitives/src/da/encoding.rs
  • crates/katana/primitives/src/da/math.rs
  • crates/katana/primitives/src/da/mod.rs
  • crates/katana/primitives/src/da/serde.rs
  • crates/katana/primitives/src/lib.rs
  • crates/katana/primitives/tests/blobs.rs
Additional comments not posted (1)
crates/katana/primitives/src/da/blob.rs (1)

1-5: LGTM! Imports are well-organized.

Ohayo, sensei! The import statements are concise and relevant to the functionality implemented in this file. Good job on keeping the imports clean and organized.

Comment on lines +31 to +41
pub fn transform(data: Vec<BigUint>) -> Vec<BigUint> {
let xs: Vec<BigUint> = (0..BLOB_LEN)
.map(|i| {
let bin = format!("{:012b}", i);
let bin_rev = bin.chars().rev().collect::<String>();
GENERATOR.modpow(&BigUint::from_str_radix(&bin_rev, 2).unwrap(), &BLS_MODULUS)
})
.collect();

fft(data, xs, &BLS_MODULUS)
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

⚠️ Potential issue

Improve robustness and reduce code duplication

Ohayo, sensei! The transform function looks good, but we can make it more robust and reduce code duplication:

  1. Add input validation and handle potential panic (similar to recover):
 pub fn transform(data: Vec<BigUint>) -> Vec<BigUint> {
+    if data.len() != BLOB_LEN {
+        panic!("Invalid data length: expected {}, got {}", BLOB_LEN, data.len());
+    }
     let xs: Vec<BigUint> = (0..BLOB_LEN)
         .map(|i| {
             let bin = format!("{:012b}", i);
             let bin_rev = bin.chars().rev().collect::<String>();
-            GENERATOR.modpow(&BigUint::from_str_radix(&bin_rev, 2).unwrap(), &BLS_MODULUS)
+            let exponent = BigUint::from_str_radix(&bin_rev, 2)
+                .expect("Failed to parse reversed binary string into BigUint");
+            GENERATOR.modpow(&exponent, &BLS_MODULUS)
         })
         .collect();

     fft(data, xs, &BLS_MODULUS)
 }
  1. To reduce code duplication, consider extracting the common logic into a separate function:
fn generate_xs() -> Vec<BigUint> {
    (0..BLOB_LEN)
        .map(|i| {
            let bin = format!("{:012b}", i);
            let bin_rev = bin.chars().rev().collect::<String>();
            let exponent = BigUint::from_str_radix(&bin_rev, 2)
                .expect("Failed to parse reversed binary string into BigUint");
            GENERATOR.modpow(&exponent, &BLS_MODULUS)
        })
        .collect()
}

pub fn recover(data: Vec<BigUint>) -> Vec<BigUint> {
    if data.len() != BLOB_LEN {
        panic!("Invalid data length: expected {}, got {}", BLOB_LEN, data.len());
    }
    let xs = generate_xs();
    ifft(data, xs, &BLS_MODULUS)
}

pub fn transform(data: Vec<BigUint>) -> Vec<BigUint> {
    if data.len() != BLOB_LEN {
        panic!("Invalid data length: expected {}, got {}", BLOB_LEN, data.len());
    }
    let xs = generate_xs();
    fft(data, xs, &BLS_MODULUS)
}

These changes will improve the robustness of both functions and make the code more maintainable.

Comment on lines +19 to +29
pub fn recover(data: Vec<BigUint>) -> Vec<BigUint> {
let xs: Vec<BigUint> = (0..BLOB_LEN)
.map(|i| {
let bin = format!("{:012b}", i);
let bin_rev = bin.chars().rev().collect::<String>();
GENERATOR.modpow(&BigUint::from_str_radix(&bin_rev, 2).unwrap(), &BLS_MODULUS)
})
.collect();

ifft(data, xs, &BLS_MODULUS)
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add input validation and handle potential panic

Ohayo, sensei! The recover function looks good overall, but I have two suggestions to improve its robustness:

  1. Add input validation to ensure the correct data length:
 pub fn recover(data: Vec<BigUint>) -> Vec<BigUint> {
+    if data.len() != BLOB_LEN {
+        panic!("Invalid data length: expected {}, got {}", BLOB_LEN, data.len());
+    }
     let xs: Vec<BigUint> = (0..BLOB_LEN)
  1. Handle the potential panic from unwrap():
         .map(|i| {
             let bin = format!("{:012b}", i);
             let bin_rev = bin.chars().rev().collect::<String>();
-            GENERATOR.modpow(&BigUint::from_str_radix(&bin_rev, 2).unwrap(), &BLS_MODULUS)
+            let exponent = BigUint::from_str_radix(&bin_rev, 2)
+                .expect("Failed to parse reversed binary string into BigUint");
+            GENERATOR.modpow(&exponent, &BLS_MODULUS)
         })

These changes will make the function more robust and easier to debug if issues arise.

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
pub fn recover(data: Vec<BigUint>) -> Vec<BigUint> {
let xs: Vec<BigUint> = (0..BLOB_LEN)
.map(|i| {
let bin = format!("{:012b}", i);
let bin_rev = bin.chars().rev().collect::<String>();
GENERATOR.modpow(&BigUint::from_str_radix(&bin_rev, 2).unwrap(), &BLS_MODULUS)
})
.collect();
ifft(data, xs, &BLS_MODULUS)
}
pub fn recover(data: Vec<BigUint>) -> Vec<BigUint> {
if data.len() != BLOB_LEN {
panic!("Invalid data length: expected {}, got {}", BLOB_LEN, data.len());
}
let xs: Vec<BigUint> = (0..BLOB_LEN)
.map(|i| {
let bin = format!("{:012b}", i);
let bin_rev = bin.chars().rev().collect::<String>();
let exponent = BigUint::from_str_radix(&bin_rev, 2)
.expect("Failed to parse reversed binary string into BigUint");
GENERATOR.modpow(&exponent, &BLS_MODULUS)
})
.collect();
ifft(data, xs, &BLS_MODULUS)
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant