Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(engine): add zstd compression support for geometry attributes #530

Merged
merged 2 commits into from
Sep 26, 2024

Conversation

miseyu
Copy link
Contributor

@miseyu miseyu commented Sep 26, 2024

Overview

What I've done

What I haven't done

How I tested

Screenshot

Which point I want you to review particularly

Memo

Summary by CodeRabbit

  • New Features

    • Introduced a new compression and decompression module using Zstandard (zstd).
    • Added new action nodes in the workflow for enhanced geometry data processing, including GeometryExtractorCityGml and FeatureFilterByLineOverlap.
    • Enhanced workflow with a new GeometryExtractor2D action for improved geometrical data extraction.
  • Bug Fixes

    • Improved error handling in the base64 decoding function to manage invalid UTF-8 sequences.
  • Documentation

    • Updated action nodes and attributes in the workflow configuration for clarity and improved functionality.

@miseyu miseyu self-assigned this Sep 26, 2024
@miseyu miseyu requested a review from a team as a code owner September 26, 2024 03:23
Copy link
Contributor

coderabbitai bot commented Sep 26, 2024

Caution

Review failed

The pull request is closed.

Walkthrough

The pull request introduces several changes across multiple files in the project. A new dependency on the zstd crate is added to the Cargo.toml files, enhancing data processing capabilities. Modifications are made to the GeometryExtractor and GeometryReplacer implementations, replacing Base64 encoding with Zstandard compression and decompression methods. A new module for compression and decompression is created, along with updates to error handling. Additionally, the workflow configuration is refined with new action nodes and updated attributes, improving the overall geometry processing logic.

Changes

File Path Change Summary
engine/Cargo.toml, engine/runtime/common/Cargo.toml Added new dependency zstd = "0.13.0" and zstd.workspace = true.
engine/runtime/action-processor/src/geometry/extractor.rs Replaced base64_encode function with compress function from reearth_flow_common::compress.
engine/runtime/action-processor/src/geometry/replacer.rs Updated decoding logic: replaced base64_decode with decode function from the new compression module.
engine/runtime/common/src/compress.rs Introduced a new module for compressing and decompressing strings using Zstandard, with functions compress and decode.
engine/runtime/common/src/lib.rs Added new error variant Compress to the Error enum and a method to create Compress errors.
engine/runtime/common/src/str.rs Modified base64_decode to return a String using String::from_utf8_lossy and added a new function base64_decode_byte.
engine/runtime/examples/plateau/testdata/workflow/quality-check/... Updated workflow.yml with renamed action nodes, new attributes, and connections to enhance geometry data processing.

Possibly related PRs

Suggested labels

workflow

Poem

🐇 In the meadow where data flows,
A zesty zstd breeze now blows.
Compressing bytes with joyful cheer,
Geometry dances, crystal clear.
With new actions, our workflow sings,
Hopping high on data's wings! 🌼


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

netlify bot commented Sep 26, 2024

Deploy Preview for reearth-flow canceled.

Name Link
🔨 Latest commit 8298cb5
🔍 Latest deploy log https://app.netlify.com/sites/reearth-flow/deploys/66f4d6331fa0a3000846015e

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🧹 Outside diff range and nitpick comments (7)
engine/runtime/common/src/compress.rs (1)

1-4: Add missing import for zstd.

The zstd crate is used in the code but not explicitly imported. Consider adding it to the import list for clarity.

Add the following import at the beginning of the file:

+use zstd;

use crate::{
    str::{base64_decode_byte, base64_encode},
    Error, Result,
};
engine/runtime/common/src/lib.rs (1)

30-31: LGTM! Consider adding a doc comment for the new variant.

The addition of the Compress variant to the Error enum is consistent with the existing error handling structure. This suggests that compression functionality is being introduced to the project, which aligns with the PR objectives.

Consider adding a doc comment to explain when this error variant is used, similar to other variants in the enum. For example:

/// Error occurred during compression or decompression operations.
#[error("CompressError: {0}")]
Compress(String),
engine/runtime/action-processor/src/geometry/extractor.rs (2)

101-101: LGTM: Zstd compression implemented correctly

The change from base64_encode to compress correctly implements the zstd compression for the geometry attribute. The overall structure of the process method remains intact, and error handling is consistent.

However, consider the following points:

  1. Data size: While zstd compression generally provides better compression ratios than base64 encoding, it would be beneficial to measure and document the impact on data size for typical geometry attributes.
  2. Processing time: Compression may introduce additional processing overhead. It might be worth benchmarking to ensure that the performance impact is acceptable, especially for large datasets.
  3. Backwards compatibility: Ensure that any existing systems or processes that expect base64 encoded data are updated to handle the new compressed format.

Consider adding configuration options to allow users to choose between compression and base64 encoding, or to adjust compression levels, to provide flexibility for different use cases and performance requirements.


Line range hint 1-116: Summary: Zstd compression successfully implemented with some considerations

The changes in this file successfully implement zstd compression for geometry attributes, aligning well with the PR objectives. The implementation is correct and maintains the existing error handling and method structure.

However, there are a few points to consider for future improvements:

  1. Performance impact: It would be beneficial to benchmark the compression process, especially for large datasets, to ensure acceptable performance.
  2. Data size impact: Measure and document the effect on data size for typical geometry attributes when using zstd compression compared to the previous base64 encoding.
  3. Flexibility: Consider adding configuration options to allow users to choose between compression and base64 encoding, or to adjust compression levels.
  4. Backwards compatibility: Ensure that systems expecting base64 encoded data are updated to handle the new compressed format.

To address these points, consider the following actions:

  1. Add performance benchmarks for the compression process.
  2. Implement configuration options for compression vs. base64 encoding.
  3. Document the data size impact of zstd compression vs. base64 encoding.
  4. Create a migration guide for updating systems that expect base64 encoded data.
engine/runtime/common/src/str.rs (2)

20-21: Approve change with suggestion for documentation.

The use of String::from_utf8_lossy improves the robustness of the base64_decode function by handling invalid UTF-8 sequences. This change prevents the function from failing when encountering such sequences.

Consider adding a comment or updating the function documentation to mention that invalid UTF-8 sequences will be replaced with the Unicode replacement character (U+FFFD). This information helps users understand the behavior when dealing with potentially malformed input.

 pub fn base64_decode<T: AsRef<[u8]>>(s: T) -> crate::Result<String> {
     let result = general_purpose::STANDARD
         .decode(s)
         .map_err(|e| crate::Error::Str(format!("{}", e)))?;
+    // Convert to String, replacing invalid UTF-8 sequences with the Unicode replacement character
     Ok(String::from_utf8_lossy(&result).to_string())
 }

23-26: Approve new function with suggestion for error handling.

The addition of base64_decode_byte is a good improvement, allowing for direct decoding of base64 data into bytes without UTF-8 validation. This is particularly useful for handling non-text data.

Consider using map_err with Into::into for more concise error handling:

 pub fn base64_decode_byte<T: AsRef<[u8]>>(s: T) -> crate::Result<Vec<u8>> {
     general_purpose::STANDARD
         .decode(s)
-        .map_err(|e| crate::Error::Str(format!("{}", e)))
+        .map_err(|e| crate::Error::Str(e.to_string()))
 }

This change maintains the same functionality while slightly improving readability.

engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml (1)

Line range hint 1-481: Overall workflow improvements

The changes made to this workflow file represent a significant improvement in clarity, specificity, and functionality. The introduction of more granular geometry processing, specific naming conventions, and additional filtering steps enhances the overall quality check process for transportation-related features.

Key improvements:

  1. More specific node naming (e.g., GeometryExtractorCityGml)
  2. Separate handling of 2D geometry
  3. Enhanced line overlap analysis and filtering

These changes should make the workflow more robust and easier to maintain. However, to ensure the changes have been implemented correctly and don't introduce any unintended consequences, please:

  1. Review the results of the verification scripts provided in the previous comments.
  2. Conduct thorough testing of the updated workflow with various input datasets.
  3. Update any documentation or comments within the code to reflect these changes.

Consider creating a visual representation of this workflow (e.g., a flowchart) to aid in understanding and future maintenance. This could be particularly helpful for new team members or for quickly identifying potential areas for further optimization.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL

📥 Commits

Files that changed from the base of the PR and between 78d9177 and 1501b49.

🔇 Files ignored due to path filters (1)
  • engine/Cargo.lock is excluded by !**/*.lock, !**/*.lock
📒 Files selected for processing (8)
  • engine/Cargo.toml (1 hunks)
  • engine/runtime/action-processor/src/geometry/extractor.rs (2 hunks)
  • engine/runtime/action-processor/src/geometry/replacer.rs (2 hunks)
  • engine/runtime/common/Cargo.toml (1 hunks)
  • engine/runtime/common/src/compress.rs (1 hunks)
  • engine/runtime/common/src/lib.rs (2 hunks)
  • engine/runtime/common/src/str.rs (1 hunks)
  • engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml (5 hunks)
🔇 Additional comments not posted (14)
engine/runtime/common/Cargo.toml (1)

37-37: LGTM! Verify workspace configuration.

The addition of zstd.workspace = true is correct and aligns with the PR objective of adding Zstandard compression support. The dependency is added in the correct alphabetical order, maintaining consistency with other dependencies.

To ensure the zstd crate is properly configured in the workspace, please run the following script:

✅ Verification successful

LGTM! Workspace configuration verified.

The zstd.workspace = true setting in engine/runtime/common/Cargo.toml correctly references the zstd dependency defined in the workspace ./engine/Cargo.toml.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify zstd configuration in the workspace Cargo.toml

# Test: Check if zstd is defined in the workspace Cargo.toml
rg --type toml 'zstd.*=.*' ./Cargo.toml

# Test: Ensure the workspace Cargo.toml exists
if [ ! -f "./Cargo.toml" ]; then
  echo "Workspace Cargo.toml not found at the expected location."
fi

Length of output: 247


Script:

#!/bin/bash
# Description: Locate workspace Cargo.toml and verify zstd configuration

# Find all Cargo.toml files excluding the one in engine/runtime/common/
find . -path "./engine/runtime/common/Cargo.toml" -prune -o -name "Cargo.toml" -print > cargo_files.txt

# Iterate through found Cargo.toml files to check for zstd dependency
while IFS= read -r file; do
  echo "Checking $file for zstd dependency..."
  rg 'zstd\s*=\s*.*' "$file"
done < cargo_files.txt

Length of output: 5303

engine/runtime/common/src/compress.rs (1)

1-37: Overall assessment: Good foundation with room for improvement.

The implementation provides a solid foundation for zstd compression and decompression functionality. However, there are several areas where the code could be enhanced:

  1. Add explicit import for the zstd crate.
  2. Parameterize the compression level in the compress function.
  3. Improve error messages to be more specific about the compression/decompression method.
  4. Use stricter UTF-8 validation in the decode function.
  5. Consider using string slices instead of taking ownership where possible.
  6. Expand test coverage to include edge cases and error scenarios.

Addressing these points will improve the robustness, flexibility, and maintainability of the code.

engine/runtime/common/src/lib.rs (2)

67-69: LGTM! The compress method is well-implemented.

The new compress method in the Error implementation is consistent with other error creation methods. It correctly allows for creating a Compress error with a custom message, following Rust best practices.


76-76: LGTM! Verify the implementation of the new compress module.

The addition of the compress module to the public modules list is consistent with the project structure and maintains the alphabetical order. This change aligns with the introduction of compression functionality mentioned in the PR objectives.

To ensure the new compress module is properly implemented, please run the following script:

This script will help verify:

  1. The existence of the compress.rs file.
  2. The presence of basic compression-related functions.
  3. The use of the zstd crate in the new module.
engine/runtime/action-processor/src/geometry/extractor.rs (1)

3-3: LGTM: Import change aligns with PR objective

The change from base64_encode to compress aligns well with the PR objective of adding zstd compression support. Moving the compression logic to a common module (reearth_flow_common::compress) is a good practice for code reusability.

engine/runtime/action-processor/src/geometry/replacer.rs (3)

Line range hint 1-124: Overall assessment: Changes align with PR objectives

The modifications to GeometryReplacer successfully implement zstd compression support for geometry attributes, aligning with the PR objectives. The overall structure and logic of the implementation remain sound.

Key points:

  1. The switch from base64 to zstd compression is correctly implemented.
  2. Error handling using the ? operator is concise but may benefit from custom error mapping.
  3. Minor improvements in code style, such as variable naming and avoiding unnecessary clones, could enhance readability and performance.

The changes are generally well-implemented, with room for small refinements as suggested in the previous comments.


102-103: Verify error handling compatibility

The use of the ? operator for error propagation is correct. However, it's important to ensure that the error types returned by decode and serde_json::from_str are compatible with the BoxedError return type of the process method.

To verify error handling compatibility and potentially improve it, please run the following script:

#!/bin/bash
# Description: Verify error handling in the GeometryReplacer process method

# Test 1: Check the return type of the decode function
rg --type rust -A 3 "pub fn decode" reearth_flow_common/src/compress

# Test 2: Verify if there's any custom error handling or mapping for decode errors
rg --type rust -A 10 "fn process" engine/runtime/action-processor/src/geometry/replacer.rs

# Test 3: Check for any custom error types or error mapping in the module
rg --type rust "pub enum .*Error" engine/runtime/action-processor/src/geometry/

Consider adding custom error handling to map specific error types to your module's error type, improving error reporting and handling.


3-3: Approve import change and verify decode function compatibility

The change from base64_decode to decode aligns with the PR objective of adding zstd compression support for geometry attributes. This modification appears to be correct.

To ensure the new decode function is compatible and properly implemented, please run the following verification script:

engine/Cargo.toml (1)

165-165: New dependency added: zstd

The addition of the zstd dependency aligns with the PR objective of introducing Zstandard compression support for geometry attributes. Here are some points to consider:

  1. The zstd crate is a well-maintained and popular Rust binding for the Zstandard compression library, which is appropriate for the stated goal.
  2. Version 0.13.0 is currently the latest stable release (as of September 2024), which is good practice.
  3. Adding this dependency may increase the binary size slightly and introduce a new external dependency to the project. Ensure that this aligns with the project's goals and constraints.
  4. The version specification style (using quotes) is consistent with other dependencies in this file.

To ensure this addition doesn't conflict with existing code or dependencies, run the following script:

engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml (5)

257-261: Improved clarity in GeometryExtractor node

The renaming of the node from "GeometryExtractor" to "GeometryExtractorCityGml" and the corresponding change in the outputAttribute provide better specificity about the type of geometry being extracted. This change enhances the overall clarity of the workflow.


311-312: Improved attribute naming in GeometryCoercerByLineString

The change from "overlap" to "lineOverlap" in the outputAttribute provides better specificity about the type of overlap being measured. This improvement in naming clarity is commendable.

Please verify that this change is properly reflected in any downstream nodes that may use this attribute. Run the following script to check for any remaining references to the old attribute name:

#!/bin/bash
# Description: Check for any remaining references to the old 'overlap' attribute

# Test: Search for any remaining 'overlap' references in the workflow file
rg --type yaml 'overlap' engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml

422-431: Updated workflow connections

The changes in the workflow connections appropriately reflect the addition of new nodes and the restructuring of the data flow. The new sequence, with 2D geometry extraction preceding line overlap analysis, appears logical.

Please verify the overall integrity of the workflow with these changes. Run the following script to visualize the updated workflow structure:

#!/bin/bash
# Description: Generate a simplified view of the workflow structure

# Test: Extract and display the node connections
echo "Workflow Structure:"
rg --type yaml -N '^      - id:' engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml | sed 's/^      - id: //' > /tmp/node_ids
rg --type yaml -N '        from:' engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml | sed 's/^        from: //' > /tmp/from_ids
rg --type yaml -N '        to:' engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml | sed 's/^        to: //' > /tmp/to_ids
paste /tmp/from_ids /tmp/to_ids | while read from to; do
  from_name=$(grep $from /tmp/node_ids | cut -d- -f1)
  to_name=$(grep $to /tmp/node_ids | cut -d- -f1)
  echo "$from_name -> $to_name"
done | sort | uniq

Ensure that the generated workflow structure aligns with the intended data processing pipeline and that there are no unintended breaks or cycles in the flow.

Also applies to: 444-451


279-285: New GeometryExtractor2D node added

The addition of the GeometryExtractor2D node enhances the workflow's capability to handle 2D geometry separately. This is a good improvement for more granular geometry processing.

Please ensure that this new node is properly integrated into the workflow. Run the following script to verify its connections:

✅ Verification successful

GeometryExtractor2D node integration verified

The GeometryExtractor2D node is properly connected within the workflow's edges. All necessary connections are established correctly.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the connections of the new GeometryExtractor2D node

# Test: Check if the new node is connected in the edges section
rg --type yaml -A 5 'from:.*357febd2-3f43-45aa-b9cf-a10e4e3b59aa' engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml
rg --type yaml -A 5 'to:.*357febd2-3f43-45aa-b9cf-a10e4e3b59aa' engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml

Length of output: 832


313-321: New FeatureFilterByLineOverlap node added

The addition of the FeatureFilterByLineOverlap node introduces a valuable filtering step based on line overlap. This can help identify specific geometric conditions in the data.

Please confirm the following:

  1. Is the filtering condition (lineOverlap > 1) appropriate for the workflow's objectives?
  2. Is the threshold value of 1 correct, or should it be adjusted?

Additionally, run the following script to check how this new filter affects the data flow:

✅ Verification successful

Please run the following script to successfully extract the 'from' and 'to' connections for the new FeatureFilterByLineOverlap node:


Please run the following script to retrieve all incoming and outgoing connections for the FeatureFilterByLineOverlap node:


FeatureFilterByLineOverlap node connections verified

The FeatureFilterByLineOverlap node no longer has a self-referential connection and now correctly connects to 69f22583-c0c8-4343-b83b-79842ad0d9ce. This ensures proper data flow without the risk of infinite loops.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check the impact of the new FeatureFilterByLineOverlap node

# Test: Check the connections and potential data flow through the new node
rg --type yaml -A 5 'from:.*6f4352f3-0397-4672-966d-67637a6c9b98' engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml
rg --type yaml -A 5 'to:.*6f4352f3-0397-4672-966d-67637a6c9b98' engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml

Length of output: 675


Script:

#!/bin/bash
# Description: Extract 'from' and 'to' connections for the new FeatureFilterByLineOverlap node

NODE_ID="6f4352f3-0397-4672-966d-67637a6c9b98"
WORKFLOW_FILE="engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml"

# Extract 'from' connections
rg --type yaml "from:.*$NODE_ID" "$WORKFLOW_FILE" -A 1 | awk -F": " '/from:/ {print $2}' > /tmp/from_ids

# Extract 'to' connections
rg --type yaml "to:.*$NODE_ID" "$WORKFLOW_FILE" -A 1 | awk -F": " '/to:/ {print $2}' > /tmp/to_ids

# Combine 'from' and 'to' connections
awk 'BEGIN {while ((getline < "/tmp/from_ids") > 0) from[++i] = $0}
     {to[++j] = $0}
     END {for (k=1; k<=i || k<=j; k++) print from[k] "\t" to[k]}' /tmp/from_ids /tmp/to_ids

Length of output: 862


Script:

#!/bin/bash
# Description: Retrieve all 'from' and 'to' connections for the FeatureFilterByLineOverlap node

NODE_ID="6f4352f3-0397-4672-966d-67637a6c9b98"
WORKFLOW_FILE="engine/runtime/examples/plateau/testdata/workflow/quality-check/03-tran-rwy-trk-squr-wwy/workflow.yml"

echo "Incoming Connections (from):"
rg --type yaml "to:.*$NODE_ID" "$WORKFLOW_FILE" -A 1 | awk -F": " '/from:/ {print $2}'

echo -e "\nOutgoing Connections (to):"
rg --type yaml "from:.*$NODE_ID" "$WORKFLOW_FILE" -A 1 | awk -F": " '/to:/ {print $2}'

Length of output: 741

engine/runtime/common/src/compress.rs Show resolved Hide resolved
engine/runtime/common/src/compress.rs Show resolved Hide resolved
engine/runtime/common/src/compress.rs Outdated Show resolved Hide resolved
engine/runtime/action-processor/src/geometry/replacer.rs Outdated Show resolved Hide resolved
engine/runtime/common/src/str.rs Outdated Show resolved Hide resolved
@miseyu miseyu merged commit f6c4b4a into main Sep 26, 2024
15 checks passed
@miseyu miseyu deleted the feature/modify-geometry-extractor branch September 26, 2024 03:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant