-
Notifications
You must be signed in to change notification settings - Fork 206
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
nydusify: introduce chunkdict generate subcommand #1572
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #1572 +/- ##
==========================================
+ Coverage 61.23% 61.42% +0.18%
==========================================
Files 144 145 +1
Lines 47100 47984 +884
Branches 44602 45960 +1358
==========================================
+ Hits 28843 29474 +631
- Misses 16778 16951 +173
- Partials 1479 1559 +80
|
Thanks for the PR, let's add SOB for all commits to pass https://github.com/dragonflyoss/nydus/pull/1572/checks?check_run_id=24019241046. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks for the work!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
@cslinwang The broken CI needs to be fixed: https://github.com/dragonflyoss/nydus/actions/runs/9747241399/job/26899471236?pr=1572 |
Signed-off-by: Lin Wang <l.wang@mail.dlut.edu.cn>
Signed-off-by: Lin Wang <l.wang@mail.dlut.edu.cn>
Signed-off-by: Lin Wang <l.wang@mail.dlut.edu.cn>
Brief introduction
Add functionlity to generate chunk dictionary by database file and algorithm "exponential_smoothing"
Implement the command nydus-image chunkdict generate --database command
Basic Usage
nydus-image chunkdict generate --database ./metadata.db
Details
Write command “nydus-image chunkdict generate” entries
Call the database interface and get the chunk information in the database
Implement the exponential smoothing algorithm and obtain chunk dictionary
To be continued, dump chunk dictionary and save, and then compact chunk