Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix the missing modifier issue of dp compress #2591

Merged
merged 1 commit into from
Jun 6, 2023

Conversation

Yi-FanLi
Copy link
Collaborator

@Yi-FanLi Yi-FanLi commented Jun 6, 2023

The current model compression feature will miss data modifier if it is used to compress a dplr model. This PR builds the modifier in the compressing process if there is a modifier in the uncompressed model.

@github-actions github-actions bot added the Python label Jun 6, 2023
@codecov
Copy link

codecov bot commented Jun 6, 2023

Codecov Report

Patch coverage: 66.66% and no project coverage change.

Comparison is base (77596f8) 76.66% compared to head (22de18b) 76.66%.

Additional details and impacted files
@@           Coverage Diff           @@
##            devel    #2591   +/-   ##
=======================================
  Coverage   76.66%   76.66%           
=======================================
  Files         233      233           
  Lines       24175    24176    +1     
  Branches     1697     1711   +14     
=======================================
+ Hits        18533    18535    +2     
+ Misses       4519     4518    -1     
  Partials     1123     1123           
Impacted Files Coverage Δ
deepmd/entrypoints/train.py 86.69% <50.00%> (-0.34%) ⬇️
source/lmp/fix_dplr.cpp 78.00% <75.00%> (+0.25%) ⬆️

... and 1 file with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

@wanghan-iapcm wanghan-iapcm merged commit 6c439c7 into deepmodeling:devel Jun 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants