Skip to content

Commit

Permalink
tf: add fparam/aparam support for finetune (#3313)
Browse files Browse the repository at this point in the history
Fix #3256.

Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
(cherry picked from commit d629616)
  • Loading branch information
njzjz committed Apr 6, 2024
1 parent 0341466 commit 36a93f2
Showing 1 changed file with 16 additions and 1 deletion.
17 changes: 16 additions & 1 deletion deepmd/fit/ener.py
Original file line number Diff line number Diff line change
Expand Up @@ -850,7 +850,22 @@ def change_energy_bias(
box = test_data["box"][:numb_test]
else:
box = None
ret = dp.eval(coord, box, atype, mixed_type=mixed_type)
if dp.get_dim_fparam() > 0:
fparam = test_data["fparam"][:numb_test]
else:
fparam = None
if dp.get_dim_aparam() > 0:
aparam = test_data["aparam"][:numb_test]
else:
aparam = None
ret = dp.eval(
coord,
box,
atype,
mixed_type=mixed_type,
fparam=fparam,
aparam=aparam,
)
energy_predict.append(ret[0].reshape([numb_test, 1]))
type_numbs = np.concatenate(type_numbs)
energy_ground_truth = np.concatenate(energy_ground_truth)
Expand Down

0 comments on commit 36a93f2

Please sign in to comment.