Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incremental training for Deep Learning and Wapiti models #971

Merged
merged 7 commits into from
Nov 25, 2022

Conversation

kermitt2
Copy link
Owner

@kermitt2 kermitt2 commented Nov 24, 2022

Support incremental training for Deep Learning models.

We simply add -i to the training command, and the training will resume from the existing model instead of starting from scratch with a new model.

> java -Xmx1024m -jar grobid-trainer/build/libs/grobid-trainer-<current version>-onejar.jar 0 <name of the model> -gH grobid-home -i

See kermitt2/delft#147

It does work for CRF Wapiti too.

@kermitt2 kermitt2 changed the title Incremental training for DL models Incremental training for Deep Learning and Wapiti models Nov 24, 2022
@coveralls
Copy link

coveralls commented Nov 24, 2022

Coverage Status

Coverage decreased (-0.08%) to 39.586% when pulling fc701f7 on incremental-training into b9d92c6 on master.

@kermitt2 kermitt2 merged commit 2bfadfd into master Nov 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants