-
Notifications
You must be signed in to change notification settings - Fork 159
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[onert-micro] Introduce OMTrainingStorage and OMTrainingHandler #13146
[onert-micro] Introduce OMTrainingStorage and OMTrainingHandler #13146
Conversation
This pr introduces OMTrainingStorage and OMTrainingHandler entities. ONE-DCO-1.0-Signed-off-by: Artem Balyshev <a.balyshev@samsung.com>
switch (config.training_context.optimizer) | ||
{ | ||
case SGD: | ||
{ | ||
auto *sgd_optimizer = _training_storage.getSGD(); | ||
assert(sgd_optimizer != nullptr); | ||
if (sgd_optimizer == nullptr) | ||
return UnknownError; | ||
|
||
status = sgd_optimizer->updateWeights(config.training_context, context); | ||
assert(status == Ok); | ||
// Reset values | ||
sgd_optimizer->reset(); | ||
break; | ||
} | ||
case ADAM: | ||
{ | ||
auto *adam_optimizer = _training_storage.getAdam(); | ||
assert(adam_optimizer != nullptr); | ||
if (adam_optimizer == nullptr) | ||
return UnknownError; | ||
|
||
status = adam_optimizer->updateWeights(config.training_context, context); | ||
assert(status == Ok); | ||
// Reset values | ||
adam_optimizer->reset(); | ||
break; | ||
} | ||
default: | ||
{ | ||
assert(false && "Unsupported type"); | ||
return UnsupportedType; | ||
} | ||
} | ||
|
||
return status; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we make a map for
{ config.training_context.optimizer : std::unique_ptr... }
I think we can remove all these switch cases and code dependencies like .getSGD
, getAdam
... for later extensibility.
Does it have a risk? or costs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I think it is good idea, thank you. I will try it now and push this changes :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ljwoo94
Ah to save it in std::unordered_map<optimizer_type , std::unique_ptr<Optimizer>>
I need common Optimizer
type and virtual mechanism to call right functions. But in onert-micro we disable virtual mechanism in order to save binary.
So, I think such switch case is the most readable way to use different optimizers :(
Or do you have some ideas?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or do you have some ideas?
I guess not :'(
Approved PR. LGTM :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
This pr introduces OMTrainingStorage and OMTrainingHandler entities.
for issue #12873
from draft: #13107
ONE-DCO-1.0-Signed-off-by: Artem Balyshev a.balyshev@samsung.com