Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[onert-micro] Introduce OMTrainingStorage and OMTrainingHandler #13146

Merged

Conversation

BalyshevArtem
Copy link
Contributor

This pr introduces OMTrainingStorage and OMTrainingHandler entities.

for issue #12873
from draft: #13107

ONE-DCO-1.0-Signed-off-by: Artem Balyshev a.balyshev@samsung.com

This pr introduces OMTrainingStorage and OMTrainingHandler entities.

ONE-DCO-1.0-Signed-off-by: Artem Balyshev <a.balyshev@samsung.com>
Comment on lines +109 to +145
switch (config.training_context.optimizer)
{
case SGD:
{
auto *sgd_optimizer = _training_storage.getSGD();
assert(sgd_optimizer != nullptr);
if (sgd_optimizer == nullptr)
return UnknownError;

status = sgd_optimizer->updateWeights(config.training_context, context);
assert(status == Ok);
// Reset values
sgd_optimizer->reset();
break;
}
case ADAM:
{
auto *adam_optimizer = _training_storage.getAdam();
assert(adam_optimizer != nullptr);
if (adam_optimizer == nullptr)
return UnknownError;

status = adam_optimizer->updateWeights(config.training_context, context);
assert(status == Ok);
// Reset values
adam_optimizer->reset();
break;
}
default:
{
assert(false && "Unsupported type");
return UnsupportedType;
}
}

return status;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we make a map for
{ config.training_context.optimizer : std::unique_ptr... }
I think we can remove all these switch cases and code dependencies like .getSGD, getAdam... for later extensibility.

Does it have a risk? or costs?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I think it is good idea, thank you. I will try it now and push this changes :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ljwoo94
Ah to save it in std::unordered_map<optimizer_type , std::unique_ptr<Optimizer>> I need common Optimizer type and virtual mechanism to call right functions. But in onert-micro we disable virtual mechanism in order to save binary.
So, I think such switch case is the most readable way to use different optimizers :(
Or do you have some ideas?

Copy link
Contributor

@ljwoo94 ljwoo94 Jun 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or do you have some ideas?

I guess not :'(

Approved PR. LGTM :)

Copy link
Contributor

@Torrero Torrero left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@chunseoklee chunseoklee requested a review from ljwoo94 June 17, 2024 01:04
@BalyshevArtem BalyshevArtem merged commit 6482288 into Samsung:master Jun 17, 2024
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants