Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ETHOSN] Throw error message when inference fails #13022

Merged
merged 2 commits into from
Nov 3, 2022

Conversation

lhutton1
Copy link
Contributor

Previously the runtime would silently skip interence failures and return random values as the result. This can make spotting inference failures challenging. The runtime now throws a fatal error when inference did not complete successfully along with an error message that gives some details about the error that occurred.

cc @Leo-arm @ashutosh-arm @leandron

Previously the runtime would silently skip interence failures and return
random values as the result. This can make spotting inference failures
challenging. The runtime now throws a fatal error when inference did not
complete successfully along with an error message that gives some
details about the error that occurred.

Change-Id: Iadb6da04ad1c906e3ec49959eb3da0978295aebf
Copy link
Contributor

@ashutosh-arm ashutosh-arm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Luke. Indeed a great change. A nit, a question and a suggestion below 😅

src/runtime/contrib/ethosn/ethosn_device.cc Outdated Show resolved Hide resolved
tests/cpp/runtime/contrib/ethosn/inference_test.cc Outdated Show resolved Hide resolved
tests/cpp/runtime/contrib/ethosn/inference_test.cc Outdated Show resolved Hide resolved
* clarify test file brief
* add test case for running status
* add driver stack reference to WaitStatus class

Change-Id: I792742892b761534904816135ae2ffcb3f028b2c
@lhutton1
Copy link
Contributor Author

@tvm-bot rerun

@tvm-bot
Copy link
Collaborator

tvm-bot commented Oct 14, 2022

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

Generated by tvm-bot

@areusch areusch added needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it and removed needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it labels Oct 19, 2022
Copy link
Contributor

@ashutosh-arm ashutosh-arm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lhutton1 LGTM! Thanks for extending the test coverage 👍

Copy link
Contributor

@Leo-arm Leo-arm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@leandron leandron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @lhutton1 @Leo-arm

@leandron leandron merged commit 47da418 into apache:main Nov 3, 2022
xinetzone pushed a commit to daobook/tvm that referenced this pull request Nov 10, 2022
* [ETHOSN] Throw error message when inference fails

Previously the runtime would silently skip interence failures and return
random values as the result. This can make spotting inference failures
challenging. The runtime now throws a fatal error when inference did not
complete successfully along with an error message that gives some
details about the error that occurred.

Change-Id: Iadb6da04ad1c906e3ec49959eb3da0978295aebf

* Address comments

* clarify test file brief
* add test case for running status
* add driver stack reference to WaitStatus class

Change-Id: I792742892b761534904816135ae2ffcb3f028b2c
xinetzone pushed a commit to daobook/tvm that referenced this pull request Nov 25, 2022
* [ETHOSN] Throw error message when inference fails

Previously the runtime would silently skip interence failures and return
random values as the result. This can make spotting inference failures
challenging. The runtime now throws a fatal error when inference did not
complete successfully along with an error message that gives some
details about the error that occurred.

Change-Id: Iadb6da04ad1c906e3ec49959eb3da0978295aebf

* Address comments

* clarify test file brief
* add test case for running status
* add driver stack reference to WaitStatus class

Change-Id: I792742892b761534904816135ae2ffcb3f028b2c
@lhutton1 lhutton1 deleted the runtime-error-msg branch February 10, 2023 10:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants