Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed gelu, added layernorm. Added timvx version gelu and layernorm #1415

Merged
merged 4 commits into from
May 18, 2023

Conversation

shijie-nv
Copy link
Contributor

Fixed missing parts when GELU operator was added in previous version.
Added implementation of layernorm operator

Added timvx implementations of GELU and Layernorm

After my test, the inference results of CPU and NPU are correct.
Some networks that require these two operations can run entirely on the NPU.

gelu_NPU.txt
gelu_uint8_CPU.txt
layernorm_NPU.txt
layernorm_uint8_CPU.txt

@shijie-nv shijie-nv changed the title Feat 001 Fixed gelu, added layernorm. Added timvx version gelu and layernorm May 17, 2023
Copy link
Contributor

@BUG1989 BUG1989 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm!

@BUG1989 BUG1989 merged commit c73708c into OAID:tengine-lite May 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants