Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix GELU & sigmoid activation precision #214

Merged
merged 2 commits into from
Oct 10, 2024
Merged

Conversation

rouson
Copy link
Contributor

@rouson rouson commented Oct 10, 2024

This PR fixes several compile-time constants to ensure that the double-precision versions of the GELU and sigmoid activation functions use double-precision throughout their evaluations.

@rouson rouson changed the title Fix gelu & sigmoid activation precision Fix GELU & sigmoid activation precision Oct 10, 2024
@rouson rouson merged commit 8f686f6 into develop Oct 10, 2024
4 checks passed
@rouson rouson deleted the fix-activation-precision branch October 10, 2024 03:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant