Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update loss.py --refactored fluid_softmax_with_cross_entropy #56686

Merged
merged 2 commits into from
Aug 28, 2023

Conversation

zeus2x7
Copy link
Contributor

@zeus2x7 zeus2x7 commented Aug 26, 2023

PR types

Function optimization

PR changes

OPs

Description

refactored the fluid_softmax_with_cross_entropy loss function

@paddle-bot
Copy link

paddle-bot bot commented Aug 26, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@CLAassistant
Copy link

CLAassistant commented Aug 26, 2023

CLA assistant check
All committers have signed the CLA.

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@paddle-bot paddle-bot bot added the contributor External developers label Aug 26, 2023
@zeus2x7
Copy link
Contributor Author

zeus2x7 commented Aug 27, 2023

@CLAassistant @PaddleCI @PaddlePM can you please assign anyone to review this pr ?

@luotao1 luotao1 merged commit 3568a99 into PaddlePaddle:develop Aug 28, 2023
BeingGod pushed a commit to BeingGod/Paddle that referenced this pull request Sep 9, 2023
…addle#56686)

* Update loss.py --refactored fluid_softmax_with_cross_entropy

* Update loss.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants