Skip to content

Conversation

@ooooo-create
Copy link
Contributor

@ooooo-create ooooo-create commented Aug 19, 2025

PR Category

Feature Enhancement

Description

提供一种方式去避开 vmap 和 autocast
https://github.com/huggingface/transformers/blob/6b5bd117231f969713ed79fd4870903ab3c93edf/docs/source/en/attention_interface.md?plain=1#L194-L195 transformers 库中本来就提供了一种 sdpa_mask_without_vmap(我看了vma
p 的版本今年5月份的时候新增的,貌似和flex attention有关,可能性能原因吧),目的是为了模型导出,虽然那个是torch export,对于提取计算图来说,没有关系,拿来用了
autocast 在第一次trace之后的 FxGraph 上 trace 会进行打断,autocast 也不影响计算图,那就替换成一个空的上下文管理器

Important

模型只是为了在 ci 验证,可以不用合入我的模型,只是提了一个脚本

@paddle-bot
Copy link

paddle-bot bot commented Aug 19, 2025

Thanks for your contribution!

@paddle-bot paddle-bot bot added the contributor External developers label Aug 19, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

contributor External developers HappyOpenSource 快乐开源活动issue与PR

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants