Skip to content

Commit

Permalink
fix (#6796)
Browse files Browse the repository at this point in the history
  • Loading branch information
xuxinyi389 authored Jul 29, 2024
1 parent 841c177 commit dd12e14
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -459,7 +459,7 @@
| 序号 | Pytorch 最新 release | Paddle develop | 映射关系分类 | 备注 |
| ----- | ----------- | ----------------- | ----------- | ------- |
|MANUAL_MAINTAINING-ITEM(`flash_attn.flash_attn_interface.flash_attn_func`,https://github.com/Dao-AILab/flash-attention/blob/72e27c6320555a37a83338178caa25a388e46121/flash_attn/flash_attn_interface.py#L808, `paddle.nn.functional.flash_attention.flash_attention`, https://github.com/PaddlePaddle/Paddle/blob/900d27c40ef4567d7ea6342f3f0eedd394885ecb/python/paddle/nn/functional/flash_attention.py#L248, torch 参数更多 , https://github.com/PaddlePaddle/docs/tree/develop/docs/guides/model_convert/convert_from_pytorch/api_difference_third_party/flash_attn/flash_attn.flash_attn_interface.flash_attn_func.md) |
|MANUAL_MAINTAINING-ITEM(`flash_attn.flash_attn_interface.flash_attn_unpadded_func`,https://github.com/Dao-AILab/flash-attention/blob/d0787acc16c3667156b51ce5b01bdafc7594ed39/flash_attn/flash_attn_interface.py#L1050 `paddle.nn.functional.flash_attention.flash_attn_unpadded`, https://github.com/PaddlePaddle/Paddle/blob/b32b51b7c21ad62bf794512c849a603c8c0ece44/python/paddle/nn/functional/flash_attention.py#L664, torch 参数更多 , https://github.com/PaddlePaddle/docs/tree/develop/docs/guides/model_convert/convert_from_pytorch/api_difference_third_party/flash_attn/flash_attn.flash_attn_interface.flash_attn_unpadded_func.md) |
|MANUAL_MAINTAINING-ITEM(`flash_attn.flash_attn_interface.flash_attn_unpadded_func`,https://github.com/Dao-AILab/flash-attention/blob/d0787acc16c3667156b51ce5b01bdafc7594ed39/flash_attn/flash_attn_interface.py#L1050, `paddle.nn.functional.flash_attention.flash_attn_unpadded`, https://github.com/PaddlePaddle/Paddle/blob/b32b51b7c21ad62bf794512c849a603c8c0ece44/python/paddle/nn/functional/flash_attention.py#L664, torch 参数更多 , https://github.com/PaddlePaddle/docs/tree/develop/docs/guides/model_convert/convert_from_pytorch/api_difference_third_party/flash_attn/flash_attn.flash_attn_interface.flash_attn_unpadded_func.md) |
|MANUAL_MAINTAINING-ITEM(`flash_attn.layers.rotary.apply_rotary_emb_func`,https://github.com/Dao-AILab/flash-attention/blob/d0787acc16c3667156b51ce5b01bdafc7594ed39/flash_attn/layers/rotary.py#L94, ` `, , 组合替代实现 , https://github.com/PaddlePaddle/docs/tree/develop/docs/guides/model_convert/convert_from_pytorch/api_difference_third_party/flash_attn/flash_attn.layers.rotary.apply_rotary_emb_func.md) |
|MANUAL_MAINTAINING-ITEM(`flash_attn.ops.rms_norm.rms_norm`,https://github.com/Dao-AILab/flash-attention/blob/d0787acc16c3667156b51ce5b01bdafc7594ed39/flash_attn/ops/rms_norm.py#L14, `paddle.incubate.nn.functional.fused_rms_norm`, https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/incubate/nn/functional/fused_rms_norm_cn.html, 仅 paddle 参数更多 , https://github.com/PaddlePaddle/docs/tree/develop/docs/guides/model_convert/convert_from_pytorch/api_difference_third_party/flash_attn/flash_attn.ops.rms_norm.rms_norm.md) |

Expand Down

0 comments on commit dd12e14

Please sign in to comment.