Skip to content

most model types now support flash attention 2 regardless of multipack support #4046

most model types now support flash attention 2 regardless of multipack support

most model types now support flash attention 2 regardless of multipack support #4046

Annotations

1 warning

PyTest (3.11)

succeeded Aug 22, 2024 in 4m 54s