Skip to content

Allow flash attention 2 and upgrade to transformers 4.34.1 #2591

Allow flash attention 2 and upgrade to transformers 4.34.1

Allow flash attention 2 and upgrade to transformers 4.34.1 #2591

Triggered via pull request October 24, 2023 17:50
Status Success
Total duration 21m 34s
Artifacts 3

pr-cpu.yaml

on: pull_request
Matrix: pytest-cpu
Coverage Results  /  coverage
11s
Coverage Results / coverage
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
coverage-07ce0854730b7f5f84085088d153d1eb0213bb7a-cpu-2.0.1 Expired
236 KB
coverage-07ce0854730b7f5f84085088d153d1eb0213bb7a-cpu-2.1.0 Expired
236 KB
coverage-07ce0854730b7f5f84085088d153d1eb0213bb7a-cpu-latest Expired
236 KB