Add attention_bias
argument in transformer block and transformer layer modules, addressing change in MCore
#6344
Annotations
2 errors
|
Run main script
The operation was canceled.
|
Loading