Skip to content

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #6344

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #6344

L2_ASR_Adapters_RelPos_MHA_Adapters  /  main

succeeded Nov 14, 2024 in 47s