Skip to content

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #6344

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #6344

L2_Prune_Width_Llama2  /  main

succeeded Nov 14, 2024 in 2m 19s