Skip to content

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #1101

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #1101

Triggered via pull request November 14, 2024 19:47
Status Success
Total duration 28s
Artifacts

copyright-check.yml

on: pull_request
copyright-check  /  main
17s
copyright-check / main
Fit to window
Zoom out
Zoom in