Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added attn and mlp bias #83

Closed
wants to merge 3 commits into from

Conversation

JRosenkranz
Copy link
Collaborator

Motivation

[Describe why this change is needed]

The Calico models currently set the mlp and attention bias to true, which was hard-coded to false in flash and paged llama implementations. This will use the config params set in huggingface/transformers#30031 to set those values properly.

Modifications

[Describe the code changes]

  • added attention_bias, mlp_bias to config for Flash and Paged Llama implementations (default is False)
  • set bias in attention and mlp to the config value

Result

[Describe how the changes affects existing behavior and how to test it]

Models should be able to load properly if containing attention and mlp bias

Related Issues

NA

@JRosenkranz JRosenkranz closed this May 6, 2024
Xaenalt pushed a commit to Xaenalt/text-generation-inference that referenced this pull request Aug 1, 2024
[pull] main from IBM:main
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant