You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to understand better how lm_head is used in llama.cpp. Did I understood correctly that is not used? I have a model that has been fine-tuned from llama-3.2-3b-instruct with some tricks to limit the response to yes/no and to improve their probability distribution.
Does llama.cpp converter disable the lm_head when converting?
Sorry if my message was confused, any help is welcome.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I would like to understand better how
lm_head
is used in llama.cpp. Did I understood correctly that is not used? I have a model that has been fine-tuned fromllama-3.2-3b-instruct
with some tricks to limit the response to yes/no and to improve their probability distribution.Does llama.cpp converter disable the lm_head when converting?
Sorry if my message was confused, any help is welcome.
Beta Was this translation helpful? Give feedback.
All reactions