Recreated settings changes - Adds serveral options for llamacpp and ollama #1703
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Original PR here:
#1677
llama-cpp https://llama-cpp-python.readthedocs.io/en/latest/api-reference/
https://docs.llamaindex.ai/en/stable/examples/llm/llama_2_llama_cpp.html#
ollama - https://github.com/run-llama/llama_index/blob/eeb2a60387b8ae1994005ad0eebb672ee02074ff/llama-index-integrations/llms/llama-index-llms-ollama/llama_index/llms/ollama/base.py
No configurable changes. -
openailike - https://docs.llamaindex.ai/en/stable/examples/llm/localai.html#localai
Not sure about the model_kwargs. The value is references for openai, but I could not find documentation on what values were allowed.
openai - https://github.com/run-llama/llama_index/blob/eeb2a60387b8ae1994005ad0eebb672ee02074ff/llama-index-integrations/llms/llama-index-llms-openai/llama_index/llms/openai/base.py
https://docs.llamaindex.ai/en/stable/examples/llm/openai.html
For the text/description I used the values found here:
https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values
LlamaCPP, where it used the same K/V, had the same values. However my setup is currently using ollama, need some testing done for LlamaCPP.
I also added the temperature under the main llm.settings. This should allow the models that supports this value to be edited/changed.