You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
PROBLEM
Currently sherpa can only interface with Open AI API which is very limiting in terms of development and testing, and also the range of use cases it can handle (eg. using local files as knowledge base without sending info to third part providers)
SOLUTION
Refactor LLM handling components into its own module and in parallel research and set up a library that allows setting up local LLMs as API (eg. https://ollama.ai/).
Challenges:
how will this impact default prompts in the system? do we need to keep track of several set of prompts?
for deployed sherpa we will continue using open ai - will this switch create too much logistical and manual steps between dev and deployment?
what tests and evaluation gaurdrails are necessary to ensure the system doesn't run into a lot of integration errors and misbehaviors.
ALTERNATIVES
One idea we considered was figuring out a way to use open ai for free or cheaper (for example through their research grant programs). this does not solve the latter problem mentioend above (local use) but also might take a long time to acquire.
This discussion was converted from issue #192 on April 04, 2024 20:03.
Heading
Bold
Italic
Quote
Code
Link
Numbered list
Unordered list
Task list
Attach files
Mention
Reference
Menu
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
PROBLEM
Currently sherpa can only interface with Open AI API which is very limiting in terms of development and testing, and also the range of use cases it can handle (eg. using local files as knowledge base without sending info to third part providers)
SOLUTION
Refactor LLM handling components into its own module and in parallel research and set up a library that allows setting up local LLMs as API (eg. https://ollama.ai/).
Challenges:
ALTERNATIVES
One idea we considered was figuring out a way to use open ai for free or cheaper (for example through their research grant programs). this does not solve the latter problem mentioend above (local use) but also might take a long time to acquire.
OTHER INFO
n/a
Beta Was this translation helpful? Give feedback.
All reactions