Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

[Backend]: start building LLM function calls #94

Open
tobySolutions opened this issue Apr 25, 2024 · 0 comments
Open

[Backend]: start building LLM function calls #94

tobySolutions opened this issue Apr 25, 2024 · 0 comments
Labels
⭐ goal: addition Addition of new feature 🟧 priority: high Stalls work on the project or its dependents 💪 skill: api requires proficiency with apis 🏁 status: ready for dev Ready for work

Comments

@tobySolutions
Copy link
Member

From the server's codebase, functions that will call different LLMs need to start being implemented and worked on.

@tobySolutions tobySolutions added 🟧 priority: high Stalls work on the project or its dependents 🏁 status: ready for dev Ready for work ⭐ goal: addition Addition of new feature 💪 skill: api requires proficiency with apis labels Apr 25, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
⭐ goal: addition Addition of new feature 🟧 priority: high Stalls work on the project or its dependents 💪 skill: api requires proficiency with apis 🏁 status: ready for dev Ready for work
Projects
None yet
Development

No branches or pull requests

1 participant