-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add /embeddings api #41
Comments
we'll definitely add it. It seems llama.cpp is almost supporting embeddings. |
@pedro-devv ggerganov/llama.cpp#5423 |
Nomic Embed also seems to be supported now by llama.cpp ggerganov/llama.cpp#5468 |
You should use nomic-embed-text-v1.5 https://blog.nomic.ai/posts/nomic-embed-matryoshka here is the v1.5 GGUF: https://huggingface.co/nomic-ai/nomic-embed-text-v1.5-GGUF |
ollama added support for nomic-embed-text as it is not supported by llama.cpp but unforutnately it is not compatible with OpenAI apis. Perf is really good too. https://ollama.com/library/nomic-embed-text curl http://localhost:11434/api/embeddings -d '{
"model": "nomic-embed-text",
"prompt": "The sky is blue because of Rayleigh scattering"
}' |
Add support for embedding api. https://platform.openai.com/docs/guides/embeddings/what-are-embeddings. This can be useful for implementing RAG and vector search.
Might be could start with nomic-embed-text-v1
The text was updated successfully, but these errors were encountered: