Use the RAG function offline, but I only found the KEY for OPENAI in the documents and search. #585
Replies: 2 comments 1 reply
-
Hi @duwenlong2, there's no immediate support for custom models, so you'd need to write a couple of classes to implement ITextGenerator and ITextEmbeddingGenerator, leveraging the custom model you're choosing. https://github.com/microsoft/kernel-memory/blob/main/service/Abstractions/AI/ITextGenerator.cs Once you have those classes, it's easy to plug them in, e.g. using KM builder methods |
Beta Was this translation helpful? Give feedback.
-
Sorry, due to unfamiliarity with AI. After several days of tinkering, I still couldn't achieve this function. Do you have similar examples or more information? |
Beta Was this translation helpful? Give feedback.
-
Context / Scenario
Use kernelmemory to process notes on a local computer, create a knowledge base from your own organized notes, and then review them.
Question
I want to create a local knowledge base feature. I added Gemma: 2b using the semantic kernel to the reference document. But I want to use RAG to analyze the QA of books or knowledge bases. Because I cannot connect to the internet, I am unable to use OPENAI's remote RAG. I tried downloading the m3e-base, but I do not know how to use the offline RAG model in kernelmemory. Is there an example?
Beta Was this translation helpful? Give feedback.
All reactions