From 854e49cf31210078483b9c5707773d7e8c023564 Mon Sep 17 00:00:00 2001 From: Harshit Mehta Date: Wed, 26 Apr 2023 12:26:39 +0530 Subject: [PATCH] docs: fix README Signed-off-by: Harshit Mehta --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index f9ece1a537..4c744ae2eb 100644 --- a/README.md +++ b/README.md @@ -310,6 +310,8 @@ _Analysis with serve mode_ curl -X GET "http://localhost:8080/analyze?namespace=k8sgpt&explain=false" ``` + + ## Running local models To run local models, it is possible to use OpenAI compatible APIs, for instance [LocalAI](https://github.com/go-skynet/LocalAI) which uses [llama.cpp](https://github.com/ggerganov/llama.cpp) and [ggml](https://github.com/ggerganov/ggml) to run inference on consumer-grade hardware. Models supported by LocalAI for instance are Vicuna, Alpaca, LLaMA, Cerebras, GPT4ALL, GPT4ALL-J and koala.