Skip to content

KirylJazzSax/llm-simple-assistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Simple Chat to Local LLM Application

This project aims to create a simple chat interface with a local Language Model (LLM) installed by ollama. The purpose of this project is to provide hands-on experience in building an LLM application from scratch.

Installation (will add docker later)

TBA

How to run (if pb generated and env for python set)

  1. make serve (runs python gprc server)
  2. make grpc-proxy (runs proxy between grpc and web app)
  3. make ui-dev (runs angular, very simple app for now.........)

Future

To improve this project, consider adding more features like: - Support for multiple users - Persistent user sessions - Integration with other NLP models (e.g., sentiment analysis, embeddings, function execution)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published