Skip to content

A simple terminal-based local AI agent for general-purpose chat and building small applications with user-controlled file permissions

Notifications You must be signed in to change notification settings

Adii0906/Local-Agent-CLI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LocalAgent – Terminal AI Assistant (Prototype)

A lightweight terminal-based AI agent prototype powered by Ollama, created to explore how local language models can assist with chat, code analysis, and basic file operations.


Overview

LocalAgent is a minimal, experimental implementation designed to understand:

  • How terminal AI agents operate
  • How local LLMs (via Ollama) respond to structured prompts
  • How AI-assisted file and code generation works in practice

This project is intentionally scoped as a learning-focused prototype, not a production system.


Scope & Intent

  • ✅ Built to experiment and learn
  • ✅ Demonstrates core agent workflows
  • ✅ Useful for general chat purpose
  • ❌ Not production-ready
  • ❌ No guarantees of correctness or safety

This prototype is meant for internal evaluation, learning, and discussion.


Features

  • Multi-Model Support
    Compatible with any locally available Ollama model

  • Chat Mode
    General-purpose AI chat and explanations

  • Build Mode
    Generates files and folder structures
    (general-purpose scaffolding only)

  • Analyze Mode
    Basic codebase inspection and statistics

  • Safe File Operations
    Requires confirmation before creating files

  • Rich Terminal UI
    Uses rich for readable and interactive CLI output


Important Notes & Safety

  • Build mode may be slow, depending on hardware
  • Local models can be outdated or inaccurate
  • Generated files may be incomplete or incorrect
  • Outputs should always be reviewed manually
  • Not suitable for production or automated deployment

Chat mode is generally reliable for exploration and discussion,
but build outputs should be treated as assistive suggestions only.


Quick Start

Prerequisites

  • Ollama installed and running
  • Python 3.8+

Installation

git clone https://github.com/Adii0906/Local-Agent-CLI.git
cd localagent
pip install -r requirements.txt
python agent.py

Commands

Command Description

/chat	Interact with the AI
/build	Generate files/folders
/analyze	Analyze a codebase
/model	Switch Ollama model
/help	Show help
/exit	Exit the agent

Recommended Models

qwen2.5-coder:1.5b – Lightweight, good for coding
deepseek-coder:1.3b – Strong code-focused model
mistral:7b – Fast general-purpose model

Acknowledgements

Built using Python and Rich for an interactive and user-friendly experience.

Special thanks to Ollama for providing powerful free local AI models, making local experimentation and learning possible without relying on cloud-based APIs.

About

A simple terminal-based local AI agent for general-purpose chat and building small applications with user-controlled file permissions

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages