Terminal AI Commander at your fingertips
coding.assistant.example.mp4
Checkout demos for more examples.
NOTE: This is a work in progress, expect bugs and api changes.
Instrukt is a terminal-based AI integrated environment. It offers a platform where users can:
- π€ Create and instruct modular AI agents
- ποΈ Generate document indexes for question-answering
- π§° Create and attach tools to any agent
Agents are simple drop-in Python packages that can be extended, shared with others, attached to tools and augmented with document indexes.
Instruct them in natural language and, for safety, run them inside secure containers (currently implemented with Docker) to perform tasks in their dedicated, sandboxed space π‘οΈ.
Built with: Langchain, Textual, Chroma
Consulting Services: Need help with Langchain or AI integration ? You can reach out to me at contact@blob42.xyz
- Usage
- Features
- Supported Platforms
- LLM Models
- Document Indexes and Question-Answering
- Roadmap
- Contributing
- Patreon
- Social
- Vision
- License
-
pip install instrukt[all]
-
export
OPENAI_API_KEY
with your OpenAI API key. -
run
instrukt
-
If the color scheme is broken, export
TERM=xterm-256color
-
A configuration file will be created at:
~/.config/instrukt/instrukt.yml
You can run instrukt on headless server such or a docker container with CUDA support.
NOTE: if you are starting with a bare container, you need at least g++ and libmagic.
Check the quickstart and install guide for more details.
- Make sure the latest version of
poetry
is installed. - Set your virtualenv
- Clone the repository
- Run
poetry install -E all --with dev,test
- This will install Instrukt including extra tools for agents.
See the installation guide for more details
Coding AI: A coding assistant. Create indexes over any code base and attach it to the agent to do RAG (Retrieval Augmented Generation)
Chat Q&A: A simple conversational agent.
- A terminal-based interface for power keyboard users to instruct AI agents without ever leaving the keyboard.
- Rich colorful agent outputs with markdown and source code support thanks to the Textual TUI library.
- Run Instrukt on bare metal or docker containers with CUDA support.
- Remote access with SSH and terminal multiplexers.
- Design custom agents and tools.
- Agents are simple python packages can be shared and loaded by other users.
- Index your data and let agents retrieve it for question-answering.
- Create and organize your indexes with an easy UI.
- Index creation will auto detect programming languages and optimize the splitting/chunking strategy accordingly.
- Fuzzy select (fzf, dmenu ...) source documents that were used for retrieval
ctrl+p
- Use the pre-defined toolset or design your own tools.
- Attach or detach tools to agents on-the-go, tailoring your AI workflows to your needs.
- Integrated REPL-Prompt for quick interaction with agents, and a fast feedback loop for development and testing.
- Automate repetitive tasks with custom commands.
- Builtin prompt/chat history.
- Use
vim
,emacs
or any external$EDITOR
to edit messages.
- Leverage the LangChain ecosystem to automate anything.
- WIP: Extensible API for integrating with other frameworks.
- Run agents inside secure docker containers for safety and privacy.
- Use gVisor runtime for a full isolation of the agent.
note: The docker agent is only available to Patreon supporters as an early preview.
Debug and introspect agents using an in-built IPython console. ctrl+d
- Indexes can be created using OpenAI or local embeddings models.
- Chroma for managing indexes.
- Create and manage indexes using the Index Management UI (press
I
) - Indexing a directory will auto detect programming languages and use an appropriate splitting strategy optimized for the target language.
- Indexes can be attached to any agent as a retrieval tool using the
index
menu in the top of the agent's window. - Agents can use attached indexes for question-answering.
- Linux/Mac.
- Windows tested under WSL2.
- Currently only OpenAI supported.
- Using private local models is the next milestone.
-
private local LLM models
- Local embeddings
- HF SetenceTransformers supported embeddings.
- Instructor Embeddings
- use text-generation-webui API
- emulate PrivateGPT
- use self hosted go-skynet/LocalAI
- Local embeddings
-
Indexing and embeddings
- Index directories and auto detect content. ( see
AutoDirLoader
) - Detect programming languages and use the appropriate splitter.
- Load a git repository from URL
- Load any webpage / website.
- Index directories and auto detect content. ( see
-
Documentation
- Creating agents
- Creating tools
- Indexing and chat with documents and source code.
- Example use cases
- Tutorials.
Any contribution, feedback and PR is welcome !
You can help with:
- Testing and creating Issues for bugs or features that would be useful.
- If you have technical skills, you are welcome to create a PR.
- If you don't have technical skills you can help with documentation, adding examples and tutorials or create new user stories.
By becoming a patron, you will help me continue committing time to the development of Instrukt and bring to life all the planned features. Check out the Patreon page for more details on the rewards for early supporters.
Join the Discord server to keep updated on the progress or ask for help.
AI should be accessible to everyone and not a walled garden for big corporations and SaaS services.
Instrukt is a modest contribution to create tools that empower users without compromising their freedoms. The short-term goal is to make it usable with minimal reliance on external APIs and services, giving users the choice to opt for local models and self-hosted services.
Copyright (c) 2023 Chakib Ben Ziane. All Rights Reserved.
Instrukt is licensed with a AGPL license, in short this means that it can be used by anyone for any purpose. However, if you decide to make a publicly available instance your users are entitled to a copy of the source code including all modifications that you have made (which needs to be available trough an interface such as a button on your website), you may also not distribute this project in a form that does not contain the source code (Such as compiling / encrypting the code and distributing this version without also distributing the source code that includes the changes that you made. You are allowed to distribute this in a closed form if you also provide a separate archive with the source code.).