Skip to content

kleer001/Text_Loom

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Text Loom:📝🧵

💬 What?

Text Loom is a fun workspace for creating networks that manage queries and build on them.
All from the comfort of your terminal!

📃 How?

Text flows from one node to the next.
The Text Loom philosophy, it's backend, is all about text.
Specifically lists of text.

Nodes pass text to each other:

  • One node creates text: (Text)
  • Some nodes read and write text files: (FileIn, FileOut)
  • Some nodes create lists: (Section, Split, MakeList)
  • One node combines lists: (Merge)
  • One node talks to an LLM: (Query)
  • One node can contain other nodes and iterate over them in loops: (Looper)
  • And one node does nothing at all except pass the text along: (Null)

🚀 Start (automagically)

curl -fsSL https://raw.githubusercontent.com/kleer001/Text_Loom/master/install.sh | bash ; cd Text_Loom

✨ Start (manual)

* Make sure you have **git** installed and **python3** (version 3.8 or higher) * **Clone** the repository git clone https://github.com/kleer001/Text_Loom ; cd Text_Loom * **Create** a local venv python3 -m venv .venv * **Activate** it and set PYTHONPATH source .venv/bin/activate ; export PYTHONPATH=\$PYTHONPATH:$(pwd)/src * **Install** in development mode pip install -e . * **Run** the program python3 src/TUI/tui_skeleton.py

Note for Windows users:
Replace source .venv/bin/activate with .venv\Scripts\activate
and export PYTHONPATH=$PYTHONPATH:$(pwd)/src with set PYTHONPATH=%PYTHONPATH%;%cd%\src

📦 Currently supported LLMS platforms

in src/core/settings.cfg

LLM Platform URL Endpoint
Ollama localhost:11434 /api/generate
LM Studio localhost:1234 /v1/chat/completions
GPT4All localhost:4891 /v1/completions
LocalAI localhost:8080 /v1/chat/completions
llama.cpp localhost:8080 /completion
oobabooga localhost:5000 /v1/chat/completions
ChatGPT https://api.openai.com /v1/chat/completions
Perplexity https://api.perplexity.ai /v1/chat/completions
Claude https://api.anthropic.com /v1/messages
Gemini https://generativelanguage.googleapis.com /v1/models/gemini-1.5-pro:generateContent
  • Please suggest more free local LLMs if you like. And feel free to change your local settings.cfg to fit your own purposes. The structure should be self-evident from the examples in it.

🚶 GUI WALK THROUGH

👀 MAIN WINDOW

Demo of MakeList functionality GIF

Primary Workspace

Each Primary window can be navigated to with the keycommand CTRL+(n/a/g)

  • Node Network - Central workspace for creating and connecting nodes. Displays node states, connections, and hierarchies using visual indicators.
  • Parameters - Center Top panel showing properties of selected nodes.
  • Globals - Right Top panel. System-wide variables accessible across the network.

Execution and Output

  • Output Display - Center bottom. Shows formatted results from node evaluations with clear item separation.
  • Status Window - Right bottom. Real-time system message monitoring, capturing stdout and stderr streams.
  • Help window - Bottom. Shows the key commands available for the active window.
  • Mode Line - Gutter. Show the active window, current filename, last window switched to, and keypressed.

Please see the extensive wiki for more detailed information.

About

Network for procedural editing of text with LLMs

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published