Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
# Chainlit + OpenAI Responses Demo (Multimodal • Reasoning • Streaming • Functions)

A lean Chainlit app that wires the **OpenAI Responses API** to a real **local Python executor** with **multimodal I/O (text, images, files)**, **reasoning traces**, **token+function streaming**, and **tool functions** (web search preview, image generation, Python runner, file ops, optional file-search).  

---

## Features (at a glance)

* **Multimodal input**: accepts text, images, and file uploads in one message. 
* **Multimodal output**: previews **images** inline and **CSV DataFrames** (with a download link) or falls back to readable text.  
* **Reasoning**: configurable effort + **streamed reasoning summary** you can show in the UI when available.  
* **Streaming**: assistant text streams token-by-token; **function call metadata & arguments stream** too, so you can live-render generated Python code.  
* **Functions / Tools**: registered tools include `execute_python_code`, `upload_file_to_workspace`, `list_workspace_files`, simple calculator, **web\_search\_preview**, **image\_generation**, and optional **file\_search** when a vector store is present.  
* **Per-chat workspace**: each chat gets its own `.files/{session_id}/pyws` folder; Python runs there and files persist for the chat’s lifetime. 
* **Nice UX**: progress steps for web search, image gen, and Python runs; optional full conversation/“reasoning summary” panels.  

---

## What’s in the repo

* **`app.py`** — Chainlit lifecycle, dev-prompt injection on first turn, vector-store hookup on upload, Responses **streaming loop** (text + function args), live “Python Code Being Generated” pane, and a multi-iteration tool loop.  
* **`tools.py`** — Tool registry and implementations:

* `execute_python_code` (**persistent workspace**, returns stdout/stderr + collected files). 
* `upload_file_to_workspace` / `list_workspace_files` helpers. 
* File renderer that shows **images inline**, **CSV as DataFrame**, and **text previews** with download buttons.  

---

## Quick start

```bash
python -m venv .venv && source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install chainlit openai pandas matplotlib
export OPENAI_API_KEY=sk-... # Windows PowerShell: $env:OPENAI_API_KEY="sk-..."
chainlit run app.py -w
```

Open [http://localhost:8000](http://localhost:8000).

---

## How it works

1. **Dev prompt + settings**
First turn injects your developer instructions; reasoning effort and summary are toggled in settings.  

2. **Multimodal in**
You can send text, images, and files together; non-image files are converted to **function calls** to place them in the workspace.  

3. **Streaming loop**
The app calls `responses.create(stream=True)` and processes an event stream:

* assistant text tokens,
* function call *creation* events,
* **function argument deltas** (used to live-render code for `execute_python_code`).  

4. **Tool execution**
Detected calls are executed, then their outputs are returned via `function_call_output` and the loop continues until the model is done. 

5. **Results rendering**
Python runs can emit images/CSVs/text; these are previewed inline and attached as downloads.  

6. **Optional RAG**
If you upload docs at start, the app creates a vector store and enables the **file\_search** tool for the model.  

---

## Configuration

* **Model**: change the model in `_ask_gpt`. 
* **Reasoning**: set effort, enable summary. 
* **Workspace**: path = `.files/{session_id}/pyws`; auto-cleaned on chat end.  
* **Tools**: edit `build_tools(...)` to add/remove tools; file-search is gated on vector store presence.  

---

## Troubleshooting

* **No code preview while functions run** → ensure you’re calling Responses with `stream=True` and watch for `function_call_arguments.delta` events.  
* **CSV not rendering as a table** → pandas must be importable; otherwise it falls back to a text preview.  
* **Files didn’t persist** → they persist for the chat; cleanup runs on `@cl.on_chat_end`. 

---

## Security

Python executes **locally** inside the per-chat workspace. Treat generated code as untrusted; time limits are applied in the executor and files are confined to that directory. (See the Python tool descriptions and file-handling utilities.)  

---

## File map

* `app.py` — chat lifecycle, dev prompt, vector store & **file\_search**, event-driven streaming, multi-iteration function loop, code preview.  
* `tools.py` — tool registry, Python executor, upload/list helpers, image/CSV/text previewers, progress + reasoning summary steps.  

Add a license (MIT recommended) and you’re set.
Loading