Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Polish docs #88

Merged
merged 2 commits into from
Nov 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 8 additions & 1 deletion docs/tutorials/chat-on-tabular-data.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,13 @@
# Chat on Tabular Data

Beyond simple conversations, `tablegpt-agent` can also analyze and process tablular data. For data analysis, preprocess the data first with TableGPT. Use a fixed session_id to keep the agent within the same execution context, and set a save point to enable memory retention:
TableGPT Agent excels at analyzing and processing tabular data. To perform data analysis, you need to first let the agent "see" the dataset. This is done by a specific "file-reading" workflow. In short, you begin by "uploading" the dataset and let the agent read it. Once the data is read, you can ask the agent questions about it.

> To learn more about the file-reading workflow, see [File Reading](./explanation/file-reading.md).

For data analysis tasks, we introduce two important parameters when creating the agent: `checkpointer` and `session_id`.

- The `checkpointer` should be an instance of `langgraph.checkpoint.base.BaseCheckpointSaver`, which acts as a versioned "memory" for the agent. (See [langgraph's persistence concept](https://langchain-ai.github.io/langgraph/concepts/persistence) for more details.)
- The `session_id` is a unique identifier for the current session. It ties the agent's execution to a specific kernel, ensuring that the agent's results are retained across multiple invocations.

```pycon
>>> from langgraph.checkpoint.memory import MemorySaver
Expand Down
9 changes: 5 additions & 4 deletions docs/tutorials/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Installation

To install `tablegpt-agent`, use the following command:
To install TableGPT Agent, use the following command:

```sh
pip install tablegpt-agent
Expand All @@ -16,14 +16,14 @@ pip install tablegpt-agent[local]

## Setup LLM Service

Before using `tablegpt-agent`, ensure that you have an OpenAI-compatible server set up to host TableGPT2. We recommend using [vllm](https://github.com/vllm-project/vllm) for this:
Before using TableGPT Agent, ensure that you have an OpenAI-compatible server set up to host TableGPT2. We recommend using [vllm](https://github.com/vllm-project/vllm) for this:

> **Note:** If you need to use `tablegpt-agent` to analyze tabular data, please ensure your `vllm>=0.5.5`

```sh
pip install 'vllm>=0.5.5'
```


```sh
python -m vllm.entrypoints.openai.api_server --served-model-name TableGPT2-7B --model path/to/weights
```
Expand All @@ -32,8 +32,9 @@ python -m vllm.entrypoints.openai.api_server --served-model-name TableGPT2-7B --

## Chat with TableGPT Agent

To create a `TablegptAgent`, you'll need both an `LLM` and a `PyBoxManager` instance:
To create an agent, you'll need both an `LLM` and a `PyBoxManager` instance:
> **NOTE** The `llm` is created using `langchain-openai`, please install it first.

```sh
pip install langchain-openai
```
Expand Down