Skip to content

Latest commit

 

History

History
95 lines (65 loc) · 3.76 KB

README.md

File metadata and controls

95 lines (65 loc) · 3.76 KB

Create TSI

create-tsi is a generative AI RAG toolkit that generates AI Applications using LlamaIndex with low code.

AI Applications generated by create-tsi, use LLMs hosted by T-Systems on Open Telekom Cloud.

The purpose of create-tsi is to make the AI Application creation process easy, flexible and fast. With create-tsi you can generate bots, write agents and customize them for specific use cases.

Please Note

To get started with create-tsi, you need a T-Systems API key. You can request trial access via this form.

Once you have the key, just run

npx create-tsi@latest

to get started. Once your app is generated, read the generated README.md file to start the app.

What you'll get

  • A Next.js-powered front-end. The app is set up as a chat interface that can answer questions about your data (see below)
  • Python FastAPI backend: You’ll get a backend powered by the llama-index python package
  • The back-end has a single endpoint that allows you to send the state of your chat and receive additional responses

Using your data

Unless you selected to generate a simple chat, you can supply your own data and the app will index it and be able to answer questions on the data.

Your generated app will have a folder called data in the backend directory. The app will ingest any supported files you put in this directory.

Example

The simplest thing to do is run create-tsi in interactive mode:

npx create-tsi@latest
# or
npm create tsi@latest
# or
yarn create tsi
# or
pnpm create tsi@latest

You will be asked for the name of your project, along with other configuration options, something like this:

>> npm create-tsi@latest
✔ What is your project named? … my-app
✔ Would you like to generate a NextJS frontend for your FastAPI (Python) backend? … No / Yes
✔ Please provide your T-Systems API key (or reuse TSI_API_KEY env variable): …
✔ Which model would you like to use? › Mixtral-8x7B-Instruct-v0.1
✔ Which embedding model would you like to use? › paraphrase-multilingual-mpnet-base-v2
? Which data source would you like to use? › - Use arrow-keys. Return to submit.
   No data, just a simple chat
❯  Use an example PDF
   Use local files (.pdf, .doc, .docx, .xls, .xlsx, .csv)
   Use local folders
   Use website content (requires Chrome)
   Use data from a database (Mysql)

Code of Conduct

This project has adopted the Contributor Covenant in version 2.1 as our code of conduct. Please see the details in our CODE_OF_CONDUCT.md. All contributors must abide by the code of conduct.

By participating in this project, you agree to abide by its Code of Conduct at all times.

Licensing

This project follows the REUSE standard for software licensing.
Each file contains copyright and license information, and license texts can be found in the ./LICENSES folder. For more information visit https://reuse.software/.
You can find a guide for developers at https://telekom.github.io/reuse-template/.

To annotate your files with licensing information, run:

pipx run reuse annotate --copyright="Deutsche Telekom AG, LlamaIndex, Vercel, Inc." --license="MIT" --recursive --fallback-dot-license --skip-existing .

LlamaIndex Documentation

Inspired by and adapted from create-next-app