This example demonstrates how to use LlamaIndex to create a document index and query it using OpenAI's GPT model via LangDB.
- Python 3.8+
- Access to LangDB.ai API
- LangDB.ai API key
pip install llama-index openai
- Create a data directory in the same folder as main.py and add your documents there.
The main.py script:
- Initializes LlamaIndex with LangDB.ai configuration
- Loads documents from the data directory consisting of langdb.ai features
- Creates a vector store index from the documents
- Persists the index to disk in a storage directory
- Creates a query engine to interact with the index
- Performs a sample query about LangDB.ai features
- Place your documents in the data directory
- Run the script:
python main.py
SimpleDirectoryReader
: Loads documents from a directoryVectorStoreIndex
: Creates and manages document embeddingsOpenAI
: LLM configuration for queriesquery_engine
: Interface for querying the indexed documents
The script will print the response to the query "what are the features of langdb.ai?" based on the content of your indexed documents.