You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After successful installation of all dependencies, we can start writing code.
26
26
27
-
First, import `os`, `VectorstoreIndexCreator`, `ApifyWrapper`, and `Document` into your source code:
27
+
First, import all required packages:
28
28
29
29
```python
30
30
import os
31
31
32
32
from langchain.indexes import VectorstoreIndexCreator
33
33
from langchain_community.utilities import ApifyWrapper
34
34
from langchain_core.document_loaders.base import Document
35
+
from langchain_openai import OpenAI
36
+
from langchain_openai.embeddings import OpenAIEmbeddings
35
37
```
36
38
37
39
Find your [Apify API token](https://console.apify.com/account/integrations) and [OpenAI API key](https://platform.openai.com/account/api-keys) and initialize these into environment variable:
@@ -57,22 +59,26 @@ loader = apify.call_actor(
57
59
)
58
60
```
59
61
60
-
_NOTE: The Actor call function can take some time as it loads the data from LangChain documentation website._
62
+
:::note Crawling may take some time
63
+
64
+
The Actor call may take some time as it crawls the LangChain documentation website.
65
+
66
+
:::
61
67
62
68
Initialize the vector index from the crawled documents:
63
69
64
70
```python
65
-
index = VectorstoreIndexCreator().from_loaders([loader])
71
+
index = VectorstoreIndexCreator(embedding=OpenAIEmbeddings()).from_loaders([loader])
66
72
```
67
73
68
74
And finally, query the vector index:
69
75
70
76
```python
71
77
query ="What is LangChain?"
72
-
result = index.query_with_sources(query)
78
+
result = index.query_with_sources(query, llm=OpenAI())
73
79
74
-
print(result["answer"])
75
-
print(result["sources"])
80
+
print("answer:", result["answer"])
81
+
print("source:", result["sources"])
76
82
```
77
83
78
84
If you want to test the whole example, you can simply create a new file, `langchain_integration.py`, and copy the whole code into it.
@@ -83,35 +89,37 @@ import os
83
89
from langchain.indexes import VectorstoreIndexCreator
84
90
from langchain_community.utilities import ApifyWrapper
85
91
from langchain_core.document_loaders.base import Document
92
+
from langchain_openai import OpenAI
93
+
from langchain_openai.embeddings import OpenAIEmbeddings
86
94
87
95
os.environ["OPENAI_API_KEY"] ="Your OpenAI API key"
88
96
os.environ["APIFY_API_TOKEN"] ="Your Apify API token"
index = VectorstoreIndexCreator().from_loaders([loader])
106
+
print("Compute embeddings...")
107
+
index = VectorstoreIndexCreator(embedding=OpenAIEmbeddings()).from_loaders([loader])
100
108
query ="What is LangChain?"
101
-
result = index.query_with_sources(query)
109
+
result = index.query_with_sources(query, llm=OpenAI())
102
110
103
-
print(result["answer"])
104
-
print(result["sources"])
111
+
print("answer:", result["answer"])
112
+
print("source:", result["sources"])
105
113
```
106
114
107
115
To run it, you can use the following command: `python langchain_integration.py`
108
116
109
117
After running the code, you should see the following output:
110
118
111
119
```text
112
-
LangChain is a framework for developing applications powered by language models. It provides standard, extendable interfaces, external integrations, and end-to-end implementations for off-the-shelf use. It also integrates with other LLMs, systems, and products to create a vibrant and thriving ecosystem.
120
+
answer: LangChain is a framework for developing applications powered by language models. It provides standard, extendable interfaces, external integrations, and end-to-end implementations for off-the-shelf use. It also integrates with other LLMs, systems, and products to create a vibrant and thriving ecosystem.
113
121
114
-
https://python.langchain.com
122
+
source: https://python.langchain.com
115
123
```
116
124
117
125
LangChain is a standard interface through which you can interact with a variety of large language models (LLMs). It provides modules you can use to build language model applications. It also provides chains and agents with memory capabilities.
0 commit comments