Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs[patch]: Fix .png badge bug and var naming #6304

Merged
merged 1 commit into from
Aug 1, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 7 additions & 3 deletions docs/core_docs/docs/integrations/chat/fireworks.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,14 @@
{
"cell_type": "raw",
"id": "afaf8039",
"metadata": {},
"metadata": {
"vscode": {
"languageId": "raw"
}
},
"source": [
"---\n",
"sidebar_label: ChatFireworks\n",
"sidebar_label: Fireworks\n",
"---"
]
},
Expand All @@ -24,7 +28,7 @@
"\n",
"| Class | Package | Local | Serializable | [PY support](https:/python.langchain.com/v0.2/docs/integrations/chat/fireworks) | Package downloads | Package latest |\n",
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
"| [ChatFireworks](https://api.js.langchain.com/classes/langchain_community_chat_models_fireworks.ChatFireworks.html) | [@langchain/community](https://api.js.langchain.com/modules/langchain_community_chat_models_fireworks.html) | ❌ | ✅ | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/community?style=flat-square&label=%20) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/community?style=flat-square&label=%20) |\n",
"| [ChatFireworks](https://api.js.langchain.com/classes/langchain_community_chat_models_fireworks.ChatFireworks.html) | [@langchain/community](https://api.js.langchain.com/modules/langchain_community_chat_models_fireworks.html) | ❌ | ✅ | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/community?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/community?style=flat-square&label=%20&) |\n",
"\n",
"### Model features\n",
"| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
Expand Down
115 changes: 60 additions & 55 deletions docs/core_docs/docs/integrations/chat/mistral.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,14 @@
{
"cell_type": "raw",
"id": "afaf8039",
"metadata": {},
"metadata": {
"vscode": {
"languageId": "raw"
}
},
"source": [
"---\n",
"sidebar_label: ChatMistralAI\n",
"sidebar_label: MistralAI\n",
"---"
]
},
Expand All @@ -24,7 +28,7 @@
"\n",
"| Class | Package | Local | Serializable | [PY support](https:/python.langchain.com/v0.2/docs/integrations/chat/mistralai) | Package downloads | Package latest |\n",
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
"| [ChatMistralAI](https://api.js.langchain.com/classes/langchain_mistralai.ChatMistralAI.html) | [@langchain/mistralai](https://api.js.langchain.com/modules/langchain_mistralai.html) | ❌ | ❌ | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/mistralai?style=flat-square&label=%20) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/mistralai?style=flat-square&label=%20) |\n",
"| [ChatMistralAI](https://api.js.langchain.com/classes/langchain_mistralai.ChatMistralAI.html) | [@langchain/mistralai](https://api.js.langchain.com/modules/langchain_mistralai.html) | ❌ | ❌ | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/mistralai?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/mistralai?style=flat-square&label=%20&) |\n",
"\n",
"### Model features\n",
"| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
Expand Down Expand Up @@ -250,7 +254,7 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 1,
"id": "98d9034c",
"metadata": {},
"outputs": [
Expand All @@ -263,7 +267,7 @@
" name: 'calculator',\n",
" args: { operation: 'add', number1: 2, number2: 2 },\n",
" type: 'tool_call',\n",
" id: '2HFfjvCvo'\n",
" id: 'Tn8X3UCSP'\n",
" }\n",
"]\n"
]
Expand All @@ -275,32 +279,31 @@
"import { z } from \"zod\";\n",
"import { tool } from \"@langchain/core/tools\";\n",
"\n",
"const calculatorSchema2 = z.object({\n",
"const calculatorSchema = z.object({\n",
" operation: z\n",
" .enum([\"add\", \"subtract\", \"multiply\", \"divide\"])\n",
" .describe(\"The type of operation to execute.\"),\n",
" number1: z.number().describe(\"The first number to operate on.\"),\n",
" number2: z.number().describe(\"The second number to operate on.\"),\n",
"});\n",
"\n",
"const calculatorTool2 = tool((input) => {\n",
"const calculatorTool = tool((input) => {\n",
" return JSON.stringify(input);\n",
"}, {\n",
" name: \"calculator\",\n",
" description: \"A simple calculator tool\",\n",
" schema: calculatorSchema2,\n",
" schema: calculatorSchema,\n",
"});\n",
"\n",
"const llm2 = new ChatMistralAI({\n",
"// Bind the tool to the model\n",
"const modelWithTool = new ChatMistralAI({\n",
" model: \"mistral-large-latest\",\n",
"}).bind({\n",
" tools: [calculatorTool],\n",
"});\n",
"\n",
"// Bind the tool to the model\n",
"const modelWithTool2 = llm2.bind({\n",
" tools: [calculatorTool2],\n",
"});\n",
"\n",
"const prompt2 = ChatPromptTemplate.fromMessages([\n",
"const calcToolPrompt = ChatPromptTemplate.fromMessages([\n",
" [\n",
" \"system\",\n",
" \"You are a helpful assistant who always needs to use a calculator.\",\n",
Expand All @@ -309,12 +312,12 @@
"]);\n",
"\n",
"// Chain your prompt, model, and output parser together\n",
"const chain2 = prompt2.pipe(modelWithTool2);\n",
"const chainWithCalcTool = calcToolPrompt.pipe(modelWithTool);\n",
"\n",
"const response2 = await chain2.invoke({\n",
"const calcToolRes = await chainWithCalcTool.invoke({\n",
" input: \"What is 2 + 2?\",\n",
"});\n",
"console.log(response2.tool_calls);"
"console.log(calcToolRes.tool_calls);"
]
},
{
Expand All @@ -337,7 +340,7 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": 3,
"id": "a8638d82",
"metadata": {},
"outputs": [
Expand All @@ -354,7 +357,7 @@
"import { ChatPromptTemplate } from \"@langchain/core/prompts\";\n",
"import { z } from \"zod\";\n",
"\n",
"const calculatorSchema3 = z\n",
"const calculatorSchemaForWSO = z\n",
" .object({\n",
" operation: z\n",
" .enum([\"add\", \"subtract\", \"multiply\", \"divide\"])\n",
Expand All @@ -364,14 +367,16 @@
" })\n",
" .describe(\"A simple calculator tool\");\n",
"\n",
"const llm3 = new ChatMistralAI({\n",
"const llmForWSO = new ChatMistralAI({\n",
" model: \"mistral-large-latest\",\n",
"});\n",
"})\n",
"\n",
"// Pass the schema and tool name to the withStructuredOutput method\n",
"const modelWithTool3 = llm3.withStructuredOutput(calculatorSchema3);\n",
"const modelWithStructuredOutput = llmForWSO.withStructuredOutput(calculatorSchemaForWSO, {\n",
" name: \"calculator\",\n",
"});\n",
"\n",
"const prompt3 = ChatPromptTemplate.fromMessages([\n",
"const promptForWSO = ChatPromptTemplate.fromMessages([\n",
" [\n",
" \"system\",\n",
" \"You are a helpful assistant who always needs to use a calculator.\",\n",
Expand All @@ -380,12 +385,12 @@
"]);\n",
"\n",
"// Chain your prompt and model together\n",
"const chain3 = prompt3.pipe(modelWithTool3);\n",
"const chainWSO = promptForWSO.pipe(modelWithStructuredOutput);\n",
"\n",
"const response3 = await chain3.invoke({\n",
"const responseWSO = await chainWSO.invoke({\n",
" input: \"What is 2 + 2?\",\n",
"});\n",
"console.log(response3);"
"console.log(responseWSO);"
]
},
{
Expand All @@ -398,7 +403,7 @@
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": 4,
"id": "9786b41a",
"metadata": {},
"outputs": [
Expand All @@ -416,14 +421,14 @@
" name: 'calculator',\n",
" args: { operation: 'add', number1: 2, number2: 2 },\n",
" type: 'tool_call',\n",
" id: 'qVxKofNLR'\n",
" id: 'w48T6Nc3d'\n",
" }\n",
" ],\n",
" invalid_tool_calls: [],\n",
" additional_kwargs: {\n",
" tool_calls: [\n",
" {\n",
" id: 'qVxKofNLR',\n",
" id: 'w48T6Nc3d',\n",
" function: {\n",
" name: 'calculator',\n",
" arguments: '{\"operation\": \"add\", \"number1\": 2, \"number2\": 2}'\n",
Expand All @@ -441,7 +446,7 @@
" additional_kwargs: {\n",
" tool_calls: [\n",
" {\n",
" id: 'qVxKofNLR',\n",
" id: 'w48T6Nc3d',\n",
" function: {\n",
" name: 'calculator',\n",
" arguments: '{\"operation\": \"add\", \"number1\": 2, \"number2\": 2}'\n",
Expand All @@ -460,7 +465,7 @@
" name: 'calculator',\n",
" args: { operation: 'add', number1: 2, number2: 2 },\n",
" type: 'tool_call',\n",
" id: 'qVxKofNLR'\n",
" id: 'w48T6Nc3d'\n",
" }\n",
" ],\n",
" invalid_tool_calls: [],\n",
Expand All @@ -472,16 +477,16 @@
}
],
"source": [
"const includeRawModel3 = llm3.withStructuredOutput(calculatorSchema3, {\n",
"const includeRawModel = llmForWSO.withStructuredOutput(calculatorSchemaForWSO, {\n",
" name: \"calculator\",\n",
" includeRaw: true,\n",
"});\n",
"const includeRawChain3 = prompt3.pipe(includeRawModel3);\n",
"const includeRawChain = promptForWSO.pipe(includeRawModel);\n",
"\n",
"const includeRawResponse3 = await includeRawChain3.invoke({\n",
"const includeRawResponse = await includeRawChain.invoke({\n",
" input: \"What is 2 + 2?\",\n",
"});\n",
"console.dir(includeRawResponse3, { depth: null });"
"console.dir(includeRawResponse, { depth: null });"
]
},
{
Expand All @@ -494,7 +499,7 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": 5,
"id": "9f1dc9bd",
"metadata": {},
"outputs": [
Expand All @@ -510,7 +515,7 @@
"import { ChatMistralAI } from \"@langchain/mistralai\";\n",
"import { ChatPromptTemplate } from \"@langchain/core/prompts\";\n",
"\n",
"const calculatorJsonSchema4 = {\n",
"const calculatorJsonSchema = {\n",
" type: \"object\",\n",
" properties: {\n",
" operation: {\n",
Expand All @@ -528,14 +533,14 @@
" description: \"A simple calculator tool\",\n",
"};\n",
"\n",
"const llm4 = new ChatMistralAI({\n",
"const llmForJsonSchema = new ChatMistralAI({\n",
" model: \"mistral-large-latest\",\n",
"});\n",
"\n",
"// Pass the schema and tool name to the withStructuredOutput method\n",
"const modelWithTool4 = llm4.withStructuredOutput(calculatorJsonSchema4);\n",
"const modelWithJsonSchemaTool = llmForJsonSchema.withStructuredOutput(calculatorJsonSchema);\n",
"\n",
"const prompt4 = ChatPromptTemplate.fromMessages([\n",
"const promptForJsonSchema = ChatPromptTemplate.fromMessages([\n",
" [\n",
" \"system\",\n",
" \"You are a helpful assistant who always needs to use a calculator.\",\n",
Expand All @@ -544,12 +549,12 @@
"]);\n",
"\n",
"// Chain your prompt and model together\n",
"const chain4 = prompt4.pipe(modelWithTool4);\n",
"const chainWithJsonSchema = promptForJsonSchema.pipe(modelWithJsonSchemaTool);\n",
"\n",
"const response4 = await chain4.invoke({\n",
"const responseFromJsonSchema = await chainWithJsonSchema.invoke({\n",
" input: \"What is 2 + 2?\",\n",
"});\n",
"console.log(response4);\n"
"console.log(responseFromJsonSchema);\n"
]
},
{
Expand All @@ -565,15 +570,15 @@
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": 6,
"id": "76bd0061",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"The weather in Paris is 28 °C.\n"
"It's 28 °C in Paris.\n"
]
}
],
Expand All @@ -585,21 +590,21 @@
"\n",
"import { ChatPromptTemplate } from \"@langchain/core/prompts\";\n",
"\n",
"const llm5 = new ChatMistralAI({\n",
"const llmForAgent = new ChatMistralAI({\n",
" temperature: 0,\n",
" model: \"mistral-large-latest\",\n",
"});\n",
"\n",
"// Prompt template must have \"input\" and \"agent_scratchpad input variables\"\n",
"const prompt5 = ChatPromptTemplate.fromMessages([\n",
"const agentPrompt = ChatPromptTemplate.fromMessages([\n",
" [\"system\", \"You are a helpful assistant\"],\n",
" [\"placeholder\", \"{chat_history}\"],\n",
" [\"human\", \"{input}\"],\n",
" [\"placeholder\", \"{agent_scratchpad}\"],\n",
"]);\n",
"\n",
"// Mocked tool\n",
"const currentWeatherTool5 = tool(async () => \"28 °C\", {\n",
"const currentWeatherToolForAgent = tool(async () => \"28 °C\", {\n",
" name: \"get_current_weather\",\n",
" description: \"Get the current weather in a given location\",\n",
" schema: z.object({\n",
Expand All @@ -608,20 +613,20 @@
"});\n",
"\n",
"const agent = createToolCallingAgent({\n",
" llm: llm5,\n",
" tools: [currentWeatherTool5],\n",
" prompt: prompt5,\n",
" llm: llmForAgent,\n",
" tools: [currentWeatherToolForAgent],\n",
" prompt: agentPrompt,\n",
"});\n",
"\n",
"const agentExecutor = new AgentExecutor({\n",
" agent,\n",
" tools: [currentWeatherTool5],\n",
" tools: [currentWeatherToolForAgent],\n",
"});\n",
"\n",
"const input = \"What's the weather like in Paris?\";\n",
"const { output } = await agentExecutor.invoke({ input });\n",
"const agentInput = \"What's the weather like in Paris?\";\n",
"const agentRes = await agentExecutor.invoke({ input: agentInput });\n",
"\n",
"console.log(output);\n"
"console.log(agentRes.output);\n"
]
},
{
Expand Down
Loading