Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

scripts[minor],docs[minor]: Add LLM integration doc template and updated OpenAI LLM doc #6327

Merged
merged 3 commits into from
Aug 1, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
260 changes: 260 additions & 0 deletions docs/core_docs/docs/integrations/llms/openai.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,260 @@
{
"cells": [
{
"cell_type": "raw",
"id": "67db2992",
"metadata": {},
"source": [
"---\n",
"sidebar_label: OpenAI\n",
"---"
]
},
{
"cell_type": "markdown",
"id": "9597802c",
"metadata": {},
"source": [
"# OpenAI\n",
"\n",
"```{=mdx}\n",
"\n",
":::caution\n",
"You are currently on a page documenting the use of OpenAI [text completion models](/docs/concepts/#llms). The latest and most popular OpenAI models are [chat completion models](/docs/concepts/#chat-models).\n",
"\n",
"Unless you are specifically using `gpt-3.5-turbo-instruct`, you are probably looking for [this page instead](/docs/integrations/chat/openai/).\n",
":::\n",
"\n",
"```\n",
"\n",
"This will help you get started with OpenAI completion models (LLMs) using LangChain. For detailed documentation on `OpenAI` features and configuration options, please refer to the [API reference](https://api.js.langchain.com/classes/langchain_openai.OpenAI.html).\n",
"\n",
"## Overview\n",
"### Integration details\n",
"\n",
"| Class | Package | Local | Serializable | [PY support](https://python.langchain.com/docs/integrations/llms/openai) | Package downloads | Package latest |\n",
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
"| [OpenAI](https://api.js.langchain.com/classes/langchain_openai.OpenAI.html) | [@langchain/openai](https://api.js.langchain.com/modules/langchain_openai.html) | ❌ | ✅ | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/openai?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/openai?style=flat-square&label=%20&) |\n",
"\n",
"## Setup\n",
"\n",
"To access OpenAI models you'll need to create an OpenAI account, get an API key, and install the `@langchain/openai` integration package.\n",
"\n",
"### Credentials\n",
"\n",
"Head to [platform.openai.com](https://platform.openai.com/) to sign up to OpenAI and generate an API key. Once you've done this set the `OPENAI_API_KEY` environment variable:\n",
"\n",
"```bash\n",
"export OPENAI_API_KEY=\"your-api-key\"\n",
"```\n",
"\n",
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:\n",
"\n",
"```bash\n",
"# export LANGCHAIN_TRACING_V2=\"true\"\n",
"# export LANGCHAIN_API_KEY=\"your-api-key\"\n",
"```\n",
"\n",
"### Installation\n",
"\n",
"The LangChain OpenAI integration lives in the `@langchain/openai` package:\n",
"\n",
"```{=mdx}\n",
"import IntegrationInstallTooltip from \"@mdx_components/integration_install_tooltip.mdx\";\n",
"import Npm2Yarn from \"@theme/Npm2Yarn\";\n",
"\n",
"<IntegrationInstallTooltip></IntegrationInstallTooltip>\n",
"\n",
"<Npm2Yarn>\n",
" @langchain/openai\n",
"</Npm2Yarn>\n",
"\n",
"```"
]
},
{
"cell_type": "markdown",
"id": "0a760037",
"metadata": {},
"source": [
"## Instantiation\n",
"\n",
"Now we can instantiate our model object and generate chat completions:"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "a0562a13",
"metadata": {},
"outputs": [],
"source": [
"import { OpenAI } from \"@langchain/openai\"\n",
"\n",
"const llm = new OpenAI({\n",
" model: \"gpt-3.5-turbo-instruct\",\n",
" temperature: 0,\n",
" maxTokens: undefined,\n",
" timeout: undefined,\n",
" maxRetries: 2,\n",
" apiKey: process.env.OPENAI_API_KEY,\n",
" // other params...\n",
"})"
]
},
{
"cell_type": "markdown",
"id": "0ee90032",
"metadata": {},
"source": [
"## Invocation"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "035dea0f",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"develops and promotes friendly AI for the benefit of humanity. It was founded in 2015 by Elon Musk, Sam Altman, Greg Brockman, Ilya Sutskever, Wojciech Zaremba, John Schulman, and Chris Olah. The company's mission is to create and promote artificial general intelligence (AGI) that is safe and beneficial to humanity.\n",
"\n",
"OpenAI conducts research in various areas of AI, including deep learning, reinforcement learning, robotics, and natural language processing. The company also develops and releases open-source tools and platforms for AI research, such as the GPT-3 language model and the Gym toolkit for reinforcement learning.\n",
"\n",
"One of the main goals of OpenAI is to ensure that the development of AI is aligned with human values and does not pose a threat to humanity. To this end, the company has established a set of principles for safe and ethical AI development, and it actively collaborates with other organizations and researchers in the field.\n",
"\n",
"OpenAI has received funding from various sources, including tech giants like Microsoft and Amazon, as well as individual investors. It has also partnered with companies and organizations such as Google, IBM, and the United Nations to advance its research and promote responsible AI development.\n",
"\n",
"In addition to its research and development\n"
]
}
],
"source": [
"const inputText = \"OpenAI is an AI company that \"\n",
"\n",
"const completion = await llm.invoke(inputText)\n",
"completion"
]
},
{
"cell_type": "markdown",
"id": "add38532",
"metadata": {},
"source": [
"## Chaining\n",
"\n",
"We can [chain](/docs/how_to/sequence/) our completion model with a prompt template like so:"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "078e9db2",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"Ich liebe Programmieren.\n"
]
}
],
"source": [
"import { PromptTemplate } from \"@langchain/core/prompts\"\n",
"\n",
"const prompt = new PromptTemplate({\n",
" template: \"How to say {input} in {output_language}:\\n\",\n",
" inputVariables: [\"input\", \"output_language\"],\n",
"})\n",
"\n",
"const chain = prompt.pipe(llm);\n",
"await chain.invoke(\n",
" {\n",
" output_language: \"German\",\n",
" input: \"I love programming.\",\n",
" }\n",
")"
]
},
{
"cell_type": "markdown",
"id": "e99eef30",
"metadata": {},
"source": [
"If you're part of an organization, you can set `process.env.OPENAI_ORGANIZATION` to your OpenAI organization id, or pass it in as `organization` when\n",
"initializing the model.\n",
"\n",
"## Custom URLs\n",
"\n",
"You can customize the base URL the SDK sends requests to by passing a `configuration` parameter like this:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d958ab00",
"metadata": {},
"outputs": [],
"source": [
"const llmCustomURL = new OpenAI({\n",
" temperature: 0.9,\n",
" configuration: {\n",
" baseURL: \"https://your_custom_url.com\",\n",
" },\n",
"});"
]
},
{
"cell_type": "markdown",
"id": "81a5e2ea",
"metadata": {},
"source": [
"You can also pass other `ClientOptions` parameters accepted by the official SDK.\n",
"\n",
"If you are hosting on Azure OpenAI, see the [dedicated page instead](/docs/integrations/llms/azure).\n"
]
},
{
"cell_type": "markdown",
"id": "e9bdfcef",
"metadata": {},
"source": [
"## API reference\n",
"\n",
"For detailed documentation of all OpenAI features and configurations head to the API reference: https://api.js.langchain.com/classes/langchain_openai.OpenAI.html"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "TypeScript",
"language": "typescript",
"name": "tslab"
},
"language_info": {
"codemirror_mode": {
"mode": "typescript",
"name": "javascript",
"typescript": true
},
"file_extension": ".ts",
"mimetype": "text/typescript",
"name": "typescript",
"version": "3.7.2"
},
"vscode": {
"interpreter": {
"hash": "e971737741ff4ec9aff7dc6155a1060a59a8a6d52c757dbbe66bf8ee389494b1"
}
}
},
"nbformat": 4,
"nbformat_minor": 5
}
55 changes: 0 additions & 55 deletions docs/core_docs/docs/integrations/llms/openai.mdx

This file was deleted.

9 changes: 0 additions & 9 deletions libs/langchain-scripts/src/cli/docs/document_loaders.ts
Original file line number Diff line number Diff line change
Expand Up @@ -96,15 +96,6 @@ async function promptExtraFields(): Promise<ExtraFields> {
};
}

/**
* Will always be community. We care about:
* Module name
* file name
* Is web loader (boolean)
* Is node only (boolean)
* Does it run locally (boolean)
*/

export async function fillDocLoaderIntegrationDocTemplate(fields: {
packageName: string;
moduleName: string;
Expand Down
10 changes: 9 additions & 1 deletion libs/langchain-scripts/src/cli/docs/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
import { Command } from "commander";
import { fillChatIntegrationDocTemplate } from "./chat.js";
import { fillDocLoaderIntegrationDocTemplate } from "./document_loaders.js";
import { fillLLMIntegrationDocTemplate } from "./llms.js";

type CLIInput = {
package: string;
Expand Down Expand Up @@ -49,9 +50,16 @@ async function main() {
moduleName,
});
break;
case "llm":
await fillLLMIntegrationDocTemplate({
packageName,
moduleName,
isCommunity,
});
break;
default:
console.error(
`Invalid type: ${type}.\nOnly 'chat' and 'doc_loader' are supported at this time.`
`Invalid type: ${type}.\nOnly 'chat', 'llm' and 'doc_loader' are supported at this time.`
);
process.exit(1);
}
Expand Down
Loading
Loading