Skip to content

Commit

Permalink
docs[patch],examples[patch]: Add tooltip with note about unified mode…
Browse files Browse the repository at this point in the history
…l params (langchain-ai#5042)

* docs[patch],examples[patch]: Add tooltip with note about unified model params

* chore: lint files

* revert moving back to model name changes

* more

* chore: lint files
  • Loading branch information
bracesproul authored Apr 10, 2024
1 parent 0623819 commit 1c270f4
Show file tree
Hide file tree
Showing 29 changed files with 595 additions and 493 deletions.
4 changes: 4 additions & 0 deletions docs/core_docs/docs/expression_language/get_started.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/openai @langchain/community
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

<CodeBlock language="typescript">{BasicExample}</CodeBlock>

:::tip
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/get_started/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,10 @@ We will cover these at a high level, but keep in mind there is a lot more to eac

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

<Tabs groupId="preferredModel">
<TabItem value="openai" label="OpenAI" default>

Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/anthropic.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,10 @@ Set it as an environment variable named `ANTHROPIC_API_KEY`, or pass it into the

## Usage

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

You can initialize an instance like this:

import CodeBlock from "@theme/CodeBlock";
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/anthropic_tools.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,10 @@ npm install @langchain/anthropic

You can initialize this wrapper the same way you'd initialize a standard `ChatAnthropic` instance:

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

```typescript
import { ChatAnthropicTools } from "@langchain/anthropic/experimental";

Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/azure.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/openai
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

<CodeBlock language="typescript">{AzureOpenAI}</CodeBlock>

If your instance is hosted under a domain other than the default `openai.azure.com`, you'll need to use the alternate `AZURE_OPENAI_BASE_PATH` environment variable.
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/bedrock.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,10 @@ npm install @aws-crypto/sha256-js @smithy/protocol-http @smithy/signature-v4 @sm

## Usage

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

Currently, only Anthropic, Cohere, and Mistral models are supported with the chat model integration. For foundation models from AI21 or Amazon, see [the text generation Bedrock variant](/docs/integrations/llms/bedrock).

import CodeBlock from "@theme/CodeBlock";
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,8 @@ npm install @langchain/cloudflare
import CodeBlock from "@theme/CodeBlock";
import Example from "@examples/models/chat/integration_cloudflare_workersai.ts";

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

<CodeBlock language="typescript">{Example}</CodeBlock>
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/fireworks.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/community
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

import Fireworks from "@examples/models/chat/integration_fireworks.ts";

<CodeBlock language="typescript">{Fireworks}</CodeBlock>
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/google_generativeai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,10 @@ npm install @langchain/google-genai

## Usage

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

import GoogleGenerativeAI from "@examples/models/chat/googlegenerativeai.ts";

<CodeBlock language="typescript">{GoogleGenerativeAI}</CodeBlock>
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/google_palm.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install google-auth-library @google-ai/generativelanguage @langchain/community
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

Create an **API key** from [Google MakerSuite](https://makersuite.google.com/app/apikey). You can then set
the key as `GOOGLE_PALM_API_KEY` environment variable or pass it as `apiKey` parameter while instantiating
the model.
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/google_vertex_ai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/google-vertexai
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

You should make sure the Vertex AI API is
enabled for the relevant project and that you've authenticated to
Google Cloud using one of these methods:
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/groq.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/groq
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

## Usage

import ChatGroqExample from "@examples/models/chat/chat_groq.ts";
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/minimax.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/community
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

## Basic usage

import Minimax from "@examples/models/chat/integration_minimax.ts";
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/mistral.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/mistralai
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

## Usage

When sending chat messages to mistral, there are a few requirements to follow:
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/openai
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

<CodeBlock language="typescript">{OpenAI}</CodeBlock>

If you're part of an organization, you can set `process.env.OPENAI_ORGANIZATION` with your OpenAI organization id, or pass it in as `organization` when
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/chat/togetherai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/community
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

import TogetherAI from "@examples/models/chat/integration_togetherai.ts";

<CodeBlock language="typescript">{TogetherAI}</CodeBlock>
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/llms/azure.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,10 @@ You'll first need to install the [`@langchain/azure-openai`](https://www.npmjs.c
npm install -S @langchain/azure-openai
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

You'll also need to have an Azure OpenAI instance deployed. You can deploy a version on Azure Portal following [this guide](https://learn.microsoft.com/azure/ai-services/openai/how-to/create-resource?pivots=web-portal).

Once you have your instance running, make sure you have the endpoint and key. You can find them in the Azure Portal, under the "Keys and Endpoint" section of your instance.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,10 @@ Here's an example of calling a HugggingFaceInference model as an LLM:
npm install @huggingface/inference@2
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

```typescript
import { HuggingFaceInference } from "langchain/llms/hf";

Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/llms/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/openai
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

```typescript
import { OpenAI } from "@langchain/openai";

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install -S @langchain/azure-openai
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

You'll also need to have an Azure OpenAI instance deployed. You can deploy a version on Azure Portal following [this guide](https://learn.microsoft.com/azure/ai-services/openai/how-to/create-resource?pivots=web-portal).

Once you have your instance running, make sure you have the endpoint and key. You can find them in the Azure Portal, under the "Keys and Endpoint" section of your instance.
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/integrations/text_embedding/bedrock.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/community
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

## Usage

The `BedrockEmbeddings` class uses the AWS Bedrock API to generate embeddings for a given text. It strips new line characters from the text as recommended.
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
:::tip
We're unifying model params across all packages. We now suggest using `model` instead of `modelName`, and `apiKey` for API keys.
:::
4 changes: 4 additions & 0 deletions docs/core_docs/docs/modules/model_io/chat/caching.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,10 @@ LangChain provides an optional caching layer for chat models. This is useful for
It can save you money by reducing the number of API calls you make to the LLM provider, if you're often requesting the same completion multiple times.
It can speed up your application by reducing the number of API calls you make to the LLM provider.

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

import CodeBlock from "@theme/CodeBlock";

```typescript
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/modules/model_io/chat/custom_chat.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ You can also implement the following optional method:

- A `_streamResponseChunks` method that returns an `AsyncGenerator` and yields [`ChatGenerationChunks`](https://api.js.langchain.com/classes/langchain_core_outputs.ChatGenerationChunk.html). This allows the LLM to support streaming outputs.

Lets implement a very simple custom chat model that just echoes back the first `n` characters of the input.
Let's implement a very simple custom chat model that just echoes back the first `n` characters of the input.

```typescript
import {
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/modules/model_io/chat/quick_start.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,10 @@ import Tabs from "@theme/Tabs";
import TabItem from "@theme/TabItem";
import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

<Tabs groupId="preferredModel">
<TabItem value="openai" label="OpenAI" default>

Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/modules/model_io/chat/streaming.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,10 @@ import IntegrationInstallTooltip from "@mdx_components/integration_install_toolt
npm install @langchain/openai
```

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

<CodeBlock language="typescript">{StreamMethodExample}</CodeBlock>

For models that do not support streaming, the entire response will be returned as a single chunk.
Expand Down
4 changes: 4 additions & 0 deletions docs/core_docs/docs/modules/model_io/quick_start.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,10 @@ import TabItem from "@theme/TabItem";
import CodeBlock from "@theme/CodeBlock";
import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";

import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx";

<UnifiedModelParamsTooltip></UnifiedModelParamsTooltip>

<Tabs groupId="preferredModel">
<TabItem value="openai" label="OpenAI" default>

Expand Down
Loading

0 comments on commit 1c270f4

Please sign in to comment.