Skip to content

Commit

Permalink
docs: Update LlamaCPP install instructions (#7029)
Browse files Browse the repository at this point in the history
  • Loading branch information
jacoblee93 authored Oct 19, 2024
1 parent 3df59ce commit 270b447
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 6 deletions.
4 changes: 2 additions & 2 deletions docs/core_docs/docs/integrations/chat/llama_cpp.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,14 +12,14 @@ This module is based on the [node-llama-cpp](https://github.com/withcatai/node-l

## Setup

You'll need to install the [node-llama-cpp](https://github.com/withcatai/node-llama-cpp) module to communicate with your local model.
You'll need to install major version `2` of the [node-llama-cpp](https://github.com/withcatai/node-llama-cpp) module to communicate with your local model.

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";

<IntegrationInstallTooltip></IntegrationInstallTooltip>

```bash npm2yarn
npm install -S node-llama-cpp @langchain/community @langchain/core
npm install -S node-llama-cpp@2 @langchain/community @langchain/core
```

You will also need a local Llama 2 model (or a model supported by [node-llama-cpp](https://github.com/withcatai/node-llama-cpp)). You will need to pass the path to this model to the LlamaCpp module as a part of the parameters (see example).
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/integrations/llms/llama_cpp.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,10 @@ This module is based on the [node-llama-cpp](https://github.com/withcatai/node-l

## Setup

You'll need to install the [node-llama-cpp](https://github.com/withcatai/node-llama-cpp) module to communicate with your local model.
You'll need to install major version `2` of the [node-llama-cpp](https://github.com/withcatai/node-llama-cpp) module to communicate with your local model.

```bash npm2yarn
npm install -S node-llama-cpp
npm install -S node-llama-cpp@2
```

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
Expand Down
4 changes: 2 additions & 2 deletions docs/core_docs/docs/integrations/text_embedding/llama_cpp.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,10 @@ This module is based on the [node-llama-cpp](https://github.com/withcatai/node-l

## Setup

You'll need to install the [node-llama-cpp](https://github.com/withcatai/node-llama-cpp) module to communicate with your local model.
You'll need to install major version `2` of the [node-llama-cpp](https://github.com/withcatai/node-llama-cpp) module to communicate with your local model.

```bash npm2yarn
npm install -S node-llama-cpp
npm install -S node-llama-cpp@2
```

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
Expand Down

0 comments on commit 270b447

Please sign in to comment.