Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

embeddings: remove TEI embeddings client. #396

Merged
merged 67 commits into from
Dec 9, 2023
Merged
Show file tree
Hide file tree
Changes from 61 commits
Commits
Show all changes
67 commits
Select commit Hold shift + click to select a range
e16b777
vectorstores: Add remote chroma "vectorstore" adapter (#282)
noodnik2 Sep 29, 2023
adf0f6c
vectorstore: openai apikey is must required only when embedder is nil…
bjwswang Oct 14, 2023
41f2824
bugfix: openai fix model option (#319)
Saerdna Oct 16, 2023
cd7ebe8
build(deps): bump postcss from 8.4.24 to 8.4.31 in /docs (#309)
dependabot[bot] Oct 16, 2023
5620c5b
deps: update golang.org/x/net
tmc Oct 16, 2023
a02d4fd
examples: sync to main
tmc Oct 16, 2023
949349d
vectorstores: disable debug in chromadb client (#324)
bjwswang Oct 17, 2023
c636b3d
chains: added constitutional chain. refactored few shot prompt (#316)
AMK9978 Oct 19, 2023
b33244e
build(deps): bump @babel/traverse from 7.22.5 to 7.23.2 in /docs (#326)
dependabot[bot] Oct 20, 2023
370d210
examples: Sync and update deps
tmc Oct 28, 2023
ce0bbea
ci: Pin golangci-lint to 1.54 (#340)
tmc Oct 28, 2023
6d09939
Update README.md (#338)
tmc Oct 28, 2023
b3845c9
build(deps): bump google.golang.org/grpc from 1.57.0 to 1.57.1 (#330)
dependabot[bot] Oct 28, 2023
db76786
Adding ollama support (#327)
CyrilPeponnet Oct 28, 2023
5f44515
example: Sync
tmc Oct 28, 2023
be6f87d
example: Sync
tmc Oct 28, 2023
440d668
examples: Include full path in examples
tmc Oct 28, 2023
e3f64b1
docs: Add docs publishing, include Ollama quickstart guide. (#342)
tmc Oct 28, 2023
0ded982
docs: Minor edits to ollama guide
tmc Oct 29, 2023
eaf002b
docs: Simplify quickstart material
tmc Oct 29, 2023
34b4fdc
readme: Add link to docs site
tmc Oct 29, 2023
8e0c552
docs: Minor tweak
tmc Oct 29, 2023
139e263
Update docusaurus.config.js
tmc Oct 29, 2023
04a7ba0
ci: Pin golangci-lint to 1.55
Ismail14098 Oct 29, 2023
12c41dc
fix: golangci-lint v1.55 errors
Ismail14098 Oct 29, 2023
0b9c047
Merge pull request #343 from Ismail14098/main
FluffyKebab Oct 29, 2023
0fb9f07
make: Update golangci-lint version in make helper
tmc Oct 30, 2023
2aaf433
callbacks: handle streaming func (#347)
Struki84 Nov 7, 2023
168acd7
examples: Add gpt4 turbo example (#349)
tmc Nov 7, 2023
a85bad2
serpapi: remove unnecessary replacement from url (#344)
Ismail14098 Nov 7, 2023
0cff34d
agents: add parser error handler (#345)
FluffyKebab Nov 7, 2023
09a09b3
llms: add enum for ernie 4.0 API path (#350)
leehaoze Nov 10, 2023
65725eb
embedding/openai: fix method-dependent embedding discrepancy (#357)
eliben Nov 18, 2023
26bd994
embeddings: refactor EmbedderClient interface to reduce code duplicat…
eliben Nov 20, 2023
22f9723
embeddings: use EmbedderClient for ollama embeddings (#360)
eliben Nov 21, 2023
ce67575
embeddings: use EmbedderClient for openai embeddings (#359)
eliben Nov 21, 2023
551b595
embeddings: remove unnecessary mostly-duplicated implementations (#362)
eliben Nov 21, 2023
386e181
embeddings: use EmbedderClient for ernie embeddings (#363)
eliben Nov 21, 2023
2eb6f54
examples/ernie-completion-examples: make this example a separate modu…
eliben Nov 22, 2023
7889a0f
examples/ernie-completion-examples: make this example a separate modu…
eliben Nov 25, 2023
365efd2
github: Run docs only on main
tmc Nov 25, 2023
e59b72f
readme: Fix discord link
tmc Nov 25, 2023
51a3a0a
build(deps): bump axios, @docusaurus/core and @docusaurus/preset-clas…
dependabot[bot] Nov 25, 2023
ace6dd6
embeddings: refactor embedding interfaces to enable creation from Cha…
eliben Nov 30, 2023
80874c4
examples: move go.work creation to root (#378)
eliben Nov 30, 2023
2567a2f
LLM: Add support for Ernie chat_completions (#355)
sxk10812139 Nov 30, 2023
b52d04e
vectorstores: add Milvus (#352)
pattonjp Nov 30, 2023
fc423fa
examples: update examples (#380)
tmc Nov 30, 2023
bfb0d75
examples: fix milvus example to not ask to run ollama (#381)
tmc Nov 30, 2023
9983a59
embeddings: TEI input truncation (enhancement) (#384)
pattonjp Nov 30, 2023
98fa24d
Simplify milvus example (#383)
pattonjp Nov 30, 2023
00bd62f
embeddings: use NewEmbedder for OpenAI embeddings (#385)
eliben Dec 1, 2023
2bb1ca0
examples/pinecone-vectorstore-example: use new embedder (#386)
eliben Dec 1, 2023
008d109
Update README.md (#387)
tmc Dec 1, 2023
db67bc5
embeddings: new NewEmbedder for Ollama embeddings
eliben Dec 1, 2023
ebe0f43
Merge pull request #390 from tmc/fixollamaemb
eliben Dec 1, 2023
4182ba4
embeddings: use NewEmbedder for Vertex embeddings (#391)
eliben Dec 1, 2023
bef6d7e
embeddings: use NewEmbedder for Ernie embeddings (#392)
eliben Dec 1, 2023
9508a34
embeddings: add example of proper usage, and tweak doc.go accordingly…
eliben Dec 1, 2023
2ac3643
embeddings: factor to new embedder client pattern
pattonjp Dec 4, 2023
da040f2
lint errors
pattonjp Dec 4, 2023
ddf6d5c
merge latest changes
pattonjp Dec 8, 2023
2a45fb2
fix missed conflict
pattonjp Dec 8, 2023
a9142b9
remove tei embeddings
pattonjp Dec 8, 2023
0651d8a
mod tidy
pattonjp Dec 8, 2023
16e0cbd
rollback wokflow change
pattonjp Dec 8, 2023
12f90a3
removing rebase artifact
pattonjp Dec 8, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,10 @@ jobs:
cache: false
- uses: actions/checkout@v3
- name: golangci-lint
uses: golangci/golangci-lint-action@v3.6.0
uses: golangci/golangci-lint-action@v3.7.0
with:
args: --timeout=4m
version: v1.55.1
build-examples:
runs-on: ubuntu-latest
steps:
Expand Down
47 changes: 47 additions & 0 deletions .github/workflows/publish-docs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
name: Deploy to GitHub Pages

on:
push: main

permissions:
contents: read
pages: write
id-token: write

concurrency:
group: "pages"
cancel-in-progress: false

jobs:
# Single deploy job since we're just deploying
deploy:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: 18
cache: yarn
cache-dependency-path: 'docs/package-lock.json'

- name: Install dependencies
working-directory: docs
run: yarn install --frozen-lockfile
- name: Build website
working-directory: docs
run: yarn build

- name: Setup Pages
uses: actions/configure-pages@v3
- name: Upload artifact
uses: actions/upload-pages-artifact@v2
with:
# Upload entire repository
path: 'docs/build'
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v2
14 changes: 13 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ test:
go test ./...

.PHONY: lint
lint:
lint: lint-deps
golangci-lint run --color=always --sort-results ./...

.PHONY: lint-exp
Expand All @@ -22,6 +22,13 @@ lint-fix:
lint-all:
golangci-lint run --color=always --sort-results ./...

.PHONY: lint-deps
lint-deps:
@command -v golangci-lint >/dev/null 2>&1 || { \
echo >&2 "golangci-lint not found. Installing..."; \
go install github.com/golangci/golangci-lint/cmd/golangci-lint@v1.55.1; \
}

.PHONY: test-race
test-race:
go run test -race ./...
Expand All @@ -45,3 +52,8 @@ clean-lint-cache:
build-examples:
for example in $(shell find ./examples -mindepth 1 -maxdepth 1 -type d); do \
(cd $$example; echo Build $$example; go mod tidy; go build -o /dev/null) || exit 1; done

.PHONY: add-go-work
add-go-work:
go work init .
go work use -r .
12 changes: 9 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,17 +2,22 @@

[![go.dev reference](https://img.shields.io/badge/go.dev-reference-007d9c?logo=go&logoColor=white&style=flat-square)](https://pkg.go.dev/github.com/tmc/langchaingo)
[![scorecard](https://goreportcard.com/badge/github.com/tmc/langchaingo)](https://goreportcard.com/report/github.com/tmc/langchaingo)
[![](https://dcbadge.vercel.app/api/server/2NgDkQDjpQ?compact=true&style=flat)](https://discord.gg/2NgDkQDjpQ)
[![Open in Dev Containers](https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/tmc/langchaingo)
[<img src="https://github.com/codespaces/badge.svg" title="Open in Github Codespace" width="150" height="20">](https://codespaces.new/tmc/langchaingo)

⚡ Building applications with LLMs through composability ⚡
⚡ Building applications with LLMs through composability, with Go!

## 🤔 What is this?

This is the Go language implementation of LangChain.

## 📖 Documentation

- [Documentation Site](https://tmc.github.io/langchaingo/docs/)
- [API Reference](https://pkg.go.dev/github.com/tmc/langchaingo)


## 🎉 Examples

See [./examples](./examples) for example usage.
Expand Down Expand Up @@ -51,5 +56,6 @@ Socktastic!

Here are some links to blog posts and articles on using Langchain Go:

- [Creating a simple ChatGPT clone with Go](https://sausheong.com/creating-a-simple-chatgpt-clone-with-go-c40b4bec9267?sk=53a2bcf4ce3b0cfae1a4c26897c0deb0)
- [Creating a ChatGPT Clone that Runs on Your Laptop with Go](https://sausheong.com/creating-a-chatgpt-clone-that-runs-on-your-laptop-with-go-bf9d41f1cf88?sk=05dc67b60fdac6effb1aca84dd2d654e)
- [Using Ollama with LangChainGo](https://eli.thegreenplace.net/2023/using-ollama-with-langchaingo/) - Nov 2023
- [Creating a simple ChatGPT clone with Go](https://sausheong.com/creating-a-simple-chatgpt-clone-with-go-c40b4bec9267?sk=53a2bcf4ce3b0cfae1a4c26897c0deb0) - Aug 2023
- [Creating a ChatGPT Clone that Runs on Your Laptop with Go](https://sausheong.com/creating-a-chatgpt-clone-that-runs-on-your-laptop-with-go-bf9d41f1cf88?sk=05dc67b60fdac6effb1aca84dd2d654e) - Aug 2023
26 changes: 21 additions & 5 deletions agents/conversational.go
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ import (
"regexp"
"strings"

"github.com/tmc/langchaingo/callbacks"
"github.com/tmc/langchaingo/chains"
"github.com/tmc/langchaingo/llms"
"github.com/tmc/langchaingo/schema"
Expand All @@ -31,6 +32,8 @@ type ConversationalAgent struct {
Tools []tools.Tool
// Output key is the key where the final output is placed.
OutputKey string
// CallbacksHandler is the handler for callbacks.
CallbacksHandler callbacks.Handler
}

var _ Agent = (*ConversationalAgent)(nil)
Expand All @@ -42,9 +45,10 @@ func NewConversationalAgent(llm llms.LanguageModel, tools []tools.Tool, opts ...
}

return &ConversationalAgent{
Chain: chains.NewLLMChain(llm, options.getConversationalPrompt(tools)),
Tools: tools,
OutputKey: options.outputKey,
Chain: chains.NewLLMChain(llm, options.getConversationalPrompt(tools)),
Tools: tools,
OutputKey: options.outputKey,
CallbacksHandler: options.callbacksHandler,
}
}

Expand All @@ -61,11 +65,21 @@ func (a *ConversationalAgent) Plan(

fullInputs["agent_scratchpad"] = constructScratchPad(intermediateSteps)

var stream func(ctx context.Context, chunk []byte) error

if a.CallbacksHandler != nil {
stream = func(ctx context.Context, chunk []byte) error {
a.CallbacksHandler.HandleStreamingFunc(ctx, chunk)
return nil
}
}

output, err := chains.Predict(
ctx,
a.Chain,
fullInputs,
chains.WithStopWords([]string{"\nObservation:", "\n\tObservation:"}),
chains.WithStreamingFunc(stream),
)
if err != nil {
return nil, nil, err
Expand Down Expand Up @@ -110,12 +124,14 @@ func (a *ConversationalAgent) parseOutput(output string) ([]schema.AgentAction,
if strings.Contains(output, _conversationalFinalAnswerAction) {
splits := strings.Split(output, _conversationalFinalAnswerAction)

return nil, &schema.AgentFinish{
finishAction := &schema.AgentFinish{
ReturnValues: map[string]any{
a.OutputKey: splits[len(splits)-1],
},
Log: output,
}, nil
}

return nil, finishAction, nil
}

r := regexp.MustCompile(`Action: (.*?)[\n]*Action Input: (.*)`)
Expand Down
2 changes: 1 addition & 1 deletion agents/conversational_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ func TestConversationalWithMemory(t *testing.T) {
t.Skip("OPENAI_API_KEY not set")
}

llm, err := openai.New()
llm, err := openai.New(openai.WithModel("gpt-4"))
require.NoError(t, err)

executor, err := Initialize(
Expand Down
19 changes: 18 additions & 1 deletion agents/errors.go
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,24 @@ var (

// ErrUnableToParseOutput is returned if the output of the llm is unparsable.
ErrUnableToParseOutput = errors.New("unable to parse agent output")
// ErrInvalidChainReturnType is returned if the internal chain of the agent eturns a value in the
// ErrInvalidChainReturnType is returned if the internal chain of the agent returns a value in the
// "text" filed that is not a string.
ErrInvalidChainReturnType = errors.New("agent chain did not return a string")
)

// ParserErrorHandler is the struct used to handle parse errors from the agent in the executor. If
// an executor have a ParserErrorHandler, parsing errors will be formatted using the formatter
// function and added as an observation. In the next executor step the agent will then have the
// possibility to fix the error.
type ParserErrorHandler struct {
// The formatter function can be used to format the parsing error. If nil the error will be given
// as an observation directly.
Formatter func(err string) string
}

// NewParserErrorHandler creates a new parser error handler.
func NewParserErrorHandler(formatFunc func(string) string) *ParserErrorHandler {
return &ParserErrorHandler{
Formatter: formatFunc,
}
}
59 changes: 45 additions & 14 deletions agents/executor.go
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ package agents

import (
"context"
"errors"
"fmt"
"strings"

Expand All @@ -19,6 +20,7 @@ type Executor struct {
Tools []tools.Tool
Memory schema.Memory
CallbacksHandler callbacks.Handler
ErrorHandler *ParserErrorHandler

MaxIterations int
ReturnIntermediateSteps bool
Expand All @@ -43,6 +45,7 @@ func NewExecutor(agent Agent, tools []tools.Tool, opts ...CreationOption) Execut
MaxIterations: options.maxIterations,
ReturnIntermediateSteps: options.returnIntermediateSteps,
CallbacksHandler: options.callbacksHandler,
ErrorHandler: options.errorHandler,
}
}

Expand All @@ -55,28 +58,56 @@ func (e Executor) Call(ctx context.Context, inputValues map[string]any, _ ...cha

steps := make([]schema.AgentStep, 0)
for i := 0; i < e.MaxIterations; i++ {
actions, finish, err := e.Agent.Plan(ctx, steps, inputs)
if err != nil {
return nil, err
var finish map[string]any
steps, finish, err = e.doIteration(ctx, steps, nameToTool, inputs)
if finish != nil || err != nil {
return finish, err
}
}

if len(actions) == 0 && finish == nil {
return nil, ErrAgentNoReturn
}
return e.getReturn(
&schema.AgentFinish{ReturnValues: make(map[string]any)},
steps,
), ErrNotFinished
}

if finish != nil {
return e.getReturn(finish, steps), nil
func (e Executor) doIteration(
ctx context.Context,
steps []schema.AgentStep,
nameToTool map[string]tools.Tool,
inputs map[string]string,
) ([]schema.AgentStep, map[string]any, error) {
actions, finish, err := e.Agent.Plan(ctx, steps, inputs)
if errors.Is(err, ErrUnableToParseOutput) && e.ErrorHandler != nil {
formattedObservation := err.Error()
if e.ErrorHandler.Formatter != nil {
formattedObservation = e.ErrorHandler.Formatter(formattedObservation)
}
steps = append(steps, schema.AgentStep{
Observation: formattedObservation,
})
return steps, nil, nil
}
if err != nil {
return steps, nil, err
}

if len(actions) == 0 && finish == nil {
return steps, nil, ErrAgentNoReturn
}

for _, action := range actions {
steps, err = e.doAction(ctx, steps, nameToTool, action)
if err != nil {
return nil, err
}
if finish != nil {
return steps, e.getReturn(finish, steps), nil
}

for _, action := range actions {
steps, err = e.doAction(ctx, steps, nameToTool, action)
if err != nil {
return steps, nil, err
}
}

return nil, ErrNotFinished
return steps, nil, nil
}

func (e Executor) doAction(
Expand Down
57 changes: 56 additions & 1 deletion agents/executor_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,66 @@ import (
"github.com/tmc/langchaingo/agents"
"github.com/tmc/langchaingo/chains"
"github.com/tmc/langchaingo/llms/openai"
"github.com/tmc/langchaingo/schema"
"github.com/tmc/langchaingo/tools"
"github.com/tmc/langchaingo/tools/serpapi"
)

func TestMRKL(t *testing.T) {
type testAgent struct {
actions []schema.AgentAction
finish *schema.AgentFinish
err error
inputKeys []string
outputKeys []string

recordedIntermediateSteps []schema.AgentStep
recordedInputs map[string]string
numPlanCalls int
}

func (a *testAgent) Plan(
_ context.Context,
intermediateSteps []schema.AgentStep,
inputs map[string]string,
) ([]schema.AgentAction, *schema.AgentFinish, error) {
a.recordedIntermediateSteps = intermediateSteps
a.recordedInputs = inputs
a.numPlanCalls++

return a.actions, a.finish, a.err
}

func (a testAgent) GetInputKeys() []string {
return a.inputKeys
}

func (a testAgent) GetOutputKeys() []string {
return a.outputKeys
}

func TestExecutorWithErrorHandler(t *testing.T) {
t.Parallel()

a := &testAgent{
err: agents.ErrUnableToParseOutput,
}
executor := agents.NewExecutor(
a,
nil,
agents.WithMaxIterations(3),
agents.WithParserErrorHandler(agents.NewParserErrorHandler(nil)),
)

_, err := chains.Call(context.Background(), executor, nil)
require.ErrorIs(t, err, agents.ErrNotFinished)
require.Equal(t, 3, a.numPlanCalls)
require.Equal(t, []schema.AgentStep{
{Observation: agents.ErrUnableToParseOutput.Error()},
{Observation: agents.ErrUnableToParseOutput.Error()},
}, a.recordedIntermediateSteps)
}

func TestExecutorWithMRKLAgent(t *testing.T) {
t.Parallel()

if openaiKey := os.Getenv("OPENAI_API_KEY"); openaiKey == "" {
Expand Down
Loading
Loading