-
Notifications
You must be signed in to change notification settings - Fork 375
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
refactor: unify response and agent response #930
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
🦋 Changeset detectedLatest commit: 0899592 The changes in this PR will be included in the next version bump. This PR includes changesets to release 10 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
sources: ToolOutput[]; | ||
}; | ||
|
||
export type AgentChatResponse<Options extends object> = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
AgentChatResponse
is better than Response
but most people are using normal chat engines I assume, so we better improve the existing Response
class (see above)
packages/core/src/Response.ts
Outdated
* Response is the output of a LLM | ||
*/ | ||
export class Response { | ||
export class Response implements ChatResponse { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Idea: we include message
and raw
from ChatResponse
instead of adding chatResponse: ChatResponse
as an attribute to Response
Note: we ChatResponse
supports extension with Options
we have to add this to this PR (after we agree on the contract)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
another advantage is that Reponse
behaves like ChatReponse
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TBD: how to deal with chunk
s? We have ChatReponseChunk
s as type returned by LLMs but chat engines are just streaming Response
objects.
Suggestion: we add an optional delta
attribute to Response
packages/core/src/Response.ts
Outdated
*/ | ||
export class Response { | ||
export class Response implements ChatResponse { | ||
// @deprecated use 'message' instead |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we tell the user to not use response
instead migrate to message
packages/core/src/Response.ts
Outdated
* Response is the output of a LLM | ||
*/ | ||
export class Response { | ||
export class Response implements ChatResponse { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could we don't naming Response
it's already a Web API
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how about EngineResponse
(see new code)?
@@ -0,0 +1,5 @@ | |||
--- | |||
"llamaindex": minor |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@himself65 minor because agent chat interface is breaking
message: "What is the weather in San Francisco?", | ||
}); | ||
consola.debug("response:", response.message.content); | ||
|
||
strictEqual(sources.length, 1); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@himself65 i removed sources
(the tooloutput) 1. the name is confusing 2. it can be retrieved with events
No description provided.