Skip to content

Conversation

theoniko
Copy link
Contributor

Fixes #4282

verbose=verbose,

self.further_context_chain = RunnableSequence(
[further_context_prompt, self.llm], verbose=verbose
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can verbose used here?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think so; you could check the documentation and test it.

formatted_patch,
)

output_further_info = self.further_info_chain.invoke(
Copy link
Contributor Author

@theoniko theoniko Feb 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it returning a string, JSON or something else?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One way to determine this is by running it.

@suhaibmujahid
Copy link
Member

@theoniko, are you still interested in following up on this PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

LangChainDeprecationWarning: The class LLMChain was deprecated

2 participants