How to configure multiple LLM nodes in one flow? #2437
flashslothless
started this conversation in
General
Replies: 1 comment
-
You can set Output of LLM Chain node to Output Prediction and connect it to a prompt node to the Format Prompt Values and feed it to another LLM as a prompt and thats the chain. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In complicated workflow, I need to configure more than one LLM node.
For example, query-> LLM->output->LLM->output,
but it looks like only one runable node (such as LLMChain, Retrieval-QA Chain, etc, ) can run.
How to link several Chain objects with each others in workflow?
Beta Was this translation helpful? Give feedback.
All reactions