Fix multiline LLM output syntax error for dynamic flow generation #748
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR adds one more demonstration to the sample conversation default prompts of v2 for multiline text output with
bot say
flow. Before this PR, any dynamic code generation with LLMs that needed more than one line resulted in syntax error.Example Colang application:
Example application trace:
By adding one more sample conversation which demonstrates that multiline outputs should be concatenated with
\n
, the aforementioned problem is mostly fixed.There could be another workaround, which is post-processing the LLM generated flow. However, I think it's up to the LLMs discretion to generate more valid Colang flows.