Skip to content

Commit

Permalink
Logical and grammatical error in prompt_llm_parser.ipynb (shorthills…
Browse files Browse the repository at this point in the history
…-ai#95)

* Update modal.py

feat: Raise KeyError when 'prompt' key is missing in JSON response

This commit updates the error handling in the code to raise a KeyError when
the 'prompt' key is not found in the JSON response. This change makes the code
more explicit about the nature of the error, helping to improve clarity and
debugging.

* prompt_llm_parser.ipynb

---------

Co-authored-by: Aashish Saini <141953346+ShorthillsAI@users.noreply.github.com>
  • Loading branch information
2 people authored and AkshayTripathiShorthillsAI committed Jan 16, 2024
1 parent e976bcf commit 942fe98
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
"source": [
"## PromptTemplate + LLM\n",
"\n",
"The simplest composition is just combining a prompt and model to create a chain that takes user input, adds it to a prompt, passes it to a model, and returns the raw model output.\n",
"The simplest composition is just combing a prompt and model to create a chain that takes user input, adds it to a prompt, passes it to a model, and returns the raw model output.\n",
"\n",
"Note, you can mix and match PromptTemplate/ChatPromptTemplates and LLMs/ChatModels as you like here."
]
Expand Down

0 comments on commit 942fe98

Please sign in to comment.