diff --git a/_posts/2024-08-05-Using-SLM-with-Sidecar.md b/_posts/2024-08-05-Using-SLM-with-Sidecar.md index ef06e671e..fddc5cb34 100644 --- a/_posts/2024-08-05-Using-SLM-with-Sidecar.md +++ b/_posts/2024-08-05-Using-SLM-with-Sidecar.md @@ -131,6 +131,7 @@ app.post("/api/generate", (req, res) => { 3. **Verify the deployment** Once your deployment is complete, you can browse to your application URL and see the chat frontend. + ![Website output]({{site.baseurl}}/media/2024/08/phi-output.jpg) *Note: Since we are deploying a language model, please be aware that the application might take a little longer to start up the first time. This delay is due to the initial setup and loading of the Phi-3 model, which ensures that it is ready to handle requests efficiently. Subsequent startups should be faster once the model is properly initialized.* diff --git a/media/2024/08/phi-output.jpg b/media/2024/08/phi-output.jpg new file mode 100644 index 000000000..91d56b25a Binary files /dev/null and b/media/2024/08/phi-output.jpg differ