Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update perpelxity how to #204

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion plugins/perplexity-online-search/.codeblocks/block_0.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ curl --request POST \
--header 'content-type: application/json' \
--data '
{
"model": "pplx-70b-online",
"model": "llama-3.1-sonar-large-128k-online",
"messages": [
{
"role": "system",
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"model": "pplx-70b-online",
"model": "llama-3.1-sonar-large-128k-online",
"messages": [
{
"role": "system",
Expand Down
8 changes: 4 additions & 4 deletions plugins/perplexity-online-search/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ Based on the needs of this use case, we should build a **Custom RAG Query**.

There’s only 1 API needed to build this use case. If we look at Perplexity’s API reference, there’s only one endpoint: [Chat Completions](https://docs.perplexity.ai/reference/post_chat_completions).

Based on the [Supported Models](https://docs.perplexity.ai/docs/model-cards#online-llms) documentation, we want to do an online search, so we should use either `pplx-7b-online` or `pplx-70b-online`. We should be careful about [pricing](https://docs.perplexity.ai/docs/pricing) for these models.
Based on the [Supported Models](https://docs.perplexity.ai/guides/model-cards) documentation, we want to do an online search, so we should use either `llama-3.1-sonar-small-128k-online` or `llama-3.1-sonar-large-128k-online`. We should be careful about [pricing](https://docs.perplexity.ai/docs/pricing) for these models.

# Prerequisites

Expand Down Expand Up @@ -75,7 +75,7 @@ Based on the [Supported Models](https://docs.perplexity.ai/docs/model-cards#onli
--header 'content-type: application/json' \
--data '
{
"model": "pplx-70b-online",
"model": "llama-3.1-sonar-large-128k-online",
"messages": [
{
"role": "system",
Expand All @@ -96,7 +96,7 @@ Based on the [Supported Models](https://docs.perplexity.ai/docs/model-cards#onli

```json
{
"model": "pplx-70b-online",
"model": "llama-3.1-sonar-large-128k-online",
"messages": [
{
"role": "system",
Expand Down Expand Up @@ -167,4 +167,4 @@ Note: It could take a couple minutes before your flow shows up in your copilot.

# Congratulations!

You just added Perplexity Online Search results to your Copilot! Check out our other guides for inspiration on what to build next.
You just added Perplexity Online Search results to your Copilot! Check out our other guides for inspiration on what to build next.
Loading