We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
prompt
The site has this example code for sending ChatML messages:
from openai import OpenAI from os import getenv # gets API Key from environment variable OPENAI_API_KEY client = OpenAI( base_url="https://openrouter.ai/api/v1", api_key=getenv("OPENROUTER_API_KEY"), ) completion = client.chat.completions.create( model="anthropic/claude-3-opus:beta", messages=[ { "role": "user", "content": "Say this is a test", }, ], ) print(completion.choices[0].message.content)
But I can't find how I can send a simple prompt (with no instruct formatting) in Python and get the response streamed.
Related:
The text was updated successfully, but these errors were encountered:
I think it was changed to messages now, and to send a traditional prompt, you send it with the "system" role:
from openai import OpenAI client = OpenAI()
completion = client.chat.completions.create( model="gpt-3.5-turbo", messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Hello!"} ] )
print(completion.choices[0].message)
Sorry, something went wrong.
No branches or pull requests
The site has this example code for sending ChatML messages:
But I can't find how I can send a simple
prompt
(with no instruct formatting) in Python and get the response streamed.Related:
The text was updated successfully, but these errors were encountered: