Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add o1 support #570

Closed
kevinburkesegment opened this issue Sep 12, 2024 · 8 comments
Closed

Add o1 support #570

kevinburkesegment opened this issue Sep 12, 2024 · 8 comments
Labels
enhancement New feature or request

Comments

@kevinburkesegment
Copy link

I believe it needs to be added to default_models.py and also add some documentation.

I just tried a local patch and the API reported that o1-preview does not exist or I don't have access to it, so not sure if API access has not been enabled yet or if our company doesn't have it yet.

@Quantisan
Copy link

you can add locally with this config https://llm.datasette.io/en/stable/openai-models.html#adding-more-openai-models

@kevinburkesegment
Copy link
Author

the docs there are not quite right... you need to specify both model_name and model_id, without model_name I get errors.

@kevinburkesegment
Copy link
Author

kevinburkesegment commented Sep 12, 2024

OK, it's there for me now, however I'm getting this - "Error: Error code: 400 - {'error': {'message': "Unsupported value: 'stream' does not support true with this model. Only the default (false) value is supported.", 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}}"

So seems like you need to use --no-stream in order to get any output, wonder if there is a way to make that the default

@Quantisan
Copy link

the docs there are not quite right... you need to specify both model_name and model_id, without model_name I get errors.

Good catch! I submitted a PR to fix the doc.

@simonw simonw added the enhancement New feature or request label Sep 12, 2024
@simonw
Copy link
Owner

simonw commented Sep 12, 2024

So seems like you need to use --no-stream in order to get any output, wonder if there is a way to make that the default

I think we need a new model option for setting can_stream = False.

simonw added a commit that referenced this issue Sep 12, 2024
@simonw
Copy link
Owner

simonw commented Sep 12, 2024

I've published a branch with this implemented, but I can't test it myself yet as I don't have a tier 5 API key.

Could someone else test this for me? You can install the branch like this:

pip install https://github.com/simonw/llm/archive/refs/heads/openai-o1.zip
# Or maybe even this:
llm install https://github.com/simonw/llm/archive/refs/heads/openai-o1.zip

Then:

llm -m o1-preview 'hello to o1-preview'
llm -m o1-mini 'hello to o1-preview'

If someone reports those running without incident I'll ship a release with them.

@Quantisan
Copy link

$ llm -m o1-mini 'hello to o1-preview'
Hello! It sounds like you're referring to "o1-preview." Could you provide a bit more context or let me know how I can assist you with it?

$ llm -m o1-preview 'hello to o1-preview'
Hello! How can I assist you today?

@simonw your branch works!

@simonw
Copy link
Owner

simonw commented Sep 12, 2024

Merged that, about to release it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants