Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: setting seed parameter of OpenAI's chat completions API #85

Open
haukelicht opened this issue Feb 14, 2024 · 1 comment

Comments

@haukelicht
Copy link

Thank you for creating and maintaining this awesome project!

OpenAI recently introduced the seed parameter to make their models' text generation and chat completion behavior (more) reproducible (see https://cookbook.openai.com/examples/reproducible_outputs_with_the_seed_parameter).

I think it would be great if you could enable users of your package to control this parameter when using OpenAI models as a backend (i.e., in the files here: https://github.com/iryna-kondr/scikit-llm/tree/main/skllm/models/gpt)

The seed parameter could be hard-coded

temperature=0.0, messages=messages, **model_dict
similar to setting temperature=0.0.

Alternatively, users could pass seed=<SEED> via **kwargs.

@OKUA1
Copy link
Collaborator

OKUA1 commented Feb 14, 2024

Hello @haukelicht,

Thank you for your suggestion. Are you aware whether the seed parameter plays any role if the temperature is set to 0? I have to admit that I did not dig deeper into this topic, but always had an impression that the seed is only relevant when the temperature is positive, otherwise the model is already (almost) deterministic.

But again, I might be completely wrong here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants