-
-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Streaming response support #23
Comments
I don't have the free quota anymore, so it will be hard for me to test it out. I never tried the Line 46 in 2019408
|
Setting
With
here's the function I'm using to test it (defun spiel--chat ()
(let ((messages `[((role . "user")
(content . "How are you?"))]))
(openai-chat
messages
(lambda (data)
(with-output-to-temp-buffer "*Chat*"
(mapc (lambda (message)
(let-alist message
(princ (format "%s: %s\n" .role (string-trim .content)))))
messages)
(let ((choices (let-alist data .choices)))
(mapc (lambda (choice)
(let-alist choice
(let-alist .message
(princ (format "%s: %s\n" .role (string-trim .content))))))
choices))))
:stream t
:parameters '(("api-version" . "2023-05-15"))))) |
For large responses, it would be nice to provide a streaming API. I'm not sure that the current request function API is ideal for this, though.
I'm fairly new to elisp, but perhaps either a lambda that was expected to be called multiple times with parts of the response or a stream type that is common amongst elisp packages?
The text was updated successfully, but these errors were encountered: