Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming response support #23

Open
lilactown opened this issue Jul 21, 2023 · 2 comments
Open

Streaming response support #23

lilactown opened this issue Jul 21, 2023 · 2 comments

Comments

@lilactown
Copy link
Contributor

For large responses, it would be nice to provide a streaming API. I'm not sure that the current request function API is ideal for this, though.

I'm fairly new to elisp, but perhaps either a lambda that was expected to be called multiple times with parts of the response or a stream type that is common amongst elisp packages?

@jcs090218
Copy link
Member

I don't have the free quota anymore, so it will be hard for me to test it out.

I never tried the stream option, so I would have to guess. Can you try setting the stream option to "True"? I think that's how users can use their API and receive data by stream.

stream

@lilactown
Copy link
Contributor Author

Setting :stream t results in the following messages:

[error] request--callback: JSON readtable error: 100
Internal error: 200

With (setq openai--show-log t)

[ERROR]: #s(request-response 200 nil nil (json-readtable-error 100) parse-error <REMOVED> nil (:error #<subr F616e6f6e796d6f75732d6c616d626461_anonymous_lambda_9> :type POST :params ((api-version . 2023-05-15)) :headers ((Content-Type . application/json) (api-key . <REMOVED>)) :data {"model":"gpt-3.5-turbo","messages":[{"role":"user","content":"How are you?"}],"stream":true} :parser json-read :complete #[128 \301�\302"A@\300�!\207 [(closure ((messages . [((role . user) (content . How are you?))]) t) (data) (let* ((old-dir default-directory) (buf (save-current-buffer (set-buffer (get-buffer-create *Chat*)) (prog1 (current-buffer) (kill-all-local-variables) (setq default-directory old-dir) (setq buffer-read-only nil) (setq buffer-file-name nil) (setq buffer-undo-list t) (let ((inhibit-read-only t) (inhibit-modification-hooks t)) (erase-buffer) (run-hooks 'temp-buffer-setup-hook))))) (standard-output buf)) (prog1 (progn (mapc #'(lambda (message) (let ((alist message)) (let ((.role (cdr (assq 'role alist))) (.content (cdr (assq 'content alist)))) (princ (format %s: %s
 .role (string-trim .content)))))) messages) (let ((choices (let ((alist data)) (let ((.choices (cdr (assq 'choices alist)))) .choices)))) (mapc #'(lambda (choice) (let ((alist choice)) (let ((.message (cdr (assq 'message alist)))) (let ((alist .message)) (let ((.role (cdr (assq 'role alist))) (.content (cdr (assq 'content alist)))) (princ (format %s: %s
 .role (string-trim .content)))))))) choices))) (internal-temp-output-buffer-show buf)))) plist-member :data] 4 

(fn &key DATA &allow-other-keys)] :url https://amperity-engineering.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2023-05-15 :response #0 :encoding utf-8) #<killed buffer> HTTP/2 200 
cache-control: no-cache, must-revalidate
content-type: text/event-stream
access-control-allow-origin: *
apim-request-id: ebb01177-a5bc-46a7-8874-50af3a78f55a
openai-model: gpt-35-turbo
x-content-type-options: nosniff
openai-processing-ms: 58.5459
x-ms-region: East US
x-accel-buffering: no
x-request-id: 5234a4b0-c1f3-4b00-9e43-923ba6fa0ad3
x-ms-client-request-id: ebb01177-a5bc-46a7-8874-50af3a78f55a
strict-transport-security: max-age=31536000; includeSubDomains; preload
azureml-model-session: aoai-gpt35-05242023
azureml-model-group: online
date: Fri, 21 Jul 2023 23:06:19 GMT
 nil curl)
Internal error: 200

here's the function I'm using to test it

(defun spiel--chat ()
  (let ((messages `[((role . "user")
                     (content . "How are you?"))]))
    (openai-chat
     messages
     (lambda (data)
       (with-output-to-temp-buffer "*Chat*"
         (mapc (lambda (message)
                 (let-alist message
                   (princ (format "%s: %s\n" .role (string-trim .content)))))
               messages)
         (let ((choices (let-alist data .choices)))
           (mapc (lambda (choice)
                   (let-alist choice
                     (let-alist .message
                       (princ (format "%s: %s\n" .role (string-trim .content))))))
                 choices))))
     :stream t
     :parameters '(("api-version" . "2023-05-15")))))

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants