Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradio Streaming with Chat Interface with OpenAI: Expecting value: line 1 column 1 (char 0) #5371

Closed
1 task done
XariZaru opened this issue Aug 29, 2023 · 4 comments
Closed
1 task done
Labels
bug Something isn't working

Comments

@XariZaru
Copy link

Describe the bug

My Gradio app was working fine before I changed it to streaming. I am working on an OpenAI streaming chatbot concept and it keeps giving me an error. I've set up my proxy configuration as well since I work behind one.

Every time I send a message, an error pops up: Expecting value: line 1 column 1 (char 0)
No issue printed in console as well.

Have you searched existing issues? 🔎

  • I have searched and found no existing issues

Reproduction

This requires an OpenAI subscription and LLM that streams a response object. This could also be done with any other asynchronous request.

def chat(input_text, history):
    response = chat_engine.stream_chat(input_text)
    for text in response.response_gen:
        yield response.response

from llama_index import StorageContext, load_index_from_storage

sc = StorageContext.from_defaults(persist_dir='demo')
index = load_index_from_storage(sc, 'vector_index')
chat_engine = index.as_chat_engine(verbose=True, streaming=True,similarity_top_k=1)

proxy = '>>my proxy and port<<'

os.environ['http_proxy'] = proxy 
os.environ['HTTP_PROXY'] = proxy
os.environ['https_proxy'] = proxy
os.environ['HTTPS_PROXY'] = proxy

gr.ChatInterface(chat).queue().launch(share=False)

Screenshot

No response

Logs

No response

System Info

Operating System: Windows
gradio version: 3.41.2
gradio_client version: 0.5.0

Severity

Blocking usage of gradio

@XariZaru XariZaru added the bug Something isn't working label Aug 29, 2023
@abidlabs
Copy link
Member

Hi @XariZaru can you try a simpler example using queuing / streaming? For example, does this work for you?

import gradio as gr
import numpy as np
import time

# define core fn, which returns a generator {steps} times before returning the image
def fake_diffusion(steps):
    for _ in range(steps):
        time.sleep(1)
        image = np.random.random((600, 600, 3))
        yield image
    image = "https://gradio-builds.s3.amazonaws.com/diffusion_image/cute_dog.jpg"
    yield image


demo = gr.Interface(fake_diffusion, inputs=gr.Slider(1, 10, 3), outputs="image")

# define queue - required for generators
demo.queue()

demo.launch()

Also it would be helpful to know if this issue is related to your environment (with the proxies) or do you see the same behavior when you run on Colab, for example?

@XariZaru
Copy link
Author

I ran the above code in my Jupyter Notebook and the same error occurred. I will try in Colab next.

@XariZaru
Copy link
Author

Hi @XariZaru can you try a simpler example using queuing / streaming? For example, does this work for you?

import gradio as gr
import numpy as np
import time

# define core fn, which returns a generator {steps} times before returning the image
def fake_diffusion(steps):
    for _ in range(steps):
        time.sleep(1)
        image = np.random.random((600, 600, 3))
        yield image
    image = "https://gradio-builds.s3.amazonaws.com/diffusion_image/cute_dog.jpg"
    yield image


demo = gr.Interface(fake_diffusion, inputs=gr.Slider(1, 10, 3), outputs="image")

# define queue - required for generators
demo.queue()

demo.launch()

Also it would be helpful to know if this issue is related to your environment (with the proxies) or do you see the same behavior when you run on Colab, for example?

Hi, it works on Colab, just not locally.

@abidlabs
Copy link
Member

Hi @XariZaru the issue is likely that your proxy doesn't allow websocket connections which are required for queuing/streaming. See #4243 for some more information. I'll go ahead and close this issue as this is almost certainly not a Gradio issue, but rather a proxy-related issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants