Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clicking on tab (Select event) stacks events and doesn't respond #7189

Closed
1 task done
skye0402 opened this issue Jan 27, 2024 · 4 comments · Fixed by #7470
Closed
1 task done

Clicking on tab (Select event) stacks events and doesn't respond #7189

skye0402 opened this issue Jan 27, 2024 · 4 comments · Fixed by #7470
Labels
bug Something isn't working

Comments

@skye0402
Copy link

skye0402 commented Jan 27, 2024

Describe the bug

I have 3 tabs in my gradio blocks app. When I select a tab, I want to change the visibility of 2 Accordion elements on the right side of the page. The tabs are on the left side.

def genai_tab_selected(state: dict):
    """ Handles the click on the Generative AI Playground """
    return [
        gr.Accordion(visible=True),
        gr.Accordion(visible=True),
    ]

This is how I handle the visibility (depending on the tab, visible or not). The first tab doesn't trigger the select event at all. However, when I click on the second or third tab (depending on which tab is currently not active), it would call the according select event for the first tab and then immediately after call the event of the clicked tab. So it seems to stack the event of the first tab but executes only with the second event.

If I click around a few times I get the error shown in the log.

Any idea what that could be?

Have you searched existing issues? 🔎

  • I have searched and found no existing issues

Reproduction

import gradio as gr
from gradio import Blocks

TEST_MODELS = [
  {
        "model": "gpt-4-32k", 
        "name": "Azure OpenAI GPT-4 32k", 
        "desc": "GPT-4 is a large multimodal model (accepting text or image inputs and outputting text) that can solve difficult problems with greater accuracy than any of our previous models, thanks to its broader general knowledge and advanced reasoning capabilities.", 
        "platform": "SAP Generative AI Hub",
        "maxToken": 16000,
        "hasImage": False,
        "restricted": False,
        "active": True
    }
]

# SQL Playground functions ---------------------
        
def sql_tab_selected(state: dict):
    """ Handles the click on the SQL Generation tab """
    return [
        gr.Accordion(visible=False),
        gr.Accordion(visible=False),
    ]
        
# User tab functions ----------------------------------------

def user_tab_selected(state: dict):
    """ Set user list dataframe """
    login_data = []
    return [
        login_data,
        gr.Accordion(visible=False),
        gr.Accordion(visible=False)
    ]
    
# General AI Playground functions ---------------------------

def genai_tab_selected(state: dict):
    """ Handles the click on the Generative AI Playground """
    return [
        gr.Accordion(visible=True),
        gr.Accordion(visible=True),
    ]
    
def set_user_data(state: dict):
    """ Sets the tabs visibility """
    return [
        gr.Tab(visible=True),
        gr.Tab(visible=True),
    ]

def build_chat_view(models: dict, sys_roles: dict)->Blocks:
    """ Build the view with Gradio blocks """
    with gr.Blocks(
            title="Architecture & Platform Advisory - Generative AI Chat", 
            theme=gr.themes.Soft(),
            # css=block_css
        ) as chat_view:
        state = gr.State({})
        with gr.Row(elem_id="overall_row") as main_screen:
            with gr.Column(scale=10, elem_id="column_left"):
                with gr.Tab(label="Generative AI", elem_id="genai_tab", visible=False) as genai_tab:
                    chatbot = gr.Chatbot(
                        label="AI Chat",
                        elem_id="chat_window",
                        bubble_full_width=False,
                        show_copy_button=True,
                        show_share_button=True,
                        avatar_images=(None, "./img/saplogo.png")
                    )
                    msg_box = gr.Textbox(
                        elem_id="msg_box",
                        show_label=False,
                        max_lines=5,
                        placeholder="Enter text and press ENTER",
                        container=False
                    )
                    msg_box.autofocus = True
                with gr.Tab(label="Architecture", elem_id="arch_info_tab", visible=False) as arch_info_tab:
                    login_count_df1 = gr.Dataframe(
                        headers=["Username", "Full name", "Email", "Login Count"],
                        datatype=["str", "str", "str", "number"],
                        # row_count=5,
                        col_count=(4, "fixed"),
                    )
                with gr.Tab(label="SQL Generation", elem_id="sql_gen_tab", visible=True) as sql_gen_tab:
                    with gr.Row(elem_id="sql_row1") as sql_row1:
                        with gr.Column(scale=10, elem_id="sql_col_left1") as sql_col_left1:
                            gr.Markdown("### AI Generated SQL Query")
                            sql_query_tb = gr.Markdown(
                                elem_id="sql_query_tb",
                                label="AI Generated SQL Query",
                                visible=False,
                                show_label=True
                            )
                        with gr.Column(scale=3, elem_id="sql_col_right1") as sql_col_right1:
                            sql_result_tb = gr.Textbox(
                                elem_id="sql_result_tb",
                                label="AI generated response",
                                placeholder="No generation requested.",
                                container=True,
                                visible=False
                            )
                            request_tb = gr.Textbox(
                                elem_id="request_tb",
                                label="Your request",
                                placeholder="No generation requested.",
                                container=True,
                                visible=False
                            )
                    with gr.Row(elem_id="sql_row2") as sql_row2:
                        sql_result_df = gr.Dataframe(
                            elem_id="sql_result_df",
                            label="SQL Retrieval Result",
                            headers=["Username", "Full name", "Email"],
                            datatype=["str", "str", "str"],
                            col_count=(3, "fixed"),
                            visible=False
                        )
                        sql_input_tb = gr.Textbox(
                            elem_id="sql_input_tb",
                            show_label=False,
                            placeholder="Enter query and press ENTER",
                            container=False
                        )
                with gr.Tab(label="User Info", elem_id="user_info_tab", visible=False) as user_info_tab:
                    login_count_df = gr.Dataframe(
                        headers=["Username", "Full name", "Email", "Login Count"],
                        datatype=["str", "str", "str", "number"],
                        row_count=20,
                        col_count=(4, "fixed"),
                    )    
            with gr.Column(scale=3, elem_id="column_right") as column_right:
                with gr.Group(elem_id="model_group") as model_group:
                    model_names = [item["name"] for item in models]
                    state.value["models"] = models
                    model_selector = gr.Dropdown(
                        choices=model_names, 
                        container=True,
                        label="🗨️ Language Model",
                        show_label=True,
                        interactive=True,
                        value=model_names[0] if len(model_names) > 0 else ""
                    )
                    model_info_box = gr.Textbox(value=models[0]["desc"], lines=3, label=f"🆗 {models[0]['platform']}", interactive=False, elem_id="model_info")
                    imagebox = gr.Image(type="filepath", elem_id="image_box", visible=False)
                    def model_change(model_name, state):
                        """ Toggle image upload visibility """
                        is_visible = False
                        result = next((item for item in models if item["name"] == model_name), None)
                        if result["model"] in ["llava","bakllava"]:
                            is_visible = True
                        try:
                            state["model"] = None
                        except Exception as e:
                            pass # No action needed
                        gr.Info("Switching and loading model - first AI response will have a delay.")
                        usage_hint = ""
                        if result['platform'] in ["SAP Generative AI Hub", "Ollama on SAP AI Core"]:
                            usage_hint = "int"
                            usage_icon = "🆗"
                        elif result['platform'] in ["SAP BTP Proxy"]:
                            usage_hint = "pub"
                            usage_icon = "⛔"
                        else:
                            usage_hint = "int"
                            usage_icon = "⛔"
                        logo_update = "## Hello 2"
                        return gr.update(visible = is_visible), gr.update(maximum=result["maxToken"], value=int(result["maxToken"]*0.6)), gr.update(value=result["desc"], label=f"{usage_icon} {result['platform']}"), gr.update(autofocus=True), gr.update(value = logo_update), state
                with gr.Accordion("🔍 Document Talk (RAG)", open=False, elem_id="rag_accordion") as rag_accordion:
                    uplbtn_file = gr.UploadButton(label="Upload",  file_types=[".docx", ".pptx", ".pdf", ".txt"], file_count="multiple", elem_id="rag_btn_upload")
                    file_output = gr.File(elem_id="rag_file_view")
                    cbox_use_document = gr.Checkbox(label="Use document", visible=False, value=False, elem_id="cb_rag_active")
                    btn_remove = gr.Button(value="Remove", visible=False, elem_id="rag_btn_remove")
                    with gr.Accordion("🎚️ Embedding Options", open=False, elem_id="embedding_options") as embedding_accordion:
                        sldr_chunk_size = gr.Slider(minimum=100, maximum=1000, value=1000, step=100, label="Chunk Size")
                        sldr_chunk_overlap = gr.Slider(minimum=0, maximum=500, value=100, step=100, label="Overlap")
                        cbox_use_ocr = gr.Checkbox(label="⚠️ Use Image OCR", visible=True, value=False, elem_id="cb_image_ocr")
                        sldr_k_top = gr.Slider(minimum=1, maximum=20, value=8, step=1, label="k-top from Vector Store")
                with gr.Accordion("🎚️ Model parameters", open=False, elem_id="parameter_accordion") as parameter_accordion:
                    system_role = gr.Dropdown(           
                        allow_custom_value=True,             
                        choices=sys_roles,
                        container=False,
                        label="Select or enter a system role",
                        interactive=True,
                        value=sys_roles[0] if len(sys_roles) > 0 else "",
                    )
                    image_process_mode = gr.Radio(
                        ["Crop", "Resize", "Pad", "Default"],
                        value="Resize",
                        label="Preprocess for non-square image", visible=True)
                    temperature = gr.Slider(minimum=0.0, maximum=1.0, value=0.1, step=0.1, interactive=True, label="Temperature",)
                    top_p = gr.Slider(minimum=0.0, maximum=1.0, value=0.7, step=0.1, interactive=True, label="Top P",)
                    max_output_tokens = gr.Slider(minimum=0, maximum=models[0]["maxToken"], value=int(models[0]["maxToken"]*0.6), step=64, interactive=True, label="Max output tokens",)
                clear = gr.Button(value="Clear history")
                logo_box = gr.Markdown(value="### Hello",elem_id="logo_box")
        # Event handling playground tab
        genai_tab.select(
            genai_tab_selected,
            inputs=[state],
            outputs=[rag_accordion, parameter_accordion]
        )
        # Event handling SQL tab
        sql_gen_tab.select(
            sql_tab_selected,
            inputs=[state],
            outputs=[rag_accordion, parameter_accordion]
        )
        # Event handling user tab
        user_info_tab.select(
            user_tab_selected,
            inputs=[state],
            outputs=[login_count_df, rag_accordion, parameter_accordion]
        )        
        # Event handling: Loading page
        chat_view.load(set_user_data, [state], [genai_tab, user_info_tab])
    return chat_view
    

def main()->None:
       
    chat_view = build_chat_view(models=TEST_MODELS, sys_roles=["123","123","125"])
    # chat_view.queue()
    chat_view.launch(
        debug=True,
        show_api=False,
        server_name="0.0.0.0",
        server_port=51010
    )

if __name__ == "__main__":
    main()

Screenshot

image
Green tabs work, red doesn't trigger select event.

Green marked accordions are toggled for visibility.

Logs

ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/routing.py", line 69, in app
    await response(scope, receive, send)
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/responses.py", line 270, in __call__
    async with anyio.create_task_group() as task_group:
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 597, in __aexit__
    raise exceptions[0]
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/responses.py", line 273, in wrap
    await func()
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/responses.py", line 262, in stream_response
    async for chunk in self.body_iterator:
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/gradio/routes.py", line 662, in sse_stream
    raise e
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/gradio/routes.py", line 603, in sse_stream
    raise HTTPException(
fastapi.exceptions.HTTPException

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 1106, in __call__
    await super().__call__(scope, receive, send)
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 83, in __call__
    await self.app(scope, receive, send)
  File "/home/gunter/ollama-aicore/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 83, in __call__
    raise RuntimeError(msg) from exc
RuntimeError: Caught handled exception, but response already started.

System Info

Gradio 4.16 (was with 4.15 as well)

Severity

Blocking usage of gradio

@skye0402 skye0402 added the bug Something isn't working label Jan 27, 2024
@abidlabs
Copy link
Member

Hi @skye0402 can you please provide a minimal code example that we can use to reproduce the issue above? See: https://stackoverflow.com/help/minimal-reproducible-example

@abidlabs abidlabs added the needs repro Awaiting full reproduction label Jan 27, 2024
@skye0402
Copy link
Author

skye0402 commented Jan 28, 2024

@abidlabs - I was able to make it reproducible. Above code is stripped of all functions, so it looks dumb since most the the logic is out.
The problem is here (I was able to track it down):

  • The tab visibility is set False for 2 tabs at start of the application. It is set to "True" statically at the load() event. In the real app there are checks if the user is allowed to see the tab or not.
  • This setting of tab visibility (or property in general?) inside the load() event conflicts with the select() event of the tab. Just run the code and you will see it.

@abidlabs abidlabs removed the bug Something isn't working label Jan 30, 2024
@skye0402
Copy link
Author

@abidlabs did you want to remove the other label? I think it's a bug. And it's reproducible.

@abidlabs abidlabs added bug Something isn't working and removed needs repro Awaiting full reproduction labels Jan 31, 2024
@abidlabs
Copy link
Member

Removed, we'll take a look at this issue as soon as we can.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants