Skip to content
This repository has been archived by the owner on Sep 15, 2024. It is now read-only.

GPT-4 Vision Preview Cutting Off Result Message [Bug] #303

Closed
1 of 3 tasks
senja24 opened this issue Mar 7, 2024 · 2 comments
Closed
1 of 3 tasks

GPT-4 Vision Preview Cutting Off Result Message [Bug] #303

senja24 opened this issue Mar 7, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@senja24
Copy link

senja24 commented Mar 7, 2024

Bug Description

I'm writing to report a bug I've encountered while using the GPT-4 Vision Preview feature in the app. The feature is cutting off the result message, showing only a single line, and not displaying the full message.

Steps to Reproduce

  1. Open the app and change model to the GPT-4 Vision Preview.
  2. Input a prompt and image in the field.
  3. Click on the "Send" button to see the preview.

Expected Behavior

The GPT-4 Vision Preview should display the full result message, without cutting it off or truncating it.

Screenshots

No response

Deployment Method

  • Docker
  • Vercel
  • Server

Desktop OS

No response

Desktop Browser

No response

Desktop Browser Version

No response

Smartphone Device

No response

Smartphone OS

No response

Smartphone Browser

No response

Smartphone Browser Version

No response

Additional Logs

No response

@senja24 senja24 added the bug Something isn't working label Mar 7, 2024
@H0llyW00dzZ
Copy link
Owner

Bug Description

I'm writing to report a bug I've encountered while using the GPT-4 Vision Preview feature in the app. The feature is cutting off the result message, showing only a single line, and not displaying the full message.

Steps to Reproduce

  1. Open the app and change model to the GPT-4 Vision Preview.
  2. Input a prompt and image in the field.
  3. Click on the "Send" button to see the preview.

Expected Behavior

The GPT-4 Vision Preview should display the full result message, without cutting it off or truncating it.

Screenshots

No response

Deployment Method

  • Docker
  • Vercel
  • Server

Desktop OS

No response

Desktop Browser

No response

Desktop Browser Version

No response

Smartphone Device

No response

Smartphone OS

No response

Smartphone Browser

No response

Smartphone Browser Version

No response

Additional Logs

No response

You have to enabled this manually.

image

@H0llyW00dzZ
Copy link
Owner

For instance, in a specific chat, you can set the max tokens instead of doing so in the settings, because that will apply it as a global setting.

image

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants