-
Notifications
You must be signed in to change notification settings - Fork 7.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
An error occurred when I enabled the 'Event Stream' in the GLM model. #9301
Labels
🐞 bug
Something isn't working
Comments
dsl app:
description: t
icon: 🤖
icon_background: '#FFEAD5'
mode: advanced-chat
name: t
use_icon_as_answer_icon: false
kind: app
version: 0.1.2
workflow:
conversation_variables: []
environment_variables: []
features:
file_upload:
image:
enabled: false
number_limits: 3
transfer_methods:
- local_file
- remote_url
opening_statement: ''
retriever_resource:
enabled: true
sensitive_word_avoidance:
enabled: false
speech_to_text:
enabled: false
suggested_questions: []
suggested_questions_after_answer:
enabled: false
text_to_speech:
enabled: false
language: ''
voice: ''
graph:
edges:
- data:
sourceType: start
targetType: llm
id: 1726711439620-llm
source: '1726711439620'
sourceHandle: source
target: llm
targetHandle: target
type: custom
- data:
isInIteration: false
sourceType: llm
targetType: answer
id: llm-source-answer-target
source: llm
sourceHandle: source
target: answer
targetHandle: target
type: custom
zIndex: 0
nodes:
- data:
desc: ''
selected: false
title: 开始
type: start
variables: []
height: 54
id: '1726711439620'
position:
x: 80
y: 282
positionAbsolute:
x: 80
y: 282
sourcePosition: right
targetPosition: left
type: custom
width: 244
- data:
context:
enabled: false
variable_selector: []
desc: ''
memory:
role_prefix:
assistant: ''
user: ''
window:
enabled: false
size: 10
model:
completion_params:
stream: false
temperature: 0.7
mode: chat
name: glm-4-plus
provider: zhipuai
prompt_template:
- id: a78c7665-932b-49be-bffa-63f46ff014c6
role: system
text: ''
selected: true
title: LLM
type: llm
variables: []
vision:
enabled: false
height: 98
id: llm
position:
x: 348
y: 194
positionAbsolute:
x: 348
y: 194
selected: true
sourcePosition: right
targetPosition: left
type: custom
width: 244
- data:
answer: '{{#llm.text#}}'
desc: ''
selected: false
title: 直接回复
type: answer
variables: []
height: 107
id: answer
position:
x: 769
y: 100
positionAbsolute:
x: 769
y: 100
selected: false
sourcePosition: right
targetPosition: left
type: custom
width: 244
viewport:
x: 254.75402520766323
y: 397.50899601162155
zoom: 0.9835020740262486 |
12 tasks
12 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Self Checks
Dify version
latest
Cloud or Self Hosted
Self Hosted (Source)
Steps to reproduce
It works as expected if i disabled 'Event Stream'
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered: