Skip to content

Commit

Permalink
Merge pull request #119 from alipay/dev_weizj
Browse files Browse the repository at this point in the history
Dev weizj
  • Loading branch information
LandJerry authored Jul 9, 2024
2 parents 1edb7a5 + 1648552 commit c9612a8
Show file tree
Hide file tree
Showing 4 changed files with 13 additions and 5 deletions.
11 changes: 8 additions & 3 deletions agentuniverse/agent_serve/web/request_task.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,8 +73,13 @@ def receive_steps(self):
ensure_ascii=False) + "\n\n"
if self.canceled():
return
yield "data:" + json.dumps({"result": self.thread.result()},
ensure_ascii=False) + "\n\n"
try:
result = self.thread.result()
yield "data:" + json.dumps({"result": result},
ensure_ascii=False) + "\n\n "
except Exception as e:
LOGGER.error("request task execute Fail: " + str(e))
yield "data:" + json.dumps({"error": {"error_msg": str(e)}}) + "\n\n "

def append_steps(self):
"""Tracing async service running state and update it to database."""
Expand Down Expand Up @@ -225,7 +230,7 @@ def finished(self):
self.__request_do__.state = TaskStateEnum.FINISHED.value

@staticmethod
def query_request_state(request_id: str) -> dict|None:
def query_request_state(request_id: str) -> dict | None:
"""Query the request data in database by given request_id.
Args:
Expand Down
3 changes: 3 additions & 0 deletions agentuniverse/agent_serve/web/thread_with_result.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,9 @@ def run(self):
self._return = self.target(*self.args, **self.kwargs)
except Exception as e:
self.error = e
finally:
if 'output_stream' in self.kwargs:
self.kwargs['output_stream'].put('{"type": "EOF"}')

def result(self):
"""Wait for target func finished, then return the result or raise an
Expand Down
2 changes: 1 addition & 1 deletion docs/guidebook/en/2_2_2_LLM_component_define_and_usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -467,7 +467,7 @@ module: 'agentuniverse.llm.default.default_openai_llm'
class: 'DefaultOpenAILLM'
```

If we need to configure and define an LLM instance based on the `gpt-3.5-turbo model`, with a maximum token limit of 1000 and a retry count of 2 for failures, the configuration would be as follows:
If we need to configure and define an LLM instance based on the `gpt-3.5-turbo model`, with a maximum token limit of 1000 and a retry count is 2, the configuration would be as follows:
```yaml
name: 'demo_llm'
description: 'demo openai'
Expand Down
2 changes: 1 addition & 1 deletion docs/guidebook/zh/2_2_2_模型定义与使用.md
Original file line number Diff line number Diff line change
Expand Up @@ -462,7 +462,7 @@ module: 'agentuniverse.llm.default.default_openai_llm'
class: 'DefaultOpenAILLM'
```

假如我们需要配置定义一个基于`gpt-3.5-turbo`模型,最大token限制为1000,失败重试次数未2次的LLM实例,其配置如下:
假如我们需要配置定义一个基于`gpt-3.5-turbo`模型,最大token限制为1000,失败重试次数为2次的LLM实例,其配置如下:
```yaml
name: 'demo_llm'
description: 'demo openai'
Expand Down

0 comments on commit c9612a8

Please sign in to comment.