Skip to content

How to stream text output? #587

@ProjCRys

Description

@ProjCRys

I tried adding "stream":True into the llm config but it doesn't seem to stream any text and just crashes it. I was planning to use the stream output for my TTS just to reduce the response time (speak every time a sentence is formed), so I tried using stream=True to see if it streams an output.

Code used:


import autogen
import random

config_list = [
{
"api_type": "open_ai",
"api_base": "http://localhost:1234/v1",
"api_key": "NULL"
}
]

random_seed = random.randint(0, 10000) # Generate a random seed

llm_config = {
"request_timeout": 1000,
"seed": random_seed, # Use the random seed here
"config_list": config_list,
"stream":True,
"temperature": 0
}

assistant = autogen.AssistantAgent(
name="assistant",
system_message="You are a coder specialized in Python",
llm_config=llm_config
)

user_proxy = autogen.UserProxyAgent(
name="user_proxy",
human_input_mode="ALWAYS",
max_consecutive_auto_reply=10,
is_termination_msg=lambda x: "TERMINATE" in x.get("content", ""),
code_execution_config={"work_dir": "web"},
llm_config=llm_config,
system_message="""End with TERMINATE if the task has been solved to full satisfaction. Otherwise, reply CONTINUE or the reason why the task is not solved yet."""
)

task = input("Please write a task: ")

user_proxy.initiate_chat(assistant, message=task)


Output:


D:\AI\ChatBots\Autogen>python instruct.py
Please write a task: Hello
user_proxy (to assistant):

Hello


Traceback (most recent call last):
File "D:\AI\ChatBots\Autogen\instruct.py", line 40, in
user_proxy.initiate_chat(assistant, message=task)
File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\site-packages\autogen\agentchat\conversable_agent.py", line 531, in initiate_chat
self.send(self.generate_init_message(**context), recipient, silent=silent)
File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\site-packages\autogen\agentchat\conversable_agent.py", line 334, in send
recipient.receive(message, self, request_reply, silent)
File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\site-packages\autogen\agentchat\conversable_agent.py", line 462, in receive
reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\site-packages\autogen\agentchat\conversable_agent.py", line 781, in generate_reply
final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\site-packages\autogen\agentchat\conversable_agent.py", line 606, in generate_oai_reply
response = oai.ChatCompletion.create(
File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\site-packages\autogen\oai\completion.py", line 803, in create
response = cls.create(
File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\site-packages\autogen\oai\completion.py", line 834, in create
return cls._get_response(params, raise_on_ratelimit_or_timeout=raise_on_ratelimit_or_timeout)
File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\site-packages\autogen\oai\completion.py", line 272, in _get_response
cls._cache.set(key, response)
File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\site-packages\diskcache\core.py", line 772, in set
size, mode, filename, db_value = self._disk.store(value, read, key=key)
File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\site-packages\diskcache\core.py", line 221, in store
result = pickle.dumps(value, protocol=self.pickle_protocol)
TypeError: cannot pickle 'generator' object

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions