-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Description
What happened / 发生了什么
Title: TypeError: Object of type Event is not JSON serializable in
openai_source.py
Version: v4.22.1
Description:
ToolLoopAgentRunner._iter_llm_responses() passes abort_signal=asyncio.Event() in the payload kwargs (line 225). In openai_source._prepare_chat_payload(), this gets merged into payloads via {**kwargs, ...} (line 626). Then in
_query_stream()
, all unrecognized keys are moved to
extra_body
(lines 312-318), which gets JSON-serialized → crash.
Reproduction: Send any message in WebChat with tool-calling enabled.
Fix (one-liner):
diff
openai_source.py, _prepare_chat_payload()
- payloads = {**kwargs, "messages": context_query, "model": model}
- _internal_keys = {"abort_signal", "extra_user_content_parts"}
- filtered_kwargs = {k: v for k, v in kwargs.items() if k not in _internal_keys}
- payloads = {**filtered_kwargs, "messages": context_query, "model": model}
Reproduce / 如何复现?
Title: TypeError: Object of type Event is not JSON serializable in
openai_source.py
Version: v4.22.1
Description:
ToolLoopAgentRunner._iter_llm_responses() passes abort_signal=asyncio.Event() in the payload kwargs (line 225). In openai_source._prepare_chat_payload(), this gets merged into payloads via {**kwargs, ...} (line 626). Then in
_query_stream()
, all unrecognized keys are moved to
extra_body
(lines 312-318), which gets JSON-serialized → crash.
Reproduction: Send any message in WebChat with tool-calling enabled.
Fix (one-liner):
diff
openai_source.py, _prepare_chat_payload()
- payloads = {**kwargs, "messages": context_query, "model": model}
- _internal_keys = {"abort_signal", "extra_user_content_parts"}
- filtered_kwargs = {k: v for k, v in kwargs.items() if k not in _internal_keys}
- payloads = {**filtered_kwargs, "messages": context_query, "model": model}
AstrBot version, deployment method (e.g., Windows Docker Desktop deployment), provider used, and messaging platform used. / AstrBot 版本、部署方式(如 Windows Docker Desktop 部署)、使用的提供商、使用的消息平台适配器
Title: TypeError: Object of type Event is not JSON serializable in
openai_source.py
Version: v4.22.1
Description:
ToolLoopAgentRunner._iter_llm_responses() passes abort_signal=asyncio.Event() in the payload kwargs (line 225). In openai_source._prepare_chat_payload(), this gets merged into payloads via {**kwargs, ...} (line 626). Then in
_query_stream()
, all unrecognized keys are moved to
extra_body
(lines 312-318), which gets JSON-serialized → crash.
Reproduction: Send any message in WebChat with tool-calling enabled.
Fix (one-liner):
diff
openai_source.py, _prepare_chat_payload()
- payloads = {**kwargs, "messages": context_query, "model": model}
- _internal_keys = {"abort_signal", "extra_user_content_parts"}
- filtered_kwargs = {k: v for k, v in kwargs.items() if k not in _internal_keys}
- payloads = {**filtered_kwargs, "messages": context_query, "model": model}
OS
Windows
Logs / 报错日志
[15:31:04.635] [Core] [WARN] [v4.22.1] [runners.tool_loop_agent_runner:283]: Chat Model ollama/qwen3-vl:235b-cloud request error: Object of type Event is not JSON serializable
Traceback (most recent call last):
File "C:\DollyAI_Soul_V8\AstrBot\main.py", line 141, in
asyncio.run(main_async(args.webui_dir))
│ │ │ │ └ None
│ │ │ └ Namespace(webui_dir=None)
│ │ └ <function main_async at 0x000001A3B219A8E0>
│ └ <function run at 0x000001A3AE60B4C0>
└ <module 'asyncio' from 'C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\asyncio\init....
File "C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\asyncio\runners.py", line 195, in run
return runner.run(main)
│ │ └ <coroutine object main_async at 0x000001A3A8534660>
│ └ <function Runner.run at 0x000001A3AEAB1620>
└ <asyncio.runners.Runner object at 0x000001A3A86A28A0>
File "C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\asyncio\runners.py", line 118, in run
return self._loop.run_until_complete(task)
│ │ │ └ <Task pending name='Task-1' coro=<main_async() running at C:\DollyAI_Soul_V8\AstrBot\main.py:121> wait_for=<_GatheringFuture ...
│ │ └ <function BaseEventLoop.run_until_complete at 0x000001A3AEAAB1A0>
│ └
└ <asyncio.runners.Runner object at 0x000001A3A86A28A0>
File "C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\asyncio\base_events.py", line 678, in run_until_complete
self.run_forever()
│ └ <function ProactorEventLoop.run_forever at 0x000001A3AEB63060>
└
File "C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\asyncio\windows_events.py", line 322, in run_forever
super().run_forever()
File "C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\asyncio\base_events.py", line 645, in run_forever
self._run_once()
│ └ <function BaseEventLoop._run_once at 0x000001A3AEAB0F40>
└
File "C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\asyncio\base_events.py", line 1999, in _run_once
handle._run()
│ └ <function Handle._run at 0x000001A3AE5EACA0>
└ <Handle Task.task_wakeup()>
File "C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\asyncio\events.py", line 88, in _run
self._context.run(self._callback, *self._args)
│ │ │ │ │ └ <member '_args' of 'Handle' objects>
│ │ │ │ └ <Handle Task.task_wakeup()>
│ │ │ └ <member '_callback' of 'Handle' objects>
│ │ └ <Handle Task.task_wakeup()>
│ └ <member '_context' of 'Handle' objects>
└ <Handle Task.task_wakeup()>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\pipeline\scheduler.py", line 87, in execute
await self._process_stages(event)
│ │ └ <astrbot.core.platform.sources.webchat.webchat_event.WebChatMessageEvent object at 0x000001A3AB49EBD0>
│ └ <function PipelineScheduler._process_stages at 0x000001A3A8502200>
└ <astrbot.core.pipeline.scheduler.PipelineScheduler object at 0x000001A3AA0C7380>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\pipeline\scheduler.py", line 61, in _process_stages
await self._process_stages(event, i + 1)
│ │ │ └ 6
│ │ └ <astrbot.core.platform.sources.webchat.webchat_event.WebChatMessageEvent object at 0x000001A3AB49EBD0>
│ └ <function PipelineScheduler._process_stages at 0x000001A3A8502200>
└ <astrbot.core.pipeline.scheduler.PipelineScheduler object at 0x000001A3AA0C7380>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\pipeline\scheduler.py", line 72, in _process_stages
await coroutine
└ <coroutine object RespondStage.process at 0x000001A3AA869640>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\pipeline\respond\stage.py", line 201, in process
await event.send_streaming(result.async_stream, realtime_segmenting)
│ │ │ │ └ True
│ │ │ └ <async_generator object run_agent at 0x000001A3AB778B30>
│ │ └ MessageEventResult(chain=[], use_t2i_=None, type=None, result_type=<EventResultType.CONTINUE: 1>, result_content_type=<Result...
│ └ <function WebChatMessageEvent.send_streaming at 0x000001A3A84B59E0>
└ <astrbot.core.platform.sources.webchat.webchat_event.WebChatMessageEvent object at 0x000001A3AB49EBD0>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\platform\sources\webchat\webchat_event.py", line 159, in send_streaming
async for chain in generator:
└ <async_generator object run_agent at 0x000001A3AB778B30>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\astr_agent_run_util.py", line 124, in run_agent
async for resp in agent_runner.step():
│ └ <function ToolLoopAgentRunner.step at 0x000001A3B5F9A2A0>
└ <astrbot.core.agent.runners.tool_loop_agent_runner.ToolLoopAgentRunner object at 0x000001A3AB52A180>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\agent\runners\tool_loop_agent_runner.py", line 389, in step
async for llm_response in self._iter_llm_responses_with_fallback():
│ └ <function ToolLoopAgentRunner._iter_llm_responses_with_fallback at 0x000001A3B5F99EE0>
└ <astrbot.core.agent.runners.tool_loop_agent_runner.ToolLoopAgentRunner object at 0x000001A3AB52A180>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\agent\runners\tool_loop_agent_runner.py", line 258, in _iter_llm_responses_with_fallback
async for resp in self._iter_llm_responses(include_model=idx == 0):
│ │ └ 0
│ └ <function ToolLoopAgentRunner._iter_llm_responses at 0x000001A3B5F99E40>
└ <astrbot.core.agent.runners.tool_loop_agent_runner.ToolLoopAgentRunner object at 0x000001A3AB52A180>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\agent\runners\tool_loop_agent_runner.py", line 232, in _iter_llm_responses
async for resp in stream: # type: ignore
└ <async_generator object ProviderOpenAIOfficial.text_chat_stream at 0x000001A3AA7D92D0>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\provider\sources\openai_source.py", line 869, in text_chat_stream
) = await self._handle_api_error(
│ └ <function ProviderOpenAIOfficial._handle_api_error at 0x000001A3A9C489A0>
└ <astrbot.core.provider.sources.openai_source.ProviderOpenAIOfficial object at 0x000001A3AA168AA0>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\provider\sources\openai_source.py", line 753, in _handle_api_error
raise e
└ TypeError('Object of type Event is not JSON serializable')
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\provider\sources\openai_source.py", line 856, in text_chat_stream
async for response in self._query_stream(payloads, func_tool):
│ │ │ └ ToolSet(tools=[FuncTool(name=get_current_time, parameters={'type': 'object', 'properties': {}}, description=Obtem a data e a ...
│ │ └ {'messages': [{'role': 'system', 'content': 'You are running in Safe Mode.\n\nRules:\n- Do NOT generate pornographic, sexuall...
│ └ <function ProviderOpenAIOfficial._query_stream at 0x000001A3A9C48540>
└ <astrbot.core.provider.sources.openai_source.ProviderOpenAIOfficial object at 0x000001A3AA168AA0>
File "C:\DollyAI_Soul_V8\AstrBot\astrbot\core\provider\sources\openai_source.py", line 321, in _query_stream
stream = await self.client.chat.completions.create(
│ │ │ │ └ <function AsyncCompletions.create at 0x000001A3A9D8C5E0>
│ │ │ └ <openai.resources.chat.completions.completions.AsyncCompletions object at 0x000001A3AA169040>
│ │ └ <openai.resources.chat.chat.AsyncChat object at 0x000001A3AA168C20>
│ └ <openai.AsyncOpenAI object at 0x000001A3AA168A10>
└ <astrbot.core.provider.sources.openai_source.ProviderOpenAIOfficial object at 0x000001A3AA168AA0>
File "C:\DollyAI_Soul_V8\AstrBot.venv\Lib\site-packages\openai\resources\chat\completions\completions.py", line 2714, in create
return await self.post(
│ └ <bound method AsyncAPIClient.post of <openai.AsyncOpenAI object at 0x000001A3AA168A10>>
└ <openai.resources.chat.completions.completions.AsyncCompletions object at 0x000001A3AA169040>
File "C:\DollyAI_Soul_V8\AstrBot.venv\Lib\site-packages\openai_base_client.py", line 1884, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
│ │ │ │ │ └ openai.AsyncStream[openai.types.chat.chat_completion_chunk.ChatCompletionChunk]
│ │ │ │ └ True
│ │ │ └ FinalRequestOptions(method='post', url='/chat/completions', params={}, headers=NOT_GIVEN, max_retries=NOT_GIVEN, timeout=NOT...
│ │ └ <class 'openai.types.chat.chat_completion.ChatCompletion'>
│ └ <function AsyncAPIClient.request at 0x000001A3B5DB2160>
└ <openai.AsyncOpenAI object at 0x000001A3AA168A10>
File "C:\DollyAI_Soul_V8\AstrBot.venv\Lib\site-packages\openai_base_client.py", line 1590, in request
request = self.build_request(options, retries_taken=retries_taken)
│ │ │ └ 0
│ │ └ FinalRequestOptions(method='post', url='/chat/completions', params={}, headers=NOT_GIVEN, max_retries=NOT_GIVEN, timeout=NOT...
│ └ <function BaseClient._build_request at 0x000001A3B5DA77E0>
└ <openai.AsyncOpenAI object at 0x000001A3AA168A10>
File "C:\DollyAI_Soul_V8\AstrBot.venv\Lib\site-packages\openai_base_client.py", line 563, in build_request
kwargs["content"] = openapi_dumps(json_data) if is_given(json_data) and json_data is not None else None
│ │ │ │ │ └ {'messages': [{'role': 'system', 'content': 'You are running in Safe
Mode.\n\nRules:\n- Do NOT generate pornographic, sexuall...
│ │ │ │ └ {'messages': [{'role': 'system', 'content': 'You are running in Safe Mode.\n\nRules:\n- Do NOT generate pornographic, sexuall...
│ │ │ └ <function is_given at 0x000001A3B57CDB20>
│ │ └ {'messages': [{'role': 'system', 'content': 'You are running in Safe Mode.\n\nRules:\n- Do NOT generate pornographic, sexuall...
│ └ <function openapi_dumps at 0x000001A3B5D77920>
└ {}
File "C:\DollyAI_Soul_V8\AstrBot.venv\Lib\site-packages\openai_utils_json.py", line 18, in openapi_dumps
return json.dumps(
│ └ <function dumps at 0x000001A3AEC1AF20>
└ <module 'json' from 'C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\json\init.py'>
File "C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\json_init.py", line 238, in dumps
**kw).encode(obj)
│ └ {'messages': [{'role': 'system', 'content': 'You are running in Safe Mode.\n\nRules:\n- Do NOT generate pornographic, sexuall... └ {}
File "C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\json\encoder.py", line 200, in encode
chunks = self.iterencode(o, _one_shot=True)
│ │ └ {'messages': [{'role': 'system', 'content': 'You are running in Safe Mode.\n\nRules:\n- Do NOT generate pornographic, sexuall...
│ └ <function JSONEncoder.iterencode at 0x000001A3AEC1B2E0>
└ <openai._utils._json._CustomEncoder object at 0x000001A3AB529E80>
File "C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\json\encoder.py", line 258, in iterencode
return _iterencode(o, 0)
│ └ {'messages': [{'role': 'system', 'content': 'You are running in Safe Mode.\n\nRules:\n- Do NOT generate pornographic, sexuall...
└ <_json.Encoder object at 0x000001A3AB331000>
File "C:\DollyAI_Soul_V8\AstrBot.venv\Lib\site-packages\openai_utils_json.py", line 35, in default
return super().default(o)
└ <asyncio.locks.Event object at 0x000001A3AB49D0A0 [unset]>
File "C:\Users\J\AppData\Roaming\uv\python\cpython-3.12-windows-x86_64-none\Lib\json\encoder.py", line 180, in default
raise TypeError(f'Object of type {o.class.name} '
│ │ └ <member 'name' of 'getset_descriptor' objects>
│ └ <attribute 'class' of 'object' objects>
└ <asyncio.locks.Event object at 0x000001A3AB49D0A0 [unset]>
TypeError: Object of type Event is not JSON serializable
[15:31:04.656] [Core] [INFO] [result_decorate.stage:189]: 流式输出已启用,跳过结果装饰阶段
Are you willing to submit a PR? / 你愿意提交 PR 吗?
- Yes!
Code of Conduct
- I have read and agree to abide by the project's Code of Conduct。