Skip to content

Workflow LLM not working #146

@humboldthills

Description

@humboldthills

I have entered the API key and URL for OpenAI, it gets verified and I have tried many of the model options from the drop-down but no matter what I choose I get: The model us.anthropic.claude-sonnet-4-20250514-v1:0 does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found' in the logs.

So I tried to verify my Anthropic API Key to choose one of their models. It will not verify with https://api.anthropic.com https://api.anthropic.com/v1 or https://api.anthropic.com/v1/. It always fails and the in the logs when trying to verify the Anthropic API key it seems to be checking openai:

ERROR    | llm_api.py:verify_openai_key:115 | API validation failed with status code: 404
2026-04-18 02:54:16 | ERROR    | llm_api.py:verify_openai_key:115 | API validation failed with status code: 401
2026-04-18 02:55:04 | ERROR    | llm_api.py:verify_openai_key:115 | API validation failed with status code: 404

That is trying to authenticate Anthropic to get around the error in using OpenAI. I will share some output from when I try to use OpenAI since it does authenticate. The chat works fine but as soon as it tries to spin up an agent the log shows:

2026-04-18 03:14:33 | ERROR    | mcp_client.py:comfyui_agent_invoke:503 | Non-retryable streaming error or max retries reached: Error code: 404 - {'error': {'message': 'The model `us.anthropic.claude-sonnet-4-20250514-v1:0` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
2026-04-18 03:14:33 | ERROR    | mcp_client.py:comfyui_agent_invoke:504 | Traceback: Traceback (most recent call last):
  File "/workspace/ComfyUI/custom_nodes/ComfyUI-Copilot/backend/service/mcp_client.py", line 460, in comfyui_agent_invoke
    async for stream_data in process_stream_events(result):
  File "/workspace/ComfyUI/custom_nodes/ComfyUI-Copilot/backend/service/mcp_client.py", line 455, in process_stream_events
    raise e
  File "/workspace/ComfyUI/custom_nodes/ComfyUI-Copilot/backend/service/mcp_client.py", line 338, in process_stream_events
    async for event in stream_result.stream_events():
  File "/workspace/miniconda3/envs/comfyui/lib/python3.11/site-packages/agents/result.py", line 773, in stream_events
    raise self._stored_exception
  File "/workspace/miniconda3/envs/comfyui/lib/python3.11/site-packages/agents/run_internal/run_loop.py", line 1012, in start_streaming
    turn_result = await run_single_turn_streamed(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspace/miniconda3/envs/comfyui/lib/python3.11/site-packages/agents/run_internal/run_loop.py", line 1473, in run_single_turn_streamed
    async for event in retry_stream:
  File "/workspace/miniconda3/envs/comfyui/lib/python3.11/site-packages/agents/run_internal/model_retry.py", line 652, in stream_response_with_retry
    event = await stream.__anext__()
            ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspace/miniconda3/envs/comfyui/lib/python3.11/site-packages/agents/models/openai_chatcompletions.py", line 248, in stream_response
    response, stream = await self._fetch_response(
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspace/miniconda3/envs/comfyui/lib/python3.11/site-packages/agents/models/openai_chatcompletions.py", line 441, in _fetch_response
    ret = await self._get_client().chat.completions.create(**create_kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspace/miniconda3/envs/comfyui/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py", line 2714, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "/workspace/miniconda3/envs/comfyui/lib/python3.11/site-packages/openai/_base_client.py", line 1913, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspace/miniconda3/envs/comfyui/lib/python3.11/site-packages/openai/_base_client.py", line 1698, in request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model `us.anthropic.claude-sonnet-4-20250514-v1:0` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

2026-04-18 03:14:33 | INFO     | mcp_client.py:comfyui_agent_invoke:530 | Total tool results: 0
2026-04-18 03:14:33 | INFO     | mcp_client.py:comfyui_agent_invoke:540 | === End Tool Results Summary ===

2026-04-18 03:14:33 | INFO     | conversation_api.py:invoke_chat:310 | -- Received ext data: None, finished: True
2026-04-18 03:14:33 | INFO     | conversation_api.py:invoke_chat:335 | -- Sending final response: 44 chars, ext: False, finished: True

This seems like the same behavior seen in Issue #116 but I am still seeing the broken behavior unless I am missing something in the config/setup.

I can provide more info if it would help.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions