Skip to content

Conversation

@Vidit-Ostwal
Copy link
Contributor

@Vidit-Ostwal Vidit-Ostwal commented Sep 25, 2025

This is a try solution for None / Empty response from LLM.

This PR assumes that this might be because of context length exceeded at the server side, which is not returning an error, just an empty response.
This PR captures the ValueError which is being raised and try again with summarization.


Note

Detect None/empty LLM responses as context-window overflows, trigger summarization/retry in executors, add utility and tests, and refine type hints.

  • Error handling & resilience
    • Treat ValueError with empty/None LLM responses as potential context overflow via new is_null_response_because_context_length_exceeded.
    • Update executors (crewai/agents/crew_agent_executor.py, crewai/lite_agent.py) to handle this case alongside is_context_length_exceeded, invoking handle_context_length to summarize and retry.
    • Enhance agent_utils.is_context_length_exceeded signature/docs; keep core detection; add LLMMessage typing usage.
  • Utilities
    • Add is_null_response_because_context_length_exceeded(exception, messages, llm) in agent_utils.py to infer overflow by chunking messages against llm.get_context_window_size().
  • Typing/Interfaces
    • Use BaseAgent in TaskEvaluator and broaden tool execution types to accept Agent | BaseAgent.
  • Tests
    • New tests validating empty-response overflow handling in agent execution and the new utility (tests/agents/test_agent.py, tests/utilities/test_agent_utils.py).

Written by Cursor Bugbot for commit 62d1a85. This will update automatically on new commits. Configure here.

@Vidit-Ostwal
Copy link
Contributor Author

Hey @lucasgomide, do you think this could be possible solution for this?
#2885 (comment)

I got some feedback from folks who tried the feature branch

@Vidit-Ostwal Vidit-Ostwal changed the title vo/fix/none_empty_response WIP Fix no response / null response from LLM Sep 27, 2025
cursor[bot]

This comment was marked as outdated.

cursor[bot]

This comment was marked as outdated.

@Vidit-Ostwal
Copy link
Contributor Author

Gentle ping on this one @lucasgomide

@lucasgomide
Copy link
Contributor

Added to my list to review later today

@Vidit-Ostwal Vidit-Ostwal force-pushed the Invalid_response_from_LLM branch from 3807072 to 5c9ac8f Compare October 30, 2025 12:44
cursor[bot]

This comment was marked as outdated.

cursor[bot]

This comment was marked as outdated.

@lucasgomide
Copy link
Contributor

mind you checking the CI errors?

@Vidit-Ostwal
Copy link
Contributor Author

mind you checking the CI errors?

All fixed now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants