Skip to content

Conversation

@devin-ai-integration
Copy link
Contributor

@devin-ai-integration devin-ai-integration bot commented Dec 5, 2025

Summary

Fixes #4036 - Models with native function calling (like Ollama) fail with 'list' object has no attribute 'rstrip' error.

When a model returns tool_calls without text content and no available_functions are provided, the code previously returned the raw tool_calls list. This list would then cause an error when passed to format_message_for_llm() which expects a string.

This PR converts tool_calls to a human-readable string representation containing the tool name and arguments, allowing the agent to see what tool the model wanted to call.

Changes:

  • Modified _handle_non_streaming_response to convert tool_calls to string
  • Modified _ahandle_non_streaming_response with the same fix
  • Added warning log when this conversion occurs
  • Added 4 tests covering sync/async paths and edge cases

Review & Testing Checklist for Human

  • Test with actual Ollama model: The fix was only tested with mocked responses. Please verify with a real Ollama model that has native function calling enabled (e.g., ollama_chat/gpt-oss:20b as mentioned in the issue)
  • Verify string format is useful: The format Tool: {name}\nArguments: {args} is somewhat arbitrary. Confirm this format works well with the agent's text-based tool parsing, or if a different format (e.g., JSON) would be more appropriate
  • Check for other code paths: Verify there are no other places in the codebase that might return raw tool_calls lists that could cause similar issues

Recommended test plan:

  1. Set up an Ollama model with native function calling support
  2. Create a simple agent with tools
  3. Run a task that triggers the model to use a tool
  4. Verify no crash occurs and the agent can see the tool call information

Notes

  • This fix makes the failure mode nicer but does NOT enable native function calling for the standard agent tool loop - it converts tool calls to text so the agent can handle them through text-based parsing
  • The broad except Exception catch in the conversion is intentional as a fallback to str(tool_calls) if the tool_call structure is unexpected

Link to Devin run: https://app.devin.ai/sessions/196112f708ab4589be19e3987b8c584c
Requested by: João ([email protected])

Fixes #4036 - Models with native function calling (like Ollama) fail when
the model returns tool_calls but no available_functions are provided.

Previously, when a model returned tool_calls without text content and no
available_functions were provided, the raw tool_calls list was returned.
This list would then cause a "'list' object has no attribute 'rstrip'"
error when passed to format_message_for_llm().

Now, tool_calls are converted to a human-readable string representation
containing the tool name and arguments. This allows the agent to see what
tool the model wanted to call and handle it through text-based parsing.

Changes:
- Modified _handle_non_streaming_response to convert tool_calls to string
- Modified _ahandle_non_streaming_response with the same fix
- Added warning log when this conversion occurs
- Added tests covering the fix for both sync and async paths

Co-Authored-By: João <[email protected]>
@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] Models with native function calling fail when using Ollama

1 participant