Skip to content

Fix JSON serialization of LiteLLM tool calls#5277

Closed
STHITAPRAJNAS wants to merge 1 commit intogoogle:mainfrom
STHITAPRAJNAS:fix/litellm-serialization
Closed

Fix JSON serialization of LiteLLM tool calls#5277
STHITAPRAJNAS wants to merge 1 commit intogoogle:mainfrom
STHITAPRAJNAS:fix/litellm-serialization

Conversation

@STHITAPRAJNAS
Copy link
Copy Markdown

This PR addresses Issue #158 - a TypeError that occurs when using LiteLlm with third-party loggers or tracers (such as Opik). The issue was caused by LiteLLM tool call objects (ChatCompletionMessageToolCall and Function) not
being natively JSON serializable by standard Python json.dumps().

Changes

  • src/google/adk/models/lite_llm.py:
    • Refactored _content_to_message_param to use plain Python dictionaries for tool call definitions. This ensures that the message history passed to LiteLLM (and intercepted by loggers) contains only
      standard, serializable types.
    • Updated _message_to_generate_content_response to defensively handle both dictionary and object-based tool calls, ensuring robust conversion to ADK's types.Part.
    • Modified streaming logic in generate_content_async to assemble tool calls as plain dictionaries, preventing non-serializable objects from entering the response stream.
    • Initialized last_model_version in generate_content_async to resolve a potential UnboundLocalError and improve linting scores.

Impact

  • Fixed TypeError: Object of type ChatCompletionMessageToolCall is not JSON serializable.
  • Improved compatibility with observability tools that rely on standard JSON serialization for tracing.
  • No breaking changes to the public ADK API.

Reproduction
The issue was reproduced using a standalone script that integrated google-adk with litellm and the opik tracer. When an agent triggered a tool call, the OpikLogger attempted to serialize the model response using
json.dumps(), which failed on the ChatCompletionMessageToolCall object.

Reproduction Snippet:

from litellm.integrations.opik.opik import OpikLogger
import litellm
import json
from litellm.types.utils import ChatCompletionMessageToolCall, Function

# LiteLLM's internal object which is not JSON serializable
tool_call = ChatCompletionMessageToolCall(
    id="call_123",
    function=Function(arguments='{"city": "New York"}', name="get_weather"),
    type="function"
)

This fails in Opik or any standard JSON logger

json.dumps({"tool_calls": [tool_call]}) 
     Raises: TypeError: Object of type ChatCompletionMessageToolCall is not JSON serializable

Verification & Testing
After applying the fix, the following validation steps were performed:

  1. Manual Verification:

    • Created a verification script that uses plain dictionaries for tool calls (mimicking the new behavior).
    • Confirmed that json.dumps() successfully serializes the data without any TypeError.
    • Result: {"tool_calls": [{"id": "call_123", "function": {"arguments": "...", "name": "..."}, "type": "function"}]}
  2. Unit Testing:

    • Ran the specific LiteLLM test suite: pytest tests/unittests/models/test_litellm.py.
    • Result: 245 passed, 0 failed.
  3. Regression Testing:

    • Ran the entire project unit test suite: pytest tests/unittests.
    • Result: 5321 passed, 0 failed.
  4. Linting & Standards:

    • Ran pylint src/google/adk/models/lite_llm.py. (Score: 9.62/10).
    • Executed ./autoformat.sh to ensure compliance with Google's Python style guide.

@adk-bot adk-bot added the models [Component] Issues related to model support label Apr 11, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

models [Component] Issues related to model support

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants