Tool Calling

Use OpenAI's tool calling format across providers. OpenGateway translates tools and tool_choice for Anthropic Claude and Google Gemini automatically.

OpenGateway accepts the OpenAI tools and tool_choice format on every model it routes to. When the underlying provider uses a different shape — Anthropic's tool_use blocks or Google's function_declarations — the gateway converts at the edge. Your code only sees the OpenAI shape.

What it is#

A single tool calling format that works with openai/, anthropic/, and google/ models. You write the tool definitions once, in OpenAI JSON schema. The gateway handles translation. The response always comes back as OpenAI's tool_calls array.

What it solves#

Different providers have different tool calling conventions:

  • OpenAI uses tools[].function with JSON schema and returns tool_calls
  • Anthropic uses tools[].input_schema and returns content blocks of type tool_use
  • Google uses function_declarations inside tools and returns function_call

Without a gateway, supporting all three means three code paths and three sets of bugs. With OpenGateway, you write the OpenAI form once.

Define a tool#

from openai import OpenAI
 
client = OpenAI(
    api_key="og_live_...",
    base_url="https://api.opengateway.ai/v1",
)
 
tools = [{
    "type": "function",
    "function": {
        "name": "get_weather",
        "description": "Get the current weather for a city.",
        "parameters": {
            "type": "object",
            "properties": {
                "city": {"type": "string", "description": "City name, e.g. Seoul"},
                "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
            },
            "required": ["city"],
        },
    },
}]
 
response = client.chat.completions.create(
    model="anthropic/claude-sonnet-4",
    messages=[{"role": "user", "content": "Weather in Seoul?"}],
    tools=tools,
    tool_choice="auto",
)

The response shape matches OpenAI exactly:

{
  "choices": [{
    "message": {
      "role": "assistant",
      "tool_calls": [{
        "id": "call_abc123",
        "type": "function",
        "function": {
          "name": "get_weather",
          "arguments": "{\"city\":\"Seoul\",\"unit\":\"celsius\"}"
        }
      }]
    }
  }]
}

The same code returns the same shape whether the model is OpenAI, Anthropic, or Google.

Handle the tool call#

Run the tool, then send the result back as a role: "tool" message:

import json
 
# 1. Model asked for a tool call
tool_call = response.choices[0].message.tool_calls[0]
args = json.loads(tool_call.function.arguments)
 
# 2. Run the tool
weather_result = get_weather(args["city"], args["unit"])
 
# 3. Send the result back
follow_up = client.chat.completions.create(
    model="anthropic/claude-sonnet-4",
    messages=[
        {"role": "user", "content": "Weather in Seoul?"},
        response.choices[0].message,  # The model's tool call message
        {
            "role": "tool",
            "tool_call_id": tool_call.id,
            "content": json.dumps(weather_result),
        },
    ],
    tools=tools,
)

The model's final answer ("It's 18°C and clear in Seoul.") comes back in follow_up.choices[0].message.content.

tool_choice#

Same as OpenAI:

ValueBehavior
"auto" (default)Model decides whether to call a tool.
"none"Model returns a normal text response. Tools are ignored.
"required"Model must call at least one tool.
{"type": "function", "function": {"name": "X"}}Model must call tool X.

Streaming tool calls#

Stream tool call deltas exactly like content deltas:

stream = client.chat.completions.create(
    model="openai/gpt-4o",
    messages=[...],
    tools=tools,
    stream=True,
)
 
for chunk in stream:
    delta = chunk.choices[0].delta
    if delta.tool_calls:
        for tc in delta.tool_calls:
            print(tc.function.arguments, end="", flush=True)

Tool call argument JSON arrives in pieces. Concatenate the deltas and parse once finish_reason: "tool_calls" arrives.

Provider notes#

ProviderTool callingStreaming tool calls
OpenAI (openai/)NativeYes
Anthropic (anthropic/)Translated from OpenAI shapeYes
Google (google/)Translated from OpenAI shapeYes

The translation is lossless for tools that match JSON schema. Provider-specific tool features (Anthropic's computer use, Google's grounding) are not exposed through the OpenAI shape — call those through the provider's native API if you need them.

What to know#

Can a tool call return a structured object?#

Yes. Set response_format: { type: "json_object" } on the follow-up call, or use response_format: { type: "json_schema", ... } on models that support it.

What if my tool throws?#

Send the error back as the tool message content. Models handle "the tool errored, here is the error" natively and usually retry with corrected input.

Does tool calling work with fallbacks?#

Yes for Chat Completions. If the primary model fails, the gateway retries on the next model listed in extra.fallbacks. The tool definitions go along.