Skip to main content
Base class for interacting with providers that implement the Open Responses API specification. This provides a foundation for multi-provider, interoperable LLM interfaces based on the OpenAI Responses API. Providers that implement this spec include Ollama (v0.13.3+) and OpenRouter.

Key Differences from OpenAI Responses

  • Configurable base_url for pointing to different API endpoints
  • Stateless by default (no previous_response_id chaining)
  • Flexible api_key handling for providers that don’t require authentication

Parameters

ParameterTypeDefaultDescription
idstr"not-provided"The ID of the model to use
namestr"OpenResponses"The name of the model
providerstr"OpenResponses"The provider of the model
api_keyOptional[str]"not-provided"The API key for authentication
storeOptional[bool]FalseWhether to store responses (disabled by default for compatible providers)

Usage

For most use cases, prefer the provider-specific classes:
from agno.agent import Agent
from agno.models.openai import OpenResponses

agent = Agent(
    model=OpenResponses(
        id="your-model-id",
        base_url="https://your-provider.com/v1",
        api_key="your-api-key",
    ),
)

agent.print_response("Hello!")