Ollama
Agent with Structured Outputs
Examples
- Introduction
- Getting Started
- Agents
- Workflows
- Applications
Agent Concepts
- Multimodal
- RAG
- Knowledge
- Memory
- Teams
- Async
- Hybrid Search
- Storage
- Tools
- Vector Databases
- Embedders
Models
- Anthropic
- AWS Bedrock Claude
- Azure OpenAI
- Cohere
- DeepSeek
- Fireworks
- Gemini
- Groq
- Hugging Face
- Mistral
- NVIDIA
- Ollama
- OpenAI
- Together
- Vertex AI
- xAI
Ollama
Agent with Structured Outputs
Code
cookbook/models/ollama/structured_output.py
import asyncio
from typing import List
from agno.agent import Agent
from agno.models.ollama import Ollama
from pydantic import BaseModel, Field
class MovieScript(BaseModel):
name: str = Field(..., description="Give a name to this movie")
setting: str = Field(
..., description="Provide a nice setting for a blockbuster movie."
)
ending: str = Field(
...,
description="Ending of the movie. If not available, provide a happy ending.",
)
genre: str = Field(
...,
description="Genre of the movie. If not available, select action, thriller or romantic comedy.",
)
characters: List[str] = Field(..., description="Name of characters for this movie.")
storyline: str = Field(
..., description="3 sentence storyline for the movie. Make it exciting!"
)
# Agent that returns a structured output
structured_output_agent = Agent(
model=Ollama(id="llama3.2"),
description="You write movie scripts.",
response_model=MovieScript,
structured_outputs=True,
)
# Run the agent synchronously
structured_output_agent.print_response("Llamas ruling the world")
# Run the agent asynchronously
async def run_agents_async():
await structured_output_agent.aprint_response("Llamas ruling the world")
asyncio.run(run_agents_async())
Usage
1
Create a virtual environment
Open the Terminal
and create a python virtual environment.
2
Install Ollama
Follow the installation guide and run:
ollama pull llama3.2
3
Install libraries
pip install -U ollama agno
4
Run Agent