Skip to main content

Code

cookbook/11_models/ollama/async_basic.py
import asyncio

from agno.agent import Agent
from agno.models.ollama import Ollama

agent = Agent(
    model=Ollama(id="llama3.1:8b"),
    description="You help people with their health and fitness goals.",
    instructions=["Recipes should be under 5 ingredients"],
)
# -*- Print a response to the cli
asyncio.run(agent.aprint_response("Share a breakfast recipe.", markdown=True))

Usage

1

Set up your virtual environment

uv venv --python 3.12
source .venv/bin/activate
2

Install Ollama

Follow the Ollama installation guide and run:
ollama pull llama3.1:8b
3

Install dependencies

uv pip install -U ollama agno
4

Run Agent

python cookbook/11_models/ollama/async_basic.py