Code

cookbook/models/ollama/ollama_cloud.py
from agno.agent import Agent
from agno.models.ollama import Ollama

agent = Agent(
    model=Ollama(id="deepseek-v3.1:671b", host="https://ollama.com"),
)

agent.print_response("How many r's in the word 'strawberry'?", stream=True)

Usage

1

Create a virtual environment

Open the Terminal and create a python virtual environment.
python3 -m venv .venv
source .venv/bin/activate
2

Set up Ollama Cloud API Key

Sign up at ollama.com and get your API key, then export it:
export OLLAMA_API_KEY=your_api_key_here
3

Install libraries

pip install -U ollama agno
4

Run Agent

python cookbook/models/ollama/ollama_cloud.py

Key Features

  • No local setup required: Access powerful models instantly without downloading or managing local installations
  • Production-ready: Enterprise-grade infrastructure with reliable uptime and performance
  • Wide model selection: Access to popular models like DeepSeek, Qwen, Phi, and more
  • Automatic configuration: When api_key is provided, the host automatically defaults to https://ollama.com