Code

cookbook/models/ollama/basic.py
from agno.agent import Agent, RunResponse  # noqa
from agno.models.ollama import Ollama

agent = Agent(model=Ollama(id="llama3.1:8b"), markdown=True)

# Get the response in a variable
# run: RunResponse = agent.run("Share a 2 sentence horror story")
# print(run.content)

# Print the response in the terminal
agent.print_response("Share a 2 sentence horror story")

Usage

1

Create a virtual environment

Open the Terminal and create a python virtual environment.

python3 -m venv .venv
source .venv/bin/activate
2

Install Ollama

Follow the installation guide and run:

ollama pull llama3.1:8b
3

Install libraries

pip install -U ollama agno
4

Run Agent

python cookbook/models/ollama/basic.py