This example shows how to add content from an Amazon S3 bucket to your knowledge base. This allows you to process documents stored in cloud storage without downloading them locally.

Code

s3.py
import asyncio
from agno.agent import Agent
from agno.db.postgres import PostgresDb
from agno.knowledge.knowledge import Knowledge
from agno.knowledge.remote_content.remote_content import S3Content
from agno.vectordb.pgvector import PgVector

contents_db = PostgresDb(
    db_url="postgresql+psycopg://ai:ai@localhost:5532/ai",
    knowledge_table="knowledge_contents",
)

# Create Knowledge Instance
knowledge = Knowledge(
    name="Basic SDK Knowledge Base",
    description="Agno 2.0 Knowledge Implementation",
    contents_db=contents_db,
    vector_db=PgVector(
        table_name="vectors", db_url="postgresql+psycopg://ai:ai@localhost:5532/ai"
    ),
)

# Add from S3 bucket
asyncio.run(
    knowledge.add_content_async(
        name="S3 PDF",
        remote_content=S3Content(
            bucket_name="agno-public", key="recipes/ThaiRecipes.pdf"
        ),
        metadata={"remote_content": "S3"},
    )
)

agent = Agent(
    name="My Agent",
    description="Agno 2.0 Agent Implementation",
    knowledge=knowledge,
    search_knowledge=True,
    debug_mode=True,
)

agent.print_response(
    "What is the best way to make a Thai curry?",
    markdown=True,
)

Usage

1

Install libraries

pip install -U agno sqlalchemy psycopg pgvector boto3
2

Configure AWS credentials

Set up your AWS credentials using one of these methods:
  • AWS CLI: aws configure
  • Environment variables: AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
  • IAM roles (if running on AWS infrastructure)
3

Run PgVector

docker run -d \
  -e POSTGRES_DB=ai \
  -e POSTGRES_USER=ai \
  -e POSTGRES_PASSWORD=ai \
  -e PGDATA=/var/lib/postgresql/data/pgdata \
  -v pgvolume:/var/lib/postgresql/data \
  -p 5532:5432 \
  --name pgvector \
  agno/pgvector:16
4

Run the example

python cookbook/knowledge/basic_operations/06_from_s3.py

Params

ParameterTypeDefaultDescription
bucket_namestr-Name of the S3 bucket containing the file.
bucketOptional[S3Bucket]-Pre-configured S3 bucket object.
keystr-S3 object key (file path) within the bucket.
objectOptional[S3Object]-Pre-configured S3 object.
prefixOptional[str]-Path prefix for organizing files in the bucket.