Agno supports using Google Cloud Storage (GCS) as a storage backend for Agents using the GcsJsonDb class. This storage backend stores session data as JSON blobs in a GCS bucket.

Usage

Configure your agent with GCS storage to enable cloud-based session persistence.
gcs_for_agent.py
import uuid
import google.auth
from agno.agent import Agent
from agno.db.base import SessionType
from agno.db.gcs_json import GcsJsonDb
from agno.tools.duckduckgo import DuckDuckGoTools

# Obtain the default credentials and project id from your gcloud CLI session.
credentials, project_id = google.auth.default()

# Generate a unique bucket name using a base name and a UUID4 suffix.
base_bucket_name = "example-gcs-bucket"
unique_bucket_name = f"{base_bucket_name}-{uuid.uuid4().hex[:12]}"
print(f"Using bucket: {unique_bucket_name}")

# Initialize GCSJsonDb with explicit credentials, unique bucket name, and project.
db = GcsJsonDb(
    bucket_name=unique_bucket_name,
    prefix="agent/",
    project=project_id,
    credentials=credentials,
)

# Initialize the Agno agent with the new storage backend and a DuckDuckGo search tool.
agent1 = Agent(
    db=db,
    tools=[DuckDuckGoTools()],
    add_history_to_context=True,
    debug_mode=False,
)

# Execute sample queries.
agent1.print_response("How many people live in Canada?")
agent1.print_response("What is their national anthem called?")

# Create a new agent and make sure it pursues the conversation
agent2 = Agent(
    db=db,
    session_id=agent1.session_id,
    tools=[DuckDuckGoTools()],
    add_history_to_context=True,
    debug_mode=False,
)

agent2.print_response("What's the name of the country we discussed?")
agent2.print_response("What is that country's national sport?")

Prerequisites

1

Google Cloud SDK Setup

  1. Install the Google Cloud SDK
  2. Run gcloud init to configure your account and project
2

GCS Permissions

Ensure your account has sufficient permissions (e.g., Storage Admin) to create and manage GCS buckets:
gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \
    --member="user:YOUR_EMAIL@example.com" \
    --role="roles/storage.admin"
3

Authentication

Use default credentials from your gcloud CLI session:
gcloud auth application-default login
Alternatively, if using a service account, set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the path of your service account JSON file.
4

Python Dependencies

Install the required Python packages:
pip install google-auth google-cloud-storage openai ddgs
1

Setup with Docker

For local testing without using real GCS, you can use fake-gcs-server.Create a docker-compose.yml file:
version: '3.8'
services:
  fake-gcs-server:
    image: fsouza/fake-gcs-server:latest
    ports:
      - "4443:4443"
    command: ["-scheme", "http", "-port", "4443", "-public-host", "localhost"]
    volumes:
      - ./fake-gcs-data:/data
Start the fake GCS server:
docker-compose up -d
2

Using Fake GCS with Docker

Set the environment variable to direct API calls to the emulator:
export STORAGE_EMULATOR_HOST="http://localhost:4443"
python gcs_for_agent.py
When using Fake GCS, authentication isn't enforced and the client will automatically detect the emulator endpoint.

Params

ParameterTypeDefaultDescription
bucket_namestr-Name of the GCS bucket where JSON files will be stored.
prefixOptional[str]-Path prefix for organizing files in the bucket. Defaults to "agno/".
session_tableOptional[str]-Name of the JSON file to store sessions (without .json extension).
memory_tableOptional[str]-Name of the JSON file to store user memories.
metrics_tableOptional[str]-Name of the JSON file to store metrics.
eval_tableOptional[str]-Name of the JSON file to store evaluation runs.
knowledge_tableOptional[str]-Name of the JSON file to store knowledge content.
projectOptional[str]-GCP project ID. If None, uses default project.
credentialsOptional[Any]-GCP credentials. If None, uses default credentials.

Developer Resources