Examples
- Examples
- Getting Started
- Agents
- Teams
- Workflows
- Applications
- Streamlit Apps
- Evals
Agent Concepts
- Reasoning
- Multimodal
- RAG
- User Control Flows
- Knowledge
- Memory
- Async
- Hybrid Search
- Storage
- Tools
- Vector Databases
- Context
- Embedders
- Agent State
- Observability
- Testing
- Miscellaneous
Models
- Anthropic
- AWS Bedrock
- AWS Bedrock Claude
- Azure AI Foundry
- Azure OpenAI
- Cerebras
- Cerebras OpenAI
- Cohere
- DeepInfra
- DeepSeek
- Fireworks
- Gemini
- Groq
- Hugging Face
- IBM
- LM Studio
- LiteLLM
- LiteLLM OpenAI
- Meta
- Mistral
- Nebius
- NVIDIA
- Ollama
- OpenAI
- Perplexity
- Together
- XAI
- Vercel
- vLLM
XAI
Live Search Streaming Agent
Code
cookbook/models/xai/live_search_agent_stream.py
Copy
Ask AI
from agno.agent import Agent
from agno.models.xai import xAI
agent = Agent(
model=xAI(
id="grok-3",
search_parameters={
"mode": "on",
"max_search_results": 20,
"return_citations": True,
},
),
markdown=True,
)
agent.print_response(
"Provide me a digest of world news in the last 24 hours.", stream=True
)
Usage
1
Create a virtual environment
Open the Terminal
and create a python virtual environment.
Copy
Ask AI
python3 -m venv .venv
source .venv/bin/activate
2
Set your API key
Copy
Ask AI
export XAI_API_KEY=xxx
3
Install libraries
Copy
Ask AI
pip install -U openai agno
4
Run Agent
Copy
Ask AI
python cookbook/models/xai/live_search_agent_stream.py
Was this page helpful?
Assistant
Responses are generated using AI and may contain mistakes.