Examples
- Examples
- Getting Started
- Agents
- Teams
- Workflows
- Applications
- Streamlit Apps
- Evals
Agent Concepts
- Reasoning
- Multimodal
- RAG
- User Control Flows
- Knowledge
- Memory
- Async
- Hybrid Search
- Storage
- Tools
- Search
- Social
- Web Scraping
- Database
- Local
- APIs & External Services
- MCP
- Vector Databases
- Context
- Embedders
- Agent State
- Observability
- Miscellaneous
Models
- Anthropic
- AWS Bedrock
- AWS Bedrock Claude
- Azure AI Foundry
- Azure OpenAI
- Cerebras
- Cerebras OpenAI
- Cohere
- DeepInfra
- DeepSeek
- Fireworks
- Gemini
- Groq
- Hugging Face
- IBM
- LM Studio
- LiteLLM
- LiteLLM OpenAI
- Meta
- Mistral
- NVIDIA
- Ollama
- OpenAI
- Perplexity
- Together
- XAI
- Vercel
MCP
Pipedream LinkedIn
This example shows how to use the LinkedIn Pipedream MCP server with Agno Agents.
Code
"""
💻 Pipedream LinkedIn MCP
This example shows how to use Pipedream MCP servers (in this case the LinkedIn one) with Agno Agents.
1. Connect your Pipedream and LinkedIn accounts: https://mcp.pipedream.com/app/linkedin
2. Get your Pipedream MCP server url: https://mcp.pipedream.com/app/linkedin
3. Set the MCP_SERVER_URL environment variable to the MCP server url you got above
4. Install dependencies: pip install agno mcp-sdk
"""
import asyncio
import os
from agno.agent import Agent
from agno.models.openai import OpenAIChat
from agno.tools.mcp import MCPTools
from agno.utils.log import log_exception
mcp_server_url = os.getenv("MCP_SERVER_URL")
async def run_agent(task: str) -> None:
try:
async with MCPTools(
url=mcp_server_url, transport="sse", timeout_seconds=20
) as mcp:
agent = Agent(
model=OpenAIChat(id="gpt-4o-mini"),
tools=[mcp],
markdown=True,
)
await agent.aprint_response(
message=task,
stream=True,
)
except Exception as e:
log_exception(f"Unexpected error: {e}")
if __name__ == "__main__":
asyncio.run(
run_agent("Check the Pipedream organization on LinkedIn and tell me about it")
)
Was this page helpful?
On this page
Assistant
Responses are generated using AI and may contain mistakes.