# MCP
The [Model Context Protocol](https://modelcontextprotocol.io/) (MCP) is an open standard for connecting AI applications to external resources. Mirascope's `llm.mcp` module allows you to connect to MCP servers and use their tools with any LLM provider.
<Info>
MCP tools are always async. You'll want to read the [Tools](/docs/learn/llm/tools) and [Async](/docs/learn/llm/async) guides first.
</Info>
In the example below, we connect to the MCP server of [FastMCP](https://gofastmcp.com/getting-started/welcome), a library for creating MCP servers. Their server exposes a `"SearchFastMcp"` tool, which our agent uses to research FastMCP on its own.
<TabbedSection>
<Tab value="Call">
```python
import asyncio
from mirascope import llm
async def main():
async with llm.mcp.streamable_http_client("https://gofastmcp.com/mcp") as client:
tools = await client.list_tools()
@llm.call("openai/gpt-5-mini", tools=tools)
async def assistant(query: str):
return query
response = await assistant("Give me a getting started primer on FastMCP.")
while response.tool_calls:
tool_outputs = await response.execute_tools()
response = await response.resume(tool_outputs)
print(response.pretty())
asyncio.run(main())
```
</Tab>
<Tab value="Prompt">
```python
import asyncio
from mirascope import llm
async def main():
async with llm.mcp.streamable_http_client("https://gofastmcp.com/mcp") as client:
tools = await client.list_tools()
@llm.prompt(tools=tools)
async def assistant(query: str):
return query
response = await assistant(
"openai/gpt-5-mini", "Give me a getting started primer on FastMCP."
)
while response.tool_calls:
tool_outputs = await response.execute_tools()
response = await response.resume(tool_outputs)
print(response.pretty())
asyncio.run(main())
```
</Tab>
<Tab value="Model">
```python
import asyncio
from mirascope import llm
async def main():
async with llm.mcp.streamable_http_client("https://gofastmcp.com/mcp") as client:
tools = await client.list_tools()
model = llm.use_model("openai/gpt-5-mini")
response = await model.call_async(
"Give me a getting started primer on FastMCP.",
tools=tools,
)
while response.tool_calls:
tool_outputs = await response.execute_tools()
response = await response.resume(tool_outputs)
print(response.pretty())
asyncio.run(main())
```
</Tab>
</TabbedSection>
<Note>
The call definition and invocation must happen inside the `async with` block. The MCP client maintains a live connection to the server, and tools retrieved from `client.list_tools()` use this connection when executed. When invoked, the MCP tools are executed on the MCP server itself, which then returns the tool outputs.
</Note>
## The MCP Client
The `llm.mcp` module provides `MCPClient`, a wrapper around the MCP `ClientSession`. You create a client using one of three async context managers, depending on how your MCP server is hosted. Each returns an `MCPClient` with a `list_tools()` method that retrieves the server's tools as Mirascope `AsyncTool`s.
### Streamable HTTP
For MCP servers exposed via HTTP endpoints:
```python
async with llm.mcp.streamable_http_client("https://example.com/mcp") as client:
tools = await client.list_tools()
```
### stdio
For local MCP servers that run as subprocesses. The client launches the process and communicates via stdin/stdout:
```python
from mcp.client.stdio import StdioServerParameters
server_params = StdioServerParameters(
command="python",
args=["path/to/server.py"],
)
async with llm.mcp.stdio_client(server_params) as client:
tools = await client.list_tools()
```
### SSE (Server-Sent Events)
For servers using server-sent events:
```python
async with llm.mcp.sse_client("http://localhost:8000/sse") as client:
tools = await client.list_tools()
```
## Combining with Mirascope Tools
MCP tools can be combined with Mirascope tools. Since MCP tools are async, all tools must be async.
In this example, we add a local `search_codebase` tool that searches our own code, then combine it with MCP tools. This lets the agent research external documentation while also exploring our local implementation:
```python
import asyncio
from mirascope import llm
@llm.tool
async def search_codebase(pattern: str) -> str:
"""Search the local codebase for a pattern using ripgrep."""
proc = await asyncio.create_subprocess_exec(
"rg",
"--max-count=5",
pattern,
"./",
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE,
)
stdout, _ = await proc.communicate()
return stdout.decode() or "No matches found."
async def main():
async with llm.mcp.streamable_http_client("https://gofastmcp.com/mcp") as client:
mcp_tools = await client.list_tools()
all_tools = [search_codebase, *mcp_tools]
@llm.call("openai/gpt-5-mini", tools=all_tools)
async def assistant(query: str):
return query
response = await assistant(
"How does FastMCP handle tool registration? "
"Search the FastMCP docs, then check our codebase for similar patterns."
)
while response.tool_calls:
print(response.pretty())
tool_outputs = await response.execute_tools()
response = await response.resume(tool_outputs)
print(response.pretty())
asyncio.run(main())
```
## Next Steps
- [Tools](/docs/learn/llm/tools) — Tool calling fundamentals
- [Async](/docs/learn/llm/async) — Async patterns
- [Agents](/docs/learn/llm/agents) — Build agents with MCP tools