Mirascope Frog Logo
Mirascope
DocsBlogPricingCloud
⌘K
Type to search
⌘Kto search
Escto close
mirascope
v1.25.7
1.3k
Join our
WelcomeLearnGuidesAPI Referencev1 (Legacy)
LLMOps
OverviewQuickstartMessagesModelsResponsesPromptsCallsThinkingToolsStructured OutputStreamingAsyncAgentsContextChainingErrorsReliabilityProvidersLocal ModelsMCP
# Agents Agents are LLM-powered systems that use tools to accomplish tasks on your behalf. The core pattern is simple: call the LLM, execute any requested tools, and repeat until done. Here's a fully-functional coding agent built with Mirascope. It uses [Thinking](/docs/learn/llm/thinking) to reason through problems, [Tools](/docs/learn/llm/tools) to read and write files, and [Streaming](/docs/learn/llm/streaming) for real-time feedback as it works. ```python import json import tempfile from pathlib import Path from mirascope import llm # For safety, the agent may only operate within this workspace. WORKSPACE = Path(tempfile.mkdtemp(prefix="agent_")) def resolve_path(path_str: str) -> Path | None: """Resolve a path within the workspace, returning None if it escapes.""" path = (WORKSPACE / path_str).resolve() if WORKSPACE.resolve() not in path.parents and path != WORKSPACE.resolve(): return None return path @llm.tool def list_files(directory: str = ".") -> str: """List files in a directory.""" path = resolve_path(directory) if path is None: return "Error: Path is outside the workspace" if not path.exists(): return f"Directory not found: {directory}" files = [f.name + ("/" if f.is_dir() else "") for f in path.iterdir()] return "\n".join(files) if files else "(empty directory)" @llm.tool def read_file(filepath: str) -> str: """Read the contents of a file.""" path = resolve_path(filepath) if path is None: return "Error: Path is outside the workspace" if not path.exists(): return f"File not found: {filepath}" return path.read_text() @llm.tool def write_file(filepath: str, content: str) -> str: """Write content to a file.""" path = resolve_path(filepath) if path is None: return "Error: Path is outside the workspace" path.parent.mkdir(parents=True, exist_ok=True) path.write_text(content) return f"Wrote {len(content)} bytes to {filepath}" def display_tool_call(tool_call: llm.ToolCall) -> str: args = json.loads(tool_call.args) match tool_call.name: case "list_files": return f"[Tool] List files in '{args.get('directory', '.')}'" case "read_file": return f"[Tool] Read '{args['filepath']}'" case "write_file": return f"[Tool] Write '{args['filepath']}'" case _: return f"[Tool]: {tool_call.name}" def run_agent(model_id: llm.ModelId, query: str): model = llm.model(model_id, thinking={"level": "medium", "include_thoughts": True}) response = model.stream(query, tools=[list_files, read_file, write_file]) while True: # The Agent Loop for stream in response.streams(): match stream.content_type: case "text": for chunk in stream: print(chunk, flush=True, end="") print("\n") case "thought": print("<Thinking>\n", flush=True) for chunk in stream: print(chunk, flush=True, end="") print("</Thinking>\n", flush=True) case "tool_call": tool_call = stream.collect() print(display_tool_call(tool_call) + "\n") if not response.tool_calls: break # Agent is finished. response = response.resume(response.execute_tools()) run_agent( "anthropic/claude-sonnet-4-5", "Create a calculator module (calc.py) with add, subtract, multiply, divide " "functions, then create a test file (test_calc.py) that tests each function.", ) print(f"View the agent's work in this directory: {WORKSPACE}\n") ``` This example is simple, but it demonstrates the power of Mirascope's abstractions. It serves as a starting point for building more sophisticated agents. Some directions to explore: - [Context](/docs/learn/llm/context) — Encapsulate the workspace the agent operates in - [MCP](/docs/learn/llm/mcp) — Add capabilities like a language server for smarter code assistance - [Reliability](/docs/learn/llm/reliability) — Handle failures gracefully with retries and fallbacks - Human-in-the-loop — Let the agent ask clarifying questions or request feedback ## Next Steps - [Tools](/docs/learn/llm/tools) — Tool calling fundamentals - [Streaming](/docs/learn/llm/streaming) — Streaming patterns in depth - [Async](/docs/learn/llm/async) — Concurrent tool execution - [Context](/docs/learn/llm/context) — Inject dependencies into agent tools - [MCP](/docs/learn/llm/mcp) — Connect to external tool servers

On this page

On this page

© 2026 Mirascope. All rights reserved.

Mirascope® is a registered trademark of Mirascope, Inc. in the U.S.

Privacy PolicyTerms of Use