Mirascope Frog Logo
Mirascope
DocsBlogPricingCloud
⌘K
Type to search
⌘Kto search
Escto close
mirascope
v2.0.2
1.4k
Join our
WelcomeLearnGuidesAPI Referencev1 (Legacy)
WelcomeWhy Mirascope?Mirascope QuickstartContributing
# Welcome to Mirascope The complete toolkit for building LLM-powered applications. - **Provider-agnostic**: One API for OpenAI, Anthropic, Google, and more - **Observable by default**: Tracing, versioning, and analytics built-in - **Production-ready**: Tools, structured outputs, and reliable error handling - **Cloud-native**: Dashboard, cost tracking, and team collaboration ## Get Started ### 1. Create a Mirascope Cloud account [Sign up for Mirascope Cloud](/cloud/login) to get your API key. Your key will be generated during onboarding, or you can create one later in [settings](/cloud/settings/api-keys). ### 2. Install Mirascope <TabbedSection> <Tab value="uv"> ```bash uv add "mirascope[all]" ``` </Tab> <Tab value="pip"> ```bash pip install "mirascope[all]" ``` </Tab> </TabbedSection> ### 3. Set your API key ```bash export MIRASCOPE_API_KEY="your-api-key" ``` <Info> Your Mirascope API key enables two features: - **Router**: A single key for multiple LLM providers (OpenAI, Anthropic, Google) - **Cloud**: Automatic tracing and analytics in your dashboard To use the Router, purchase credits in [Billing](/cloud/settings/billing). Prefer to use your own provider keys? See [Providers](/docs/learn/llm/providers) for configuration. </Info> ## Your First Agent Here's a complete agent that uses tools, tracing, and versioning: ```python from typing import Literal from mirascope import llm, ops # Connect to Mirascope Cloud for tracing and analytics ops.configure() ops.instrument_llm() @llm.tool @ops.trace def calculate( operation: Literal["add", "subtract", "multiply", "divide"], a: float, b: float, ) -> str: """Perform a mathematical operation on two numbers.""" match operation: case "add": return str(a + b) case "subtract": return str(a - b) case "multiply": return str(a * b) case "divide": return str(a / b) if b != 0 else "Cannot divide by zero" @ops.version # Automatically versions `math_agent` and traces it's execution @llm.call("openai/gpt-4o-mini", tools=[calculate]) def math_agent(query: str) -> str: return f"Help the user with: {query}" @ops.trace def run_math_agent(query: str) -> str: response = math_agent(query) while response.tool_calls: tool_outputs = response.execute_tools() response = response.resume(tool_outputs) return response.text() print(run_math_agent("What's 42 * 17?")) ``` This example shows the core Mirascope patterns: - **Tools** let the LLM call your functions (traced with `@ops.trace`) - **Versioning** with `@ops.version` tracks changes to your prompts and traces automatically - **Tracing** on `run_agent` captures the entire agent loop as a nested trace - **Router** routes `"openai/gpt-4o-mini"` through your Mirascope API key ## View Your Traces After running the example, view your traces at [Mirascope Cloud](/cloud/dashboard). You'll see execution time, token usage, costs, and the full call flow. ## What's Next | Learning Path | Topics | | --- | --- | | [LLM Quickstart](/docs/quickstart) | Messages, Calls, Tools, Structured Output, Streaming, Agents | | [Ops Overview](/docs/learn/ops) | Configuration, Tracing, Sessions, Versioning | | [API Reference](/docs/api) | Full API documentation |

On this page

On this page

© 2026 Mirascope. All rights reserved.

Mirascope® is a registered trademark of Mirascope, Inc. in the U.S.

Privacy PolicyTerms of Use