Mirascope Frog Logo
Mirascope
DocsBlogPricingCloud
⌘K
Type to search
⌘Kto search
Escto close
mirascope
v1.25.7
1.3k
Join our
WelcomeLearnGuidesAPI Referencev1 (Legacy)
LLMOps
OverviewQuickstartMessagesModelsResponsesPromptsCallsThinkingToolsStructured OutputStreamingAsyncAgentsContextChainingErrorsReliabilityProvidersLocal ModelsMCP
# LLM Overview <div className="badge-container"> <a href="https://github.com/Mirascope/mirascope/actions/workflows/ci.yml" target="_blank"><img src="https://github.com/Mirascope/mirascope/actions/workflows/ci.yml/badge.svg?branch=main" alt="Tests"/></a> <a href="https://app.codecov.io/github/Mirascope/mirascope" target="_blank"><img src="https://codecov.io/github/Mirascope/mirascope/graph/badge.svg?token=HAEAWT3KC9" alt="Coverage"/></a> <a href="https://pypi.org/project/mirascope/" target="_blank"><img src="https://img.shields.io/pypi/v/mirascope.svg" alt="PyPI Version"/></a> <a href="https://pypi.org/project/mirascope/" target="_blank"><img src="https://img.shields.io/pypi/pyversions/mirascope.svg" alt="Python Versions"/></a> <a href="https://github.com/Mirascope/mirascope/blob/main/python/LICENSE" target="_blank"><img src="https://img.shields.io/badge/License-MIT-blue.svg" alt="License"/></a> </div> The `mirascope.llm` module is a Python library for building LLM-powered applications. It provides type-safe, provider-agnostic abstractions that make it easy to call LLMs, use tools, and get structured output. ## Core Concepts | Concept | Description | | --- | --- | | [Messages](/docs/learn/llm/messages) | Understand message types and multimodal content | | [Models](/docs/learn/llm/models) | Configure and call LLMs across providers | | [Responses](/docs/learn/llm/responses) | Work with LLM responses and continue conversations | | [Prompts](/docs/learn/llm/prompts) | Create reusable prompt functions with `@llm.prompt` | | [Calls](/docs/learn/llm/calls) | Bundle models with prompts using `@llm.call` | ## Features | Feature | Description | | --- | --- | | [Thinking](/docs/learn/llm/thinking) | Use extended reasoning capabilities | | [Tools](/docs/learn/llm/tools) | Let LLMs call your functions | | [Structured Output](/docs/learn/llm/structured-output) | Get typed responses with Pydantic models | | [Streaming](/docs/learn/llm/streaming) | Stream responses in real-time | | [Async](/docs/learn/llm/async) | Async operations for concurrent calls | ## Advanced Topics | Topic | Description | | --- | --- | | [Agents](/docs/learn/llm/agents) | Build autonomous agent systems | | [Context](/docs/learn/llm/context) | Manage call context and model resolution | | [Chaining](/docs/learn/llm/chaining) | Chain prompts for multi-step workflows | | [Errors](/docs/learn/llm/errors) | Handle errors and edge cases | | [Reliability](/docs/learn/llm/reliability) | Retries, validation, and resilience | | [Providers](/docs/learn/llm/providers) | Provider-specific configuration | | [Local Models](/docs/learn/llm/local-models) | Run models locally | | [MCP](/docs/learn/llm/mcp) | Model Context Protocol integration | ## Quick Example ```python from mirascope import llm @llm.call("openai/gpt-4o-mini") def recommend_book(genre: str) -> str: return f"Recommend a {genre} book" response = recommend_book("fantasy") print(response.text()) ``` For a complete introduction, see the [Quickstart](/docs/learn/llm/quickstart).

On this page

On this page

© 2026 Mirascope. All rights reserved.

Mirascope® is a registered trademark of Mirascope, Inc. in the U.S.

Privacy PolicyTerms of Use