Mirascope
Mirascope is a powerful, flexible, and user-friendly library that simplifies the process of working with LLMs through a unified interface that works across various supported providers, including OpenAI, Anthropic, Mistral, Google (Gemini/Vertex), Groq, Cohere, LiteLLM, Azure AI, and Bedrock.
Whether you're generating text, extracting structured information, or developing complex AI-driven agent systems, Mirascope provides the tools you need to streamline your development process and create powerful, robust applications.
Getting Started
Install Mirascope, specifying the provider you intend to use, and set your API key:
pip install "mirascope[openai]"
export OPENAI_API_KEY=XXXX
Mirascope API
Mirascope provides a consistent, easy-to-use API across all providers:
Mirascope
from mirascope import llm
from pydantic import BaseModel
class Book(BaseModel):
"""An extracted book."""
title: str
author: str
@llm.call(
provider="openai",
model="gpt-4o-mini",
response_model=Book
)
def extract_book(text: str) -> str:
return f"Extract {text}"
book: Book = extract_book("The Name of the Wind by Patrick Rothfuss")
print(book)
# Output: title='The Name of the Wind' author='Patrick Rothfuss'
Provider SDK Equivalent
For comparison, here's how you would achieve the same result using the provider's native SDK:
We're excited to see what you'll build with Mirascope, and we're here to help! Don't hesitate to reach out :)