Mirascope
The AI Engineer'sDeveloper Stack
LLM abstractions that
aren't obstructions
Mirascope
from mirascope import llm
from pydantic import BaseModel
class Book(BaseModel):
title: str
author: str
@llm.call(
provider="openai",
model="gpt-4o-mini",
response_model=Book,
)
def extract_book(text: str) -> str:
return f"Extract the book: {text}"
text = "The Name of the Wind by Patrick Rothfuss"
book: Book = extract_book(text)
Spin up your data flywheel
with one line of code
answer_questionv2
Lilypad
Betaimport lilypad
from mirascope import llm
lilypad.configure(auto_llm=True)
@lilypad.trace(versioning="automatic")
@llm.call(provider="openai", model="gpt-4o-mini")
def answer_question(question: str) -> str:
return f"Answer in one word: {question}"
answer_question("What is the capital of France?")
Traces
Version
Label
Time
Cost
Tokens
2
1 min ago
$0.0012
24
2
2 mins ago
$0.0011
22
1
1 hr ago
$0.0018
36
1
1 hr ago
$0.0019
38
Messages
user
Answer in one word: What is the capital of France?
assistant
Paris