Langfuse
Mirascope provides out-of-the-box integration with Langfuse.
You can install the necessary packages directly or using the langfuse
extras flag:
pip install "mirascope[langfuse]"
You can then use the with_langfuse
decorator to automatically log calls:
from mirascope import llm
from mirascope.integrations.langfuse import with_langfuse
@with_langfuse()
@llm.call(provider="openai", model="gpt-4o-mini")
def recommend_book(genre: str) -> str:
return f"Recommend a {genre} book."
print(recommend_book("fantasy"))
This will give you:
- A trace around the
recommend_book
function that captures items like the prompt template, and input/output attributes and more. - Human-readable display of the conversation with the agent
- Details of the response, including the number of tokens used
Example trace
Handling streams