OpenTelemetry Integration | Mirascope
MirascopeLilypad

OpenTelemetry

Mirascope provides out-of-the-box integration with OpenTelemetry.

You can install the necessary packages directly or using the otel extras flag:

pip install "mirascope[otel]"

You can then use the with_otel decorator to automatically create spans for LLM calls:

from mirascope import llm
from mirascope.integrations.otel import configure, with_otel 

configure() 


@with_otel() 
@llm.call(provider="openai", model="gpt-4o-mini")
def recommend_book(genre: str) -> str:
    return f"Recommend a {genre} book."


print(recommend_book("fantasy"))

Sending to an observability tool

You'll likely want to send the spans to an actual observability tool so that you can monitor your traces. There are many observability tools out there, but the majority of them can collect OpenTelemetry data. You can pass in processors as an argument of configure() so that you can send the spans to your choice of observability tool:

from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace.export import (
    BatchSpanProcessor,
)
from mirascope.integrations.otel import configure

OBSERVABILITY_TOOL_ENDPOINT = "..."
configure(
    processors=[
        BatchSpanProcessor(
            OTLPSpanExporter(
                endpoint=f"https://{OBSERVABILITY_TOOL_ENDPOINT}/v1/traces",
            )
        )
    ]
)

You should refer to your observability tool's documentation to find the endpoint. If there is an observability backend that you would like for us to integrate out-of-the-box, create a GitHub Issue or let us know in our Slack community.