Skip to content

mirascope.llm.call

A decorator for making provider-agnostic LLM API calls with a typed function.

Usage Documentation

Calls

This decorator enables writing provider-agnostic code by wrapping a typed function that can call any supported LLM provider's API. It parses the prompt template of the wrapped function as messages and templates the input arguments into each message's template.

Example:

from mirascope.llm import call


@call(provider="openai", model="gpt-4o-mini")
def recommend_book(genre: str) -> str:
    return f"Recommend a {genre} book"


response = recommend_book("fantasy")
print(response.content)

Parameters:

Name Type Description Default
provider str

The LLM provider to use (e.g., "openai", "anthropic").

required
model str

The model to use for the specified provider (e.g., "gpt-4o-mini").

required
stream bool

Whether to stream the response from the API call.

required
tools list[BaseTool | Callable]

The tools available for the LLM to use.

required
response_model BaseModel | BaseType

The response model into which the response should be structured.

required
output_parser Callable[[CallResponse | ResponseModelT], Any]

A function for parsing the call response whose value will be returned in place of the original call response.

required
json_mode bool

Whether to use JSON Mode.

required
client object

An optional custom client to use in place of the default client.

required
call_params CommonCallParams

Provider-specific parameters to use in the API call.

required

Returns:

Name Type Description
decorator Callable

A decorator that transforms a typed function into a provider-agnostic LLM API call that returns standardized response types regardless of the underlying provider used.