mirascope.core.litellm.call¶
See OpenAI For Full Reference
We use the types found in the openai
module for our LiteLLM integration, so please reference that module for information regarding anything beyond the LiteLLM call
decorator.
A decorator for calling the LiteLLM API with a typed function.
Usage Documentation
This decorator is used to wrap a typed function that calls the LiteLLM API. It parses the prompt template of the wrapped function as the messages array and templates the input arguments for the function into each message's template.
Example:
from mirascope.core import prompt_template
from mirascope.core.litellm import litellm_call
@litellm_call("gpt-4o-mini")
def recommend_book(genre: str) -> str:
return f"Recommend a {genre} book"
response = recommend_book("fantasy")
print(response.content)
Parameters:
Name | Type | Description | Default |
---|---|---|---|
model |
str
|
The model to use in the API call. |
required |
stream |
bool
|
Whether to stream the response from the API call. |
required |
tools |
list[BaseTool | Callable]
|
The tools to use in the API call. |
required |
response_model |
BaseModel | BaseType
|
The response model into which the response should be structured. |
required |
output_parser |
Callable[[OpenAICallResponse | ResponseModelT], Any]
|
A function for parsing the call response whose value will be returned in place of the original call response. |
required |
json_mode |
bool
|
Whether to use JSON Mode. |
required |
client |
None
|
LiteLLM does not support a custom client. |
required |
call_params |
OpenAICallParams
|
The |
required |
Returns:
Name | Type | Description |
---|---|---|
decorator |
Callable
|
The decorator for turning a typed function into a LiteLLM routed LLM API call. |