mirascope.core.litellm.stream
The LiteLLMStream
class for convenience around streaming LLM calls.
Usage
Class LiteLLMStream
A simple wrapper around OpenAIStream
.
Everything is the same except updates to the construct_call_response
method and
the cost
property so that cost is properly calculated using LiteLLM's cost
calculation method. This ensures cost calculation works for non-OpenAI models.
Bases:
OpenAIStreamAttributes
Name | Type | Description |
---|---|---|
cost_metadata | CostMetadata | Returns metadata needed for cost calculation. |
Function construct_call_response
Parameters
Name | Type | Description |
---|---|---|
self | Any | - |
Returns
Type | Description |
---|---|
LiteLLMCallResponse | - |