MirascopeLilypad

mirascope.core.litellm.stream

The LiteLLMStream class for convenience around streaming LLM calls.

Usage

Class LiteLLMStream

A simple wrapper around OpenAIStream.

Everything is the same except updates to the construct_call_response method and the cost property so that cost is properly calculated using LiteLLM's cost calculation method. This ensures cost calculation works for non-OpenAI models.

Bases:

OpenAIStream

Attributes

NameTypeDescription
cost_metadataCostMetadataReturns metadata needed for cost calculation.

Function construct_call_response

Parameters

NameTypeDescription
selfAny-

Returns

TypeDescription
LiteLLMCallResponse-