Mirascope Frog Logo
Mirascope
DocsBlogPricingCloud
⌘K
Type to search
⌘Kto search
Escto close
mirascope
v2.0.2
1.4k
Join our
WelcomeLearnGuidesAPI Referencev1 (Legacy)
DocsGuidesAPI
Overview
Core
CallCall ParamsCall ResponseCall Response ChunkDynamic ConfigStreamTool
Llm
CallCall ResponseCall Response ChunkContextOverrideStreamTool
Mcp
Client
Retries
FallbackTenacity
Tools
# mirascope.core.litellm.call_response_chunk This module contains the `LiteLLMCallResponseChunk` class. <Info title="Usage"> [Streams](/docs/v1/learn/streams#handling-streamed-responses) </Info> ## <ApiType type="Class" path="core/litellm/call_response_chunk" symbolName="LiteLLMCallResponseChunk" /> LiteLLMCallResponseChunk A simpler wrapper around `OpenAICallResponse`. Everything is the same except the `cost` property, which has been updated to use LiteLLM's cost calculations so that cost tracking works for non-OpenAI models. **Bases:** <TypeLink type={{"type_str": "OpenAICallResponseChunk", "description": null, "kind": "simple", "doc_url": "/docs/v1/api/core/openai/call_response_chunk#openaicallresponsechunk"}} />

Provider

On this page

Provider

On this page

© 2026 Mirascope. All rights reserved.

Mirascope® is a registered trademark of Mirascope, Inc. in the U.S.

Privacy PolicyTerms of Use