# mirascope.core.litellm.call_response_chunk
This module contains the `LiteLLMCallResponseChunk` class.
<Info title="Usage">
[Streams](/docs/v1/learn/streams#handling-streamed-responses)
</Info>
## <ApiType type="Class" path="core/litellm/call_response_chunk" symbolName="LiteLLMCallResponseChunk" /> LiteLLMCallResponseChunk
A simpler wrapper around `OpenAICallResponse`.
Everything is the same except the `cost` property, which has been updated to use
LiteLLM's cost calculations so that cost tracking works for non-OpenAI models.
**Bases:**
<TypeLink type={{"type_str": "OpenAICallResponseChunk", "description": null, "kind": "simple", "doc_url": "/docs/v1/api/core/openai/call_response_chunk#openaicallresponsechunk"}} />