mirascope.core.mistral.call_response_chunk
This module contains the MistralCallResponseChunk class.
Usage
Class MistralCallResponseChunk
A convenience wrapper around the Mistral ChatCompletionChunk streamed chunks.
When calling the Mistral API using a function decorated with mistral_call and
stream set to True, the stream will contain MistralResponseChunk instances with
properties that allow for more convenient access to commonly used attributes.
Example:
from mirascope.core import prompt_template
from mirascope.core.mistral import mistral_call
@mistral_call("mistral-large-latest", stream=True)
def recommend_book(genre: str) -> str:
    return f"Recommend a {genre} book"
stream = recommend_book("fantasy")  # response is an `MistralStream`
for chunk, _ in stream:
    print(chunk.content, end="", flush=True)Bases:
BaseCallResponseChunk[CompletionChunk, FinishReason]Attributes
| Name | Type | Description | 
|---|---|---|
| content | str | Returns the content of the delta. | 
| finish_reasons | list[FinishReason] | Returns the finish reasons of the response. | 
| model | str | Returns the name of the response model. | 
| id | str | Returns the id of the response. | 
| usage | UsageInfo | None | Returns the usage of the chat completion. | 
| input_tokens | int | None | Returns the number of input tokens. | 
| cached_tokens | int | Returns the number of cached tokens. | 
| output_tokens | int | None | Returns the number of output tokens. | 
| cost_metadata | CostMetadata | Returns the cost metadata. | 
| common_finish_reasons | list[types.FinishReason] | None | - |