mirascope.core.openai.call_response_chunk | Mirascope
MirascopeLilypad

mirascope.core.openai.call_response_chunk

Module call_response_chunk

This module contains the OpenAICallResponseChunk class.

Usage

Attribute FinishReason

Type: Choice.__annotations__['finish_reason']

Class OpenAICallResponseChunk

A convenience wrapper around the OpenAI ChatCompletionChunk streamed chunks.

When calling the OpenAI API using a function decorated with openai_call and stream set to True, the stream will contain OpenAIResponseChunk instances with properties that allow for more convenient access to commonly used attributes.

Example:

from mirascope.core import prompt_template
from mirascope.core.openai import openai_call


@openai_call("gpt-4o-mini", stream=True)
def recommend_book(genre: str) -> str:
    return f"Recommend a {genre} book"


stream = recommend_book("fantasy")  # response is an `OpenAIStream`
for chunk, _ in stream:
    print(chunk.content, end="", flush=True)

Bases:

BaseCallResponseChunk[ChatCompletionChunk, FinishReason]

Attributes

NameTypeDescription
chunkSkipValidation[ChatCompletionChunk]-
contentstrReturns the content for the 0th choice delta.
finish_reasonslist[FinishReason]Returns the finish reasons of the response.
modelstrReturns the name of the response model.
idstrReturns the id of the response.
usageCompletionUsage | NoneReturns the usage of the chat completion.
cached_tokensint | NoneReturns the number of cached tokens.
input_tokensint | NoneReturns the number of input tokens.
output_tokensint | NoneReturns the number of output tokens.
audiobytes | NoneReturns the audio data of the response.
audio_transcriptstr | NoneReturns the transcript of the audio content.
cost_metadataCostMetadataReturns the cost metadata.
common_finish_reasonslist[FinishReason] | NoneProvider-agnostic finish reasons.