mirascope.core.xai.stream¶
The XAIStream
class for convenience around streaming xAI LLM calls.
Usage Documentation
XAIStream
¶
XAIStream(
*,
stream: (
Generator[
tuple[
_BaseCallResponseChunkT, _BaseToolT | None
],
None,
None,
]
| AsyncGenerator[
tuple[
_BaseCallResponseChunkT, _BaseToolT | None
],
None,
]
),
metadata: Metadata,
tool_types: list[type[_BaseToolT]] | None,
call_response_type: type[_BaseCallResponseT],
model: str,
prompt_template: str | None,
fn_args: dict[str, Any],
dynamic_config: _BaseDynamicConfigT,
messages: list[_MessageParamT],
call_params: _BaseCallParamsT,
call_kwargs: BaseCallKwargs[_ToolSchemaT]
)
Bases: OpenAIStream
A simple wrapper around OpenAIStream
.
Everything is the same except updates to the construct_call_response
method and
the cost
property so that cost is properly calculated using xAI's cost
calculation method. This ensures cost calculation works for non-OpenAI models.