prompts
Class AsyncContextMessageTemplate
Protocol for an async context-aware prompt function that returns UserContent or Sequence[Message].
An async MessageTemplate with a first parameter named 'ctx' of type Context[DepsT].
Can be converted by the llm.prompt decorator into an AsyncContextPrompt, or by
the llm.call decorator into an AsyncContextCall.
Bases:
Protocol[P, DepsT]Class AsyncContextPrompt
An async context-aware prompt that can be called with a model to generate a response.
Created by decorating an async ContextMessageTemplate with llm.prompt. The decorated
async function (with first parameter 'ctx' of type Context[DepsT]) becomes callable
with a Model to generate LLM responses asynchronously with context dependencies.
An AsyncContextPrompt is essentially: async ContextMessageTemplate + tools + format.
It can be invoked with a model: await prompt(model, ctx, *args, **kwargs).
Bases:
Generic[P, DepsT, FormattableT]Attributes
| Name | Type | Description |
|---|---|---|
| fn | AsyncContextMessageTemplate[P, DepsT] | The underlying async context-aware prompt function that generates message content. |
| toolkit | AsyncContextToolkit[DepsT] | The toolkit containing this prompt's async context-aware tools. |
| format | type[FormattableT] | Format[FormattableT] | None | The response format for the generated response. |
Function call
Generates a response using the provided model asynchronously.
Parameters
Returns
| Type | Description |
|---|---|
| AsyncContextResponse[DepsT, None] | AsyncContextResponse[DepsT, FormattableT] | - |
Function stream
Generates a streaming response using the provided model asynchronously.
Parameters
Returns
| Type | Description |
|---|---|
| AsyncContextStreamResponse[DepsT, None] | AsyncContextStreamResponse[DepsT, FormattableT] | - |
Class AsyncMessageTemplate
Protocol for an async prompt function that returns UserContent or Sequence[Message].
An async MessageTemplate that can be converted by the llm.prompt decorator
into an AsyncPrompt, or by the llm.call decorator into an AsyncCall.
Bases:
Protocol[P]Class AsyncPrompt
An async prompt that can be called with a model to generate a response.
Created by decorating an async MessageTemplate with llm.prompt. The decorated
async function becomes callable with a Model to generate LLM responses asynchronously.
An AsyncPrompt is essentially: async MessageTemplate + tools + format.
It can be invoked with a model: await prompt(model, *args, **kwargs).
Bases:
Generic[P, FormattableT]Attributes
| Name | Type | Description |
|---|---|---|
| fn | AsyncMessageTemplate[P] | The underlying async prompt function that generates message content. |
| toolkit | AsyncToolkit | The toolkit containing this prompt's async tools. |
| format | type[FormattableT] | Format[FormattableT] | None | The response format for the generated response. |
Function call
Generates a response using the provided model asynchronously.
Parameters
| Name | Type | Description |
|---|---|---|
| self | Any | - |
| model | Model | - |
| args= () | P.args | - |
| kwargs= {} | P.kwargs | - |
Returns
| Type | Description |
|---|---|
| AsyncResponse | AsyncResponse[FormattableT] | - |
Function stream
Generates a streaming response using the provided model asynchronously.
Parameters
| Name | Type | Description |
|---|---|---|
| self | Any | - |
| model | Model | - |
| args= () | P.args | - |
| kwargs= {} | P.kwargs | - |
Returns
| Type | Description |
|---|---|
| AsyncStreamResponse | AsyncStreamResponse[FormattableT] | - |
Class ContextMessageTemplate
Protocol for a context-aware prompt function that returns UserContent or Sequence[Message].
A MessageTemplate with a first parameter named 'ctx' of type Context[DepsT].
Can be converted by the llm.prompt decorator into a ContextPrompt, or by
the llm.call decorator into a ContextCall.
Bases:
Protocol[P, DepsT]Class ContextPrompt
A context-aware prompt that can be called with a model to generate a response.
Created by decorating a ContextMessageTemplate with llm.prompt. The decorated
function (with first parameter 'ctx' of type Context[DepsT]) becomes callable
with a Model to generate LLM responses with context dependencies.
A ContextPrompt is essentially: ContextMessageTemplate + tools + format.
It can be invoked with a model: prompt(model, ctx, *args, **kwargs).
Bases:
Generic[P, DepsT, FormattableT]Attributes
| Name | Type | Description |
|---|---|---|
| fn | ContextMessageTemplate[P, DepsT] | The underlying context-aware prompt function that generates message content. |
| toolkit | ContextToolkit[DepsT] | The toolkit containing this prompt's context-aware tools. |
| format | type[FormattableT] | Format[FormattableT] | None | The response format for the generated response. |
Function call
Generates a response using the provided model.
Parameters
Returns
| Type | Description |
|---|---|
| ContextResponse[DepsT, None] | ContextResponse[DepsT, FormattableT] | - |
Function stream
Generates a streaming response using the provided model.
Parameters
Returns
| Type | Description |
|---|---|
| ContextStreamResponse[DepsT, None] | ContextStreamResponse[DepsT, FormattableT] | - |
Class MessageTemplate
Protocol for a prompt function that returns UserContent or Sequence[Message].
A MessageTemplate is a raw function that returns prompt content. It can be
converted by the llm.prompt decorator into a Prompt (callable with a Model),
or by the llm.call decorator into a Call (Prompt + Model).
Bases:
Protocol[P]Class Prompt
A prompt that can be called with a model to generate a response.
Created by decorating a MessageTemplate with llm.prompt. The decorated
function becomes callable with a Model to generate LLM responses.
A Prompt is essentially: MessageTemplate + tools + format.
It can be invoked with a model: prompt(model, *args, **kwargs).
Bases:
Generic[P, FormattableT]Attributes
| Name | Type | Description |
|---|---|---|
| fn | MessageTemplate[P] | The underlying prompt function that generates message content. |
| toolkit | Toolkit | The toolkit containing this prompt's tools. |
| format | type[FormattableT] | Format[FormattableT] | None | The response format for the generated response. |
Function call
Generates a response using the provided model.
Parameters
| Name | Type | Description |
|---|---|---|
| self | Any | - |
| model | Model | - |
| args= () | P.args | - |
| kwargs= {} | P.kwargs | - |
Returns
| Type | Description |
|---|---|
| Response | Response[FormattableT] | - |
Function stream
Generates a streaming response using the provided model.
Parameters
| Name | Type | Description |
|---|---|---|
| self | Any | - |
| model | Model | - |
| args= () | P.args | - |
| kwargs= {} | P.kwargs | - |
Returns
| Type | Description |
|---|---|
| StreamResponse | StreamResponse[FormattableT] | - |
Class PromptDecorator
Decorator for converting a MessageTemplate into a Prompt.
Takes a raw prompt function that returns message content and wraps it with
tools and format support, creating a Prompt that can be called with a model.
The decorator automatically detects whether the function is async or context-aware and creates the appropriate Prompt variant (Prompt, AsyncPrompt, ContextPrompt, or AsyncContextPrompt).
Bases:
Generic[ToolT, FormattableT]Attributes
| Name | Type | Description |
|---|---|---|
| tools | Sequence[ToolT] | None | The tools that are included in the prompt, if any. |
| format | type[FormattableT] | Format[FormattableT] | None | The structured output format off the prompt, if any. |
Function prompt
Decorates a MessageTemplate to create a Prompt callable with a model.
This decorator transforms a raw prompt function (that returns message content)
into a Prompt object that can be invoked with a model to generate LLM responses.
The decorator automatically detects the function type:
- If the first parameter is named
'ctx'with typellm.Context[T], creates aContextPrompt - If the function is async, creates an
AsyncPromptorAsyncContextPrompt - Otherwise, creates a regular
Prompt
Parameters
| Name | Type | Description |
|---|---|---|
| __fn= None | AsyncContextMessageTemplate[P, DepsT] | ContextMessageTemplate[P, DepsT] | AsyncMessageTemplate[P] | MessageTemplate[P] | None | The prompt function to decorate (optional, for decorator syntax without parens) |
| tools= None | Sequence[ToolT] | None | Optional `Sequence` of tools to make available to the LLM |
| format= None | type[FormattableT] | Format[FormattableT] | None | Optional response format class (`BaseModel`) or Format instance |
Returns
| Type | Description |
|---|---|
| AsyncContextPrompt[P, DepsT, FormattableT] | ContextPrompt[P, DepsT, FormattableT] | AsyncPrompt[P, FormattableT] | Prompt[P, FormattableT] | PromptDecorator[ToolT, FormattableT] | A `Prompt` variant (Prompt, AsyncPrompt, ContextPrompt, or AsyncContextPrompt) |