mirascope.core.base.prompt
Module prompt
The BasePrompt class for better prompt engineering.
Attribute SUPPORTED_MESSAGE_ROLES
Type: ['system', 'user', 'assistant']
Class BasePrompt
The base class for engineering prompts.
This class is implemented as the base for all prompting needs. It is intended to work across various providers by providing a common prompt interface.
Example:
from mirascope.core import BasePrompt, metadata, prompt_template
@prompt_template("Recommend a {genre} book")
@metadata({"tags": {"version:0001", "books"}})
class BookRecommendationPrompt(BasePrompt):
    genre: str
prompt = BookRecommendationPrompt(genre="fantasy")
print(prompt)
# > Recommend a fantasy book
print(prompt.message_params())
# > [BaseMessageParam(role="user", content="Recommend a fantasy book")]
print(prompt.dump()["metadata"])
# > {"metadata": {"version:0001", "books"}}Bases:
BaseModelAttributes
| Name | Type | Description | 
|---|---|---|
| prompt_template | str | - | 
Function message_params
Returns the list of parsed message parameters.
Parameters
| Name | Type | Description | 
|---|---|---|
| self | Any | - | 
Returns
| Type | Description | 
|---|---|
| list[BaseMessageParam] | - | 
Function dynamic_config
Returns the dynamic config of the prompt.
Parameters
| Name | Type | Description | 
|---|---|---|
| self | Any | - | 
Returns
| Type | Description | 
|---|---|
| BaseDynamicConfig | - | 
Function dump
Dumps the contents of the prompt into a dictionary.
Parameters
| Name | Type | Description | 
|---|---|---|
| self | Any | - | 
Function run
Returns the response of calling the API of the provided decorator.
Example:
from mirascope.core import BasePrompt, openai, prompt_template
@prompt_template("Recommend a {genre} book")
class BookRecommendationPrompt(BasePrompt):
    genre: str
prompt = BookRecommendationPrompt(genre="fantasy")
response = prompt.run(openai.call("gpt-4o-mini"))
print(response.content)Parameters
| Name | Type | Description | 
|---|---|---|
| self | Any | - | 
| call_decorator | (() => BaseDynamicConfig) => () => _BaseCallResponseT | (() => BaseDynamicConfig) => () => _BaseStreamT | (() => BaseDynamicConfig) => () => _ResponseModelT | (() => BaseDynamicConfig) => () => Iterable[_ResponseModelT] | - | 
| additional_decorators= () | (_T) => _T | - | 
Returns
| Type | Description | 
|---|---|
| _BaseCallResponseT | _BaseStreamT | _ResponseModelT | Iterable[_ResponseModelT] | - | 
Function run_async
Returns the response of calling the API of the provided decorator.
Example:
import asyncio
from mirascope.core import BasePrompt, openai, prompt_template
@prompt_template("Recommend a {genre} book")
class BookRecommendationPrompt(BasePrompt):
    genre: str
async def run():
    prompt = BookRecommendationPrompt(genre="fantasy")
    response = await prompt.run_async(openai.call("gpt-4o-mini"))
    print(response.content)
asyncio.run(run())Parameters
| Name | Type | Description | 
|---|---|---|
| self | Any | - | 
| call_decorator | (() => Awaitable[BaseDynamicConfig]) => () => Awaitable[_BaseCallResponseT] | (() => Awaitable[BaseDynamicConfig]) => () => Awaitable[_BaseStreamT] | (() => Awaitable[BaseDynamicConfig]) => () => Awaitable[_ResponseModelT] | (() => Awaitable[BaseDynamicConfig]) => () => Awaitable[AsyncIterable[_ResponseModelT]] | - | 
| additional_decorators= () | (_T) => _T | - | 
Returns
| Type | Description | 
|---|---|
| Awaitable[_BaseCallResponseT] | Awaitable[_BaseStreamT] | Awaitable[_ResponseModelT] | Awaitable[AsyncIterable[_ResponseModelT]] | - | 
Class PromptDecorator
Bases:
ProtocolFunction prompt_template
A decorator for setting the prompt_template of a BasePrompt or call.
Example:
from mirascope.core import openai, prompt_template
@prompt_template()
def recommend_book(genre: str) -> str:
    return f"Recommend a {genre} book"
print(recommend_book("fantasy"))
# Output: [BaseMessageParam(role='user', content='Recommend a fantasy book')]Returns
| Type | Description | 
|---|---|
| Callable | The decorator function that turns the decorated function into a prompt template. | 
Class MetadataDecorator
Bases:
ProtocolFunction metadata
A decorator for adding metadata to a BasePrompt or call.
Adding this decorator to a BasePrompt or call updates the metadata annotation
to the given value. This is useful for adding metadata to a BasePrompt or call
that can be used for logging or filtering.
Example:
from mirascope.core import metadata, openai, prompt_template
@openai.call("gpt-4o-mini")
@prompt_template("Recommend a {genre} book")
@metadata({"tags": {"version:0001", "books"}})
def recommend_book(genre: str):
    ...
response = recommend_book("fantasy")
print(response.metadata)Parameters
| Name | Type | Description | 
|---|---|---|
| metadata | Metadata | - | 
Returns
| Type | Description | 
|---|---|
| Callable | The decorator function that updates the `_metadata` attribute of the decorated input prompt or call. |