mirascope.core.base.prompt | Mirascope
MirascopeLilypad

mirascope.core.base.prompt

Module prompt

The BasePrompt class for better prompt engineering.

Attribute SUPPORTED_MESSAGE_ROLES

Type: ['system', 'user', 'assistant']

Class BasePrompt

The base class for engineering prompts.

This class is implemented as the base for all prompting needs. It is intended to work across various providers by providing a common prompt interface.

Example:

from mirascope.core import BasePrompt, metadata, prompt_template

@prompt_template("Recommend a {genre} book")
@metadata({"tags": {"version:0001", "books"}})
class BookRecommendationPrompt(BasePrompt):
    genre: str

prompt = BookRecommendationPrompt(genre="fantasy")

print(prompt)
# > Recommend a fantasy book

print(prompt.message_params())
# > [BaseMessageParam(role="user", content="Recommend a fantasy book")]

print(prompt.dump()["metadata"])
# > {"metadata": {"version:0001", "books"}}

Bases:

BaseModel

Attributes

NameTypeDescription
prompt_templatestr-

Function message_params

Returns the list of parsed message parameters.

Parameters

NameTypeDescription
selfAny-

Returns

TypeDescription
list[BaseMessageParam]-

Function dynamic_config

Returns the dynamic config of the prompt.

Parameters

NameTypeDescription
selfAny-

Returns

TypeDescription
BaseDynamicConfig-

Function dump

Dumps the contents of the prompt into a dictionary.

Parameters

NameTypeDescription
selfAny-

Returns

TypeDescription
dict[str, Any]-

Function run

Returns the response of calling the API of the provided decorator.

Example:

from mirascope.core import BasePrompt, openai, prompt_template


@prompt_template("Recommend a {genre} book")
class BookRecommendationPrompt(BasePrompt):
    genre: str


prompt = BookRecommendationPrompt(genre="fantasy")
response = prompt.run(openai.call("gpt-4o-mini"))
print(response.content)

Parameters

NameTypeDescription
selfAny-
call_decorator(() => BaseDynamicConfig) => () => _BaseCallResponseT | (() => BaseDynamicConfig) => () => _BaseStreamT | (() => BaseDynamicConfig) => () => _ResponseModelT | (() => BaseDynamicConfig) => () => Iterable[_ResponseModelT]-
additional_decorators= ()(_T) => _T-

Returns

TypeDescription
_BaseCallResponseT | _BaseStreamT | _ResponseModelT | Iterable[_ResponseModelT]-

Function run_async

Returns the response of calling the API of the provided decorator.

Example:

import asyncio

from mirascope.core import BasePrompt, openai, prompt_template


@prompt_template("Recommend a {genre} book")
class BookRecommendationPrompt(BasePrompt):
    genre: str


async def run():
    prompt = BookRecommendationPrompt(genre="fantasy")
    response = await prompt.run_async(openai.call("gpt-4o-mini"))
    print(response.content)


asyncio.run(run())

Parameters

NameTypeDescription
selfAny-
call_decorator(() => Awaitable[BaseDynamicConfig]) => () => Awaitable[_BaseCallResponseT] | (() => Awaitable[BaseDynamicConfig]) => () => Awaitable[_BaseStreamT] | (() => Awaitable[BaseDynamicConfig]) => () => Awaitable[_ResponseModelT] | (() => Awaitable[BaseDynamicConfig]) => () => Awaitable[AsyncIterable[_ResponseModelT]]-
additional_decorators= ()(_T) => _T-

Returns

TypeDescription
Awaitable[_BaseCallResponseT] | Awaitable[_BaseStreamT] | Awaitable[_ResponseModelT] | Awaitable[AsyncIterable[_ResponseModelT]]-

Class PromptDecorator

Bases:

Protocol

Function prompt_template

A decorator for setting the prompt_template of a BasePrompt or call.

Usage

Example:

from mirascope.core import openai, prompt_template


@prompt_template()
def recommend_book(genre: str) -> str:
    return f"Recommend a {genre} book"


print(recommend_book("fantasy"))
# Output: [BaseMessageParam(role='user', content='Recommend a fantasy book')]

Parameters

NameTypeDescription
template= Nonestr | None-

Returns

TypeDescription
CallableThe decorator function that turns the decorated function into a prompt template.

Class MetadataDecorator

Bases:

Protocol

Function metadata

A decorator for adding metadata to a BasePrompt or call.

Adding this decorator to a BasePrompt or call updates the metadata annotation to the given value. This is useful for adding metadata to a BasePrompt or call that can be used for logging or filtering.

Example:

from mirascope.core import metadata, openai, prompt_template


@openai.call("gpt-4o-mini")
@prompt_template("Recommend a {genre} book")
@metadata({"tags": {"version:0001", "books"}})
def recommend_book(genre: str):
    ...


response = recommend_book("fantasy")
print(response.metadata)

Parameters

NameTypeDescription
metadataMetadata-

Returns

TypeDescription
CallableThe decorator function that updates the `_metadata` attribute of the decorated input prompt or call.