Mirascope Frog Logo
Mirascope
DocsBlogPricingCloud
⌘K
Type to search
⌘Kto search
Escto close
mirascope
v1.25.7
1.3k
Join our
Docs v1 (legacy)API ReferenceGuidesDocs
Welcome
Getting Started
Why Mirascope?HelpContributing0.x Migration Guide
Learn
OverviewPromptsCallsStreamsChainingResponse ModelsJSON ModeOutput ParsersToolsAgentsEvalsAsyncRetriesLocal Models
Provider-Specific Features
Thinking & ReasoningOpenAIAnthropic
Extensions
MiddlewareCustom LLM Provider
MCP - Model Context Protocol
Client
# Async Asynchronous programming is a crucial concept when building applications with LLMs (Large Language Models) using Mirascope. This feature allows for efficient handling of I/O-bound operations (e.g., API calls), improving application responsiveness and scalability. Mirascope utilizes the [asyncio](https://docs.python.org/3/library/asyncio.html) library to implement asynchronous processing. <Info title="Best Practices"> - **Use asyncio for I/O-bound tasks**: Async is most beneficial for I/O-bound operations like API calls. It may not provide significant benefits for CPU-bound tasks. - **Avoid blocking operations**: Ensure that you're not using blocking operations within async functions, as this can negate the benefits of asynchronous programming. - **Consider using connection pools**: When making many async requests, consider using connection pools to manage and reuse connections efficiently. - **Be mindful of rate limits**: While async allows for concurrent requests, be aware of API rate limits and implement appropriate throttling if necessary. - **Use appropriate timeouts**: Implement timeouts for async operations to prevent hanging in case of network issues or unresponsive services. - **Test thoroughly**: Async code can introduce subtle bugs. Ensure comprehensive testing of your async implementations. - **Leverage async context managers**: Use async context managers (async with) for managing resources that require setup and cleanup in async contexts. </Info> <Info title="Diagram illustrating the flow of asynchronous processing" collapsible={true} defaultOpen={false}> ```mermaid sequenceDiagram participant Main as Main Process participant API1 as API Call 1 participant API2 as API Call 2 participant API3 as API Call 3 Main->>+API1: Send Request Main->>+API2: Send Request Main->>+API3: Send Request API1-->>-Main: Response API2-->>-Main: Response API3-->>-Main: Response Main->>Main: Process All Responses ``` </Info> ## Key Terms - `async`: Keyword used to define a function as asynchronous - `await`: Keyword used to wait for the completion of an asynchronous operation - `asyncio`: Python library that supports asynchronous programming ## Basic Usage and Syntax <Note> If you haven't already, we recommend first reading the section on [Calls](/docs/v1/learn/calls) </Note> To use async in Mirascope, simply define the function as async and use the `await` keyword when calling it. Here's a basic example: <TabbedSection> <Tab value="Shorthand"> ```python import asyncio from mirascope import llm @llm.call(provider="$PROVIDER", model="$MODEL") async def recommend_book(genre: str) -> str: # [!code highlight] return f"Recommend a {genre} book" async def main(): response = await recommend_book("fantasy") # [!code highlight] print(response.content) asyncio.run(main()) ``` </Tab> <Tab value="Template"> ```python import asyncio from mirascope import llm, prompt_template @llm.call(provider="$PROVIDER", model="$MODEL") @prompt_template("Recommend a {genre} book") async def recommend_book(genre: str): ... # [!code highlight] async def main(): response = await recommend_book("fantasy") # [!code highlight] print(response.content) asyncio.run(main()) ``` </Tab> </TabbedSection> In this example we: 1. Define `recommend_book` as an asynchronous function. 2. Create a `main` function that calls `recommend_book` and awaits it. 3. Use `asyncio.run(main())` to start the asynchronous event loop and run the main function. ## Parallel Async Calls One of the main benefits of asynchronous programming is the ability to run multiple operations concurrently. Here's an example of making parallel async calls: <TabbedSection> <Tab value="Shorthand"> ```python import asyncio from mirascope import llm @llm.call(provider="$PROVIDER", model="$MODEL") async def recommend_book(genre: str) -> str: # [!code highlight] return f"Recommend a {genre} book" async def main(): genres = ["fantasy", "scifi", "mystery"] tasks = [recommend_book(genre) for genre in genres] # [!code highlight] results = await asyncio.gather(*tasks) # [!code highlight] for genre, response in zip(genres, results): print(f"({genre}):\n{response.content}\n") asyncio.run(main()) ``` </Tab> <Tab value="Template"> ```python import asyncio from mirascope import llm, prompt_template @llm.call(provider="$PROVIDER", model="$MODEL") @prompt_template("Recommend a {genre} book") async def recommend_book(genre: str): ... # [!code highlight] async def main(): genres = ["fantasy", "scifi", "mystery"] tasks = [recommend_book(genre) for genre in genres] # [!code highlight] results = await asyncio.gather(*tasks) # [!code highlight] for genre, response in zip(genres, results): print(f"({genre}):\n{response.content}\n") asyncio.run(main()) ``` </Tab> </TabbedSection> We are using `asyncio.gather` to run and await multiple asynchronous tasks concurrently, printing the results for each task one all are completed. ## Async Streaming <Note> If you haven't already, we recommend first reading the section on [Streams](/docs/v1/learn/streams) </Note> Streaming with async works similarly to synchronous streaming, but you use `async for` instead of a regular `for` loop: <TabbedSection> <Tab value="Shorthand"> ```python import asyncio from mirascope import llm @llm.call(provider="$PROVIDER", model="$MODEL", stream=True) # [!code highlight] async def recommend_book(genre: str) -> str: # [!code highlight] return f"Recommend a {genre} book" async def main(): stream = await recommend_book("fantasy") # [!code highlight] async for chunk, _ in stream: # [!code highlight] print(chunk.content, end="", flush=True) asyncio.run(main()) ``` </Tab> <Tab value="Template"> ```python import asyncio from mirascope import llm, prompt_template @llm.call(provider="$PROVIDER", model="$MODEL", stream=True) # [!code highlight] @prompt_template("Recommend a {genre} book") async def recommend_book(genre: str): ... # [!code highlight] async def main(): stream = await recommend_book("fantasy") # [!code highlight] async for chunk, _ in stream: # [!code highlight] print(chunk.content, end="", flush=True) asyncio.run(main()) ``` </Tab> </TabbedSection> ## Async Tools <Note> If you haven't already, we recommend first reading the section on [Tools](/docs/v1/learn/tools) </Note> When using tools asynchronously, you can make the `call` method of a tool async: <TabbedSection> <Tab value="Shorthand"> ```python import asyncio from mirascope import BaseTool, llm class FormatBook(BaseTool): title: str author: str async def call(self) -> str: # [!code highlight] # Simulating an async API call await asyncio.sleep(1) return f"{self.title} by {self.author}" @llm.call(provider="$PROVIDER", model="$MODEL", tools=[FormatBook]) # [!code highlight] async def recommend_book(genre: str) -> str: # [!code highlight] return f"Recommend a {genre} book" async def main(): response = await recommend_book("fantasy") if tool := response.tool: if isinstance(tool, FormatBook): # [!code highlight] output = await tool.call() # [!code highlight] print(output) else: print(response.content) asyncio.run(main()) ``` </Tab> <Tab value="Template"> ```python import asyncio from mirascope import BaseTool, llm, prompt_template class FormatBook(BaseTool): title: str author: str async def call(self) -> str: # [!code highlight] # Simulating an async API call await asyncio.sleep(1) return f"{self.title} by {self.author}" @llm.call(provider="$PROVIDER", model="$MODEL", tools=[FormatBook]) # [!code highlight] @prompt_template("Recommend a {genre} book") async def recommend_book(genre: str): ... async def main(): response = await recommend_book("fantasy") if tool := response.tool: if isinstance(tool, FormatBook): # [!code highlight] output = await tool.call() # [!code highlight] print(output) else: print(response.content) asyncio.run(main()) ``` </Tab> </TabbedSection> It's important to note that in this example we use `isinstance(tool, FormatBook)` to ensure the `call` method can be awaited safely. This also gives us proper type hints and editor support. ## Custom Client When using custom clients with async calls, it's crucial to use the asynchronous version of the client. You can provide the async client either through the decorator or dynamic configuration: ### Decorator Parameter <TabbedSection> <Tab value="Shorthand"> <ProviderCodeBlock examplePath="v1/learn/async/custom_client/decorator"/> </Tab> <Tab value="Template"> <ProviderCodeBlock examplePath="v1/learn/async/custom_client/decorator_template"/> </Tab> </TabbedSection> ### Dynamic Configuration <TabbedSection> <Tab value="Shorthand"> <ProviderCodeBlock examplePath="v1/learn/async/custom_client/dynamic_config"/> </Tab> <Tab value="Template"> <ProviderCodeBlock examplePath="v1/learn/async/custom_client/dynamic_config_template"/> </Tab> </TabbedSection> <Warning title="Synchronous vs Asynchronous Clients"> Make sure to use the appropriate asynchronous client class (e.g., `AsyncOpenAI` instead of `OpenAI`) when working with async functions. Using a synchronous client in an async context can lead to blocking operations that defeat the purpose of async programming. </Warning> ## Next Steps By leveraging these async features in Mirascope, you can build more efficient and responsive applications, especially when working with multiple LLM calls or other I/O-bound operations. This section concludes the core functionality Mirascope supports. If you haven't already, we recommend taking a look at any previous sections you've missed to learn about what you can do with Mirascope. You can also check out the section on [Provider-Specific Features](/docs/v1/learn/provider-specific/openai) to learn about how to use features that only certain providers support, such as OpenAI's structured outputs.

Provider

On this page

Provider

On this page

© 2026 Mirascope. All rights reserved.

Mirascope® is a registered trademark of Mirascope, Inc. in the U.S.

Privacy PolicyTerms of Use