MirascopeLilypad

Mirascope - LLMs Text Viewer

Concatenated markdown docs, intended for use by LLMs.

Copy it using the buttons, or navigate to /docs/mirascope/llms-full.txt.

Mirascope

62k tokens

LLM abstractions that aren't obstructions.

1k tokens

Mirascope is a Python library that streamlines working with LLMs

3k tokens

Get started with Mirascope across various LLM providers

2k tokens

A comprehensive guide to Mirascope's core components and features. This overview provides a roadmap for learning how to build AI-powered applications with Mirascope.

7k tokens

Master the art of creating effective prompts for LLMs using Mirascope. Learn about message roles, multi-modal inputs, and dynamic prompt configuration.

5k tokens

Learn how to make API calls to various LLM providers using Mirascope. This guide covers basic usage, handling responses, and configuring call parameters for different providers.

3k tokens

Learn how to process LLM responses in real-time as they are generated using Mirascope's streaming capabilities for more interactive and responsive applications.

3k tokens

Learn how to combine multiple LLM calls in sequence to solve complex tasks through functional chaining, nested chains, conditional execution, and parallel processing.

6k tokens

Learn how to structure and validate LLM outputs using Pydantic models for type safety, automatic validation, and easier data manipulation across different providers.

1k tokens

Learn how to request structured JSON outputs from LLMs with Mirascope's JSON Mode for easier parsing, validation, and integration with your applications.

2k tokens

Learn how to process and structure raw LLM outputs into usable formats using Mirascope's flexible output parsers for more reliable and application-ready results.

14k tokens

Learn how to define, use, and chain together LLM-powered tools in Mirascope to extend model capabilities with external functions, data sources, and system interactions.

5k tokens

Learn how to build autonomous and semi-autonomous LLM-powered agents with Mirascope that can use tools, maintain state, and execute multi-step reasoning processes.

3k tokens

Learn how to evaluate LLM outputs using multiple approaches including LLM-based evaluators, panels of judges, and hardcoded evaluation criteria.

2k tokens

Learn how to use asynchronous programming with Mirascope to efficiently handle I/O-bound operations, improve responsiveness, and run multiple LLM calls concurrently.

4k tokens

Learn how to implement robust retry mechanisms for LLM API calls using Mirascope and Tenacity to handle rate limits, validation errors, and other failures.

1k tokens

Learn how to use Mirascope with locally hosted open-source models through Ollama, vLLM, and other APIs with OpenAI compatibility.