llms-full.txt - LLMs Text Viewer

Concatenated markdown docs, intended for use by LLMs.

Copy it using the buttons, or navigate to /llms-full.txt.

llms-full.txt

73k tokens

Concatenated documentation for Mirascope and Lilypad, intended to get LLMs up to speed on both products.

Mirascope

62k tokens

LLM abstractions that aren't obstructions.

1k tokens

Mirascope is a Python library that streamlines working with LLMs

3k tokens

Get started with Mirascope across various LLM providers

2k tokens

A comprehensive guide to Mirascope's core components and features. This overview provides a roadmap for learning how to build AI-powered applications with Mirascope.

7k tokens

Master the art of creating effective prompts for LLMs using Mirascope. Learn about message roles, multi-modal inputs, and dynamic prompt configuration.

5k tokens

Learn how to make API calls to various LLM providers using Mirascope. This guide covers basic usage, handling responses, and configuring call parameters for different providers.

3k tokens

Learn how to process LLM responses in real-time as they are generated using Mirascope's streaming capabilities for more interactive and responsive applications.

3k tokens

Learn how to combine multiple LLM calls in sequence to solve complex tasks through functional chaining, nested chains, conditional execution, and parallel processing.

6k tokens

Learn how to structure and validate LLM outputs using Pydantic models for type safety, automatic validation, and easier data manipulation across different providers.

1k tokens

Learn how to request structured JSON outputs from LLMs with Mirascope's JSON Mode for easier parsing, validation, and integration with your applications.

2k tokens

Learn how to process and structure raw LLM outputs into usable formats using Mirascope's flexible output parsers for more reliable and application-ready results.

14k tokens

Learn how to define, use, and chain together LLM-powered tools in Mirascope to extend model capabilities with external functions, data sources, and system interactions.

5k tokens

Learn how to build autonomous and semi-autonomous LLM-powered agents with Mirascope that can use tools, maintain state, and execute multi-step reasoning processes.

3k tokens

Learn how to evaluate LLM outputs using multiple approaches including LLM-based evaluators, panels of judges, and hardcoded evaluation criteria.

2k tokens

Learn how to use asynchronous programming with Mirascope to efficiently handle I/O-bound operations, improve responsiveness, and run multiple LLM calls concurrently.

4k tokens

Learn how to implement robust retry mechanisms for LLM API calls using Mirascope and Tenacity to handle rate limits, validation errors, and other failures.

1k tokens

Learn how to use Mirascope with locally hosted open-source models through Ollama, vLLM, and other APIs with OpenAI compatibility.

Lilypad

11k tokens

Spin up your data flywheel with one line of code.

2k tokens

An open-source prompt engineering framework

<1k tokens

How Lilypad balances open-source and enterprise features

2k tokens

Start using Lilypad in one line of code

1k tokens

Run Lilypad in your own infrastructure

<1k tokens

No-code interface for experimenting with Lilypad

<1k tokens

Observability made easy

<1k tokens

Easily instrument arbitrary blocks of code with OpenTelemetry

1k tokens

Structured collections of spans

<1k tokens

Track versions of your LLM functions

<1k tokens

Monitor the performance and cost of your LLM functions

<1k tokens

Compare different LLM function implementations

<1k tokens

Add labels and feedback to your LLM outputs

<1k tokens

API documentation for Lilypad