Mirascope Frog Logo
Mirascope
DocsBlogPricingCloud
⌘K
Type to search
⌘Kto search
Escto close
mirascope
v1.25.7
1.3k
Join our
Docs v1 (legacy)API ReferenceGuidesDocs
Welcome
Getting Started
Why Mirascope?HelpContributing0.x Migration Guide
Learn
OverviewPromptsCallsStreamsChainingResponse ModelsJSON ModeOutput ParsersToolsAgentsEvalsAsyncRetriesLocal Models
Provider-Specific Features
Thinking & ReasoningOpenAIAnthropic
Extensions
MiddlewareCustom LLM Provider
MCP - Model Context Protocol
Client
# Local (Open-Source) Models You can use the [`llm.call`](/docs/v1/api) decorator to interact with models running with [Ollama](https://github.com/ollama/ollama) or [vLLM](https://github.com/vllm-project/vllm): <TabbedSection> <Tab value="Ollama"> ```python from mirascope import llm from pydantic import BaseModel @llm.call("ollama", "llama3.2") # [!code highlight] def recommend_book(genre: str) -> str: return f"Recommend a {genre} book" recommendation = recommend_book("fantasy") print(recommendation) # Output: Here are some popular and highly-recommended fantasy books... class Book(BaseModel): title: str author: str @llm.call("ollama", "llama3.2", response_model=Book) # [!code highlight] def extract_book(text: str) -> str: return f"Extract {text}" book = extract_book("The Name of the Wind by Patrick Rothfuss") assert isinstance(book, Book) print(book) # Output: title='The Name of the Wind' author='Patrick Rothfuss' ``` </Tab> <Tab value="vLLM"> ```python from mirascope import llm from pydantic import BaseModel @llm.call("vllm", "llama3.2") # [!code highlight] def recommend_book(genre: str) -> str: return f"Recommend a {genre} book" recommendation = recommend_book("fantasy") print(recommendation) # Output: Here are some popular and highly-recommended fantasy books... class Book(BaseModel): title: str author: str @llm.call("vllm", "llama3.2", response_model=Book) # [!code highlight] def extract_book(text: str) -> str: return f"Extract {text}" book = extract_book("The Name of the Wind by Patrick Rothfuss") assert isinstance(book, Book) print(book) # Output: title='The Name of the Wind' author='Patrick Rothfuss' ``` </Tab> </TabbedSection> <Info title="Double Check Support" collapsible={true} defaultOpen={false}> The `llm.call` decorator uses OpenAI compatibility under the hood. Of course, not all open-source models or providers necessarily support all of OpenAI's available features, but most use-cases are generally available. See the links we've included below for more details: - [Ollama OpenAI Compatibility](https://github.com/ollama/ollama/blob/main/docs/openai.md) - [vLLM OpenAI Compatibility](https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html) </Info> ## OpenAI Compatibility When hosting (fine-tuned) open-source LLMs yourself locally or in your own cloud with tools that have OpenAI compatibility, you can use the [`openai.call`](/docs/v1/api) decorator with a [custom client](/docs/v1/learn/calls#custom-client) to interact with your model using all of Mirascope's various features. <TabbedSection> <Tab value="Ollama"> ```python from mirascope.core import openai from openai import OpenAI from pydantic import BaseModel custom_client = OpenAI( # [!code highlight] base_url="http://localhost:11434/v1", # your ollama endpoint # [!code highlight] api_key="ollama", # required by openai, but unused # [!code highlight] ) # [!code highlight] @openai.call("llama3.2", client=custom_client) # [!code highlight] def recommend_book(genre: str) -> str: return f"Recommend a {genre} book" recommendation = recommend_book("fantasy") print(recommendation) # Output: Here are some popular and highly-recommended fantasy books... class Book(BaseModel): title: str author: str @openai.call("llama3.2", response_model=Book, client=custom_client) # [!code highlight] def extract_book(text: str) -> str: return f"Extract {text}" book = extract_book("The Name of the Wind by Patrick Rothfuss") assert isinstance(book, Book) print(book) # Output: title='The Name of the Wind' author='Patrick Rothfuss' ``` </Tab> <Tab value="vLLM"> ```python from mirascope.core import openai from openai import OpenAI from pydantic import BaseModel custom_client = OpenAI( # [!code highlight] base_url="http://localhost:8000/v1", # your vLLM endpoint # [!code highlight] api_key="vllm", # required by openai, but unused # [!code highlight] ) # [!code highlight] @openai.call("llama3.2", client=custom_client) # [!code highlight] def recommend_book(genre: str) -> str: return f"Recommend a {genre} book" recommendation = recommend_book("fantasy") print(recommendation) # Output: Here are some popular and highly-recommended fantasy books... class Book(BaseModel): title: str author: str @openai.call("llama3.2", response_model=Book, client=custom_client) # [!code highlight] def extract_book(text: str) -> str: return f"Extract {text}" book = extract_book("The Name of the Wind by Patrick Rothfuss") assert isinstance(book, Book) print(book) # Output: title='The Name of the Wind' author='Patrick Rothfuss' ``` </Tab> </TabbedSection>

Provider

On this page

Provider

On this page

© 2026 Mirascope. All rights reserved.

Mirascope® is a registered trademark of Mirascope, Inc. in the U.S.

Privacy PolicyTerms of Use