mirascope.core.openai.tool¶
The OpenAITool
class for easy tool usage with OpenAI LLM calls.
Usage Documentation
OpenAIToolConfig
¶
OpenAITool
¶
Bases: BaseTool
A class for defining tools for OpenAI LLM calls.
Example:
from mirascope.core import prompt_template
from mirascope.core.openai import openai_call
def format_book(title: str, author: str) -> str:
return f"{title} by {author}"
@openai_call("gpt-4o-mini", tools=[format_book])
def recommend_book(genre: str) -> str:
return f"Recommend a {genre} book"
response = recommend_book("fantasy")
if tool := response.tool: # returns an `OpenAITool` instance
print(tool.call())
tool_schema
classmethod
¶
Constructs a JSON Schema tool schema from the BaseModel
schema defined.
Example:
from mirascope.core.openai import OpenAITool
def format_book(title: str, author: str) -> str:
return f"{title} by {author}"
tool_type = OpenAITool.type_from_fn(format_book)
print(tool_type.tool_schema()) # prints the OpenAI-specific tool schema
Source code in mirascope/core/openai/tool.py
from_tool_call
classmethod
¶
from_tool_call(
tool_call: ChatCompletionMessageToolCall,
allow_partial: bool = False,
) -> OpenAITool
Constructs an OpenAITool
instance from a tool_call
.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
tool_call |
ChatCompletionMessageToolCall
|
The OpenAI tool call from which to construct this tool instance. |
required |
allow_partial |
bool
|
Whether to allow partial JSON data. |
False
|