The un-framework for AI Agents — just the plumbing, no supply-chain baggage.
Status: Alpha. Python 3.10-3.13. Zero runtime dependencies. CI runs ruff, pytest, and package builds on every push and pull request to main.
Get Started · Why Baretools? · Concepts · Provider Integrations · API Reference · Changelog
pip install baretools-ai installs one package and zero runtime dependencies. No Pydantic required. No httpx. No hidden transitive packages that drift, get compromised, or bloat your Docker image.
Baretools handles the mechanical plumbing between your Python functions and LLM tool calls:
ToolCall dictsEverything else — prompts, context, retries, state, orchestration — stays in your code.
from baretools import tool, ToolRegistry, parse_tool_calls, format_tool_results
@tool
def get_weather(location: str) -> str:
"""Get current weather for a location."""
return f"Sunny, 72°F in {location}"
tools = ToolRegistry()
tools.register(get_weather)
# Same pattern for OpenAI, Anthropic, or Gemini — only provider= changes
schemas = tools.get_schemas("openai")
tool_calls = parse_tool_calls(llm_response, provider="openai")
results = tools.execute(tool_calls)
messages = format_tool_results(results, provider="openai")
pip install baretools-ai
Optional Pydantic support (only needed if your tools accept BaseModel parameters):
pip install "baretools-ai[pydantic]"
| Provider | Schema format | parse_tool_calls |
format_tool_results |
|---|---|---|---|
| OpenAI | {"type":"function","function":{...}} |
✓ | ✓ |
| Anthropic | {"name":...,"input_schema":{...}} |
✓ | ✓ |
| Gemini | {"functionDeclarations":[...]} |
✓ | ✓ |