🌿

Open tools for
open models

TEA/Branch builds open-source infrastructure that lets you use any AI model with any tool — no vendor lock-in.

🍃 MIT Licensed
API Proxy

Open Responses Server

Use any AI backend with OpenAI's Responses API. Run Codex against Ollama, vLLM, or any OpenAI-compatible model.

  • Drop-in Responses API proxy
  • MCP tool integration
  • Streaming & SSE support
  • Docker & PyPI packages
Unified API

Open Bedrock Server

A single chat completions endpoint that works with both OpenAI and AWS Bedrock models. Switch providers without changing code.

  • Provider-agnostic chat API
  • OpenAI + Bedrock support
  • Knowledge base integration
  • File API support
Cost Management

Bedrock Budgeteer

Serverless budget monitoring and control for AWS Bedrock API usage. Real-time cost tracking with progressive access control to prevent overruns.

  • Real-time budget monitoring
  • Progressive access control
  • Multi-channel alerts (Email, Slack, SMS)
  • Fully serverless architecture
Agent Packaging

Abbyfile

Build AI agents from declarative YAML + Markdown into standalone, versioned CLI binaries. Build once, run on any MCP-compatible runtime.

  • Declarative YAML + Markdown definitions
  • Standalone distributable binaries
  • Multi-runtime support (Claude Code, Codex, Gemini)
  • MCP-over-stdio tool integration