Build on Brain

The only industrial PLC with a real plugin SDK.

Drop a Python file into a folder. The AI agent uses it as a tool. No vendor, no licenses, no committees. Siemens, Rockwell, Beckhoff — none of them have this.

PYTHON SDKMCP SERVERREST APIBYOKWEBSOCKETS

brain_sdk

YAML manifest + Python handler. That's it.

Machine builders create domain-specific tools in Python. The AI agent uses plugin tools exactly like built-in tools. Sandboxed subprocess execution. 30-second timeout. Tier-capped. Tag-whitelist-restricted.

Python — plugin handler
# plugins/brewery/mash_profile.py
from brain_sdk import tool, Tier

@tool(
    name="set_mash_profile",
    tier=Tier.CONTROL,
    tags_write=["MASH.*.SETPOINT"],
    description="Set mash temperature profile stages"
)
def set_mash_profile(stages: list[dict]) -> dict:
    """Configure multi-stage mash with temperature rests."""
    # Your logic here
    return {"stages": len(stages), "duration_min": total_time}
YAML — plugin manifest
# plugins/brewery/manifest.yaml
name: Brewery Control
version: 1.0.0
author: YourCompany
tier_cap: CONTROL
tags_whitelist:
  - MASH.*
  - FERMENT.*
  - SPARGE.*

Sandboxed

Subprocess isolation

30s timeout

No runaway tools

Tier-capped

READ / CONTROL / ADMIN

Tag whitelist

Surgical write scope

Shipped with the SDK

Three production-ready plugins.

🍺

Brewery

  • Mash profiles with multi-stage temperature rests
  • Fermentation control with auto-attenuation detection
  • Sparge timing and volume tracking
  • SRM color monitoring via Brain Vision
🏢

HVAC

  • Zone scheduling with occupancy awareness
  • Energy optimization (kWh minimization)
  • Setpoint ramping for comfort
  • Predictive pre-heat / pre-cool
📦

Packaging

  • Conveyor speed coordination
  • Label synchronization with product tracking
  • Throughput balancing across lines
  • Reject station coordination

MCP Server

Connect Claude Desktop, Cursor, or any AI client directly to your PLC.

JSON-RPC 2.0 on port 8766. API key authentication with tier-based access control. External AI clients use Brain's tools as if they were native.

Resources

  • brain://tagsAll PLC tags (read/write)
  • brain://alarms/activeCurrent alarm state
  • brain://system/statusController health
  • brain://programsPLC program library
  • brain://historianTime-series data queries

Claude Desktop Config

~/.claude/config.json
{
  "mcpServers": {
    "brain": {
      "command": "brain-mcp-proxy",
      "env": {
        "BRAIN_URL": "https://brain.local:8766",
        "BRAIN_API_KEY": "bk_live_..."
      }
    }
  }
}

REST API

100+ endpoints. 33 route groups. Full OpenAPI spec.

Every Brain action is exposed as HTTP. JWT authentication. RBAC with 5 roles. Full audit trail. WebSocket for real-time events.

HTTP — common endpoints
GET    /api/v1/tags                 # List all tags
POST   /api/v1/tags                 # Create tag
GET    /api/v1/tags/{name}/value    # Read tag value
POST   /api/v1/tags/{name}/value    # Write tag value
GET    /api/v1/alarms/active        # Active alarms
POST   /api/v1/programs/{id}/start  # Start PLC program
GET    /api/v1/historian/query      # Time-series query
WS     /api/v1/events               # Real-time events

100+

Endpoints

33

Route groups

5

RBAC roles

JWT

Authentication

BYOK

Your AI keys. Your costs. Your control.

Configure Anthropic, OpenAI, Azure OpenAI, or any OpenAI-compatible endpoint in the admin dashboard. Your keys stay on your cabinet. Brain never proxies your API calls through our servers.

Anthropic Claude

Sonnet, Opus

OpenAI

GPT-4o, GPT-5

Azure OpenAI

Enterprise-grade

Local LLM

OpenAI-compatible endpoint

Self-hosted

Ollama, vLLM, and more

Resources

Start building.

Ship faster with Brain's developer platform.