Build your first integrations.

Step-by-step walkthroughs for Reachy Mini, HLabs ACB v2.0, DIY robots with Pi 5 or Jetson, ESP32, and LEGO Mindstorms.

๐Ÿค– Reachy Mini Integration

Connect Pollen Robotics Reachy Mini to OpenCastor via gRPC. Control the 5-DOF head, speaker, and mic from a unified RCAN config. Includes mDNS auto-discovery, behavior scripting, and live embedding capture.

Read tutorial โ†’

โš™๏ธ HLabs ACB v2.0 โ€” Custom BLDC Arm

Build a low-cost BLDC motor arm using the open-source HLabs ACB v2.0 controller. CAN Bus wiring, DFU firmware flash, 50Hz telemetry, and Prometheus monitoring out of the box.

Read tutorial โ†’

๐Ÿฆพ DIY Robot: ACB + Pi 5 or Jetson

Full build guide โ€” combine HLabs ACB v2.0 motor controllers with a Raspberry Pi 5 or Nvidia Jetson as the robot brain. Covers hardware assembly, OpenCastor install, and running ML inference on-device.

Read tutorial โ†’

โšก ESP32 Integration

Learn how to wire an ESP32-based robot, expose a simple API over Wi-Fi, and connect it to OpenCastor using an RCAN preset.

Read tutorial โ†’

๐Ÿงฑ LEGO Mindstorms Integration

Connect EV3 or SPIKE hardware, map motors and sensors, and validate that OpenCastor can safely command movement.

Read tutorial โ†’

๐Ÿค– Reachy Mini + OpenCastor

Connect a Pollen Robotics Reachy Mini desktop humanoid to OpenCastor. Control the 5-DOF animated head, speaker, mic, and antenna via a unified RCAN config with mDNS auto-discovery.

๐Ÿ›’ What you need

  • Pollen Robotics Reachy Mini (assembled โ€” from huggingface.co/reachy-mini)
  • A computer on the same Wi-Fi or Ethernet as the robot (Raspberry Pi 5, Jetson, or laptop)
  • Python 3.10+ with OpenCastor installed
  • USB-C cable for initial firmware check (optional)

Step 1 โ€” Install OpenCastor with the Reachy extra

The [reachy] extra installs reachy2-sdk and zeroconf for mDNS discovery.

# In a virtual environment (required on Pi OS / Debian)
python3 -m venv ~/opencastor-venv --system-site-packages
source ~/opencastor-venv/bin/activate

pip install opencastor[reachy]

# Verify
castor scan --preset-only

Expected output: reachy_mini (high): Reachy Mini detected via mDNS

Step 2 โ€” Power on Reachy Mini and verify mDNS

Reachy Mini advertises itself on your local network as reachy-mini.local using mDNS.

# Test connectivity from your computer
ping reachy-mini.local

# Or use the OpenCastor hardware scanner
castor scan
# Should show: reachy: true, host: reachy-mini.local

If mDNS doesn't resolve, check that both devices are on the same network segment. Fallback: use the robot's IP directly in your config.

Step 3 โ€” Generate an RCAN config for Reachy Mini

OpenCastor's wizard auto-populates a config based on detected hardware.

castor wizard --preset reachy_mini --output ~/reachy-mini.rcan.yaml

The generated config will look like:

robot_uuid: "your-uuid-here"
robot_name: "reachy-mini"

hardware:
  - driver: reachy
    host: auto        # mDNS auto-discovery: reachy-mini.local
    joints:
      - name: head_roll
      - name: head_pitch
      - name: head_yaw
      - name: l_antenna
      - name: r_antenna

interpreter:
  backend: auto       # local CLIP โ†’ Gemini if GOOGLE_API_KEY is set
  mode: non_blocking

Step 4 โ€” Start the gateway and dashboard

castor gateway --config ~/reachy-mini.rcan.yaml &
castor dashboard &

# Check the API
curl http://localhost:8000/api/status
# โ†’ {"version": "2026.3.12.0", "status": "ok", "hardware": ["reachy"]}

Step 5 โ€” Control head movement

Send movement commands via the REST API or Python SDK:

import requests

# Nod the head
requests.post("http://localhost:8000/api/move", json={
    "joint": "head_pitch",
    "position": 0.3,    # radians
    "duration": 1.0
})

# Wave the antennas
for joint in ["l_antenna", "r_antenna"]:
    requests.post("http://localhost:8000/api/move", json={
        "joint": joint,
        "position": 0.5,
        "duration": 0.5
    })

Step 6 โ€” Behavior scripting with RCAN behaviors

Add named behaviors to your config to make Reachy Mini react to events:

behaviors:
  - name: greet
    trigger: on_event("hello")
    actions:
      - move: {joint: head_yaw, position: 0.0, duration: 0.3}
      - move: {joint: l_antenna, position: 0.8, duration: 0.5}
      - move: {joint: r_antenna, position: 0.8, duration: 0.5}
      - wait: 0.5
      - move: {joint: l_antenna, position: 0.0, duration: 0.5}
      - move: {joint: r_antenna, position: 0.0, duration: 0.5}

  - name: look_around
    trigger: on_schedule("*/30 * * * * *")   # every 30s
    actions:
      - move: {joint: head_yaw, position: 0.4, duration: 1.0}
      - wait: 0.8
      - move: {joint: head_yaw, position: -0.4, duration: 1.5}
      - wait: 0.8
      - move: {joint: head_yaw, position: 0.0, duration: 1.0}

Step 7 โ€” (Optional) Scene embeddings for reactive behaviors

If you attach a USB camera, OpenCastor can generate CLIP embeddings and trigger behaviors based on what the robot sees:

interpreter:
  backend: auto
  camera: /dev/video0     # USB camera attached to your compute node
  similarity_threshold: 0.78

behaviors:
  - name: react_to_person
    trigger: on_embedding_match("a person in the scene", threshold: 0.82)
    actions:
      - move: {joint: head_pitch, position: 0.0, duration: 0.5}  # look up
      - emit_event: "hello"

This runs entirely on-device with CLIP (Tier 0). No cloud API key required.

โœ… You're done

Reachy Mini is now fully integrated with OpenCastor. Next steps: add a Raspberry Pi 5 compute node for local ML inference, or wire up an HLabs ACB motor controller to add a custom arm.

โš™๏ธ HLabs ACB v2.0 โ€” Custom BLDC Robot Arm

Build a low-cost custom robot arm using open-source HLabs ACB v2.0 brushless motor controllers. Covers wiring, DFU firmware flash, CAN Bus configuration, and live 50Hz telemetry with Prometheus.

๐Ÿ›’ Parts list

  • 1โ€“6ร— HLabs ACB v2.0 โ€” STM32G474, 12โ€“30V, 40A, CAN Bus 1Mbit/s. Available from h-laboratories.com
  • BLDC motors โ€” e.g. T-Motor Antigravity 4004, GL Brushless 5010, or similar 12Vโ€“24V outrunners
  • 12Vโ€“24V power supply โ€” 10A+ bench supply or 4S LiPo battery
  • CAN Bus cable โ€” 120ฮฉ termination resistors at each end of the bus
  • Raspberry Pi 5 or Nvidia Jetson (or laptop for dev) โ€” with USB-C port for DFU flash
  • USB-C cable for flashing firmware via DFU
  • 3D printed or aluminum arm links โ€” any design with motor mounts for your BLDC size

Step 1 โ€” Install OpenCastor with HLabs support

pip install opencastor[hlabs]

# Verify detection (with ACB plugged in via USB-C)
castor scan
# โ†’ acb: true, firmware: v2.x.x, protocol: usb+can

Step 2 โ€” Flash firmware via DFU

Hold the BOOT button on the ACB, plug in USB-C, then release. The board enumerates as DFU device 0483:df11.

# Flash via OpenCastor (fetches latest release from GitHub)
castor hlabs flash --device /dev/ttyACM0

# Or manually with dfu-util
dfu-util -a 0 -s 0x08000000 -D acb-firmware-v2.bin

The flash takes ~15 seconds. LED turns solid green when complete.

Step 3 โ€” Wire the CAN Bus

Each ACB has a CAN-H and CAN-L connector. Wire all boards in a daisy chain. Terminate both ends of the bus with 120ฮฉ resistors.

Pi/Jetson USB-CAN adapter
        โ”‚
    [120ฮฉ]
        โ”‚
   โ”Œโ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”
   โ”‚  ACB 0  โ”‚  Node ID: 0x01  (joint: shoulder_pan)
   โ””โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”˜
        โ”‚
   โ”Œโ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”
   โ”‚  ACB 1  โ”‚  Node ID: 0x02  (joint: shoulder_lift)
   โ””โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”˜
        โ”‚
   โ”Œโ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”
   โ”‚  ACB 2  โ”‚  Node ID: 0x03  (joint: elbow_flex)
   โ””โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”˜
        โ”‚
    [120ฮฉ]

Set each ACB's node ID via its DIP switches before powering up. Node IDs must be unique on the bus.

Step 4 โ€” Create the RCAN config

robot_uuid: "arm-uuid-here"
robot_name: "hlabs-arm"

hardware:
  - driver: hlabs_acb
    interface: can0         # USB-CAN adapter on Linux
    bitrate: 1000000        # 1 Mbit/s
    joints:
      - name: shoulder_pan
        node_id: 1
        direction: 1
        gear_ratio: 6.0
      - name: shoulder_lift
        node_id: 2
        direction: -1
        gear_ratio: 6.0
      - name: elbow_flex
        node_id: 3
        direction: 1
        gear_ratio: 4.0

telemetry:
  enabled: true
  rate_hz: 50               # 50Hz position/velocity/current logging
  prometheus_port: 9090

Step 5 โ€” Run calibration

Before moving the arm, calibrate encoder zero positions:

castor gateway --config ~/hlabs-arm.rcan.yaml &

# Run interactive calibration wizard
castor hlabs calibrate --joint shoulder_pan
castor hlabs calibrate --joint shoulder_lift
castor hlabs calibrate --joint elbow_flex

# Or calibrate all at once (moves each joint to limit switches)
castor hlabs calibrate --all

Step 6 โ€” Send your first motion command

import requests

BASE = "http://localhost:8000/api"

# Move shoulder pan to 45 degrees over 2 seconds
requests.post(f"{BASE}/move", json={
    "joint": "shoulder_pan",
    "position": 0.785,   # radians (45ยฐ)
    "duration": 2.0,
    "mode": "position"
})

# Check live telemetry
r = requests.get(f"{BASE}/telemetry/shoulder_pan")
print(r.json())
# โ†’ {"position_rad": 0.783, "velocity_rad_s": 0.01, "current_a": 0.4}

Step 7 โ€” Monitor with Prometheus + Grafana

OpenCastor exposes standard Prometheus metrics at :9090/metrics:

opencastor_acb_position_rad{joint="shoulder_pan"} 0.785
opencastor_acb_velocity_rad_s{joint="shoulder_pan"} 0.002
opencastor_acb_current_a{joint="shoulder_pan"} 0.41
opencastor_acb_error_flags{joint="shoulder_pan"} 0

# Scrape with Prometheus, visualize in Grafana
# Import our dashboard template: opencastor.com/hub/grafana-acb

โœ… Arm is live

Your HLabs ACB arm is running. Next: mount it on a mobile base with Pi 5 or Jetson for a full autonomous robot, or combine with Reachy Mini for a humanoid setup.

๐Ÿฆพ DIY Robot: HLabs ACB + Pi 5 or Jetson

Full build guide for a custom robot using HLabs ACB v2.0 motor controllers and a Raspberry Pi 5 or Nvidia Jetson Orin as the compute brain. Covers hardware assembly, OS setup, OpenCastor install, and on-device ML inference.

๐Ÿซ Raspberry Pi 5

  • $80 โ€” best value
  • ARM Cortex-A76, 4โ€“8GB RAM
  • Great for servo control + vision
  • Pi OS (Debian) โ€” easy setup
  • No GPU โ€” use CLIP CPU inference
  • Best for: hobbyists, education, prototyping

๐ŸŸข Nvidia Jetson Orin Nano

  • $250โ€“$500 โ€” most capable
  • 1024-core Ampere GPU, 8GB
  • CUDA-accelerated ML inference
  • JetPack OS (Ubuntu 22.04 based)
  • Run ImageBind, real-time depth estimation
  • Best for: research, heavy ML workloads

๐Ÿ›’ Full parts list

  • Compute: Raspberry Pi 5 (8GB) + active cooler, OR Nvidia Jetson Orin Nano developer kit
  • Motor controllers: 2โ€“6ร— HLabs ACB v2.0
  • Motors: BLDC outrunners (T-Motor, GL, etc.) sized for your robot
  • Power: 4S LiPo (14.8V) + power distribution board, or 24V bench supply
  • USB-CAN adapter: Canable, Waveshare USB-CAN-A, or Pi 5 CAN HAT
  • Camera: USB webcam or Pi Camera Module 3 (for visual embeddings)
  • OAK-D SR (optional): For stereo depth + RGB capture
  • Frame: 3D printed, aluminum extrusion, or MDF โ€” your design
  • MicroSD (Pi 5) or NVMe SSD (Jetson): 64GB+ for OS + models
  • 120ฮฉ termination resistors: 2ร— for CAN Bus ends

Step 1 โ€” OS setup

Raspberry Pi 5:

# Flash Pi OS (64-bit) via Raspberry Pi Imager
# Enable SSH, set hostname, configure Wi-Fi during flash

# After first boot, update
sudo apt update && sudo apt upgrade -y
sudo apt install -y python3-venv python3-pip can-utils git

Nvidia Jetson Orin:

# Flash JetPack 6 via SDK Manager on a host Ubuntu machine
# Then on Jetson:
sudo apt update && sudo apt upgrade -y
sudo apt install -y python3-venv python3-pip can-utils git

# Verify CUDA
nvcc --version
# โ†’ Cuda compilation tools, release 12.x

Step 2 โ€” Enable CAN Bus interface

With USB-CAN adapter (works on both Pi 5 and Jetson):

# Bring up CAN interface
sudo ip link set can0 up type can bitrate 1000000
sudo ip link set can0 txqueuelen 1000

# Make persistent via /etc/network/interfaces.d/can0
echo "auto can0
iface can0 inet manual
    pre-up /sbin/ip link set can0 up type can bitrate 1000000
    post-down /sbin/ip link set can0 down" | sudo tee /etc/network/interfaces.d/can0

# Test โ€” plug in ACB and check
candump can0

With Pi 5 CAN HAT (MCP2515-based):

# Add to /boot/config.txt:
dtparam=spi=on
dtoverlay=mcp2515-can0,oscillator=12000000,interrupt=25
dtoverlay=spi-bcm2835-overlay

# Then reboot and bring up:
sudo ip link set can0 up type can bitrate 1000000

Step 3 โ€” Install OpenCastor

# Create venv (required on Pi OS/Ubuntu โ€” PEP 668)
python3 -m venv ~/opencastor-venv --system-site-packages
source ~/opencastor-venv/bin/activate

# Install with HLabs support
# (also add [reachy] if connecting Reachy Mini via this compute node)
pip install opencastor[hlabs]

# Scan detected hardware
castor scan
# Pi 5 output:
# platform: rpi ยท acb: true (2 nodes) ยท cameras: 1
# Jetson output:
# platform: jetson ยท acb: true (2 nodes) ยท cameras: 1 ยท hailo: false

Step 4 โ€” Full RCAN config for your robot

robot_uuid: "diy-robot-uuid"
robot_name: "my-diy-robot"

hardware:
  - driver: hlabs_acb
    interface: can0
    bitrate: 1000000
    joints:
      - name: base_rotation
        node_id: 1
        gear_ratio: 10.0
      - name: arm_shoulder
        node_id: 2
        gear_ratio: 6.0
      - name: arm_elbow
        node_id: 3
        gear_ratio: 4.0

interpreter:
  backend: auto          # CLIP on Pi 5, ImageBind on Jetson (if GPU available)
  camera: /dev/video0
  mode: non_blocking

telemetry:
  enabled: true
  rate_hz: 50
  prometheus_port: 9090

Step 5 โ€” Jetson: enable GPU-accelerated inference (optional)

On Jetson, the auto backend will detect CUDA and use the local_extended tier (ImageBind) for richer multimodal embeddings:

# Install CUDA-compatible torch (Jetson ships its own torch wheel)
pip install torch torchvision --index-url https://developer.download.nvidia.com/compute/redist/jp/v60/pytorch/

# Verify GPU is visible to OpenCastor
castor doctor
# โ†’ embedding_backend: local_extended (GPU) โœ…
# โ†’ cuda_available: true โœ…

# ImageBind runs on GPU โ€” ~15ms per frame vs ~200ms CPU on Pi 5

Note: ImageBind is CC BY-NC 4.0. For commercial use, set backend: gemini with a GOOGLE_API_KEY.

Step 6 โ€” Run as a systemd service (auto-start on boot)

# Generate and install systemd service files
castor service install --config ~/my-diy-robot.rcan.yaml

# Enable on boot
systemctl --user enable opencastor.service
systemctl --user enable opencastor-dashboard.service

# Start now
systemctl --user start opencastor.service

# Check status
systemctl --user status opencastor.service
castor doctor   # verify everything is healthy

Step 7 โ€” (Optional) Add OAK-D SR for stereo depth

Plug in a Luxonis OAK-D SR (USB 3) for stereo RGB+Depth capture:

pip install depthai

# OAK-D is auto-detected by castor scan
castor scan
# โ†’ oakd: true, model: Luxonis OAK-D SR, usb3: true

# Add to RCAN config:
hardware:
  - driver: oakd
    model: auto             # auto-detected
    streams: [rgb, stereo_depth]
    fps: 30

โœ… Robot is live

Your DIY robot is running with HLabs ACB motor control, on-device ML inference, and full OpenCastor observability. Dashboard at http://<your-robot-ip>:8501. Next steps:

ESP32 + OpenCastor (Beginner Path)

Goal: drive a two-motor robot using OpenCastor commands routed through an ESP32 over Wi-Fi.

What you need
  • ESP32 dev board (WROOM or S3), USB cable, motor driver (L298N or TB6612), and a battery pack.
  • One machine running OpenCastor on the same network.
  • Optional: HC-SR04 or IMU sensor for obstacle/safety feedback.

Step 1 โ€” Flash your ESP32 control firmware

Use your preferred firmware path (Arduino IDE, PlatformIO, or ESP-IDF). Ensure your robot exposes a health endpoint and a command endpoint.

GET /status -> {"ok": true, "robot": "esp32-rover"} POST /cmd -> {"linear": 0.25, "angular": -0.1}

Tip for beginners: start by confirming the ESP32 can receive a command and blink an LED before controlling motors.

Step 2 โ€” Create the OpenCastor config

Create a new project and point the driver to your ESP32 endpoint.

castor wizard # Choose "ESP32 Generic Wi-Fi Bot" castor run --config my_robot.rcan.yaml

In your my_robot.rcan.yaml, use the esp32_generic preset flow and set the robot IP address under connection.host.

Step 3 โ€” Validate movement safely

  • Put the robot on blocks first so wheels can spin freely.
  • Run castor test-hardware and verify forward, reverse, and stop commands.
  • Set conservative speed limits before floor testing.

Step 4 โ€” Add beginner-friendly behaviors

Once base movement works, add one behavior at a time: voice command routing, obstacle stop, and simple patrol routes. Keep logs enabled to track command latency and dropped packets.

LEGO Mindstorms + OpenCastor

Goal: connect EV3/SPIKE hardware and run your first autonomous command loop with OpenCastor.

Choose your platform
  • EV3: use the lego_mindstorms_ev3.rcan.yaml preset and verify ev3dev access.
  • SPIKE Prime: use lego_spike_prime.rcan.yaml and verify serial/BLE bridge availability.

Step 1 โ€” Connect and verify hardware

For EV3, connect over USB or network and confirm motors/sensors are listed by ev3dev. For SPIKE, validate the hub is discoverable before launching OpenCastor.

castor scan castor doctor

Step 2 โ€” Map ports in RCAN

Use explicit port naming so beginners can troubleshoot quickly. For example: left motor on outB, right motor on outC, distance sensor on in1.

Keep drive speed low until turning behavior is tuned.

Step 3 โ€” Run your first mission

  • Start OpenCastor with your LEGO preset.
  • Issue a simple move-forward then stop task.
  • Confirm emergency stop behavior works before longer runs.
castor run --config config/presets/lego_mindstorms_ev3.rcan.yaml # or castor run --config config/presets/lego_spike_prime.rcan.yaml

Step 4 โ€” Extend the project

Add line-following or classroom challenge scenarios, then layer in language tasks ("drive to marker", "scan the room"). This keeps the project educational while introducing real robot autonomy concepts.