๐ What you need
- Pollen Robotics Reachy Mini (assembled โ from huggingface.co/reachy-mini)
- A computer on the same Wi-Fi or Ethernet as the robot (Raspberry Pi 5, Jetson, or laptop)
- Python 3.10+ with OpenCastor installed
- USB-C cable for initial firmware check (optional)
Step 1 โ Install OpenCastor with the Reachy extra
The [reachy] extra installs reachy2-sdk and zeroconf for mDNS discovery.
# In a virtual environment (required on Pi OS / Debian)
python3 -m venv ~/opencastor-venv --system-site-packages
source ~/opencastor-venv/bin/activate
pip install opencastor[reachy]
# Verify
castor scan --preset-only
Expected output: reachy_mini (high): Reachy Mini detected via mDNS
Step 2 โ Power on Reachy Mini and verify mDNS
Reachy Mini advertises itself on your local network as reachy-mini.local using mDNS.
# Test connectivity from your computer
ping reachy-mini.local
# Or use the OpenCastor hardware scanner
castor scan
# Should show: reachy: true, host: reachy-mini.local
If mDNS doesn't resolve, check that both devices are on the same network segment. Fallback: use the robot's IP directly in your config.
Step 3 โ Generate an RCAN config for Reachy Mini
OpenCastor's wizard auto-populates a config based on detected hardware.
castor wizard --preset reachy_mini --output ~/reachy-mini.rcan.yaml
The generated config will look like:
robot_uuid: "your-uuid-here"
robot_name: "reachy-mini"
hardware:
- driver: reachy
host: auto # mDNS auto-discovery: reachy-mini.local
joints:
- name: head_roll
- name: head_pitch
- name: head_yaw
- name: l_antenna
- name: r_antenna
interpreter:
backend: auto # local CLIP โ Gemini if GOOGLE_API_KEY is set
mode: non_blocking
Step 4 โ Start the gateway and dashboard
castor gateway --config ~/reachy-mini.rcan.yaml &
castor dashboard &
# Check the API
curl http://localhost:8000/api/status
# โ {"version": "2026.3.12.0", "status": "ok", "hardware": ["reachy"]}
Step 5 โ Control head movement
Send movement commands via the REST API or Python SDK:
import requests
# Nod the head
requests.post("http://localhost:8000/api/move", json={
"joint": "head_pitch",
"position": 0.3, # radians
"duration": 1.0
})
# Wave the antennas
for joint in ["l_antenna", "r_antenna"]:
requests.post("http://localhost:8000/api/move", json={
"joint": joint,
"position": 0.5,
"duration": 0.5
})
Step 6 โ Behavior scripting with RCAN behaviors
Add named behaviors to your config to make Reachy Mini react to events:
behaviors:
- name: greet
trigger: on_event("hello")
actions:
- move: {joint: head_yaw, position: 0.0, duration: 0.3}
- move: {joint: l_antenna, position: 0.8, duration: 0.5}
- move: {joint: r_antenna, position: 0.8, duration: 0.5}
- wait: 0.5
- move: {joint: l_antenna, position: 0.0, duration: 0.5}
- move: {joint: r_antenna, position: 0.0, duration: 0.5}
- name: look_around
trigger: on_schedule("*/30 * * * * *") # every 30s
actions:
- move: {joint: head_yaw, position: 0.4, duration: 1.0}
- wait: 0.8
- move: {joint: head_yaw, position: -0.4, duration: 1.5}
- wait: 0.8
- move: {joint: head_yaw, position: 0.0, duration: 1.0}
Step 7 โ (Optional) Scene embeddings for reactive behaviors
If you attach a USB camera, OpenCastor can generate CLIP embeddings and trigger behaviors based on what the robot sees:
interpreter:
backend: auto
camera: /dev/video0 # USB camera attached to your compute node
similarity_threshold: 0.78
behaviors:
- name: react_to_person
trigger: on_embedding_match("a person in the scene", threshold: 0.82)
actions:
- move: {joint: head_pitch, position: 0.0, duration: 0.5} # look up
- emit_event: "hello"
This runs entirely on-device with CLIP (Tier 0). No cloud API key required.