SO-101 Setup Guide

From parts to first data collection. Estimated time: ~3–4 hours (not counting 3D print time).

1

Assembly

~60 min + print time

The SO-101 is a fully open-source arm. All parts are either 3D-printed or available as off-the-shelf hardware listed in the LeRobot BOM on HuggingFace.

Parts you need

  • 6× Feetech STS3215 servo motors
  • 3D-printed structural parts (STL files in the SO-101 GitHub repo)
  • USB-to-serial adapter cable (CH340 or CP2102 chip)
  • 12V power supply (3A minimum)
  • Servo cables and connector hardware (per BOM)

Assembly checklist

  • Print all structural components (base, links, end-effector)
  • Install STS3215 servos into their respective link housings
  • Route servo cables through the printed cable channels
  • Daisy-chain servos in the correct order (IDs 1–6 from base to tip)
  • Secure the base to a stable surface before powering on
  • Read the safety page before applying power
Where to get the BOM and STL files: The full bill of materials and printable parts are maintained in the HuggingFace LeRobot repository. Search for "SO-101" on the LeRobot GitHub.
1b

3D Printing the Parts

~8–16 hrs print time

All structural components of the SO-101 are FDM-printable using standard desktop printers. STL files are organized into single-file prints for each arm, making slicing straightforward.

Recommended Slicer Settings

Setting Value
Material PLA+
Nozzle diameter 0.4 mm (or 0.6 mm)
Layer height 0.2 mm (0.4 mm for 0.6 mm nozzle)
Infill density 15%
Supports Everywhere; ignore slopes >45°
Bed adhesion Standard glue stick on PEI or glass
Tested printers Prusa MINI+, Creality Ender 3, Bambu Lab A/P/X-series

STL Files — Which to Print

Pre-arranged single-file prints are available for common bed sizes:

  • 220×220 mm bed (Ender 3):
    • Follower: STL/SO101/Follower/Ender_Follower_SO101.stl
    • Leader: STL/SO101/Leader/Ender_Leader_SO101.stl
  • 205×250 mm bed (Prusa / UP):
    • Follower: STL/SO101/Follower/Prusa_Follower_SO101.stl
    • Leader: STL/SO101/Leader/Prusa_Leader_SO101.stl
Verify dimensional accuracy first. Before printing the full arm, print the gauge STLs from STL/Gauges/ and test them against a Lego brick or an STS3215 servo. A correct fit on the gauge confirms your printer calibration is accurate. Adjust scaling if needed before committing to the full print.
Don't own a printer? See the 3DPRINT.md guide in the SO-ARM100 repo for print service options. Pre-printed kits are also available from PartaBot (US), Seeed Studio (international), and Autodiscovery (EU).
2

Software Install

~15 min

The SO-101 is natively supported by HuggingFace LeRobot. No additional plugin is needed — just install LeRobot.

Install LeRobot

# Using pip
pip install lerobot

# Or with uv (recommended)
uv pip install lerobot

Linux serial port permissions

On Linux, serial ports under /dev/ttyACM* require the user to be in the dialout group. Run this once and log out and back in:

sudo usermod -aG dialout $USER
# Then log out and back in, or run:
newgrp dialout

Prerequisites

  • Python 3.10+
  • Linux (Ubuntu 22.04 recommended) or macOS
  • USB-to-serial driver installed (CH340 driver on macOS; usually pre-installed on Linux)
3

Port Detection & Calibration

~20 min

Find the correct USB serial port for the arm, then run the LeRobot calibration script to set servo zero positions.

Find the serial port

python lerobot/scripts/find_motors_bus_port.py

Plug and unplug the USB cable when prompted. The script identifies which port the arm is connected to. Typical values:

# Linux:  /dev/ttyACM0  (or ttyUSB0 for CH340 adapters)
# macOS:  /dev/tty.usbmodem*  or  /dev/tty.usbserial-*

Run calibration

Move the arm through its full range of motion when prompted:

python lerobot/scripts/calibrate.py \
  --robot.type=so101 \
  --robot.port=/dev/ttyACM0
Re-calibrate after any reassembly. Calibration data is stored locally. If you disassemble and reassemble joints, re-run calibration to restore accurate zero positions.
4

First Motion Test

~15 min

Run the teleoperate script in single-arm mode to verify all joints respond correctly before connecting a leader arm.

python lerobot/scripts/teleoperate.py \
  --robot.type=so101 \
  --robot.port=/dev/ttyACM0

What to verify

  • All 6 joints respond to commands without skipping
  • No servo stall or overload warnings in the terminal
  • Gripper opens and closes through full range
  • No cable snagging at any joint position
Emergency stop Disconnect the USB cable to immediately cut communication to the arm. Keep hands clear of the workspace during powered operation.
5

Teleoperation

~30 min

The SO-101 works as a standalone arm or as a follower arm with a leader arm for teleoperation. Using a second arm as leader produces higher-quality demonstrations for imitation learning.

Standalone mode (keyboard / programmatic)

python lerobot/scripts/teleoperate.py \
  --robot.type=so101 \
  --robot.port=/dev/ttyACM0

With a leader arm (e.g. DK1 leader)

python lerobot/scripts/teleoperate.py \
  --robot.type=so101 \
  --robot.port=/dev/ttyACM0 \
  --teleop.type=so101 \
  --teleop.port=/dev/ttyACM1
Bimanual setup: The SO-101 can be used as a follower arm with the DK1 leader arm. Both are LeRobot-native and communicate over USB serial. See the DK1 page for full bimanual setup details.
6

Data Collection

Ongoing

Record demonstrations using record.py. Data is saved in LeRobot format and can be pushed directly to HuggingFace Hub for training.

Basic recording

python lerobot/scripts/record.py \
  --robot.type=so101 \
  --robot.port=/dev/ttyACM0 \
  --dataset.repo_id=your-org/so101-dataset \
  --dataset.task="pick cube"

With a USB camera

python lerobot/scripts/record.py \
  --robot.type=so101 \
  --robot.port=/dev/ttyACM0 \
  --robot.cameras.top.type=opencv \
  --robot.cameras.top.index=0 \
  --dataset.repo_id=your-org/so101-dataset \
  --dataset.task="pick cube"

Recording best practices

  • Record at least 50 demonstrations per task before training
  • Vary object positions and orientations across episodes
  • Use descriptive --dataset.task names for later filtering
  • OAK-D or Intel RealSense cameras work well for depth-enabled data collection
  • Verify the dataset uploads to HuggingFace Hub after each session

Next steps

Once you have data collected, train an ACT or Diffusion Policy model using LeRobot's training scripts. Read the full SO-101 learning path for a structured progression from setup to model deployment.

Setup Complete?

Join the community to share results and get help with advanced configurations.