Open-source leader arm designed for teleoperation and VLA (Vision-Language-Action) data collection.
- Improved ergonomics: 50mm joint offset (J2, J3, J4) for better operator clearance
- Coaxial trigger alignment: Natural grip, reduced wrist strain
- SO-101 compatible: Same kinematics, works with existing electronics
- Open Hardware: MIT License, OSHWA certified
- DOF: 5+1 (same as SO-101)
- Actuators: Feetech ST3215 series (SO-101 compatible)
- Materials: 3D printed components
- Kinematics: SO-101 compatible
- Servos: Dynamixel MX-28, MX-64, XM430, XM540
- Electronics: Standard SO-101 controller boards
- Software: Compatible with SO-101 control stacks
- 50mm joint offset for operator clearance
- Coaxial trigger-finger alignment
- Optimized cable routing
- [Add other specific improvements]
- VLA model data collection
- Robot learning from demonstration
- Teleoperation research
- Imitation learning datasets
- Issues: Report bugs or request features via GitHub Issues
- Discussions: Share your builds in GitHub Discussions
- Discord: Discord Server
If you use ML-101 in your research, please cite:
@misc{ml101,
author = {Chatzikonstantinou, Ioannis},
title = {ML-101: Ergonomic Leader Arm for VLA Research},
year = {2026},
publisher = {GitHub},
url = {https://github.com/motionlayer/ML-101}
}This project is licensed under the MIT License - see LICENSE file.
Open Source Hardware Certification: OSHWA UID [PENDING/XXXXXX]
ML-101 is developed by MotionLayer, building motion infrastructure for embodied AI.
Other projects:
Questions? Open an issue or email: info@motionlayer.com
