Logo Pollen Robotics
Menu burger

Resources

Find all the essential resources to install, control, and animate your humanoid robot—either on your own Reachy 2 or entirely in simulation. Whether you’re a developer, researcher, or robotics enthusiast, step-by-step guides and tools will help you unlock Reachy 2’s full potential.
Don’t have a Reachy yet? Click here to get one!

Reachy's waving
Reachy 2 online documentation presentation

🛠️ The Reachy 2 Documentation provides everything you need to get started, from the moment you unpack your robot to mastering its advanced features. You’ll find step-by-step guides for setup, basic controls, and in-depth tutorials to help you explore the full range of Reachy 2’s features.

Set up the Reachy 2’s stack on your computer to control your humanoid robot via Jupyter Notebook or your own code. Use RViz for visualization and Gazebo or MuJoCo for simulation, before real execution.

Prerequisite: Install Docker Desktop – no login required, the image is open-access.

simulation on rviz

Now that you are all set, let's begin to play!

Step-by-step tutorials

Step-by-step notebook tutorials to help you get started, from basic behaviors to advanced control, using Reachy 2’s SDK for robot operation and Pollen-Vision for object detection.

Display emotions on Reachy 2

Reachy 2's 85 emotions

Module allowing Reachy 2 to show emotions via expressions, motions, and sounds, improving human-robot interaction. Help grow this project by sharing yours.

Use AI-powered perception

Pollen vision with Reachy1

Pollen-Vision is a perception library designed for robots, providing real-time object detection, pose estimation, and scene understanding to enhance robotic interactions with their environment.

Tell us about your project

If you need any help or want to share your project, don’t hesitate to reach out — we’d love to hear from you! You can join the conversation on our Discord, explore and contribute via GitHub, or explore our spaces on 🤗Hugging Face.

Looking to get in touch directly?