Robot Embodiment via Virtual Reality
Moving robots around human spaces requires advanced sensing, planning and control capabilities. Humans are experts at doing this already, however, it’s not so easy to transfer that knowledge to robots. First, robots often have very different physical bodies compared to us, they move differently (wheels, joints in different places, etc.) and so it’s often unclear how to replicate a human’s body motion using a robot which may have a very different morphology. Second, robots come with very different sensing modalities. The way they “see” the world is in, e.g., pixels, range measurements and joint efforts. This means the information they use to decide on the state of the world and what to do next can be quite different to what we are used to.
This project takes a step towards solving these challenging problems by enabling a human pilot to embody a robot through a virtual reality (VR) interface. The goals are 1) to understand how to best present the robot’s sensor data (camera, depth, lidar, proprioception) to the user so that they can fluently pilot the robot and 2) to map the user’s hand and arm movements to the robot’s end-effector. The project will make use of the Stretch Open Source Mobile Manipulator robot and the Meta Quest 2 VR headset.
This project is in collaboration with the CSIRO Robotic Perception and Autonomy Group.
