The Institute for Human and Machine Cognition (IHMC) has implemented new software that enables Boston Dyamics’ Atlas and NASA’s Valkyrie robots to walk with a human sending instructions. This software automatically analyzes the environment using the robot’s sensors and separates it into sections, with each interpreted into a series of polygons to create a model of its surroundings. Next, the robot plans out each of its steps from start to end. Read more for a video demonstration and additional information.
Boston Dynamics’ Atlas line of robots feature a control system that coordinates motions of the arms, torso and legs to achieve whole-body mobile manipulation, greatly expanding its reach and workspace. Its ability to balance while performing tasks enables it to work in a large volume while occupying only a small footprint. NASA’s Valkyrie robot boasts a multitude of cameras and sensors, including a MultiSense SL camera on her head, which combines laser, 3D stereo, and video to get a sense the environment around her.