Google's New Robot Understanding Human Language
Google’s new robot is getting closer to understanding natural language that humans might use, thanks to the reasoning capabilities of Large Language Models (LLMs) that can now be applied to planning and interaction for robotics. This means that soon, robots will be able to decipher simple phrases, such as ‘when you have a minute, please grab me a drink’, instead of only basic words.



Current robots would break down the first part of that phrase as ‘yes’ and the second as ‘ok’, but since the instructions were not explicit enough, the result wouldn’t be the desired one. By leveraging environmental feedback, LLMs are capable of forming an inner monologue that enables them to more richly process and plan in robotic control scenarios.

Segway Ninebot S Smart Self-Balancing Electric Scooter, Dual 400W Motor, Max 13.7 Miles Range & 10MPH,...
  • UL 2272 certified for safety. Smart Battery Management System for reliability. IP54 waterproof protection.
  • Sturdy & Powerful: Compact design, 28 lbs, 220 lbs max load. Dual 400W motors, 10 mph max speed, 13.7 miles per charge, 15° slope.
  • Road Adaptive Design: 10.5" pneumatic tires for smooth ride. Knee control bar for precise steering, easy lifting. Age: 16-50. Height: 3.11-6.6".

We find that closed-loop language feedback significantly improves high level instruction completion on three domains, including simulated and real table top rearrangement tasks and long-horizon mobile manipulation tasks in a real kitchen environment,” said the researchers.

Author

A technology, gadget and video game enthusiast that loves covering the latest industry news. Favorite trade show? Mobile World Congress in Barcelona.