Google's New Robot Understanding Human Language
Google’s new robot is getting closer to understanding natural language that humans might use, thanks to the reasoning capabilities of Large Language Models (LLMs) that can now be applied to planning and interaction for robotics. This means that soon, robots will be able to decipher simple phrases, such as ‘when you have a minute, please grab me a drink’, instead of only basic words.



Current robots would break down the first part of that phrase as ‘yes’ and the second as ‘ok’, but since the instructions were not explicit enough, the result wouldn’t be the desired one. By leveraging environmental feedback, LLMs are capable of forming an inner monologue that enables them to more richly process and plan in robotic control scenarios.

Segway Ninebot S Smart Self-Balancing Electric Scooter with LED light, Portable and Powerful, White
  • UL 2272 certification ensures Ninebot S meets high standards for fire and electrical safety. The Smart Battery Management System provides reliable...
  • Sturdy & Powerful : Ninebot S is compact, weighting just 28 lbs with a max load of 220 lbs. Thanks to the dual 400W motors, it can easily reach a max...
  • Road Adaptive Design : 10.5" pneumatic tires provide a comfortable and smoother ride on bumpy roads. Knee control bar allows for precise steering and...

We find that closed-loop language feedback significantly improves high level instruction completion on three domains, including simulated and real table top rearrangement tasks and long-horizon mobile manipulation tasks in a real kitchen environment,” said the researchers.

Author

A technology, gadget and video game enthusiast that loves covering the latest industry news. Favorite trade show? Mobile World Congress in Barcelona.