Google Research has teamed with Everyday Robots to create a new robotics algorithm called PaLm-SayCan, which is helping robots better understand humans through language, whether it’s through voice or text.
The joint effort uses Google Pathways Language Model (PaLm) while a helper robot from Everyday Robots is on the physical side, with Google saying the effort is the first implementation that uses a large-scale language model to plan for a real robot.
The new robotics program will help people communicate with robots through voice or text easier than before, while the robots themselves will perform complex tasks thanks to the improved understating of our written and spoken languages.
Google even has a robot that can play table tennis, where the robot will slide from side to side attached to a track where it will play ping pong with a human player. The data that the company acquires during its robots playing games with humans, helps them improve their robotic reflexes as the robots work in our world.
VIEW GALLERY – 5 IMAGES
A bunch of the Google PaLM-StayCan robots actually practice regular everyday actions that humans do: opening draws, grabbing, and moving objects around in the kitchen. Hell, the robots can even drop cans of Pepsi into the recycle bin, all without you doing a thing… now, this is the future.
Google is wanting to see robots improve their interaction with humans, and even our simplest interactions are very complex: facial movements like smiling, communication, and grabbing things with grace and not squeezing them too hard. The companies are hoping that the use of the PaLm language model that the robots can better compete, and understand open-ended prompts like humans do.
The research suggests that the use of PaLm helped a robot plan and approach a task reasonably over other models with a 14% improvement, while there was a 13% success rate improvement for the robot carrying out tasks, and a much larger 26% improvement for the robots to complete a lengthy job: something with 8 steps for the robot to complete, or more.
Google explains in the description for its YouTube video: “Today, robots by and large exist in industrial environments, and are painstakingly coded for specific tasks. What if there was a better way to communicate with learning robots so they can help us? Researchers and engineers at Google Research and Everyday Robots are working together to combine the best of machine learning language models with helper robots that can complete complex and abstract tasks like ‘cleaning up a spilled drink.'”
Vincent Vanhoucke, senior director for Google’s robotics research said: “It’s going to take a while before we can really have a firm grasp on the direct commercial impact”.