Why living beings they have movements as sharp and efficient, where robots are so slow when they must make movements that are not pre-programmed. A biological brain is able to analyze in real time a complex situation and adjust the gesture to catch the fly ball. It is this type of interaction that researchers looking to learn the robot, not by programming, but by learning.
14 robots have fueled IA to teach him to pick up items
Google researchers are not the first to place a robot controlled by a neural network. University Carnegie Mellon had published the results of their work. With a silicon brains, it took 50,000 tests a Baxter robot to learn to seize any object. Stressing that it takes a year to a baby learning to grasp an object and four to be able to manipulate an object accurately. Compared to their peers, Google researchers have tackled the same problem with Google’s own means. this is not one, but 14 robots that were all connected to this artificial intelligence or rather a CNN algorithm (deep convolutional neural network ). By trial and error, these robots have attempted to grasp objects of various colors and shapes, guided by the AI through a camera. A similar loop learning to coordinate eye / hand of a human being, the researchers said. Each test has enriched the classifier algorithm and thus bring a little more dexterity this intelligence with 14 arms. The researchers left the AI perform 800,000 tests or 3,000 hours of training per robot.
From one image to select the action to take, the algorithm fails in 34% of his attempts to catch an object at random. With real-time return of the camera, the algorithm can refine his act and the failure rate drops to 18%. A still imperfect but encouraging results which considers that, coupled with large neural networks, this is a new era that opens to robotics.
“Deep Learning for Robots: Learning from Large-Scale Interaction” , Google Research Blog, March 8, 2016
“Learning Hand-Eye Coordination for Robotic Grasping with Deep Learning and Large-Scale Data Collection” , Cornell University Library, March 7, 2016