Although robotics is advancing quickly, a lack of “nimble-fingered” robots still remains a challenge, as does the long times needed to program robots to pick up irregular or odd-shaped items. That may not be the case for long, however.

 

Berkeley News reports that UC Berkeley professor Ken Goldberg, postdoctoral researcher Jeff Mahler and the Laboratory for Automation Science and Engineering (AUTOLAB) have created a robot called DexNet 2.0, which has a high grasping success rate, meaning the technology could soon be applied in industry—with the potential to “revolutionize” manufacturing and the supply chain, according to researchers.

 

DexNet 2.0 gained its highly accurate dexterity through a process called deep learning. The researchers built a vast database of 3D shapes and the physics of grasping—6.7 million data points in total—that a neural network uses to learn grasps that will pick up and move objects with irregular shapes. The neural network was then connected to a 3D sensor and a robotic arm. When an object is placed in front of DexNet 2.0, it quickly “studies” the shape and selects a grasp that will successfully pick up and move the object.

 

The robot is significantly better than previous robots at grasping items. In tests, when DexNet 2.0 was more than 50 percent confident it could grasp an object, it succeeded in lifting the item and shaking it without dropping the object 98 percent of the time, an MIT Technology Review article reports. When the robot was unsure, it would poke the object to identify a better grasp. After doing that, DexNet 2.0 was successful at lifting an item 99 percent of the time.

 

Many researchers are working on ways for robots to learn to grasp and manipulate items by practicing grasping, but the process is time-consuming. DexNet 2.0, however, learns without needing to practice, which shows how new approaches to robot learning, combined with the ability for robots to access information through the cloud, could advance the capabilities of robots in factories and warehouses—and may even enable robots to do useful work in settings such as hospitals and homes, the researchers say.

 

“We’re producing better results but without that kind of experimentation,” Professor Goldberg says in the article. “We’re very excited about this.”

 

Jeff Mahler, the postdoctoral researcher who worked on the project, adds that, “We can generate sufficient training data for deep neural networks in a day or so instead of running months of physical trials on a real robot.”

 

The UC Berkeley researchers collaborated with Juan L. Aparicio, who heads a Siemens research team in advanced manufacturing automation, also located in Berkeley. Siemens is interested in commercializing cloud robotics, as well as other connected manufacturing technologies. Aparicio says this nimble-fingered robot is exciting because the reliability of its grasping capability offers a clear path toward commercialization.

 

“Today, robots just have to pick and place the same object over and over, but in the near future, as we move toward lot-size-one, it will be unworkable to reprogram all of a robot’s movements for each new object,” Aparicio says in an article on the Siemens website, titled “A Cloud for Robotics.” “To figure out how to grasp new objects on their own, robots will need the cloud.”

 

What do you think of DexNet 2.0’s ability to learn grasps for irregularly shaped objects? How do you think this “deep learning” for robots can have an impact on the supply chain?