Campus & community, Research, Technology & engineering, Mind & body, Science & environment, Campus news

Big NSF grant funds research into training robots to work with humans

By Carol Ness

What if robots and humans, working together, were able to perform tasks in surgery and manufacturing that neither can do alone?

That’s the question driving new research by UC Berkeley robotics experts Ken Goldberg and Pieter Abbeel and colleagues from four other universities, who were awarded a $3.5 million grant from the National Science Foundation.

Animesh Garg, Ken Goldberg, Pieter Abbeel

Berkeley professors Ken Goldberg, center, and Pieter Abbeel, right, work on the RAVEN surgical robot in a Soda Hall lab, with an assist from graduate student Animesh Garg, left. (Cheryl Martinez photos)

Their work is part of the first $50 million in funding for the National Robotics Initiative, announced in 2011 with the goal of exploring how robots can enhance the work of humans rather than replacing them.

“The emerging generation of robots are more aware than oblivious, more social than solitary, and more like companions than tools,” says Goldberg, a professor in the departments of Industrial Engineering and Operations Research and  Electrical Engineering and Computer Sciences.

The four-year project, a collaboration of experts at Berkeley, Stanford, Johns Hopkins, UC Santa Cruz and the University of Washington, will focus on ways that robots can be trained by humans to perform “multilateral manipulation,” with one or more humans providing perception and adaptability and robots providing speed, precision, accuracy and dexterity, as the researchers described it in their grant application.

In surgery, for example, a human-robot system could mean an extra set of “hands” for retraction or suturing for a doctor focusing on more complex procedures. The concept envisions something quite different from tele-surgery, where a remote surgeon controls robotic equipment directly; the next generation of robots would function autonomously, but with training and supervision by humans and reliance on algorithms and data libraries compiled by humans.

In manufacturing, such human-robot systems could learn to handle tasks such as threading wires or cables, or aligning gaskets. In households, they could wrap packages or fold laundry.

Robot pincer

Close-up of one of the RAVEN robot’s two pincers, which can hold a needle for simulated suturing.

All of these examples revolve around the idea of teaching a robot to properly handle objects whose size and shape can be variable, because they’re made of malleable materials.  Examples include strings, wires, sheets, and cushions and organs.

Abbeel, an assistant professor in EECS, has done groundbreaking work in this area; he programmed a robot to pick up, recognize and fold towels crumpled in a random pile. A video shows the robot in action.

“Robotics is at an inflection point,” says Goldberg, pointing to some 5 million household robots, 10,000 military robots and 2,000 surgical robots adopted over the past decade. An emerging new paradigm is “cloud robotics,” in which robots no longer have to be self-contained but instead are designed to tap vast global stores of information, parallel computing power, data sharing and open-source programming.

Berkeley is a leading center for robot learning and cloud robotics, notes Goldberg. With researchers at Google, he recently submitted a paper on a robot trained to recognize and grasp objects using Google Goggles, a “crowd-sourced” online database of images of objects.

More information is available on the project website.

Also available online is information about the RAVEN surgical robotic system that Goldberg, Abbeel and others at Berkeley are working on, and on Goldberg’s cloud robotics site.