Augmented reality research at UC Berkeley got a boost with the announcement of a $100,000 grant from Microsoft — plus two of the company’s HoloLens virtual reality headsets — to a team to advance their work in simplifying the human-robot interaction.
Led by three faculty members from the UC Berkeley Robotics and Intelligent Machines Lab — Dr. Allen Y. Yang, Professors Claire Tomlin and Shankar Sastry, the engineering dean — the team was one of five around the country that won Microsoft’s challenge, to design projects that push the boundaries of holographic computing. From the Virtual Reality @ Berkeley Club, EECS undergraduate student Rohit Swamy will also lead more than 10 undergraduate students to develop augmented reality solutions for drones with the principal investigators’ research groups.
Microsoft described the team’s project as developing “better ways for pilots to ‘drive’ autonomous aerial vehicles. They imagine that one of the first applications of this project would be used to search for survivors after some sort of calamity.” The pilot would use the HoloLens to control multiple vehicles.
“One of the goals of this project is to provide people who don’t have much drone-flying experience the opportunity to pilot the fleet via a human-robot interface,” Microsoft wrote.
Sastry launched the research initiative to improve human-robot interfaces three years ago, partnering with researchers and students from the electrical engineering, computer science, chemical engineering and psychology departments at UC Berkeley, Stanford University and UCLA. The grant adds to the funding that the team already receives from NASA and the Office of Naval Research.
One of the researchers’ goals is to introduce a new, simple user interface that would allow people to control robots and drones without an advanced degree, according to Yang.