After being invited to give a presentation on her work in Human-Robot Interaction at the Monterey Bay Aquarium Research Institute (MBARI), UC Santa Cruz Associate Professor of Computational Media Leila Takayama and Kevin Weatherwax, a Ph.D. student in computational media, have been discussing ways to streamline the garbled three-way conversation between oceanographers, engineers and the robots and autonomous vehicles prowling beneath the Monterey Bay (and beyond).
“Right now we’re starting a collaboration with the Monterey Bay Aquarium Research Institute. The scientists there, for the most part, can’t use robots by themselves. So what we’re doing is shadowing scientists and trying to see how their mental models of how the system should work could map onto the way the system is designed.”
It’s the first step in the process of designing a more intuitive control system, one that won’t require a team of engineers to run.
“We’ve got these vehicles which were pretty much invented a decade ago,” MBARI Engineer Brian Kieft said, “they were made by engineers and the interface we have to control them when they drift through the ocean was made by the same engineers who designed the control algorithms. There’s not so much of an eye toward the end user. So we started this collaboration [with Takayama] to create something a scientist or maybe a grad student who’s never had experience with a command line can use.”
The control systems running these autonomous underwater vehicles (AUVs) have been a major hurdle for small NGOs or laboratories who can’t afford a team of engineers to steer the bots or have trouble extracting information from them. Even in labs that can afford engineers, operating the robots diverts resources away from other important projects (such as designing the next generation of AUVs).
Professor Takayama was on the team that created the Robot Operating System (ROS) software used by many research, industrial, and consumer robotics teams. During a conversation with Kieft about adapting ROS for underwater vehicles, they realized there were also serious user interface issues to address.
“What we’re hoping to achieve through this collaboration with Professor Takayama’s lab is to create a user interface that allows a small science lab to operate these things confidently without asking us for help,” Kieft said. “If my phone rings less often, then I’d say we’re doing our job.”
“Robots should be human-readable,” Takayama said. “There shouldn’t have to be robot whisperers. You shouldn’t have to have ten thousand hours of training to use one. I would like for anyone to be able to interact with robots, know what they’re doing, and control them.”
For more information about Professor Leila Takayama please see: http://www.leilatakayama.org
For information about supporting Professor Takayama’s research please contact Roger Trippel at email@example.com.