Students demo their high-level robotics projects


Lindsay France/University Photography
Chuck Yang, M.Eng. '12, models a pair of glasses that mirror a computer screen, providing the user with a heads-up display as he communicates with a robot on a search-and-rescue mission.

Lindsay France/University Photography
Annie Dai '12 demonstrates her project, a universal robotic gripper integrated with a platform that patrols an "apartment" to find trash.

Pop into Cornell's Autonomous Systems Lab in Rhodes Hall any given day, and a mechanical arthropod might be negotiating a steep ramp, or a Roomba-like rover could be cleaning up a cluttered room.

Students led by Mark Campbell, professor of mechanical and aerospace engineering, and Hadas Kress-Gazit, assistant professor in the same department, are helping to bring robotics out of the rigid hard-wired programming systems of yore into more sophisticated, integrated and automated functions for a variety of robot platforms. Several student researchers showed off their latest contributions at the end of last semester with a project demo.

"What we want to do is create machines that can do things autonomously," explained Kress-Gazit, whose research interests include a high-level software toolkit called Linear Temporal Mission Planning (LTLMop). "That means from the simplest things -- 'Don't collide with something,' to the more complex."

For instance, Robert Villalba '15 used LTLMoP to create high-level commands for a spiderlike robot that's smart enough to traverse different terrains without being programmed with every move; instead, it reacts to an environment based on broad specifications.

"The idea is to use English and tell the robot what you want it to do," Villalba said. The robot, for example, walks with a relaxed gait on a flat surface; when it encounters an incline, it adjusts its gait with more exaggerated movements to aid its climb.

Another group might someday put campus tour guides out of a job. Ahmed Elsamadisi '14 and his group demonstrated their design of a robotic system that can give a prerecorded campus tour -- and knows where it's going with minimal human supervision.

They used a rolling mechanical Segway platform and QR code-like tags along the walls of Rhodes hall for a "vision" system so the robot could orient itself. Preloaded with specifications, the robot can negotiate hallways and corners on its own, providing auditory recordings along the way. It can adjust its behavior depending on what it encounters -- such as a cluster of people walking by.

"The robot has to constantly inform itself and update its knowledge based on what it sees," Elsamadisi said. "The full goal would be [for it] to learn habits … and respond with verbal communication."

Annie Dai '12 demonstrated how she used a robotic universal "gripper," which is a balloon filled with coffee grounds that hardens around and picks up objects, integrated with a platform to patrol "bedrooms" of a mock apartment and find trash, pick it up and drop it off in receptacles. Like the arthropod and the Segway, this robot operates with high-level understanding of its environment, reacting to situations as they arise, guided by simple English commands.

Yet another group, with the goal of improving search-and-rescue missions, designed a software interface with an interactive map of an area (in this case, the engineering quad) that a person could input information into -- such as, "go here, not there"; "there may be something interesting to look at here, but not there" -- using a touchpad. In turn, the robot could relay information back to the human.

And to really make things futuristic, the students loaded their interface into a pair of monitor goggles, which mirrors the computer screen. This provides the wearer with a heads-up display -- more convenient than looking down at a screen, and more ideal in a search-and-rescue type environment.

"The idea is to leverage things robots are good at, like doing long, monotonous or dangerous missions, with the things humans are good at, which is interpreting scenes and doing high-level decision making and pattern recognition," said Nisar Ahmed, a postdoctoral associate who works on the project, which is called Husion.

These projects and more all demonstrate the lab's unifying theme: increasing the autonomy of robots, Kress-Gazit said.

"Robotics has different flavors, but here we focus on machines that can do things autonomously with minimum human intervention, and still be safe while doing something interesting," she said.

 

Media Contact

John Carberry