Humans meet their robot overlords at SIGGRAPH
Los Angeles (CA) – Balancing on two wheels, the shiny UBot-5 scurries around the Los Angeles Convention Center like it owns the place. Made by students at the University of Massachusetts at Amherst, the robot can speed up to 12 km/hour and can’t be knocked over. Nearby, MIT students were demoing “Nexi”, a robot that can respond to people with face gestures and speech. Put your hand directly in “her” face and the bot will act upset. Say something and the robot will follow your face around. Oh, and don’t you dare have a staring contest against Nexi, because she will win, guaranteed.
Steered by a Microsoft Xbox controller, the aluminum-framed Ubot speeds around on two wheels. Two articulating arms can make the bot do a pushup of sorts and touch things.
Bryan Thibodeau, a grad student at University of Massachusetts/Amherst, told TG Daily that a gyroscope in the base of the chassis helps the bot stay upright. Position, balance and speed information is sampled at 400 Hertz and is sent to a board mounted in the chest. Another board has a Core 2 Duo processor and runs Windows XP with Microsoft’s Robotics Studio. Since the balance is maintained on a separate board, the bot stays upright even if Windows crashes. Thibodeau said machining the aluminum was probably the most costly “part” of the bot. Several A123 lithium batteries are in the chest and Thibodeau estimates those cost approximately $600.
Ubot weighs approximately 35 pounds, but all the extra gear brings the total traveling weight to 100 pounds, but Thibodeau said it wasn’t difficult getting the robot to Los Angeles. He explained that the bot gets a first class seat on the airplane. Students accompanying the bot also get to sit in first class – how awesome is that? “TSA doesn’t have a problem with our bot because it fits in the xray machine, but flight attendants are a different story,” said Thibodeau. He added that it’s cheaper to fly the robot first class than to insure and ship it.
Thibodeau says the UBot is a test platform that can be used to help elderly people in the future. A small webcam is mounted on the shoulders and can monitor whether someone has fallen down. The bot can also be used as a mobile web conferencing platform – the camera comes to you and not the other way around.
Nearby, students from MIT showed off their Nexi MDS (mobile,dexterous,social) robot. This bot shows emotion with eyebrows, mouth and eye movements. It even gets upset if you walk too close or put your hand up to her face. An array of infrared LEDs on the head make a depth map of nearby objects and this is compared to data coming into regular stereo cameras mounted in the eyes.
Nexi is quite lifelike and she is able to follow your speech and face. One attendee, fearing the inevitable fall of the human race, tried a staring contest against the bot. For several seconds, he watched the bot, while Nexi unblinkingly looked back. Amazingly, Nexi lost as her program forced her to blink, but MIT student Mikey Siegel joked, “Never have a staring contest against a robot.”