Engineers at the Coordinated Robotics Lab at the University of California, San Diego, have developed new image processing techniques for rapid exploration and characterization of structural fires by small Segway-like robotic vehicles.
A sophisticated on-board software system takes the thermal data recorded by the robot’s small infrared camera and maps it onto a 3D scene constructed from the images taken by a pair of stereo RGB cameras.
This allows small mobile robotic vehicles to create a virtual reality picture that includes a 3D map and temperature data that can be used immediately by first responders as the robot drives through a building on fire.
The research is part of a plan to develop novel robotic scouts that can help firefighters to assist in residential and commercial blazes. The robots will map and photograph the interior of burning buildings by using stereo vision. They will use data gathered from various sensors to characterize the state of a fire, including temperatures, volatile gases, and structural integrity while looking for survivors.
Working together both collaboratively and autonomously, a number of such vehicles would quickly develop an accurate augmented virtual reality picture of the building interior. They would then provide it in near real time to rescuers, who could better assess the structure and plan their firefighting and rescue activities. Bewley’s dynamics and control team has already built the first vehicle prototype, essentially a self-righting Segway-like vehicle that can climb stairs.
“These robot scouts will be small, inexpensive, agile, and autonomous,” added Thomas Bewley, a professor of mechanical engineering at the Jacobs School of Engineering at UC San Diego. “Firefighters arriving at the scene of a fire have 1000 things to do. To be useful, the robotic scouts need to work like well-trained hunting dogs, dispatching quickly and working together to achieve complex goals while making all necessary low-level decisions themselves along the way to get the job done.”