Victorville (CA) – Getting first place in a car race is great, but imagine finishing in 1st, 2nd and 4th places. No, it’s not impossible because at last Saturday’s DARPA Urban Challenge, Velodyne did just that. The company’s spinning laser sensor was used by many of the top teams in the robotic car race and in fact seven out of the 11 finalists had the sensor on top of their vehicles. So while Tartan Racing enjoys its $2 million dollar first place prize, Velodyne President Bruce Hall is also laughing all the way to the bank.
The Velodyne HDL-64E LIDAR unit is a very new sensor and Hall told us that his company only started making the units in January. But despite the short lead time till the October race qualifications, several top teams snapped up the first units.
CMU and Stanford immediately showed greatest interest and bought two each, one for both the primary and backup vehicle… and at $75,000 a piece, these laser sensors aren’t cheap folks. But for many teams the cost was well worth it, “We showed off the first units to teams and their jaws dropped because they could see outlines and faces of people with it. It was almost like looking at a camera image,” Hall told us.
The sensor became so common at the Urban Challenge that teams just started calling it THE Velodyne. With its 64 lasers mounted on a spinning top, the unit can distinguish curbs, cars, people and even faces as far as 120 meters away and we saw this in numerous demonstrations while talking to teams at the Urban Challenge qualifications and race.
Hall tells us that he’s sold “tens of devices” to Urban Challenge teams and to automotive companies, but he believes the biggest market may be to mobile surveying crews like the people you often see on freeways and streets wearing orange vests and looking through laser view finders. By mounting the laser in interesting ways, crews could perform measurements in a single vehicle pass, something that takes several hours now.
“You could measure things like if a bridge is sagging. No more taking a laser on a tripod and walking back and forth for hours. Just drive on the bridge and finish it in one pass,” Hall told us. He added that he’d like to actually get rid of the guys on the tripod because it’s both a risky job and the crews slow down traffic, “guys get killed all the time,” Hall said.
Hall did say that the sensor will need to become much more accurate if it’s to be used in surveying. “We’ll have to get accuracy of a quarter of an inch, right now it’s plus or minus one inch,” he said.
So what’s the magic behind the Velodyne laser? The unit spins at 10 rotations per second, firing off lasers in 360 degrees and in a 26 degree vertical arc. The beams hit objects and are then reflected back into laser detectors. The resulting data is impressive, 1 million data points – which include both a distance number and an intensity figure. Since the distance figure is a flat number from the reflected object, Hall told us that you use some simple trigonometry to figure out the actual distance from the object to the car.
Some teams figured out that the intensity figure could be used to detect lane markers because they tend to light up like a “Christmas Tree”, according to Sebastian Thrun, team leader of Stanford Racing. Thrun told reporters that lane markings are amazing reflective in the IR spectrum and as a result the intensity value of the reflected points was very high. You can watch Stanford'sSebastian Thrun and Michael Montemerlo talking about the sensors in the following video.
With so many cars using the spinning Velodyne laser, some people thought conflict laser beams could send other contestants careening out of control. Nothing could be further from the truth, said Hall, and he chalked it up to another “urban legend”. He told us that the chances of two of his laser systems interfering with each other are incredibly slim because the Velodyne only fires four of its 64 lasers at a time. When the four lasers fire, four corresponding laser detectors become active for the reflected light. So for a conflict to occur, everything would have to be in phase, “the windows would have be facing each other, the lasers would have to fire at the same time and the lasers would have to be aligned to the sensor that was active,” said Hall.
Even if all the conditions were met, you would then get one stray pixel and Hall points out that all of the teams have filters that detect cars, trees and curbs which are usually comprised of thousands or millions of pixels. “So you just got one bad pixel between you and that big blob,” he added.
Mr. Hall attended the Urban Challenge and had some mixed feelings because he fielded teams in the first and second Grand Challenge Races. “In the other races I just wanted the car to make it out of TV camera range. I didn’t want it to fail on television,” he joked. But things were much different at the race because he just wanted all of his sensors to work adding, “When the race was finished, they were all still working and spinning.”