How the ‘freeze-the-field’ rule led to NASCAR’s Mobile Technology Center

Share on facebook
Share on twitter
Share on linkedin
Share on whatsapp
How the 'freeze-the-field' rule led to NASCAR's Mobile Technology Center

See the NASCAR Mobile Technology Center slide show…

Indianapolis (IN) – The white flag flew over a crowded New Hampshire International Speedway, and Dale Jarrett was among the last to see it. This was not his day, as his #88 was running in the back of the pack. The car became “loose,” as NASCAR drivers refer to cars that respond too easily to the steering wheel. For a moment, he lost control, and the car spun; but the man driving the brown and gold UPS vehicle had by that time lost count of how many times he’d traveled in a spinning vehicle. He brought the car to a stop, pointing the right direction, but in the middle of the backstretch. At the start/finish line, the white flag was replaced with a yellow.

Since the days of the first nationally recognized stock car organizations, there had been a “gentlemen’s agreement:” When the yellow comes out, the caution “begins” at the start/finish line, but you can race for position until you get there. At this point in time, most of the cars still in the race had just passed over the line and seen the white flag. Sure, there was a yellow waiting for them next time around, but the checkered would be there too, so from where they were sitting, there was no point in slowing down.

Dale Jarrett could have died that day, and his #88 could have joined a certain #3 on the milestone of history, where Dale Earnhardt had been memorialized just one year before. But like a lone cactus amid a stampede of bulls, every car in the skirmish that ensued avoided the UPS car. Jarrett escaped an horrific fate, and NASCAR escaped a public relations disaster that would have erupted if two giants of the sport had been lost to avoidable accidents. NASCAR knew full well how close it came not only to losing Jarrett, but losing its momentum. This would not happen again.

They would mess with tradition, at least as many in the sports press saw it. Under the “freeze-the-field” rule NASCAR devised, when the caution flag is shown and the yellow lights go on, cars on the track line up behind the pace car in their relative running order at the time the flag is shown. But up until 2002, running order had been determined by when cars crossed the start/finish line. For the new scheme to work, spotters had to keep their eyes on every car, all over the track, at all times, and then agree on what they all saw.

The new rule almost didn’t work. In October 2003, on the last lap of the Mr. Goodcents 300 – a Busch series race in Kansas City – a last-lap restart resulted in a crash between Bobby Hamilton, Jr., and Greg Biffle. The caution came out, but some drivers raced to the yellow and checkered flags anyway. In the scramble for position, Jason Keller’s car was left in the dust. But when the race was over, NASCAR officials insisted that because Keller was fourth at the time the yellow was shown, he therefore finished fourth. Fans protested – as fans are wont to do whenever tradition is changed. And drivers protested that they didn’t see the yellow lights all the way around the track.

NASCAR needed a technological solution, and they needed it yesterday. Its technology partner was AMD. Its national IT general manager, Steve Worling, was already a veteran server administrator, but his background was with Intel-based systems. His brand of choice was HP. The mission before him was this: Design a scoring system and methodology that eventually could be used for every NASCAR race – Nextel Cup, Busch Series, and Craftsman Truck Series – that would be capable of detecting the precise running order at any frozen point in time. It needed to be able to communicate with sensors installed all around every speedway and road course in the NASCAR circuits. Yet the servers communicating with these sensors needed to be the same servers, first of all so that the results between venues could be trusted, but secondly because they didn’t want to install fixed, identical servers at every speedway.

One laptop running Windows was not an option. What NASCAR needed was a network command center that was not only portable but mobile. And if anyone understands the fine art of mobility, it’s NASCAR.

See the NASCAR Mobile Technology Center slide show…

We want to race the truck

Inside NASCAR’s mobile timing and scoring Technology Center vehicle.

You almost can’t enter the NASCAR Mobile Technology Center (MTC) vehicle without your mind replaying the annoying little buzzy theme from “Knight Rider.” It’s a 52-foot Freightliner that serves as a complete office complex for the timing and scoring team. After you step through the main entrance of the vehicle – which is at the back end, through the kitchen – you find yourself in Mission Control. There are banks of LCD monitors along the walls, and rows of notebook computers along both sides, mostly toward the rear and left side of the vehicle.

But you don’t see the servers – at least, not right away. They’re literally tucked away in a closet…a specially designed closet resting atop a two-ton air conditioning unit: a pair of 32U racks, one for the main system, one for the backup. The main system houses an HP BL25p blade server configuration with five blades. Three of these blades are single-core, 2.8 GHz AMD Opterons with 2 GB of RAM apiece – they’re used for the communications network with the transceiver devices. The two other operational units are dual-core, 2.4 GHz AMD Opterons with 4 GB of RAM apiece, and are used for managing the SQL Server database of car positions, which is updated for every split second of the race.

These things shouldn’t be here in a closet – at least that’s what everyone told NASCAR’s Steve Worling in September 2005, when he was first charged with this mission. “I sat down with a specialist on HP and blades,” he told us, “and the first thing he said is, ‘How you gonna cool ’em'” I said, ‘Well, here’s my parameters. I’m building a 52’ hauler, I’ve got x by x by x space, which is very small…[and] basically we have two 32 U racks, and I’m going to stick a 2,000 pound air conditioner under them. And they were like, ‘No, you can’t do that.’ I said, ‘Trust me, we can make it work.'”

NASCAR national IT manager Steve Worling explains the MTC, with AMD marketing official Travis Bullard looking on.

With the help of Featherlite, the mobile systems construction firm that had built dozens of crew trailers for NASCAR in the past, Worling and his team constructed a 3D model of his dream truck. It took ten iterations for Featherlite to meet Worling’s exacting specifications. “I did the calculations on the BTUs, on what the servers could do if they were fully populated with all the equipment,” he said, “[and] two tons of air was going to be the maximum we could do.”

By the time HP was coming up with specifications for the system, Worling and the Featherlite team had already constructed the truck. “We ordered two 32 U racks with smoked glass doors front and back, and the first thing [HP] said was, ‘Why are you doing smoked glass doors?’ Well, when you stash these into this room, seal them and then close the doors, now that is your server room environment. It’s completely closed off, [and] right underneath this room is a two-ton air unit. The air is fed only to it, nothing else.”

The base of the server closet is about eight inches higher than the floor of the truck. That space is empty – on purpose. Fans from Delphi Automotive mounted on the bottom of the racks draw air from the unit into pocket of the open space. When the smoked glass doors are closed, the fans gather cool air from the pocket and expel it out the back of the unit, through two vent holes that Featherlite cut in the top of the closet. Another space, in-between the wall and the work area, acts as a plenum that vents the exhausted air back into the room. The air conditioning unit then recirculates this air, because it’s easier to cool cleaner air from the inside than dirtier air from the outside. Besides, we all know how dirty the air can get on a racecourse.

NASCAR’s Steve Worling shows off the two 32 U HP ProLiant server racks to analyst Rob Enderle.

Worling’s goal throughout the design process was to eliminate all possible single points of failure. There are redundant server racks, and among the timing and scoring server blades, the third is a backup. The HP StorageWorks component provides redundant storage arrays to all workstations in the MTC. And for every connection between the truck and the outside world of fiberoptic connections rigged all around the racecourse, there are two switches. If one connection is cut or one switch is misappropriated, system software can detect the breach and move to the bypass switch.

Even the power system is redundant. On the front of the truck is mounted a 40 KW generator capable of producing 150 amps, even though the servers, workstations, microwave oven, and coffee maker only require about 90 amps maximum. But that’s not even the truck’s main power source – it’s the backup. Main power is supplied by a separate truck, provided by Caterpillar, which provides 998 KW of power for both the MTC and the mobile crews for US television networks.

So suppose the air conditioner unit goes out, as could conceivably happen during an electrical event such as a lightning strike. Would the race be red-flagged? NASCAR engineers have a “never-say-die” attitude. In the event of a cooling system failure, the team is ready to actually cannibalize a part of the truck itself, to serve as a standby cooling device.

“In the front, we have this 8 x 12 office,” Worling told us. “I knew this office was [going to be there, so I said], ‘You know what? Drop me in as much air as you can. I don’t care if it snows in here while we’re working; I want as much air as we can get in here.'” The wall on one side of the eight-inch air pocket butts up to the front office. “So I said, ‘Cut me as big a hole as you can that fits inside of our plenum, out of this office.’ So [if] this AC breaks, I go pull this little piece of wood out of the way, it becomes an inlet for cold air. We close the door to the office, it becomes a meat locker.”

Click here to see the NASCAR Mobile Technology Center slide show…

NASCAR’s time machine

We took our tour of the MTC truck at the Indianapolis Motor Speedway, on race day morning for the Allstate 400 at the Brickyard. According to NASCAR senior officials we spoke with, the race that fans still call the “Brickyard 400” has grown to become the #2 most attended single-day sporting event in the country, with close to 300,000 fans in the stands – eclipsing the Kentucky Derby, and now overshadowing even the mighty Daytona 500.

The Indianapolis Motor Speedway, on race day morning, 6 August 2006.

The #1 event, as anyone who lives here in Indianapolis knows, is affectionately referred to as “The Greatest Spectacle in Racing.” But for the Indy 500, the wheels are exposed, and the cars are designed to fly apart whenever they strike the new SAFER barrier wall. Although Indy Racing League events are different in many respects, a “freeze-the-field” rule has been in effect for several years, and fans have come to accept it. To implement this rule, a network of sensors is installed in 22 opportune locations all around the Speedway, though not at even intervals – there are certainly more sensors in the front straightaway and the pit area than in the backstretch. When the caution comes out for an IRL race here, the timing and scoring officials seated in the famous pagoda near the grandstands use their software, and make their judgments with regard to how vehicles should line up behind the pace car.

Every major racecourse now has some type of sensor network installed, for servicing either IRL or NASCAR, or both. The IMS had its network installed after an end-of-race incident in 2002, in which Paul Tracy passed race leader Helio Castroneves after the caution was waved. The freeze-of-field rule was already in effect, but Tracy claimed he didn’t see the yellow light on the track – even though a video replay showed a yellow light come up on Tracy’s steering wheel before the pass even began. Castroneves was declared the winner, but many fans were still left bitter.

NASCAR wants to avoid public feuds between its drivers, in an effort to clean its image and be more presentable to families. But with 42 races in the Nextel Cup circuit alone, NASCAR could no longer afford to lug its servers up and down the elevators of every scoring tower and pagoda on its schedule. Thus the need for the MTC truck, which can pull up alongside the Caterpillar power station and the NBC/TNT mobile production vehicle, and supply the exact same hardware and software for – eventually – every race. (After Steve Worling and his team produce a duplicate vehicle for 2007, it will be equipped with the same hardware and software.)

Steve Worling shows an AMB TranX Pro RFID transponder device, which is mounted behind each race car.

The MTC connects to the IMS’ existing fiberoptic network of 22 sensors, called decoders. Each of these decoders serves to project its own crossing line across the racecourse, as it reads the constant pulse of signals sent from RFID transponders mounted on the rear of each vehicle. The transponder itself is two inches long. That’s a very important fact, because every NASCAR vehicle is exactly 14 feet long. So the line each decoder represents at its designated position on the racecourse is exactly 14′ 2″ from its physical location.

Each decoder is connected to the fiberoptic network by way of a loop, and sometimes officials will call the entire decoder station the “loop.” “What it does is record all of those pulses,” Garr explained, “and throws out all of them except the strongest one. That’s how it calculates the accuracy. It’s very accurate, down to about a thousandth of a second.”

NASCAR’s time machine, Continued

The specific hardware the IMS uses to contact each decoder – along with virtually every other major racecourse in North America – is called the TranX Pro Multiloop system, manufactured by AMB (no relation to AMD). This system completely accounts for the transponders and the decoders. The software NASCAR uses to manage this system is called TimeGear. It projects the locations of decoders along the racecourse, and monitors the physical condition of each one – not every loop is critical to the operation of the system except the one representing the start/finish line, although the more loops operational, the better. Green boxes on the map indicate loops in nominal working order.

“When the car comes across [a loop],” Worling explained to us, “the transponder in the car gets read by the decoder, and it sends [a pulse] back to the server. As a car comes through, [the software] says, ‘Now I know relatively where you’re at, and I’m going to put you in order as everybody comes around until you get to the next line, and if you change a little bit, I’ll adjust that.’ So when we throw a caution, what happens is, everybody gets lined up in their relative positions from the previous loop they were scored at.”

NASCAR scoring official James Garr demonstrates TimeGear timing and scoring software, showing the locations of decoders stationed along IMS.

As NASCAR scoring official James Garr demonstrated to us, two screens along the left-side wall are devoted to the cars’ running order. When a caution comes out, the bottom display will show the frozen running order, where the cars should be. The top display then continues to show the cars’ actual running order, along with notation letting officials know when to notify drivers to move forward or back as they line up behind the pace car. “Then we can see, if they move out a little bit, we can say, ‘Hey, 48, you need to back up two spots!'” explained Worling, in a soon-to-be-prophetic reference to Jimmie Johnson.

With the strict enforcement of the frozen field, Garr stated, “[drivers] have no incentive to keep on the accelerator, because by the time the caution comes out, it’s already been decided where [they’re supposed to go].”

The loop sensors are all embedded at various points in the race track. As the track ages – as asphalt tends to do very quickly – the track surface itself can actually weep. This might move the sensors an inch or two over the years. Doesn’t that affect the scoring? “No, because it’s all relative,” said Garr. “All we’re trying to do is figure out which car is in front of the other car. So I don’t really care how many inches this car is in front of that car. When a caution comes out, I’m not concerned about the start/finish line. I’m concerned about the last line that they passed.”

The complete network bandwidth, Worling stated, has the usual Ethernet bandwidth of 100 Mbps. That’s more than enough, he said, because the data each decoder sends through the loop – complete with car identifier and timestamp – is only about 60 K per packet. At full streaming, he said, bandwidth consumption only runs about 2 Mbps, and that’s counting the separate feed to officials who continue to occupy their traditional locations inside the pagoda.

At the end of the race, the entire database collected from every point in time is transmitted to NASCAR’s headquarters in Daytona Beach, Florida. It’s a proprietary, flat-form database that can consume about 200 MB of hard drive space – as much as 1 GB for a race with more decoder data or a longer road course – but when sent in compressed form consumes only about 15 MB. The connection used to send this file is not the Internet – rather, it’s a corporate VPN, the tunneling data for which is never shared with the outside world. Having the race data hacked is something NASCAR never wants to experience.

NASCAR’s time machine, Continued

As it stands now, NASCAR doesn’t share its timing and scoring feed with any of its teams. Part of the reason for that is because it doesn’t want to start granting access to outside parties, for reasons explained in the paragraph above. The other reason is more technological: The data receiving equipment used by most speedways (none of which are bigger than the IMS) and by all the teams and their crews, is a 9600 bps serial connection, sent over a 600 MHz radio frequency. All that’s sent over this connection is the current scoring data – and for “freeze-of-field,” that means the assessed running order at the time of caution. But it’s literally broadcast in little regular pulses through transmitters, and picked up by fairly expensive radio sets – about $2,000 per unit.

James Garr shows where the ‘freeze-the-field’ results show up.

“The reason it’s like that,” explained Garr, “is because we have to feed scoreboards, we have to feed stats [statistics], a lot of different people…and they’ve got software written that’s based on serial connections to us. We can’t just upgrade and leave them in the dirt.”

Over time, it would actually save teams money, Worling said, if the data could be sent faster over network cable. So that’s part of NASCAR’s next upgrade: For next season, it’s considering co-opting with the crews who lay down fiberoptic cable for the digital TV camera connections, so they can about 60 K or so of that bandwidth to send scoring data to teams through everyday network cards. This way, teams would have an interactive client/server connection with the scoring system, which can feed data asynchronously through the loop.

One group that would immediately see the benefit from this upgrade are the TV statistics providers, whose job is to make the commentators they serve seem smarter than they sometimes are. “The next world we’re moving into is where we bring the stats group in house,” said Worling. “We talk about stats – it’s really a group of guys who are providing information to the TV guys in the broadcast booth, and the media center in the press box. Right now, that’s all done kinda serially. We’re going to switch that.” Rather than outsource its statistics service, NASCAR will compile its statistics in-house, and provide that to TV networks along with the scoring data they already receive.

The way the MTC truck is currently staffed, it only needs two scoring officials to monitor data and to interact with officials in the scoring tower – at IMS, in the pagoda. While at first glance, it may look like these guys are actually running software on a bunch of laptops, the truth is, they’re interacting with two workstations that are crunching and displaying data being run on the HP servers. If you’ve ever run applications software on Windows over a long period of time, you already know that uptime seems to throw off its ability to manage real-time – especially if a screen saver should happen to kick in. The way the workstations interact with the TimeGear software, however, the servers are crunching all the data, unimpeded by the activity going on within the workstations’ processors. These processors, meanwhile, are managing the SQL Server and proprietary databases, which are not time-intensive processes.

The two banks of wall screens are managed by Nvidia mezzanine cards plugged into the workstations. Each graphics card manages the view for three displays, and could handle more if necessary. Meanwhile, monitoring and other software actually is running on the laptops, unimpeded by the process sequences for the workstation and servers. So even the software procedure has been designed here for maximum efficiency.

“The workstation environment has really been awesome, that these guys [from HP and AMD] have been able to deliver,” said Worling. He was most impressed with the fact that the Quadro Graphics software used on the Nvidia cards utilizes the same screen rendering drivers used by NASA for the Mars Rover mission. There’s something else NASCAR has in common now with the space program.

Click here to see the NASCAR Mobile Technology Center slide show…

The big freeze at the Brickyard

I have never stood at the Indianapolis Motor Speedway at the last lap of a race – and I’ve been there for 16 – where the crowd wasn’t standing for the last lap. For all of them, the final lap of a close race is always a moment frozen in time.

The Joe Gibbs Racing crew scrambles to service J.J. Yeley during his first pit stop.

It is lap 159 of the 2006 Allstate 400 at the Brickyard. Jimmie Johnson (Lowe’s Chevrolet #48) has seized the lead for a third time, and this time isn’t letting it go. Kasey Kahne (Dodge #9), in the lead at a few points during the end, saw pole-sitter and long-time leader Jeff Burton (Cingular Chevrolet #31) drop way back, and had his eyes on Kevin Harvick (Reese’s Chevrolet #29) who was trading first and second position with Johnson throughout the last 30 laps.

The leader sees the white flag. It is lap 160. Kahne, eager to race for that last line, gives it all he’s got. But his car gets “loose,” as these cars can do right at the end, and strikes the SAFER barrier head-first. Trying to avoid an incident, Robby Gordon (Johns Manville Chevrolet #7) taps Greg Biffle (National Guard Ford #16). No one is hurt.

The crowd knew what this meant. Over the public address system, the voice of Dave Calabro verified the news everyone had expected. The field was frozen. The order of finish was known, and the race was still going. Jimmie Johnson had an early victory lap.

There were no controversies. No complaining drivers. No protests filed. Nobody whining over lost traditions. The race was made safer, and lives were saved, never mind the sacrifice of a silly showdown.

For the NASCAR Mobile Technology Center team, it was mission accomplished.

Jimmie Johnson surges forward one more time during the final laps of the 2006 Allstate 400 at the Brickyard.

See the NASCAR Mobile Technology Center slide show…