IBM to build massive 20 Petaflop supercomputer for NNSA
Chicago (IL) – What happens in the supercomputer world is a seemingly distant topic
for most of us, yet it impacts all of us in many ways through new
opportunities for scientists and hopefully accelerated research
results. Recent advances in supercomputing are simply breathtaking, and the computing power they'll soon possess is awe-inspiring.
The most recent list of the Top 500 supercomputers, published in November 2008, included the first ever Petaflop systems - capable of performing more than 1 quadrillion floating point operations per second. Now we hear that the Department of Energy’s National Nuclear Security Administration (NNSA) has ordered a supercomputer that will be able to deliver 20 times that performance by 2012.
The new supercomputer, called Sequoia, will be installed at the site of the Lawrence Livermore National Laboratory in stages necessary to reach its maximum processing power in three years. The first portion of it, called Dawn, is currently being installed and is expected to deliver about 500 TFlops during Q1 2009. While Dawn is based on currently available processors, Sequoia will be built on future IBM BlueGene technology.
When completed, the supercomputer will have 1.6 Petabytes of memory, 96 racks, 98,304 compute nodes, and 1.6 million cores. IBM promises Sequoia will be 160 times more power efficient than ASC Purple and 17 times more so than BlueGene/L - both previously installed supercomputers at LLNL.
BlueGene/L is currently listed as the world’s fourth fastest supercomputer with a maximum performance of 596 TFlops. The system was upgraded in 2007 and 2008 to 106,496 PowerPC 440 (700 MHz) processors with 212,992 cores.
IBM said its performance is comparable to the world’s population (6.7 billion people) working together on a calculation 24 hours per day and 365 days a year using a hand calculator: However, it would take all of us 320 years to achieve what Sequoia can do in just one hour.
In theory the projected 20 PFlops will offer a 50x improvement in predicting earthquakes and a 40x improvement in predicting weather - allowing forecasters to predict local weather events that affect areas 100 meters to one kilometer in size, which is a significant improvement over their current ten-kilometer ability.
As far as earthquakes are concerned, the possibilities such a new system offers are fascinating: We recently ran a story on the work being done by the Mid-America Earthquake Center (MAE) at the National Center for Supercomputing Applications (NCSA) and the impact such research has on people living in areas threatened by earthquakes - down to even predicting which houses may be destroyed and which may not.
Imagine the possibilities of continued, affordable supercomputer growth. What a fascinating time we live in.