- Survivors pulled from Oklahoma tornado debris as toll falls |
- Analysis: Some Republicans see new scandal in Sebelius fundraising
- Convicted U.S. killer Arias would join tiny death row group
- Drop in U.S. underground water levels has accelerated -USGS
- Israel fires back at Syria after gunshots at its troops
Biggest Problem for Exascale Computing: Power
Computers that can deliver an exaflop of performance — producing a billion billion calculations per second — aren’t the stuff of science fiction. The supercomputer industry wants to hit that mark by 2017 (see Supercomputers and the Search for the Exascale Grail, GigaOM Pro, subscription required). Currently, one of the world’s fastest supercomputers (Jaguar) runs at 2.3 petaflops, which performs more than 2 million billion calculations in a second, and your laptop likely processes around 2 billion calculations a second.
No, the biggest hurdle for exascale computing continues to be power, and a project manager at Belgian research institute IMEC is quoted as saying, “Energy is number one. Right now we need 7,000 MW for exascale performance.” That’s 7 GW of power for a single exascale capable computer. Other experts put the figure closer to 1 GW.
As Stacey points out in her report, supercomputers are designed to run at optimal speeds, and like a race car, anything superfluous is stripped out for the sake of better performance. That means energy use concerns that have led to the construction of greener data centers have been mostly absent from the supercomputing race. The IMEC project manager says the goal is to cut that 7 GW down to 50 MW — clearly a massive drop.
IBM VP of Deep Computing David Turek, told Stacey that when supercomputers are built at the exascale level, there needs to be a rethinking of the design, because just making them bigger multiplies the inefficiencies. One way IBM is looking to solve this problem is by using more efficient chips for various work loads, including graphics processors from Nvidia, the Cell processor built by IBM, or even ARM-based processors found inside cell phones.
More powerful supercomputers could be key to helping solve the world’s pressing problems like climate change. James Hack, who heads up climate research at ORNL and directs the National Center for Computational Sciences, told me once that he thinks more powerful supercomputers for climate research will “improve the fidelity of global climate modeling simulations.”
There’s also been a recent effort underway to share computing power for climate research. Both the Department of Energy and Google recently announced computing sharing projects focused on climate change data. The DOE is donating space on two of the world’s supercomputers for dozens of projects focused on energy innovation. At the same time, Google is donating parallel processing power to help groups in developing countries build environmental maps based on its new Google Earth Engine tool.
For more research on supercomputing check out GigaOM Pro (subscription required):
- Supercomputers and the Search for the Exascale Grail
- The Real Reason Google Is Buying Wind Power
- Green Data Center Design Strategies
Image courtesy of Cray.
- Tweet this
- Share this
- Digg this