Top Supercomputer in the World: Should You Care?

June 19, 2012 8 Comments »
Top Supercomputer in the World: Should You Care?

Admittedly, I’m a bit of a contrarian when it comes to what I perceive as overexuberance about potential technologies and various scientific or technological achievements whose value is questionable at best. So, when a news headline pops up about which nation-state has the fastest supercomputer, I become immediately suspicious. The latest incarnation is the U.S.’s BlueGene/Q-based Sequoia supercomputer, which now sits at the apex of the Top500 list. But let’s skip the technical details (numerous articles cover all the critical details) and cut to the chase.

Dispense With the Nationalism

The most grating part of the fanfare about the top supercomputer is the focus on which nation has the top spot. This isn’t necessarily an irrelevant fact, but it’s always picked up by politicians as a point of collective pride. But this is tantamount to a highly indebted neighbor, ever competing with the Joneses, coming home with a Ferrari and declaring himself top of the neighborhood automobile competition. Modern nation-states—particularly those in the west, but followed closely by many others—are vast economic failures in their inability to maintain sustainability (i.e., they create ever expanding debt loads). And not surprisingly, the top supercomputer was built under the auspices of the U.S. Department of Energy—an agency of the most profligate government ever.

What about the next few supercomputers on the list? At the number two spot in the latest rankings is the Japanese K supercomputer, funded by the RIKEN Advanced Institute for Computational Science—an “Independent Administrative Institution,” which in common parlance is a branch of the government of Japan (another nation stuck in a mire of debt and economic difficulties). In third place is another U.S. Department of Energy computer, Mira. Fourth is the German SuperMUC; with a little searching, one can find that it also is a government-funded machine (in a political union on the verge of explosion owing to the fiscal problems of a number of member states). The Chinese Tianhe-1A, which once held the top spot, is funded by the National Supercomputing Center in Tianjin. Sixth is yet another U.S. Department of Energy computer, Jaguar.

The ultimate point of the matter is that although the top supercomputers may represent, in some sense, a technological frontier, they don’t represent sustainable or even necessarily practical computing scale. If you had a credit card with no limit, you could build a supercomputer that beats the competition in the Top500 list—but that’s unrealistic. By the same token, governments that pursue these types of pet projects (always using someone else’s money) haven’t really done anything all that impressive.

A Real Accomplishment in Supercomputing

If you buy a Ferrari in cash, you’re rich and driving in style; if you buy one on credit (or you otherwise lack the means to afford it), you’re probably just an idiot. In the realm of supercomputing, a more commendable accomplishment beyond just maximum petaflops is maximum petaflops within a sustainable budget of a private (i.e., sustainable) organization—whether that organization relies on commerce or donations. No, this may not be the absolute technological limit of supercomputing, but neither is the machines that governments build. Add a few more dollars, and you can probably build something a little faster. Thus, economics always places a limit on development.

In fact, using the Top500 sublist generator, you must move well into the twenties before you find a supercomputer that might be considered to be built by a private organization (and even then, you can’t necessarily dismiss the possibility of the supercomputer receiving at least partial government funding—particularly if the organization is an educational institution). In other words, the vast majority of supercomputers are built by governments. What are the espoused purposes of all this computing power? In the case of Sequoia, the new top machine, CBC News (“U.S. supercomputer tops list of world’s fastest machines”) reports, “The newly assembled Sequoia will be used to conduct simulations intended to extend the life of America’s aging nuclear weapons arsenal, in lieu of underground nuclear testing.” This, presumably, is to ensure that U.S. nuclear arsenal is able to destroy the entire world a minimum of two or three times, lest other nations lose respect for the superpower.

On the bright side, NetworkWorld (“U.S. regains supercomputing crown, bests China, Japan”) cites University of Tennessee computer science professor Jack Dongarra as claiming that “more than half of the machines on this list aren’t deployed in research, academic settings or by government. ‘More than half are used by industry,’ he said.” According to CBC News, industries using supercomputers include “credit card companies, the gaming industry, internet services, financial services, telecommunications, large internet retail companies like and aerospace manufacturers like Lockheed Martin.” Even in this list, you can quickly spy organizations on the government dole: Lockheed Martin, obviously, but also many financial companies as well (does TARP ring any bells?).

The Supercomputer: Is It Really a Grand Feat?

So, maybe the list of top supercomputers in the world is less than meaningful, owing to the flood of government money. From a technical perspective, however, what is the real import of a supercomputer as opposed to, say, a desktop computer? Supercomputers don’t employ processors that are all that different from what most other computers use. In other words, Moore’s Law is not the driving force behind supercomputers relative to other kinds of computing. The driving force is the networking and programming method. NetworkWorld notes, “This system, named Sequoia, has more than 1.57 million compute cores and relies on architecture and parallelism, and not Moore’s Law, to achieve its speeds.” Thus, the focus of supercomputing is not on revolutionary processors (a la the Terminator movies)—instead, it’s on how “normal” processors are interconnected and how they are used to accomplish a programming task. Employing parallelism is more complicated than pursuing the traditional linear approach to computing.

So, supercomputers may not be all that impressive—if you just look at the processors that constitute them. Their interest lies in the way these processors are connected to one another and how they are used to accomplish tasks.


News of who has the top supercomputer—and how fast that supercomputer is—is really quite boring. It’s just another government spending money it doesn’t have to create a neat toy for its employees to play with. (Yes, you’ll hear phrases like “climate change research,” “medical innovation” and so forth to give it all a noble ring.) What is really of interest is what’s the fastest supercomputer built by a voluntarily funded, sustainable organization (company or other institution). That is the real frontier of supercomputing, as it obeys not just the basic rules of science, but it follows the basic rules of economics as well. Computers like Sequoia, while interesting from a purely technical standpoint, are the result of unsustainable fantasy economic policies that have all but ruined western nations. On the technical side, supercomputing is less about the characteristics of processors (more of a focus for small-scale computing) and more about interconnections and parallelism.

Photo courtesy of NASA Goddard Space Flight Center

About Jeff Clark

Jeff Clark is editor for the Data Center Journal. He holds a bachelor’s degree in physics from the University of Richmond, as well as master’s and doctorate degrees in electrical engineering from Virginia Tech. An author and aspiring renaissance man, his interests range from quantum mechanics and processor technology to drawing and philosophy.


Leave a Reply

Pin It on Pinterest

Share This