Agencies in race to smash supercomputing records
Efforts sponsored by NASA and the Energy Department have set new supercomputing standards in the past two weeks.
The race is on again to build the world's fastest supercomputer, and in the past two weeks, two efforts led by federal agencies have beaten the record that was held for two years by Japan. The developments bode well for U.S. international competitiveness, experts said, but there can be no rest yet.
NASA and Silicon Graphics Inc. late last month announced that the space agency's new Columbia supercomputer is the most powerful in the world. Driven by an Intel Itanium 2 processor, it reached a sustained performance of 42.7 teraflops per second, or 42.7 trillion calculations per second.
Then last week, Energy Secretary Spencer Abraham announced that the IBM-made BlueGene/L supercomputer, which was developed for nuclear-weapons stockpiling, attained a performance of 70.72 teraflops per second.
Both computers eclipsed the two-year record of Japan's Earth Simulator, which was rated at 35.86 teraflops per second, and an earlier peak of 36.01 teraflops per second by BlueGene/L.
Cray, the historic supercomputing leader, also has returned en force with shipments of its XT3 supercomputer, which sells for $2 million apiece. The XT3 was designed for Sandia National Laboratories as part of its "Red Storm" system, which will perform at more than 40 peak teraflops per second. The Forest Service will use a Cray computer to predict wildfires.
The apparent return of U.S. supercomputing and its significance for U.S. scientific capabilities is not lost on scientists and government officials.
"High-performance computing is the backbone of the nation's science and technology enterprise, which is why the department has made supercomputing a top priority investment," Abraham said in a Thursday statement. "Breakthroughs in applied scientific research are possible with the tremendous processing capabilities" of computers such as BlueGene/L.
Bob Bishop, the CEO of SGI, said at the unveiling of the NASA supercomputer that "supercomputing stimulates innovation and helps companies compete in today's global marketplace." He added, "Beyond the results that we will see in science and engineering, I believe there will be a trickle-down effect throughout our economy of these innovations and discoveries." He predicted a boost to the U.S. economy and security, new jobs at home and inspiration for a new generation of scientists and engineers.
Bishop said his company developed its Columbia computer in 120 days, compared with the several years it took to develop BlueGene/L. He added that SGI has 2,600 employees and one factory, in Chippewa Falls, Wis., compared with IBM's 319,000 employees worldwide.
After IBM broke the record Thursday, he said, "The bottom line is [that] this is a two-horse race, and to quote IBM itself, 'No one is number one forever.'"
The National Academies' National Research Council, meanwhile, will release a new report on Monday outlining the federal government's needs for supercomputers.
After industry pressure, the White House Office of Science and Technology Policy has become involved in the effort, as it increasingly recognized the need for better hardware and software. OSTP Director John Marburger said in July that "something is happening in the world of computing that is about to alter this."
"The hardware we take for granted is not capable of doing all that we should like to do, and we know today that much more power is potentially available to us," Marburger said at a Council on Competitiveness conference for high-performance computing users. "If that potential is realized, it will once again transform ways of doing business. In today's globally competitive economy, we cannot afford to leave this opportunity to others."
The council is sponsoring a panel discussion on the importance of advanced computing to U.S. competitiveness at the Supercomputing 2004 conference that began over the weekend and runs through this week in Pittsburgh. The event features top government, research and industry experts addressing the latest issues.
In July, the council produced a study on the use and impact of high-performance computing resources in industry. The study, compiled by the research firm IDC and sponsored by the Defense Advanced Research Projects Agency, found that high-performance computing is increasingly critical to companies' competitive survival. Some of the 33 executives surveyed said more powerful and easier-to-use computers could save them billions of dollars.
In June, Marburger charged the President's Information Technology Advisory Committee with assessing federal research in computational science, which is an application of high-end computing for things such as weather and climate modeling. PITAC will hold a town-hall event at the supercomputing conference this week.
David Nelson, director of the national coordination office for information technology research and development, said after a PITAC meeting on Thursday that "supercomputing has tremendous promise, but it's up to us to realize that promise." He said he expects the initiative to go forward in the second Bush administration.
Daniel Reed, chair of the PITAC subcommittee on computational science, gave a progress report at the subcommittee's Thursday meeting. He said a key application for supercomputing is weather and climate, as it impacts 40 percent of the $10 trillion U.S. economy. Currently, emergency authorities "over warn" by a factor of three, or more than 200 miles, when natural events are predicted, Reed said. That costs an average of $200 million extra per month.
Reed said the subcommittee also has heard that there is a "disconnect" between commercial practice and the computing infrastructure needs of government and academia. In addition, he said investment has been too short term, and a "roadmap" of priorities is needed by decade.
Other problems Reed cited include a limited number of senior leaders, inadequate interdisciplinary education, little interagency coordination, the need for better software, and the availability of government and academic computing resources to all agencies and industries.
Some experts see the role for government in supercomputing as providing greater resources and coordination. In August, Marburger circulated a memorandum for agency heads that highlighted the need for more research and development. In it, he described a recent report of a high-end computing revitalization task force that calls for a coordinated R&D plan.
The fiscal 2005 White House budget called for $2 billion for the networking and information technology R&D program, an increase of 14 percent from 2001.
On Capitol Hill, House Science Committee Chairman Sherwood Boehlert, R-N.Y., supports supercomputing, which he said on Friday "has become an essential resource for U.S. industry and academia."
Boehlert predicted enactment of a high-end computing bill, H.R. 4516, before the end of the year. The bill, an amended version of which the Senate passed Oct. 10, would require the Energy Department to establish and operate one of the world's leading high-end computing facilities to conduct advanced scientific and engineering research and development.
The department also would be called on to develop advancements in high-end computing hardware and software. The bill would authorize $50 million for fiscal 2005, $55 million for fiscal 2006 and $60 million for fiscal 2007.
The House passed the bill July 7, but it has languished in the Senate Commerce, Science and Transportation Committee since then.
NEXT STORY: Navy says new technology contract yields rewards