UT has received a $10 million grant from the National Science Foundation to develop a computer with the ability to process the vast amounts of data generated by the current system of the foundation’s computers around the U.S.
The new supercomputer, to be called Nautilus, will be housed in UT’s Joint Institute for Computational Sciences, which is located on the Oak Ridge National Laboratory campus.
Nautilus will have 1,024 core processors, enabling it to accommodate four terabytes of shared memory, accompanied by filing space for one-petabyte of information. This configuration will allow the Nautilus to rank in size among the top shared-memory computers worldwide.
The new supercomputer will share its home with Kraken, the world’s fastest academic supercomputer. However, while Kraken is devoted to running simulation applications, the Nautilus will serve to analyze massive amounts of scientific data.
Working for both UT-Knoxville’s College of Engineering and ORNL, computer scientist Sean Ahern has been charged with the task of crafting and directing the Center for Remote Data Analysis and Visualization. This center will be able to amass and analyze enormous quantities of data gathered through use of scientific sensors or other intricate computer simulations.
The center will achieve its aims through the use of Nautilus. The new supercomputer will be valuable for its ability to organize and interpret the information that other supercomputers are gathering. Instead of existing as an incomprehensible amount of complex data, the work of other supercomputers will now be expressed as more accessible and applicable results.
Working along with Ahern, Jian Huang, associate professor in UT’s Department of Electrical Engineering and Computer Science, will serve as a co-investigator on the development project.
The NSF Cyberinfrastructure Program awarded the university the commission for the Nautilus project.
“We submitted a proposal which was the highest ranked among all proposals submitted in the entire U.S.,” Huang said.
NSF was able to provide the funding for this program through the American Recovery and Reinvestment Act of 2009 or the economic stimulus package.
The new supercomputer will function as “a hub of expertise and computing resources for the research arm of UT, as well as a world-class platform for use by the undergraduate education arm of UT,” Huang said.
The Nautilus hardware is expected to be operational in the summer of 2010, along with the basic software features. However the software will be continually developed through the four-year duration of the NSF’s funding.
UT scientists conducting research that will benefit from the capabilities of Nautilus are looking forward to the supercomputer’s launch.
“In my opinion, Nautilus holds a tremendous potential to impact the UT community, in which many researchers rely heavily on cutting-edge digital resources,” Xueping Li, director of Intelligent Information Engineering and Systems Laboratory, said. “This unprecedented visualization and data analysis facility will lower the barriers for many academic members across a broad array of research areas to exploit cyber-infrastructure. As for myself, I cannot wait to use Nautilus to my research in large-scale systems modeling, simulation, visualization and optimization.”