View more articles about


Dealing with a universe of data

U. CHICAGO—Modeling the evolution of the universe is no mean feat, not only because of the complex mathematics involved, but also because of the sheer amount of data that is generated from a working model of—well, the universe.

A team of scientists at the Department of Energy’s Argonne National Laboratory is working to develop software to manage the mountains of data and allow for real-time interactions.

“Finding the resources and software capable of rendering volumes of data at such large scales can be a challenge,” says Mark Hereld, visualization and analysis lead for Argonne’s Leadership Computing Facility (ALCF).

The facility is home to Eureka, one of the world’s largest graphics supercomputers, which features 200 high-end graphics processing units. Eureka enables software such as vl3—a volume rendering toolkit developed at Argonne and the University of Chicago—that leverages graphics hardware to visualize such data sets in real time.

Networking advances make it feasible to move large amounts of data from the location where it was computed to specialized visualization resources where it can be rendered into images. However, the scientists who need to analyze this data often live and work far from both supercomputing and rendering clusters. It is vital that the renderings be brought to the scientist.

This simulation follows the growth of density perturbations in both gas and dark matter components in a volume 1 billion light years on a side beginning shortly after the Big Bang and evolved to half the present age of the universe. It calculates the gravitational clumping of intergalactic gas and dark matter modeled using a computational grid of 64 billion cells and 64 billion dark matter particles. The simulation took more than 4,000,000 CPU hours to complete.

To see the subtle details in the data and make full use of the visualizations, high-quality images are also required. New vl3 enhancements allow researchers to stream hi-res images created on graphics clusters to a remote cluster driving a high-resolution tiled display.

To illustrate the importance—and feasibility—of visualization at a distance, scientists from several organizations, including institutions from both the National Science Foundation’s TeraGrid project and from the Department of Energy, teamed up to create a live demonstration using vl3.

Images were streamed live at 10 gigabits per second from Eureka over a high-speed network with dedicated bandwidth for moving large datasets worldwide to an OptiPortal tiled display in the San Diego Supercomputer Center’s booth at the Nov. 2009 Supercomputing Conference.

“As a team, we were able to link institutions across the country and leverage high-performance computing, visualization resources, high-speed networks and advanced displays in real-time,” says Joe Insley, principal software developer at Argonne. “But what was really wonderful was seeing the scientists get excited about the possibilities that this will enable.”

The next step will be to add controls already present in the local version of the software to the wide-area version, giving the scientist even more power to investigate data.

University of Chicago/Argonne news:

Related Articles