A team at the University of Colorado has performed one of the largest cosmological supercomputer simulations ever to produce a simulation of 2.5% of the visible universe. It has taken a decade to prepare the code for the simulation.
The UC team ran the computer code for a total of about 500,000 processor hours at two supercomputing centers --the San Diego Supercomputer Centre and the National Centre for Supercomputing Applications at the University of Illinois. The team generated about 60 terabytes of data during the calculations.
The model constructed in the computer equates to a region roughly 1.5 billion light-years across (one light year is about six trillion miles).
The simulation itself takes into account virtually all of the known physical conditions of the universe reaching back in time almost to the Big Bang and models the motion of matter as it collapsed due to gravity and became dense enough to form cosmic filaments which are comprised of much of the gaseous mass of the universe.
These filaments stretch for hundreds of millions of light-years.
I wonder how many CD-ROMs it would take to accommodate all of the data?
Pictured is a portion of a supercomputer simulation of the universe. The bright object in the centre is a galaxy cluster about 1 million-billion times the mass of the sun. In between the filaments, which store most of the universe's mass, are giant, spherical voids nearly empty of matter.