Computing the Universe

Jeremiah P. Ostriker

Professor, Astrophysics, Princeton University

The study of cosmology, the origin, nature and future evolution of structure in the universe, has been totally transformed in the last decade, and computers have played a major role in the change.  New theories have arisen which make the subject, formerly almost a branch of philosophy, into a quantitative science.  A standard, precisely specifiable model has emerged which has been labeled "Concordance Cosmology."  Initial tests of this model, either using data on galaxy distributions in the local universe or the cosmic background radiation fluctuations reaching us from the distant universe, indicate rough agreement with the simplest predictions of the theories.  But now that fully three-dimensional, time dependent numerical simulations can be made on modern, parallel architecture computers, we can examine (using good physical modelling) the detailed quantitative predictions of this theory and its variants to see which, if any, can produce an output consistent with the real world being revealed to us by the latest ground and space based survey instruments.

Simulations could address 32^3 = 10^4.5 independent volume elements a decade ago.  Now 1024^3 = 10^9 is the current state-of-the art for hydro computations.  Increasingly, unstructured, adaptive or moving mesh techniques are being used to improve the resolution in the highest density regions.  In purely darkmatter (gravitation only) calculations, the ratio of volume-to-resolution element has reached 10^5^3 = 10^15.  This has enabled detailed computation for observable phenomena, from gravitational lensing to X-ray clusters, to be made and to be compared with observations.

Back to Schedule