Durham University: Simulating the Universe

The Durham University EAGLE project is an amazing computer simulation that models the formation of galaxies and black holes.

If you could ride a spaceship to watch the universe grow after the Big Bang, it might look something like Durham University’s EAGLE simulation. You’d zoom through veins of cosmic dust strung across the cold, dark expanse of space and watch splashes of bright pink gas cool to become spinning spiral galaxies. The further you travel, the more galaxies you’d see—thousands of them in an infinite virtual universe.

The EAGLE (Evolution and Assembly of GaLaxies and their Environments) project is extraordinary in its scope. It’s led by Richard Bower, Professor of Cosmology at Durham University’s Institute for Computational Cosmology.

The problem
Why is the universe the way it is? What is it made of? You can’t test observational theories in a laboratory, so you need a virtual approach to tackle the biggest cosmological concepts.

What’s Needed
Durham University’s EAGLE project can model galaxies in a cosmological volume measuring 100 Megaparsecs on a side (over 300 million light years across). But it’s not big enough.

The Solution
Using newer, high-performance hardware, including Intel® Xeon® Scalable processors, Durham has upgraded its supercomputing cluster to simulate the universe in greater detail.

“My role...” says Bower with a smile. “I like to think of it as making universes. But the aim of our EAGLE project is to create calculations that generate artificial models that can be compared to what astronomers see in telescopes. It is helping us to understand how the real universe and galaxies like our Milky Way formed”.

It’s a bold undertaking and modelling the universe is obviously easier said than done. As Hitchhiker’s Guide to the Galaxy author Douglas Adams famously put it: “Space is big. Really big. You just won’t believe how vastly, hugely, mind-bogglingly big it is.” This massive scope poses a challenge for any attempt to simulate it in any real detail.

“How big is our universe? We don't know,” admits Professor Bower. “But this is because we can only see so far. Even with the biggest and most technologically advanced telescopes, we can only see back about 14 billion years and that's simply because the universe has a finite age. It had a beginning, the Big Bang, and so light that was emitted in the Big Bang can only travel as far as it can travel in 14 billion years. As far as we know, the universe may be much bigger.”

Bower and his team at Durham University would like to be able to run a single simulation that includes this entire observable universe. But in technology terms it’s just not practical. The original EAGLE simulation (first run in 2015) is one of the largest cosmological hydrodynamical simulations ever created, using nearly 7 billion particles to model its intergalactic physics. It took 50 days of computer time on 4,000 compute cores to model a cosmological volume measuring 100 Megaparsecs on a side (over 300 million light years).

“When we started the EAGLE project, people said it would be technically impossible. They've had to eat their hats.”—Richard Bower, Professor of Cosmology at Durham University’s Institute for Computational Cosmology

Even that is a fraction of the estimated 14 billion light-year size of the real universe, but it’s still an area large enough to contain 10,000 galaxies. Using observed properties of real galaxies (e.g., size, mass, color) and applying the laws of physics, EAGLE can produce an artificial slice of universe that’s statistically representative of the larger whole. This can then be used to explore some of the biggest cosmological questions, such as: “Why do galaxies look the way they do?” and “what is the universe made of?”

The EAGLE project is already changing how we understand the structure of the universe and is delivering fascinating new insights. “We set up the simulation just after the Big Bang,” explains Bower, “and we ran it forward in time to the present day to see whether our universe looks like the one we can see with telescopes. Initially we struggled to produce something that was very realistic. Eventually we realized that we were missing a key ingredient—black holes.”

Black holes seem to play a crucial role in shaping the universe we observe today. Galaxies come in all sorts of different types, as identified by American astronomer Edwin Hubble in 1926. His classification scheme (aka the “Hubble Tuning Fork”) features spiral galaxies at one end and elliptical galaxies at the other. Spiral galaxies have beautiful disks and feature an abundance of new, young stars. Whereas elliptical galaxies are typically rugby ball-shaped and their stars are much older.

“What we needed to make our simulated universe look like Hubble’s,” says Bower, “was to include black holes in the calculation. When galaxies get to about the size of the Milky Way, their black holes tend to be very small. But then the black holes begin growing very fast and that transforms the appearance of the galaxy and it stops forming new stars. The existing stars subsequently rearrange themselves into an elliptical shape. It’s one of the most remarkable things that we have discovered in the simulation and provides such a good description of what we see in the real universe.”

Durham University has recently upgraded its supercomputing system to run the next iteration of the EAGLE simulation. With rewritten code and a new hardware solution based on the latest Intel® Xeon® Scalable technology, Bower and his team will be able to simulate a patch of universe that’s 30 times bigger than the 2015 sim and, crucially, features more data points. More data points mean more detail and it will increase the chances of encountering rare celestial objects, like quasars.

“It’s essential that the hardware is powerful and reliable,” says Bower. “Because we need to bundle lots of physics together to model the way we believe the real universe works—how galaxies form, how they move, the pull of gravity, the flow of time and the evolution of stars... It’s a fantastic resource for understanding things you see through telescopes, testing theory with observation. But it stresses our engineering in a very critical way.”

It’s why Durham needed a significant hardware upgrade. As Bower points out, compared to the 300 million light-year volume used in 2015, “in the next simulation, we will be able to model a volume of almost a billion light years, something like a tenth of the visible universe.”

In a corner of the Ogden Centre for Fundamental Physics, where Bower and his team are based, there’s a Universe Creator. It looks like a 1980s arcade cabinet, with a glass pyramid in the center that displays a floating view of a virtual universe. Turn a dial and you can increase/decrease the amount of dark matter. Push a lever and you can adjust the “black hole power.” It’s a simplified demonstration of the EAGLE output, but no less impressive for it.

“When we started this project,” remembers Professor Bower, “people said it would be technically impossible. They said that trying to do these calculations on such a big scale wouldn’t work and the galaxies won’t look like the galaxies in the real universe. But we’ve worked hard to make sure that’s not true and to get a good balance between the amount of processing power versus bandwidth. We found that Intel® hardware delivers that.

“It’s amazing how it’s worked [so far]. What I've enjoyed most is making up a movie sequence where you start at the beginning of the universe and you fly in your spaceship, the universe forming around you, the gas clouds condensing and forming into stars. And you fly along and then, right there at the end of the movie, is our own Milky Way galaxy.”

Explore Related Products and Solutions

Intel® Xeon® Scalable Processors

Drive actionable insight, count on hardware-based security, and deploy dynamic service delivery with Intel® Xeon® Scalable processors.

Learn more

Intel® Optane™ Persistent Memory

Extract more actionable insights from data – from cloud and databases, to in-memory analytics, and content delivery networks.

Learn more

Notices and Disclaimers

Intel® technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Performance varies depending on system configuration. No computer system can be absolutely secure. Check with your system manufacturer or retailer or learn more at https://www.intel.com. // Software and workloads used in performance tests may have been optimized for performance only on Intel® microprocessors. Performance tests, such as SYSmark* and MobileMark*, are measured using specific computer systems, components, software, operations and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you in fully evaluating your contemplated purchases, including the performance of that product when combined with other products. For more complete information visit https://www.intel.com/benchmarks. // Performance results are based on testing as of the date set forth in the configurations and may not reflect all publicly available security updates. See configuration disclosure for details. No product or component can be absolutely secure. // Cost reduction scenarios described are intended as examples of how a given Intel®-based product, in the specified circumstances and configurations, may affect future costs and provide cost savings. Circumstances will vary. Intel does not guarantee any costs or cost reduction. // Intel does not control or audit third-party benchmark data or the web sites referenced in this document. You should visit the referenced web site and confirm whether referenced data are accurate. // In some test cases, results have been estimated or simulated using internal Intel analysis or architecture simulation or modeling, and provided to you for informational purposes. Any differences in your system hardware, software or configuration may affect your actual performance.