Is everything we thought we knew about the Universe wrong? A groundbreaking new study is shaking the foundations of cosmology, daring to challenge the very idea that the Universe's expansion is accelerating. Prepare to question everything you thought you understood about dark energy, cosmic distances, and even Einstein's theories! But here's where it gets controversial... what if the Nobel Prize-winning research that 'proved' accelerating expansion was based on a flawed assumption? Let's dive in.
In the world of cosmology, the 'standard model,' also known as ΛCDM, reigns supreme. Think of it as the king of the castle. ΛCDM stands for Lambda Cold Dark Matter. The CDM part represents Cold Dark Matter, which scientists believe makes up the majority of matter in the Universe. (You can learn more about Cold Dark Matter here: https://briankoberlein.com/blog/cold-and-dark/). The Λ, or Lambda, symbolizes the cosmological constant. This constant, initially proposed by Einstein, is supposed to drive the ever-increasing expansion of space-time. The cosmological constant has been a cornerstone of the standard model for almost a century. (See a historical perspective: https://briankoberlein.com/post/how-far-weve-come/) And with good reason! Nobel Prizes have been awarded for research supporting its validity. So, despite its strangeness (explore the weirdness here: https://briankoberlein.com/post/ever-expanding-into-nothing/), the cosmological constant is considered a very real phenomenon.
But a recent paper published in Monthly Notices of the Royal Astronomical Society is throwing a wrench in the works. [^1] The paper boldly claims that the Nobel-winning study that 'proved' cosmic expansion is accelerating is, in fact, wrong. The very concept of dark energy and cosmic expansion as an inherent property of space-time? Wrong, according to this study. And the cosmic distance ladder, the tool we use to measure distances to galaxies? You guessed it – also wrong!
"Wait, what?" you might be asking. It's a fair question! And this is the part most people miss... this challenge to the standard model didn't appear out of nowhere. As early as 2015, some evidence suggested potential biases in our supernovae distance measurements. (Read about the early hints: https://briankoberlein.com/blog/saying-ive-got-a-chance/). Furthermore, issues with the standard model, like the Hubble tension, (explore the Hubble tension: https://briankoberlein.com/blog/tension-and-hope/) have led astronomers to at least consider alternative explanations. But this new study isn't just a tentative suggestion; its implications are far-reaching, making it worth examining in detail.
The unique brightness curve of a Type Ia supernova. Credit: Wikipedia
The study focuses on Type Ia supernovae. (For an explanation of different supernova types, see: https://briankoberlein.com/blog/just-my-type/) These supernovae, triggered by binary white dwarf stars, can be identified by the presence of silicon in their spectral lines. Their brightness curves – the patterns that describe how their brightness increases and decreases over time – are primarily determined by the radioactive decay of Nickel-56 into Cobalt-56, and then into Iron-56. Since radioactive decay rates are constant for specific elements, the light curves of Type Ia supernovae should act as 'standard candles.' This means that regardless of where we observe them in the Universe, we can compare their observed brightness to their intrinsic brightness to determine their distance. It's like knowing how bright a lightbulb should be; if it appears dimmer, you know it's farther away.
A correlation between galactic age and supernova brightness. Credit: Son, et al
However, we've long known that this 'standard candle' method isn't perfectly precise. Even with its foundation in radioactive decay, there are statistical variations in the relationship between peak brightness and the width of the light curve. But this new study reveals a significant correlation between the maximum brightness of a Type Ia supernova and the age of its host galaxy. In essence, the younger the galaxy, the fainter the supernova is likely to be. And this is the part that could spark differing opinions...Why would such a correlation exist? There's no immediately obvious physical reason for this relationship. Yet, observationally, the correlation is remarkably clear.
We can estimate a galaxy's age by analyzing its overall spectrum of light. Hot, bright blue stars have short lifespans, while small, red dwarf stars live much longer. Newly formed stars also contain a higher abundance of heavier elements. Therefore, the presence or absence of specific spectral lines within a galaxy provides valuable information about its age. (Learn more about galactic spectra: https://briankoberlein.com/post/cosmic-rainbow/). This method works for both nearby and distant galaxies. And when the peak brightness of Type Ia supernovae is plotted against the ages of their host galaxies, the correlation becomes strikingly apparent. Studies by other research teams indicate that this correlation has a statistical significance of around 5σ, which is considered quite strong. (Understand statistical significance: https://briankoberlein.com/post/five-is-a-magic-number/)
This new result disagrees with ΛCDM but agrees with BAO and Planck. Credit: Son, et al
Bearing this correlation in mind, the research team re-examined studies of cosmic acceleration. Instead of treating variations in supernova brightness as random statistical fluctuations, they incorporated the age-brightness correlation into their analysis. The early Universe contained a greater proportion of young galaxies compared to the middle-aged Universe, and this bias becomes more pronounced with increasing distance. When they factored this into their calculations, the results were astounding. The ΛCDM model is contradicted with a confidence level exceeding 9σ! Based on their findings, cosmic expansion isn't accelerating. Instead, the Universe continues to expand, but the rate of expansion is slowing down. It has been decelerating for approximately a billion years.
If these results are confirmed by further research, what does this mean? It suggests that cosmic expansion cannot be solely attributed to the inherent fabric of space-time. According to general relativity, the Hubble parameter is supposed to be an absolute, universal constant. It shouldn't vary over time or space, and it certainly shouldn't cause the expansion to decelerate. So, could it be that Einstein, despite his genius, was a little bit wrong after all? (Explore the nuances of Einstein's theories: https://briankoberlein.com/blog/why-einstein-will-never-be-wrong/)
But there is a silver lining! This finding could potentially resolve the biggest mystery in modern cosmology. I'll delve into that topic next time. What do you think? Could this be the beginning of a revolution in our understanding of the Universe, or is it just a statistical anomaly? Share your thoughts and opinions in the comments below!
Reference: Son, Junhyuk, et al. "Strong progenitor age bias in supernova cosmology–II. Alignment with DESI BAO and signs of a non-accelerating universe (https://academic.oup.com/mnras/article/544/1/975/8281988)." Monthly Notices of the Royal Astronomical Society 544.1 (2025): 975-987.