dark energy – One Universe at a Time https://briankoberlein.com Brian Koberlein Thu, 21 Feb 2019 22:09:36 +0000 en-US hourly 1 https://wordpress.org/?v=5.1 Who Needs Dark Energy? https://briankoberlein.com/2017/04/11/needs-dark-energy/ https://briankoberlein.com/2017/04/11/needs-dark-energy/#comments Tue, 11 Apr 2017 14:29:24 +0000 https://briankoberlein.com/?p=6600

Do we really need dark energy to explain cosmic expansion?

The post Who Needs Dark Energy? appeared first on One Universe at a Time.

]]>

Our universe is expanding. We’ve known this for nearly a century, and modern observations continue to support this. Not only is our universe expanding, it is doing so at an ever increasing rate. But the question remains as to what drives this cosmic expansion. The most popular answer is what we call dark energy. But do we need dark energy to account for an expanding universe? Perhaps not. 

The idea of dark energy comes from a property of general relativity known as the cosmological constant. The basic idea of general relativity is that the presence of matter warps spacetime. As a result, light and matter are deflected from simple straight paths in a way that resembles a gravitational force. The simplest mathematical model in relativity just describes this connection between matter and curvature, but it turns out that the equations also allow for an extra parameter, the cosmological constant, that can give space an overall rate of expansion. The cosmological constant perfectly describes the observed properties of dark energy, and it arises naturally in general relativity, so it’s a reasonable model to adopt.

In classical relativity, the presence of a cosmological constant simply means that cosmic expansion is just a property of spacetime. But our universe is also governed by the quantum theory, and the quantum world doesn’t play well with the cosmological constant. One solution to this issue is that quantum vacuum energy might be driving cosmic expansion, but in quantum theory vacuum fluctuations would probably make the cosmological constant far larger than what we observe, so it isn’t a very satisfactory answer.

Despite the unexplainable weirdness of dark energy, it matches observations so well that it has become part of the concordance model for cosmology, also known as the ΛCDM model. Here the Λ is the symbol for dark energy, and CDM stands for Cold Dark Matter. In this model there is a simple way to describe the overall shape of the cosmos, known as the Friedmann–Lemaître–Robertson–Walker (FLRW) metric. The only catch is that this assumes matter is distributed evenly throughout the universe. In the real universe matter is clumped together into clusters of galaxies, so the FLRW metric is only an approximation to the real shape of the universe. Since dark energy makes up about 70% of the mass/energy of the universe, the FLRW metric is generally thought to be a good approximation. But what if it isn’t?

A new paper argues just that. Since matter clumps together, space would be more highly curved in those regions. In the large voids between the clusters of galaxies, there would be less space curvature. Relative to the clustered regions, the voids would appear to be expanding similar to the appearance of dark energy. Using this idea the team ran computer simulations of a universe using this cluster effect rather than dark energy. They found that the overall structure evolved similar to dark energy models. That would seem to support the idea that dark energy might be an effect of clustered galaxies.

It’s an interesting idea, but there are reasons to be skeptical. While such clustering can have some effect on cosmic expansion, it wouldn’t be nearly as strong as we observe. While this particular model seems to explain the scale at which the clustering of galaxies occur, it doesn’t explain other effects, such as observations of distant supernovae which strongly support dark energy. Personally, I don’t find this new model very convincing, but I think ideas like this are certainly worth exploring. If the model can be further refined, it could be worth another look.

Paper: Gabor Rácz, et al. Concordance cosmology without dark energy. Monthly Notices of the Royal Astronomical Society: Letters DOI: 10.1093/mnrasl/slx026 (2017)

The post Who Needs Dark Energy? appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/04/11/needs-dark-energy/feed/ 6
Bigger, Stronger, Faster https://briankoberlein.com/2017/02/01/bigger-stronger-faster/ https://briankoberlein.com/2017/02/01/bigger-stronger-faster/#comments Wed, 01 Feb 2017 12:00:23 +0000 https://briankoberlein.com/?p=6450

New observations of lensed quasars show the Universe is expanding faster than expected. But these results raise questions about the assumptions of our cosmological models.

The post Bigger, Stronger, Faster appeared first on One Universe at a Time.

]]>

We’ve known for nearly a century that the Universe is expanding. The fact that galaxies are receding away from us was first demonstrated by Edwin Hubble in 1929, building upon the work of Henrietta Leavitt and others. Since then we’ve developed a variety of ways to measure the rate of cosmic expansion, and while they are broadly in agreement, there are small discrepancies between them. As a result we still don’t know exactly how fast the Universe is expanding, as astrophysicist Ethan Siegel has so clearly explained. Now a new method of measuring cosmic expansion may settle the issue, but it also raises more questions.

It all comes down to a physical parameter known as the Hubble constant. The bigger Hubble constant, the greater the rate of cosmic expansion. The value of the constant also tells us the age of the Universe. If you trace the expansion backwards through time, you reach the point where the Universe was extremely hot and dense, commonly known as the big bang.

Hubble’s original measurement of the constant compared the distances of galaxies with the redshift of their light. He calculated galactic distances by measuring the brightness of variable stars known as Cepheid variables, and combined them with measurements of galactic redshifts made by Vesto Slipher. He found that more distant galaxies had greater redshifts, and presumably were receding from us at a greater rate. Hubble’s original value for the constant was about 500 km/s/Mpc, which caused a bit of a cosmological crisis. If the value was correct, the Universe was only about 2 billion years old, which contradicted geological evidence that showed the Earth was more than 4 billion years old.

Credits: NASA, ESA, A. Feild (STScI), and A. Riess (STScI/JHU)

Over time our measurements of the Hubble constant got better, and the results settled around a much smaller value of around 70 km/s/Mpc, putting the age of the Universe at about 14 billion years. We also developed different ways to calculate the Hubble constant using different types of data, and they each produced similar results. This means we know for sure that the Universe is expanding, and we have a pretty good handle on just how fast it’s expanding. But while these different methods broadly agreed, they didn’t exactly agree. It was generally thought that as our measurements got better this discrepancy would go away, but it didn’t. Something was clearly wrong with our understanding of cosmic expansion.

 

Modern measurement tensions from the distance ladder (red) with CMB (green) and BAO (blue) data. Image credit:
“Cosmological implications of baryon acoustic oscillation measurements”, Aubourg, Éric et al. Phys.Rev. D92 (2015) no.12, 123516.

Hubble’s method of comparing distance with redshift has been extended by shifting from Cepheid variables to supernovae. A particular type of supernova known as Type IA allows us to determine galactic distances across billions of light years. In 2016, observations from the Hubble telescope using this approach gave a value of 73.24±1.74 km/s/Mpc, which is on the high side of modern values.

A different approach looks at fluctuations in the cosmic microwave background (CMB). The CMB is the thermal remnant of the big bang, and while it is quite uniform in temperature, there are small-scale fluctuations. As the Universe has expanded, these fluctuations are stretched, so the scale at which fluctuations peak gives a value of the rate of cosmic expansion, and thus the Hubble constant. The most precise CMB measurement of the Hubble constant was made by the Planck satellite, and gave a result of 66.93±0.62 km/s/Mpc, which is on the low side. The Planck result agrees with another method known as baryon acoustic oscillation (BAO), which looks at how galaxies cluster together across billions of light years, which gives a value of 67.6±0.7 km/s/Mpc.

Temperature fluctuations of the CMB vary at different scales. Credit: NASA/WMAP

These disagreements are problematic because they point to problems in our cosmological model. Although each result is quite sophisticated, they depend upon certain assumptions about the Universe. Our current model, known as the LCDM model, includes regular matter, dark matter and dark energy, as well as things such as how flat the Universe is on small scales. Each of these can be measured by independent experiments, but the results all have a bit of uncertainty. Tweak the values of these parameters within the model, and the value of the measured Hubble constant shifts. So we could tweak the model to make Hubble constant results fit, but tweaking models to fit your expectations is bad science.

Now there’s a new method for determining the Hubble constant, and its result is very interesting.

Diagram showing how distant light can be gravitationally lensed. ALMA (ESO/NRAO/NAOJ), L. Calçada (ESO), Y. Hezaveh et al.

Rather than looking at the CMB or measuring galactic distances, the new approach looks at an effect known as gravitational lensing. As light passes near a large mass such as a star or galaxy, it is gravitationally deflected. As a result, light from a distant object such as a quasar can be deflected around a less distant galaxy. Instead of seeing one image of the distant quasar, we see multiple images. But if we look at these lensed images things get very interesting. Each image of the quasar has taken a different path, and those paths can have different lengths. So some images reach us sooner than others. We’ve seen this effect with distant supernovae, for example, allowing us to see multiple “instant replays” of a supernova over the course of a few decades. Quasars can fluctuate in brightness, which allows us to measure the timing between lensed images of a particular quasar.

In this new approach, the team looked at several lensed quasars, and measured the timing differences. These timing differences are affected by the Hubble constant, so by measuring different lensed quasars the team could get a value for the Hubble constant. The key here is that while the results depend strongly on the value of the Hubble constant, they aren’t affected very much by other model parameters such as the amount of regular matter and dark matter. It’s a more direct measurement, and therefore less dependent on model assumptions. The result they got was 71.9±2.7 km/s/Mpc.

This agrees pretty well with the Hubble results, but not with the CMB results. Since the result is less model dependent, it raises questions about our cosmological model. Why are the CMB and BAO results so much lower than the others? It isn’t clear at this point, and while this new result is great, it doesn’t solve the mystery of Hubble’s constant.

Paper: V. Bonvin, et al. H0LiCOW V. New COSMOGRAIL time delays of HE0435-1223: H0 to 3.8% precision from strong lensing in a flat ΛCDM modelarXiv:1607.01790 [astro-ph.CO] (2017)

 

The post Bigger, Stronger, Faster appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/02/01/bigger-stronger-faster/feed/ 8
The Constant Of Time https://briankoberlein.com/2016/09/13/the-constant-of-time/ https://briankoberlein.com/2016/09/13/the-constant-of-time/#comments Tue, 13 Sep 2016 15:00:06 +0000 https://briankoberlein.com/?p=6253

The rate of cosmic expansion has changed over time. So why does it look like a constantly expanding universe?

The post The Constant Of Time appeared first on One Universe at a Time.

]]>

When Edwin Hubble first demonstrated the Universe was expanding in 1929, you could do a simple calculation to determine the age of the Universe. Take the rate at which galaxies expand from each other (known as the Hubble constant H) and set it equal to the inverse age of the cosmos (1/t). This simple model assumed that the Universe expands at a constant rate, thus Ht = 1. When this was first proposed within the context of the big bang model, it actually raised a few questions. Early measurements of the Hubble constant were much higher than the current accepted value, which gave a cosmic age that was actually younger than some stars

We now know the Universe hasn’t expanded at a constant rate. The rate of cosmic expansion is determined both by dark energy driving galaxies apart, and the overall density of matter in the Universe, which tries to slow the rate of expansion. In the early universe, matter dominated, so the rate of expansion was actually decreasing. About 6.5 billion years ago the average density of the Universe dropped to the point that dark energy began to dominate, and the Universe began expanding at an ever increasing rate. An accurate determination of the age of the Universe has to account for the initial inflationary period, then deceleration, then acceleration. If you do that you get an age of about 13.8 billion years, which is the currently accepted age.

Because of this variation in cosmic expansion, the Hubble constant has changed over cosmic time. This is why you can’t simply set Ht = 1. And yet, if you take the current Hubble constant and multiply it by the currently accepted age of the Universe, you get exactly 1 (to within known uncertainties). In other words, if the Universe had expanded at a constant rate, it would be exactly the same size and age as the Universe currently is. This is known as the synchronicity problem. It’s not a problem, per se, but rather an interesting coincidence. This hasn’t been true for any other epoch of the cosmos. It’s also not the only odd coincidence. The vacuum energy density (as determined by the Hubble constant) and the matter energy density are also about equal, and is known as the coincidence problem.

As the Universe expands the matter density drops, while the vacuum density doesn’t,  so it’s tempting to think that the synchronicity problem and the coincidence problem are two sides of the same coin. But a recent work shows this isn’t the case. By varying the parameters of a hypothetical universe, one could create a model where one is true but the other is not. These two unusual correlations are independent of each other. This raises the question of whether the two actually are related by some unknown physical process. We always have to be a bit careful with these kinds of questions. It is perfectly possible that the two “problems” are just due to random flukes. But when you start seeing coincidences in your data it is sometimes worth exploring.

If there is a connection, it will only be a matter of time before we find it.

Paper: Arturo Avelino and Robert P. Kirshner. The dimensionless age of the Universe: a riddle for our time. The Astrophysical Journal, Volume 828, Number 1 (2016) arXiv:1607.00002 [astro-ph.CO]

The post The Constant Of Time appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/09/13/the-constant-of-time/feed/ 2
A Million Points Of Light https://briankoberlein.com/2016/07/16/million-points-light/ https://briankoberlein.com/2016/07/16/million-points-light/#comments Sat, 16 Jul 2016 16:16:38 +0000 https://briankoberlein.com/?p=6090

A new survey of more than a million galaxies supports the standard model of cosmology.

The post A Million Points Of Light appeared first on One Universe at a Time.

]]>

The Baryon Oscillation Spectroscopic Survey (BOSS) is part of the Sloan Digital Sky Survey-III. It looks at the positions and redshifts of more than a million galaxies. Redshift can be used as a measure of distance, though due to cosmic expansion you have to be a little careful about what that distance is. Using such a large sample, we can look at how the redshifts of galaxies (and therefore their distances) cluster on average, and this allows us to test whether the standard model of cosmology is correct

The key is to measure what is known as baryon acoustic oscillation (BAO). In the Universe there are two main forces driving the way in which galaxies cluster on large scales. One is dark matter, which causes galaxies to clump together. The other is dark energy, which causes these clumps to spread apart. The scale at which clumping occurs allows us to compare the ratio of dark matter to dark energy. It also allows us to study whether that ratio has changed over time. If, for example, dark energy were stronger in the early Universe, the gaps between clumps would be larger at greater distances.

The standard model of cosmology is known as the LCDM model, and assumes that dark energy is a constant, rather than varying over time and space. This new result found that the level of dark energy was constant over time to the limits of its observation, so once again the standard model holds up. The survey also calculate a value of dark energy through the Hubble parameter, and got a value of  67.3 km/s per megaparsec, which is much lower than the value of 73 recently found by Hubble observations, and even lower than the “official” value of 69.3 km/s per megaparsec. This variation of results within different observations, known as tension in the model, seems to be common these days. While the results aren’t completely contradictory when you take into account their uncertainties, they could hint at aspects of the model we don’t fully understand.

So overall the standard cosmological model holds up once again, but there are hints that a new chapter of the story may be unfolding as well.

Paper: Shadab Alam et al. The clustering of galaxies in the completed SDSS-III Baryon Oscillation Spectroscopic Survey: cosmological analysis of the DR12 galaxy sample. Monthly Notices of the Royal Astronomical Society (2016) arXiv:1607.03155 [astro-ph.CO]

The post A Million Points Of Light appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/07/16/million-points-light/feed/ 4
Bell, Book, and Candle https://briankoberlein.com/2016/07/01/bell-book-candle/ https://briankoberlein.com/2016/07/01/bell-book-candle/#respond Fri, 01 Jul 2016 11:00:43 +0000 https://briankoberlein.com/?p=6066

Dark energy isn't some magical idea we've dreamt up to fit our models.

The post Bell, Book, and Candle appeared first on One Universe at a Time.

]]>

Cosmology has upped its game in the past couple of decades. There was a time when the age of the Universe was only known as probably several billion years. This is because measurements of cosmic distances and the Hubble parameter were only loosely known. Today we can say the Universe is 13.8 billion years old, give or take a bit. But that level of precision has been hard won, and it hasn’t been without controversy. 

Some of the methods used to measure cosmic distances. Credit: Tabitha Dillinger

Some of the methods used to measure cosmic distances. Credit: Tabitha Dillinger

Cosmic distances are measured through a range of methods that overlap in their useful distances, known as the cosmic distance ladder. The three big methods are parallax for nearby stars, Cepheid variable stars for globular clusters and nearby galaxies, and type Ia supernovae for distant galaxies. The latter two are known as standard candles, because we can indirectly determine their actual brightness and compare it to their observed brightness to determine their distance. The Achilles heel of standard candles is that if their presumed actual brightness is wrong, then our distance measurements are wrong.

Recently there’s been a bit of furor over the discovery that type Ia supernovae aren’t entirely consistent in their brightness, but come in two varieties. One is a bit brighter at reddish wavelengths, and another is a bit brighter at bluish wavelengths. We’ve generally used a single method to determine absolute brightness, and these two varieties would give different answers. This led to speculation that perhaps dark energy and the ever increasing expansion of the universe was actually wrong. If distant supernova are actually dimmer than we assumed, perhaps dark energy isn’t real.

That made for great headlines, but it isn’t true. While we do use supernovae to determine the existence of dark energy, we also have other methods, such as the large-scale clustering of galaxies and fluctuations in the cosmic background to measure it as well. This new discovery might mean there is less dark energy than we thought, but dark energy isn’t going away because we have a confluence of evidence to support it. This discovery is an indication of just how far we’ve come. Our measurements are now so sophisticated that we’re adding nuances to our results to make them better.

This has happened before. In the early days of modern cosmology the big focus was on Cepheid variables. After being used to demonstrate our galaxy wasn’t alone in the Universe, they were used to determine galactic distances millions of light years away. It was Cepheids that allowed Hubble to first demonstrate the Universe was expanding. But as we observed more Cepheids it became clear that they came in two varieties. When this became understood, the age of the Universe shifted from a few billion to around 10 – 20 billion. What started as a problem led to a better measure of cosmic age.

Which just goes to show that dark energy isn’t some magical idea we’ve dreamt up to fit our models. Our model is so good now that we can find subtle errors and correct them.

The post Bell, Book, and Candle appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/07/01/bell-book-candle/feed/ 0
New Evidence Challenges Rate Of Cosmic Expansion https://briankoberlein.com/2016/06/09/new-evidence-challenges-rate-cosmic-expansion/ https://briankoberlein.com/2016/06/09/new-evidence-challenges-rate-cosmic-expansion/#comments Thu, 09 Jun 2016 11:00:22 +0000 https://briankoberlein.com/?p=6024

A new measurements for the Hubble parameter raises interesting questions, but whether it leads to a new understanding of cosmic expansion and dark energy is yet to be seen.

The post New Evidence Challenges Rate Of Cosmic Expansion appeared first on One Universe at a Time.

]]>

The Universe is expanding. In the standard model of cosmology the rate of that expansion is given by the Hubble parameter, which is a measure of the dark energy that drives cosmic expansion. New observations of distant galaxies yield a higher than expected Hubble value. That may mean the Universe is expanding faster than we thought, but there’s no need to start rewriting textbooks just yet. 

Since the Hubble parameter measures the rate of cosmic expansion, one way to determine it is to compare the redshift of light from distant galaxies with their distance. The cosmological redshift of a galaxy is easy to measure, and is due to the fact that cosmic expansion stretches the wavelength of light as it travels across millions or billions of light years, making it appear more red. By comparing the redshifts for galaxies of different distances we can determine just how fast the Universe is expanding.

Unfortunately distance is difficult to determine. It relies upon a range of methods that vary depending on distance, known as the cosmic distance ladder. For close stars we can use parallax, which is an apparent shift of stars relative to more distant objects due to the Earth’s motion around the Sun. The greater a star’s distance the smaller its parallax, so the method is only good to about 1,600 light years. For larger distances we can look at variable stars such as Cepheid variables. We know the distance to some Cepheid variables from their parallax, so we can determine their actual brightness (absolute magnitude). From this we’ve found that the rate at which a Cepheid variable changes in brightness correlates with its overall brightness. This relation means we can determine the absolute brightness of Cepheid variables greater than 1,600 light years away. If we compare that to their apparent brightness we can calculate their distance. By observing Cepheids in various galaxies we can determine galactic distances. We can observe Cepheids out to about 50 million light years, at which point they’re simply too faint to currently observe.

The bright spot in the lower left is a supernova in the NGC 4526 galaxy. Credit: NASA, ESA, The Hubble Key Project Team, and The High-Z Supernova Search Team

The bright spot in the lower left is a supernova in the NGC 4526 galaxy. Credit: NASA, ESA, The Hubble Key Project Team, and The High-Z Supernova Search Team

Enter the supernova. In a single burst of light a supernova can outshine an entire galaxy, so they can be detected across billions of light years. While there are several types of supernovae, one type (Type Ia) has a fairly consistent maximum brightness. We know this by observing several in galaxies where Cepheids have been used to determine their distance. Just like Cepheids, we can compare the actual brightness of a Type Ia supernova with its apparent brightness and determine the distance of a galaxy.

The achilles heel of the cosmological distance ladder is that it relies upon a chain of data. The distance for supernovas depends upon the calculated distance of Cepheid variables, which in turn depend upon parallax distance measurements. With ever increasing distance comes greater uncertainty in the results. So you want your uncertainties at each step to be as small as possible, which is where this new work comes in. Using data from the Hubble Space Telescope’s Wide Field Camera 3, a team measured about 2,400 Cepheid variables in 11 galaxies where a Type Ia supernova had also occurred. This allowed them to reduce the uncertainty of supernova distance measurements. They then compared the distances and redshifts for 300 supernovae to get a measure of the Hubble parameter accurate to within 2.4%.

That by itself is good work, but the result was surprising. The value for the Hubble parameter they got was about 73 km/s per megaparsec, which is higher than the “accepted” value of 69.3. The difference is large enough that it falls outside the uncertainty range of the accepted value. If the result is right, then it means the Universe is expanding at a faster rate than we thought. It could also point to an additional dark energy component in the early Universe, meaning that dark energy is very different than we’ve supposed.

But we shouldn’t consider this result definitive just yet. The use of supernovae to measure the Hubble parameter isn’t the only method we have. We can also look at the way galaxies cluster on large scales, and fluctuations in the cosmic microwave background. Each of these gives a slightly different value for the Hubble parameter, and the “accepted” value is a kind of weighted average of all measurements. The variation of values from different methods is known as tension in the cosmological model, and any new claim about dark energy and cosmic expansion will need to address this tension. If the supernova method is right and the Universe really is expanding faster than we thought, why do other methods yield a value significantly smaller than the true value? It could be that there is some bias in one or both of the methods that we haven’t accounted for. Planck, for example, has to account for gas and dust between us and the cosmic background, and that may be skewing the results. It could be that the supernovae we use as standard candles to measure galactic distance aren’t as standard as we think. It could also be that our cosmological model isn’t quite right. The current model presumes that the universe is flat, and that cosmic expansion is driven by a cosmological constant. We have measurements to support those assumptions, but if they are slightly wrong that could account for the differences as well.

This new result does raise interesting questions, and it confirms that the discrepancy between different methods is very real. Whether that leads to a new understanding of cosmic expansion and dark energy is yet to be seen.

Paper: Adam G. Riess, et al. A 2.4% Determination of the Local Value of the Hubble ConstantarXiv:1604.01424 [astro-ph.CO] (2016)

The post New Evidence Challenges Rate Of Cosmic Expansion appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/06/09/new-evidence-challenges-rate-cosmic-expansion/feed/ 1
The Incredible Shrinking Universe https://briankoberlein.com/2016/04/29/incredible-shrinking-universe/ https://briankoberlein.com/2016/04/29/incredible-shrinking-universe/#comments Fri, 29 Apr 2016 11:00:14 +0000 https://briankoberlein.com/?p=5963

Most of the universe we can observe is forever beyond our reach.

The post The Incredible Shrinking Universe appeared first on One Universe at a Time.

]]>

The Universe is getting smaller. Not the observable universe, which is currently a sphere about 93 billion light years across and increasing all the time, but the much smaller portion that we could ever hope to reach. Since the Universe is expanding, our cosmic playground is shrinking all the time. 

If the Universe weren’t expanding, then the size of the observable universe would simply depend on its age. As the years go by, ever more distant light would be able to reach us. Likewise, we would be able to travel anywhere in the Universe given enough time. Even at speeds approaching that of light it might take billions of years, but the only limiting factor is time. But the Universe is expanding. It’s not that galaxies are racing away from some point in space, but rather that space itself is expanding, and that makes a big difference.

Since space itself is expanding, the more distant an object, the faster it seems to be moving away from us. We measure cosmic expansion in terms of the Hubble parameter, which is about 20 km/s per million light years. This means that two points in space a million light years apart are moving away from each other at 20 kilometers each second. Two points 10 million light years apart are moving away at 200 km/s, and so on. Because of this, if you consider two points far enough apart, they will be moving away from each other faster than the speed of light. The speed of light is about 300,000 km/s, which, given our current Hubble constant is the separation speed for two points 15 billion light years apart. This is known as the Hubble radius. Anything outside that radius is impossible for us to reach, even if we could travel toward it at the speed of light.

Some of you might protest, since you’ve been told numerous times that nothing can travel faster than light. The catch is that a galaxy 16 billion light years away isn’t actually traveling faster than light. What’s happening is that the expansion of space between us and the distant galaxy is increasing the distance between us faster than the speed of light. That subtle difference is also why we can see things that are farther away than 15 billion light years.

Because of cosmic expansion, the whole idea of galactic distance depends on your definition. As light leaves a galaxy to travel in our direction, space is expanding all along its journey. This not only causes the light to redden (known as the cosmological redshift) it makes the journey longer. All the while, the galaxy is moving even farther away. For light from the most distant galaxies, the light we observe has traveled for more than 13 billion years. When the light began its journey, its galaxy was only 3.4 billion light years away. Now the galaxy is 29 billion light years away. We can see such distant galaxy even though we’ll never reach them.

Since the observable universe is about 42 billion light years in radius, and the Hubble radius is about 15 billion light years, that means about 97% of the observable universe is beyond our reach. Furthermore, since space continues to expand, galaxies that are currently within reach will eventually move beyond the Hubble radius. Our galaxy is part of a cluster of galaxies known as the local group. It is about 10 million light years across and contains about 50 galaxies. Together they are close enough that their gravity will cause them to collapse toward each other despite cosmic expansion. But more distant clusters are so far away that cosmic expansion will win in the end. In perhaps a hundred billion years our local group will have collapsed into a single large galaxy, and the rest of the Universe will have moved forever out of reach.

The post The Incredible Shrinking Universe appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/04/29/incredible-shrinking-universe/feed/ 16
Standardizing the Candle https://briankoberlein.com/2015/07/22/standardizing-the-candle/ https://briankoberlein.com/2015/07/22/standardizing-the-candle/#comments Wed, 22 Jul 2015 13:55:54 +0000 https://briankoberlein.com/?p=5008

The evidence for dark energy lies in our ability to relate the redshift of a galaxy with it's distance. To prove dark energy is real we have to measure redshift and distance independently, and that takes a bit of doing.

The post Standardizing the Candle appeared first on One Universe at a Time.

]]>

The evidence for dark energy lies in our ability to relate the redshift of a galaxy with it’s distance. While we often talk about how the observed redshift of a galaxy allows us to determine its distance, that assumes our understanding of dark energy is correct. To prove dark energy is real we have to measure redshift and distance independently, and that takes a bit of doing.

Measuring redshift is fairly straightforward. By comparing the spectrum of a distant galaxy with the known spectra of atoms and molecules here on Earth, we can determine the amount of redshift expressed in a quantity known as z. To measure distance, however, we need to use observations of a kind of supernova known as type Ia. These are often described as “standard candles” that always explode with the same brightness, but that isn’t actually the case. Some type Ia supernovae are brighter than others, so you can’t simply use their observed brightness as a measure.

Raw light curves (top) vs. calibrated light curves (bottom) for type Ia supernovae.

Raw light curves (top) vs. calibrated light curves (bottom) for type Ia supernovae.

Type Ia supernovae are identified by their emission spectrum. Their spectrum lacks hydrogen lines, and has a distinct silicon emission line when it is near maximum brightness. From this we can clearly distinguish type Ia from other supernovae. What we know from observing type Ia supernovae in nearby galaxies is that there is a specific relation between their peak brightness and the time it takes for them to decay. Bright supernovae shine longer than dim supernovae. From the ratio of peak to width of their light curve, we can calibrate these supernovae to determine their absolute magnitude. Comparing that with their observed magnitude we can determine their distance.

Calculating distance is based upon two assumptions. The first is that our view of the supernovae is relatively unobscured. We calculate distance using the inverse-square relation for light, but that only works if there isn’t gas or dust absorbing some of the light. While there can be gas and dust between us and a supernova, it wouldn’t absorb all frequencies of light by the same amount. Blue wavelengths are absorbed much more than red wavelengths (creating an effect known as reddening) and infrared wavelengths aren’t absorbed much at all. Since distant galaxies are deeply reddened, gas and dust have little effect on their observed brightness. So we know our first assumption is valid.

The second assumption is that nearby type Ia supernovae are the same as distant ones. Interestingly, in recent years there’s been some evidence that might not be the case. Recent observations of a large number of supernovae seem to show two classes of type Ia supernovae, with slightly different ratios. If this is true, then it could readjust the amount of dark energy the universe has. However this would be a minor adjustment to our understanding of cosmology, not a revolutionary change. While supernovae are a great way to observe the effects of dark energy, they aren’t the only way. We can also look at things such as the clustering of galaxies on large scales, and the fluctuations within the cosmic microwave background to determine the amount of dark energy in the universe. What we find is that they all agree reasonably well.

So while type Ia supernovae aren’t standard candles, they are standardizable candles, and they tell us a great deal about the cosmos.

Paper: Peter A. Milne et al. The Changing Fractions of Type Ia Supernova NUV–Optical Subclasses with Redshift. ApJ 803 20. (2015)

The post Standardizing the Candle appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2015/07/22/standardizing-the-candle/feed/ 1
So You’re Saying I’ve Got a Chance https://briankoberlein.com/2015/06/10/got-a-chance/ https://briankoberlein.com/2015/06/10/got-a-chance/#comments Wed, 10 Jun 2015 18:48:29 +0000 https://briankoberlein.com/?p=4883

New research questions the claim that the universe is accelerating. But this new work isn't as strong as some claim.

The post So You’re Saying I’ve Got a Chance appeared first on One Universe at a Time.

]]>

Recently I’ve been asked about reports of new research showing the universe isn’t accelerating. If true, it would mean that dark energy doesn’t exist, which would be a good way to solve the mystery. While there is the occasional headline making such a claim, there isn’t a great deal of evidence to support the idea. There is, however, plenty of evidence that dark energy exists.

The most recent paper claiming to eliminate (or at least weaken) dark energy showed up recently on the arxiv. It focuses on one keystone of dark energy evidence, the observations of distant supernovae. One particular type of supernova known as Type Ia has the useful property of exploding with a fairly uniform brightness. This means they can be used as “standard candles” to determine their distance. Basically you can observe its apparent brightness and compare it to its actual brightness to get a distance. Observation of some of the most distant supernovae at the time led to the Nobel-winning discovery of dark energy.

The no acceleration model compared to observational data.

The no acceleration model compared to observational data.

But recently there’s been evidence that there is more variation within Type Ia supernovae than originally thought, including a dimmer variation known as Type Iax. This means the uncertainty in the actual brightness of Type Ia supernovae might be greater than we’ve been using, which is where this new paper comes in. Basically what the authors do is analyze the observations we have of distant supernovae using larger uncertainties. They then compare this data to both the accelerating and non-accelerating cosmological models. What they find is that the confidence level of the accelerating model is lowered, which is exactly what you would expect if you make your uncertainties larger. They also find that support for no acceleration increases, which is also what you’d expect with larger uncertainties.

Their conclusion is that the non-accelerating model is “still in the game” as it were, since larger uncertainties make the distinction between the two models less clear. But the evidence doesn’t support that conclusion. The strongest candidate by far is still an accelerating universe based upon this data, and dark energy is supported by other evidence such as galactic clustering and the cosmic microwave background.

In light of new supernova observations, it’s good to keep testing our cosmological models, but so far the standard LCDM model of an accelerating universe is the best model we have.

Paper: J T Nielsen, et al. Marginal evidence for cosmic acceleration from Type Ia supernovae. arXiv:1506.01354 [astro-ph.CO] (2015)

Paper: Peter A. Milne et al. The Changing Fractions of Type Ia Supernova NUV—Optical Subclasses with Redshift. ApJ 803 20 (2015)

The post So You’re Saying I’ve Got a Chance appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2015/06/10/got-a-chance/feed/ 2
Karma Chameleon https://briankoberlein.com/2015/02/21/karma-chameleon/ https://briankoberlein.com/2015/02/21/karma-chameleon/#comments Sat, 21 Feb 2015 16:19:51 +0000 https://briankoberlein.com/?p=4507

One proposed model for dark energy known as the chameleon field has been put to the test, and failed.

The post Karma Chameleon appeared first on One Universe at a Time.

]]>

While the search for dark matter particles often hits the news, there are also efforts underway to detect dark energy particles. As with dark matter, the experiments thus far have largely determined what dark matter isn’t rather than what it is. 

There are two basic ways to account for dark energy. One is known as the cosmological constant. In this model, dark energy is an inherent aspect of the structure of space and time. Thus, throughout the universe there is a constant, uniform expansion of spacetime that gives the effect of dark energy. This model is the simplest way to account for dark energy, as it’s just a matter of adding a term to the usual general relativity equations. It also agrees with observations so far. But simply adding a term to your equations seems like a bit of a tweak model. General relativity doesn’t require a cosmological constant, it just allows for one. There’s no reason why there should be such a constant other than the fact that it fits observation. So lots of alternatives have been proposed.

The most popular type of alternative is to propose some type of scalar field. The idea is that the universe would be filled with a scalar field that results in dark energy. That may seem even more crazy than a cosmological constant, but the Higgs boson is a result of a scalar Higgs field introduced to account for particle mass, and we’ve actually detected it. There are several variations of the scalar field idea, but most of them can’t be tested using current data. But one version known as chameleon fields has just been tested, and failed the test.

The basic experiment. <br>Credit: Paul Hamilton, et al.

The basic experiment. Credit: Paul Hamilton, et al.

The chameleon field is a “fifth force” field that interacts with itself  to produces the effects of dark energy in deep space, but also gets inhibited by the presence of mass. In this way you get cosmic expansion between galaxies, but you don’t see its effect in galaxies (or in our solar system). Since the presence of mass makes it “hidden,” it acts as a kind of cosmic chameleon, hence the name. Normally I wouldn’t put much credence in a “just so” model like this, but a few months ago it was demonstrated that the model could actually be tested. Because of its chameleon effect, the field could be “trapped” within a vacuum cavity. By making the matter in the chamber as little as possible, the chameleon field would strengthen in the chamber. As a result, the effective gravitational force within the vacuum is altered. Using an atom interferometer (basically a double-slit experiment using atoms instead of electrons) the change in gravity could be measured. What the team found was that there was no measured effect to the limits of their experiment.

This basically rules out the chameleon field and similar models. There’s still a few ways the model could be tweaked to still exist within the limits of this experiment, but it doesn’t look good for chameleon fields. That’s not particularly surprising, since most proposed models will be wrong. What makes this interesting is that we’re now actually testing dark energy models in the lab.

Paper: Paul Hamilton, et al. Atom-interferometry constraints on dark energy. arXiv:1502.03888 [physics.atom-ph] (2015)

The post Karma Chameleon appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2015/02/21/karma-chameleon/feed/ 1
It’s What’s For Dinner https://briankoberlein.com/2014/11/28/whats-dinner/ https://briankoberlein.com/2014/11/28/whats-dinner/#respond Fri, 28 Nov 2014 12:00:46 +0000 https://briankoberlein.com/?p=4166

A new paper in Physical Review Letters has led to popular science headlines wondering "Is dark energy eating dark matter?" but that's not what the paper claims at all.

The post It’s What’s For Dinner appeared first on One Universe at a Time.

]]>

Two of the most mysterious phenomena in astrophysics are dark matter and dark energy. Dark matter is what holds galaxies together, and causes them to clump into clusters and superclusters. It is known as dark matter because it doesn’t interact strongly with light. Dark energy, on the other hand, causes clusters of galaxies to expand away from each other. The resulting cosmic expansion is why distant galaxies appear to be racing away from us. Because both of these phenomena are central to the structure of the cosmos, there have been several attempts to connect them into a single phenomenon. Now a new paper in Physical Review Letters proposes that decaying dark matter may produce dark energy.

This new paper has led to popular science headlines wondering “Is dark energy eating dark matter?” but that’s not what the paper claims at all. In the paper the authors look at three sets of data: fluctuations in the cosmic microwave background, observations of distant supernovae, and redshift data from distant galaxies. In the standard model of cosmology, known as the ΛCDM model, these three data sets should be related, and if you plug the data into the model they should give the same results. What we find is that they almost agree, but not quite. This is not to say that they contradict each other, simply that they don’t agree as well as the ΛCDM model says they should.

This slight disagreement between datasets is known as tension in the model. There have been various proposed solutions to easing this tension, such as factoring in neutrino mass, but in this new work the authors propose resolving it by connected dark matter and dark energy. To do this they tweaked the ΛCDM model slightly. The Λ or “lambda” in ΛCDM represents the cosmological constant that drives dark energy. The CDM stands for “cold dark matter,” which is the leading model for dark matter. In the standard cosmological model these two are separate and constant, but the authors proposed that dark matter could decay into dark energy (by some unknown process) so that over the history of the universe the Λ would increase while the CDM decreased. The decay of one into the other would be slow, but the authors found that such a decay would ease tension in the observational data.

It should be pointed out that this doesn’t prove such a decay occurs, only that tweaking the standard model in such away seems to better fit the data. Tweak theories are weak theories, as I’ve said before. But what this new work does show is that modifications of the standard model are something worth exploring. As we strive to solve the mysteries of dark matter and dark energy, they may turn out to be two sides of the same coin.

Paper: Valentina Salvatelli, et al. Indications of a Late-Time Interaction in the Dark Sector. Phys. Rev. Lett. 113, 181301 (2014)

The post It’s What’s For Dinner appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2014/11/28/whats-dinner/feed/ 0
The Phantom Menace https://briankoberlein.com/2014/10/26/phantom-menace/ https://briankoberlein.com/2014/10/26/phantom-menace/#comments Sun, 26 Oct 2014 19:00:44 +0000 https://briankoberlein.com/?p=4048

We know the universe is expanding, and we know it is doing so at an ever increasing rate. This cosmic acceleration is part of the evidence for dark energy, which current observations put at about 68% of the observable universe. But beyond its existence as some kind of energy, we’re still trying to determine just what dark energy is.

The post The Phantom Menace appeared first on One Universe at a Time.

]]>

We know the universe is expanding, and we know it is doing so at an ever increasing rate. This cosmic acceleration is part of the evidence for dark energy, which current observations put at about 68% of the observable universe. But beyond its existence as some kind of energy, we’re still trying to determine just what dark energy is.

The most widely accepted model for the universe, sometimes called the standard model of cosmology, is known as the ΛCDM model. The Λ or “lambda” refers to the dark energy parameter known as the cosmological constant (which often uses lambda to represent its value). The CDM stands for Cold Dark Matter, which is the type of dark matter currently best supported by observation. In the ΛCDM model, the matter of the universe consists of regular matter (about 5%), cold dark matter (about 27%). Dark energy is then caused by a cosmological constant, which is a property of space and time itself, giving rise to dark energy.

The ΛCDM model makes a couple of very important predictions about the universe that can be tested against observation. The first is the the universe is, on average, flat. This means that local masses can curve space around them (with the effect of gravitational attraction), but since the cosmological constant curves space in an opposite way (cosmic expansion), the total average should be zero. Current observations agree with this prediction, though is a hint in the data of the cosmic microwave background that the curvature might not be exactly zero, as I’ve written about before.

Predictions of different cosmological models. Credit: NASA, CXC, M. Weiss

Predictions of different cosmological models.
Credit: NASA, CXC, M. Weiss

A second prediction is about the equation of state for dark energy. This boils down to a parameter in cosmology known as w. According the the ΛCDM model, w should be exactly equal -1. If w = -1, then dark energy is a constant, and can be described in general relativity by the cosmological constant. Again, observations agree pretty well with this prediction, but a new paper summarizing observational data from Pan-STARRS seems to show that w might not be -1 after all.

Pan-STARRS (the Panoramic Survey Telescope & Rapid Response System) has been making a medium deep sky survey that has observed 112 type 1a supernovae. These supernovae observations were combined with other observations for a total of 313 type 1a supernovae. Type 1a supernovae are useful because they have a consistent maximum brightness. This mean you can observe their apparent brightness to determine how far away they are. You can also measure the redshift of their light to determine cosmic expansion. So observations of these supernovae let us determine the value of w.

What the team found was that if you just take the supernovae data, the ΛCDM model gives an experimental value for w of -1.015 give or take about 0.1 either way, which agrees with the model. But if you take the supernova data and combine it with other observational data, including the Planck observations of the Cosmic Microwave Background and observation of the Baryon Acoustic Oscillation, then you get a different value, w = -1.186 give or take 0.07. If this value is correct then the ΛCDM model isn’t an accurate description of the universe.

If the w parameter is truly less than one, then inflation cannot be due to a cosmological constant. One of the more popular models giving a w value less than -1 is known as phantom energy. This would be a particularly strong form of dark energy, and if true could mean the universe would continue to expand at an ever accelerating rate until it ended as a Big Rip.

Before you run out and tell your friends that the standard cosmological model is wrong, keep in mind that these results are tentative. They also aren’t overly strong. In any observational result there is a chance for observation bias, which would skew the results a bit. Then there is the fact that there can be random variations in the data. The reliability of an observation is often given in a statistical term known as sigma. In this case the result is at 2-sigma, which means there is a 1 in 20 chance that the deviation from w = -1 is just random variation. In science we generally set the bar at 5-sigma, which is about a 1 in 1.7 million chance of being random variation. The authors themselves note that the data isn’t strong enough to make a clear determination.

Future observations by Pan-STARRS will likely triple the number of supernovae we can use to determine w. So in time the result, whether it agrees or disagrees with ΛCDM, will reach a strong confidence level. By then we’ll know if our standard cosmological model is correct, or if it faces a phantom menace.

The post The Phantom Menace appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2014/10/26/phantom-menace/feed/ 3