cosmology – One Universe at a Time https://briankoberlein.com Brian Koberlein Thu, 21 Feb 2019 22:09:36 +0000 en-US hourly 1 https://wordpress.org/?v=5.1 The Black Hole At The Edge Of The Universe https://briankoberlein.com/2017/12/12/black-hole-edge-universe/ https://briankoberlein.com/2017/12/12/black-hole-edge-universe/#comments Tue, 12 Dec 2017 12:00:36 +0000 https://briankoberlein.com/?p=6809

The most distant quasar ever observed challenges our understanding of how black holes formed.

The post The Black Hole At The Edge Of The Universe appeared first on One Universe at a Time.

]]>

Within most galaxies there lurks a supermassive black hole. Our own galaxy, for example contains a black hole 4 million times more massive than our Sun. One of the big mysteries of these black holes is just how they formed, and how long it took for them to reach such a massive size. Now a massive black hole at the edge of the observable universe challenges our understanding of them. 

We discovered this distant black hole because it is a quasar. When a black hole captures nearby material, the material becomes superheated and radiates powerful radio energy and x-rays. These beacons of light are so bright they can only be powered by supermassive black holes. By observing the brightness of distant quasar we can calculate the mass of its black hole engine.

Recently astronomers discovered the most distant quasar ever. Known as J1342+0928, it is so distant we see it from a time when the universe was only 690 million years old. At that time galaxies were just starting to form. But this quasar gives off so much light that its black hole must be 800 million times the mass of our Sun. So how did this particular black hole get so massive so soon? It’s possible that it exists in a rather dense region of space. Having lots of matter around would make it easier to grow quickly. But that isn’t enough to solve the mystery, because the faster a black hole consumes matter, the more light the matter would emit, and that pressure of light and heat would tend to push matter away from the black hole. It’s known as the Eddington limit, and it puts an upper bound on how fast a black hole can grow. To reach 800 million solar masses in such a short time, the black hole would have to consume matter fairly close to this limit.

The big question about this black hole is whether it is simply an unusually early bloomer, or if it is rather typical of black holes in the early universe. To answer that question we will need to find more examples of large quasars on the edge of the cosmos. So the race is on to find these most distant of beacons. If we can find them, they will help us understand how supermassive black holes and their galaxies formed.

The post The Black Hole At The Edge Of The Universe appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/12/12/black-hole-edge-universe/feed/ 2
A Universe Of Antimatter https://briankoberlein.com/2017/11/24/a-universe-of-antimatter/ https://briankoberlein.com/2017/11/24/a-universe-of-antimatter/#comments Fri, 24 Nov 2017 12:00:29 +0000 https://briankoberlein.com/?p=6794

If our universe were made of antimatter, what would it look like?

The post A Universe Of Antimatter appeared first on One Universe at a Time.

]]>

Our universe is dominated by matter. Sure, there is dark matter and dark energy, but things like stars, planets and people are made of matter. Protons, electrons, neutrons and such. But matter seems to come in pairs. For every electron created, an antimatter positron is created. For every proton that appears, so does an anti-proton. Since our universe is dominated by matter, what if there is another universe dominated by antimatter? What would an antimatter universe look like? 

The basic difference between matter and antimatter is that they have opposite charges. A proton has a positive charge, while an antiproton a negative one. Positively charged positrons are the antimatter version of negatively charged electrons. What’s interesting is that the signs of electric charge are a fluke of history. We could have assigned a positive charge to electrons and a negative one to protons. There’s nothing special about choosing one or the other. So you might think that an antimatter universe would look exactly like our regular one. But matter and antimatter have subtle differences.

One of the main differences has to do with neutrinos. Neutrinos don’t have any charge, so if the sign of charge were the only difference between matter and antimatter, “antimatter” neutrinos would be identical to “matter” neutrinos. But it turns out they are slightly different. Neutrinos have a property called helicity, which describes whether they spin to the left or the right as they travel through space. Matter neutrinos have left-handed helicity, while antimatter one have a right-handed helicity. That might not seem like a big deal, but in 1956 Chien-Shiung Wu looked at the radioactive decay of cobalt-60 atoms. She found that left-oriented and right-oriented atoms decay at different rates. Since handedness is different between matter and antimatter, the two might decay at different rates. This might be the reason why we don’t seen lots of antimatter in the universe.

But suppose there was an antimatter universe that had lots of anti-hydrogen and anti-helium after its big bang, just as our early universe had lots of hydrogen and helium. It would seem reasonable that these could fuse to heavier antimatter elements in the cores of antimatter stars, and this could produce antimatter planets and perhaps even antimatter life. What would these creatures see when they look up into their night sky?

In this case we know it would look much like our own night sky. Recently we’ve been able to produce anti-hydrogen, and we have looked at the type of light it produces. We found that anti-hydrogen produces the same kind of light as regular hydrogen. So an antimatter Sun would emit the same light as our Sun. Light would reflect off an antimatter moon just as it does our Moon, and our antimatter cousins would see a sky filled with stars, nebulae and planets, just like we do.

Of course all of this is based upon the assumption that antimatter would collapse under gravity to form stars in the first place. We think that should be the case, but what if antimatter also had anti-mass? What if anti-atoms gravitationally repelled each other? In that case, an antimatter universe would never form stars or galaxies. Our antimatter universe would simply be filled with traces of anti-hydrogen and anti-helium, and nothing would ever look up at the cosmic sky.  While we think antimatter has regular mass, we haven’t created enough of it in the lab to test the idea. For now we can’t be sure.

So it is quite possible that an antimatter universe would look nearly identical to our own. But it could be that an antimatter universe would be nothing but cold gas. It’s even possible that the radioactive decay of antimatter is so different from that of matter, that an antimatter universe can’t even exist.

The post A Universe Of Antimatter appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/11/24/a-universe-of-antimatter/feed/ 5
Measuring The Universe With Gravitational Waves https://briankoberlein.com/2017/11/22/measuring-universe-gravitational-waves/ https://briankoberlein.com/2017/11/22/measuring-universe-gravitational-waves/#comments Wed, 22 Nov 2017 12:00:28 +0000 https://briankoberlein.com/?p=6793

The recent merger of two neutron stars produced both visible light and gravitational waves, and that lets us determine just how fast the universe is expanding.

The post Measuring The Universe With Gravitational Waves appeared first on One Universe at a Time.

]]>

Last year astronomers made the first detection of gravitational waves from the merging of two black holes. It gave us an entirely new way to view the cosmos. Now we aren’t limited by the emission and absorption of light by matter. We can explore the universe through ripples in the fabric of spacetime itself. Through recent observations we can study the most mysterious aspect of spacetime, known as dark energy. 

Dark energy is what causes the universe to expand. It makes up about 70% of the universe, but we don’t really know what it is. One reason for this is that we don’t know exactly how much it expands the universe. Cosmic expansion is typically defined in terms of the Hubble parameter H0. Because the universe expands, more distant galaxies appear to be moving away from us faster than closer galaxies. The velocity of a distant galaxy is related to its distance by v =H0d. We can measure the speed of a galaxy through the redshift of its light. The greater the galaxy’s speed, the more its light is shifted toward longer (red) wavelengths.

Various methods used in the cosmic distance ladder. Credit: Wikipedia

Knowing the distance of a galaxy and its observed redshift, we can determine the Hubble parameter. When we do this for lots of galaxies, we get a value of about H= 67.6 (km/s)/Mpc. But there is a catch. We can’t measure the distances to the furthest galaxies directly. We use what’s known as the cosmic distance ladder, where we use one type of measurement to determine the distance of nearby stars, use that result with other observations to measure distances to close galaxies, and use that result to measure more distant galaxies. Each step in the ladder has its own advantages and disadvantages, and if one rung in the ladder is off, it throws off all the other ones.

Fortunately, we have other ways to measure the Hubble parameter. One of these is through the cosmic microwave background. This remnant echo of the big bang has small fluctuations in temperature. The size of these fluctuations tells us the rate of cosmic expansion (among other things). Observations by the Planck satellite gave a Hubble parameter value of about H= 67.7 (km/s)/Mpc.

But other methods of measuring the Hubble parameter give slightly different results. For example, one method looked at how light is gravitationally lensed by distant galaxies. Gravitational lensing can create multiple images of distant supernovae, and since each image takes a different path around the galaxy, they arrive at different times. The timing of these images can be used to determine the Hubble parameter, and the result is about H= 71.9 (km/s)/Mpc. A different method using distant supernovae gives a result as high as H= 73 (km/s)/Mpc. So what is the real value of the Hubble parameter?

This is where gravitational waves come in. All of the measurements of the Hubble parameter so far rely upon observations of light. Gravitational waves provide us an entirely new method to measure cosmic distances. As two black holes or neutron stars begin to merge, they spiral ever closer to each other, creating gravitational waves we can detect. The frequency of these waves depends upon their masses, and their masses determine how much energy they produce when they merge. By comparing the energy they produce with the strength of the gravitational waves we observe, we know their distance. This is similar to the way standard candles are used in optical astronomy, where we know the actual brightness of a star or galaxy, and compare it to the observed brightness to determine distance. In fact, this new method has been termed a standard siren.

But distance isn’t enough to determine the Hubble parameter. We also need to determine its speed away from us. We aren’t able to measure the redshift of gravitational waves, so we can’t used them to measure speed. But when two neutron stars merge they produce both gravitational waves and light. For one such merger, we not only observed the light produced, but also its redshift. From that we can find the Hubble parameter. Since the distance is found directly from gravitational waves, it doesn’t rely upon the cosmic distance ladder or an assumed model of cosmic expansion. From this event it found H= 70 (km/s)/Mpc.

While that result points to a larger Hubble parameter, the uncertainty of the result is really large. Based on the data, it could be as large as 82 or as small as 62. But this is only one measurement. As more mergers are observed, we will get more precise results. So gravitational waves will help us pin down the Hubble parameter. It’s only a mater of time, and space.

Paper: The LIGO Scientific Collaboration, et al. A gravitational-wave standard siren measurement of the Hubble constant. Nature 551, 85–88. doi:10.1038/nature24471 (2017)

The post Measuring The Universe With Gravitational Waves appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/11/22/measuring-universe-gravitational-waves/feed/ 3
Dark Matter Isn’t Warm And Fuzzy https://briankoberlein.com/2017/08/03/dark-matter-isnt-warm-fuzzy/ https://briankoberlein.com/2017/08/03/dark-matter-isnt-warm-fuzzy/#comments Thu, 03 Aug 2017 11:00:56 +0000 https://briankoberlein.com/?p=6722

A new survey of distant quasars shows that dark matter isn't warm and fuzzy.

The post Dark Matter Isn’t Warm And Fuzzy appeared first on One Universe at a Time.

]]>

Dark matter is one of the big mysteries of cosmology. Theoretically it explains cosmic phenomena such as the scale at which galaxies cluster, and observationally we see its effect through things like gravitational lensing, but it hasn’t been observed directly. This means we have a limited understanding its exact nature. As a result, there have been lots of theoretical ideas about what dark matter could be. But we now know that whatever dark matter is, it isn’t warm and fuzzy. 

There are two broad aspects about dark matter that no one disagrees about (assuming it exists). The first is that it must be dark, meaning that it doesn’t interact much with light. If it did interact with light, we would see its effects through the absorption or scattering of light from stars and distant galaxies. The second is that it must have mass, since the models require that it interacts with regular matter gravitationally. Beyond that, almost anything goes.

The most models assume that dark matter is cold. In this case, cold vs warm refers to the speed at which dark matter particles typically move. In cold dark matter models, the particles are relatively heavy, with a mass similar to that of protons or more. Because of their high mass, these dark matter particles would move relatively slowly, at much the same speed as the gas and dust in our galaxy. Neutrinos, on the other hand, are warm dark matter. Neutrinos don’t interact strongly with light, and they do have mass, so they meet the basic requirement of dark matter. But neutrino mass is minuscule, and they typically move at speeds approaching the speed of light. Thus, neutrinos are an example of warm or hot dark matter. One of the things we observe about galaxies is that they have far more mass than their visible matter would suggest, so they must contain a lot of dark matter. This means dark matter clumps together just as gas and dust clumps to form galaxies. Warm dark matter such as neutrinos move much too quickly to clump together in this way, so it would seem that dark matter must be cold. While cold dark matter is a central part of the standard “concordance model” of cosmology, it isn’t without problems. One of the biggest problems is that cold dark matter predicts that large spiral galaxies like our Milky Way should have hundreds of small satellite galaxies surrounding it. We’ve only found about a dozen satellite galaxies. Even the distribution of stars within these dwarf galaxies doesn’t fit the dark matter model very well.

While warm dark matter like neutrinos doesn’t fit the data well, there are other warm models that might. They solve some of the issues with warm dark matter by suggesting dark matter is also “fuzzy.” This refers to its quantum nature. All matter has a quantum aspect to it. For example, an electron doesn’t orbit the nucleus of an atom like a planet around the Sun. Instead, the electron is in a “fuzzy” quantum state within the atom. Normally the fuzzy nature of quantum particles only acts at short distances, on the scale of a few atoms, but under the right conditions this kind of fuzzy quantum behavior can occur over large distances. In the fuzzy dark matter model, the dark matter particles can interact quantum mechanically over great distances, thus allowing them to behave in ways similar to cold dark matter.

Several computer simulations of the universe agree with the cold dark matter model on large scales, but a new study specifically looked at how the warm fuzzy model compares. To do this the team used observations from more than 100 quasars. Quasars are distant objects powered by the supermassive black holes in the centers of galaxies. They give off tremendous amounts of light and energy, and so we can see them across billions of light years. As the light from these quasars travels across the cosmos to reach us, it is distorted by diffuse filaments of hydrogen gas between galaxies, known as the intergalactic medium. The distribution of hydrogen in the intergalactic medium allows us to study how clusters of galaxies formed. The team compared this data to both cold and warm-fuzzy dark matter models. They found the warm-fuzzy model didn’t agree with observation. That doesn’t mean that warm-fuzzy dark matter doesn’t exist, but if it does exist it must be so diffuse and have such an extraordinarily tiny mass that it couldn’t have caused the clustering of galaxies we observe.

Cold dark matter still has its own problems, and the nature of dark matter still holds many mysteries. But we now know that for the most part dark matter isn’t warm and fuzzy.

Paper: Vid Iršič, et al. First Constraints on Fuzzy Dark Matter from Lyman-α Forest Data and Hydrodynamical Simulations. Phys. Rev. Lett. 119, 031302 (2017)

The post Dark Matter Isn’t Warm And Fuzzy appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/08/03/dark-matter-isnt-warm-fuzzy/feed/ 5
How To Define Distance In An Expanding Universe https://briankoberlein.com/2017/05/28/define-distance-expanding-universe/ https://briankoberlein.com/2017/05/28/define-distance-expanding-universe/#comments Sun, 28 May 2017 11:00:20 +0000 https://briankoberlein.com/?p=6657

On a cosmic scale the notion of distance is more subtle than you might think.

The post How To Define Distance In An Expanding Universe appeared first on One Universe at a Time.

]]>

Recently the Sloan Digital Sky Survey (SDSS) has completed the largest map of the universe thus far. The map focuses on the positions of quasars. These objects are powered by supermassive black holes in the centers of galaxies, and are so bright they can be seen from the farthest regions of the cosmos. Most quasars are so far away that we have to redefine what “distance” means. In an expanding universe, distance can be defined in a variety of ways. 

For the stars we see in the night sky, their distance is just what you’d expect: the physical distance from the Sun to the star. The bright star Sirius, for example, is 2.6 parsecs away. A parsec is defined by the method used to measure stellar distances, known as parallax. As the Earth orbits the Sun, its view of the stars shifts very slightly. Nearby stars shift more than distant ones, and this is known as a parallax shift. The bigger the parallax, the closer the star. If a star were one parsec away, its parallax would be 1 arcsecond. There are 360 degrees in a circle. If you took a single degree and divided it into 3600 parts, each part would be an arcsecond.

The parallax of nearby stars is small because they are so very far away. While astronomers often use parsecs for distance, a more common measure is the time it takes light from the star to reach us. For Sirius, that is about 8.6 light years, meaning the starlight we observe left Sirius about 8.6 years ago. Of course that distance changes a bit over time. Sirius is moving relative to the Sun. Even if we could travel to Sirius at the speed of light, we would have to make accommodations for its changing location. But this change in distance is small compared to its current distance.

Because stellar parallax is so small, it can only be used for stars out to about 10,000 light years or so. Beyond that the parallax is simply too small to measure. For more distant objects such as galaxies we have to use other methods. One popular method is to use variable stars known as Cepheid variables. Cepheid variables have a particular relation between their overall brightness and how quickly they vary from bright to dim. By watching them vary over time we can calculate their distance. Observations of Cepheid variables in the Andromeda galaxy, for example, shows that it is about 766,000 parsecs away, or 2.5 million light years. Just as with stars, the distance of a galaxy changes over time. Over the course of a 2.5 million year journey to Andromeda, the galaxy would have moved by about 1,500 light years. That’s still a small fraction of its overall distance, but its not insignificant.

With more distant galaxies distance becomes much more complicated. If we measure the motion of various galaxies, usually through the redshift of their light, we find that the more distant the galaxy the greater its redshift. This is due to the overall expansion of the cosmos. Through dark energy, the overall distance between galaxies is increasing, and this cosmic expansion puts a serious wrench in the meaning of cosmic distance.

The most direct quantity we can measure for a distant galaxy is its redshift. Usually this is expressed as z, which is the fractional amount a particular wavelength has changes from its unshifted wavelength. The upper range of z we have observed is about 12, so lets consider a galaxy that has about half that amount, or z = 6. Just how far away is such a galaxy?

Redshift can be caused by two things: the motion of a galaxy through space (often called Doppler redshift) and the expansion of space itself (often called cosmological redshift). We can’t distinguish between them observationally, but we know from various observations that the motion of local galaxies that the Doppler shift tends to be rather small. So its safe to assume that for distant galaxies redshift is almost entirely due to cosmic expansion. To calculate distances, we then have to look at how the universe expands over time, and this relies on which particular cosmological model we use. Typically this is the concordance model, or ΛCDM model, which is your standard dark matter, dark energy dominated universe model. Assuming this model is accurate (and we have lots of reasons to think it is) then we can calculate galactic distances. But we have to be careful about how we define distances.

Suppose we use the parsec definition above. That is, based upon the light we currently see, how far away is a quasar with redshift z = 6? Another way to say this would be “How far away was the quasar when the light left it?” This turns out to be about 1.2 billion parsecs. It’s tempting to convert this to light years, and thus say it was about 3.9 billion light years away, but this is misleading. Because the cosmos was expanding as the light traveled to us, it actually took the light about 12.8 billion years to reach us. So its light travel time distance is actually 12.8 billion light years. This is the most common “distance” used, since it’s easy to compare with the age of the light. When we observe a quasar with a redshift of z = 6, we see the universe as it was 12.8 billion years ago.

Unfortunately this can also make things more confusing. Given that the travel time distance is 12.8 billion years, one might assume that the quasar is about 12.8 billion light years away, rather than 3.9 billion light years when the light left it. But the light we observe was traveling toward us as the universe expanded, while the quasar it left behind moved away from us with the expansion of space. This comoving distance is about 8.4 billion parsecs, which is equivalent to 27 billion light years.

Each of these distances is valid in its own way, even though they are all quite different. That’s why astronomers often stick to redshift.

The post How To Define Distance In An Expanding Universe appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/05/28/define-distance-expanding-universe/feed/ 6
The Dark Web https://briankoberlein.com/2017/04/23/the-dark-web/ https://briankoberlein.com/2017/04/23/the-dark-web/#respond Sun, 23 Apr 2017 11:00:16 +0000 https://briankoberlein.com/?p=6612

A study of thousands of galaxy pairs shows dark matter filaments exist between galaxies.

The post The Dark Web appeared first on One Universe at a Time.

]]>

Dark matter is difficult to study. Since it doesn’t interact with light, it is basically invisible. But it does have mass, and that means it deflects light ever so slightly, an effect known as weak gravitational lensing. By observing the way light from distant galaxies is distorted, we can map the distribution of mass between us and the galaxies. Comparing this to the visible matter of galaxies allows us to map the presence of dark matter. This technique works well when measuring large regions of dark matter, such as the halos around galaxies, but gravitational lensing is such a weak effect it’s difficult to study the detailed structure of dark matter. That’s unfortunate, because the details are what we need to understand the nature of dark matter. 

A computer simulation showing filaments of dark matter between clusters of galaxies. Credit: Michael L. Umbricht

The dominant model for dark matter makes several predictions we can test. For example, it predicts that dark matter will clump together gravitationally, and that means galaxies will also cluster together at a particular scale. This is exactly the clumping scale we observe across the cosmos. But there are also predictions we can’t easily test, such as dark matter filaments. As dark matter clumps together, some of the dark matter should be left behind, forming filaments of dark matter that connect galaxies and clusters of galaxies. These filaments have long been thought to exist, but detecting them is extremely difficult. Their gravitational influence is so small any weak lensing they produce is almost indistinguishable from random noise.

There has been some evidence of dark matter filaments. Comparisons of faint lensing between galaxies agrees with models of dark matter filaments, but with weak data you have to be careful not to presume too much about your model. A new paper in MNRAS tries to overcome this issue by taking a different approach. Rather than trying to observe filaments within a single cluster of galaxies, the team looked at data from thousands of filaments.

Taking data from the Baryon Oscillation Spectroscopic Survey, they focused on about 23,000 pairs of Luminous Red Galaxies (LRGs). These galaxies are particularly bright, and are easy to distinguish from other galaxies. They also have very similar structures, which makes them useful to study statistically. The team then measured the weak lensing between these pairs of galaxies. Individually the lensing between them would be hard to distinguish from random distortions, but they then combined the data from the pairs to create an overall average. In this way any random distortions would tend to wash out, leaving only the effects of dark matter. The result is a statistical image of the dark matter filaments between galaxy pairs.

While the result is statistical, it doesn’t rely upon a dark matter model to infer its presence. It also agrees with the statistical predictions of dark matter filaments. It’s yet another success for the dark matter model.

Paper: Seth D. Epps et al. The weak-lensing masses of filaments between luminous red galaxies. Monthly Notices of the Royal Astronomical Society. DOI: 10.1093/mnras/stx517 (2017)

The post The Dark Web appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/04/23/the-dark-web/feed/ 0
Who Needs Dark Energy? https://briankoberlein.com/2017/04/11/needs-dark-energy/ https://briankoberlein.com/2017/04/11/needs-dark-energy/#comments Tue, 11 Apr 2017 14:29:24 +0000 https://briankoberlein.com/?p=6600

Do we really need dark energy to explain cosmic expansion?

The post Who Needs Dark Energy? appeared first on One Universe at a Time.

]]>

Our universe is expanding. We’ve known this for nearly a century, and modern observations continue to support this. Not only is our universe expanding, it is doing so at an ever increasing rate. But the question remains as to what drives this cosmic expansion. The most popular answer is what we call dark energy. But do we need dark energy to account for an expanding universe? Perhaps not. 

The idea of dark energy comes from a property of general relativity known as the cosmological constant. The basic idea of general relativity is that the presence of matter warps spacetime. As a result, light and matter are deflected from simple straight paths in a way that resembles a gravitational force. The simplest mathematical model in relativity just describes this connection between matter and curvature, but it turns out that the equations also allow for an extra parameter, the cosmological constant, that can give space an overall rate of expansion. The cosmological constant perfectly describes the observed properties of dark energy, and it arises naturally in general relativity, so it’s a reasonable model to adopt.

In classical relativity, the presence of a cosmological constant simply means that cosmic expansion is just a property of spacetime. But our universe is also governed by the quantum theory, and the quantum world doesn’t play well with the cosmological constant. One solution to this issue is that quantum vacuum energy might be driving cosmic expansion, but in quantum theory vacuum fluctuations would probably make the cosmological constant far larger than what we observe, so it isn’t a very satisfactory answer.

Despite the unexplainable weirdness of dark energy, it matches observations so well that it has become part of the concordance model for cosmology, also known as the ΛCDM model. Here the Λ is the symbol for dark energy, and CDM stands for Cold Dark Matter. In this model there is a simple way to describe the overall shape of the cosmos, known as the Friedmann–Lemaître–Robertson–Walker (FLRW) metric. The only catch is that this assumes matter is distributed evenly throughout the universe. In the real universe matter is clumped together into clusters of galaxies, so the FLRW metric is only an approximation to the real shape of the universe. Since dark energy makes up about 70% of the mass/energy of the universe, the FLRW metric is generally thought to be a good approximation. But what if it isn’t?

A new paper argues just that. Since matter clumps together, space would be more highly curved in those regions. In the large voids between the clusters of galaxies, there would be less space curvature. Relative to the clustered regions, the voids would appear to be expanding similar to the appearance of dark energy. Using this idea the team ran computer simulations of a universe using this cluster effect rather than dark energy. They found that the overall structure evolved similar to dark energy models. That would seem to support the idea that dark energy might be an effect of clustered galaxies.

It’s an interesting idea, but there are reasons to be skeptical. While such clustering can have some effect on cosmic expansion, it wouldn’t be nearly as strong as we observe. While this particular model seems to explain the scale at which the clustering of galaxies occur, it doesn’t explain other effects, such as observations of distant supernovae which strongly support dark energy. Personally, I don’t find this new model very convincing, but I think ideas like this are certainly worth exploring. If the model can be further refined, it could be worth another look.

Paper: Gabor Rácz, et al. Concordance cosmology without dark energy. Monthly Notices of the Royal Astronomical Society: Letters DOI: 10.1093/mnrasl/slx026 (2017)

The post Who Needs Dark Energy? appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/04/11/needs-dark-energy/feed/ 6
Doomsday Scenario https://briankoberlein.com/2017/03/14/doomsday-scenario/ https://briankoberlein.com/2017/03/14/doomsday-scenario/#comments Tue, 14 Mar 2017 15:42:51 +0000 https://briankoberlein.com/?p=6545

Could the Universe collapse and destroy everything? Probably not.

The post Doomsday Scenario appeared first on One Universe at a Time.

]]>

Humans are mortal. Not just as individuals, but also as a species. We can defend against many of the existential dangers to humanity. Threats such as global warming and pollution are well understood, and we can take steps to address them if we have the will. Even cosmic threats such as a civilization ending impact can be mitigated given time. But what about a deeper cosmic threat? What if the Universe could destroy not only our planet, but the entire galaxy, and what if we could never see it coming? 

Recently there’s been buzz about an idea known as the false vacuum scenario, and it’s terrifying to think of.

Usually a physical system will try to get to the lowest energy state it can, releasing that energy in some form. In classical physics, if a system reaches a state of low energy it will remain there even if a lower energy state is possible. Imagine a ball rolling into a small valley on the side of a mountain. If the ball could get out of the valley it would roll even farther down the mountain. But the ball has no way to get out of the valley, so it will remain their indefinitely.

However in quantum mechanics this isn’t the case. If a quantum system reaches a state of low energy, it might remain there for a time, but it won’t remain there forever. Because of an effect known as quantum tunneling, a quantum system can break out of its little valley and head toward an even lower energy state. Given enough time, a quantum system will eventually reach the lowest energy state possible.

The observed mass of the Higgs boson supports the idea that the Universe is in a metastable state. Credit: Wikipedia

Our Universe is a quantum system, so one of the big questions is whether it happens to be stable and in the lowest energy state, or in a higher energy state and only metastable. In the standard model of particle physics, the answer to this question can be answered by the mass of the Higgs boson and the top quark. These two masses can be used to determine if the the vacuum state of the electroweak force is stable or metastable. Current observations point to it being metastable, which means the current state of the Universe might be temporary. If so, the Universe could collapse into a lower energy state at any time. If it does, then everything in the Universe would be destroyed. And there would be no way to see it coming. We would just exist one moment, and dissolve into quantum chaos the next.

But how likely is such a scenario? It’s tempting to argue that since the Universe has existed just fine for nearly 14 billion years, it will probably exist for billions more. But that’s not how probability works. If you toss a fair coin ten times and each time comes up heads, that doesn’t mean it will likely come up heads the next ten times. The odds of each toss is 50/50, and just because you got lucky the first ten time doesn’t mean you will on toss eleven. However there is also the possibility that your coin isn’t fair, in which case you would expect to keep seeing heads. So if you get heads ten times in a row, what are the odds that the coin is fair?

The more likely the doomsday scenario, the less likely Earth would have formed later.

We can use this idea to estimate the likelihood of the false vacuum scenario. We live in a Universe that is about 14 billion years old, and Earth formed when the Universe was about 9 billion years old. If the false vacuum scenario were highly likely, then the odds of our planet forming so late in the game would be tiny. The more stable the Universe is likely to be, the more probable a late-forming Earth is. As with the coin toss, the fact that we live on a planet that only formed 5 billion years ago means the odds of cosmic destruction must be quite small. Doing the math, it comes out to a chance of about 1 in 1.1 billion years.

So even if the Universe is metastable (and we still don’t know for sure) it is at least very, very stable. There are lots of other existential threats that are more likely, and we would do well to focus on them. If we rise to the challenge there is still plenty of time to explore the stars.

Paper: Max Tegmark and Nick Bostrom. Is a doomsday catastrophe likely? Nature 438, 754 (2005)

The post Doomsday Scenario appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/03/14/doomsday-scenario/feed/ 8
Bigger, Stronger, Faster https://briankoberlein.com/2017/02/01/bigger-stronger-faster/ https://briankoberlein.com/2017/02/01/bigger-stronger-faster/#comments Wed, 01 Feb 2017 12:00:23 +0000 https://briankoberlein.com/?p=6450

New observations of lensed quasars show the Universe is expanding faster than expected. But these results raise questions about the assumptions of our cosmological models.

The post Bigger, Stronger, Faster appeared first on One Universe at a Time.

]]>

We’ve known for nearly a century that the Universe is expanding. The fact that galaxies are receding away from us was first demonstrated by Edwin Hubble in 1929, building upon the work of Henrietta Leavitt and others. Since then we’ve developed a variety of ways to measure the rate of cosmic expansion, and while they are broadly in agreement, there are small discrepancies between them. As a result we still don’t know exactly how fast the Universe is expanding, as astrophysicist Ethan Siegel has so clearly explained. Now a new method of measuring cosmic expansion may settle the issue, but it also raises more questions.

It all comes down to a physical parameter known as the Hubble constant. The bigger Hubble constant, the greater the rate of cosmic expansion. The value of the constant also tells us the age of the Universe. If you trace the expansion backwards through time, you reach the point where the Universe was extremely hot and dense, commonly known as the big bang.

Hubble’s original measurement of the constant compared the distances of galaxies with the redshift of their light. He calculated galactic distances by measuring the brightness of variable stars known as Cepheid variables, and combined them with measurements of galactic redshifts made by Vesto Slipher. He found that more distant galaxies had greater redshifts, and presumably were receding from us at a greater rate. Hubble’s original value for the constant was about 500 km/s/Mpc, which caused a bit of a cosmological crisis. If the value was correct, the Universe was only about 2 billion years old, which contradicted geological evidence that showed the Earth was more than 4 billion years old.

Credits: NASA, ESA, A. Feild (STScI), and A. Riess (STScI/JHU)

Over time our measurements of the Hubble constant got better, and the results settled around a much smaller value of around 70 km/s/Mpc, putting the age of the Universe at about 14 billion years. We also developed different ways to calculate the Hubble constant using different types of data, and they each produced similar results. This means we know for sure that the Universe is expanding, and we have a pretty good handle on just how fast it’s expanding. But while these different methods broadly agreed, they didn’t exactly agree. It was generally thought that as our measurements got better this discrepancy would go away, but it didn’t. Something was clearly wrong with our understanding of cosmic expansion.

 

Modern measurement tensions from the distance ladder (red) with CMB (green) and BAO (blue) data. Image credit:
“Cosmological implications of baryon acoustic oscillation measurements”, Aubourg, Éric et al. Phys.Rev. D92 (2015) no.12, 123516.

Hubble’s method of comparing distance with redshift has been extended by shifting from Cepheid variables to supernovae. A particular type of supernova known as Type IA allows us to determine galactic distances across billions of light years. In 2016, observations from the Hubble telescope using this approach gave a value of 73.24±1.74 km/s/Mpc, which is on the high side of modern values.

A different approach looks at fluctuations in the cosmic microwave background (CMB). The CMB is the thermal remnant of the big bang, and while it is quite uniform in temperature, there are small-scale fluctuations. As the Universe has expanded, these fluctuations are stretched, so the scale at which fluctuations peak gives a value of the rate of cosmic expansion, and thus the Hubble constant. The most precise CMB measurement of the Hubble constant was made by the Planck satellite, and gave a result of 66.93±0.62 km/s/Mpc, which is on the low side. The Planck result agrees with another method known as baryon acoustic oscillation (BAO), which looks at how galaxies cluster together across billions of light years, which gives a value of 67.6±0.7 km/s/Mpc.

Temperature fluctuations of the CMB vary at different scales. Credit: NASA/WMAP

These disagreements are problematic because they point to problems in our cosmological model. Although each result is quite sophisticated, they depend upon certain assumptions about the Universe. Our current model, known as the LCDM model, includes regular matter, dark matter and dark energy, as well as things such as how flat the Universe is on small scales. Each of these can be measured by independent experiments, but the results all have a bit of uncertainty. Tweak the values of these parameters within the model, and the value of the measured Hubble constant shifts. So we could tweak the model to make Hubble constant results fit, but tweaking models to fit your expectations is bad science.

Now there’s a new method for determining the Hubble constant, and its result is very interesting.

Diagram showing how distant light can be gravitationally lensed. ALMA (ESO/NRAO/NAOJ), L. Calçada (ESO), Y. Hezaveh et al.

Rather than looking at the CMB or measuring galactic distances, the new approach looks at an effect known as gravitational lensing. As light passes near a large mass such as a star or galaxy, it is gravitationally deflected. As a result, light from a distant object such as a quasar can be deflected around a less distant galaxy. Instead of seeing one image of the distant quasar, we see multiple images. But if we look at these lensed images things get very interesting. Each image of the quasar has taken a different path, and those paths can have different lengths. So some images reach us sooner than others. We’ve seen this effect with distant supernovae, for example, allowing us to see multiple “instant replays” of a supernova over the course of a few decades. Quasars can fluctuate in brightness, which allows us to measure the timing between lensed images of a particular quasar.

In this new approach, the team looked at several lensed quasars, and measured the timing differences. These timing differences are affected by the Hubble constant, so by measuring different lensed quasars the team could get a value for the Hubble constant. The key here is that while the results depend strongly on the value of the Hubble constant, they aren’t affected very much by other model parameters such as the amount of regular matter and dark matter. It’s a more direct measurement, and therefore less dependent on model assumptions. The result they got was 71.9±2.7 km/s/Mpc.

This agrees pretty well with the Hubble results, but not with the CMB results. Since the result is less model dependent, it raises questions about our cosmological model. Why are the CMB and BAO results so much lower than the others? It isn’t clear at this point, and while this new result is great, it doesn’t solve the mystery of Hubble’s constant.

Paper: V. Bonvin, et al. H0LiCOW V. New COSMOGRAIL time delays of HE0435-1223: H0 to 3.8% precision from strong lensing in a flat ΛCDM modelarXiv:1607.01790 [astro-ph.CO] (2017)

 

The post Bigger, Stronger, Faster appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/02/01/bigger-stronger-faster/feed/ 8
Deep Field Black Holes https://briankoberlein.com/2017/01/07/deep-field-black-holes/ https://briankoberlein.com/2017/01/07/deep-field-black-holes/#comments Sat, 07 Jan 2017 17:20:34 +0000 https://briankoberlein.com/?p=6426

A new x-ray deep field image supports the idea that supermassive black holes formed before galaxies did.

The post Deep Field Black Holes appeared first on One Universe at a Time.

]]>

At the heart of most galaxies lies a supermassive black hole. How such black holes came to be is a matter of some debate. Did black holes form first, and galaxies later formed around them (bottom up model), or did galaxies form first, and only later did their cores collapse into a black hole (top down model). To answer this question we need to have a good understanding of when these black holes started to form. A new ultra-deep x-ray image is helping to answer these questions.

A deep field image is one that has a clear view of very distant objects. The most famous deep field is the Hubble Ultra Deep Field (HUDF), which showed us just how many galaxies there are in the cosmos. The HUDF was in the visible and infrared, but there are others such as the ALMA Deep Field, which was taken at microwave wavelengths, which gave us a view of distant gas and dust. Now the Chandra X-ray Observatory has taken an x-ray deep field, which gives us a view of distant black holes.

Black holes don’t emit light themselves, but the gas and dust near a black hole can become superheated by the gravitational squeezing of the black hole. Such “active” black holes can emit huge jets of plasma that give off intense x-rays. By studying these x-ray emissions, we can determine things such as the size and rate of growth of the black hole. This new deep field image gathered light from supermassive black holes when the Universe was about 2 billion years old. Since the region observed was the same as the Hubble Deep Field, the team could match x-ray black holes to galaxies in the Hubble deep field, and get an idea of the size and evolution of the black holes and their galaxies. What they found was that the “seeds” for these supermassive black holes were likely on the order of 10,000 to 100,000 times more massive than our Sun. This would tend to support the bottom up model where black holes formed first. If the top down model was correct, we would assume the seeds would be smaller, on the order of 100 t0 1,000 solar masses.

This new data doesn’t completely rule out the top down model, but it is consistent with other evidence that supports the bottom up model. Right now it looks like black holes formed early in the Universe, and this triggered the formation of galaxies around them.

Paper: Fabio Vito, et al. The deepest X-ray view of high-redshift galaxies: constraints on low-rate black-hole accretion. MNRAS 463 (1): 348-374. doi: 10.1093/mnras/stw1998 (2016)

The post Deep Field Black Holes appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/01/07/deep-field-black-holes/feed/ 5
Antimatter Astronomy https://briankoberlein.com/2017/01/02/antimatter-astronomy/ https://briankoberlein.com/2017/01/02/antimatter-astronomy/#comments Mon, 02 Jan 2017 12:00:38 +0000 https://briankoberlein.com/?p=6416

Matter and antimatter emit the same spectra of light. So how do we know that distant galaxies aren't made of antimatter?

The post Antimatter Astronomy appeared first on One Universe at a Time.

]]>

In astronomy we study distant galaxies by the light they emit. Just as the stars of a galaxy glow bright from the heat of their fusing cores, so too does much of the gas and dust at different wavelengths. The pattern of wavelengths we observe tells us much about a galaxy, because atoms and molecules emit specific patterns of light. Their optical fingerprint tells us the chemical composition of stars and galaxies, among other things. It’s generally thought that distant galaxies are made of matter, just like our own solar system, but recently it’s been demonstrated that anti-hydrogen emits the same type of light as regular hydrogen. In principle, a galaxy of antimatter would emit the same type of light as a similar galaxy of matter, so how do we know that a distant galaxy really is made of matter? 

The basic difference between matter and antimatter is charge. Atoms of matter are made of positively charged nuclei surrounded by negatively charged electrons, while antimatter consists of negatively charged nuclei surrounded by positively charged positrons (anti-electrons). In all of our interactions, both in the lab and when we’ve sent probes to other planets, things are made of matter. So we can assume that most of the things we see in the Universe are also made of matter.

However, when we create matter from energy in the lab, it is always produced in pairs. We can, for example, create protons in a particle accelerator, but we also create an equal amount of anti-protons. This is due to a symmetry between matter and antimatter, and it leads to a problem in cosmology. In the early Universe, when the intense energy of the big bang produced matter, did it also produce an equal amount of antimatter? If so, why do we see a Universe that’s dominated by matter? The most common explanation is that there is a subtle difference between matter and antimatter. This difference wouldn’t normally be noticed, but on a cosmic scale it means the big bang produced more matter than antimatter.

But suppose the Universe does have an equal amount of matter and antimatter, but early on the two were clumped into different regions. While our corner of the Universe is dominated by matter, perhaps there are distant galaxies or clusters of galaxies that are dominated by antimatter. Since the spectrum of light from matter and antimatter is the same, a distant antimatter galaxy would look the same to us as if it were made of matter. Since we can’t travel to distant galaxies directly to prove their made of matter, how can we be sure antimatter galaxies don’t exist?

One clue comes from the way matter and antimatter interact. Although both behave much the same on their own, when matter and antimatter collide they can annihilate each other to produce intense gamma rays. Although the vast regions between galaxies are mostly empty, they aren’t complete vacuums. Small amounts of gas and dust drift between galaxies, creating an intergalactic wind. If a galaxy were made of antimatter, any small amounts of matter from the intergalactic wind would annihilate with antimatter on the outer edges of the galaxy and produce gamma rays. If some galaxies were matter and some antimatter, we would expect to see gamma ray emissions in the regions between them. We don’t see that. Not between our Milky Way and other nearby galaxies, and not between more distant galaxies. Since our region of space is dominated by matter, we can reasonably assume that other galaxies are matter as well.

It’s still possible that our visible universe just happens to be matter dominated. There may be other regions beyond the visible universe that are dominated by antimatter, and its simply too far away for us to see. That’s one possible solution to the matter-antimatter cosmology problem. But that would be an odd coincidence given the scale of the visible universe.

So there might be distant antimatter galaxies in the Universe, but we can be confident that the galaxies we do see are made of matter just like us.

The post Antimatter Astronomy appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/01/02/antimatter-astronomy/feed/ 18
A Light Change https://briankoberlein.com/2016/12/07/a-light-change/ https://briankoberlein.com/2016/12/07/a-light-change/#comments Wed, 07 Dec 2016 12:00:53 +0000 https://briankoberlein.com/?p=6366

Was the speed of light much faster in the early universe?

The post A Light Change appeared first on One Universe at a Time.

]]>

One of the big mysteries of modern cosmology is the fact that the Universe is so uniform on large scales. Observations tell us our Universe is topologically flat, and the cosmic microwave background we see in all directions has only the smallest temperature fluctuations. But if the cosmos began with a hot and dense big bang, then we wouldn’t expect such high uniformity. As the Universe expanded, distant parts of it would have moved out of reach from each other before there was time for their temperatures to even out. One would expect the cosmic background to have large hot and cold regions. The most common idea to explain this uniformity is early cosmic inflation. That is, soon after the big bang, the Universe expanded at an immense rate. The Universe we can currently observe originated from an extremely small region, and early inflation made everything even out. The inflation model has a lot going for it, but proving inflation is difficult, so some theorists have looked for alternative models that might be easier to prove. One recent idea looks at a speed of light that changes over time.

The idea that light may have had a different speed in the past isn’t new. Despite the assertions of some young Earth creationists, we know the speed of light has remained constant for at least 7 billion years. The well-tested theories of special and general relativity also confirm a constant speed of light. But perhaps things were very different in the earliest moments of the cosmos. This new work looks at alternative approach to gravity where the speed of gravity and the speed of light don’t have to be the same. In general relativity, if the speed of light changed significantly, so would the speed of gravity, and this would lead to effects we don’t observe. In this new model, the speed of light could have been much faster than gravity early on, and this would allow the cosmic microwave background to even out. As the Universe expanded and cooled, a phase transition would shift the speed of light to that of gravity, just as we observe now.

Normally this kind of thing can be discarded as just another handwaving idea, but the model makes two key predictions. The first is that there shouldn’t be any primordial gravitational waves. Inflation models predict primordial gravitational fluctuations, so if they are observed this new model is ruled out. But it might be the case that primordial gravitational waves are simply too faint to be observed, which would leave inflation in theoretical limbo. But this new model also predicts that the cosmic background should have temperature fluctuations of a particular scale (known as the scalar spectral index ns). According to the model, ns should be about 0.96478. Current observations find ns = 0.9667 ± 0.0040. So the predictions of this model actually agree with observation.

That seems promising, but inflation can’t be ruled out yet. This current model only explains the uniformity of the cosmic background. Inflation also explains things like topological flatness and a few other subtle cosmological issues this new model doesn’t address. The key is that this new model is testable, and that makes it a worthy challenger to inflation.

Paper: Niayesh Afshordi and Joao Magueijo. The critical geometry of a thermal big bang arXiv:1603.03312 [gr-qc]

The post A Light Change appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/12/07/a-light-change/feed/ 6