Cosmology – One Universe at a Time https://briankoberlein.com Brian Koberlein Thu, 21 Feb 2019 22:09:36 +0000 en-US hourly 1 https://wordpress.org/?v=5.1 A Universe Of Antimatter https://briankoberlein.com/2017/11/24/a-universe-of-antimatter/ https://briankoberlein.com/2017/11/24/a-universe-of-antimatter/#comments Fri, 24 Nov 2017 12:00:29 +0000 https://briankoberlein.com/?p=6794

If our universe were made of antimatter, what would it look like?

The post A Universe Of Antimatter appeared first on One Universe at a Time.

]]>

Our universe is dominated by matter. Sure, there is dark matter and dark energy, but things like stars, planets and people are made of matter. Protons, electrons, neutrons and such. But matter seems to come in pairs. For every electron created, an antimatter positron is created. For every proton that appears, so does an anti-proton. Since our universe is dominated by matter, what if there is another universe dominated by antimatter? What would an antimatter universe look like? 

The basic difference between matter and antimatter is that they have opposite charges. A proton has a positive charge, while an antiproton a negative one. Positively charged positrons are the antimatter version of negatively charged electrons. What’s interesting is that the signs of electric charge are a fluke of history. We could have assigned a positive charge to electrons and a negative one to protons. There’s nothing special about choosing one or the other. So you might think that an antimatter universe would look exactly like our regular one. But matter and antimatter have subtle differences.

One of the main differences has to do with neutrinos. Neutrinos don’t have any charge, so if the sign of charge were the only difference between matter and antimatter, “antimatter” neutrinos would be identical to “matter” neutrinos. But it turns out they are slightly different. Neutrinos have a property called helicity, which describes whether they spin to the left or the right as they travel through space. Matter neutrinos have left-handed helicity, while antimatter one have a right-handed helicity. That might not seem like a big deal, but in 1956 Chien-Shiung Wu looked at the radioactive decay of cobalt-60 atoms. She found that left-oriented and right-oriented atoms decay at different rates. Since handedness is different between matter and antimatter, the two might decay at different rates. This might be the reason why we don’t seen lots of antimatter in the universe.

But suppose there was an antimatter universe that had lots of anti-hydrogen and anti-helium after its big bang, just as our early universe had lots of hydrogen and helium. It would seem reasonable that these could fuse to heavier antimatter elements in the cores of antimatter stars, and this could produce antimatter planets and perhaps even antimatter life. What would these creatures see when they look up into their night sky?

In this case we know it would look much like our own night sky. Recently we’ve been able to produce anti-hydrogen, and we have looked at the type of light it produces. We found that anti-hydrogen produces the same kind of light as regular hydrogen. So an antimatter Sun would emit the same light as our Sun. Light would reflect off an antimatter moon just as it does our Moon, and our antimatter cousins would see a sky filled with stars, nebulae and planets, just like we do.

Of course all of this is based upon the assumption that antimatter would collapse under gravity to form stars in the first place. We think that should be the case, but what if antimatter also had anti-mass? What if anti-atoms gravitationally repelled each other? In that case, an antimatter universe would never form stars or galaxies. Our antimatter universe would simply be filled with traces of anti-hydrogen and anti-helium, and nothing would ever look up at the cosmic sky.  While we think antimatter has regular mass, we haven’t created enough of it in the lab to test the idea. For now we can’t be sure.

So it is quite possible that an antimatter universe would look nearly identical to our own. But it could be that an antimatter universe would be nothing but cold gas. It’s even possible that the radioactive decay of antimatter is so different from that of matter, that an antimatter universe can’t even exist.

The post A Universe Of Antimatter appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/11/24/a-universe-of-antimatter/feed/ 5
Dark Matter Isn’t Warm And Fuzzy https://briankoberlein.com/2017/08/03/dark-matter-isnt-warm-fuzzy/ https://briankoberlein.com/2017/08/03/dark-matter-isnt-warm-fuzzy/#comments Thu, 03 Aug 2017 11:00:56 +0000 https://briankoberlein.com/?p=6722

A new survey of distant quasars shows that dark matter isn't warm and fuzzy.

The post Dark Matter Isn’t Warm And Fuzzy appeared first on One Universe at a Time.

]]>

Dark matter is one of the big mysteries of cosmology. Theoretically it explains cosmic phenomena such as the scale at which galaxies cluster, and observationally we see its effect through things like gravitational lensing, but it hasn’t been observed directly. This means we have a limited understanding its exact nature. As a result, there have been lots of theoretical ideas about what dark matter could be. But we now know that whatever dark matter is, it isn’t warm and fuzzy. 

There are two broad aspects about dark matter that no one disagrees about (assuming it exists). The first is that it must be dark, meaning that it doesn’t interact much with light. If it did interact with light, we would see its effects through the absorption or scattering of light from stars and distant galaxies. The second is that it must have mass, since the models require that it interacts with regular matter gravitationally. Beyond that, almost anything goes.

The most models assume that dark matter is cold. In this case, cold vs warm refers to the speed at which dark matter particles typically move. In cold dark matter models, the particles are relatively heavy, with a mass similar to that of protons or more. Because of their high mass, these dark matter particles would move relatively slowly, at much the same speed as the gas and dust in our galaxy. Neutrinos, on the other hand, are warm dark matter. Neutrinos don’t interact strongly with light, and they do have mass, so they meet the basic requirement of dark matter. But neutrino mass is minuscule, and they typically move at speeds approaching the speed of light. Thus, neutrinos are an example of warm or hot dark matter. One of the things we observe about galaxies is that they have far more mass than their visible matter would suggest, so they must contain a lot of dark matter. This means dark matter clumps together just as gas and dust clumps to form galaxies. Warm dark matter such as neutrinos move much too quickly to clump together in this way, so it would seem that dark matter must be cold. While cold dark matter is a central part of the standard “concordance model” of cosmology, it isn’t without problems. One of the biggest problems is that cold dark matter predicts that large spiral galaxies like our Milky Way should have hundreds of small satellite galaxies surrounding it. We’ve only found about a dozen satellite galaxies. Even the distribution of stars within these dwarf galaxies doesn’t fit the dark matter model very well.

While warm dark matter like neutrinos doesn’t fit the data well, there are other warm models that might. They solve some of the issues with warm dark matter by suggesting dark matter is also “fuzzy.” This refers to its quantum nature. All matter has a quantum aspect to it. For example, an electron doesn’t orbit the nucleus of an atom like a planet around the Sun. Instead, the electron is in a “fuzzy” quantum state within the atom. Normally the fuzzy nature of quantum particles only acts at short distances, on the scale of a few atoms, but under the right conditions this kind of fuzzy quantum behavior can occur over large distances. In the fuzzy dark matter model, the dark matter particles can interact quantum mechanically over great distances, thus allowing them to behave in ways similar to cold dark matter.

Several computer simulations of the universe agree with the cold dark matter model on large scales, but a new study specifically looked at how the warm fuzzy model compares. To do this the team used observations from more than 100 quasars. Quasars are distant objects powered by the supermassive black holes in the centers of galaxies. They give off tremendous amounts of light and energy, and so we can see them across billions of light years. As the light from these quasars travels across the cosmos to reach us, it is distorted by diffuse filaments of hydrogen gas between galaxies, known as the intergalactic medium. The distribution of hydrogen in the intergalactic medium allows us to study how clusters of galaxies formed. The team compared this data to both cold and warm-fuzzy dark matter models. They found the warm-fuzzy model didn’t agree with observation. That doesn’t mean that warm-fuzzy dark matter doesn’t exist, but if it does exist it must be so diffuse and have such an extraordinarily tiny mass that it couldn’t have caused the clustering of galaxies we observe.

Cold dark matter still has its own problems, and the nature of dark matter still holds many mysteries. But we now know that for the most part dark matter isn’t warm and fuzzy.

Paper: Vid Iršič, et al. First Constraints on Fuzzy Dark Matter from Lyman-α Forest Data and Hydrodynamical Simulations. Phys. Rev. Lett. 119, 031302 (2017)

The post Dark Matter Isn’t Warm And Fuzzy appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/08/03/dark-matter-isnt-warm-fuzzy/feed/ 5
How To Define Distance In An Expanding Universe https://briankoberlein.com/2017/05/28/define-distance-expanding-universe/ https://briankoberlein.com/2017/05/28/define-distance-expanding-universe/#comments Sun, 28 May 2017 11:00:20 +0000 https://briankoberlein.com/?p=6657

On a cosmic scale the notion of distance is more subtle than you might think.

The post How To Define Distance In An Expanding Universe appeared first on One Universe at a Time.

]]>

Recently the Sloan Digital Sky Survey (SDSS) has completed the largest map of the universe thus far. The map focuses on the positions of quasars. These objects are powered by supermassive black holes in the centers of galaxies, and are so bright they can be seen from the farthest regions of the cosmos. Most quasars are so far away that we have to redefine what “distance” means. In an expanding universe, distance can be defined in a variety of ways. 

For the stars we see in the night sky, their distance is just what you’d expect: the physical distance from the Sun to the star. The bright star Sirius, for example, is 2.6 parsecs away. A parsec is defined by the method used to measure stellar distances, known as parallax. As the Earth orbits the Sun, its view of the stars shifts very slightly. Nearby stars shift more than distant ones, and this is known as a parallax shift. The bigger the parallax, the closer the star. If a star were one parsec away, its parallax would be 1 arcsecond. There are 360 degrees in a circle. If you took a single degree and divided it into 3600 parts, each part would be an arcsecond.

The parallax of nearby stars is small because they are so very far away. While astronomers often use parsecs for distance, a more common measure is the time it takes light from the star to reach us. For Sirius, that is about 8.6 light years, meaning the starlight we observe left Sirius about 8.6 years ago. Of course that distance changes a bit over time. Sirius is moving relative to the Sun. Even if we could travel to Sirius at the speed of light, we would have to make accommodations for its changing location. But this change in distance is small compared to its current distance.

Because stellar parallax is so small, it can only be used for stars out to about 10,000 light years or so. Beyond that the parallax is simply too small to measure. For more distant objects such as galaxies we have to use other methods. One popular method is to use variable stars known as Cepheid variables. Cepheid variables have a particular relation between their overall brightness and how quickly they vary from bright to dim. By watching them vary over time we can calculate their distance. Observations of Cepheid variables in the Andromeda galaxy, for example, shows that it is about 766,000 parsecs away, or 2.5 million light years. Just as with stars, the distance of a galaxy changes over time. Over the course of a 2.5 million year journey to Andromeda, the galaxy would have moved by about 1,500 light years. That’s still a small fraction of its overall distance, but its not insignificant.

With more distant galaxies distance becomes much more complicated. If we measure the motion of various galaxies, usually through the redshift of their light, we find that the more distant the galaxy the greater its redshift. This is due to the overall expansion of the cosmos. Through dark energy, the overall distance between galaxies is increasing, and this cosmic expansion puts a serious wrench in the meaning of cosmic distance.

The most direct quantity we can measure for a distant galaxy is its redshift. Usually this is expressed as z, which is the fractional amount a particular wavelength has changes from its unshifted wavelength. The upper range of z we have observed is about 12, so lets consider a galaxy that has about half that amount, or z = 6. Just how far away is such a galaxy?

Redshift can be caused by two things: the motion of a galaxy through space (often called Doppler redshift) and the expansion of space itself (often called cosmological redshift). We can’t distinguish between them observationally, but we know from various observations that the motion of local galaxies that the Doppler shift tends to be rather small. So its safe to assume that for distant galaxies redshift is almost entirely due to cosmic expansion. To calculate distances, we then have to look at how the universe expands over time, and this relies on which particular cosmological model we use. Typically this is the concordance model, or ΛCDM model, which is your standard dark matter, dark energy dominated universe model. Assuming this model is accurate (and we have lots of reasons to think it is) then we can calculate galactic distances. But we have to be careful about how we define distances.

Suppose we use the parsec definition above. That is, based upon the light we currently see, how far away is a quasar with redshift z = 6? Another way to say this would be “How far away was the quasar when the light left it?” This turns out to be about 1.2 billion parsecs. It’s tempting to convert this to light years, and thus say it was about 3.9 billion light years away, but this is misleading. Because the cosmos was expanding as the light traveled to us, it actually took the light about 12.8 billion years to reach us. So its light travel time distance is actually 12.8 billion light years. This is the most common “distance” used, since it’s easy to compare with the age of the light. When we observe a quasar with a redshift of z = 6, we see the universe as it was 12.8 billion years ago.

Unfortunately this can also make things more confusing. Given that the travel time distance is 12.8 billion years, one might assume that the quasar is about 12.8 billion light years away, rather than 3.9 billion light years when the light left it. But the light we observe was traveling toward us as the universe expanded, while the quasar it left behind moved away from us with the expansion of space. This comoving distance is about 8.4 billion parsecs, which is equivalent to 27 billion light years.

Each of these distances is valid in its own way, even though they are all quite different. That’s why astronomers often stick to redshift.

The post How To Define Distance In An Expanding Universe appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/05/28/define-distance-expanding-universe/feed/ 6
Bigger, Stronger, Faster https://briankoberlein.com/2017/02/01/bigger-stronger-faster/ https://briankoberlein.com/2017/02/01/bigger-stronger-faster/#comments Wed, 01 Feb 2017 12:00:23 +0000 https://briankoberlein.com/?p=6450

New observations of lensed quasars show the Universe is expanding faster than expected. But these results raise questions about the assumptions of our cosmological models.

The post Bigger, Stronger, Faster appeared first on One Universe at a Time.

]]>

We’ve known for nearly a century that the Universe is expanding. The fact that galaxies are receding away from us was first demonstrated by Edwin Hubble in 1929, building upon the work of Henrietta Leavitt and others. Since then we’ve developed a variety of ways to measure the rate of cosmic expansion, and while they are broadly in agreement, there are small discrepancies between them. As a result we still don’t know exactly how fast the Universe is expanding, as astrophysicist Ethan Siegel has so clearly explained. Now a new method of measuring cosmic expansion may settle the issue, but it also raises more questions.

It all comes down to a physical parameter known as the Hubble constant. The bigger Hubble constant, the greater the rate of cosmic expansion. The value of the constant also tells us the age of the Universe. If you trace the expansion backwards through time, you reach the point where the Universe was extremely hot and dense, commonly known as the big bang.

Hubble’s original measurement of the constant compared the distances of galaxies with the redshift of their light. He calculated galactic distances by measuring the brightness of variable stars known as Cepheid variables, and combined them with measurements of galactic redshifts made by Vesto Slipher. He found that more distant galaxies had greater redshifts, and presumably were receding from us at a greater rate. Hubble’s original value for the constant was about 500 km/s/Mpc, which caused a bit of a cosmological crisis. If the value was correct, the Universe was only about 2 billion years old, which contradicted geological evidence that showed the Earth was more than 4 billion years old.

Credits: NASA, ESA, A. Feild (STScI), and A. Riess (STScI/JHU)

Over time our measurements of the Hubble constant got better, and the results settled around a much smaller value of around 70 km/s/Mpc, putting the age of the Universe at about 14 billion years. We also developed different ways to calculate the Hubble constant using different types of data, and they each produced similar results. This means we know for sure that the Universe is expanding, and we have a pretty good handle on just how fast it’s expanding. But while these different methods broadly agreed, they didn’t exactly agree. It was generally thought that as our measurements got better this discrepancy would go away, but it didn’t. Something was clearly wrong with our understanding of cosmic expansion.

 

Modern measurement tensions from the distance ladder (red) with CMB (green) and BAO (blue) data. Image credit:
“Cosmological implications of baryon acoustic oscillation measurements”, Aubourg, Éric et al. Phys.Rev. D92 (2015) no.12, 123516.

Hubble’s method of comparing distance with redshift has been extended by shifting from Cepheid variables to supernovae. A particular type of supernova known as Type IA allows us to determine galactic distances across billions of light years. In 2016, observations from the Hubble telescope using this approach gave a value of 73.24±1.74 km/s/Mpc, which is on the high side of modern values.

A different approach looks at fluctuations in the cosmic microwave background (CMB). The CMB is the thermal remnant of the big bang, and while it is quite uniform in temperature, there are small-scale fluctuations. As the Universe has expanded, these fluctuations are stretched, so the scale at which fluctuations peak gives a value of the rate of cosmic expansion, and thus the Hubble constant. The most precise CMB measurement of the Hubble constant was made by the Planck satellite, and gave a result of 66.93±0.62 km/s/Mpc, which is on the low side. The Planck result agrees with another method known as baryon acoustic oscillation (BAO), which looks at how galaxies cluster together across billions of light years, which gives a value of 67.6±0.7 km/s/Mpc.

Temperature fluctuations of the CMB vary at different scales. Credit: NASA/WMAP

These disagreements are problematic because they point to problems in our cosmological model. Although each result is quite sophisticated, they depend upon certain assumptions about the Universe. Our current model, known as the LCDM model, includes regular matter, dark matter and dark energy, as well as things such as how flat the Universe is on small scales. Each of these can be measured by independent experiments, but the results all have a bit of uncertainty. Tweak the values of these parameters within the model, and the value of the measured Hubble constant shifts. So we could tweak the model to make Hubble constant results fit, but tweaking models to fit your expectations is bad science.

Now there’s a new method for determining the Hubble constant, and its result is very interesting.

Diagram showing how distant light can be gravitationally lensed. ALMA (ESO/NRAO/NAOJ), L. Calçada (ESO), Y. Hezaveh et al.

Rather than looking at the CMB or measuring galactic distances, the new approach looks at an effect known as gravitational lensing. As light passes near a large mass such as a star or galaxy, it is gravitationally deflected. As a result, light from a distant object such as a quasar can be deflected around a less distant galaxy. Instead of seeing one image of the distant quasar, we see multiple images. But if we look at these lensed images things get very interesting. Each image of the quasar has taken a different path, and those paths can have different lengths. So some images reach us sooner than others. We’ve seen this effect with distant supernovae, for example, allowing us to see multiple “instant replays” of a supernova over the course of a few decades. Quasars can fluctuate in brightness, which allows us to measure the timing between lensed images of a particular quasar.

In this new approach, the team looked at several lensed quasars, and measured the timing differences. These timing differences are affected by the Hubble constant, so by measuring different lensed quasars the team could get a value for the Hubble constant. The key here is that while the results depend strongly on the value of the Hubble constant, they aren’t affected very much by other model parameters such as the amount of regular matter and dark matter. It’s a more direct measurement, and therefore less dependent on model assumptions. The result they got was 71.9±2.7 km/s/Mpc.

This agrees pretty well with the Hubble results, but not with the CMB results. Since the result is less model dependent, it raises questions about our cosmological model. Why are the CMB and BAO results so much lower than the others? It isn’t clear at this point, and while this new result is great, it doesn’t solve the mystery of Hubble’s constant.

Paper: V. Bonvin, et al. H0LiCOW V. New COSMOGRAIL time delays of HE0435-1223: H0 to 3.8% precision from strong lensing in a flat ΛCDM modelarXiv:1607.01790 [astro-ph.CO] (2017)

 

The post Bigger, Stronger, Faster appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/02/01/bigger-stronger-faster/feed/ 8
Antimatter Astronomy https://briankoberlein.com/2017/01/02/antimatter-astronomy/ https://briankoberlein.com/2017/01/02/antimatter-astronomy/#comments Mon, 02 Jan 2017 12:00:38 +0000 https://briankoberlein.com/?p=6416

Matter and antimatter emit the same spectra of light. So how do we know that distant galaxies aren't made of antimatter?

The post Antimatter Astronomy appeared first on One Universe at a Time.

]]>

In astronomy we study distant galaxies by the light they emit. Just as the stars of a galaxy glow bright from the heat of their fusing cores, so too does much of the gas and dust at different wavelengths. The pattern of wavelengths we observe tells us much about a galaxy, because atoms and molecules emit specific patterns of light. Their optical fingerprint tells us the chemical composition of stars and galaxies, among other things. It’s generally thought that distant galaxies are made of matter, just like our own solar system, but recently it’s been demonstrated that anti-hydrogen emits the same type of light as regular hydrogen. In principle, a galaxy of antimatter would emit the same type of light as a similar galaxy of matter, so how do we know that a distant galaxy really is made of matter? 

The basic difference between matter and antimatter is charge. Atoms of matter are made of positively charged nuclei surrounded by negatively charged electrons, while antimatter consists of negatively charged nuclei surrounded by positively charged positrons (anti-electrons). In all of our interactions, both in the lab and when we’ve sent probes to other planets, things are made of matter. So we can assume that most of the things we see in the Universe are also made of matter.

However, when we create matter from energy in the lab, it is always produced in pairs. We can, for example, create protons in a particle accelerator, but we also create an equal amount of anti-protons. This is due to a symmetry between matter and antimatter, and it leads to a problem in cosmology. In the early Universe, when the intense energy of the big bang produced matter, did it also produce an equal amount of antimatter? If so, why do we see a Universe that’s dominated by matter? The most common explanation is that there is a subtle difference between matter and antimatter. This difference wouldn’t normally be noticed, but on a cosmic scale it means the big bang produced more matter than antimatter.

But suppose the Universe does have an equal amount of matter and antimatter, but early on the two were clumped into different regions. While our corner of the Universe is dominated by matter, perhaps there are distant galaxies or clusters of galaxies that are dominated by antimatter. Since the spectrum of light from matter and antimatter is the same, a distant antimatter galaxy would look the same to us as if it were made of matter. Since we can’t travel to distant galaxies directly to prove their made of matter, how can we be sure antimatter galaxies don’t exist?

One clue comes from the way matter and antimatter interact. Although both behave much the same on their own, when matter and antimatter collide they can annihilate each other to produce intense gamma rays. Although the vast regions between galaxies are mostly empty, they aren’t complete vacuums. Small amounts of gas and dust drift between galaxies, creating an intergalactic wind. If a galaxy were made of antimatter, any small amounts of matter from the intergalactic wind would annihilate with antimatter on the outer edges of the galaxy and produce gamma rays. If some galaxies were matter and some antimatter, we would expect to see gamma ray emissions in the regions between them. We don’t see that. Not between our Milky Way and other nearby galaxies, and not between more distant galaxies. Since our region of space is dominated by matter, we can reasonably assume that other galaxies are matter as well.

It’s still possible that our visible universe just happens to be matter dominated. There may be other regions beyond the visible universe that are dominated by antimatter, and its simply too far away for us to see. That’s one possible solution to the matter-antimatter cosmology problem. But that would be an odd coincidence given the scale of the visible universe.

So there might be distant antimatter galaxies in the Universe, but we can be confident that the galaxies we do see are made of matter just like us.

The post Antimatter Astronomy appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2017/01/02/antimatter-astronomy/feed/ 18
A Light Change https://briankoberlein.com/2016/12/07/a-light-change/ https://briankoberlein.com/2016/12/07/a-light-change/#comments Wed, 07 Dec 2016 12:00:53 +0000 https://briankoberlein.com/?p=6366

Was the speed of light much faster in the early universe?

The post A Light Change appeared first on One Universe at a Time.

]]>

One of the big mysteries of modern cosmology is the fact that the Universe is so uniform on large scales. Observations tell us our Universe is topologically flat, and the cosmic microwave background we see in all directions has only the smallest temperature fluctuations. But if the cosmos began with a hot and dense big bang, then we wouldn’t expect such high uniformity. As the Universe expanded, distant parts of it would have moved out of reach from each other before there was time for their temperatures to even out. One would expect the cosmic background to have large hot and cold regions. The most common idea to explain this uniformity is early cosmic inflation. That is, soon after the big bang, the Universe expanded at an immense rate. The Universe we can currently observe originated from an extremely small region, and early inflation made everything even out. The inflation model has a lot going for it, but proving inflation is difficult, so some theorists have looked for alternative models that might be easier to prove. One recent idea looks at a speed of light that changes over time.

The idea that light may have had a different speed in the past isn’t new. Despite the assertions of some young Earth creationists, we know the speed of light has remained constant for at least 7 billion years. The well-tested theories of special and general relativity also confirm a constant speed of light. But perhaps things were very different in the earliest moments of the cosmos. This new work looks at alternative approach to gravity where the speed of gravity and the speed of light don’t have to be the same. In general relativity, if the speed of light changed significantly, so would the speed of gravity, and this would lead to effects we don’t observe. In this new model, the speed of light could have been much faster than gravity early on, and this would allow the cosmic microwave background to even out. As the Universe expanded and cooled, a phase transition would shift the speed of light to that of gravity, just as we observe now.

Normally this kind of thing can be discarded as just another handwaving idea, but the model makes two key predictions. The first is that there shouldn’t be any primordial gravitational waves. Inflation models predict primordial gravitational fluctuations, so if they are observed this new model is ruled out. But it might be the case that primordial gravitational waves are simply too faint to be observed, which would leave inflation in theoretical limbo. But this new model also predicts that the cosmic background should have temperature fluctuations of a particular scale (known as the scalar spectral index ns). According to the model, ns should be about 0.96478. Current observations find ns = 0.9667 ± 0.0040. So the predictions of this model actually agree with observation.

That seems promising, but inflation can’t be ruled out yet. This current model only explains the uniformity of the cosmic background. Inflation also explains things like topological flatness and a few other subtle cosmological issues this new model doesn’t address. The key is that this new model is testable, and that makes it a worthy challenger to inflation.

Paper: Niayesh Afshordi and Joao Magueijo. The critical geometry of a thermal big bang arXiv:1603.03312 [gr-qc]

The post A Light Change appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/12/07/a-light-change/feed/ 6
The Stars Uncounted https://briankoberlein.com/2016/11/28/the-stars-uncounted/ https://briankoberlein.com/2016/11/28/the-stars-uncounted/#comments Mon, 28 Nov 2016 12:00:31 +0000 https://briankoberlein.com/?p=6355

The relativistic effect of gravitational lensing allows astronomers to see the faintest galaxies.

The post The Stars Uncounted appeared first on One Universe at a Time.

]]>

As we’ve recently seen, the cosmos is much larger than we’ve thought, with more than 2 trillion galaxies in the observable universe. Actually observing many of the most distant and faint galaxies is a real challenge, but more of them are being detected thanks to a trick that relies on relativity. 

The more distant a galaxy, the more dim it can appear. This is due in part to the fact that the apparent brightness of an object decreases with the square of its distance (known as the inverse square law). For galaxies, this effect is even more dramatic due to cosmic expansion, which further dims objects billions of light years away. Because of this dimming, small dwarf galaxies can be difficult to observe. This is a problem because in the nearby universe dwarf galaxies are the most numerous, so we could be missing a lot of galaxies when we look across great distances.

But it turns out that relativity can help, thanks to an effect known as gravitational lensing. The path of starlight can be deflected by the gravity of a nearby mass, as Arthur Eddington first demonstrated in 1919. This means that light from a distant galaxy can be deflected and focused if a closer galaxy is between us and it. Through gravitational lensing, the distant galaxy can appear brighter than it would otherwise, just as a glass lens can magnify and brighten a distant star.

Recently, a team used this method to observe faint dwarf galaxies at a redshift between z = 1 and z = 3.  We see these galaxies as they were when the Universe was 2 to 6 billion years old, which is a period of peak star formation. They found that dwarf galaxies were most abundant at the greatest redshifts, and thus the earliest period. Since most of the stars in these early dwarf galaxies were hot and bright, they flooded the Universe with ultraviolet light, driving the reionization period of the early Universe.

When the James Webb telescope launches in 2018, we should have an even better view of these dim and distant galaxies. Until then, gravitational lensing will help us explore this critical period of galaxy formation.

Paper: Anahita Alavi, et al. The Evolution Of The Faint End Of The UV Luminosity Function During The Peak Epoch Of Star Formation (1<z<3). The Astrophysical Journal, Volume 832, Number 1 (2016)  arXiv:1606.00469

The post The Stars Uncounted appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/11/28/the-stars-uncounted/feed/ 2
Sense Of Direction https://briankoberlein.com/2016/10/04/sense-of-direction/ https://briankoberlein.com/2016/10/04/sense-of-direction/#comments Tue, 04 Oct 2016 11:00:07 +0000 https://briankoberlein.com/?p=6277

The Universe has no preferred direction, and that's a good thing.

The post Sense Of Direction appeared first on One Universe at a Time.

]]>

One of the basic assumptions of cosmology is that the Universe is basically the same everywhere. That is, our location in the cosmos isn’t special, and if we happened to be located in another corner of the galaxy we’d see basically the same thing. It’s sometimes known as the Copernican principle, since it was Copernicus who famously proposed the Earth was not the center of the Universe. But given that we haven’t traveled very far into space, how can we really know that assumption is valid? 

The usual condition for the Copernican principle is that the Universe is homogeneous (meaning that on large scales matter should be evenly distributed) and isotropic (meaning that there is no special direction or orientation in the cosmos). The first can be seen by observing the distribution of galaxies in the visible universe. On small scales galaxies are clumped into cluster and superclusters, but at increasingly large scales things even out. We also know that the laws of physics seem to be the same throughout the cosmos, which supports the idea of homogeneity.

But what about the directionality of the Universe? To be isotropic, all directions have to be equivalent. This means, for example, that the Universe as a whole can’t be rotating. If there was a cosmic rotation, then the axis of rotation would be a preferred direction. It also means that the Universe can’t be expanding more quickly in one direction, since that would give the Universe a specific orientation. These aren’t simply hypothetical constraints. As Kurt Gödel demonstrated in 1949, general relativity does allow for a rotating universe, and several models of cosmic expansion have proposed that it might vary for different regions of space. If it does turn out the Universe isn’t isotropic, then our current model of the Universe would be overturned.

One of the best ways to test isotropy is to look at the cosmic microwave background (CMB). Since the CMB is the most distant source of light, any rotation or preferred expansion would be seen as deviations from isotropy within the CMB. Previous studies such as that of the Planck spacecraft have found no evidence of any deviations, but a new study takes things one step further. It looked at everything from the distribution of hot and cold regions within the cosmic background to the polarization of CMB light (what’s known as vorticity). The study found the CMB is isotropic to the limits of the data. Specifically they found the odds of a preferred direction in cosmic expansion to be  121,000 to 1 against. This is good news for supporters of the standard model, as still theoretical models such as early cosmic inflation.

There might still be some small anisotropy within the Universe, but as far as we can tell Copernican principle holds. For now, cosmology continues to lack a sense of direction.

Paper: Daniela Saadeh, at al. How Isotropic is the Universe? Physical Review Letters, 2016; 117 (13) DOI: 10.1103/PhysRevLett.117.131302

The post Sense Of Direction appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/10/04/sense-of-direction/feed/ 3
The Constant Of Time https://briankoberlein.com/2016/09/13/the-constant-of-time/ https://briankoberlein.com/2016/09/13/the-constant-of-time/#comments Tue, 13 Sep 2016 15:00:06 +0000 https://briankoberlein.com/?p=6253

The rate of cosmic expansion has changed over time. So why does it look like a constantly expanding universe?

The post The Constant Of Time appeared first on One Universe at a Time.

]]>

When Edwin Hubble first demonstrated the Universe was expanding in 1929, you could do a simple calculation to determine the age of the Universe. Take the rate at which galaxies expand from each other (known as the Hubble constant H) and set it equal to the inverse age of the cosmos (1/t). This simple model assumed that the Universe expands at a constant rate, thus Ht = 1. When this was first proposed within the context of the big bang model, it actually raised a few questions. Early measurements of the Hubble constant were much higher than the current accepted value, which gave a cosmic age that was actually younger than some stars

We now know the Universe hasn’t expanded at a constant rate. The rate of cosmic expansion is determined both by dark energy driving galaxies apart, and the overall density of matter in the Universe, which tries to slow the rate of expansion. In the early universe, matter dominated, so the rate of expansion was actually decreasing. About 6.5 billion years ago the average density of the Universe dropped to the point that dark energy began to dominate, and the Universe began expanding at an ever increasing rate. An accurate determination of the age of the Universe has to account for the initial inflationary period, then deceleration, then acceleration. If you do that you get an age of about 13.8 billion years, which is the currently accepted age.

Because of this variation in cosmic expansion, the Hubble constant has changed over cosmic time. This is why you can’t simply set Ht = 1. And yet, if you take the current Hubble constant and multiply it by the currently accepted age of the Universe, you get exactly 1 (to within known uncertainties). In other words, if the Universe had expanded at a constant rate, it would be exactly the same size and age as the Universe currently is. This is known as the synchronicity problem. It’s not a problem, per se, but rather an interesting coincidence. This hasn’t been true for any other epoch of the cosmos. It’s also not the only odd coincidence. The vacuum energy density (as determined by the Hubble constant) and the matter energy density are also about equal, and is known as the coincidence problem.

As the Universe expands the matter density drops, while the vacuum density doesn’t,  so it’s tempting to think that the synchronicity problem and the coincidence problem are two sides of the same coin. But a recent work shows this isn’t the case. By varying the parameters of a hypothetical universe, one could create a model where one is true but the other is not. These two unusual correlations are independent of each other. This raises the question of whether the two actually are related by some unknown physical process. We always have to be a bit careful with these kinds of questions. It is perfectly possible that the two “problems” are just due to random flukes. But when you start seeing coincidences in your data it is sometimes worth exploring.

If there is a connection, it will only be a matter of time before we find it.

Paper: Arturo Avelino and Robert P. Kirshner. The dimensionless age of the Universe: a riddle for our time. The Astrophysical Journal, Volume 828, Number 1 (2016) arXiv:1607.00002 [astro-ph.CO]

The post The Constant Of Time appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/09/13/the-constant-of-time/feed/ 2
Bouncing Back https://briankoberlein.com/2016/07/19/bouncing-back/ https://briankoberlein.com/2016/07/19/bouncing-back/#comments Tue, 19 Jul 2016 11:00:05 +0000 https://briankoberlein.com/?p=6093

Perhaps the Universe began with a big bounce rather than a big bang.

The post Bouncing Back appeared first on One Universe at a Time.

]]>

The Universe began with a big bang. Not an explosion from a single point, but rather an early dense state. Of course an obvious question this raises is what came before the big bang? While it’s possible that the answer is “nothing,” that hasn’t stopped some theorists from postulating an earlier cause for the Universe. One of these ideas is known as the big bounce. 

The basic idea of the big bounce is that the Universe goes through a series of expansions and contractions. Right now we live in an expanding Universe, but at some point, the model argues, the Universe will start to contract. Eventually it will contract to a dense fireball again, and this will trigger a new big bang. This solves the “what came before” problem of the big bang by postulating an infinite series of big bangs, but it’s not without problems. For one, as we currently understand dark energy the Universe will likely continue to expand forever. For another, if the Universe did re-collapse into a dense state, we have no idea how it would trigger a new big bang.

A new work in Physical Review Letters proposes a solution to this second problem. The key to the idea is to introduce quantum theory into the mix. In a purely classical model, a shrinking universe will eventually collapse into a singularity. It’s long been thought that quantum theory could provide a solution to this problem, but the devil is in the details. To prevent the formation of a singularity, the work introduces a symmetry known as conformal invariance. As long as the Universe has this symmetry during its dense period, it could enter the dense period at the end of one “universe” and re-expand to form a new “universe.” The authors call this a perfect bounce.

So with the right conditions it’s possible that our Universe could simply be the period between bounces.

Paper: Steffen Gielen et al. Perfect Quantum Cosmological Bounce. Phys. Rev. Lett. 117, 021301 (2016). DOI: 10.1103/PhysRevLett.117.021301

The post Bouncing Back appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/07/19/bouncing-back/feed/ 20
A Million Points Of Light https://briankoberlein.com/2016/07/16/million-points-light/ https://briankoberlein.com/2016/07/16/million-points-light/#comments Sat, 16 Jul 2016 16:16:38 +0000 https://briankoberlein.com/?p=6090

A new survey of more than a million galaxies supports the standard model of cosmology.

The post A Million Points Of Light appeared first on One Universe at a Time.

]]>

The Baryon Oscillation Spectroscopic Survey (BOSS) is part of the Sloan Digital Sky Survey-III. It looks at the positions and redshifts of more than a million galaxies. Redshift can be used as a measure of distance, though due to cosmic expansion you have to be a little careful about what that distance is. Using such a large sample, we can look at how the redshifts of galaxies (and therefore their distances) cluster on average, and this allows us to test whether the standard model of cosmology is correct

The key is to measure what is known as baryon acoustic oscillation (BAO). In the Universe there are two main forces driving the way in which galaxies cluster on large scales. One is dark matter, which causes galaxies to clump together. The other is dark energy, which causes these clumps to spread apart. The scale at which clumping occurs allows us to compare the ratio of dark matter to dark energy. It also allows us to study whether that ratio has changed over time. If, for example, dark energy were stronger in the early Universe, the gaps between clumps would be larger at greater distances.

The standard model of cosmology is known as the LCDM model, and assumes that dark energy is a constant, rather than varying over time and space. This new result found that the level of dark energy was constant over time to the limits of its observation, so once again the standard model holds up. The survey also calculate a value of dark energy through the Hubble parameter, and got a value of  67.3 km/s per megaparsec, which is much lower than the value of 73 recently found by Hubble observations, and even lower than the “official” value of 69.3 km/s per megaparsec. This variation of results within different observations, known as tension in the model, seems to be common these days. While the results aren’t completely contradictory when you take into account their uncertainties, they could hint at aspects of the model we don’t fully understand.

So overall the standard cosmological model holds up once again, but there are hints that a new chapter of the story may be unfolding as well.

Paper: Shadab Alam et al. The clustering of galaxies in the completed SDSS-III Baryon Oscillation Spectroscopic Survey: cosmological analysis of the DR12 galaxy sample. Monthly Notices of the Royal Astronomical Society (2016) arXiv:1607.03155 [astro-ph.CO]

The post A Million Points Of Light appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/07/16/million-points-light/feed/ 4
Bell, Book, and Candle https://briankoberlein.com/2016/07/01/bell-book-candle/ https://briankoberlein.com/2016/07/01/bell-book-candle/#respond Fri, 01 Jul 2016 11:00:43 +0000 https://briankoberlein.com/?p=6066

Dark energy isn't some magical idea we've dreamt up to fit our models.

The post Bell, Book, and Candle appeared first on One Universe at a Time.

]]>

Cosmology has upped its game in the past couple of decades. There was a time when the age of the Universe was only known as probably several billion years. This is because measurements of cosmic distances and the Hubble parameter were only loosely known. Today we can say the Universe is 13.8 billion years old, give or take a bit. But that level of precision has been hard won, and it hasn’t been without controversy. 

Some of the methods used to measure cosmic distances. Credit: Tabitha Dillinger

Some of the methods used to measure cosmic distances. Credit: Tabitha Dillinger

Cosmic distances are measured through a range of methods that overlap in their useful distances, known as the cosmic distance ladder. The three big methods are parallax for nearby stars, Cepheid variable stars for globular clusters and nearby galaxies, and type Ia supernovae for distant galaxies. The latter two are known as standard candles, because we can indirectly determine their actual brightness and compare it to their observed brightness to determine their distance. The Achilles heel of standard candles is that if their presumed actual brightness is wrong, then our distance measurements are wrong.

Recently there’s been a bit of furor over the discovery that type Ia supernovae aren’t entirely consistent in their brightness, but come in two varieties. One is a bit brighter at reddish wavelengths, and another is a bit brighter at bluish wavelengths. We’ve generally used a single method to determine absolute brightness, and these two varieties would give different answers. This led to speculation that perhaps dark energy and the ever increasing expansion of the universe was actually wrong. If distant supernova are actually dimmer than we assumed, perhaps dark energy isn’t real.

That made for great headlines, but it isn’t true. While we do use supernovae to determine the existence of dark energy, we also have other methods, such as the large-scale clustering of galaxies and fluctuations in the cosmic background to measure it as well. This new discovery might mean there is less dark energy than we thought, but dark energy isn’t going away because we have a confluence of evidence to support it. This discovery is an indication of just how far we’ve come. Our measurements are now so sophisticated that we’re adding nuances to our results to make them better.

This has happened before. In the early days of modern cosmology the big focus was on Cepheid variables. After being used to demonstrate our galaxy wasn’t alone in the Universe, they were used to determine galactic distances millions of light years away. It was Cepheids that allowed Hubble to first demonstrate the Universe was expanding. But as we observed more Cepheids it became clear that they came in two varieties. When this became understood, the age of the Universe shifted from a few billion to around 10 – 20 billion. What started as a problem led to a better measure of cosmic age.

Which just goes to show that dark energy isn’t some magical idea we’ve dreamt up to fit our models. Our model is so good now that we can find subtle errors and correct them.

The post Bell, Book, and Candle appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2016/07/01/bell-book-candle/feed/ 0