photography – One Universe at a Time https://briankoberlein.com Brian Koberlein Thu, 21 Feb 2019 22:09:36 +0000 en-US hourly 1 https://wordpress.org/?v=5.1 Astronomy On The Side https://briankoberlein.com/2015/11/15/astronomy-on-the-side/ https://briankoberlein.com/2015/11/15/astronomy-on-the-side/#respond Sun, 15 Nov 2015 16:19:26 +0000 https://briankoberlein.com/?p=5452

Often astronomy is a hobby, and sometimes these "amateurs" make significant advances in the field. Take, for example, the story of Andrew Common.

The post Astronomy On The Side appeared first on One Universe at a Time.

]]>

One of the interesting things about astronomy is how much of it has been driven by amateurs. While today many people are employed as astronomers full time, that hasn’t always been the case. Often astronomy is a hobby, and sometimes these “amateurs” make significant advances in the field. Take, for example, the story of Andrew Common.

Common's photograph of the Orion Nebula was the first of its kind.

Common’s photograph of the Orion Nebula was the first of its kind.

Common worked as a sanitary engineer in the 1800s, but had an interest in photography and astronomy. Since he didn’t have an official position or equipment, he made an observatory on his own in a garden shed in his back yard. In the 1870s he photographed the Moon and planets, and his work earned him a fellowship in the Royal Astronomical Society. But he really wanted to photograph stars and nebulae. Using a 36 inch reflector telescope he was able to photograph the Orion Nebula in 1883. For the first time a photograph showed more detail than could be captured by the naked eye. His image won the Gold Medal of the Royal Astronomical Society.

Toward the end of his life, Common worked to build a 60-inch mirror, which was plagued with difficulties. Along the way he pioneered methods of grinding such large mirrors. But his photograph of the Orion Nebula remains his most famous work.

Not bad for a bit of astronomy on the side.

The post Astronomy On The Side appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2015/11/15/astronomy-on-the-side/feed/ 0
The Illusion of Reality https://briankoberlein.com/2015/05/22/the-illusion-of-reality/ https://briankoberlein.com/2015/05/22/the-illusion-of-reality/#comments Fri, 22 May 2015 13:34:10 +0000 https://briankoberlein.com/?p=4817

Often in astronomy the images presented are not the images we would actually see. But this manipulation of images actually serves a scientific purpose.

The post The Illusion of Reality appeared first on One Universe at a Time.

]]>

One of the most impressive aspects of astronomy is the stunning visuals. These amazing color images inspire our love of the cosmos, and are a perennial hit on social media. They also aren’t real, at least in the sense of being an accurate representation of how celestial objects actually appear to the human eye. They are more art than science, providing an illusion of reality.

The reason for this is rooted in the way astronomers observe the heavens. At a basic level, astronomers image the sky in much the same way you might take a selfie with your phone. Both are captured with a digital camera, and both are manipulated to produce the desired result. But in astronomy we’re primarily interested in accurate data, which means creating an image often comes second.

Compression levels. Credit: Mountain Heights Academy OpenCourseWare

Compression levels. Credit: Mountain Heights Academy OpenCourseWare

If you take a photograph with your phone, for example, it’s typically stored as a jpg file. In this format, images are compressed to reduce their size, and the way in which they are compressed is “lossy.” This means part of the image is approximated, which loses some of the information in the image. For selfies the approximation isn’t noticeable, so this isn’t typically a big deal. For scientific imagery, however, you don’t want approximations; you want to preserve 100% of the available information you worked so hard to collect. So astronomers typically use a different image format known as the Flexible Image Transport System (FITS).

The FITS format is uncompressed, and stores data as a text (ASCII) file. This means you can easily analyze the data or convert it to other file formats. The files can also contain metadata, or information about how and where the image was obtained, which is particularly useful when you need to combine data from multiple sources.

Left: A raw FITS image. Right: The same image with brightness and contrast enhanced.

Left: A raw FITS image. Right: The same image with brightness and contrast enhanced.

One disadvantage of the FITS format is that raw images typically need to be manipulated to show anything. For example, a file might give the amount of light gathered for each pixel on a linear scale. When displayed on a screen the raw image often looks black because our eyes perceive brightness on a logarithmic scale. To actually see the image of a faint galaxy, above, we have to severely adjust the brightness and contrast.

Another difference between your typical selfie and an astronomical image is the way in which color images are produced. Digital cameras detect light through an array of sensors that measure the amount of light reaching them (typically CMOS or CCD detectors). These sensors are sensitive to light within a particular range of wavelengths. Most commercial digital cameras also implement an array of filters so that some pixels will only capture red light, and others only green or blue. The three “color” images are then combined to produce the color photograph. This is similar to the way our eyes perceive light, with cones in our retina sensitive to these three primary colors.

Hubble images of M57 taken at wavelengths (in nanometers) of 658 (red), 502 (green) and 469 (blue). I’ve given them color and combined them to produce the color image (bottom right).

Hubble images of M57 taken at wavelengths (in nanometers) of 658 (red), 502 (green) and 469 (blue). I’ve given them color and combined them to produce the color image (bottom right).

While this is an easy way to produce a color image, its big downside is that each type of sensor is only capturing a fraction of the light. It also means that the amount of light gathered at each wavelength is determined by the ratio of red, green and blue sensors, and can’t be changed. Since astronomers want to gather as much light as possible, their cameras are sensitive to a wide range of wavelengths. Different filters can then be placed in front of the sensors if we want to focus on a particular color range. As a result, raw photographs in astronomy are almost always black and white.

To create a color image, black and white images taken through different filters are then colorized and combined to produce a color image. With the right care it’s possible to create an image which closely approximates a “true color” image. But often the resulting image doesn’t accurately represent the real colors of the night, and often this is done intentionally. It’s sometimes referred to as the National Geographic effect.

In the late 1970s, the Voyager missions made their flybys of Jupiter. It was the first time truly detailed images were gathered of the planet. Magazines such as National Geographic had full page spreads of these images, which were absolutely stunning. Then, as now, the raw data were black and white images captured through different color filters, which were combined to create color photographs. But rather than using true-color images, the photos had boosted colors and depth. It made for great imagery, but wasn’t a true representation of how Jupiter looks.

Left: The Great Red Spot as seen in National Geographic. Right: A more accurate color image produced with the same data. Credit: NASA/JPL

Left: The Great Red Spot as seen in National Geographic. Right: A more accurate color image produced with the same data. Credit: NASA/JPL

There are some who would argue that these enhanced images misrepresent reality in a way that runs counter to scientific accuracy. Shouldn’t we be honest and strive for accurate images rather than color-hyped photographs that are more art than science?

While there’s a case to be made for accuracy, in some ways a color-hyped image is more accurate to what we perceive, even if it isn’t accurate to reality. By changing the contrast on these images, we can visually perceive details that would be washed out if we insisted on “true color” all the time. If you asked people the color of the Moon, for example, most would say it is white or pale gray. They would say this based upon their own observation of the Moon. But in reality, the Moon is a much darker shade that borders on black, more the color of gunpowder. A similar effect occurs with Mars, which we see in the sky as pale red, but is more the color of butterscotch or cocoa powder. The reason for this discrepancy is that our perception of colors depends on other factors such as the brightness of an object, or the colors of objects next to it.

Top Row (left to right): The Crab Nebula at radio, infrared and visible wavelengths. Bottom Row (left to right): ultraviolet, x-ray, and a false-color composition of the full range.

Top Row (left to right): The Crab Nebula at radio, infrared and visible wavelengths. Bottom Row (left to right): ultraviolet, x-ray, and a false-color composition of the full range.

Then there are the vast range of wavelengths that our eyes can’t even observe. We’ve developed telescopes that can see radio, infrared, ultraviolet, x-rays and gamma rays. Accuracy would ask that we remain blind to these images. But instead we produce false-color images, where colors are assigned to various wavelengths. This allows us to perceive structures that wouldn’t be apparent otherwise, structures that in many ways describe what’s physically present better than what our eyes would see alone.

Images credit: Chris Spratt  (L); D. López (IAC) (R)

Images credit: Chris Spratt (L); D. López (IAC) (R)

Compare what the human eye sees of the Ring Nebula through an eyepiece, at left, with what an advanced telescope and camera (the Isaac Newton Telescope’s Wide Field Camera) — with advanced processing, multiwavelength views, spectroscopic data and the contrast turned way up — sees at right.

Pictures tell a story, and sometimes the power of these images lies not in being true to life, but rather in extending our view of the universe beyond the limits of the human eye.

The post The Illusion of Reality appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2015/05/22/the-illusion-of-reality/feed/ 7
Take the Shot https://briankoberlein.com/2015/02/11/take-shot/ https://briankoberlein.com/2015/02/11/take-shot/#respond Wed, 11 Feb 2015 12:00:53 +0000 https://briankoberlein.com/?p=4480

This image was produced from a series of long exposures by Petr Horálek, and it captures a range of astronomical objects in a single image.

The post Take the Shot appeared first on One Universe at a Time.

]]>

This image might look like something created in Photoshop, but it is very real. It was produced from a series of long exposures by Petr Horálek, and it captures a range of astronomical objects in a single image.

In the center is Comet Lovejoy, which you might notice has a bit of a greenish tint. Just to the left of Lovejoy is a meteor, far too small to reach the Earth, but bright enough to announce its arrival in our atmosphere. Above and to the right of Lovejoy is the open cluster known as the Pleiades, which is one of the most recognizable star clusters in the night sky. The red arc on the right is the California Nebula, which is an emission nebula. Its red coloring comes from atomic hydrogen that’s excited by ultraviolet light. Throughout the upper right region you can see the diffuse light of the Milky Way, with darker dusty regions within it. The greenish glow near the horizon is an effect known as airglow. It isn’t an aurora, but a very faint glow due to excited particles in the upper atmosphere. Airglow is fairly uniform throughout the sky, but it is most noticed near the horizon, since in that direction we look through the most atmosphere.

It’s a striking image brings home the fact that the night sky is filled with light and color. With the limited light sensitivity of our eyes, and our often light-polluted vantage points, it’s easy to forget that fact.

The post Take the Shot appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2015/02/11/take-shot/feed/ 0
National Geographic Effect https://briankoberlein.com/2014/10/20/national-geographic-effect/ https://briankoberlein.com/2014/10/20/national-geographic-effect/#comments Mon, 20 Oct 2014 11:00:02 +0000 https://briankoberlein.com/?p=4013

You can see this effect in the image above. On the left is a Voyager II image of Jupiter's great red spot as it appeared in NatGeo and elsewhere. On the right is the same image in its more true-color form. You can see why the colors were boosted. The true-color image lacks much of the depth and richness we like to see in images.

The post National Geographic Effect appeared first on One Universe at a Time.

]]>

In the late 1970s the Voyager missions made their flyby of Jupiter. It was the first time truly detailed images were gathered of the planet (even more detailed than the Pioneer images) and they created quite a stir. These detailed images appeared all over the media, but were perhaps most widely seen as full page color photographs in National Geographic. But rather than using true-color images, the photos in National Geographic had boosted colors and depth. It made for great imagery, but wasn’t a true representation of how Jupiter looks. When this color enhancement was pointed out, it was sometimes referred to as the National Geographic effect.

You can see this effect in the image above. On the left is a Voyager II image of Jupiter’s great red spot as it appeared in NatGeo and elsewhere. On the right is the same image in its more true-color form. You can see why the colors were boosted. The true-color image lacks much of the depth and richness we like to see in images.

There are some who would argue that enhanced color images misrepresent reality in a way that runs counter to scientific accuracy. We should be honest and strive for accurate images rather than color-hyped images that are more art than science. That view has gained significant traction since the 1970s, and today you can easily find true-color images of many celestial objects.

A more true-color image of the Moon. Credit: NASA/GSFC/Arizona State University/J. Major

A more true-color image of the Moon. Credit: NASA/GSFC/Arizona State University/J. Major

But on the other hand, in some ways a color-hyped image is more accurate to what we perceive, even if it isn’t as accurate to reality. Take for example, the color of the Moon. If you asked people the color of the Moon, most would say it is white or pale gray. They would say this based upon their own observation of the Moon. It appears pale gray when you look at it in the night sky. In reality, the moon is a much darker shade, with the “white” regions more the color of gunpowder, and the darker regions the more color of asphalt. A similar effect occurs with Mars. We see it in the sky as pale red, but it is actually more the color of butterscotch or cocoa powder.

Images of Enceladus, the Earth, the Moon, and Comet 67P/C-G, with their relative albedos scaled approximately correctly. Credit: ESA's Rosetta Blog

Images of Enceladus, the Earth, the Moon, and Comet 67P/C-G, with their relative albedos scaled approximately correctly. Credit: ESA’s Rosetta Blog

Part of the reason for this discrepancy is that we don’t perceive colors as absolute, but rather relative to surrounding colors. Against the background of black sky, colors appear brighter and more pale.  The actual brightness of an object is related to its albedo, or the fraction of light striking an object that reflects off its surface. We normally adjust for albedo in photographs, so we don’t notice the variation. For example, Saturn’s moon Enceladus is a brilliant white body that reflects nearly 100% of the light striking it. The comet 67P/Churyumov–Gerasimenko is also photographed as a bright white object, but has an albedo of only 5%. Compared to Enceladus, 67P is as dark as a lump of coal. But even a lump of coal would be brighter than the dark of space, so showing it as a brighter gray object is more realistic.

Although we can strive for accurate “true-color” representations of celestial objects, they will never be entirely like the way they would appear to us in real life. Even our spacecraft don’t take color images. Instead they take grayscale images at various wavelengths, which can be combined to create color images. Whether we generate more brilliant or more true images is a matter of preference.

The post National Geographic Effect appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2014/10/20/national-geographic-effect/feed/ 3
Star of Bethlehem https://briankoberlein.com/2014/09/21/star-bethlehem/ https://briankoberlein.com/2014/09/21/star-bethlehem/#respond Sun, 21 Sep 2014 11:00:04 +0000 https://briankoberlein.com/?p=3851

When stars are portrayed in media, they are often shown with long spikes emanating from them. Perhaps the most common example is that of the "star of Bethlehem" which, according to the story, led the wise men to baby Jesus. Of course when we look at stars in the night sky, we don't see any such spikes. Stars twinkle due to atmospheric disturbances, but that's about it. In photographs, however, bright stars often have such long spikes. So what causes them? It all has to do with an interesting bit of optics.

The post Star of Bethlehem appeared first on One Universe at a Time.

]]>

When stars are portrayed in media, they are often shown with long spikes emanating from them. Perhaps the most common example is that of the “star of Bethlehem” which, according to the story, led the wise men to baby Jesus. Of course when we look at stars in the night sky, we don’t see any such spikes. Stars twinkle due to atmospheric disturbances, but that’s about it. In photographs, however, bright stars often have such long spikes. So what causes them? It all has to do with an interesting bit of optics.

The Craig lens-based telescope was 85ft long. Credit: Wakefield Collection

The Craig lens-based telescope was 85ft long.
Credit: Wakefield Collection

In astronomy, they are known as diffraction spikes, and they appear with certain types of telescopes. Optical telescopes broadly fall into two types: lens-based and mirror-based. Lens-based telescopes were the first to be developed, and are basically a long tube with two or more lenses. Starlight is refracted by the lenses to produce a magnified image. Since the light goes straight through the telescope unimpeded, you don’t get any spikes on stars. The big disadvantage of lens telescopes is that they tend to get be fairly long for large magnifications, and large lenses are difficult to make well.

The mirror supports on my telescope.

The mirror supports on my telescope.

Mirror-based telescopes are easier to make, and because they reflect light they can be made shorter. Light can be focused from a large back mirror to a smaller mirror, which then focuses the light onto an eyepiece or camera for viewing. One downside of this type of design is that starlight has to pass the smaller mirror before reaching the large back mirror. As it does, the supports for the mirror cause the light diffract. It’s the diffraction pattern that causes stars to appear spiky, hence the term diffraction spikes.

Normally diffraction spikes aren’t noticeable. When viewing objects directly you won’t typically notice them. But with long photographic exposures they generally show up.  Even professional telescopes have them in many of their images. They generally aren’t a problem for research, and the advantages of mirror telescopes vastly outweigh the minor inconvenience of diffraction spikes.

But the main reason we see diffraction spikes so often is that astrophotographers often use them to artistic effect. They transform a bright point of light into a wondrous stellar image.

The post Star of Bethlehem appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2014/09/21/star-bethlehem/feed/ 0
Shades of Gray https://briankoberlein.com/2014/06/01/shades-gray/ https://briankoberlein.com/2014/06/01/shades-gray/#comments Sun, 01 Jun 2014 19:33:21 +0000 https://briankoberlein.com/?p=3135

Despite all the wonderful color images we have from the Hubble space telescope, there is no color camera on the Hubble. The main reason for this is scientific. When observing astronomical objects, you’d like to get as much light as you can from the object. You also want to get as wide a range of wavelengths as you can.

The post Shades of Gray appeared first on One Universe at a Time.

]]>

Despite all the wonderful color images we have from the Hubble space telescope, there is no color camera on the Hubble. The main reason for this is scientific. When observing astronomical objects, you’d like to get as much light as you can from the object. You also want to get as wide a range of wavelengths as you can.

The light detectors used in space telescopes are typically charged coupled devices or CCDs. When light hits a pixel in a CCD it induces a charge. The more photons that strike a pixel, the more charge you get. In this way you can measure the brightness by the amount of charge you get. A similar digital camera is known as CMOS sensor. These are typically used in cell phones and the like since they are inexpensive, but they tend to have less sensitivity.

To make a color digital camera, you have to put a layer of filters over the sensor so that a third of the pixels will only see red light, a third only green and a third only blue. This is similar to the way the cones in our eyes work. They are sensitive at red, green or blue, and our brain puts this information together to produce our color vision. In your cell phone, the red, green and blue pixels are put together to make a color image.

This is fine if all you want to do is put photos of your cat on Instagram, but all those filters block a portion of the light, which produces a darker image. Since the filter is built into the camera, this also means you can only take a color picture. This means you have no flexibility to image things at any other wavelengths.

So for most telescopes the CCD just measures brightness within the range of their sensitivity. For the Hubble’s Wide Field and Planetary Camera (WFPC) this ranges from infrared through the visible to ultraviolet. The Hubble then has filters that can be moved in front of the camera. So if you just want to look at infrared, there is a filter that lets you do that.

color3

So if telescope cameras typically only see in shades of gray, how do we get all these wonderful color images? They are composite images made from grayscale images taken under a red, green and blue filter. These three images are then given the appropriate color, and layered to produce a color image. You can see this in the image above, where I’ve colorized the original filtered grayscales from the Hubble to produce a color image of the planetary nebula M57.

My result is pretty basic. To produce the truly magnificent color images you see takes a tremendous amount of skill and talent. Of course since these images are composites you can also create false-color images of objects taken in infrared, ultraviolet, radio and x-rays, none of which have color to our eyes.

Color images are not only beautiful, they serve to inspire us to understand and appreciate the universe around us. But astronomers find shades of gray much more useful.

The post Shades of Gray appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2014/06/01/shades-gray/feed/ 1
Having FITS https://briankoberlein.com/2013/10/30/having-fits/ https://briankoberlein.com/2013/10/30/having-fits/#comments Wed, 30 Oct 2013 16:25:21 +0000 https://briankoberlein.com/?p=525

For scientific imagery you want the data your camera gathers be "raw." In other words, you don't want the image to be compressed or manipulated in any way. For this reason a different image format is used, known as the Flexible Image Transport System, or FITS.

The post Having FITS appeared first on One Universe at a Time.

]]>

Images on the web are typically in a portable format such as .gif, .jpeg or .png. They have the advantage of being relatively small (and therefore easy to download) as well as being viewable in any common web browser.  For scientific purposes, however, these file formats are awful.  One major disadvantage is that they are compressed to reduce their size, and the way in which they are compressed is “lossy.”  This means that part of the image is approximated, which loses some of the information in the image when it’s compressed.  Normally the approximation is not noticeable, so for standard digital photography these formats are just fine.  This is why your standard camera phone, for example, converts the image to a .jpeg before saving it.  Most people aren’t worried about compression or small image distortions; they just want a picture they can easily post on Facebook.

For scientific imagery, however, you want the data your camera gathers be “raw.” In other words, you don’t want the image to be compressed or manipulated in any way.  For this reason a different image format is used, known as the Flexible Image Transport System, or FITS.  The FITS format is raw and uncompressed.  It is also stored in a text (ASCII) format, so you can even go through the file by hand if you want to look at a particular feature.  The other advantage is that it can also contain metadata, or information about how and where the image was obtained. This is particularly useful when you need to combine data from multiple sources, as is often the case in astronomy.

There are two disadvantages to the format.  The first is that they tend to be much larger than your typical image file.  The images below are .png versions of an original .fits file, and they are about 50 kB in size.  The original file is over 600 kB. The second disadvantage is that the raw images typically need to be manipulated to show anything.  In the images of the whirlpool galaxy below, the first is the type of image you are likely to see on a web page.

galaxy2 The second image is what the raw file looks like before you enhance the brightness and contrast.  Neither of these are big disadvantages, and the are greatly outweighed by the benefit of having raw, unmanipulated data.

galaxyWhile the images you see online are typically heavily manipulated, many of them are free available in their original .fits format.  You just have to look for them a bit. There is also a free application known as ImageJ, which can view and manipulate FITS files, so you can have your own fun with raw astronomical data.

With a little practice, you too can be having FITS.

The post Having FITS appeared first on One Universe at a Time.

]]>
https://briankoberlein.com/2013/10/30/having-fits/feed/ 1