October 20, 2010. I have decided to try my hand at narrowband imaging, i.e., color photos made by combining grayscale images that have been made through hydrogen alpha, oxygen III, and sulfur II filters (Hα, OIII, and SII ) rather than the standard red, green, and blue filters that comprise a "normal" color photograph. The filters are called narrowband because each one admits only a very narrow slice of the visible spectrum, just 3 nanometers (3 billionths of a meter) wide. There are numerous grayscale images on this site that I made with Hα filters, for example, this recent photo of NGC 7000, which served as the Hα channel for all of the narrowband NGC 7000 photos on this page.
There are several ways of assembling a narrowband color image. There is the Hubble, or HST, palette, named for a system frequently used by the Space Telescope Science Institute, where
RGB=SII, Hα, OIII. The photo above is in the HST palette. There is the reverse Hubble palette, where RGB=OIII, Hα, SII. And there is the Canada-France-Hawaii Telescope, or CFHT, palette, where RGB=Hα, OIII, SII. I do not compare my photos with those taken by the CFHT, and you should not, either. That 3.6-meter (142-inch) telescope is at 4200 meters (13780 feet) at the summit of the extinct volcano Mauna Kea. My 4 and 6 inch telescopes are at 18 meters (60 feet) under the East-Coast Light Dome (which, as I have noted below, represents a vast misuse and/or waste of energy resources).
Narrowband images are sometimes called false-color images. I think that is a misnomer. The colors in these images may be different from what you would see with your eye if you could see these objects in color, but they are not false; they only emphasize different molecules and ionized gases. The reason we can't see dim astronomical (and earth-bound) objects in color with our eyes is that of the two types of photoreceptor cells in our eyes, rods and cones, only the cones are sensitive to color, and the cones require bright light. The rods can detect dim light, but they are not sensitive to color. Furthermore, the camera's CCD sensor can be thought of as accumulating and storing light (photons) up to a certain capacity (its saturation point, also known as its full-well capacity) for later reading by the computer. (In fact, photons striking the sensor cause electrons to be knocked loose from the molecules that comprise the camera’s light sensor in numbers that are proportionate to the number of photons that strike the sensor. It is the electrons that are stored, not the photons.) Fortunately, our eyes cannot do that; they detect only what is in our field of vision at any given instant. The memory of what we have seen may (or may not) be stored in our brains, but our eyes do not accumulate and store photons for later processing.
Why narrowband? In large part because the artificial light sources that make up the great majority of the light polution that often interferes with my RGB photography (representing billions of dollars in wasted energy in the U.S. each year) do not radiate in the narrow bands that comprise these photos. They are a way of beating light pollution and even moonlight. Narrowband images are also a way of literally seeing deep-sky objects in a different light. The downside: longer exposures are required because narrowband filters are by their very nature passing less light than wideband RGB filters. Think of the number of people that can walk through a narrow doorway in a given period of time compared to the number that can pass through the broad entrance to a sports arena, e.g. And high-quality narrowband such as the 50mm Astrodon 3 nm filters that I am using cost $900 each.
The three narrowband color photos of NGC 7000 on this page were made with my SBIG STL-11000M camera and my Takahashi FSQ-106ED astrograph with a .73 reducer.
The
effective focal length was 387mm @ ƒ3.65. All-Mac images. |
|