Narrowband Color

Posted on May 20, 2020Comments Off on Narrowband Color

In April 1990 NASA launched the Hubble Space Telescope. Above the distortion of the atmosphere, Hubble has an unobstructed view of the universe. In 1995 Hubble captured an image containing 3 pillars of gas inside the Eagle Nebula. This image has become an iconic image in Astronomy. It is also a false color image; if you were to see it with your eyes it would not look like this.

Pillars of Creation in Hubble Palette, image courtesy of NASA

To create the Pillars of Creation Hubble captured the wavelengths emitted from Sulfur, Hydrogen, and Oxygen. This is accomplished using a monochrome sensor, but using filters that only allow those specific wavelengths through. The result is three monochrome images, one representing Sulfur, one representing Hydrogen, and one representing Oxygen. As the image below from NASA shows, both Hydrogen and Sulfur are in the red region of the spectrum while Oxygen is closer to the green wavelengths. If the images were combined in this manner it would be difficult to differentiate between the Sulfur and Hydrogen. How do we get around that issue? The answer is false color.

Eagle Nebula Pillars of Creation – enhanced color detail, image courtesy of NASA

The color palette used for the Hubble image is Sulfur = red, Hydrogen = green, and Oxygen = Blue. This combination has become known as the Hubble palette. It is characterized by a stronger blue and yellow hues in the image. The Hubble palette is not the only option. The wavelengths can be mixed in any combination to represent the image desired. It is possible to blend the narrowband images to come closer to what would be seen if the images were taken with red, green, and blue filters instead of sulfur, hydrogen, and oxygen.


How does all this relate to the images that I am taking? I am using a monochrome camera. To capture different color channels I use filters that only let certain wavelengths through to the sensor. When I am taking the photos I use an electronic filter wheel. This allows me to control which filter is used with the imaging software. The filter wheel I use is shown below, with the cover removed to show the filters visible. There are 8 filter slots available, I currently have 7 installed. The filter descriptions are below the image.

Filter wheel with LRGB and SHO filters installed
  1. Luminance (L) – This is a clear filter that allows all wavelengths through. I typically use this to help boost contrast in the final image.
  2. Red (R) – The coating looks blue, but this is actually the red filter, allowing the red wavelengths through.
  3. Green (G) – The coating looks pink, but it is the green filter, allowing the green wavelengths through.
  4. Blue (B) – The yellow coating is the blue filter, allowing the blue wavelengths through.
  5. Sulfur II (SII) – This is the sulfur filter, only allowing a very narrow band through corresponding to sulfur emission wavelengths.
  6. Hydrogen-alpha (Ha) – The Hydrogen alpha filter, another narrowband filter, allows a narrow wavelength band through corresponding to hydrogen-alpha emissions.
  7. Oxygen III (OIII) – The final filter is the final narrowband filter, allowing the oxygen emission wavelengths through.

The term narrowband is used for the SII, OIII, and Ha filters because they let a much smaller wavelength band through. The L filter allows the most, ~300nm covering R, G, and B wavelengths. The R, G, and B filters each allow ~100nm covering the individual red, green, and blue wavelengths. The SII, OIII, and Ha filters I am using each allow ~7nm through.

So how does this all come together? I’ll use my Eastern Veil image to demonstrate. In a single evening I may only image using one filter, or I may setup the software to run multiple filters through the night. For the Eastern Veil I used SII, Ha, and OIII filters over multiple nights spanning several weeks. I imaged one night using Ha, another using SII, and another using OIII. When done I would start the cycle over until I had as much data as I wanted to assemble the image.

To assemble the final image all the SII photos were calibrated, aligned, and stacked together. This is the resulting SII image.

Eastern Veil Nebula, SII channel

The Ha images were calibrated, aligned, and stacked. In many deep space objects the Ha emission is stronger, and that shows in the resulting image.

Eastern Veil Nebula, Ha channel

The OIII images were put through the same calibration, alignment, and stacking process. On this particular subject the OIII image below contains much of the fine detail.

Eastern Veil Nebula, OIII channel

Once the individual images are complete it is time to assign color and combine them. As mentioned above there are many different ways to do this. First I will show the image assembled using the Hubble palette. The SII image is assigned to red, the Ha image is assigned to green, and the OIII image is assigned to blue. The initial result is often predominantly green due to the strong Ha presence. The image then goes through editing, and the green is reduced, leading to a blue / orange image as seen below.

Caldwell 33 Eastern Veil Nebula in Hubble palette

What happens if I choose a different combination? Often these narrowband images will be assigned colors to try to come closer to what would be seen if using RGB. Why not just use RGB then? There are several reasons. Because the SII, Ha, and OII filters let in such a small wavelength band they are not as susceptible to light pollution or moonlight, resulting in clearer source data to work with. Another reason is that in many deep sky objects there is significant Ha emission that is not picked up using RGB. Many times I have added Ha to RGB images to bring out additional detail.

I again assembled the image using a combination that would more closely resemble RGB. For red I used mostly Ha with a little SII. For green I used a mix of SII and OIII. For blue I used mostly OIII, with a little SII and Ha added in. The resulting image was the subject of the previous post Eastern Veil.


Caldwell 33 Eastern Veil Nebula in Narrowband

The above examples are simplified, in reality the editing can take many hours, and I typically have multiple versions before I am satisfied.


I was asked once why the images can be different colors. The answer I gave was “because I made it that way”. There is some truth in that statement. The nice images we see often include wavelengths we cannot see with our eyes. They may be assembled using different colors to better see different emissions. Many images, particularly narrowband, are false color. I hope this helps explain why and how.

Eastern Veil in multiple palettes

Comments Off on Narrowband Color