Wednesday, March 5, 2014

RGB and Astrophotography

Ever wonder why satellite images were colored the way they were? And how it was done?...

A regular camera has detectors inside that records three different wavelengths that are also the primary stimuli for the perception of color in humans, RGB (red, green, blue). Our cameras then produce an image exactly as our eyes see it in real life. Satellites do the same thing as our cameras but with two major differences. It too collects the RGB wavelength to produce a true image, but it also collects a much broader spectrum of wavelengths such as infrared and thermal wavelengths, that are not visible to the human eye. The second difference is that rather than combining all the information from each wavelength, it keeps them separated and saves them as separate images called bands that are taken in black and white. From there it is up to the people collecting the images from the satellite or telescope to put them together into one readable image.



The top image placed three different bands of black and white images in the RGB spectrum into Photoshop to create the second image. Various bands are chosen to be placed in the RGB spectrum depending on what is wanted to be seen.

This image is only different in color because different bands were chosen to be placed in RGB, revealing different information about the photo.

Using this way of capturing images can help scientists see thing undetectable by our naked eyes or regular cameras that only collect three wavelengths. With the ability to see things we normally cant see, it will help us learn so much more about our world and the deep space that earth sits in.






No comments:

Post a Comment