Technology has not yet made it possible to visit the far corners of our universe, but you can see them.
Space images give us a deep understanding of the cosmos in a new way, but the breathtaking images captured by the James Webb telescope are not photographs in the true sense of the word.
Read also: The James Webb Space Telescope captures a ‘ghostly’ galaxy
Shining stars, shining planets, colorful nebulae: outer space is often depicted in photographs as a mysterious place with many colors. How about these breathtaking space images that the James Webb Telescope recently sent us?
The James Webb Space Telescope has several instruments on board that detect various near-infrared electromagnetic frequency ranges, most of which are invisible to the human eye.
“This data is stored digitally with ones and zeros,” says Joe DePascale on the Nasa podcast. Basically, he says, it’s a black and white picture. Senior Data Imaging Engineer at the Space Telescope Science Institute in Baltimore, Maryland, USA, together with his team, is responsible for publishing satellite images from, among others, the Webb Telescope.
Talk about out of this world! This is Webb’s first direct image of a planet outside our solar system, and it hints at Webb’s future possibilities for studying distant worlds: https://t.co/ITcl6RItLa
Not what you expected? Let’s go through the details👇 pic.twitter.com/bCgzW0dcUE
— NASA Webb Telescope (@NASAWeb) September 1, 2022
Telescope data must first be cleaned of instrumental effects, explains astrophysicist Kai Noeske from the European Space Agency (Esa). Then comes color: roughly speaking, this means assigning primary colors – red, green and blue – to different areas.
After all, that is not the case when scientists create an image with the colors they have chosen. DePascual explains, “We respect the data from start to finish. And we force the data to appear in color.”
The path to the final product
Objects in space, such as stars or gases, are visible at different wavelengths. To register them, the Webb telescope has several filters on board. Astrophysicist Noeske illustrates an image processing program where color plays a role: “Typically, the image in the shortest wavelength filter is the blue channel, the image in the middle filter is the green channel, and the image in the longest wavelength filter is the red channel.”
Enjoy the beautiful spiral structure of the Phantom Galaxy, M74, as seen by Webb in mid-infrared. Fine filaments of dust and gas wind outward from the center of the galaxy, which has a ring of star formation around its core. https://t.co/pPVvxsC6KA pic.twitter.com/JQ2C9Wf19f
— NASA Webb Telescope (@NASAWeb) 30 August 2022
In this regard, this approach is not too different from conventional digital cameras or smartphone cameras. They also use tricks to be able to depict colors. There, sensors measure how much of the red, green and blue colors are in the corresponding areas of the image. Monochrome image information is combined directly in the camera or smartphone to form a color image.
For Web images, this step is performed only after the image has been acquired. In addition, some of the published images consist of hundreds of individual images.
The James Webb Telescope was launched into space on December 25, 2021 aboard the Ariane rocket. The space organizations in the USA, Canada and Europe are cooperating in the project. Behind it are 30 years of development and costs of around ten billion dollars.
The Webb telescope follows the Hubble telescope, which has been in operation for over 30 years. While Hubble works in the optical and ultraviolet regions, James Webb explores in the near-infrared region.
Among other things, the James Webb telescope allows you to get new images of the early universe with the help of a mirror with a surface of 25 square meters. Scientists hope the images will allow us to understand the time after the Big Bang and possibly find clues about the existence of a second Earth.
Subscribe to our Telegram
Get 1 latest news release per day, every weekday evening.