Sind die Farben, die wir sehen, echt?

https://i.redd.it/dw653ua0gscg1.jpeg

22 Kommentare

  1. Many of these photos are colorized to show the different elements and gases.

  2. Bearded_Apple on

    yeah photos like these are edited to look more colorful. its really just as colorless as you see when you look up

  3. AlmightyTestichilles on

    The colors you see in space pics come from our digital conversion of the gas chemicals floating around. Without the digital conversation you wouldn’t see so many vibrant colors

  4. WilburHiggins on

    Depends on how the photo is taken and what wave lengths it is taken at. Basically every picture from Webb is completely different from what you would see given it is infrared light. Same with Chandra and X-rays. Lots of photos are also taken with specific filters (Hydrogen, oxygen, and sulfur.) Those photos are assigned specific colors per filter when they are combined.

    So in reality most space photos are not true color, because they are not collecting data the way our eyes do. Some photos are true color, and some editing is done to create true color images. The best idea you can get about true color is looking at the Orion Nebula with a telescope or binoculars, but even then your eyes are dark adapted and the color receptors are muted in the dark.

    TLDR: Most photos aren’t true color, but we can see colors. Most objects are just too dim to see the colors with your eyes very well.

  5. AnyAlps3363 on

    Colour from space is just the frequency of an object’s emmitted light as it reaches us. 

    Depending on if the objects is travelling away/ towards us (red light shift), or if its light has been bent or changed frequencies from some sort of external force (but idk much about this yet, I’m just supposing that when gravity interacts with light it can slow it to a different frequency to some degree), the colour we see may be different from its ‚real‘ colour it originally emmitted. 

  6. Turian_Agent on

    You’ve provided an enhanced photo of some kind, of a star-forming region (similar to the Orion Nebula). In true visible-light color (what our eyes would perceive if sensitive enough), nebulae are much more subtle — often pale, washed-out pastels or even mostly grayish with faint hints of red/green.

    Space is mostly black, and almost a perfect vacuum. Still, many nebulae do emit real visible colors (especially red from hydrogen, green from oxygen) that fall within the spectrum visible to humans (and perhaps other beings!) Still, they would be fainter than in your image.

  7. WardenEdgewise on

    I just watched a few videos on YouTube about how the colours are decided for these photos from the various telescopes. I can’t paraphrase here, but yes. There are reasons why the colours are chosen to look like this.

  8. Real color space images looks very boring. The colors are exaggerated or straight up „faked“.

    Like there might be 600nm and 610nm photons in that image, which would look almost the same to a human eye. But you adjust the image so that one of those is red and the other blue, to make a nicer picture.

    Or, maybe you take a visible light picture and an infrared picture, and then color the first one red and the second blue and layer them.

  9. There is nothing „real“ about the way the human visual system perceives colour. In general digital image sensors don’t capture light in the same way, and the images need to be post-processed to imitate what a human eye would have seen – for example, this is how the camera in your phone works.

    With astrophotography this requirement is often relaxed. Images can be produced by capturing only specific wavelengths of light, or with detectors sensitive to light completely outside the visible spectrum. We have to make some arbitrary mapping from that data to an image that we can see, to make a so-called false colour* image. The colours do not (and are not intended to) match human vision, but they still faithfully represent the spectral information that the detector captured.

    *I don’t like this term, because the corrollary that human vision is „true colour“ is, again, not correct!

  10. Laurie Anderson asked the NASA photo editors why they color their images to look like Walt Disney, and they said because they need funding, and nobody would fund a black and white image of the heavens. 

  11. Be cool if someone did a series of side-by-sides showing visible light true colour images next to the more commonly displayed images of the same objects using colour filters to represent different materials or wavelengths.

  12. I think a lot of the answers in this thread are misleading. You asked how we see any colors in space. The light coming from space has a variety of pretty colors. The stuff out there is very far away, and so it’s dim. So the light from it is captured by a telescope and then amplified so it looks like an image. Digital cameras do this as well, by the way. A raw image from a digital camera is very dark and grey and washed out, typically, before it’s color corrected and the light is boosted and a light curve and color correction are applied.

    It’s true that often times the colors are also remapped, so what would look to your eye like a red color ends up appearing as green, e.g. That allows us to take the strongest signals coming off these objects and give them contrast against each other. And too much is made of this IMO to suggest that space photos are „fake“.

    Objects in space have color, and all of the colors in this image represent different colors (wavelengths) of light. They just might be remapped because some of them overlap and you can see more of the nuance with them separated.

    for instance, [here](https://clarkvision.com/galleries/gallery.astrophoto-1/) are a bunch of photos (from u/rnclark ) and all of these images are shot with a normal color digital camera, only boosted with light curves and color corrected like any digital camera image. You can see there are plenty of pretty colors and actually a lot of emission nebulae are really a vibrant pink

  13. The colors are real, but they’re extremely faint. The human eye is better at discerning brightness than color, so if you look at these nebulae with the naked eye, they just look like fuzzy grey blotches, but if you take a long exposure photograph you get images like the one you posted.

    The color comes from the gasses in the nebulae. Atoms energized by radiation from nearby stars emit light at specific frequencies based on the composition of the gas. Hydrogen and helium are very abundant, which is why a lot of nebulae have these lovely mixes of blues, purples and pinks, with areas of yellow. The color of the gas is based on its composition.

    There ARE lots of astronomy photos that use false color to represent frequencies that the human eye can’t pick up, but most photographs like this are true color.

  14. uncleawesome on

    All pictures from space based telescopes are in black and white. They take them through filters that only see specific wavelengths of light. They color each wavelength picture and put them all together to make these pretty pictures. If we could even see that nebula it would most likely be a faint grey mist.

  15. So basically what we know of what space looks like isn’t what space looks like.

  16. General_Addition_913 on

    R/G/B. We have telescopes that take photos in red, green and blue and then we combine the photos to get the correct color. Our eyes do the exact same thing in red green and blue and automatically process everything for us. Same thing in your computers etc. I’m sure someone else could elaborate more.

  17. JanRagnarsson on

    I have seen a video how nasa produces these pictures. They have 3 black and white images of different wavelengths captured by the infrared telescopes, each wavelenght would have a different color if visible through the naked eye (red, green and blue), they overlay all three images with corresponding RGB color and combine all three images together. These amazing colorful images are the result.

Leave A Reply