It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
originally posted by: Unwokenone
Yes how do does one obtain the EXIF file?
originally posted by: Encia22
The problem is the night mode that was used. If I'm correct, all the EXIF info we'll get pertains to the physical properties of the camera; not what was actually applied to the photos in question.
Unfortunately, night mode would have taken a series of bracketed shots at various exposures, then AI is used by the top of the line phones to interpret and render what AI determines was in the image.
If we're going to get anything from the images then we need the raw files from the phone, without any further compression or manipulation; the phone's software has already done enough damage...
Essentially, Android Night Mode (or whatever your manufacturer may call it) uses artificial intelligence to analyze the scene you are trying to photograph. The phone will consider multiple factors, such as light, the phone’s movement, and the movement of objects being captured.
The device will then shoot a series of images at different exposure levels, use bracketing to merge them, and bring out as much detail as it can into a single picture. Of course, there is a lot more going on behind the scenes. The phone must also measure white balance, colors, and other elements, which is usually done with fancy algorithms most don’t fully understand.
originally posted by: Encia22
Here's more about how AI is used by Google, so Android. I know Samsung also does this via its own software and most probably, Apple, too.
For years, Samsung “Space Zoom”-capable phones have been known for their ability to take incredibly detailed photos of the Moon. But a recent Reddit post showed in stark terms just how much computational processing the company is doing, and — given the evidence supplied — it feels like we should go ahead and say it: Samsung’s pictures of the Moon are fake.
A relatively detailed photo of the Moon.
A Samsung smartphone identified a blurry photo of the Moon and added detail to create the above image. Image: u/ibreakphotos
But what exactly does “fake” mean in this scenario? It’s a tricky question to answer, and one that’s going to become increasingly important and complex as computational techniques are integrated further into the photographic process. We can say for certain that our understanding of what makes a photo fake will soon change, just as it has in the past to accommodate digital cameras, Photoshop, Instagram filters, and more. But for now, let’s stick with the case of Samsung and the Moon.
The test of Samsung’s phones conducted by Reddit user u/ibreakphotos was ingenious in its simplicity. They created an intentionally blurry photo of the Moon, displayed it on a computer screen, and then photographed this image using a Samsung S23 Ultra. As you can see below, the first image on the screen showed no detail at all, but the resulting picture showed a crisp and clear “photograph” of the Moon. The S23 Ultra added details that simply weren’t present before. There was no upscaling of blurry pixels and no retrieval of seemingly lost data. There was just a new Moon — a fake one.
This is not a new controversy. People have been asking questions about Samsung’s Moon photography ever since the company unveiled a 100x “Space Zoom” feature in its S20 Ultra in 2020. Some have accused the company of simply copying and pasting prestored textures onto images of the Moon to produce its photographs, but Samsung says the process is more involved than that.
In 2021, Input Mag published a lengthy feature on the “fake detailed moon photos” taken by the Galaxy S21 Ultra. Samsung told the publication that “no image overlaying or texture effects are applied when taking a photo” but that the company uses AI to detect the Moon’s presence and “then offers a detail enhancing function by reducing blurs and noises.”
The company later offered a bit more information in this blog post (translated from Korean by Google). But the core of the explanation — the description of the vital step that takes us from a photograph of a blurry Moon to a sharp Moon — is dealt with in obfuscatory terms. Samsung simply says it uses a “detail improvement engine function” to “effectively remove noise and maximize the details of the moon to complete a bright and clear picture of the moon” (emphasis added). What does that mean? We simply don’t know.