The engineer pointed out that Samsung’s moon shooting is not a P picture, but an AI enhancement

by time news

Some time ago, the Samsung Galaxy S23 Ultra mobile phone took photos of the moon that were suspected to be fake, which aroused heated discussions among netizens. Although Samsung officially clarified and denied the claim that the photos were faked, some netizens still did not buy it.

Fortunately, recently a Youtuber named Eric used his 12 years of hardware engineering experience and mathematical and logical analysis to introduce how to distinguish between real and fake pictures. It also concluded that Samsung may have been unfairly criticized.

Before Eric started to explain, he expressed his neutrality, saying that he just wanted netizens to better understand why these pictures showed such effects.

First, Eric showed a photo of the moon taken with the iPhone 14 Pro Max, and we can only see a white spot in the photo. The photos taken by iPhone 14 Pro Max are obviously real, but they cannot intuitively and accurately reflect the appearance of the moon seen when shooting. This approach is not in line with the original purpose of the camera “to preserve and relive the memory of what you saw with your own eyes.”

Eric mentioned that Samsung explained the technology behind the camera in an article. The article stated that in order to enhance the effect of moon photos, at least 10 pictures will be taken throughout the process, and then image processing technology will be used to remove blemishes and noise, and the clearest parts of these pictures will be combined into a single image. The final result is to augment the imagery with AI trained to recognize the various phases of the Moon, resulting in the final rendering. To put it simply, a photo of the moon needs to take multiple photos first, then go through image processing, and finally AI enhancement.

In u/ibreakphotos’ “Crackdown” video, Eric claims that its use of Gaussian blur to imitate moon photos is the opposite of Samsung’s imaging process. Samsung uses a Convolutional Neural Network, and Samsung’s AI model needs to speculate when dealing with downscaled and Gaussian blurred images, which means the end result won’t exactly match the original downscaled image, well explained Why is the captured picture different from the original picture.

Samsung’s Convoluted Convolutional Neural Network results in a brownish tint to the image, with some loss of image quality around the edges. When adjusting to remove the brown tint, you can see that the image is very close to the picture before the netizen applied Gaussian blur.

In fact, Samsung’s use of image processing and AI models is not a scam, but Samsung’s publicity has been misleading consumers to believe that the moon photos are taken by the hardware of the camera, not the processing of the software, which leads some consumers to think that they are Be “deceived”.

You may also like

Leave a Comment