And unlike, say, the Eiffel Tower, its appearance won’t change drastically based on lighting. Shooting on the moon is usually only done at night, and Samsung’s processing falls apart when the moon is partially obscured by clouds.
One of the clearest ways Samsung’s processing plays with the moon is by manipulating the mid-tone contrast, making the topography more pronounced. However, it is clearly also capable of introducing the appearance of texture and detail not present in the raw photo.
Samsung does this because the 100x zoom images of the Galaxy S21, S22 and S23 Ultra phones are poor. Of course they do. They include massive cropping to a tiny 10-MP sensor. Periscope zooms in phones are great, but they’re not magic.
credible theories
Huawei is the other major company accused of faking its lunar photos, with the otherwise brilliant one Huawei P30Pro from 2019. It was the last flagship Huawei released before the company was blacklisted in the US, effectively destroying the appeal of its phones in the West.
Android authority claimed the phone pasted a stock photo of the moon into your photos. Here’s how the company responded: “Moon Mode works on the same principle as other master AI modes, in that it recognizes and optimizes details in an image to help individuals take better photos. It doesn’t replace the image in any way – that would require an unrealistic amount of storage as the AI mode recognizes over 1,300 scenarios. Based on machine learning principles, the camera recognizes a scenario and helps optimize focus and exposure to enhance details such as shapes, colors and highlights/lowlights.”
Well known right?
You won’t see these techniques in too many other brands, but not for high-profile reasons. If a phone doesn’t have a long-range zoom of at least 5x, then a moon mode is largely pointless.
Trying to photograph the moon with an iPhone is difficult. Even the iPhone 14 Pro Max doesn’t have the zoom range for it, and the phone’s auto exposure turns the moon into a scorching blob of white. From a photographer’s point of view, the S23’s exposure control alone is excellent. But just how “fake” are the S23’s moon images?
The most generous interpretation is that Samsung uses the real camera image data and only implements its machine learning knowledge to massage the processing. This can help, for example, to trace the contours of the Sea of serenity And Sea of Tranquility when trying to extract a greater sense of detail from a blurry source.
However, this line is stretched in the way the final image shows the position of the Kepler, Aristarchus, and Copernicus craters with seemingly uncanny accuracy when these tiny features are imperceptible in the source. While you can draw a conclusion about where lunar features come from a hazy source, these are next level things.
Still, it’s easy to overstate how much of an edge the Samsung Galaxy S23 gets here. The moon photos may look good at first glance, but they are still bad. A recent vs video with the S23 Ultra and Nikon P1000 shows what a decent sub-DSLR consumer superzoom camera is capable of.
A matter of trust
The furor over this lunar issue is understandable. Samsung uses moon images to hype its 100x camera mode and the images are synthesized to some extent. But it’s actually just stuck a toe outside the ever-expanding Overton AI window here, which has spearheaded phone photography innovation for the past decade.
Each of these tech tricks, whether you call them AI or not, is designed to do what would have been impossible with the raw basics of a phone camera. One of the first, and perhaps the most consistent, was HDR (High Dynamic Range). Apple built HDR into its camera app in iOS 4.1, released in 2010, the year of the iPhone 4.