Are iPhone cameras getting too smart?

In late 2020, Kimberly McCabe, an executive at a Washington, D.C.-area consulting firm, upgraded from the iPhone 10 to the iPhone 12 Pro. The quarantine prompted McCabe, a mother of two, to put more effort into documenting family life. I realized that the new smartphone, released the previous month and featuring an improved camera, would improve the quality of amateur shots. She recently told me that the 12 Pro was a disappointment, adding, “I feel a bit cheated.” Each image looks very bright, with warm colors that are not saturated to gray and yellow. Some of McCabe’s photos of her daughter in gymnastics training look strangely blurry. In one of the photos she showed me, the girl’s high legs warped together like a chaotic watercolor. McCabe said that when she uses her DSLR, “what I see in real life is what I see on camera and in the photo.” The new iPhone promises “next-level” photography with the ease of pressing a button. But the results seem strange and miraculous. “Make it less smart – I’m serious,” she said. It was recently ported to the Pixel carry, from Google’s line of smartphones, for the sole purpose of taking pictures.

It is reported that Apple has sold more than one hundred million units of the iPhone 12 Pro, and more than forty million of the iPhone 13 Pro since its launch in September last year. Both models are among the most popular consumer cameras ever, and also among the most powerful. The lenses on our smartphones are tiny apertures, no bigger than a shirt button. Until recently, they had little chance of imitating the functionality of full-size professional camera lenses. Phone cameras have achieved basic digital photography standards; Not many of us expected anything else. With the latest iPhone models, Apple is trying to make its small phone cameras work as much as traditional cameras possible, and to make every photo you take look like the work of a seasoned professional. (Hence the names 12 and 13 “Pro,” which differ from the older iPhone 12 and 13 models primarily by their great cameras.) The iPhone 13 Pro takes 12-megapixel photos, includes three separate lenses, and uses machine learning to automatically adjust light and focus. . However, for some users, all of these optimization features had an undesirable effect. Halide, a camera app developer, recently published a close inspection of the 13 Pro that noted optical glitches caused by the device’s smart photography, including wiping out bridge cables in a landscape shot. “Its complex and interlocking set of ‘smart’ software components do not fit together quite properly,” the report stated.

In January, I swapped my iPhone 7 for the iPhone 12 Pro, and was appalled by the camera’s performance. On the seventh, the slight roughness of the photos I took seemed like a logical outgrowth of the camera’s limited capabilities. I didn’t mind flaws like “digital noise” that occurred when a subject was under light or too far away, and liked that any photo editing was up to me. On the 12 Pro, by contrast, digital manipulation is aggressive and undesirable. One would expect a person’s face in front of a sunlit window to appear dark, for example, because a conventional camera lens, like the human eye, can only allow light to enter through a single aperture the size of an aperture at a given moment. But on my iPhone 12 Pro, even the backlit face looks weirdly lit. The modification could theoretically result in an improved image – it’s nice to see faces – but the effect is frightening. When I press the shutter button to take a photo, the photo often appears in the frame for a moment as it did to my naked eyes. Then it turns out to be unrecognizable, and there is no way to reverse the process. David Vitt, a professional photographer based in Paris, also moved from the iPhone 7 to the 12 Pro, in 2020, and still prefers the less powerful 7 camera. On the 12 Pro, he said “I’m picking it up and it looks like it’s too much”. “They bring back detail in the highlights and in the shadows which is often more than what you see in real life. It looks more than real.”

For a large portion of the population, “smartphone” has become synonymous with “camera,” but the truth is that iPhones are no longer cameras in the traditional sense. Instead, they are devices at the forefront of “computational imaging,” a term that describes images made up of digital and processed data as much as optical information. Each image recorded by the lens is adjusted to bring it closer to the ideal pre-programmed image. Gregory Ginter, a friend of a high-profile photographer in Brooklyn, told me, “I’ve tried shooting on the iPhone when the light gets bluish at the end of the day, but the iPhone will try to correct that kind of thing.” The dark purple is released, and in the process it is erased, because Hue is evaluated as undesirable, as a defect rather than an advantage. He added that the device “sees the things I’m trying to portray as a problem that needs to be solved.” Image processing also eliminates digital noise, and makes it a slight blur, which may be the reason for the smudge McCabe sees in her daughter’s gymnastics photos. The “fix” ends up creating a distortion more noticeable than any originally perceptible error.

Earlier this month, Apple’s iPhone team agreed to provide me with information, in the background, about the latest camera upgrades. An employee explained that when a user takes a photo with the latest iPhone, the camera creates up to nine frames with different levels of exposure. And then Deep Fusion, which has been around in some form since 2019, combines the clearest parts of all those frames together, pixel by pixel, to form a single composite image. This process is an extreme version of High Dynamic Range, or HDR, a technology that previously required some software know-how. (As a college student, I had a hard time copying HDR onto my traditional camera photos using Photoshop to overlay different frames and then crop out the desired parts.) The iPhone camera also analyzes each photo semantically, with the help of the graphics-processing unit, which picks out specific elements from the frame. Faces, landscapes, the sky – each element is presented differently. In both the 12 Pro and 13 Pro, I’ve found that image processing makes clouds and jets stand out more clearly than the human eye can perceive, creating skies that resemble the supersaturated vistas of an anime movie or video game. Andy Adams, a longtime photo blogger, told me, “HDR is a technology that, like salt, must be applied very wisely.” Now every photo we take on our iPhones has been generously applied salt, whether it was necessary or not.

In the 20th century, photography enabled the mass reproduction of works of art, expanding their accessibility while weakening their individual influence. Just as artworks have physical “halos,” as Walter Benjamin described them, traditional cameras produce images with distinct qualities. Think of an original Leica camera photo taken with a fixed-length lens, or an instant Polaroid shot with its choppy exposure. The images that are made on these devices are inseparable from the mechanisms of the devices themselves. In a way, the iPhone made the camera itself infinitely reproducible. The device’s digital tools can simulate any camera, lens, or film at any given moment, without the manual skill that was necessary in the past—unlike the way early photographs replicate painters’ brushstrokes. The resulting iPhone photos have a destabilizing effect on the state of the camera and the photographer, creating a shallow copy of photographic technology that undermines the effect of the original. The average iPhone photo shooter goes about looking professional and imitating art without ever getting there. We’re all professional photographers now, with the tap of a finger, but that doesn’t mean our photos are good.

After my conversations with an iPhone team member, Apple loaned me the 13 Pro, which includes a new Photography Styles feature that’s meant to allow users to participate in the computational photography process. While filters and other familiar editing tools work on an entire image at once, after capturing it, patterns make adjustments in the stages of semantic analysis and selection between frames. This process is a bit like setting settings on a manual camera; Changes the way a photo is taken when the shutter button is pressed. The Tone tweak combines brightness, contrast, saturation, and other factors, and the Warmth tweak changes the color temperature of images. The effects of these adjustments are more subtle than the older iPhone post-processing filters, but the essential qualities of the new generation of iPhone photos remain. Coldly fragile and vaguely inhuman, they have fallen into a strange valley where creative expression meets machine learning.

One of Apple’s most dramatic features of computer photography is Portrait Mode, which mimics the way a wide-aperture lens captures a subject in the foreground in sharp focus while blurring what’s behind. Available on iPhone models since 2016, this effect is achieved not by the lens itself but by algorithmic filters that locate the subject and apply an artificial blur to the background. Bouquet, as this spiky quality is known, was once the domain of glossy magazines and fashion photo shoots. Now it is simply another aesthetic option open to any user, and digital simulations are often unconvincing. Take a picture in portrait mode and you’ll see where the algorithm is imperfect. Perhaps a person’s hair outline will appear ambiguous, because the system cannot fully measure its boundaries, or a minor number will be recorded as part of the background and completely camouflaged. This mechanically approximate version of bokeh refers to hobby rather than craftsmanship. Amateurs who hate such tech scams may look for old digital cameras, or they may flee again to shoot movies. But the new iPhone cameras, more than most users realize, are shaping a paradigm that is reshaping the nature of the photo industry along with our expectations of what a photo should be. “It sets the standard for what a normal photo looks like,” said David Fett, a Paris-based photographer. “I hope, in the future, I won’t have clients asking for this kind of look.”