Home Entertainment Have iPhone Cameras Become Too Smart?

Have iPhone Cameras Become Too Smart?

0
Have iPhone Cameras Become Too Smart?

[ad_1]

In late 2020, Kimberly McCabe, an executive at a consulting firm in the Washington, D.C. area, upgraded from an iPhone 10 to an iPhone 12 Pro. Quarantine had prompted McCabe, a mother of two, to invest more effort into documenting family life. She figured that the new smartphone, which had been released the month before and featured an enhanced camera, would improve the quality of her amateur snapshots. But the 12 Pro has been a disappointment, she told me recently, adding, “I feel a little duped.” Every image seems to come out far too bright, with warm colors desaturated into grays and yellows. Some of the photos that McCabe takes of her daughter at gymnastics practice turn out strangely blurry. In one image that she showed me, the girl’s upraised feet smear together like a messy watercolor. McCabe said that, when she uses her older digital single-lens-reflex camera (D.S.L.R.), “what I see in real life is what I see on the camera and in the picture.” The new iPhone promises “next level” photography with push-button ease. But the results look odd and uncanny. “Make it less smart—I’m serious,” she said. Lately she’s taken to carrying a Pixel, from Google’s line of smartphones, for the sole purpose of taking pictures.

Apple has reportedly sold more than a hundred million units of the iPhone 12 Pro, and more than forty million of the iPhone 13 Pro since it débuted, in September of last year. Both models are among the most popular consumer cameras ever made, and also among the most powerful. The lenses on our smartphones are tiny apertures, no bigger than a shirt button. Until recently, they had little chance of imitating the function of full-size professional camera lenses. Phone cameras achieved the standards of a basic digital point-and-shoot; many of us didn’t expect anything more. With the latest iPhone models, though, Apple is attempting to make its minuscule phone cameras perform as much like traditional cameras as possible, and to make every photo they take look like the work of a seasoned professional. (Hence the names 12 and 13 “Pro,” which are distinguished from the earlier iPhone 12 and 13 models mainly by their fancier cameras.) The iPhone 13 Pro takes twelve-megapixel images, includes three separate lenses, and uses machine learning to automatically adjust lighting and focus. Yet, for some users, all of those optimizing features have had an unwanted effect. Halide, a developer of camera apps, recently published a careful examination of the 13 Pro that noted visual glitches caused by the device’s intelligent photography, including the erasure of bridge cables in a landscape shot. “Its complex, interwoven set of ‘smart’ software components don’t fit together quite right,” the report stated.

In January, I traded my iPhone 7 for an iPhone 12 Pro, and I’ve been dismayed by the camera’s performance. On the 7, the slight roughness of the images I took seemed like a logical product of the camera’s limited capabilities. I didn’t mind imperfections like the “digital noise” that occurred when a subject was underlit or too far away, and I liked that any editing of photos was up to me. On the 12 Pro, by contrast, the digital manipulations are aggressive and unsolicited. One expects a person’s face in front of a sunlit window to appear darkened, for instance, since a traditional camera lens, like the human eye, can only let light in through a single aperture size in a given instant. But on my iPhone 12 Pro even a backlit face appears strangely illuminated. The editing might make for a theoretically improved photo—it’s nice to see faces—yet the effect is creepy. When I press the shutter button to take a picture, the image in the frame often appears for an instant as it did to my naked eye. Then it clarifies and brightens into something unrecognizable, and there’s no way of reversing the process. David Fitt, a professional photographer based in Paris, also went from an iPhone 7 to a 12 Pro, in 2020, and he still prefers the 7’s less powerful camera. On the 12 Pro, “I shoot it and it looks overprocessed,” he said. “They bring details back in the highlights and in the shadows that often are more than what you see in real life. It looks over-real.”

For a large portion of the population, “smartphone” has become synonymous with “camera,” but the truth is that iPhones are no longer cameras in the traditional sense. Instead, they are devices at the vanguard of “computational photography,” a term that describes imagery formed from digital data and processing as much as from optical information. Each picture registered by the lens is altered to bring it closer to a pre-programmed ideal. Gregory Gentert, a friend who is a fine-art photographer in Brooklyn, told me, “I’ve tried to photograph on the iPhone when light gets bluish around the end of the day, but the iPhone will try to correct that sort of thing.” A dusky purple gets edited, and in the process erased, because the hue is evaluated as undesirable, as a flaw instead of a feature. The device “sees the things I’m trying to photograph as a problem to solve,” he added. The image processing also eliminates digital noise, smoothing it into a soft blur, which might be the reason behind the smudginess that McCabe sees in photos of her daughter’s gymnastics. The “fix” ends up creating a distortion more noticeable than whatever perceived mistake was in the original.

Earlier this month, Apple’s iPhone team agreed to provide me information, on background, about the camera’s latest upgrades. A staff member explained that, when a user takes a photograph with the newest iPhones, the camera creates as many as nine frames with different levels of exposure. Then a “Deep Fusion” feature, which has existed in some form since 2019, merges the clearest parts of all those frames together, pixel by pixel, forming a single composite image. This process is an extreme version of high-dynamic range, or H.D.R., a technique that previously required some software savvy. (As a college student, I’d struggle to replicate H.D.R. on my traditional camera’s photos by using Photoshop to overlay various frames and then cut out their desirable parts.) The iPhone camera also analyzes each image semantically, with the help of a graphics-processing unit, which picks out specific elements of a frame—faces, landscapes, skies—and exposes each one differently. On both the 12 Pro and 13 Pro, I’ve found that the image processing makes clouds and contrails stand out with more clarity than the human eye can perceive, creating skies that resemble the supersaturated horizons of an anime film or a video game. Andy Adams, a longtime photo blogger, told me, “H.D.R. is a technique that, like salt, should be applied very judiciously.” Now every photo we take on our iPhones has had the salt applied generously, whether it is needed or not.

In the twentieth century, photography enabled the mass reproduction of art works, broadening their accessibility while degrading their individual impact. Just as art works have physical “auras,” as Walter Benjamin described it, traditional cameras produce images with distinctive qualities. Think of the pristine Leica camera photo shot with a fixed-length lens, or the Polaroid instant snapshot with its spotty exposure. The images made on those devices are inextricable from the mechanics of the devices themselves. In a way, the iPhone has made the camera itself infinitely reproducible. The device’s digital tools can mimic any camera, lens, or film at any moment, without the manual skill that was necessary in the past—not unlike the way in which early photographs replicated painters’ brushstrokes. The resulting iPhone images have a destabilizing effect on the status of the camera and the photographer, creating a shallow copy of photographic technique that undermines the impact of the original. The average iPhone photo strains toward the appearance of professionalism and mimics artistry without ever getting there. We are all pro photographers now, at the tap of a finger, but that doesn’t mean our photos are good.

After my conversations with the iPhone-team member, Apple loaned me a 13 Pro, which includes a new Photographic Styles feature that is meant to let users in on the computational-photography process. Whereas filters and other familiar editing tools work on a whole image at once, after it is taken, Styles factors the adjustments into the stages of semantic analysis and selection between frames. The process is a bit like adjusting the settings on a manual camera; it changes how the photo will be taken when the shutter button is pushed. A Tone dial combines brightness, contrast, saturation, and other factors, and a Warmth dial changes the color temperature of the photos. The effects of these adjustments are more subtle than the iPhone’s older post-processing filters, but the fundamental qualities of new-generation iPhone photographs remain. They are coldly crisp and vaguely inhuman, caught in the uncanny valley where creative expression meets machine learning.

One of the most dramatic features of Apple’s computational photography is Portrait Mode, which imitates the way a lens with a wide aperture captures a subject in the foreground in sharp focus while fuzzing out what’s behind. Available on iPhone models since 2016, this effect is achieved not by the lens itself but by algorithmic filters that determine where the subject is and apply an artificial blur to the background. Bokeh, as that gauzy quality is known, was once the domain of glossy magazines and fashion photo shoots. Now it is simply another aesthetic choice open to any user, and the digital simulation is often unconvincing. Take a picture in Portrait Mode and you’ll see where the algorithm is imperfect. Perhaps the outline of a subject’s hair will come out fuzzy, because the system can’t quite gauge its borders, or a secondary figure will be registered as part of the background and blurred out altogether. This machine-approximated version of bokeh signifies amateurism rather than craft. Hobbyists who dislike such technological tricks might seek out older digital cameras, or they might flee back to film. But the new iPhone cameras, more than most users realize, are forging a template that is reshaping the nature of image-making along with our expectations of what a photograph should be. David Fitt, the Paris-based photographer, said, “It sets a standard of what the normal picture looks like. I hope, in the future, that I won’t have clients asking for this type of look.”

[ad_2]

Source link