Apple's iPhone Cameras Accused of Being 'Too Smart'

The New Yorker argues that photos on newer iPhones are "coldly crisp and vaguely inhuman, caught in the uncanny valley where creative expression meets machine learning...." "[T]he truth is that iPhones are no longer cameras in the traditional sense. Instead, they are devices at the vanguard of 'computational photography,' a term that describes imagery formed from digital data and processing as much as from optical information. Each picture registered by the lens is altered to bring it closer to a pre-programmed ideal." In late 2020, Kimberly McCabe, an executive at a consulting firm in the Washington, D.C. area, upgraded from an iPhone 10 to an iPhone 12 Pro... But the 12 Pro has been a disappointment, she told me recently, adding, "I feel a little duped." Every image seems to come out far too bright, with warm colors desaturated into grays and yellows. Some of the photos that McCabe takes of her daughter at gymnastics practice turn out strangely blurry. In one image that she showed me, the girl's upraised feet smear together like a messy watercolor. McCabe said that, when she uses her older digital single-lens-reflex camera (D.S.L.R.), "what I see in real life is what I see on the camera and in the picture." The new iPhone promises "next level" photography with push-button ease. But the results look odd and uncanny. "Make it less smart — I'm serious," she said. Lately she's taken to carrying a Pixel, from Google's line of smartphones, for the sole purpose of taking pictures.... Gregory Gentert, a friend who is a fine-art photographer in Brooklyn, told me, "I've tried to photograph on the iPhone when light gets bluish around the end of the day, but the iPhone will try to correct that sort of thing." A dusky purple gets edited, and in the process erased, because the hue is evaluated as undesirable, as a flaw instead of a feature. The device "sees the things I'm trying to photograph as a problem to solve," he added. The image processing also eliminates digital noise, smoothing it into a soft blur, which might be the reason behind the smudginess that McCabe sees in photos of her daughter's gymnastics. The "fix" ends up creating a distortion more noticeable than whatever perceived mistake was in the original. Earlier this month, Apple's iPhone team agreed to provide me information, on background, about the camera's latest upgrades. A staff member explained that, when a user takes a photograph with the newest iPhones, the camera creates as many as nine frames with different levels of exposure. Then a "Deep Fusion" feature, which has existed in some form since 2019, merges the clearest parts of all those frames together, pixel by pixel, forming a single composite image. This process is an extreme version of high-dynamic range, or H.D.R., a technique that previously required some software savvy.... The iPhone camera also analyzes each image semantically, with the help of a graphics-processing unit, which picks out specific elements of a frame — faces, landscapes, skies — and exposes each one differently. On both the 12 Pro and 13 Pro, I've found that the image processing makes clouds and contrails stand out with more clarity than the human eye can perceive, creating skies that resemble the supersaturated horizons of an anime film or a video game. Andy Adams, a longtime photo blogger, told me, "H.D.R. is a technique that, like salt, should be applied very judiciously." Now every photo we take on our iPhones has had the salt applied generously, whether it is needed or not.... The average iPhone photo strains toward the appearance of professionalism and mimics artistry without ever getting there. We are all pro photographers now, at the tap of a finger, but that doesn't mean our photos are good.

Read more of this story at Slashdot.



from Slashdot https://ift.tt/XxR8MKe

SUBSCRIBE TO OUR NEWSLETTER

“Work hard in silence, let your success be your noise"

0 Response to "Apple's iPhone Cameras Accused of Being 'Too Smart'"

Post a Comment

ad

Search Your Job