Owners of the new iPhone XS noticed that their selfies were looking “too good,” like they’d been air-brushed, their skin make-up-smooth. Users assumed that a software filter was automatically being applied when they used the front-facing camera on their phones, and the phenomenon was quickly dubbed #beautygate. It turns out that it was a bug, not a feature. For each snap, the camera was shooting several frames, and the software was basing the final image on a frame with less fine detail.
At least iPhone users are getting their photos: lately my phone had been deciding it didn’t want to save my pictures at all. It turns out that I was just going too fast; I need to wait patiently for the software to stop processing the picture before putting my phone away.
These unrelated bugs have a common cause: the high dynamic range (HDR) processing of digital images. As Devin Coldewey writes in the first article linked below, there’s a limit to what the microscopic sensors on our phones can take in, and the real battleground in photography is the software. To get that sweet bokeh (background blur and more) without a big lens and delicate aperture controls, we’re going to have to rely on innovations based in the code.
- The future of photography is code [TechCrunch] “Cameras can’t get too much better than they are right now, or at least not without some rather extreme shifts in how they work.”
- Google is rolling out a software update soon to fix the Pixel 3 photo-saving issue (The Verge) “Until Google rolls out the fix, users can work around the issue by leaving the camera app open until HDR processing completes or turning off the HDR function completely.”
- Apple says glossy iPhone XS selfies were a bug, promises a fix in iOS 12.1 [Ars Technica] “Using powerful image processors and machine learning is a trend in today’s phones for a reason—it’s an extremely promising way to make up for the limitations inherent in phone cameras’ optics. But it does have the downside of taking some control away from the user.”
- At long last, pet portraits with background blur are possible on the iPhone XR [TechCrunch] “The problem was that Apple’s machine learning systems are only trained to recognize and create high-quality depth maps of people. Not dogs, cats, plants or toy robots. People would be frustrated if the artificial background blur inexplicably got way worse when it was pointed at something that wasn’t a person, so the effect just doesn’t trigger unless someone’s in the shot.”
From the Ohio Web Library:
- Long, Ben. “HDR Photography: Shooting and Processing.” Lynda.com, July 2011.
- Scoblete, Greg. “GET SMART: 9 Tools to Boost Your Mobile Photography.” Photo District News, vol. 35, no. 1, Jan. 2015, p. 78.
- Bansal, Agam, et al. “Selfies: A Boon or Bane?” Journal of Family Medicine & Primary Care, vol. 7, no. 4, July 2018, pp. 828–831.