This Week's Sponsor:

Winterfest 2024

The Festival of Artisanal Software


Apple’s Quest to Transform Photography

John Paczkowski of BuzzFeed conducted a fascinating interview with Apple’s Senior Vice President of Worldwide Marketing Phil Schiller and Johnnie Manzari of Apple’s Human Interface Team about the iPhone’s camera. Much of the discussion is focused on the new Portrait Lighting feature available in the 8 Plus and X. As Paczkowski explains,

The camera’s effects don’t rely on filters. They’re the result of Apple’s new dual camera system working in concert with machine learning to sense a scene, map it for depth, and then change lighting contours over the subject. It’s all done in real time, and you can even preview the results thanks to the company’s enormously powerful new A11 Bionic chip. The result, when applied to Apple scale, has the power to be transformative for modern photography, with millions of amateur shots suddenly professionalized.

Manzari described the extensive process that went into creating Portrait Lighting:

“We spent a lot of time shining light on people and moving them around — a lot of time,” Manzari says. “We had some engineers trying to understand the contours of a face and how we could apply lighting to them through software, and we had other silicon engineers just working to make the process super-fast. We really did a lot of work.”

BuzzFeed’s article is worth a close read because it’s about more than just the camera in Apple’s new and upcoming iPhones. The behind-the-scenes peek at the development process of the many functions that the iPhone’s camera serves is the best example of one of Apple’s biggest competitive advantages: the fusion of hardware and software.