Posts tagged with "camera"

Halide 1.7 Brings New Depth Photography and ARKit Features, Darkroom Integration

We first reviewed Halide, the powerful third-party camera app by Ben Sandofsky and Sebastiaan de With, when it debuted in the summer of 2017, providing a powerful and elegant alternative to Apple’s Camera app that fully embraced RAW photography and advanced controls in an intuitive interface. We later showcased Halide’s iPhone X update as one of the most thoughtful approaches to adapting for the device’s Super Retina Display; to this day, Halide is a shining example of how the iPhone X’s novel form factor can aid, instead of hindering, complex app UIs.

While Halide was already regarded as an appealing alternative to Apple’s stock app for professional photographers and RAW-curious iPhone users (something that designer de With covered in depth in his excellent guide), it was lacking a handful of key features of the modern iPhone photography experience. Sandofsky and de With want to eliminate some of these important gaps with today’s 1.7 update, which focuses on bringing the power of Portrait mode to Halide, supporting the iPhone X’s TrueDepth camera system, and extending the app’s integrations via a special ARKit mode, new export options, and native integration with the popular Darkroom photo editing tool.

Read more


How to Design for iPhone X (Without an iPhone X)

Great analysis by Sebastiaan de With on how they redesigned Halide for the iPhone X (the app indeed turned out to be one of the best iPhone X app updates we’ve seen so far):

Design for ergonomics. On regular iPhones, you have to do much less as a designer to optimize ergonomics. The iPhone X requires you to think about comfortable button placement and usability. Ergonomics is more than just tapping, but also swiping and other gestures. Lay out your UI so all actions are accessible and as comfortably usable as possible.

It’s a whole new device: Design for it. Everyone can stretch an app out to a larger screen, but just like the iPad, a fresh approach is not only welcomed but helps you stand out in the App Store. This is a great time to validate your current design. Are your approaches still valid? Is there a better solution possible? You might come to some valuable insights that you can apply to all your designs, not just the ones for the shiny new device.

If you’re a developer working on iPhone X UI updates, don’t miss Sebastiaan’s map visualization of the device’s display.

Permalink

Dual Lens Switching on iPhone X

Dan Provost of Studio Neat (makers of the excellent Glif) ran some tests to analyze the low-light performance of the iPhone X’s telephoto lens:

Last year, when the iPhone 7 Plus was released, Glenn Fleishman wrote a terrific piece for Macworld about how the dual lens camera system works. In short, when you zoom-in to 2X, the camera does not always switch to the telephoto lens. In some cases (typically in low light scenarios), you will be presented with a cropped image from the wide angle lens instead. This was sacrilege to camera nerds, but Apple would argue that if the cropped image looks better in those low light situations, then that is the correct approach.

Results are impressive:

As you can see, the iPhone X required very little light before it decided to use the telephoto lens. The iPhone 7 Plus required quite a bit more. I used the app Light Meter to measure the light at each interval, which I denote in the video. The app measures the lux, which is a measure of illuminance equal to one lumen per square meter. (I measured from both devices and averaged the results, as the readings were slightly different. I wouldn’t expect an app to function as well as a true light meter, but this probably gets us in the ball park).

Make sure to check out the video to see the lens switching in action. The difference between the iPhone 7 Plus and the X is substantial when it comes to the amount of light required for the system to pick the telephoto lens.

Permalink

Focos: Powerful Depth Image Controls in a Fun Package

The iPhone’s camera has long been one of its most important features. Every year when new models are introduced, it’s a sure bet that camera improvements are part of the package. Last year that remained true, but it also proved an even more special year for the iPhone’s camera setup. The introduction of dual rear-facing cameras with Portrait mode was something different – pictures no longer just looked a little better than on older iPhone models, they looked almost professional-quality.

This year, whether you picked up a new iPhone or not, Portrait mode is a better feature than before. Part of this is due to software improvements in iOS 11, but another key benefit is that third-party developers now have access to the depth information in Portrait photos. For the first time, Portrait images taken with the iPhone can be edited and enhanced in unique ways, and Focos is a new app that takes full advantage of that opportunity.

Read more


Can Clips Be a Modern Day Photo Booth?

Karan Varindani considers the potential of Apple’s Clips to be a spiritual successor to Photo Booth:

With the iPad 2, back in early 2011, Apple brought Photo Booth to the iPad. I distinctly remember thinking that this was a no-brainer at the time. Growing up in Ghana, there weren’t that many Macs in my high school, but everybody that had one used Photo Booth. It was very regular to walk into the sixth form (senior year) common room and see groups of friends, myself included, behind a MacBook playing with the filters. Talking to several of my American friends, it sounds like it was the same deal here. I always thought that it was only a matter of time before Apple brought Photo Booth to the iPhone, but six years later it still just ships with Macs and iPads (and I don’t think that it’s been updated in that time).

Playing with the Selfie Scenes in Clips last week, I had the same feeling that I did playing with Photo Booth on my Mac many years ago. It was a little surreal, as someone with incredible front-camera shyness, to find myself having so much fun with it. The whole experience had me thinking: In a few years, once the Face ID technology has spread to the rest of the iOS line (and maybe even the Mac), could Clips be the successor to Photo Booth? Between Selfie Scenes, stickers, Live Titles, and fast sharing to social media, it seems the perfect fit.

I think the best modern equivalent of that Photo Booth social experience is Snapchat’s lenses, which I’ve observed can consistently deliver laughter and interest among a group of friends or family members. While Clips’ Selfie Scenes offer a similarly neat technical effect, if Apple is serious about being successful with the app, a couple big changes need to take place: the square orientation limit has to go, and Clips needs better hooks into apps like Instagram and Snapchat than the share sheet provides.

Photo Booth’s prime was a very different era than where we are today, and without the aid of a true social network it will be hard for Apple to replicate its success. So far, Animoji seem much closer to meeting that goal than Clips.

Permalink


Apple’s Quest to Transform Photography

John Paczkowski of BuzzFeed conducted a fascinating interview with Apple’s Senior Vice President of Worldwide Marketing Phil Schiller and Johnnie Manzari of Apple’s Human Interface Team about the iPhone’s camera. Much of the discussion is focused on the new Portrait Lighting feature available in the 8 Plus and X. As Paczkowski explains,

The camera’s effects don’t rely on filters. They’re the result of Apple’s new dual camera system working in concert with machine learning to sense a scene, map it for depth, and then change lighting contours over the subject. It’s all done in real time, and you can even preview the results thanks to the company’s enormously powerful new A11 Bionic chip. The result, when applied to Apple scale, has the power to be transformative for modern photography, with millions of amateur shots suddenly professionalized.

Manzari described the extensive process that went into creating Portrait Lighting:

“We spent a lot of time shining light on people and moving them around — a lot of time,” Manzari says. “We had some engineers trying to understand the contours of a face and how we could apply lighting to them through software, and we had other silicon engineers just working to make the process super-fast. We really did a lot of work.”

BuzzFeed’s article is worth a close read because it’s about more than just the camera in Apple’s new and upcoming iPhones. The behind-the-scenes peek at the development process of the many functions that the iPhone’s camera serves is the best example of one of Apple’s biggest competitive advantages: the fusion of hardware and software.

Permalink

Apple’s Clips App Receives Update Adding Disney/Pixar Content and More

Today Apple released the first major update for its short form video creation app, Clips. Version 1.1 includes, most notably, a variety of animated graphics featuring beloved characters from Disney and Pixar films.

In the Disney department you can add Mickey, Minnie, Donald, and Daisy to your videos, each with their own unique animations. And from Pixar, characters originating in Toy Story, Inside Out, and Cars are available. There are also a variety of new posters available to be used as title cards, some of which were designed by Disney and others by Apple. A selection of the new posters feature vibrant animations when you use them, such as water rippling in a pool.

In addition to the new content available for creating videos, Apple has also refined some design aspects in the app to make it easier to use. For example, Live Titles could always be edited by tapping on the text, but that wasn’t a very discoverable interface. Now there’s a new button to accomplish the task.

Apple’s press release announcing the update mentions that Clips “is included on all new iOS devices,” which should help bolster adoption of the app. That press release also features a video seemingly created in Clips that’s worth checking out.


Microsoft Launches iPhone App for Low Vision Community: Seeing AI

Today Microsoft introduced a new app exclusively for iPhone, Seeing AI. This app is designed as a tool for the low vision community; using the iPhone’s camera and its AI smarts, Seeing AI converts the visual experience of the world into an audible one. As you point the camera at things in the world around you, the app will describe that world in a quick, informative manner.

From a user’s perspective, the app is tremendously simple to use; there’s very little that needs to be done before Seeing AI can begin describing the space around you. If you want to identify people, you can first set them up as recognizable from the sidebar menu’s ‘Face Recognition’ option. Otherwise, all you have to do to start identifying things is select from one of five different categories (the app calls them ‘channels’) to help the app understand what type of object it needs to identify. The five current categories are:

  • Short Text
  • Document
  • Product
  • Person
  • Scene (currently tagged as ‘Beta’)

Microsoft says a category for currency will be coming soon, allowing the app to intelligently identify different denominations of cash.

In my testing of the app, it’s far from perfect in its ability to identify things, but it has done a solid job all-around. Though the tech driving the app may only be experimental and have a long way to go, the app is far from barebones in what it can do now. When identifying a document, Seeing AI will audibly guide you through the capture process to help you get the full document in view. After scanning a product’s barcode, in some cases you’ll receive additional information about the product beyond just its name. And if the app is scanning a person, it can even describe a best guess at their visible emotional state. It’s an impressive, deep experience that nevertheless remains dead simple to operate.

Even if you aren’t in the market for Seeing AI yourself, it’s a fascinating product worth checking out, and it’s entirely free. You can download it on the App Store.

Microsoft has a short introductory video that gives a great taste of all that the app can do, embedded below.