This Week's Sponsor:

PowerPhotos

The Ultimate Toolbox for Photos on the Mac


Posts tagged with "camera"

Apple’s Deep Fusion Camera Feature Launching as Part of the iOS Developer Beta Program

According to TechCrunch’s Matthew Panzarino, Apple will roll out the Deep Fusion camera feature announced at the company’s fall iPhone event today as part of the iOS developer beta program.

Deep Fusion is Apple’s new method of combining several images exposures at the pixel level for enhanced definition and color range beyond what is possible with traditional HDR techniques. Panzarino explains how Deep Fusion works:

The camera shoots a ‘short’ frame, at a negative EV value. Basically a slightly darker image than you’d like, and pulls sharpness from this frame. It then shoots 3 regular EV0 photos and a ‘long’ EV+ frame, registers alignment and blends those together.

This produces two 12MP photos – 24MP worth of data – which are combined into one 12MP result photo. The combination of the two is done using 4 separate neural networks which take into account the noise characteristics of Apple’s camera sensors as well as the subject matter in the image.

Apple told Panzarino that the technique “results in better skin transitions, better clothing detail and better crispness at the edges of moving subjects.”

There is no button or switch to turn Deep Fusion on. Like the over-crop feature that uses the ultra wide lens to allow photo reframing after the fact, Deep Fusion is engaged automatically depending on the camera lens used and light characteristics of the shot being taken. Panzarino also notes that Deep Fusion, which is only available for iPhones that use the A13 processor, does not work when the over-crop feature is turned on.

I’ve been curious about Deep Fusion since it was announced. It’s remarkable that photography has become as much about machine learning as it is about the physics of light and lenses. Deep Fusion is also the sort of feature that can’t be demonstrated well onstage, so I’m eager to get my hands on the beta and try it myself.

Permalink

Halide 1.14 Adds New Lens Switching Interface and Guides

Halide 1.14 is out with a new lens switching UI to accommodate the three-camera system of the iPhone 11 Pro and Pro Max. As soon as the update was out, I went for a walk to give it a try.

Halide has introduced a new lens switching button featuring haptic feedback and a dial-like system for moving among the iPhone’s lenses. When you press down on the lens button, you get a tap of haptic feedback to let you know without looking that the lens picker has been engaged.

From there, you can slide your finger among the ultra wide, wide, and telephoto options that radiate out from the button. As you swipe your finger across each option, it enlarges, and you’re met with another little bit of haptic feedback as you swipe over the lenses other than the one already selected. Once you have the lens you want, you simply let go and your iPhone switches to it.

You can also cycle through the lenses in order by tapping the button repeatedly or swipe left for the ultra wide lens or up for the telephoto one. In my brief tests, swiping left or up is the best option if you already know the lens you want, but using the dial-like lens switcher is perfect for considering your options first because Halide has also added lens preview guides.

With the lens button engaged, Halide shows guides for each of your zoom options. That means if you’re using the ultra-wide lens, you’ll see the light gray guidelines for the wide and telephoto lenses. As you swipe over those lenses, the guides change to yellow to highlight the composition you’ll get if you switch to that lens.

If you’re already using the telephoto lens though, Halide will highlight the outer frame of the image to suggest you’ll get a wider shot, though it does not zoom the viewfinder out to show that composition until you lift your finger. You can see how the lens guides work from the screenshots I took at a local high school football field above and in this video:

Switching lenses in Halide.Replay

When you switch to the ultra wide lens, you’ll notice that not all the usual Halide features are available. Manual focus is missing and so is shooting in RAW. That’s because the new iPhone hardware and iOS and iPadOS 13 don’t support those features. Although the ultra wide shots don’t support RAW, Halide has included a ‘MAX’ option in place of the ‘RAW’ option, so you can get the most image data possible from your wide shots, which you can see in the screenshots below.

Ultra wide images are limited to MAX quality (left) instead of RAW, which is supported by the wide and telephoto lenses (right).

Ultra wide images are limited to MAX quality (left) instead of RAW, which is supported by the wide and telephoto lenses (right).

The Halide team says that the latest update also includes noise-reduction adjustments to the RAW images produced by the iPhone 11, but that they are continuing to fine-tune how that app handles RAW photos from the new phones as part of a more significant update that is coming next.

The latest update is relatively small, but I especially like the use of haptic feedback and lens guides, which make it easy to switch lenses when you’re focused on the viewfinder of the camera instead of Halide’s buttons.

Halide is available on the App Store for $5.99.


Austin Mann on the iPhone 11 and 11 Pro Cameras

Source: austinmann.com

Source: austinmann.com

Every year I look forward to Austin Mann taking the latest iPhones through their paces somewhere in the world. This year, Mann is on tour with cellist Yo-Yo Ma in China where he went out into the countryside to capture some stunning portraits and landscapes.

Mann’s review covers the new Ultra Wide lens, Night Mode, Smart HDR improvements, and ability to capture outside the frame, along with wishes for additional improvements. Mann’s take on Night Mode:

As long as I can remember, the top question I’ve received from iPhone photographers, beginners and pros alike, is How can I shoot better pictures in low light? This year’s addition of Night mode is the answer to the question. It’s easy to use, crazy powerful, and because it’s automatic it will completely change how everyone shoots on their iPhone.

Mann confirms what seemed to be the case from the photos that Apple showed off last week at its event in Cupertino – Apple has implemented Night Mode in a way that doesn’t try to turn night into day:

One thing I love about Apple’s approach to Night mode is the strategic balance of solving a technical problem while also caring deeply about artistic expression. When you look at the image above, it’s clear their team didn’t take the let’s-make-night-look-like-day approach, as some of their competitors have. Instead, it feels more like an embrace of what it actually is (night) while asking, “How do we capture the feel of this scene in a beautiful way?”

How Apple accomplishes Night Mode is interesting. As Mann explains:

From what I understand, the way Night mode actually works is the camera captures a bunch of short exposures and slightly longer exposures, checks them for sharpness, throws out the bad ones and blends the good ones. On a traditional dSLR/mirrorless camera, a 5 second exposure is one single, continuous recording of the light throughout the duration of the shutter so any movement (of subject or camera) is recorded.

But with iPhone 11 Pro the rules are different… it’s not capturing one single continuous frame but blending a whole bunch of shots with variable lengths (some shorter exposures to freeze motion and longer shots to expose the shadows.) This means the subject can actually move during your exposure but still remain sharp.

If you’ve been wondering about the new Ultra Wide camera on the new iPhones or the other new features of the camera app, be sure to check out Austin Mann’s full review for great technical and artistic insights about what Apple has accomplished with its new cameras as well as some absolutely fantastic examples of what they can do.

Permalink

Spectre: A Computational Approach to Long-Exposure iPhone Photography

Spectre is a new specialized camera app from the team that created Halide, one of our favorite camera apps on iOS. The Halide team describes Spectre as a computational shutter for the iPhone, which allows the app to do things like remove people from a crowded scene, create artistic images of rushing water, and produce light trails at night. The same sort of images can be created using traditional cameras, but getting the exposure right, holding the camera absolutely still, and accounting for other factors make them difficult to get right. With Spectre, artificial intelligence is used to simplify the process and make long-exposure photography accessible to anyone with an iPhone.

Read more


Apple’s Clips Introduces New Selfie Scenes, Filters, and More

On launch day for the new iPad Pros, which feature the iPhone’s TrueDepth camera system for the first time, Apple has upgraded its Clips video app with new features that take advantage of TrueDepth’s power. Today’s update also brings new camera filters, posters, stickers, and soundtrack options.

If you have an iPhone X, XS, or XR, or one of the new iPad Pros, the highlight of this release is a batch of six new Selfie Scenes. Last year when the iPhone X launched, Clips debuted Selfie Scenes as a fun and impressive way to utilize the new device’s TrueDepth camera system. Selfie Scenes isolate you from your environment and replace your surroundings with interesting digital backgrounds, such as the Millennium Falcon from Star Wars, or an animated world. The new scenes added in today’s update are called Clouds, High Noon, Big Backyard, Monster Lab, Animal Forest, and Municiberg Mayhem, a scene from The Incredibles 2. They’re all a lot of fun, providing different moods through sound effects and music. Apple says that Selfie Scenes perform better than ever on recent devices thanks to the A12 Bionic’s Neural Engine, and in my tests I did notice that the scenes were smoother and more responsive than before.

Filters and soundtracks are Clips’ next most substantial upgrades. There are three new filters: Comic Mono, Watercolor Mono, and Aged Film. Of these, the latter is easily my favorite, as the first two are only monochrome versions of existing filters. On the soundtrack side, there are a whopping 17 new tunes to choose from for your videos.

Every major Clips update adds a host of new posters, stickers, and text labels, and today’s is no exception. Continuing Apple’s partnership with Disney, there are poster options from Coco and The Incredibles 2, as well as designs related to sports, science, and more.

Though the app’s development cycle has slowed, Apple continues to plug away making Clips a great tool for short video creation. My biggest wish for the app – non-square video – has still gone unfulfilled, but hopefully one day we’ll get that change. Until then, the Selfie Scenes are a fun demo of the newest iPhones and iPad Pros, and Clips continues to be the most enjoyable video creation tool I’ve ever used. I think Apple’s on to something here, but the square video restriction continues to hold Clips back.



Halide Developer Ben Sandofsky Breaks Down How the iPhone XR Captures Depth Data

Ben Sandofsky from the team that makes the Halide iOS camera app has a detailed post on the iPhone XR’s camera and how Apple creates Portrait Mode photos with a single lens. Sandofsky walks through how Apple uses Focus Pixels to develop a rough Disparity Map and combines that with a Portrait Effects Matte to create Portrait Mode images.

The results have some advantages, but also distinct disadvantages compared to the iPhone XS’s camera. As Sandofsky explains:

It seems the iPhone XR has two advantages over the iPhone XS: it can capture wider angle depth photos, and because the wide-angle lens collects more light, the photos will come out better in low light and have less noise.

However:

…most of the time, the XS will probably produce a better result. The higher fidelity depth map, combined with a focal length that’s better suited for portraiture means people will just look better, even if the image is sometimes a bit darker. And it can apply Portrait effects on just about anything, not just people.

Although Apple’s Camera app can only take Portrait Mode photos of people on the iPhone XR, the upcoming Halide 1.11 update will combine the XR’s Disparity Map and Halide’s own blur effect to apply a similar effect beyond human subjects. Sandofsky admits that the feature isn’t perfect due to the low quality of the Disparity Map created by the XR, but the photos included in his post show that it can take excellent pictures under some conditions.

It’s remarkable what is being done to squeeze depth information out of the XR’s single lens and instructive to understand how the underlying technology works. It’s also apparent that Apple has made significant advancements since the introduction of the first dual-lens cameras.

Permalink

Halide Introduces Smart RAW for iPhone XS, Joining a Host of Other Improvements

Since its debut Halide has been one of the best manual camera apps available on iPhone. The month of September brought a number of challenges to Halide’s team though, thanks to all the photography work Apple put into iOS 12 and the iPhone XS. And in response, within the span of a few weeks Halide has receive two major updates: version 1.9 on iOS 12’s release date, and releasing today is version 1.10 featuring Smart RAW.

Read more


An In-Depth Explanation of Computational Photography on the iPhone XS

Outside of Apple employees, one of the people most knowledgeable about the iPhone’s camera is Sebastiaan de With, designer of the manual camera app Halide. It is fitting, then, that Sebastiaan would publish what I believe is the best explanation of the iPhone XS camera system to date. Following up on a piece he wrote about the new camera’s hardware changes, the subject of today’s article is software – specifically, all the work of computational photography on the iPhone XS and XS Max.

The piece starts with an explanation of the iPhone’s new Smart HDR feature, then details the exact reasons why selfies on the new iPhones appear to employ skin smoothing (a theory he soundly debunks). Finally, Sebastiaan details the problem that the XS camera poses for RAW camera apps like Halide, and shares about the forthcoming solution Halide’s team came up with: something they call Smart RAW.

There are too many excellent, informative tidbits to quote here, so I highly recommend you check out the article in full. This year’s iPhones are so full of interesting changes to the way the camera works, most of which are undocumented by Apple – as Sebastiaan says, it is “a whole new camera” in many ways.

Permalink