Posts tagged with "photography"

Apple Announces Night Mode Photography Contest

About this time last year, Apple announced its first-ever ‘Shot on iPhone’ photography challenge judged by a panel of professional photographers and Apple employees. Apple is back with a new contest app this year asking users to submit their Night mode photos.

Through January 29th, Apple is taking submissions on Instagram, Twitter, and Weibo. To qualify, post your photos on Instagram or Twitter with the hashtag #ShotoniPhone and #NightmodeChallenge and Weibo using #ShotoniPhone# and #NightmodeChallenge#.

Five winners will be picked by a panel of judges that include:

plus the following Apple executives and employees:

  • Phil Schiller
  • Kaiann Drance
  • Brooks Kraft
  • Jon McCormack
  • Are Duplessis

The five winning photos will be announced on March 4th on the Apple Newsroom. Apple says the images may also be used in digital campaigns, at stores, on billboards, and in photo exhibitions.

Night mode photography was a big part of Federico’s story on iPhone 11 Pro photography called Eternal City, Modern Photography: The iPhone 11 Pro in Rome. Here’s an outtake from that story that Federico submitted for the challenge:

For more on the contest and tips on shooting Night mode photos, check out Apple’s press release.


Lightroom 5.1 Adds Direct SD Card Importing on iPad and iPhone, Plus New Export Options

As promised this fall, Adobe has updated Lightroom for iPad and Lightroom Photo Editor for the iPhone with the ability to import image files from SD cards directly inside the app. The company has added new options when exporting your photos too. I’ve been using the beta of Lightroom 5.1 for the past couple of weeks, and the update has worked exceptionally well, reducing the friction of getting images into the app and adding flexibility to getting them back out again.

Read more


Loupedeck+ Review: Faster, More Natural Image and Video Editing with a Dedicated Control Panel

Source: Loupedeck

Source: Loupedeck

The Loupedeck+ is a hardware control panel for editing photos and video that transforms the software tools you’re accustomed to using with a mouse or trackpad into physical buttons, knobs, and dials. By eliminating the need to dive into menus and hunt for software, the Loupedeck+ changes the image editing process into something much closer to the feeling of editing on an iPad with the Apple Pencil. The seemingly endless series of swipes, drags, and clicks are replaced by something far more tactile and natural.

The result is a clear example of the benefit of using a dedicated tool for a particular task. Photo and video editing is often a high-volume, high-precision activity with lots of repetition, and depending on your job, tight deadlines. That makes any tool that can shave a little time off of editing each photo a win for professionals who often edit thousands of images in a week.

What I didn’t expect, though, is that the Loupedeck+ also makes editing more accessible for beginners like myself. As I’ll explain in more detail below, when Loupedeck sent me their device to test, I spent most of my time using it in Adobe Lightroom Classic, which I hadn’t used before. However, after a short time familiarizing myself with the Loupedeck+ layout, I found myself deep in the editing process with my eyes fixed on the images I was working on instead of darting back and forth hunting for the tools I wanted to use.

I may never enjoy the sort of time savings that a professional photographer could squeeze out of the Loupedeck+. However, simply knowing that I can dip in and out of Lightroom Classic for my editing needs with virtually no learning curve eliminates a significant hurdle that has slowed me down in the past. Although there are aspects of the Loupedeck+ that could be improved, it’s an incredibly powerful tool that fits into more workflows than I anticipated, which makes it an accessory worth considering for a wide range of users.

Read more


iOS Photo Metadata Utility Metapho Adds Deep Fusion and Night Mode Photo Detection

Metapho has been one of my favorite photo utilities on iOS for years. The marquee feature has always been its ability to strip metadata from images, which is handy when sharing photos online, for instance. Over time though, Metapho has grown to incorporate other functionality for inspecting and editing photo metadata that has made the app a must-have iOS utility. With its latest update, Metapho has added Deep Fusion and Night Mode photo detection, an intriguing addition that I haven’t seen any other app offer.

Read more


Sebastiaan de With Explains Why the iPhone 11 Camera Is Such a Big Leap Forward

Sebastiaan de With, part of the team behind the camera app Halide has published part 1 of a multi-part breakdown of the iPhone 11 camera. It’s a fantastic analysis of what makes the new camera different from past versions and goes into great depth while remaining accessible, even if you have only a passing familiarity with photography.

To put this year’s camera into perspective, de With recaps what Apple did with last year’s iPhone cameras explaining how Smart HDR works and its shortcomings. The iPhone 11 features Smart HDR too, but as de With explains, Apple has significantly improved how it handles the dynamic range of an image.

Another aspect of the improvement is in the camera sensor hardware. Despite its diminutive size, the iPhone 11’s image sensor can resolve more detail than any iPhone camera before it.

However, many of the iPhone 11’s camera improvements come down to better software. The new camera post-processes each component of an image differently, applying different noise reduction to the sky, a face, hair, and clothing, for example. Apple calls the feature Photo Segmentation, and it’s aided by machine learning.

One of my favorite features of the new camera is Night Mode. As de With notes:

In the iPhone 11 Night Mode, you can also see detail vanish in some areas. Except that it really seems to only affect parts of the image that you don’t really care that much about. Night Mode has a remarkable if not uncanny ability to extract an image that is sometimes even sharper than the regular mode, with strong sharpening and detail retention occurring in areas that are selected by the camera during processing.

The iPhone 11’s camera is also the first one de With thinks rivals standalone cameras:

In the past, iPhones made great photos for sharing on social media, but blown up on a big screen, the shots didn’t hold up. It’s why I frequently still pack a ‘big’ camera with me on trips.

With these huge improvements in processing, the iPhone 11 is the first iPhone that legitimately challenges a dedicated camera.

There are many more details in de With’s article, including a close look at the iPhone 11’s ultra wide lens. Every section of the post has photos and side-by-side comparisons that illustrate the analysis too, which makes the full post a must-read].

Permalink

Eternal City, Modern Photography: The iPhone 11 Pro in Rome

The Colosseum at night. Shot on iPhone 11 Pro using the wide lens, with night mode enabled. Unedited. Zoom in for details.

The Colosseum at night. Shot on iPhone 11 Pro using the wide lens, with night mode enabled. Unedited. Zoom in for details.

In many ways, the iPhone 11 Pro’s camera system feels like the culmination of over a decade’s worth of judicious, relentless improvements. Not only is the device’s camera the best and smartest Apple has ever shipped, but it also affords the most photographic freedom, allowing non-professional photographers like me to produce amazing shots with minimal effort.

Read more



    Apple’s Deep Fusion Camera Feature Launching as Part of the iOS Developer Beta Program

    According to TechCrunch’s Matthew Panzarino, Apple will roll out the Deep Fusion camera feature announced at the company’s fall iPhone event today as part of the iOS developer beta program.

    Deep Fusion is Apple’s new method of combining several images exposures at the pixel level for enhanced definition and color range beyond what is possible with traditional HDR techniques. Panzarino explains how Deep Fusion works:

    The camera shoots a ‘short’ frame, at a negative EV value. Basically a slightly darker image than you’d like, and pulls sharpness from this frame. It then shoots 3 regular EV0 photos and a ‘long’ EV+ frame, registers alignment and blends those together.

    This produces two 12MP photos – 24MP worth of data – which are combined into one 12MP result photo. The combination of the two is done using 4 separate neural networks which take into account the noise characteristics of Apple’s camera sensors as well as the subject matter in the image.

    Apple told Panzarino that the technique “results in better skin transitions, better clothing detail and better crispness at the edges of moving subjects.”

    There is no button or switch to turn Deep Fusion on. Like the over-crop feature that uses the ultra wide lens to allow photo reframing after the fact, Deep Fusion is engaged automatically depending on the camera lens used and light characteristics of the shot being taken. Panzarino also notes that Deep Fusion, which is only available for iPhones that use the A13 processor, does not work when the over-crop feature is turned on.

    I’ve been curious about Deep Fusion since it was announced. It’s remarkable that photography has become as much about machine learning as it is about the physics of light and lenses. Deep Fusion is also the sort of feature that can’t be demonstrated well onstage, so I’m eager to get my hands on the beta and try it myself.

    Permalink

    Pixelmator Photo Adds Direct iCloud Photo Library Access, Batch Editing, and New Export Features

    Pixelmator Photo for the iPad has been released with a trio of new features that greatly increase the power of the app. With the update, you can now now edit images in your iCloud Photo Library non-destructively without creating duplicates. There are also new batch-processing workflows and better options for exporting images. It’s an interesting mix of updates that I expect will appeal to a wide audience even though there remain iPadOS features I’d like to see adopted in the future.

    Read more


    Halide 1.14 Adds New Lens Switching Interface and Guides

    Halide 1.14 is out with a new lens switching UI to accommodate the three-camera system of the iPhone 11 Pro and Pro Max. As soon as the update was out, I went for a walk to give it a try.

    Halide has introduced a new lens switching button featuring haptic feedback and a dial-like system for moving among the iPhone’s lenses. When you press down on the lens button, you get a tap of haptic feedback to let you know without looking that the lens picker has been engaged.

    From there, you can slide your finger among the ultra wide, wide, and telephoto options that radiate out from the button. As you swipe your finger across each option, it enlarges, and you’re met with another little bit of haptic feedback as you swipe over the lenses other than the one already selected. Once you have the lens you want, you simply let go and your iPhone switches to it.

    You can also cycle through the lenses in order by tapping the button repeatedly or swipe left for the ultra wide lens or up for the telephoto one. In my brief tests, swiping left or up is the best option if you already know the lens you want, but using the dial-like lens switcher is perfect for considering your options first because Halide has also added lens preview guides.

    With the lens button engaged, Halide shows guides for each of your zoom options. That means if you’re using the ultra-wide lens, you’ll see the light gray guidelines for the wide and telephoto lenses. As you swipe over those lenses, the guides change to yellow to highlight the composition you’ll get if you switch to that lens.

    If you’re already using the telephoto lens though, Halide will highlight the outer frame of the image to suggest you’ll get a wider shot, though it does not zoom the viewfinder out to show that composition until you lift your finger. You can see how the lens guides work from the screenshots I took at a local high school football field above and in this video:

    Switching lenses in Halide.Replay

    When you switch to the ultra wide lens, you’ll notice that not all the usual Halide features are available. Manual focus is missing and so is shooting in RAW. That’s because the new iPhone hardware and iOS and iPadOS 13 don’t support those features. Although the ultra wide shots don’t support RAW, Halide has included a ‘MAX’ option in place of the ‘RAW’ option, so you can get the most image data possible from your wide shots, which you can see in the screenshots below.

    Ultra wide images are limited to MAX quality (left) instead of RAW, which is supported by the wide and telephoto lenses (right).

    Ultra wide images are limited to MAX quality (left) instead of RAW, which is supported by the wide and telephoto lenses (right).

    The Halide team says that the latest update also includes noise-reduction adjustments to the RAW images produced by the iPhone 11, but that they are continuing to fine-tune how that app handles RAW photos from the new phones as part of a more significant update that is coming next.

    The latest update is relatively small, but I especially like the use of haptic feedback and lens guides, which make it easy to switch lenses when you’re focused on the viewfinder of the camera instead of Halide’s buttons.

    Halide is available on the App Store for $5.99.