Posts tagged with "camera"

Epic Games Releases iPhone App That Captures Facial Expressions to Unreal Engine

Source: Epic Games.

Source: Epic Games.

Epic Games has released a new iPhone app for videogame developers that captures facial expressions, piping them into the company’s Unreal Engine in real-time. As explained on the Unreal Engine blog:

Live Link Face streams high-quality facial animation in real-time from your iPhone directly onto characters in Unreal Engine. The app’s tracking leverages Apple’s ARKit and the iPhone’s TrueDepth front-facing camera to interactively track a performer’s face, transmitting this data directly to Unreal Engine via Live Link over a network.

What I find most interesting about Live Link Face is that Epic says it scales from solo developers working at home to sophisticated stage productions involving actors in motion capture suits and multiple iPhones. If so, that will make the app a terrific example of the sort of democratization of complex tools that technologies like ARKit and hardware like the iPhone’s TrueDepth camera make possible when integrated into existing workflows.

Permalink

Halide Team Experiments with iPad Pro’s LiDAR Scanner

Source: Halide Blog

Source: Halide Blog

Sebastiaan de With, on the Halide blog, goes deep on the 2020 iPad Pro’s camera module. His examination reveals that the device’s wide camera is virtually identical to that of the 2018 model. And the ultra-wide camera, unfortunately, isn’t quite up to the quality level of what’s found in the iPhone 11 and 11 Pro.

The most exciting and impressive aspect of the camera system is the LiDAR Scanner. The Halide team actually went to the trouble of building an entire proof of concept app that utilizes the LiDAR Scanner to capture your surroundings.

With Halide, we’d love to use the depth data in interesting ways, even if it’s low resolution. There was only one problem: there are no APIs for us as developers to use to get access to the underlying depth data. They only expose the processed 3D surface.

What if we re-thought photographic capture, though? We built a proof-of-concept we’re calling Esper.

Esper experiments with realtime 3D capture using the cameras and LIDAR sensor at room scale. It’s a fun and useful way to capture a space.

I always love reading de With’s in-depth explanations and comparisons of new iPhone or iPad cameras, and this was an especially fun one.

Permalink

Apple Announces Winners of Its ‘Shot on iPhone’ Night Mode Challenge

Photograper: Konstantin Chalabov (Moscow, Russia), iPhone 11 Pro

Photograper: Konstantin Chalabov (Moscow, Russia), iPhone 11 Pro

Apple has announced the six winners of its Shot on iPhone challenge. The contest, which was announced at the beginning of the year, asked photographers to submit their best Night mode shots taken with the iPhone 11 Pro and iPhone 11 Pro Max.

The winning photos, which were taken by photographers from China, India, Russia, and Spain, were judged by a panel of professional photographers and Apple executives and employees. The photos are currently being featured on apple.com, Apple’s Instagram account, and will appear on billboards worldwide in the future.

The images chosen by Apple’s panel of judges are fantastic. It’s remarkable what can be accomplished with Night mode, especially when you look back at what nighttime photography was like on the iPhone just a few years ago.

Don’t miss all six winning Night mode shots in Apple’s press release.

Permalink

First Look: RTRO by Moment Vintage Video Camera App

Source: Moment.

Source: Moment.

RTRO by Moment is a brand new vintage video camera app for iOS from the makers of my favorite add-on camera lenses for the iPhone and the excellent Moment Pro Camera app.

The app is a new direction for Moment. The company’s Pro Camera app, combined with its add-on lenses for the iPhone, push the boundaries of what’s possible with the iPhone’s camera. Packed with settings and customizations, the Pro Camera app can create stunning photos and video in the hands of a skilled photographer.

In contrast, RTRO is a video-only camera app focused first and foremost on making fun, short videos for sharing that use filters crafted by photographers to create unique retro looks. It’s those filters, which Moment calls ‘looks,’ paired with a simple, approachable interface that make the app work. It’s easy to get started, fun to use, and the videos the app creates have a unique vibe that makes even the most mundane video more interesting for viewers.

Read more


CES: A Tour of the Most Interesting (and Strange) Tech Announcements

CES has been going strong all week with announcements of new gadgets: home automation gear, TVs, computers, and lots more. Many mobile phone makers and some big industry players sit out CES, but there is still plenty of news from companies big and small with new products and technologies to show off.

A lot of what gets hyped at CES is prototypes and concept devices that will never ship or will get delayed. Still, every year I find that CES is fascinating to study for the industry trends it reveals and the handful of gadgets I discover that I’d like to try.

After combing through hundreds of headlines and press releases, I’ve compiled a roundup of some of this week’s most compelling announcements. Feel free to skip around to the categories that you find most interesting using the table of contents below.

Read more


Apple Announces Night Mode Photography Contest

About this time last year, Apple announced its first-ever ‘Shot on iPhone’ photography challenge judged by a panel of professional photographers and Apple employees. Apple is back with a new contest app this year asking users to submit their Night mode photos.

Through January 29th, Apple is taking submissions on Instagram, Twitter, and Weibo. To qualify, post your photos on Instagram or Twitter with the hashtag #ShotoniPhone and #NightmodeChallenge and Weibo using #ShotoniPhone# and #NightmodeChallenge#.

Five winners will be picked by a panel of judges that include:

plus the following Apple executives and employees:

  • Phil Schiller
  • Kaiann Drance
  • Brooks Kraft
  • Jon McCormack
  • Are Duplessis

The five winning photos will be announced on March 4th on the Apple Newsroom. Apple says the images may also be used in digital campaigns, at stores, on billboards, and in photo exhibitions.

Night mode photography was a big part of Federico’s story on iPhone 11 Pro photography called Eternal City, Modern Photography: The iPhone 11 Pro in Rome. Here’s an outtake from that story that Federico submitted for the challenge:

For more on the contest and tips on shooting Night mode photos, check out Apple’s press release.


Hands-On with Clips 2.1: Memoji and Animoji Support, Plus New Sticker Face Tracking and More

Apple has released the first big update in over a year for its Clips video creation tool. Following the trend begun in iOS 12, which added Animoji support to FaceTime, now all Animoji and Memoji characters can also be used inside Clips. Though I would have expected such an update a year ago, it’s nevertheless good to see. Besides Animoji and Memoji, Clips 2.1 only adds a couple other small new features, like a fresh batch of Mickey and Minnie stickers, a ‘Let It Snow’ winter poster, and support for right-to-left languages. After spending some time with the update, there are a couple nice implementation details related to Animoji that deserve highlighting.

Read more


Sebastiaan de With Explains Why the iPhone 11 Camera Is Such a Big Leap Forward

Sebastiaan de With, part of the team behind the camera app Halide has published part 1 of a multi-part breakdown of the iPhone 11 camera. It’s a fantastic analysis of what makes the new camera different from past versions and goes into great depth while remaining accessible, even if you have only a passing familiarity with photography.

To put this year’s camera into perspective, de With recaps what Apple did with last year’s iPhone cameras explaining how Smart HDR works and its shortcomings. The iPhone 11 features Smart HDR too, but as de With explains, Apple has significantly improved how it handles the dynamic range of an image.

Another aspect of the improvement is in the camera sensor hardware. Despite its diminutive size, the iPhone 11’s image sensor can resolve more detail than any iPhone camera before it.

However, many of the iPhone 11’s camera improvements come down to better software. The new camera post-processes each component of an image differently, applying different noise reduction to the sky, a face, hair, and clothing, for example. Apple calls the feature Photo Segmentation, and it’s aided by machine learning.

One of my favorite features of the new camera is Night Mode. As de With notes:

In the iPhone 11 Night Mode, you can also see detail vanish in some areas. Except that it really seems to only affect parts of the image that you don’t really care that much about. Night Mode has a remarkable if not uncanny ability to extract an image that is sometimes even sharper than the regular mode, with strong sharpening and detail retention occurring in areas that are selected by the camera during processing.

The iPhone 11’s camera is also the first one de With thinks rivals standalone cameras:

In the past, iPhones made great photos for sharing on social media, but blown up on a big screen, the shots didn’t hold up. It’s why I frequently still pack a ‘big’ camera with me on trips.

With these huge improvements in processing, the iPhone 11 is the first iPhone that legitimately challenges a dedicated camera.

There are many more details in de With’s article, including a close look at the iPhone 11’s ultra wide lens. Every section of the post has photos and side-by-side comparisons that illustrate the analysis too, which makes the full post a must-read].

Permalink

Apple’s Deep Fusion Camera Feature Launching as Part of the iOS Developer Beta Program

According to TechCrunch’s Matthew Panzarino, Apple will roll out the Deep Fusion camera feature announced at the company’s fall iPhone event today as part of the iOS developer beta program.

Deep Fusion is Apple’s new method of combining several images exposures at the pixel level for enhanced definition and color range beyond what is possible with traditional HDR techniques. Panzarino explains how Deep Fusion works:

The camera shoots a ‘short’ frame, at a negative EV value. Basically a slightly darker image than you’d like, and pulls sharpness from this frame. It then shoots 3 regular EV0 photos and a ‘long’ EV+ frame, registers alignment and blends those together.

This produces two 12MP photos – 24MP worth of data – which are combined into one 12MP result photo. The combination of the two is done using 4 separate neural networks which take into account the noise characteristics of Apple’s camera sensors as well as the subject matter in the image.

Apple told Panzarino that the technique “results in better skin transitions, better clothing detail and better crispness at the edges of moving subjects.”

There is no button or switch to turn Deep Fusion on. Like the over-crop feature that uses the ultra wide lens to allow photo reframing after the fact, Deep Fusion is engaged automatically depending on the camera lens used and light characteristics of the shot being taken. Panzarino also notes that Deep Fusion, which is only available for iPhones that use the A13 processor, does not work when the over-crop feature is turned on.

I’ve been curious about Deep Fusion since it was announced. It’s remarkable that photography has become as much about machine learning as it is about the physics of light and lenses. Deep Fusion is also the sort of feature that can’t be demonstrated well onstage, so I’m eager to get my hands on the beta and try it myself.

Permalink