The equipment supporting the iPhone 15 Pro Max used to film Apple’s Scary Fast event is extensive and clearly made for a final product that you couldn’t shoot on your own with just an iPhone. However, it’s still impressive to see such a small device at the center of such an elaborate and well-produced event. Originally leaked in a tweet, here’s the official version of the video:
Every year, one of the most anticipated iPhone hardware announcements is changes to its camera. This year, the iPhone Pro Max’s new telephoto lens was the center of attention. However, there were other notable tweaks to the camera hardware and software across the iPhone lineup, too. Plus, we got a hardware bonus in the form of the iPhone 15 Pro and Pro Max’s Action button, which can perform some interesting tricks. Now, with the new iPhones in the hands of people around the world, we’re starting to see what that hardware can do in practice, and I’ve got three examples I’d like to share.
Source: Lux.
The first is an update to the camera app Halide that does something incredibly clever. Built into version 2.13 of the app is a shortcut that can be tied to the Action button to open Halide with a single press. That’s something you can do with any app using an Open App action, but Halide goes a step further by offering in-app functionality if you tie the button to its app. In the app’s settings, you can choose to tie the Action button to any of the following options:
Do nothing
Lenses
Exposure Mode
Focus Mode
RAW
Resolution
Capture
After using the Action button to open the app, pressing the button again will perform whichever action you picked in its settings. For example, if you chose Lenses, the first press of the Action button from outside the app will open Halide, and subsequent presses will cycle through each of the available camera lenses. I love this use of the Action button and hope other developers do the same, adding contextual actions to more apps.
With iPhone 15 Pro Max’s default 24 megapixel resolution, added ‘lenses’ under the main camera lens, automatic depth capture for portraits, and that 5× lens, this release might not blow away on a spec sheet, but is massive for everyone who uses an iPhone to take photos.
There’s a lot of ground to cover between the hardware and processing changes happening behind the scenes. Plus, de With is an excellent photographer whose shots do a fantastic job illustrating what is possible with the iPhone 15 Pro Max. So be sure to check out the full review.
Finally, the iPhone’s camera takes amazing video, too. This year saw the introduction of Log encoding for Pro Res 4K footage. That opens up a wider range of editing control using apps like Final Cut Pro, which Joey Helms used to create this amazing video of Chicago:
I’ve had my iPhone 15 Pro Max for just four days, and already, I’m enjoying taking photos as I walk around my neighborhood and playing with features like adding Portrait mode after the fact to images like the one below.
Before (left) and after (right) applying Portrait mode.
The result is a lot more creative freedom that’s more accessible than ever, not only because your iPhone is usually in your pocket but because the tools Apple has created for taking great photos and videos are so easy to use.
It’s easy to forget how powerful the computers we carry with us everywhere are. While most of us are firing off text messages to our friends, companies like 3D Pets are using the iPhone’s LiDAR and TrueDepth camera in innovative ways to help dogs and other animals with missing or deformed limbs.
Yesterday, both Apple and Marques Brownlee published videos spotlighting the work 3D Pets is doing to create custom prostheses for pets. The process includes taking a 3D scan of the animal using the iPhone’s TrueDepth camera and then modeling and 3D printing a one-of-a-kind prosthesis.
The tech is cool, and the stories are heartwarming and worth taking a break during your day to watch.
It’s harder than ever to push Apple devices to their limits. Sure, some apps and workflows will do it, but for everyday tasks, Apple silicon has opened a gap between hardware and software that we haven’t seen in a while.
The transformation was gradual with the iPhone and iPad compared to the sudden leap the Mac took with the M1, but the result is the same. There are fewer and fewer apps that push Apple’s chips to the max.
That’s beginning to change with the focus on machine learning and Apple silicon’s Neural Engine. While pundits fret over Apple’s lack of an AI chatbot, developers are building a new class of apps that use local, on-device machine learning to accomplish some pretty amazing feats on all of Apple’s devices.
Detail Duo.
Great examples of this are the apps by Detail, an Amsterdam-based startup. Detail has two apps: Detail Duo, an iPhone and iPad video production app, and Detail for Mac, which does something similar but with a focus on multi-camera setups more suitable to a desktop environment.
As I explained in my Final Cut Pro for iPad first impressions story last week, I don’t work with much video. However, I’ve been dabbling in video more, and I’ve discovered a story as old as personal computers themselves.
Every hardware advance that creates a huge amount of performance headroom is eventually consumed by the ever-growing demands of apps. That’s just as true with Apple silicon as it was for other chip advances. What seemed like more power than average consumers would ever need quickly becomes a necessity as apps like Detail Duo and Detail push that hardware to its limits.
It’s these sorts of advances that I find incredibly exciting because when they’re coupled with intuitive, well-designed apps, they open up entirely new opportunities for users. For Detail, that means simplifying and democratizing video production that would have been out of reach of most users not that long ago, expanding access to video as a creative outlet.
Before digging into these apps further, though, you should know that my son Finn is on the team building Detail and Detail Duo. That’s one of the reasons I’ve known about and followed these apps for a long time now. I figured that’s context readers should know.
In his latest video, MKBHD eloquently summarized and explained something that I’ve personally felt for the past few years: pictures taken on modern iPhones often look sort-of washed out and samey, like much of the contrast and highlights from real life were lost somewhere along the way during HDR processing, Deep Fusion, or whatever Apple is calling their photography engine these days. From the video (which I’m embedding below), in the part where Marques notes how the iPhone completely ignored a light source that was pointing at one side of his face:
Look at how they completely removed the shadow from half of my face. I am clearly being lit from a source that’s to the side of me, and that’s part of reality. But in the iPhone’s reality you cannot tell, at least from my face, where the light is coming from. Every once in a while you get weird stuff like this, and it all comes back to the fact that it’s software making choices.
That’s precisely the issue here. The iPhone’s camera hardware is outstanding, but how iOS interprets and remixes the data it gets fed from the camera often leads to results that I find…boring and uninspired unless I manually touch them up with edits and effects. I like how Brendon Bigley put it:
Over time though, it’s become more and more evident that the software side of iOS has been mangling what should be great images taken with a great sensor and superbly crafted lenses. To be clear: The RAW files produced by this system in apps like Halide are stunning. But there’s something lost in translation when it comes to the stock Camera app and the ways in which it handles images from every day use.
Don’t miss the comparison shots between the Pixel 7 Pro and iPhone 14 Pro in MKBHD’s video. As an experiment for the next few weeks, I’m going to try what Brendon suggested and use the Rich Contrast photographic style on my iPhone 14 Pro Max.
It’s the end of the year, and before I take a few days off to relax for the holidays, I have a few cool things to share that have been sitting on my desk and Mac for a little bit.
The Belkin Mount with MagSafe for Mac Desktops and Displays
One of macOS Ventura’s flagship features is Continuity Camera, which lets you use an iPhone’s camera as a webcam. I covered Continuity Camera in my Ventura review, and it works really well, especially with Center Stage turned off, so you get the full uncropped image from the iPhone’s camera.
A side view.
Alongside Continuity Camera, Belkin introduced an excellent, compact MagSafe mount for Apple Laptops but left desktop and external display users hanging. Last week, desktop users got their wish for a similar solution, with a double-hinged MagSafe Mount that I expect will work on a work with a wide range of displays.
Ready for hooking to a screen.
Belkin sent me its new mount to try last week, and I immediately gave it a try. The hardware has a nice, solid feel. The hinges are stiff, so your iPhone’s weight won’t affect your setup, and every surface that touches your display, front and back, as well as your iPhone, has a soft-touch finish that shouldn’t scratch your display or phone.
Austin Mann’s review of the iPhone 14 Pro’s cameras is out, and as usual, he’s back with beautiful photos from an interesting location. This time it’s the Scottish Highlands where Mann put the iPhone Pro’s new cameras to the test.
One of the advantages of the new 48MP camera is more latitude to crop images without reducing their resolution too far. There’s a great example in Mann’s review of a tight crop on a rooster that illustrates how far an image can be cropped and still retain lots of detail. Still, Mann concludes that he’s more likely to shoot at 12MP than 48 in most situations because it’s still the fastest way to shoot and performs so well in low-light. Mann was also impressed with shooting video in Action Mode, although he notes that it requires good lighting and crops the resulting video substantially.
With high-resolution imaging capability, Action mode stabilization, and a Cinematic mode that now supports 4K at 24 fps, the iPhone 14 Pro is a powerful imaging tool in the pocket of a creative pro. Beyond the cameras, new safety features like Emergency SOS via satellite and crash detection are exciting to have with me (and with my loved ones).
Now I’m just hoping we see some monster steps forward in the digital workflow so we can quickly get these beautiful files off our cameras and into our projects to share with the world!
For examples of the kind of shots that are possible when the iPhone 14 Pro is in the hands of a professional and more details on the camera’s performance, be sure to visit Mann’s site.
Halide 2.5 is out, and it includes a brand new Macro Mode. Macro photography is an exclusive feature of the iPhone 13 Pro and 13 Pro Max. Still, Halide has managed to make its Macro Mode available on the iPhone 8 and newer models thanks to some cool machine learning tricks.
Switching to Macro Mode and dialing in precise focus is simple with Halide 2.5.
Macro Mode is easy to use. When you open the app, auto-focus (AF) is selected by default. Tap it, and the focus controls slide into place with the auto-focus at one end of the app’s focus dial and Macro Mode (the button with the flower) at the other end. Select Macro Mode, and you’ll see a new focus dial with smaller increments appear. The Halide team says this enables sub-millimeter focusing for extra-precise close-up focusing.
Halide takes its close-ups by first switching to the camera on your iPhone that can take the closest shots. Focusing is handled by its precision focus dial, and the final step is to enhance the image’s details using an AI-based enhancement process. That last super-resolution step is what allows Halide’s Macro Mode to be used on cameras on older models of iPhones and to enhance Apple’s own macro system too.
In my testing over the past day, the results have been impressive. I’m especially fond of the precise focus dial that allows for minute adjustments that make a difference at such close range.
If you’re a Club MacStories+ and Club Premier member, head over to the new Photography channel in our Club Discord to see even more of my experiments with Halide’s Macro Mode and share your own macro shots.
Halide is available as on the App Store as a subscription for $2.99/month or $11.99/year or for a one-time payment of $49.99. The app also offers a 7-day free trial.
Matthew Panzarino, TechCrunch’s Editor-in-Chief, put the iPhone 13 Pro camera’s new Cinematic mode through its paces at Disneyland in an excellent real-world test of the new feature. Panzarino also spoke to Kaiann Drance, Apple’s vice president of Worldwide iPhone Product Marketing and Johnnie Manzari, a designer on Apple’s Human Interface Team about how Cinematic mode works.
“In cinema, the role of gaze and body movement to direct that story is so fundamental. And as humans we naturally do this, if you look at something, I look at it too.”
So they knew they would need to build in gaze detection to help lead their focusing target around the frame, which in turn leads the viewer through the story. Being on set, Manzari says, allowed Apple to observe these highly skilled technicians and then build in that feel.
“We’re on set and we have all these amazing people and they’re really the best of the best. And one of the engineers noticed that the focus puller has this focus control wheel, and, and he’s just studying the way that this person does this. Just like when you look at like someone who’s really good at playing the piano, and it looks so easy, and yet you know it’s impossible. There’s no way you’re going to be able to do this,” says Manzari.
“This person is an artist, this person is so good at what they do and the craft they put into it. And so we spent a lot of time trying to model the analog feel of a focus wheel turning.”
Some of the individual components that make up Cinematic Mode include:
Subject recognition and tracking
Focus locking
Rack focusing (moving focus from one subject to another in an organic-looking way)
Image overscan and in camera stabilization
Synthetic Bokeh (lens blur)
A post-shot editing mode that lets you alter your focus points even after shooting
And all of those things are happening in real-time.
Despite everything that goes into Cinematic mode, Panzarino notes that the battery impact of using it throughout the day was surprisingly slight.
Cinematic mode isn’t without its flaws, which are covered in the story, but it’s worth watching the entire video that Panzarino shot during a Disneyland visit with his family to get a sense for it yourself. If you study the video closely, you’ll pick up on the places where Cinematic mode struggles. However, sitting back and casually watching the video like you would after a vacation or if a friend sent it to you, the flaws largely fade into the background. I’m eager to test Cinematic mode for myself, and I don’t mean to suggest that it’s necessarily fine as it is, but I also expect that it will be a net positive in a lot of circumstances.