Unlike most reviewers who were given an iPhone to test, Austin Mann didn’t head home, he headed to Kenya. Mann, whose iPhonecamerareviews we’ve covered many times in the past, took the new iPhone 16 Pro on safari (not the browser), photographing a variety of wildlife, landscapes, and people. The video Mann shot of the trip is stunning:
Both the video and Mann’s accompanying article and photographs do more than just showcase the kind of shots that are possible with the iPhone 16 Pro. They both go in-depth about the new features and what each means to photographers:
The iPhone 15 Pro had the same 13mm (.5x) ultra-wide lens, but the sensor was only 12 megapixels—just 25% of the resolution of the 24mm (1x) lens. Over the past year, especially while working with the iPhone 15 Pro, I often found myself torn. Sometimes I wanted the wider perspective, but I didn’t want to sacrifice resolution. I was thrilled when the new Ultra Wide was announced with 48 megapixels, and it certainly doesn’t disappoint.
An added bonus is that the iPhone’s Macro mode also uses the Ultra Wide camera, meaning Macro shots are now 48 megapixels as well. The detail is remarkable, and the iPhone 16 Pro might just be my new favorite camera for macro photography.
Mann likes the Camera Control button for quick access to the Camera app, too, but found that in circumstances like shooting from a helicopter, it could be hard to operate.
The video and post are both worth spending time with. You’ll learn what the new cameras in the Pro iPhones can do and perhaps be inspired to go out and try the new features yourself.
Today, Lux released an update to Halide, its manual control camera app. The marquee feature is Process Zero, a mode that allows photographers to take images with no algorithmic or AI processing. As Lux’s Ben Sandofsky explains it:
Process Zero is a new mode in Halide that skips over the standard iPhone image processing system. It produces photos with more detail and allows the photographer greater control over lighting and exposure. This is not a photo filter— it really develops photos at the raw, sensor-data level.
The result is that it’s possible to capture finer details than a processed photo under some conditions. The resulting image is a RAW file that’s 12 MB, significantly smaller than a ProRAW photo. In addition to Process Zero, the Halide team introduced Image Lab, a feature accessed from your Halide photo library, that offers a single dial element for adjusting your RAW photos.
Process Zero comes with some tradeoffs as explained in depth in Sandofsky’s post. The images it produces are “less saturated, softer, grainier, and quite different than what you see from most phones.”
I’ve had limited time to try Process Zero, but it was immediately apparent that the process of taking photos is different and harder compared to relying on the iPhone’s image processing. The feature requires a more deliberate, attentive approach to Halide’s manual camera settings to get a good shot. That’s not necessarily a bad thing, but it is clearly different, and I imagine it’s also probably the best way to really learn how the app’s manual camera settings work.
I also appreciate that the Halide team is taking a human-focused approach to photography at a time when so many developers and AI companies seem all too willing to cast aside photographers in favor of algorithms and generative AI. Process Zero’s approach to photography isn’t for everyone, and I expect most of the time, it won’t be for me either. However, I’m glad it’s an option because, in the hands of a skilled photographer, it’s a great tool.
If you’re interested in checking out Halide’s new Process Zero and Image Lab features, which are the foundation of what will become Halide Mark III, the app is currently on sale. For the rest of this week, Lux is offering Halide membership subscriptions for $11.99 per year, which is a 40% discount. The app is also available as a one-time $60 purchase.
Insta360 has had a busy week. Earlier this spring, the company released the X4, an advanced 360º action camera, which became available at Apple.com for the first time last week. Then this week, the company released the Insta360 Flow Pro, an AI-powered gimbal for smartphones, including the iPhone. I haven’t had a chance to try either gadget yet, but both caught my eye for different reasons.
First, the Insta360 reminds me of the company’s One X2 that I reviewed a few years ago. As action cameras go, that was a great little device that took excellent video and photos for something so compact.
Source: Insta360.
Three years later, the Insta360 X4 goes much further. The candy bar-shaped camera is capable of 8K video at 30 fps and has a 2290mAh battery that Insta360 says lasts for 135 minutes, which is impressive if it bears out in real-world use. The bundle being sold on Apple.com, includes the camera, a lens cover, a carrying case, a 256GB microSD card, a USB-C cable, and the company’s Invisible Selfie Stick accessory. That last item is key because, through the magic of software, it can be removed from any scene you shoot, creating a third-person perspective without needing someone else to operate the camera.
The X4 also features a 2.5” screen that is protected by Corning Gorilla Glass. Plus, it can take 360º photos, capture different cinematic styles of video like slow motion, and be controlled with voice and hand gestures, making it an intriguing choice for solo creators. The X4 is also available on Amazon.
The Insta360 Flow Pro. Source: Insta360.
Smartphone gimbals have come a long way too. I tried a DJI Osmo 2 gimbal years ago, but it was bulky and difficult to calibrate accurately. Insta360’s new Flow Pro looks like it solves a lot of those friction points. The gimbal, which will stabilize video taken with an iPhone or other smartphone using AI, folds up, saving room in your bag. It also doubles as a tripod with a flip-out base that allows you to set it up to film yourself or others nearby tracking you to keep you framed in the scene. For iPhone users, the Flow Pro uses DockKit, the API introduced by Apple in 2023, that coordinates shots with the gimbal stabilizing them and keeping you in the frame. The only other iPhone accessory that I’m aware of that does this is the Belkin Auto-Tracking Stand Pro with DockKit, which is a tabletop or desktop device, not a gimbal. The Flow Pro, which is available on Insta360s’ website, also offers a fast, integrated pairing process for iPhone users.
It’s summer, which means I and a lot of others will be traveling, and both of these devices strike me as compelling travel companions. The X4 offers high-resolution video, 360º images, and a plethora of cool software tricks for creating unique videos. Meanwhile, the Flow Pro is the kind of accessory that allows you to take the camera you always have with you and use it in new and creative ways, extending its utility. I’m hoping to get a chance to test one or both devices later this summer and will report back.
I’ve been playing around with Kino, a video camera app by Lux, on and off for the past day. That’s not long enough to do a full review, so instead, I got up this morning and headed out for a walk with Kino in tow to see what the default experience is like. The short answer is it’s excellent. Kino is designed to work well out of the box for a novice like me but offers manual controls for someone who needs less hand-holding. It’s similar to Lux’s approach to Halide, the company’s pro camera app, and my early experience with Kino has been just as good as it’s been with Halide.
Kino and Halide share a similar design aesthetic, so if you’ve ever tried Halide, you’ll have no trouble finding your way around Kino’s UI. There’s a record button at the bottom of the screen flanked by a button to access the video you’ve taken, which can be stored in your photo library or in the Files app, and a button for the app’s Instant Grade feature. At the top of the screen are controls for resolution, frame rate, and format presets, as well as a ‘Custom’ option. The top of the screen is where you’ll also see your audio levels and a button for switching between automatic and manual exposure. Just beneath the viewfinder are controls for toggling auto and manual focus, picking your camera lens, and a button for accessing additional controls and the app’s settings.
Like Halide, Kino also comes with a set of guides to get you started, which I haven’t tried yet because they weren’t available in the beta version of the app. However, if they’re anything like Halide’s guides, I expect they’ll be worth checking out if you’re new to shooting video and want to get the most out of Kino.
Some of Kino’s built-in color presets.
The app shoots beautiful video by default. Here’s an example of a short walk through Davidson College’s campus using all default settings, the iPhone 15 Pro Max’s Ultra Wide lens, and no post-processing.
The marquee feature of Kino is its Instant Grade. The app also comes with a collection of built-in color presets that you can preview in the viewfinder, making it easy to find one that fits your needs. The collection that comes with Kino has been created by video experts, including Stu Maschwitz, Sandwich Video, Evan Schneider, Tyler Stalman, and Kevin Ong. But you’re not limited to the presets that come with Kino. You can also import any LUT using the app’s integration with the Files app.
I visited a nearby lake and shot some video with Kino’s default settings enabled, and then tried each of its color presets:
The app also implements something Lux calls AutoMotion, which applies an exposure logic that gives video a cinematic feel. It’s another feature that just works out of the box for novices who don’t want to dig deeper. However, you always have the option to vary from the defaults, adjusting settings manually.
My first-run experience with Kino was great. I didn’t explore the app before heading out the door this morning, yet I had no trouble figuring out the basics and shooting video that looks good with no processing whatsoever. With more practice and some post-processing, I’m sure the results would look even better, but I love how well my video turned out with minimal effort. I’m planning to spend more time with Kino over the summer and look forward to checking out Lux’s guides to improve my video skills.
Continuity Camera is amazing. Since it was introduced in macOS Ventura, I’ve been using the feature almost daily. Continuity Camera is a native feature on macOS that lets you use an iPhone as your webcam. For it to work, you can either connect the iPhone to your Mac using a cable, or use it wirelessly if both devices are signed in with the same Apple ID. It’s quite impressive that, despite having to rely so often on video calls for work, I still don’t own a webcam today. Instead, the camera I use at my desk is an old iPhone SE (2nd generation), which was my partner’s main iPhone until they upgraded last year.
Over the past few months, however, the number of video calls I have needed to take on a daily basis has become critical. As an activist, part of my work now also involves conducting online training sessions with sometimes up to a hundred participants at a time. I just couldn’t afford to join one of those sessions and not have my camera working. Continuity Camera became a feature that I need to work reliably. Sadly, it doesn’t. Half of the time, apps like Zoom and Discord on macOS could not see the iPhone SE in the list of available cameras. This meant I had to fetch a Lightning cable to manually connect the iPhone. If I was unlucky that day, and that didn’t work, I would have to completely reboot the Mac. If I was really unlucky that day, and even that didn’t work, I would end up joining the call without a camera. Despite meeting all the requirements listed by Apple Support, this problem just kept happening on random occasions.
I had to find a fix for this bug, or at least a way to work around it.
The equipment supporting the iPhone 15 Pro Max used to film Apple’s Scary Fast event is extensive and clearly made for a final product that you couldn’t shoot on your own with just an iPhone. However, it’s still impressive to see such a small device at the center of such an elaborate and well-produced event. Originally leaked in a tweet, here’s the official version of the video:
Every year, one of the most anticipated iPhone hardware announcements is changes to its camera. This year, the iPhone Pro Max’s new telephoto lens was the center of attention. However, there were other notable tweaks to the camera hardware and software across the iPhone lineup, too. Plus, we got a hardware bonus in the form of the iPhone 15 Pro and Pro Max’s Action button, which can perform some interesting tricks. Now, with the new iPhones in the hands of people around the world, we’re starting to see what that hardware can do in practice, and I’ve got three examples I’d like to share.
Source: Lux.
The first is an update to the camera app Halide that does something incredibly clever. Built into version 2.13 of the app is a shortcut that can be tied to the Action button to open Halide with a single press. That’s something you can do with any app using an Open App action, but Halide goes a step further by offering in-app functionality if you tie the button to its app. In the app’s settings, you can choose to tie the Action button to any of the following options:
Do nothing
Lenses
Exposure Mode
Focus Mode
RAW
Resolution
Capture
After using the Action button to open the app, pressing the button again will perform whichever action you picked in its settings. For example, if you chose Lenses, the first press of the Action button from outside the app will open Halide, and subsequent presses will cycle through each of the available camera lenses. I love this use of the Action button and hope other developers do the same, adding contextual actions to more apps.
With iPhone 15 Pro Max’s default 24 megapixel resolution, added ‘lenses’ under the main camera lens, automatic depth capture for portraits, and that 5× lens, this release might not blow away on a spec sheet, but is massive for everyone who uses an iPhone to take photos.
There’s a lot of ground to cover between the hardware and processing changes happening behind the scenes. Plus, de With is an excellent photographer whose shots do a fantastic job illustrating what is possible with the iPhone 15 Pro Max. So be sure to check out the full review.
Finally, the iPhone’s camera takes amazing video, too. This year saw the introduction of Log encoding for Pro Res 4K footage. That opens up a wider range of editing control using apps like Final Cut Pro, which Joey Helms used to create this amazing video of Chicago:
I’ve had my iPhone 15 Pro Max for just four days, and already, I’m enjoying taking photos as I walk around my neighborhood and playing with features like adding Portrait mode after the fact to images like the one below.
Before (left) and after (right) applying Portrait mode.
The result is a lot more creative freedom that’s more accessible than ever, not only because your iPhone is usually in your pocket but because the tools Apple has created for taking great photos and videos are so easy to use.
It’s easy to forget how powerful the computers we carry with us everywhere are. While most of us are firing off text messages to our friends, companies like 3D Pets are using the iPhone’s LiDAR and TrueDepth camera in innovative ways to help dogs and other animals with missing or deformed limbs.
Yesterday, both Apple and Marques Brownlee published videos spotlighting the work 3D Pets is doing to create custom prostheses for pets. The process includes taking a 3D scan of the animal using the iPhone’s TrueDepth camera and then modeling and 3D printing a one-of-a-kind prosthesis.
The tech is cool, and the stories are heartwarming and worth taking a break during your day to watch.
It’s harder than ever to push Apple devices to their limits. Sure, some apps and workflows will do it, but for everyday tasks, Apple silicon has opened a gap between hardware and software that we haven’t seen in a while.
The transformation was gradual with the iPhone and iPad compared to the sudden leap the Mac took with the M1, but the result is the same. There are fewer and fewer apps that push Apple’s chips to the max.
That’s beginning to change with the focus on machine learning and Apple silicon’s Neural Engine. While pundits fret over Apple’s lack of an AI chatbot, developers are building a new class of apps that use local, on-device machine learning to accomplish some pretty amazing feats on all of Apple’s devices.
Detail Duo.
Great examples of this are the apps by Detail, an Amsterdam-based startup. Detail has two apps: Detail Duo, an iPhone and iPad video production app, and Detail for Mac, which does something similar but with a focus on multi-camera setups more suitable to a desktop environment.
As I explained in my Final Cut Pro for iPad first impressions story last week, I don’t work with much video. However, I’ve been dabbling in video more, and I’ve discovered a story as old as personal computers themselves.
Every hardware advance that creates a huge amount of performance headroom is eventually consumed by the ever-growing demands of apps. That’s just as true with Apple silicon as it was for other chip advances. What seemed like more power than average consumers would ever need quickly becomes a necessity as apps like Detail Duo and Detail push that hardware to its limits.
It’s these sorts of advances that I find incredibly exciting because when they’re coupled with intuitive, well-designed apps, they open up entirely new opportunities for users. For Detail, that means simplifying and democratizing video production that would have been out of reach of most users not that long ago, expanding access to video as a creative outlet.
Before digging into these apps further, though, you should know that my son Finn is on the team building Detail and Detail Duo. That’s one of the reasons I’ve known about and followed these apps for a long time now. I figured that’s context readers should know.