Posts tagged with "iPhone X"

Apple Opens iPhone X App Submissions to Developers

Apple has notified third-party iOS developers via its News and Updates website that they can download Xcode 9.0.1 and submit iPhone X apps for review in advance of the new iPhone’s release on November 3, 2017.

Download Xcode 9.0.1, test your apps in the iPhone X simulator, and capture screenshots. Then submit your updated apps and metadata in iTunes Connect today.

Earlier in the day, Apple told Reuters that customer demand for the iPhone X has been ‘off the charts’ since pre-orders began at 12:01 AM Pacific on October 27th.

Permalink

Design Like the Notch Isn’t There

After revealing the iPhone X to the world on September 12th, Apple updated its Human Interface Guidelines and introduced a series of developer videos to address, among other topics, designing iOS apps with the iPhone X’s notch in mind. Designer Max Rudberg provides a comprehensive overview of Apple’s treatment of the notch. As Rudberg explains:

Apple is choosing to highlight the fact that the screen reaches the top left and right corner of the device. So the recommendation is clear. As a good platform citizen, one should follow their lead. By doing so, you likely have better chances to be highlighted by Apple in the App Store, or even win an Apple Design Award.

Eventually, they will get rid of the notch. It could be 2, 5, or even 10 years, but it’s a stop gap, not a permanent design solution. In the meantime, treat it like the elephant in the room. We all know it’s there, but for the most part, you should design as if it’s not.

Rudberg illustrates his article with screenshots of each point he covers and the dimensions of each screen elements adjacent to the notch. It’s not a substitute for reading the Human Interface Guidelines and watching Apple’s videos, but Rudberg’s article is a great place for developers to start when considering how to design for the iPhone X.

Permalink

Apple’s Quest to Transform Photography

John Paczkowski of BuzzFeed conducted a fascinating interview with Apple’s Senior Vice President of Worldwide Marketing Phil Schiller and Johnnie Manzari of Apple’s Human Interface Team about the iPhone’s camera. Much of the discussion is focused on the new Portrait Lighting feature available in the 8 Plus and X. As Paczkowski explains,

The camera’s effects don’t rely on filters. They’re the result of Apple’s new dual camera system working in concert with machine learning to sense a scene, map it for depth, and then change lighting contours over the subject. It’s all done in real time, and you can even preview the results thanks to the company’s enormously powerful new A11 Bionic chip. The result, when applied to Apple scale, has the power to be transformative for modern photography, with millions of amateur shots suddenly professionalized.

Manzari described the extensive process that went into creating Portrait Lighting:

“We spent a lot of time shining light on people and moving them around — a lot of time,” Manzari says. “We had some engineers trying to understand the contours of a face and how we could apply lighting to them through software, and we had other silicon engineers just working to make the process super-fast. We really did a lot of work.”

BuzzFeed’s article is worth a close read because it’s about more than just the camera in Apple’s new and upcoming iPhones. The behind-the-scenes peek at the development process of the many functions that the iPhone’s camera serves is the best example of one of Apple’s biggest competitive advantages: the fusion of hardware and software.

Permalink

Craig Federighi Answers Face ID Questions

In a telephone interview with Matthew Panzarino of TechCrunch, Apple’s Senior Vice President of Software Engineering, Craig Federighi, answered many of the questions that have arisen about Face ID since the September 12th keynote event. Federighi went into depth on how Apple trained Face ID and how it works in practice. Regarding the training,

“Phil [Schiller] mentioned that we’d gathered a billion images and that we’d done data gathering around the globe to make sure that we had broad geographic and ethnic data sets. Both for testing and validation for great recognition rates,” says Federighi. “That wasn’t just something you could go pull of the internet.”

That data was collected worldwide from subjects who consented to having their faces scanned.

Federighi explained that Apple retains a copy of the depth map data from those scans but does not collect user data to further train its model. Instead, Face ID works on-device only to recognize users. The computational power necessary for that process is supplied by the new A11 Bionic CPU and the data is crunched and stored in the redesigned Secure Enclave.

The process of disabling Face ID differs from the five presses of the power button required on older iPhones. Federighi said,

“On older phones the sequence was to click 5 times [on the power button] but on newer phones like iPhone 8 and iPhone X, if you grip the side buttons on either side and hold them a little while – we’ll take you to the power down [screen]. But that also has the effect of disabling Face ID,” says Federighi. “So, if you were in a case where the thief was asking to hand over your phone – you can just reach into your pocket, squeeze it, and it will disable Face ID. It will do the same thing on iPhone 8 to disable Touch ID.”

In many respects, the approach Apple has taken with Face ID is very close to that taken with Touch ID. User data is stored in the Secure Enclave, and biometric processing happens on your iOS device, not in the cloud. If you have concerns about Face ID’s security, Panzarino’s article is an excellent place to start. Federighi says that closer to the introduction of the iPhone X, Apple will release an in-depth white paper on Face ID security with even more details.

Permalink

Apple’s Bionic Advantage

Mashable interviewed Apple’s Senior Vice President of Worldwide Marketing Phil Schiller and Senior Vice President of Hardware Technologies Johny Srouji about Apple new A11 Bionic CPU, which powers the iPhone 8, 8 Plus, and X.

“This is something we started 10 years ago, designing our own silicon because that’s the best way to truly customize something that’s uniquely optimized for Apple hardware and software,” said Srouji.

For Apple, silicon development is an intrinsic part of the iPhone creation process. “It’s not just something you drop in or build around,” said Schiller.

It’s a strategy that has paid off for Apple by giving it more control over the full hardware/software stack and enabling the company to squeeze more performance and energy efficiency out of the tiny chips that power iOS devices. At the same time though, it’s an approach that requires Apple to make big bets far in the future:

Srouji told me that when Apple architects silicon, they start by looking three years out, which means the A11 Bionic was under development when Apple was shipping the iPhone 6 and its A8 chip. Back then we weren’t even talking about AI and machine learning at a mobile level and, yet, Srouji said, “The neural engine embed, it’s a bet we made three years ahead.”

Apple’s tight control over hardware and the software that runs on it isn’t new. It’s one of the cornerstones of the company’s success. What’s remarkable though, is the microscopic level to which Apple has taken the approach. As author Lance Ulanoff points out, the images of chips that Phil Schiller displayed onscreen during the September 12th keynote to illustrate new and improved iPhone technologies weren’t different chips. They were different areas on the same chip – one with leaked Geekbench scores that put it on par with the silicon inside the 2017 13-inch MacBook Pro. That’s extraordinary and likely to be a key advantage that Apple will have over competitors for years to come.

Permalink