Yesterday, the world shook with collective anticipation for the newest iPhone releases. Apple fanatics the world over held their breaths as the tenth-anniversary iPhone was unveiled, aptly named the iPhone X. As all Apple keynotes go, a slew of devices and new features were announced.* Most notable to me (Animojis were only a close second) was the smartphone’s new Portrait Lighting mode. More specifically, the supposed tech behind this new iPhone feature.
Aside from firmly believing that good lighting, which Apple promises to fix here, is essential to great portraits, it’s the promise of the iPhone’s mathematical face mapping technology that got me excited.
Introducing Portrait Lighting
At the Apple event, Portrait Lighting mode was announced as a way to shoot smartphone portraits with lighting comparable to studio illumination. It was heavily stressed that this is not a filter. To explain: A filter will simply add another layer to a photograph to get desired results. Most beauty modes are achieved by adding a blur setting or a rosy hue on what the smartphone camera perceives to be the subject’s face.
The Portrait Lighting mode, as explained, is the result of dual-camera technology and a new phone chip. It’s real-time analysis of your face as the photograph is being taken, which is then transformed into a depth map that separates the subject from the background. Certain facial landmarks are then created and lighting is manipulated by machine learning.
Yeah, I hardly caught that at the Apple Event so I’m sure it’s harder to follow reading this, but simply put: It’s not an added layer to mask out photo imperfections. It’s actual adjustment on the portrait based on information the camera has taken in as it captured that photograph.
This mode sounds great in itself but aside from better-lit photographs and a slew of great studio lighting options, this technology should, in theory, improve on the smartphone’s selfie capabilities. Facial 3D-mapping technology used in the Portrait Lighting mode could vastly improve background blur — I’ve yet to find a smartphone which has perfected this function.
Taking it a step further, this tech should be able to significantly improve existing beauty modes and add skin smoothing and make-up more precisely, and more importantly, more naturally. That is, if Apple finally decides to integrate a beauty mode in its camera app — a selfie function sorely lacking in all their devices.
The iPhone X as Selfie King
With the rise of selfie smartphones and an obvious refocus on smartphone cameras in the tech world, it seems the Apple is also moving towards this direction, albeit in their own way.
Again, and as usual, Apple fills our heads with grandiose promises of tech innovation. Admittedly, tech that use 3D facial mapping is not new in the game. Whether Apple’s unique execution will revolutionize portrait and selfie taking forever is something we’ll have to discover. As with all great technological leaps, it starts with incremental improvements such as this.
Truth be told, the iPhone X is going to need more than great lighting to compete in the selfie arena.** This shouldn’t be hard, however, as Apple knows very well that tech is already there — it’s just a matter of proper application.
For now, at the very least, this face mapping technology is put to good use with Animojis.
*Apple non-believers would argue that these features aren’t, in fact, new since countless other Android phones have had similar tech
**A built-in beauty mode would be a good start!