Apple wastes no time in showing off what their hardware can do with their newest smartphone features. One specific function caught my eye — and for good reason. The newly introduced Portrait Lighting mode promises casual smartphone users like yours truly to capture photographs and recreate studio lighting effects on the new iPhones.
How does it work, exactly? According to the announcement, the iPhone cameras sense the scene and create a depth map that separate the subject from the background. Machine learning then creates facial landmarks and changes the lighting on the contours of your face. Apple stresses that the effect isn’t a filter; it’s the result of real-time analysis as the photo is being taken.
What does that all mean? Simply that iPhone users are about to get portraits with really great lighting — if we can afford the new iPhones.
Portrait Lighting mode choices include natural light, stage light, contour light, and a monochrome light setting.