Apple wants to make it as easy as possible to use Portrait Mode — and that includes removing the effect from a photo if you don't like how it looks.
If you take a Portrait Mode photo but change your mind, you can remove the background blur after the fact. Just choose the photo, tap the edit button, and then tap "Portrait," which will be at the top of your screen. The photo will then return to looking like a standard iPhone shot.
Portrait Mode is available on iPhone 7 Plus, iPhone 8 Plus, and iPhone X. The latter two phones also have Portrait Lighting, which artificially adjusts the lighting around your subject to add different effects.
The iPhone 8 and 8 Plus are on sale now, and iPhone X arrives November 3.
Portrait Mode works best on people and still objects, and there are some limitations to the feature: the amount of light present and your distance from the subject.
Portrait Mode doesn't work well, or at all, in low light. If it's too dark for the feature to work, a message will appear within your camera app letting you know.
It also won't work if you're too close to the subject you're trying to capture, which your phone will alert you to as well. In order to take a Portrait Mode photo, you can't be any closer than 19 inches away from your subject.
It's also worth noting that Portrait Mode works best when there's a lot of contrast between the subject and the background. If you're shooting a white coffee cup on a white counter, for instance, the sensors may have trouble deciding what should be in focus and what should be blurred.
That depth map created by the wide-angle lens is crucial to the end result, because it helps Apple's image signal processor figure out what should be sharp and what should be blurred.
The image above demonstrates what a standard iPhone photo looks like (left) and what a Portrait Mode photo looks like (right). At a quick glance, the image on the right seems like it just has a totally blurry background, but this is where the depth map comes into play.
In order to make the photo look natural and as close to a real DSLR photo as possible, Apple's image processor goes through the layers one by one and blurs them in varying amounts, an effect known as "bokeh."
The layers closer to the subject will be slightly sharper than than the layers farthest away, and if you look closely at the above photo of my colleague Melia Robinson, you can tell: The stuff that's close to her in the photo — like the long grass and the slab of wood on the ground — is a lot easier to make out than the cliff in the distance, which is just a dark, blurry form.
Apple's Portrait Mode requires two lenses because each lens is different: One is a 12-megapixel wide-angle lens, while the other is a 12-megapixel telephoto lens.
When taking a Portrait Mode photo, the two lenses serve different purposes.
The telephoto lens is what actually captures the image. While it's doing that, the wide-angle lens is busy capturing data about how far away the subject is, which it then uses to create a nine-layer depth map.
Portrait Mode is only available on recent "Plus" models of its iPhones — iPhone 7 Plus, iPhone 8 Plus, and the upcoming iPhone X — for a simple reason: Apple's version of Portrait Mode requires dual cameras.
Soon after Apple introduced Portrait Mode in 2016, the feature started popping up on other flagship phones like Samsung's Galaxy Note 8 (called Live Focus) and the Google Pixel (called Lens Blur).
In the case of the Pixel phones, which only have one lens, Google relies on software to achieve that Portrait Mode quality. Apple's iPhones require two lenses to make it happen — at least for now. So if you buy the new iPhone 8, for instance, it will not have the ability to take Portrait Mode photos.
Source: Business Insider India