RSS
 

Posts Tagged ‘11’s’

iPhone 11’s Portrait Mode sets a high bar for the Pixel 4

12 Oct
Taken with iPhone 11 ISO 500 | 1/30 sec | F1.8

The bokeh-imitation effect that’s all over your Instagram feed is a few generations old, but it’s still a relatively young technology. Portrait Mode, as Apple calls it, is a computational feature that mimics the shallow depth of field closely associated with professional portrait photography. The latest iteration in the iPhone 11 is a great leap forward and, when compared with Google’s Pixel 3, shows that the search-engine giant is going to have to do something pretty special with the forthcoming Pixel 4.

Just to make sure you’re caught up – phone sensors and the lenses they are coupled with are quite small, and inherently limited in their ability to create a blurry background behind a subject. Hence, portrait mode was born (Portrait Mode is Apple’s proprietary name, but for the sake of simplicity I’ll use it throughout this article to refer to all such modes).

Compared side-by-side with results from the iPhone 11, the Pixel 3 has been surpassed in many respects

Like so many first-generation technologies, portrait mode was a bit dodgy at first – subjects poorly separated from their backgrounds and results that were decent but not quite convincing. But no matter where you stand on its current state from “It’s so terrible it’s an insult to photographers” to “Eh, it’s passable,” there’s no denying that it has steadily improved with each generation.

Apple, like most manufacturers, introduced Portrait Mode when it brought dual cameras to its devices. However, Google chose to offer it with a single camera, relying on dual pixel depth data, machine learning, and up-sampling to create that fake bokeh look. The results looked fine until, well, about now.

Compared side-by-side with results from the iPhone 11, the Pixel 3 has been surpassed in many respects. Here are the areas in which the iPhone 11 pulls clearly ahead of the Pixel – and where Google needs to do some catching up in the Pixel 4.

The nitty gritty details

Google achieves its portrait mode by digitally zooming in to mimic a longer focal length and creating a depth map by using dual pixels along with a learning-based algorithm to judge distance to a subject and separate it from its background, up-sampling the final result to a full 12MP resolution. Apple (and Samsung, Huawei, among others) instead use their telephoto camera, calculating depth with the help of the perspective offset between the telephoto and wide cameras – no cropping or up-sampling needed.

The images below demonstrate the difference – note that to match the subject’s size in the frame using the two different focal lengths, the 3a image was taken from about a meter farther back than the iPhone 11.

Of course the vast majority of portrait mode images will be viewed on a phone or computer screen, where the difference in detail is much harder to spot. Still, looking at the images above at even a 50% crop shows a vast difference in the level of detail captured, and all things being equal we’d much rather have more detail than less.

Backlit subjects

We’ve previously noted the Pixel 3’s fantastic ability to render high-contrast scenes, but one place this falls flat is with backlit portrait subjects. The camera’s tendency to preserve highlight detail and push up shadows is normally what we’d prefer, but it doesn’t work well when the shadows are your main subject.

Pixel 3a iPhone 11

The resulting image shows that the Pixel does a poor job of rendering the cat’s orange fur, giving him an overall ‘crunchy’ look in comparison to the more pleasing rendering by the iPhone. In our testing, the Pixel 3 has consistently shown this tendency to expose for highlights, even when it might do better to choose an exposure better suited to your human or feline subject, at the cost of highlight detail. Even tapping the subject’s face doesn’t adjust the exposure as much as we’d like.

Skin tones

The most sophisticated depth mapping in the world won’t save an image from bad-looking skin tones, and this is one area where Google really needs to catch up. The subject below is lit by window light that’s much cooler than the yellow lights of the kitchen behind him. It’s a tricky situation for sure, but the iPhone has clearly made the right call to warm up the subject’s skin tone rather than preserve the cool cast of the window light.

Pixel 3a iPhone 11

To be fair, both of these phones are susceptible to producing noticeably different colors based on slight shifts in framing, or using a different camera mode like Night Sight. But over the course of much use, we’ve seen that the Pixel 3’s standard camera mode renders skin tones particularly poorly by comparison.

Apple’s face smoothing and skin tone rendering has a tendency to go too far in some situations, and there are times when we prefer the more faithful color rendering of the Pixel. It’s also pretty easy to correct the Pixel’s skin tone rendering in the phone’s own Photos app, but we’re betting that most people don’t want to (and won’t) take the time to color correct every portrait that they take.

Focal length flexibility for portrait mode

Taken with iPhone 11 | ISO 200 | 1/60 sec | F1.8

Apple’s XR introduced wide-angle portrait mode to the iPhone, but the 11 and 11 Pro improve on it with more accurate depth maps thanks to the availability of the ultra-wide lens. Thus, the 11 offers a very good wide Portrait mode via its standard lens, and the 11 Pro offers both telephoto and wide portrait options.

Whether you prefer the look of a telephoto or wide portrait is of course a personal preference, and I tend to prefer the wide portrait mode on the iPhone 11. I like an across-the-table environmental portrait, which usually requires backing up if I’m using the telephoto lens or the crop imposed by the Pixel 3.

Whether or not you like the crop, it being forced on you makes it less flexible and, in my book, I’d rather have that wide-angle – and I’m sure I’m not alone.

Your move, Google

To be fair, there are things we prefer about the Pixel 3’s portrait mode. We find it’s much less prone to obvious errors in cutting around human subjects than the iPhone 11. I also far prefer using Google Photos to Apple’s iCloud, so the seamless integration with my photo archive is a big plus.

Healthy competition between two big tech companies keeps pushing phone camera technology forward at a rapid pace

We can also say with some certainty based on leaks and rumors that the Pixel 4 will address some of these shortcomings. We know that the device will offer more cameras, which will likely improve portrait mode. Whether we’ll see improvements to skin tones or better handling of backlit subjects is less certain, though encouragingly, leaked photos do show better rendering of skin tones. All will be revealed soon, but one thing is for sure – healthy competition between two big tech companies keeps pushing phone camera technology forward at a rapid pace, and that’s nothing but good news for the photo-taking public.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone 11’s Portrait Mode sets a high bar for the Pixel 4

Posted in Uncategorized

 

iPhone 11’s coolest photo feature is the hardest one to find

05 Oct
Cinerama leaning back – a natural result of pointing my camera upwards to capture the whole building.

Anyone who has stood at ground level and taken a photo of a building across the street has likely seen the effects of perspective distortion – you tilt your camera back to bring the whole building into frame, causing the straight lines of the building to appear to be ‘leaning back.’ Tilt-shift lenses are designed for exactly this problem, but they’re expensive, specialist optics.

More often, this effect will be corrected in software, but doing so usually requires the user to stretch the top of the image and crop to avoid the blank spaces this creates at the bottom of the frame. Apple is tackling this problem with a unique approach in the iPhone 11: by capturing more data outside of the frame.

I don’t know, I just like boring photos I guess?

For whatever reason, I’m drawn to the types of photos where perspective distortion is painfully obvious – signs, sides of buildings, etc. – but I’m horrible at lining them up correctly. Usually, I find out going through my images later that I wasn’t squared up to my subject even though I thought I was. Horizons are slightly askew, or I was leaning back slightly. Apple, it seems, has heard my cries.

When you’re shooting with the standard camera (with a focal length equivalent to about 26mm), the iPhone 11 will also capture image data from the ultra-wide (13mm equiv.) camera – a feature that is referred to in the settings menu as “Photos Capture Outside the Frame.” If you’re shooting on the telephoto camera of the 11 Pro, it’ll capture additional information from the standard camera.

That extra information is saved alongside your photo. When you edit that image in the native camera app, you’ll be able to use the extra data as you rotate and manipulate your image – a big help when you’re trying to fix crooked lines in a photo.

As you make image adjustments, you’ll see the extra data captured by the ultra-wide lens. This additional image information is available for 30 days.

The phone can use that information to automatically re-crop photos too. In the camera settings menu there’s an option to “Auto Apply Adjustments.” You’ll know that auto adjustments have been applied to an image when it shows a blue “Auto” icon above your captured photo. We’ve noticed this feature being employed when the phone detects a human subject cut off at the edge of the frame.

And even for many photos that aren’t automatically adjusted, the stock camera app will suggest tweaks when brought into edit. For example, take that image of the building that’s leaning back – if you edit it in the iPhone’s camera app and engage the crop tool, it will automatically correct for perspective distortion and use the extra image data it saved to fill in the areas at the edges of the frame that would otherwise need to be cropped out.

Bringing the image into the iPhone’s native editing app, then pressing the ‘crop’ option will take you to this view. The yellow ‘auto’ icon appears at the top of the image if there’s a suggested crop, as there is in this example.
The same adjustments can be applied in Photoshop, but without that extra image information at the sides of the frame you’ll need to crop in to avoid including blank space in your final image.
The iPhone goes beyond these limitations with that extra image data. In addition to correcting perspective, you can creatively re-crop your image to preserve details at the edge of the frame – and even include objects that were well outside of the frame in your initial standard image.

I don’t think many people will discover this feature, and that’s a shame. It’s not just helpful for correcting distortion and fixing crooked horizons – it’s a useful feature if you just want to re-crop an image after-the-fact. However, it will only be discovered by those who enable the ‘capture outside the frame’ feature and attempt to crop an image, which I imagine is a fraction of the many people who will use the camera day in and day out.

Regardless of how widely used this feature will be, what Apple is doing is clever. Photoshop’s Content Aware Fill feature does something similar – it will fill in missing data when rotating or stretching an image – but instead of using data from a wider lens, it’s filling in those empty spaces based on educated guesses. Apple’s approach is just one more way in which smartphone manufacturers are using data to their advantage – to the advantage of boring photo fans everywhere.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone 11’s coolest photo feature is the hardest one to find

Posted in Uncategorized