Apple announces iPhone 11 (Pro)
Major smartphone manufacturers introduce new models on a yearly cadence. Camera upgrades tend to be a major focus, with little else apparently to differentiate new models from old ones. Often, what seem like small, incremental upgrades can have significant impact on photo and video quality. The iPhone XS, for example, dramatically improved image quality in high contrast scenes thanks to the sensor’s ability to capture ‘inter-frames’ – short exposures in between longer ones – to improve dynamic range and noise. Similarly, 4K video was improved with multi-frame HDR capture.
Last week, Apple announced numerous updates to the cameras in the iPhone 11, some of which will inevitably be seen as attempts to catch up to capabilities of contemporary Android offerings. But, taken together, we think they stack up to meaningful upgrades that potentially make an already very capable camera one of the most compelling ones on the market.
See beyond your frame
The iPhone 11 offers a whopping 13mm 35mm equivalent field-of-view with its wide-angle, ‘0.5x’ lens. The iPhone 11 is the first Apple phone to feature an ultra-wide angle lens, a feature that’s been present on numerous Android phones. Wide angle lenses often add drama and a sense of depth to everything from landscapes, portraits, architecture and still life. They also allow for creative framing options, juxtaposing interesting foreground objects and distant ones. They’re also useful when you simply can’t step any further back from your subject.
The iPhone 11 models alert you to the potential presence of objects of interest beyond your current framing by showing you the wider field-of-view within the camera user interface. Simply tap the ‘1x’ focal length multiplier button to zoom out (or zoom in, on the Pro models).
Refined image processing pipeline
Newer, faster processors often mean increased photo and video capability, and the iPhone 11 is no exception. Its image processing pipeline, which handles everything from auto white balance to auto exposure, autofocus, and image ‘development’, gets some new features: a 10-bit rendering pipeline upgraded from the previous 8-bit one, and the generation of a segmentation mask that isolates human subjects and faces, allowing for ‘semantic rendering’.
10-bit rendering should help render high dynamic range images without banding, which could otherwise result from the extreme tone-mapping adjustments required. Semantic rendering allows faces to be processed differently from other portions of the scene, allowing for more intelligent tone mapping and local contrast operations in images with human subjects (for example, faces can look ‘crunchy’ in high contrast scenes if local contrast is uniformly preserved across the entire image). The end result? More pleasing photos of people.
Night mode
The general principle of night modes on smartphones is to use burst photography to capture multiple frames. Averaging pixels from multiple exposures reduces noise, allowing the camera software to brighten the image with less noise penalty.
Google set the bar for low light photography with its Night Sight mode. Other Android phones soon added their own similar modes, making the iPhone’s lack of such a mode particularly conspicuous (third party solutions like Hydra haven’t offered quite the level of detail as the best Android implementations, and of course require you to launch a separate app).
Apple has developed its own Night mode on the iPhone 11 phones, which turns on automatically under dim conditions. Apple’s approach is slightly different from Google’s, using ‘adaptive bracketing’ to capture and fuse multiple exposures with potentially differing shutter speeds (the Pixel takes a burst of images at the same shutter speed).
Varying shutter speeds to capture both short and long exposures can help reduce blur with moving subjects. Information from shorter exposures is used for moving subjects, while longer exposures – which are inherently brighter and contain less noise – can be used for static scene elements. Each frame is broken up into many small blocks before alignment and merging. Blocks that have too much motion blur are discarded, with a noise penalty resulting from fewer averaged frames for that scene element.
Deep Fusion
Google’s Night Sight mode isn’t just about better photos in low light. Night Sight uses burst photography and super resolution techniques to generate images with more detail, less noise, and less moiré thanks to the lack of demosaicing (slight shifts from frame to frame allow the camera to sample red, green and blue information at each pixel location). ‘Deep Fusion’, available in a soon-to-be-released update later this year, seems to be Apple’s response to Google’s Night Sight mode.
Deep Fusion captures up to 9 frames and fuses them into a higher resolution 24MP image. Four short and four secondary frames are constantly buffered in memory, throwing away older frames to make room for newer ones. The buffer guarantees that the ‘base frame’ – the most important frame to which all other frames are aligned – is taken as close to your shutter press as possible. The buffer ensures a very short, or zero, shutter lag, enabling the camera to capture your desired moment.
After you press the shutter, one long exposure is taken (ostensibly to reduce noise), and subsequently all 9 frames are combined – ‘fused’ – presumably using a super resolution technique with tile-based alignment (described in the previous slide) to produce a blur and ghosting-free high resolution image. Apple’s SVP of Worldwide Marketing Phil Schiller also stated that it’s the ‘first time a neural engine is responsible for generating the output image’. We look forward to assessing the final results.
Portrait mode
Apple is famous for using its technologies to perfect ideas that other companies introduced. That could be said about iPhone’s Portrait mode. Fake blurred background modes have been around for years, even on some compact cameras, but they were never convincing. By applying depth mapping technology to the problem, Apple made a portrait mode that worked, and soon the feature was everywhere. And we’re slowly seeing a similar trend with portrait re-lighting: researchers and companies have been quick to develop relighting techniques, some of which do not even require a depth map.
The iPhone 11 updates portrait mode in a few significant ways. First, it offers the mode even with the main 26mm equivalent main camera, allowing for shallow depth-of-field wide-angle portraiture. The main camera modules always have better autofocus, and its sensor has been updated to have ‘100% focus pixels’ (hinting at a dual pixel design), so wide-angle portrait mode will benefit from this as well.
Second, on the Pro models, the telephoto lens used for more traditional portraits has been updated: it’s F2.0 aperture lets in 44% more light than the F2.4 aperture on previous telephoto modules. That’s a little over a half stop improvement in low light light gathering ability, which should help both image quality and autofocus. Telephoto modules on most smartphone cameras have struggled with autofocus in low light, resorting to hunting and resulting in misfocused shots, so this is a welcome change.
And thirdly…
Portrait relighting
The iPhone 11 offers a new portrait relighting option: ‘High-Key Light Mono’. This mode uses the depth map generated from the stereo pair of lenses to separate the subject from the background, blow out the background to white, and ‘re-light’ the subject while making the entire photo B&W. Presumably, the depth map can aid in identifying the distance of various facial landmarks, so that the relighting effect can emulate the results of a real studio light source (nearer parts of the face receive more light than farther ones). The result is a portrait intended to look as if it were shot under studio lighting.
We’ve now talked a bit about the new features iPhone 11 brings to the table, but let’s turn our attention backwards and take a look at the ways in which iPhone cameras are already leading the industry, if not setting standards along the way.
Sublime rendering
Straight out of camera, iPhone photos are, simply put, sublime. Look at the iPhone XS shot above: with a single button press, I’ve captured the bright background and it looks as if my daughter has some fill light on her. If you want to get technical, then, white balance is well judged (not too cool), skintones are great (not too green), and wide dynamic ranges are preserved without leading to crunchy results, a problem that can result from tone-mapping a large global contrast range while retaining local contrast.
Much of this is thanks to segmented processing techniques that treat human subjects differently to others when processing the image. Digging deeper and looking at images at pixel-level, Apple’s JPEG engine could do a better job in balancing noise reduction and sharpening: often images can appear overly smoothed in some areas with aggressive sharpening and overshoot in others. This may be done in part because results have been optimized for display on high DPI retina devices, and a Raw option – that still utilizes all the computational multi-frame ‘smarts – would go a long way to remedying this for enthusiasts and pros.
But it’s hard to argue that iPhone’s default color and rendition aren’t pleasing. In our opinion, Apple’s white balance, color rendering, and tone-mapping are second to none. The improvements to image detail, particularly thanks to Apple’s as-of-yet unreleased ‘Deep Fusion’ mode, should (we hope) remedy many of our remaining reservations regarding pixel-level image quality.
HDR Photos
No, not the HDR you’re thinking about, that creates flat images from large dynamic range scenes. We’re talking about HDR display of HDR capture. Think HDR10 and Dolby Vision presentations of 4K UHD video. Traditionally, when capturing a high contrast scene, we had two processing options for print or for display on old, dim 100-nit monitors: (1) preserve global contrast, often at the cost of local contrast, leading to flat results; or (2) preserve local contrast, often requiring clipping of shadows and highlights to keep the image from looking too unnatural. The latter is what most traditional camera JPEG engines do in the absence of dynamic range compensation modes.
With the advent of new display technology like OLED, capable of 10x or higher brightness compared to old displays and print, as well as nearly infinite contrast, the above trade-off need no longer exist. The iPhone X was the first device ever to support the HDR display of HDR photos. Since then, iPhones can capture a wide dynamic range and color gamut but then also display them using the full range of its class-leading OLED displays. This means that HDR photos need not look flat, retaining both large global contrast from deep shadows to bright highlights, while still looking contrasty, with pop. All without clipping tones and colors, in an effort to get closer to reproducing the range of tones and colors we see in the real world.
It’s hard to show the effect, and much easier to experience it in person, but in the photo above we’ve used a camera to shoot an iPhone XS displaying HDR (left) vs. non-HDR (right) versions of the same photo. Note how the HDR photo has brighter highlights, and darker midtones, creating the impression that the sky is much brighter than the subject (which it is!). The bright displays on modern phones mean that the subject doesn’t look too dark compared to the non-HDR version, she just looks more appropriately balanced against the sky, rather than appearing almost the same brightness as the sky.
Wide color (P3)
Apple is also leading the field in wide gamut photography. Ditching the age-old sRGB color space, iPhone images can now fully utilize the P3 color gamut, which means images can contain far more saturated colors. In particular, more vivid reds, oranges, yellows and greens. You won’t see them in the image above because of the way our content management system operates, but if you do have a P3 display and a color managed workflow, you can download and view the original image here. Or take a look at this P3 vs. sRGB rollover here on your iPhone or any recent Mac.
Apple is not only taking advantage of the extra colors of the P3 color space, it’s also encoding its images in the ‘High Efficiency Image Format’ (HEIF), which is an advanced format intended to replace JPEG that is more efficient and also allows for 10-bit color encoding (to avoid banding while allowing for more colors) and HDR encoding to allow the display of a larger range of tones on HDR displays.
Video
The video quality from the iPhone XS was already class-leading, thanks to the use of a high quality video codec and advanced compression techniques that suppressed common artifacts like macro-blocking and mosquito noise. 4K video up to 30 fps also had high dynamic range capture: fusing both bright and dark frames together to capture a wider range of contrast. Check out this frame grab of a very high contrast scene, with complex motion due to constant camera rotation. Note the lack of any obvious artifacts.
The iPhone 11 takes things further by offering extended dynamic range (EDR) for 4K 60p capture. Android devices are still limited to standard dynamic range capture. While a few Android devices offer a high dynamic range (HDR) video output format – such as HDR10 or HLG – to take advantage of the larger contrast of recent displays, without HDR capture techniques (fusion of multiple frames) this benefit is limited.
To sum up: we expect that the iPhone 11 will offer the highest quality video available in a smartphone, with natural looking footage in even high contrast scenes, now at the highest resolution and frame rates. We do hope Apple pairs its HDR video capture with an HDR video output format, like HDR10 or HLG. This would allow for a better viewing experience of the extended dynamic range (EDR) of video capture on the latest iPhones, with more pop and a wider color gamut.
Video experience
Apple is also looking to change the experience of shooting video in the iPhone 11 models. First, it’s easier to record video than it was in previous iterations: just hold down the shutter button in stills mode to start shooting a video. Instant video recording helps you capture the moment, rather than miss it as you switch camera modes.
Perhaps more exciting are the new video editing tools built right into the camera app. These allow for easy adjustment of image parameters like exposure, highlights, shadows, contrast and color. And the interface appears to be intuitive as ever.
Multiple angles and ‘Pro’ capture
Advanced video shooters are familiar with the FiLMiC Pro app, which allows for creative and total control over movie shooting. The CEO of FiLMiC was invited on stage to talk about some of the new features, and one of the coolest was the ability to record multiple streams from multiple cameras. The app shows all four streams from all four cameras (on the Pro), allowing you to choose which ones to record from. You can even record both participants of a face-to-face conversation using the front and rear cameras. This opens up new possibilities for creative framing, in some cases obviating the need for A/B cameras.
Currently it’s unclear how many total streams can be recorded simultaneously, but even two simultaneous streams opens up creative possibilities. Some of this capability will come retroactively to 2018 flagship devices, as we describe here.
Conclusion
Much of the sentiment after the launch of the iPhone 11 has centered around how Apple is playing catch-up with Android devices. And this is somewhat true: ultra-wide angle lenses, night modes, and super-resolution burst photography features have all appeared on multiple Android devices, with Google and Huawei leading the pack. No-one is standing still, and the next iterations from these companies – and others – will likely leap frog respective capabilities even further.
Even if Apple is playing catch up in some regards though, it’s leading in others, and we suspect that when they ship, the combination of old features and new – like Deep Fusion and Night mode – will make the iPhone 11 models among the most compelling smartphone cameras on the market.
As the newest iPhone, the iPhone 11 camera is by inevitably the best Apple has made. But is the iPhone 11 Pro the best smartphone camera around currently? We’ll have to wait until we have one in our hands. And, of course, the Google Pixel 4 is a wildcard, and just around the corner…
Articles: Digital Photography Review (dpreview.com)
You must be logged in to post a comment.