RSS
 

Posts Tagged ‘Pixel’

The Pixel 4 can’t beat a compact camera, but that doesn’t matter

13 Nov

With its newly improved Super Resolution Zoom, the Pixel 4 makes a case for itself as a replacement for a compact camera with a 4-6x zoom range. The kind you might bring on vacation – something with a sensor that’s a little bigger and a modest zoom that won’t be too cumbersome while you explore your destination.

I took the Pixel 4 as my primary camera on a recent trip, but just to satisfy my curiosity, packed the Canon PowerShot G5 X Mark II alongside it. The Pixel 4 fell short in a couple of ways, but overall it did the job well enough that I wouldn’t have regretted taking it as my only camera. Here’s what it did well, what the dedicated camera still does best, and why I think those differences don’t matter much to most people who take pictures.

A military fort-turned-prison is kind of a weird place to take someone for their birthday, but my fiancé is into that kind of thing. Plus, it was a great excuse to quote Sean Connery saying ‘Welcome to the Rock,’ for several weeks leading up to the trip. I’d been to Alcatraz before, so I was happy to spend a little more effort and concentration on taking photos.

As you might imagine, a jailhouse provides lots of low light photography opportunities – a task that the Pixel 4 is well equipped for. Night Sight does a little bit of computational magic to create surprisingly detailed images in low light (and good light for that matter). But even the default camera mode does a very nice job in dim conditions, thanks to its ability to capture multiple frames, analyze them and assemble the best bits into one final image on the fly. In fact, it out-performed the Canon G5 X II in the situations where I tested both.

The moderately low light images below show the Pixel 4 producing a slightly more detailed, less noise-smudged image in its standard camera mode versus the Canon G5 X II’s out-of-camera JPEG.

Zoom is another story. Google has improved the Super Resolution digital zoom in the Pixel 4, boosting image quality thanks to a combo of clever algorithms and the new telephoto lens. The company claims that the camera will produce decent results up to 6x zoom, but admit that zoom is a difficult problem to solve with the current technology.

Absolutely nobody is claiming that the phone’s 4-6x zoomed images can take on a traditional camera’s zoom pixel-for-pixel, but because I’m curious I checked it out anyway. Both cameras are at 5x zoom in the example below (about 135mm equiv. for the Pixel 4 and 122mm equiv. on the G5 X II).

The difference is obvious in the 100% crops above, and can be easily seen even at 50% – but then again how often will those photos be viewed on anything bigger than a computer screen? If I planned on making prints of these images, I’d still want a traditional optical zoom. But I rarely print images and I suspect I’m in the majority of the picture-taking public.

There was one more Pixel 4 camera feature that I found myself relying on that the G5 X II doesn’t offer: Dual Exposure Controls, which doesn’t mean what you think it means.

Dual Exposure Control puts a higher level of control over shadows and brightness, along with the ability to adjust them independently of each other, all before image capture

An advanced compact such as the G5 X II provides plenty of manual controls over exposure settings. What it doesn’t provide is the ability to finely tune shadows and brightness before you press the shutter: you can instead select low, medium or high levels of its Auto Lighting Optimizer.

The Pixel 4’s Dual Exposure Control gives you direct control over shadows and brightness, along with the ability to adjust them independently of each other, all before image capture. This phone and previous Google devices would do this automatically expose for backlit subjects and high-contrast scenes, but the dual controls allow you to increase or minimize the effect, depending on what you want.

The Pixel 4’s Dual Exposure Controls allowed me to slightly boost shadows in this image before pressing the shutter.

Of course the G5 X II offers plenty of editing flexibility with in-camera Raw processing, but control over settings is limited. For anything more advanced than some basic tweaks, you’ll need to take your Raw images into Lightroom or the like. On the Pixel 4, it all happens in-camera.

This potentially changes how you approach a high-contrast scene. Normally I’d expose for the highlights and bring up the shadows later, which works well but leaves me without an image to share now. This is annoying because social media has robbed me of any patience I once had. The Pixel 4 lets me make those adjustments before I take the photo – rather than having to wait until I can process the image later.

If I was keeping score, I could award a lot more points in favor of either device. Color science backed by decades of fine tuning, better picture-taking ergonomics, flip-out touchscreen for low angle shots: all points for the traditional camera. Integrated photo storage, seamless image sharing, always in your pocket: point, point, point for the Pixel 4.

What speaks louder than any arbitrary score-keeping though is the fact that I saw few, if any, compact cameras among my fellow tourists at Alcatraz. I saw mirrorless cameras, DSLRs, a few superzoom cameras and of course, lots of phones. To most of the photo-taking population though, the compact camera – even a really nice compact camera – is already history.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on The Pixel 4 can’t beat a compact camera, but that doesn’t matter

Posted in Uncategorized

 

Google’s Pixel 4 Astrophotography mode is now available on Pixel 2, 3 and 3a devices

07 Nov

The Google Pixel 4 offers a range of new and innovative camera features. Some of them are, mainly due to hardware requirements, exclusive to the latest Pixel device but Google has promised to make some others available for older Pixel models.

This has now happened in the case of the Pixel 4 Astrophotography mode. The function had previously been made available for older Pixels via a community-driven development effort but it’s now officially supported with older devices in the latest version 7.2 of the Google Camera app. Below are a few sample photos captured with the Astrophotography mode on the Pixel 4:

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_6422957949″,”galleryId”:”6422957949″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Users of the Pixel 2 and Pixel 3 series, including the Pixel 3a, are now able to use the feature after updating to the latest version of the app. The Astrophotography option builds on Google’s Night Sight technology and captures and combines several frames to achieve a clean exposure and great detail as well as limited noise levels when photographing the night sky.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s Pixel 4 Astrophotography mode is now available on Pixel 2, 3 and 3a devices

Posted in Uncategorized

 

Pixel peeping the everyday: Explore our Google Pixel 4 sample gallery

29 Oct

Google’s Pixel 4 promises many clever benefits thanks to computational photography, but what does that mean when you go out and shoot with it?

We’ve been using a Pixel 4 as our always-with-us camera for the past week, trying out many of the modes and unique features available to it, including ‘Night Sight’, ‘Dual Exposure Controls’, and multiple zoom ratios. We’ve captioned each image with the mode it was shot in. Have a look at what the results look like.

We’ve also included Raw files for download, which are the result of Google’s burst photography modes with its robust align and merge algorithms.

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_0674242834″,”galleryId”:”0674242834″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Pixel peeping the everyday: Explore our Google Pixel 4 sample gallery

Posted in Uncategorized

 

Google’s Dual Exposure Controls and Live HDR+ features remain exclusive to Pixel 4

22 Oct

Google’s brand new flagship smartphone Pixel 4 comes with a variety of new and innovative camera features. Google has already said that some of those features, for example the astrophotography mode, will be made available on older Pixel devices via software updates.

However, two of the new functions will be reserved for the latest generation Pixel devices: Dual exposure controls, which let you adjust highlights and shadows via sliders in the camera app, and Live HDR+, which gives you a WYSIWYG live preview of Google’s HDR+ processing. Google confirmed this in a tweet:

According to the tweet the reason these two features won’t be available on the Pixel 3 and older devices is down to hardware rather than a marketing decision. It appears the older phones simply don’t have enough processing oomph to display the HDR+ treatment and manual adjustments in real time.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s Dual Exposure Controls and Live HDR+ features remain exclusive to Pixel 4

Posted in Uncategorized

 

iPhone users get free unlimited Google Photo storage for original photo files, for now, while Pixel 4 owners have to pay up

22 Oct

All Android phones running Google apps come with free Google Photos cloud storage for images and videos captured with the device. However, image and video files are not stored at original quality. Instead, they are compressed to what Google calls ‘high quality’. Users who prefer to store their original out-of-camera files in the cloud have to shell out for one of Google’s storage plans.

On its own Pixel device, Google has in the past made an exception. Users of the Pixel 3 and previous Pixel devices could store unlimited original files for life, but this perk has ended with the brand new Pixel 4. Users of the latest Google flagship will just be treated like users of any other Android phone.

Now it seems the only ones benefiting from Google’s free unlimited storage for original photos are actually the users of one of Apple’s recent iPhones. Reddit user u/stephenvsawyer discovered that images in the HEIC/HEIF format, which recent iPhones use by default, will be stored without any compression.

A screenshot from Google’s Pixel 2 promotional page captured at the time of the release showing the no-longer-current benefit of getting unlimited storage of original files for free.

The reason for this is pretty simple: if Google tried to compress the images, the file size would actually increase. So the decision to save original HEIC/HEIF files to its cloud platform saves Google both storage space on its servers and computing power. It’s worth noting that this only applies to photos. iPhone videos are saved at 1080p resolution, even if they were recorded at 4K settings.

The latest version of the Android OS, Android 10, technically supports the HEIC/HEIF format, but Pixel 4 devices don’t currently offer this option. So, at least for now, iPhone users are actually getting more out of Google Photos than users of Google’s own flagship phone.

In a statement made to Android Police, Google said ‘We are aware of this bug and are working to fix it.’ What exactly this ‘fix’ entails remains to be seen, but there’s a good chance this iPhone loophole could get closed down in the near future.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone users get free unlimited Google Photo storage for original photo files, for now, while Pixel 4 owners have to pay up

Posted in Uncategorized

 

Google ends free ‘original quality’ image backups for the Pixel 4, Pixel 4 XL

20 Oct

The newly unveiled Google Pixel 4 and Pixel 4 XL smartphones will not include three years of free ‘original quality’ Google Photos storage, the company has confirmed. Details about the change were quietly listed on the Google Store’s Pixel 4 product page following the company’s press event on Tuesday, revealing an elimination of the perk Google has offered since the launch of its original Pixel model.

All Android mobile devices come with free Google Photos storage for images and videos captured with the handset, but there’s a catch: the content is compressed from its original quality down to ‘high quality.’

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_0948233002″,”galleryId”:”0948233002″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

The Pixel smartphone line has remained notable among its peers by offering atypically excellent camera quality, particularly in low-light environments. Before the Pixel 4, Google relied on computational photography, not extra lenses, to give its phones an edge. This time around, however, Google has taken steps to remain competitive with Apple by packing more than one camera into its newly unveiled Pixel 4 devices.

Many consumers, particularly photographers who prefer Android over iOS, have anticipated the launch of this phone specifically for its mobile camera capabilities. That makes Google’s decision to end its free ‘original quality’ photo storage particularly baffling. Buyers must either sign up for a paid storage plan or settle for compressed backups.

As recently noted by XDA, the Google Store’s Pixel 4 page reads, ‘Never worry about storing, finding, or sharing your memories thanks to unlimited storage in high quality on Google Photos.’ That feature comes with a small disclaimer that states:

Google Photos offers free unlimited online storage for all photos and videos uploaded in high quality. Photos and videos uploaded in high quality may be compressed or resized. Requires Google Account. Data rates may apply.

Google offers multiple cloud storage plans under its Google One subscription, which starts at $ 1.99/month for 100GB of storage if you pay annually. The Pixel 4 smartphone is available to preorder from the Google Storage now for $ 799.


Update (October 16, 2019): Corrected pricing of the entry-level Google One subscription plan.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google ends free ‘original quality’ image backups for the Pixel 4, Pixel 4 XL

Posted in Uncategorized

 

The Google Pixel 4 Will Feature Two Cameras Plus Enhanced Night Sight

19 Oct

The post The Google Pixel 4 Will Feature Two Cameras Plus Enhanced Night Sight appeared first on Digital Photography School. It was authored by Jaymes Dempsey.

 

The Google Pixel 4 Will Feature Two Cameras Plus Enhanced Night Sight

Earlier this week Google announced the long-awaited Pixel 4, which promises to take smartphone photography to a whole new level.

This comes in the wake of Apple’s iPhone 11 Pro announcement last month, which saw the debut of a triple-camera setup and features such as Night Mode.

In other words, the Pixel 4 is a competitor in an intense fight to create the best cameras, the best lenses, and the best camera software.

So what does the Google Pixel 4 offer?

Let’s take a closer look:

First, the Google Pixel 4 features a dual-camera setup, offering the usual wide-angle lens alongside a new 2X telephoto option. This isn’t unique (Apple has regularly included “telephoto” lenses going all the way back to the iPhone 7 Plus), but it is a nice addition for those who need a bit more reach. You can use the 2X lens for tighter portraits, and it’s also useful for street photography, where you often need to photograph subjects from a distance.

Interestingly, Google has decided to keep the wide-angle camera at 12 megapixels, but has packed in a 16-megapixel sensor for the telephoto camera. While plenty of photographers will be excited by this jump in resolution, it remains to be seen whether such tiny pixels will result in significant noise.

The dual-camera setup should also improve Google’s Portrait Mode, and Google has promised more natural background blur and very precise edges (e.g., when dealing with hair). Truthfully, I’m skeptical. I’ve yet to see a Portrait mode photo that looks perfect on any smartphone camera. But I’ll wait until I see the results from the Pixel 4 before judging.

One cool new feature that will debut in the Pixel 4 is Live HDR. When you go to capture an HDR photo, you’ll be able to see a live HDR preview on your smartphone screen; this should give you a sense of what you can expect from the HDR+ effect.

Finally, if you enjoy doing astrophotography, you’re in luck: The Pixel 4 offers an improved Night Sight mode, in which you can take stunning photos of the night sky. It works by taking a series of long exposures, before blending them together to create a beautiful final photo. Note that you’ll need a tripod or other method of stabilization to get sharp astrophotography shots.

Overall, the Google Pixel 4 offers some impressive new features, even if none of them feel totally groundbreaking. Up until now, the Pixel lineup has dominated regarding low-light shooting, and the enhanced Night Sight suggests that Google plans to keep running with this success.

The Google Pixel 4 is currently available for preorder starting at $ 799 USD and will hit the shelves on October 24.

You can check out this first look video from cnet to get more of an idea of the Google Pixel 4.

?

Are you interested in the Google Pixel 4? Let us know in the comments!

The post The Google Pixel 4 Will Feature Two Cameras Plus Enhanced Night Sight appeared first on Digital Photography School. It was authored by Jaymes Dempsey.


Digital Photography School

 
Comments Off on The Google Pixel 4 Will Feature Two Cameras Plus Enhanced Night Sight

Posted in Photography

 

These are the most important Google Pixel 4 camera updates

19 Oct

Google yesterday announced the Pixel 4 and Pixel 4 XL, updates to the popular line of Pixel smartphones.

We had the opportunity recently to sit down with Marc Levoy, Distinguished Engineer and Computational Photography Lead at Google, and Isaac Reynolds, Product Manager for Camera on Pixel, to dive deep into the imaging improvements brought to the lineup by the Pixel 4.

Table of contents:

  • More zoom
  • Dual exposure controls / Live HDR+
  • Improved Night Sight
  • DSLR-like bokeh
  • Portrait mode improvements
  • Further improvements
  • Conclusion

Note that we do not yet have access to a production-quality Pixel 4. As such, many of the sample images in this article were provided by Google.

More zoom

The Pixel 4 features a main camera module with a 27mm equivalent F1.7 lens, employing a 12MP 1/2.55″ type CMOS sensor. New is a second ‘zoomed-in’ camera module with a 48mm equivalent, F2.4 lens paired with a slightly smaller 16MP sensor. Both modules are optically stabilized. Google tells us the net result is 1x-3x zoom that is on par with a true 1x-3x optical zoom, and pleasing results all the way out to 4x-6x magnification factors. No doubt the extra resolution of the zoomed-in unit helps with those higher zoom ratios.

Have a look at what the combination of two lenses and super-res zoom gets you with these 1x to 8x full-resolution samples from Google.

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_3571311037″,”galleryId”:”3571311037″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Marc emphasized that pinching and zooming to pre-compose your zoomed-in shot is far better than cropping after the fact. I’m speculating here, but I imagine much of this has to do with the ability of super-resolution techniques to generate imagery of higher resolution than any one frame. A 1x super-res zoom image (which you get by shooting 1x Night Sight) still only generates a 12MP image; cropping and upscaling from there is unlikely to get you as good results as feeding crops to the super-res pipeline for it to align and assemble on a higher resolution grid before it outputs a 12MP final image.

We’re told that Google is not using the ‘field-of-view fusion’ technique Huawei uses on its latest phones where, for example, a 3x photo gets its central region from the 5x unit and its peripheries from upscaling (using super-resolution) the 1x capture. But given Google’s choice of lenses, its decision makes sense: from our own testing with the Pixel 3, super-res zoom is more than capable of handling zoom factors between 1x and 1.8x, the latter being the magnification factor of Google’s zoomed-in lens.

Dual exposure controls with ‘Live HDR+’

The results of HDR+, the burst mode multi-frame averaging and tonemapping behind every photograph on Pixel devices, are compelling, retaining details in brights and darks in, usually, a pleasing, believable manner. But it’s computationally intensive to show the end result in the ‘viewfinder’ in real-time as you’re composing. This year, Google has opted to use machine learning to approximate HDR+ results in real-time, leading to a much better viewfinder experience.1 Google calls this ‘Live HDR+’. It’s essentially a WYSIWYG implementation that should give photographers more confidence in the end result, and possibly feel less of a need to adjust the overall exposure manually.

“If we have an intrinsically HDR camera, we should have HDR controls for it” – Marc Levoy

On the other hand, if you do have an approximate live view of the HDR+ result, wouldn’t it be nice if you could adjust it in real-time? That’s exactly what the new ‘dual exposure controls’ allow for. Tap on the screen to bring up two separate exposure sliders. The brightness slider, indicated by a white circle with a sun icon, adjusts the overall exposure, and therefore brightness, of the image. The shadows slider essentially adjusts the tonemap, so you can adjust shadow and midtone visibility and detail to suit your taste.

Default HDR+ result Brightness slider (top left) lowered to darken overall exposure
Shadows slider (top center) lowered to create silhouettes Final result

Dual exposure controls are a clever way to operate an ‘HDR’ camera, as it allows the user to adjust both the overall exposure and the final tonemap in one or two swift steps. Sometimes HDR and tonemapping algorithms can go a bit far (as in this iPhone XS example here), and in such situations photographers will appreciate having some control placed back in their hands.

And while you might think this may be easy to do after-the-fact, we’ve often found it quite difficult to use the simple editing tools on smartphones to push down the shadows we want darkened after tonemapping has already brightened them. There’s a simple reason for that: the ‘shadows’ or ‘blacks’ sliders in photo editing tools may or may not target the same range of tones the tonemapping algorithms did when initially processing the photo.

Improved Night Sight

Google’s Night Sight is widely regarded as an industry benchmark. We consistently talk about its use not just for low light photography, but for all types of photography because of its use of a super-resolution pipeline to yield higher resolution results with less aliasing and moire artifacts. Night Sight is what allowed the Pixel 3 to catch up to 1″-type and four-thirds image quality, both in terms of detail and noise performance in low light, as you can see here (all cameras shot with equivalent focal plane exposure). So how could Google improve on that?

Well, let’s start with the observation that some reviewers of the new iPhone 11 remarked that its night mode had surpassed the Pixel 3’s. While that’s not entirely true, as I covered in my in-depth look at the respective night modes, we have found that at very low light levels the Pixel 3 does fall behind. And it mostly has to do with the limits: handheld exposures per-frame in our shooting with the Pixel 3 were limited to ~1/3s to minimize blur caused by handshake. Meanwhile, the tripod-based mode only allowed shutter speeds up to 1s. Handheld and tripod-based shots were limited to 15 and 6 total frames, respectively, to avoid user fatigue. That meant the longest exposures you could ever take were limited to 5-6s.

Pixel 4 extends the per-frame exposure, when no motion is detected, to at least 16 seconds and up to 15 frames. That’s a total of 4 minutes of exposure. Which is what allows the Pixel 4 to capture the Milky Way:

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_6422957949″,”galleryId”:”6422957949″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Remarkable is the lack of user input: just set the phone up against a rock to stabilize it, and press one button. That’s it. It’s important to note you couldn’t get this result with one long exposure, either with the Pixel phone or a dedicated camera, because it would result in star trails. So how does the Pixel 4 get around this limitation?

The same technique that enables high quality imagery from a small sensor: burst photography. First, the camera picks a shutter speed short enough to ensure no star trails. Next, it takes many frames at this shutter speed and aligns them. Since alignment is tile-based, it can handle the moving stars due to the rotation of the sky just as the standard HDR+ algorithm handles motion in scenes. Normally, such alignment is very tricky for photographers shooting night skies with non-celestial, static objects in the frame, since aligning the stars would cause misalignment in the foreground static objects, and vice versa.

Improved Night Sight will not only benefit starry skyscapes, but all types of photography requiring long exposures

But Google’s robust tile-based merge can handle displacement of objects from frame to frame of up to ~8% in the frame2. Think of it as tile-based alignment where each frame is broken up into roughly 12,000 tiles, with each tile individually aligned to the base frame. That’s why the Pixel 4 has no trouble treating stars in the sky differently from static foreground objects.

Another issue with such long total exposures is hot pixels. These pixels can become ‘stuck’ at high luminance values as exposure times increase. The new Night Sight uses clever algorithms to emulate hot pixel suppression, to ensure you don’t have bright pixels scattered throughout your dark sky shot.

DSLR-like bokeh

This is potentially a big deal, and perhaps underplayed, but the Google Pixel 4 will render bokeh, particularly out-of-focus highlights, closer to what we’d expect from traditional cameras and optics. Until now, while Pixel phones did render proper disc-shaped blur for out of focus areas as real lenses do (as opposed to a simple Gaussian blur), blurred backgrounds simply didn’t have the impact they tend to have with traditional cameras, where out-of-focus highlights pop out of the image in gorgeous, bright, disc-shaped circles as they do in these comparative iPhone 11 examples here and also here.

The new bokeh rendition on the Pixel 4 takes things a step closer to traditional optics, while avoiding the ‘cheap’ technique some of its competitors use where bright circular discs are simply ‘stamped’ in to the image (compare the inconsistently ‘stamped’ bokeh balls in this Samsung S10+ image here next to the un-stamped, more accurate Pixel 3 image here). Have a look below at the improvements over the Pixel 3; internal comparisons graciously provided to me via Google.

Daytime bokeh

Daytime bokeh

Nighttime bokeh

Nighttime bokeh

The impactful, bright, disc-shaped bokeh of out-of-focus highlights are due to the processing of the blur at a Raw level, where linearity ensures that Google’s algorithms know just how bright those out-of-focus highlights are relative to their surroundings.

Previously, applying the blur to 8-bit tonemapped images resulted in less pronounced out-of-focus highlights, since HDR tonemapping usually compresses the difference in luminosity between these bright highlights and other tones in the scene. That meant that out-of-focus ‘bokeh balls’ weren’t as bright or separated from the rest of the scene as they would be with traditional cameras. But Google’s new approach of applying the blur at the Raw stage allows it to more realistically approximate what happens optically with conventional optics.

One thing I wonder about: if the blur is applied at the Raw stage, will we get Raw portrait mode images in a software update down-the-line?

Portrait mode improvements

Portrait mode has been improved in other ways apart from simply better bokeh, as outlined above. But before we begin I want to clarify something up front: the term ‘fake bokeh’ as our readers and many reviewers like to call blur modes on recent phones is not accurate. The best computational imaging devices, from smartphones to Lytro cameras (remember them?), can actually simulate blur true to what you’d expect from traditional optical devices. Just look at the gradual blur in this Pixel 2 shot here. The Pixel phones (and iPhones as well as other phones) generate actual depth maps, gradually blurring objects from near to far. This isn’t a simple case of ‘if area detected as background, add blurriness’.

The Google Pixel 3 generated a depth map from its split photodiodes with a ~1mm stereo disparity, and augmented it using machine learning. Google trained a neural network using depth maps generated by its dual pixel array (stereo disparity only) as input, and ‘ground truth’ results generated by a ‘franken-rig’ that used 5 Pixel cameras to create more accurate depth maps than simple split pixels, or even two cameras, could. That allowed Google’s Portrait mode to understand depth cues from things like defocus cues (out-of-focus objects are probably further away than in-focus ones) and semantic cues (smaller objects are probably further away than larger ones).

Deriving stereo disparity from two perpendicular baselines affords the Pixel 4 much more accurate depth maps

The Pixel 4’s additional zoomed-in lens now gives Google more stereo data to work with, and Google has been clever in its arrangement: if you’re holding the phone upright, the two lenses give you horizontal (left-right) stereo disparity, while the split pixels on the main camera sensor give you vertical (up-down) stereo disparity. Having stereo data along two perpendicular axes avoids artifacts related to the ‘aperture problem’, where detail along the axis of stereo disparity essentially has no measured disparity.

Try this: look at a horizontal object in front of you and blink to switch between your left and right eye. The object doesn’t look very different as you switch eyes, does it? Now hold out your index finger, pointing up, in front of you, and do the same experiment. You’ll see your finger moving dramatically left and right as you switch eyes.

Deriving stereo disparity from two perpendicular baselines affords the Pixel 4 much more accurate depth maps, with the dual cameras providing disparity information that the split pixels might miss, and vice versa. In the example below, provided by Google, the Pixel 4 result is far more believable than the Pixel 3 result, which has parts of the upper and lower green stem, and the horizontally-oriented green leaf near bottom right, accidentally blurred despite falling within the plane of focus.

(dual baseline)

(single baseline)

The combination of two baselines, one short (split pixels) and one significantly longer (the two lenses) also has other benefits. The longer stereo baselines of dual camera setups can run into the problem of occlusion: since the two perspectives are considerably different, one lens may see a background object that to the other lens is hidden behind a foreground object. The shorter 1mm disparity of the dual pixel sensor means its less prone to errors due to occlusion.

On the other hand, the short disparity of the split pixels means that further away objects that are not quite at infinity appear the same to ‘left-looking’ and ‘right-looking’ (or up/down) photodiodes. The longer baseline of the dual cameras means that stereo disparity can be calculated for these further away objects, which allows the Pixel 4’s portrait mode to better deal with distant subjects, or groups of people shot from further back, as you can see below.

There’s yet another benefit of the two separate methods for calculating stereo disparity: macro photography. If you’ve shot portrait mode on telephoto units of other smartphones, you’ve probably run into error messages like ‘Move farther away’. That’s because these telephoto lenses tend to have a minimum focus distance of ~20cm. Meanwhile, the minimum focus distance of the main camera on the Pixel 4 is only 10cm. That means that for close-up photography, the Pixel 4 can simply use its split pixels and learning-based approach to blur backgrounds.3

One thing we’ll be curious to test is if the additional burden of taking two images with the dual camera setup will lead to any latency. The iPhone 11, for example, has considerable shutter lag in portrait mode.

Google continues to keep a range of planes in perfect focus, which can sometimes lead to odd results where multiple people in a scene remain focused despite being at different depths. However, this approach avoids prematurely blurring parts of people that shouldn’t be blurred, a common problem with iPhones.

Oddly, portrait mode is unavailable with the zoomed-in lens, instead opting to use the same 1.5x crop from the main camera that the Pixel 3 used. This means images will have less detail compared to some competitors, especially since the super-res zoom pipeline is still not used in portrait mode. It also means you don’t get the versatility of both wide-angle and telephoto portrait shots. And if there’s one thing you probably know about me, it’s that I love my wide angle portraits!

Pixel 4’s portrait mode continues to use a 1.5x crop from the main camera. This means that, like the Pixel 3, it will have considerably less detail than portrait modes from competitors like the iPhone 11 Pro that use the full-resolution image from wide or tele modules. Click to view at 100%

Further improvements

There are a few more updates to note.

Learning-based AWB

The learning-based white balance that debuted in Night Sight is now the default auto white balance (AWB) algorithm in all camera modes on the Pixel 4. What is learning-based white balance? Google trained its traditional AWB algorithm to discriminate between poorly, and properly, white balanced images. The company did this by hand-correcting images captured using the traditional AWB algorithm, and then using these corrected images to train the algorithm to suggest appropriate color shifts to achieve a more neutral output.

Google tells us that the latest iteration of the algorithm is improved in a number of ways. A larger training data set has been used to yield better results in low light and adversarial lighting conditions. The new AWB algorithm is better at recognizing specific, common illuminants and adjusting for them, and also yields better results under artificial lights of one dominant color. We’ve been impressed with white balance results in Night Sight on the Pixel 3, and are glad to see it ported over to all camera modes. See below how Google’s learning-based AWB (top left) preserves both blue and red/orange tones in the sky compared to its traditional AWB (top right), and how much better it is at separating complex sunset colors (bottom left) compared to the iPhone XS (bottom right).

Learning-based AWB (Pixel 3 Night Sight) Traditional AWB (Pixel 3)
Learning-based AWB (Pixel 3 Night Sight) iPhone XS HDR result

New face detector

A new face detection algorithm based solely on machine learning is now used to detect, focus, and expose for faces in the scene. The new face detector is more robust at identifying faces in challenging lighting conditions. This should help the Pixel 4 better focus on and expose for, for example, strongly backlit faces. The Pixel 3 would often prioritize exposure for highlights and underexpose faces in backlit conditions.

Though tonemapping would brighten the face properly in post-processing, the shorter exposure would mean more noise in shadows and midtones, which after noise reduction could lead to smeared, blurry results. In the example below the Pixel 3 used an exposure time of 1/300s while the iPhone 11 yielded more detailed results due to its use of an exposure more appropriate for the subject (1/60s).

Along with the new face detector, the Pixel 4 will (finally) indicate the face it’s focusing on in the ‘viewfinder’ as you compose. In the past, Pixel phones would simply show a circle in the center of the screen every time it refocused, which was a very confusing experience that left users wondering whether the camera was in fact focusing on a face in the scene, or simply on the center. Indicating the face its focusing on should allow Pixel 4 users to worry less, and feel less of a need to tap on a face in the scene if the camera’s already indicating it’s focusing on it.

On previous Pixel phones, a circle focus indicator would pop up in the center when the camera refocused, leading to confusion. Is the camera focusing on the face, or the outstretched hand? On the Huawei P20, the camera indicates when it’s tracking a face. The Pixel 4 will have a similar visual indicator.

Semantic segmentation

This isn’t new, but in his keynote Marc mentioned ‘semantic segmentation’ which, like the iPhone, allows image processing to treat different portions of the scene differently. It’s been around for years in fact, allowing Pixel phones to brighten faces (‘synthetic fill flash’), or to better separate foregrounds and backgrounds in Portrait mode shots. I’d personally point out that Google takes a more conservative approach in its implementation: faces aren’t brightened or treated differently as much as they tend to be with the iPhone 11. The end result is a matter of personal taste.

Conclusion

The questions on the minds of many of our readers will undoubtedly be: (1) what is the best smartphone for photography I can buy, and (2) when should I consider using such a device as opposed to my dedicated camera?

We have much testing to do and many side-by-sides to come. But from our tests thus far and our recent iPhone 11 vs. Pixel 3 Night Sight article, one thing is clear: in most situations the Pixel cameras are capable of a level of image quality unsurpassed by any other smartphone when you compare images at the pixel (no pun intended) level.

But other devices are catching up, or exceeding Pixel phone capabilities. Huawei’s field-of-view fusion offers compelling image quality across multiple zoom ratios thanks to its fusion of image data from multiple lenses. iPhones offer a wide-angle portrait mode far more suited for the types of photography casual users engage in, with better image quality to boot than Pixel’s (cropped) Portrait mode.

The Pixel 4 takes an already great camera and refines it to achieve results closer to, and in some cases surpassing, traditional cameras and optics

Overall though, Google Pixel phones deliver some of the best image quality we’ve seen from a mobile device. No other phone can compete with its Raw results, since Raws are a result of a burst of images stacked using Google’s robust align-and-merge algorithm. Night Sight is now improved to allow for superior results with static scenes demanding long exposures. And Portrait mode is vastly improved thanks to dual baselines and machine learning, with fewer depth map errors and better ability to ‘cut around’ complex objects like pet fur or loose hair strands. And pleasing out-of-focus highlights thanks to ‘DSLR-like bokeh’. AWB is improved, and a new learning-based face detector should improve focus and exposure of faces under challenging lighting.

It’s not going to replace your dedicated camera in all situations, but in many it might. The Pixel 4 takes an already great camera in the Pixel 3, and refines it further to achieve results closer to, and in some cases surpassing, traditional cameras and optics. Stay tuned for more thorough tests once we get a unit in our hands.

Finally, have a watch of Marc Levoy's Keynote presentation yesterday below. And if you haven’t already, watch his lectures on digital photography or visit his course website from the digital photography class he taught while at Stanford. There’s a wealth of information on digital imaging in those talks, and Marc has a knack for distilling complex topics into elegantly simple terms.


Footnotes:

1 The Pixel 3’s dim display combined with the dark shadows of a non-HDR preview often made the experience of shooting high contrast scenes outdoors lackluster, sometimes even making it difficult to compose. Live HDR+ should dramatically improve the experience, though the display remains relatively dim compared to the iPhone 11 Pro.

2 The original paper on HDR+ by Hasinoff and Levoy claims HDR+ can handle displacements of up to 169 pixels within a single raw color channel image. For a 12MP 4:3 Bayer sensor, that’s 169 pixels of a 2000 pixel wide (3MP) image, which amounts to ~8.5%. Furthermore, tile-based alignment is performed using as small as 16×16 pixel blocks of that single raw channel image. That amounts to ~12,000 effective tiles that can be individually aligned.

3 The iPhone 11’s wide angle portrait mode also allows you to get closer to subjects, since its ultra-wide and wide cameras can focus on nearer subjects than its telephoto lens.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on These are the most important Google Pixel 4 camera updates

Posted in Uncategorized

 

Google Pixel 4 adds telephoto lens, improved portrait mode and HDR in live view

15 Oct

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_0259124455″,”galleryId”:”0259124455″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Google officially unveiled the Pixel 4 today, with the addition of a telephoto camera headlining the camera updates. Other improvements include an enhanced live view experience showing the approximated effects of HDR in real time, added controls for adjusting exposure and tone mapping prior to image capture, and an updated portrait mode with better depth mapping thanks to the additional rear camera.

The Pixel 4 and Pixel 4 XL offer 5.7″ and 6.3″ OLED displays respectively, each with a 90Hz variable refresh rate that Google calls ‘Smooth Display.’ Gone is the fingerprint sensor on the rear of the device, replaced by face unlock. Also new is a technology called Soli, comprising a radar chip that detects hand motions. Called Motion Sense, this feature makes it possible to skip songs and silence calls with a wave of your hand.

As is the case with high-profile phone launches, along with the main specifications the camera updates are also the center of attention (in fact, Annie Leibovitz made an appearance). In addition to the new F2.4, optically stabilized telephoto camera (about 48mm equiv.), Google has introduced improved Super Resolution Zoom for up to 8x digital zoom. In fact, the telephoto camera uses a hybrid of optical and digital zoom at its default zoom setting to achieve approximately 2x zoom.

The process of taking photos has been improved on the Pixel 4 as well. On previous models, the results of Google’s impressive HDR rendering could only be seen after capture – now, machine learning is used to approximate the effect in real-time for a much more ‘what you see is what you get’ experience.

Google Pixel 4 official sample images

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_0948233002″,”galleryId”:”0948233002″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Additional exposure controls are also available during image capture. Two new sliders give users direct control of overall scene brightness and rendering of shadows, as compared to the single exposure slider offered by the Pixel 3. Google also says the Pixel 4’s camera is more responsive and stable compared to the Pixel 3, thanks to 6GB of RAM at its disposal.

Portrait mode should see significant improvements as well. The mode now uses information from the telephoto camera as well as split pixels to judge subject distance, creating a better depth map than was previously possible only using split pixels. Portrait mode’s range has also been extended, making it possible to capture large objects as well as human subjects from farther back than was possible on the Pixel 3.

While the telephoto camera lends depth information, the standard camera with a 1.5x digital zoom is used for the image itself. Background blur is now applied to the Raw image before tone mapping, with the aim of creating more SLR-like bokeh. The updated Portrait mode should also handle human hair and dog fur better, and Google says that its face detection has been improved and should handle backlit subjects better.

All camera modes will benefit from improved, learning-based white balance – previously used only in Night Sight

An astrophotography mode is added to Night Sight, using longer shutter speeds to capture night skies. Additionally, all camera modes will benefit from improved, learning-based white balance – previously used only in Night Sight. Google has also done some white balance tuning for certain light sources.

Google has reduced the number of front-facing cameras from two back down to one. Citing the popularity of the ultra-wide selfie camera, the Pixel 4’s single front-facing camera offers a focal length that’s a happy medium between the standard and ultra-wide options on the Pixel 3.

Google Pixel 4 pre-orders start today; Pixel 4 starts at $ 799 and Pixel 4 XL starts at $ 899. Both will ship on October 24th. It will be available for all major US carriers for the first time, including AT&T.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google Pixel 4 adds telephoto lens, improved portrait mode and HDR in live view

Posted in Uncategorized

 

How does iPhone 11 Night Mode compare to Google Pixel 3 Night Sight?

15 Oct

Many smartphones today take great images in broad daylight. That’s no surprise – when there’s a lot of light, it doesn’t matter so much that the small smartphone sensor doesn’t collect as many photons as a larger sensor: there’s an abundance of photons to begin with. But smartphone image quality can take a nosedive as light levels drop and there just aren’t many photons to collect (especially for a small sensor). That’s where computational techniques and burst photography come in.

Low light performance is a huge differentiator that separates the best smartphones from
the worst

Low light performance is a huge differentiator that separates the best smartphones from the worst. And Google’s Night Sight has been the low-light king of recent1, thanks to its averaging of many (up to 15) frames, its clever tile-based alignment to deal with hand movement and motion in the scene, and its use of a super-resolution pipeline that yields far better resolution, particularly color resolution, and lower noise than simple frame stacking techniques.

With the iPhone 11, Apple launched its own Night Mode to compete with offerings from Android phones. It uses ‘adaptive bracketing’ to combine both long and short exposures (to freeze any movement) to build a high quality image in low light conditions. Let’s see how it stacks up compared to Google’s Night Sight and Apple’s own previous generation iPhone XS.

The set-up

‘Low light performance’ is difficult to sum up in one number or picture when it comes to computational imaging. Different devices take different approaches, which ultimately means that comparative performance across devices can vary significantly with light level. Hence we’ve chosen to look at how the iPhone 11 performs as light levels decrease from evening light before sunset to very low light conditions well after sunset. The images span an hour-long time frame, from approximately 500 lux to 5 lux. All shots are handheld, since this is how we expect users to operate their smartphones. The iPhone 11 images spanning this time period are shown below.

7:00 pm, evening light
1/60 | ISO 100
485 lux | 7.6 EV

7:25 pm, late evening light
1/8 | ISO 250
25 lux | 3.4 EV

7:50 pm, low light
1/4 | ISO 640
5 lux | 1 EV
8:05 pm, very low light
1/8 | ISO 1250
<5 lux | <1 EV

Note that Night mode is only available with the main camera unit, not the 2x or 0.5x cameras. And before we proceed to our comparisons, please see this footnote about the rollovers and crops that follow: on ‘HiDPI’ screens like smartphones and higher-end laptops/displays, the following crops are 100%, but on ‘standard’ displays you’ll only see 50% crops.2

Now, on to the comparisons. In the headings, we’ve labeled the winner.

Evening light (485 lux) | Winner: Google Pixel 3

Before sunset, there’s still a good amount of available light. At this light level (485 lux, as measured by the iPhone 11 camera), the option for Night mode on iPhone 11 is not available. Yet Night Sight on the Google Pixel 3 is available, as it is in all situations. And thanks to its averaging of up to 15 frames and its super-resolution pipeline, it provides far more detail than the iPhone 11.

It’s not even close.

Take a look at the detail in the foreground trees and foliage, particularly right behind the fence at the bottom. Or the buildings and their windows up top, which appear far crisper on the Pixel 3.

Late evening light (25 lux) | Winner: Google Pixel 3

As the sun sets, light levels drop, and at 25 lux we finally have the option to turn on Night Mode on the iPhone, though it’s clearly not suggested by Apple since it’s not turned on by default. You’ll see the Night Mode option as a moon-like icon appearing on the bottom left of the screen in landscape orientation. Below we have a comparison of the iPhone with Night Mode manually turned on next to the Google Pixel 3 Night Sight (also manually enabled).

There’s more detail and far less noise – particularly in the skies – in the Google Pixel 3 shot. It’s hard to tell what shutter speeds and total exposure time either camera used, due to stacking techniques using differing shutter speeds and discarding frames or tiles at will based on their quality or usability. But it appears that, at best, the Pixel 3 utilized 15 frames of 1/5s shutter speeds, or 3s total, while the iPhone 11 indicated it would use a total of 1s in the user interface (the EXIF indicates 1/8s, so is likely un-representative). In other words, here it appears the Pixel 3 used a longer total exposure time.

Apart from that, though, the fact that the iPhone result looks noisier than the same shot with Night Mode manually turned off (not shown) leads us to believe that the noisy results are at least in part due to Apple’s decision to use less noise reduction in Night Mode. This mode appears to assume that the longer overall exposures will lead to lower noise and, therefore, less of a need for noise reduction.

However, in the end, it appears that under these light levels Apple is not using a long enough total exposure (the cumulative result of short and long frames) to yield low enough noise results that the lower noise reduction levels are appropriate. So, in these conditions when it appears light levels are not low enough for Apple to turn on Night Mode by default, the Google Pixel 3 outperforms, again.

Low light (5 lux) | Winner: Tie

As light levels drop further to around 5 lux, the iPhone 11 Night mode appears to catch up to Google’s Night Sight. Take a look above, and it’s hard to choose a winner. The EXIF data indicates the Pixel used 1/8s shutter speeds per frame, while the iPhone used at least 1/4s shutter speed for one or more frames, so it’s possible that the iPhone’s use of longer exposure times per frame allows it to catch up to Google’s result, despite presumably using fewer total frames. Keynotes from Apple and personal conversations with Google indicate that Apple only uses up to 8-9 frames of both short and long exposures, while the Pixel uses up to 15 frames of consistent exposure, for each phone’s respective burst photography frame-stacking methods.

Very low light (< 5 lux) | Winner: iPhone 11

As light levels drop even further, the iPhone 11 catches up to and surpasses Google’s Night Sight results. Note the lower noise in the dark blue sky above the cityscape. And while overall detail levels appear similar, buildings and windows look crisper thanks to lower noise and a higher signal:noise ratio. We presume this is due to the use of longer exposure times per frame.

It’s worth noting the iPhone, in this case, delivers a slightly darker result, which arguably ends up being more pleasing, to me anyway. Google’s Night Sight also does a good job of ensuring that nighttime shots don’t end up looking like daytime, but Apple appears to take a slightly more conservative approach.

We shot an even darker scene to see if the iPhone’s advantage persisted. Indeed, the iPhone 11’s advantage became even greater as light levels dropped further. Have a look below.

(Night Mode Off)

(Night Sight Off)

As you can see, the iPhone 11 delivers a more pleasing result, with more detail and considerably less noise, particularly in peripheral areas of the image where lens vignetting considerably lowers image quality as evidenced by the drastically increased noise in the Pixel 3 results.

Ultimately it appears that the lower the light levels, the better the iPhone 11 performs comparatively.

A consideration: (slightly) moving subjects

Neither camera’s night mode is meant for photographing moving subjects, but that doesn’t mean they can’t deal with motion. Because these devices use tile-based alignment to merge frames to the base frame, static and moving subjects in a scene can be treated differently. For example, on the iPhone, shorter and longer exposures can be used for moving and static subjects, respectively. Frames with too much motion blur for the moving subjects may be discarded, or perhaps only have their static portions used if the algorithms are clever enough.

Below we take a look at a slightly moving subject in two lighting conditions: the first dark enough for Night mode to be available as an option on the iPhone (though it isn’t automatically triggered until darker conditions), and the second in very dim indoor lighting where Night mode automatically triggers.

Although I asked my subject to stay still, she moved around a bit as children are wont to do. The iPhone handles this modest motion well. You’ll recall that Apple’s Night mode uses adaptive bracketing, meaning it can combine both short and long exposures for the final result. It appears that the exposure times used for the face weren’t long enough to avoid a considerable degree of noise, which is exacerbated by more conservative application of noise reduction to Night mode shots. Here, we prefer the results without Night mode enabled, despite the slight watercolor painting-like result when viewed at 100%.

We tested the iPhone 11 vs. the Google Pixel 3 with very slightly moving subjects under even darker conditions below.

Here you can see that Apple’s Night mode yields lower noise than with the mode (manually) turned off. With the mode turned off, it appears Deep Fusion is active3, which yields slightly more detail at the cost of more noise (the lack of a smeary, watercolor painting-like texture is a giveaway that Deep Fusion kicked in). Neither iPhone result is as noise-free and crisply detailed as the Pixel 3 Night Sight shot, though. We can speculate that the better result is due to either the use of more total frames, or perhaps more effective use of frames where the subject has slightly moved, or some combination thereof. Google’s tile-based alignment can deal with inter-frame subject movement of up to 8% of the frame, instead of simply discarding tiles and frames where the subject has moved. It is unclear how robust Apple’s align-and-merge algorithm is comparatively.

Vs. iPhone XS

We tested the iPhone 11 Night Mode vs. the iPhone XS, which has no Night Mode to begin with. As you can see below, the XS image is far darker, with more noise and less detail than the iPhone 11. This is no surprise, but it’s informative to see the difference between the two cameras.

Conclusion

iPhone 11’s Night Mode is formidable and a very welcome tool in Apple’s arsenal. It not only provides pleasing images for its users, but it sometimes even surpass what is easily achievable by dedicated cameras. In the very lowest of light conditions, Apple has even managed to surpass the results of Google’s Night Sight, highly regarded – and rightfully so – as the industry standard for low light smartphone photography.

But there are some caveats. First, in less low light conditions – situations you’re actually more likely to be shooting in – Google’s use of more frames and its super-resolution pipeline mean that its Pixel 3 renders considerably better results, both in terms of noise and resolution. In fact, the Pixel 3 can out-resolve even the full-frame Sony a7S II, with more color resolution and less color aliasing.

Second, as soon as you throw people as subjects into the mix, things get a bit muddled. Both cameras perform pretty well, but we found Google’s Night Sight to more consistently yield sharper images with modest subject motion in the scene. Its use of up to 15 frames ensures lower noise, and its align-and-stack method can actually make use of many of those frames even if you subject has slightly moved, since the algorithm can tolerate inter-frame subject movement of up to ~8% of the frame.

If you’re photographing perfectly still scenes in very low light, Apple’s iPhone 11 is your best bet

That shouldn’t undermine Apple’s effort here which, overall, is actually currently class-leading under very, very low light conditions where the iPhone can use and fuse multiple frames of very long exposure. We’re told the iPhone 11 can use total exposure times of 10s handheld, and 28s on a tripod. Google’s Night Sight, on the other hand, tends to use an upper limit of 1/3s per frame handheld, or up to 1s on a tripod. Rumors however appear to suggest the Pixel 4 being capable of even longer total exposures, so it remains to be seen who will be the ultimate low light king.

Currently though, if you’re photographing perfectly still scenes in very low light, Apple’s iPhone 11 is your best bet. For most users, factoring in moving subjects and less low light (yet still dark) conditions, Google’s Night Sight remains the technology to beat.


Footnotes:

1 Huawei phones have their own formidable night modes; while we haven’t gotten our hands on the latest P30 Pro, The Verge has its own results that show a very compelling offering from the Chinese company.

2 A note about our presentation: these are rollovers, so on desktop you can hover your mouse over the states below the image to switch the crop. On mobile, simply tap the states at the bottom of each rollover to switch the crop. Tap (or click) on the crop itself to launch a separate window with the full-resolution image. Finally, on ‘Retina’ laptops and nearly all modern higher-end smartphones, these are 100% crops (each pixel maps 1 display pixel); however, on ‘standard’ (not HiDPI) displays these are 50% crops. In other words, on standard displays the differences you see are actually under-represented. [return to text]

3We had updated the iPhone 11 to the latest iOS 13.2 public beta by the time this set of shots was taken; hence the (sudden) availability of Deep Fusion.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on How does iPhone 11 Night Mode compare to Google Pixel 3 Night Sight?

Posted in Uncategorized