RSS
 

Posts Tagged ‘mode’

MacOS beta reveals ‘Pro Mode’ code, teasing a high-performance mode for MacBook Pros

16 Jan

Apple may be preparing to release a new macOS feature called ‘Pro Mode,’ according to a recent report from 9to5Mac. Evidence of the feature was found nestled within macOS Catalina 10.15.3 beta code alongside strings of text. Based on the text, it seems ‘Pro Mode’ will be a manual feature that enables users to temporarily boost a Mac’s performance.

Apple releases beta versions of its macOS operating system for developers to test before the updates are made available to casual users. Teardowns of these updates may reveal the presence of unannounced features that are hidden in the code, the latest example being this newly detailed ‘Pro Mode.’

Strings of text listed as descriptions of the feature state that enabling Pro Mode may make apps ‘run faster, but battery life may decrease and fan noise may increase.’ As well, a string of text reveals that ‘fan speed limit [is] overridden’ when Pro Mode is active.

The report indicates that users may be able to turn Pro Mode on manually and that the system will automatically disable it by the next day in a way similar to the existing Do Not Disturb feature. The feature is expected to be made available on MacBook laptops, making it possible for users to temporarily boost performance while editing images, processing videos or other tasks with more demanding requirements.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on MacOS beta reveals ‘Pro Mode’ code, teasing a high-performance mode for MacBook Pros

Posted in Uncategorized

 

Filmmaker Mode TV setting will be available on select Philips, Samsung and other TV models

14 Jan

Motion smoothing, the controversial TV setting that uses interpolation to reduce motion blur, will be addressed with a previously announced television feature called Filmmaker Mode. During CES 2020, the UHD Alliance offered an update about this setting, revealing that Samsung, Kaleidescape and Philips/TP Vision will offer Filmmaker Mode on some of their 2020 television models.

Filmmaker Mode was first announced by the UHD Alliance in August 2019 with support from directors like Martin Scorses, Christopher Nolan and Rian Johnson. The goal of this setting is to present movies in the way they were intended by the filmmaker, including with the original frame rates, aspect ratios and colors.

Below is a brief video explainer of motion smoothing, provided by Vulture:

Motion smoothing has proven controversial among consumers and filmmakers alike. Though the technology is effective at reducing motion blurs, many viewers complain that it adds an unwanted visual effect that makes the content less enjoyable to watch. Most modern TVs, including many budget models, now offer motion smoothing as a standard feature, though some manufacturers make it possible to disable the setting.

In addition to the manufacturers announced at CES, Vizio, Panasonic and LG were previously revealed as companies that will also offer Filmmaker Mode on select television models starting in 2020.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Filmmaker Mode TV setting will be available on select Philips, Samsung and other TV models

Posted in Uncategorized

 

Apple’s 2020 iPhone photography contest seeks best Night mode shots

09 Jan
Shot on iPhone 11 Pro Max in Night mode by Eric Zhang.

Apple has announced another iPhone photography contest, this one soliciting photos shot using the company’s Night mode feature found on the iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max. This new contest follows the first ‘Shot on iPhone’ competition announced by the company in January 2019. The new contest is open to submissions through January 29.

The 2020 iPhone Night mode photography contest opened to submissions on Wednesday, January 8; the five winning images will be announced by Apple on March 4.

Interested iPhone 11 owners can submit their favorite shots on Twitter and Instagram using the hashtags #NightmodeChallenge and #ShotoniPhone, as well as on China’s Weibo service using the tags #NightmodeChallenge# and #ShotoniPhone#.

Shot on iPhone 11 Pro in Night mode by Austin Mann.

Apple is also giving competitors the option of emailing a high-resolution version of their images to its shotoniphone@apple.com; in this case, photographers must use the following file naming convention: ‘firstname_lastname_nightmode_iPhonemodel.’

Social media submissions should include a note about which iPhone model was used to capture the image in the caption. Apple says contestants can use third-party and Photos app editing tools to edit the images. Submissions must be submitted by 11:59 PM PST on January 29 to be eligible. As well, contestants must be at least 18 years old.

The company has offered multiple tips on using the Night mode feature, including paying attention to the capture time displayed in the Night mode icon and using a tripod to keep the shots steady. Winning images will be showcased in a gallery on the Apple website, Apple Newsroom and Apple Instagram account; they may also appear in Apple’s digital campaigns, among other promotions.

The full list of judges and other details can be found in Apple’s announcement.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Apple’s 2020 iPhone photography contest seeks best Night mode shots

Posted in Uncategorized

 

Google details how it’s improved Portrait mode on the Pixel 4

17 Dec

Portrait modes that simulate the shallow depth of field of a large sensor camera and fast lens have been around on smartphones for a long time. The Pixel 2 was the first Google phone to offer the feature. With the Pixel 2 being a single-lens camera the dual-pixel autofocus system was used to estimate the parallax and thus depth. The Pixel 3 still relied on dual-pixels but the system was improved using machine learning.

The Pixel 4 is the first Google phone to use dual-pixel AF and dual-cameras combined for depth estimation.

The latest Google flagship, the Pixel 4, is the first phone in the Pixel line to feature a dual-cameras. This allows for even better depth estimation by leveraging both the dual-camera and dual-pixel auto-focus system. In addition Google has also improved the appearance of the bokeh, making it more closely match that of a large sensor DSLR or mirrorless camera.

With the dual-pixel autofocus distance between the two focus pixels is very small which makes it difficult to estimate depth further away from the camera. The Pixel 4’s dual-cameras are 13 mm apart, allowing for a larger parallax and making it easier to estimate the depth of objects at a distance.

…dual-pixels provide better depth information in the occluded regions between the arm and torso, while the large baseline dual cameras provide better depth information in the background and on the ground.

Google is also still using the information collected by the dual-pixels, though, as it helps refine depth estimation around the foreground subject. In addition machine learning is used to estimate depth from both and dual cameras. A neural network first processes data from the two separately into an intermediate representation. A final depth map is then computed in a second step.

This image shows depth maps generated from dual-pixel AF, dual-camera and both combined. Dual-pixels provide better depth in the areas visible to only one camera, dual-cameras provide better depth in the background and ground. (Photo: Mike Milne/Google)

In addition to the improved depth estimation spotlights in the background are now rendered with more contrast, making for more natural looking results. This is achieved by blurring the merged raw image produced by the HDR+ processing and applying tone mapping.

Additional depth map samples can be found here, head over to the Google Blog for the full article.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google details how it’s improved Portrait mode on the Pixel 4

Posted in Uncategorized

 

Google explains its Night Sight astrophotography mode in detail

27 Nov

Ever since Google launched its Night Sight feature on the Pixel 3 series the low light photography feature has been very popular with users. On the new Pixel 4 Google has updated Night Sight with a specific mode for astrophotography. The team behind it has now authored a blog post to explained the function in more detail.

In order to capture as much light as possible without using shutter speeds that would require a tripod and/or lead to blur on any moving subject, Night Sight splits the exposure across multiple frames that are aligned to compensate for camera shake and in-scene motion. In a second step the frames are averaged to reduce noise and increase image detail.

The astrophotography feature uses the same approach in principle but uses longer exposure times for individual frames and therefore relies on tripod use or some other kind of support.

Image with hot pixels (left) and the corrected version (right)

The team decided exposure times of individual frames should not be longer than 16 seconds to make the stars look like points of light rather than streaks. The team also found that most users were not patient enough to wait longer than four minutes for a full exposure. So the feature uses a maximum of 15 frames with up to 16 seconds exposure time per frame.

At such long exposure times hot pixel can become a problem. The system identifies them by comparing neighboring pixels within the same frame as well as across a sequence of frames recorded for a Night Sight image. If an outlier is detected its value is replaced by an average.

In addition the feature uses AI to identify the sky in night images and selectively darken it for image results that are closer to the real scene than what you would achieve with a conventional long exposure.

This image was captured under the lighting of a full moon. The left half shows the version without any sky processing applied. On the right the sky has been slightly darkened for a more realistic result, without affecting the landscape elements in the frame.

Night Sight is not only about capture, though, it also includes a special viewfinder that is optimized for shooting in ultra-low light. When the shutter is pressed each individual long-exposure frame is displayed as it is captured, showing much more detail than the standard preview image. The composition can then be corrected and a new Night Sight shot triggered.

Some of the results we have seen have been impressive. For more more technical detail head over to the original post on the Google blog. A n album of full-size sample images can be found here. The team has also put together a helpful guide for using the feature in pdf format.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google explains its Night Sight astrophotography mode in detail

Posted in Uncategorized

 

3 Things Aperture Mode is Perfect For in Photography

25 Nov

The post 3 Things Aperture Mode is Perfect For in Photography appeared first on Digital Photography School. It was authored by Mat Coker.

things-aperture-mode-is-perfect-for

Many new photographers are overwhelmed by all the settings on their camera. But what if you could ignore most of the settings on your camera and just choose one to experiment with? Where would you begin?

I suggest you begin by experimenting with the aperture because this setting has a huge effect on your photos.

Once you know the things Aperture Mode (or Aperture Priority) is perfect for, you’ll have increased your creative possibilities and simplified the camera setting problem.

Here are three things you can do with Aperture Mode.

 

But first, how to put your camera on Aperture Mode

For most cameras, to put your camera on Aperture Mode you need to turn the dial to A in order to take control of your aperture (Av for Canon).

things aperture mode is perfect for

This is a Nikon camera. On Canon, you’re looking for Av

 

When you look at the screen on your camera, you’ll notice a number with an F beside it. This is your aperture value. Use the scroller on your camera to change that number. Experiment and see how high and how low you can make that number go.

things aperture mode is perfect for

When that number is smaller (1.8, 3.5, 5.6) the aperture is wider or more open.

 

things aperture mode is perfect for

When the number is larger (11, 16, 22) the aperture is narrower or more closed.

As we move through the tips, you’ll see how opening or closing your aperture affects your photo. When you’re intentional about setting your aperture, it will drastically change your photo.

 

1. How to create background blur (or keep the background in focus if you prefer)

Think in terms of opposites for a moment.

Normally, when we take a portrait, we only want the person to be in focus. But when we photograph a landscape, we want the whole photo to be in focus.

I’ll show you how you can use aperture to create background blur for portraits. I’ll also show you the opposite; how to keep the whole scene in focus for landscapes.

The principle is as simple as this: open your aperture for portraits, close it for landscapes.

PS – the technical term for background blur is bokeh (like a bouquet of flowers).

Image: Remember to open your aperture to create background blur in your portraits. Opening your aper...

Remember to open your aperture to create background blur in your portraits. Opening your aperture means setting it to the smallest number possible (probably 1.8 or 3.5 or 5.6). I set the aperture to F/2.5 for this portrait.

 

Image: F/1.2 using the 56mm Fuji prime lens

F/1.2 using the 56mm Fuji prime lens

 

things aperture mode is perfect for

Remember to close your aperture to keep the whole scene in focus for landscape photos. Closing your aperture means setting it to a larger number such as 11, 16, or 22. I set the aperture to F/11 for this landscape photo.

 

things aperture mode is perfect for

The aperture is set to F/11 for this landscape photo.

 

How to achieve better bokeh (background blur)

The first thing I told you about bokeh is that you need to open your aperture all the way. That means that you need to set it to the smallest number possible. That number might be 5.6, 3.5, or even 1.8, depending on your lens.

However, opening your aperture all the way isn’t always enough. So I’ll show you a formula for getting an even better bokeh.

My goal for the following portrait of Batman is to have him in focus with a nice blurry background.

There are four simple steps involved; let’s look at them one at a time.

1. Open the aperture

Image: The aperture is set to 3.5

The aperture is set to 3.5

Now, I opened the aperture all the way, but the building isn’t really out of focus. The back part of the building is out of focus, but the part directly behind Batman is still pretty crisp.

The biggest problem is that he is too close to the background, so the second step will make a huge difference.

2. Bring Batman away from the background

Image: Batman has been moved away from the background.

Batman has been moved away from the background.

Now the building is out of focus, but let’s make it even more out of focus.

3. Zoom in

So far, I set my lens to its widest angle of 18mm. When I zoom all the way to 55mm, the background will go more out of focus.

things aperture mode is perfect for

The aperture has closed a little bit to f/5.6 because I zoomed in. This will happen with most lenses.

As well as blurring the background, zooming in also gave the photo a more compressed look.

Would you like the background to be even more blurry? Is it even possible?

4. Get closer

Yes, it is!

The closer you get to Batman, the more out of focus the background becomes.

things aperture mode is perfect for

I used my Olympus Tough TG-6 for this photo. The microscope mode allows me to get very close. The aperture is set to f/6.3 because I zoomed in.

For great bokeh just remember:

  • Open your aperture
  • Step away from the background
  • Zoom in
  • Get closer

Controlling your background blur is just one of the things Aperture Mode is perfect for. Now let’s see what else it can do.

 

2. Starburst effect

The starburst effect adds interest to your photos because we don’t normally see this with our eye.

To achieve the starburst effect, it’s as easy as closing your aperture.

things aperture mode is perfect for

For this landscape photo, I closed the aperture to F/8.

 

Image: For this photo, I set the aperture to F/8. I thought that it would be interesting to capture...

For this photo, I set the aperture to F/8. I thought that it would be interesting to capture this bridge using the starburst effect. But I’m disappointed with the angle or perspective. When the river freezes over, I’m going to come back and photograph the bridge from a different perspective. I consider this to be a “sketch shot.” I tried it out, and I know that it’s worth pursuing another photo later on.

The starburst effect is one of the more creative things Aperture Mode is perfect for. Now let’s see one of the biggest problems that Aperture Mode will help solve.

3. Low light photography

One of the biggest problems with dim light is that your photos become blurry from motion.

Image: A typical blurry photo caused by dim light and a slow shutter speed.

A typical blurry photo caused by dim light and a slow shutter speed.

Photos become blurry because there is not enough light and the camera takes more time to capture the photo. Technically, it’s a slow shutter speed issue.

The important thing to know is that you need to get more light into the camera. You can get more light in by opening your aperture all the way.

You should also raise your ISO higher (1600, 3200, or 6400).

Your shutter speed may still be a little bit slow, which could lead to motion blur in your photos. But if you hold still while taking the photo, and wait for your subject to hold still, you’ll get a pretty crisp photo.

things aperture mode is perfect for

I captured this candlelight portrait at F/2.0, ISO 4000, shutter speed 1/60 sec

 

things aperture mode is perfect for

F/2.0, ISO 2500, shutter speed 1/60 sec

 

Image: An extreme low light photo captured at f/2.0, ISO 5000, shutter speed 1/15 sec

An extreme low light photo captured at f/2.0, ISO 5000, shutter speed 1/15 sec

Sometimes you have no choice but to have a slow shutter speed. Why not get creative and make the most of it?

You’ve increased your skill as a photographer!

You’ve learned four things aperture mode is perfect for. These creative effects are achieved by simply opening or closing your aperture:

  • Blur your background by opening the aperture
  • Keep a landscape in focus by closing your aperture
  • Create a starburst effect by closing your aperture
  • Improve dim light photos by opening your aperture

Focusing on this one camera setting will help improve your photography and simplify camera setting confusion.

Try these out, and let me know how you go in the comments!

The post 3 Things Aperture Mode is Perfect For in Photography appeared first on Digital Photography School. It was authored by Mat Coker.


Digital Photography School

 
Comments Off on 3 Things Aperture Mode is Perfect For in Photography

Posted in Photography

 

Google’s Pixel 4 Astrophotography mode is now available on Pixel 2, 3 and 3a devices

07 Nov

The Google Pixel 4 offers a range of new and innovative camera features. Some of them are, mainly due to hardware requirements, exclusive to the latest Pixel device but Google has promised to make some others available for older Pixel models.

This has now happened in the case of the Pixel 4 Astrophotography mode. The function had previously been made available for older Pixels via a community-driven development effort but it’s now officially supported with older devices in the latest version 7.2 of the Google Camera app. Below are a few sample photos captured with the Astrophotography mode on the Pixel 4:

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_6422957949″,”galleryId”:”6422957949″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Users of the Pixel 2 and Pixel 3 series, including the Pixel 3a, are now able to use the feature after updating to the latest version of the app. The Astrophotography option builds on Google’s Night Sight technology and captures and combines several frames to achieve a clean exposure and great detail as well as limited noise levels when photographing the night sky.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s Pixel 4 Astrophotography mode is now available on Pixel 2, 3 and 3a devices

Posted in Uncategorized

 

Ricoh adds new ‘Handheld HDR’ still capture mode to its Theta V, Z1 360-degree cameras

28 Oct

Ricoh has released updated versions of its Ricoh Theta app that adds new ‘Handheld HDR’ functionality for its Theta V and Theta Z1 360-degree cameras.

The Ricoh Theta app update (version 1.26.0 on Android and version 2.8.0 on iOS) adds Ricoh’s new ‘Handheld HDR’ capture setting for still images and addresses a number of unspecified bug fixes. For the new HDR setting to work, the Theta V and Theta Z1 cameras need to be updated to the latest firmware, version 3.10.1 and version 1.20.1, respectively.

App Store screenshots from the iOS version of the Ricoh Theta app.

All of the apps and firmware updates are free to download. You can find instructions on how to update the Theta V and Theta Z1 firmware on Ricoh’s support pages.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Ricoh adds new ‘Handheld HDR’ still capture mode to its Theta V, Z1 360-degree cameras

Posted in Uncategorized

 

Google Pixel 4 adds telephoto lens, improved portrait mode and HDR in live view

15 Oct

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_0259124455″,”galleryId”:”0259124455″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Google officially unveiled the Pixel 4 today, with the addition of a telephoto camera headlining the camera updates. Other improvements include an enhanced live view experience showing the approximated effects of HDR in real time, added controls for adjusting exposure and tone mapping prior to image capture, and an updated portrait mode with better depth mapping thanks to the additional rear camera.

The Pixel 4 and Pixel 4 XL offer 5.7″ and 6.3″ OLED displays respectively, each with a 90Hz variable refresh rate that Google calls ‘Smooth Display.’ Gone is the fingerprint sensor on the rear of the device, replaced by face unlock. Also new is a technology called Soli, comprising a radar chip that detects hand motions. Called Motion Sense, this feature makes it possible to skip songs and silence calls with a wave of your hand.

As is the case with high-profile phone launches, along with the main specifications the camera updates are also the center of attention (in fact, Annie Leibovitz made an appearance). In addition to the new F2.4, optically stabilized telephoto camera (about 48mm equiv.), Google has introduced improved Super Resolution Zoom for up to 8x digital zoom. In fact, the telephoto camera uses a hybrid of optical and digital zoom at its default zoom setting to achieve approximately 2x zoom.

The process of taking photos has been improved on the Pixel 4 as well. On previous models, the results of Google’s impressive HDR rendering could only be seen after capture – now, machine learning is used to approximate the effect in real-time for a much more ‘what you see is what you get’ experience.

Google Pixel 4 official sample images

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_0948233002″,”galleryId”:”0948233002″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Additional exposure controls are also available during image capture. Two new sliders give users direct control of overall scene brightness and rendering of shadows, as compared to the single exposure slider offered by the Pixel 3. Google also says the Pixel 4’s camera is more responsive and stable compared to the Pixel 3, thanks to 6GB of RAM at its disposal.

Portrait mode should see significant improvements as well. The mode now uses information from the telephoto camera as well as split pixels to judge subject distance, creating a better depth map than was previously possible only using split pixels. Portrait mode’s range has also been extended, making it possible to capture large objects as well as human subjects from farther back than was possible on the Pixel 3.

While the telephoto camera lends depth information, the standard camera with a 1.5x digital zoom is used for the image itself. Background blur is now applied to the Raw image before tone mapping, with the aim of creating more SLR-like bokeh. The updated Portrait mode should also handle human hair and dog fur better, and Google says that its face detection has been improved and should handle backlit subjects better.

All camera modes will benefit from improved, learning-based white balance – previously used only in Night Sight

An astrophotography mode is added to Night Sight, using longer shutter speeds to capture night skies. Additionally, all camera modes will benefit from improved, learning-based white balance – previously used only in Night Sight. Google has also done some white balance tuning for certain light sources.

Google has reduced the number of front-facing cameras from two back down to one. Citing the popularity of the ultra-wide selfie camera, the Pixel 4’s single front-facing camera offers a focal length that’s a happy medium between the standard and ultra-wide options on the Pixel 3.

Google Pixel 4 pre-orders start today; Pixel 4 starts at $ 799 and Pixel 4 XL starts at $ 899. Both will ship on October 24th. It will be available for all major US carriers for the first time, including AT&T.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google Pixel 4 adds telephoto lens, improved portrait mode and HDR in live view

Posted in Uncategorized

 

How does iPhone 11 Night Mode compare to Google Pixel 3 Night Sight?

15 Oct

Many smartphones today take great images in broad daylight. That’s no surprise – when there’s a lot of light, it doesn’t matter so much that the small smartphone sensor doesn’t collect as many photons as a larger sensor: there’s an abundance of photons to begin with. But smartphone image quality can take a nosedive as light levels drop and there just aren’t many photons to collect (especially for a small sensor). That’s where computational techniques and burst photography come in.

Low light performance is a huge differentiator that separates the best smartphones from
the worst

Low light performance is a huge differentiator that separates the best smartphones from the worst. And Google’s Night Sight has been the low-light king of recent1, thanks to its averaging of many (up to 15) frames, its clever tile-based alignment to deal with hand movement and motion in the scene, and its use of a super-resolution pipeline that yields far better resolution, particularly color resolution, and lower noise than simple frame stacking techniques.

With the iPhone 11, Apple launched its own Night Mode to compete with offerings from Android phones. It uses ‘adaptive bracketing’ to combine both long and short exposures (to freeze any movement) to build a high quality image in low light conditions. Let’s see how it stacks up compared to Google’s Night Sight and Apple’s own previous generation iPhone XS.

The set-up

‘Low light performance’ is difficult to sum up in one number or picture when it comes to computational imaging. Different devices take different approaches, which ultimately means that comparative performance across devices can vary significantly with light level. Hence we’ve chosen to look at how the iPhone 11 performs as light levels decrease from evening light before sunset to very low light conditions well after sunset. The images span an hour-long time frame, from approximately 500 lux to 5 lux. All shots are handheld, since this is how we expect users to operate their smartphones. The iPhone 11 images spanning this time period are shown below.

7:00 pm, evening light
1/60 | ISO 100
485 lux | 7.6 EV

7:25 pm, late evening light
1/8 | ISO 250
25 lux | 3.4 EV

7:50 pm, low light
1/4 | ISO 640
5 lux | 1 EV
8:05 pm, very low light
1/8 | ISO 1250
<5 lux | <1 EV

Note that Night mode is only available with the main camera unit, not the 2x or 0.5x cameras. And before we proceed to our comparisons, please see this footnote about the rollovers and crops that follow: on ‘HiDPI’ screens like smartphones and higher-end laptops/displays, the following crops are 100%, but on ‘standard’ displays you’ll only see 50% crops.2

Now, on to the comparisons. In the headings, we’ve labeled the winner.

Evening light (485 lux) | Winner: Google Pixel 3

Before sunset, there’s still a good amount of available light. At this light level (485 lux, as measured by the iPhone 11 camera), the option for Night mode on iPhone 11 is not available. Yet Night Sight on the Google Pixel 3 is available, as it is in all situations. And thanks to its averaging of up to 15 frames and its super-resolution pipeline, it provides far more detail than the iPhone 11.

It’s not even close.

Take a look at the detail in the foreground trees and foliage, particularly right behind the fence at the bottom. Or the buildings and their windows up top, which appear far crisper on the Pixel 3.

Late evening light (25 lux) | Winner: Google Pixel 3

As the sun sets, light levels drop, and at 25 lux we finally have the option to turn on Night Mode on the iPhone, though it’s clearly not suggested by Apple since it’s not turned on by default. You’ll see the Night Mode option as a moon-like icon appearing on the bottom left of the screen in landscape orientation. Below we have a comparison of the iPhone with Night Mode manually turned on next to the Google Pixel 3 Night Sight (also manually enabled).

There’s more detail and far less noise – particularly in the skies – in the Google Pixel 3 shot. It’s hard to tell what shutter speeds and total exposure time either camera used, due to stacking techniques using differing shutter speeds and discarding frames or tiles at will based on their quality or usability. But it appears that, at best, the Pixel 3 utilized 15 frames of 1/5s shutter speeds, or 3s total, while the iPhone 11 indicated it would use a total of 1s in the user interface (the EXIF indicates 1/8s, so is likely un-representative). In other words, here it appears the Pixel 3 used a longer total exposure time.

Apart from that, though, the fact that the iPhone result looks noisier than the same shot with Night Mode manually turned off (not shown) leads us to believe that the noisy results are at least in part due to Apple’s decision to use less noise reduction in Night Mode. This mode appears to assume that the longer overall exposures will lead to lower noise and, therefore, less of a need for noise reduction.

However, in the end, it appears that under these light levels Apple is not using a long enough total exposure (the cumulative result of short and long frames) to yield low enough noise results that the lower noise reduction levels are appropriate. So, in these conditions when it appears light levels are not low enough for Apple to turn on Night Mode by default, the Google Pixel 3 outperforms, again.

Low light (5 lux) | Winner: Tie

As light levels drop further to around 5 lux, the iPhone 11 Night mode appears to catch up to Google’s Night Sight. Take a look above, and it’s hard to choose a winner. The EXIF data indicates the Pixel used 1/8s shutter speeds per frame, while the iPhone used at least 1/4s shutter speed for one or more frames, so it’s possible that the iPhone’s use of longer exposure times per frame allows it to catch up to Google’s result, despite presumably using fewer total frames. Keynotes from Apple and personal conversations with Google indicate that Apple only uses up to 8-9 frames of both short and long exposures, while the Pixel uses up to 15 frames of consistent exposure, for each phone’s respective burst photography frame-stacking methods.

Very low light (< 5 lux) | Winner: iPhone 11

As light levels drop even further, the iPhone 11 catches up to and surpasses Google’s Night Sight results. Note the lower noise in the dark blue sky above the cityscape. And while overall detail levels appear similar, buildings and windows look crisper thanks to lower noise and a higher signal:noise ratio. We presume this is due to the use of longer exposure times per frame.

It’s worth noting the iPhone, in this case, delivers a slightly darker result, which arguably ends up being more pleasing, to me anyway. Google’s Night Sight also does a good job of ensuring that nighttime shots don’t end up looking like daytime, but Apple appears to take a slightly more conservative approach.

We shot an even darker scene to see if the iPhone’s advantage persisted. Indeed, the iPhone 11’s advantage became even greater as light levels dropped further. Have a look below.

(Night Mode Off)

(Night Sight Off)

As you can see, the iPhone 11 delivers a more pleasing result, with more detail and considerably less noise, particularly in peripheral areas of the image where lens vignetting considerably lowers image quality as evidenced by the drastically increased noise in the Pixel 3 results.

Ultimately it appears that the lower the light levels, the better the iPhone 11 performs comparatively.

A consideration: (slightly) moving subjects

Neither camera’s night mode is meant for photographing moving subjects, but that doesn’t mean they can’t deal with motion. Because these devices use tile-based alignment to merge frames to the base frame, static and moving subjects in a scene can be treated differently. For example, on the iPhone, shorter and longer exposures can be used for moving and static subjects, respectively. Frames with too much motion blur for the moving subjects may be discarded, or perhaps only have their static portions used if the algorithms are clever enough.

Below we take a look at a slightly moving subject in two lighting conditions: the first dark enough for Night mode to be available as an option on the iPhone (though it isn’t automatically triggered until darker conditions), and the second in very dim indoor lighting where Night mode automatically triggers.

Although I asked my subject to stay still, she moved around a bit as children are wont to do. The iPhone handles this modest motion well. You’ll recall that Apple’s Night mode uses adaptive bracketing, meaning it can combine both short and long exposures for the final result. It appears that the exposure times used for the face weren’t long enough to avoid a considerable degree of noise, which is exacerbated by more conservative application of noise reduction to Night mode shots. Here, we prefer the results without Night mode enabled, despite the slight watercolor painting-like result when viewed at 100%.

We tested the iPhone 11 vs. the Google Pixel 3 with very slightly moving subjects under even darker conditions below.

Here you can see that Apple’s Night mode yields lower noise than with the mode (manually) turned off. With the mode turned off, it appears Deep Fusion is active3, which yields slightly more detail at the cost of more noise (the lack of a smeary, watercolor painting-like texture is a giveaway that Deep Fusion kicked in). Neither iPhone result is as noise-free and crisply detailed as the Pixel 3 Night Sight shot, though. We can speculate that the better result is due to either the use of more total frames, or perhaps more effective use of frames where the subject has slightly moved, or some combination thereof. Google’s tile-based alignment can deal with inter-frame subject movement of up to 8% of the frame, instead of simply discarding tiles and frames where the subject has moved. It is unclear how robust Apple’s align-and-merge algorithm is comparatively.

Vs. iPhone XS

We tested the iPhone 11 Night Mode vs. the iPhone XS, which has no Night Mode to begin with. As you can see below, the XS image is far darker, with more noise and less detail than the iPhone 11. This is no surprise, but it’s informative to see the difference between the two cameras.

Conclusion

iPhone 11’s Night Mode is formidable and a very welcome tool in Apple’s arsenal. It not only provides pleasing images for its users, but it sometimes even surpass what is easily achievable by dedicated cameras. In the very lowest of light conditions, Apple has even managed to surpass the results of Google’s Night Sight, highly regarded – and rightfully so – as the industry standard for low light smartphone photography.

But there are some caveats. First, in less low light conditions – situations you’re actually more likely to be shooting in – Google’s use of more frames and its super-resolution pipeline mean that its Pixel 3 renders considerably better results, both in terms of noise and resolution. In fact, the Pixel 3 can out-resolve even the full-frame Sony a7S II, with more color resolution and less color aliasing.

Second, as soon as you throw people as subjects into the mix, things get a bit muddled. Both cameras perform pretty well, but we found Google’s Night Sight to more consistently yield sharper images with modest subject motion in the scene. Its use of up to 15 frames ensures lower noise, and its align-and-stack method can actually make use of many of those frames even if you subject has slightly moved, since the algorithm can tolerate inter-frame subject movement of up to ~8% of the frame.

If you’re photographing perfectly still scenes in very low light, Apple’s iPhone 11 is your best bet

That shouldn’t undermine Apple’s effort here which, overall, is actually currently class-leading under very, very low light conditions where the iPhone can use and fuse multiple frames of very long exposure. We’re told the iPhone 11 can use total exposure times of 10s handheld, and 28s on a tripod. Google’s Night Sight, on the other hand, tends to use an upper limit of 1/3s per frame handheld, or up to 1s on a tripod. Rumors however appear to suggest the Pixel 4 being capable of even longer total exposures, so it remains to be seen who will be the ultimate low light king.

Currently though, if you’re photographing perfectly still scenes in very low light, Apple’s iPhone 11 is your best bet. For most users, factoring in moving subjects and less low light (yet still dark) conditions, Google’s Night Sight remains the technology to beat.


Footnotes:

1 Huawei phones have their own formidable night modes; while we haven’t gotten our hands on the latest P30 Pro, The Verge has its own results that show a very compelling offering from the Chinese company.

2 A note about our presentation: these are rollovers, so on desktop you can hover your mouse over the states below the image to switch the crop. On mobile, simply tap the states at the bottom of each rollover to switch the crop. Tap (or click) on the crop itself to launch a separate window with the full-resolution image. Finally, on ‘Retina’ laptops and nearly all modern higher-end smartphones, these are 100% crops (each pixel maps 1 display pixel); however, on ‘standard’ (not HiDPI) displays these are 50% crops. In other words, on standard displays the differences you see are actually under-represented. [return to text]

3We had updated the iPhone 11 to the latest iOS 13.2 public beta by the time this set of shots was taken; hence the (sudden) availability of Deep Fusion.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on How does iPhone 11 Night Mode compare to Google Pixel 3 Night Sight?

Posted in Uncategorized