RSS
 

Posts Tagged ‘HDR+’

Google shares a deep dive into its new HDR+ with Bracketing technology found in its latest Pixel devices

26 Apr

Google has shared an article on its AI Blog that dives into the intricacies of the HDR capabilities of its most recent Pixel devices. In it, Google explains how its HDR+ with Bracketing technology works to capture the best image quality possible through clever capture and computational editing techniques.

To kick off the article, Google explains how its new ‘under the hood’ HDR+ with Bracketing technology — first launched on the Pixel 4a 5G and Pixel 5 back in October — ‘works by merging images taken with different exposure times to improve image quality (especially in shadows), resulting in more natural colors, improved details and texture, and reduced noise.’

Using bursts to improve image quality. HDR+ starts from a burst of full-resolution raw images (left). Depending on conditions, between 2 and 15 images are aligned and merged into a computational raw image (middle). The merged image has reduced noise and increased dynamic range, leading to a higher quality final result (right). Caption and image via Google.

Before diving into how the behind-the-scenes work is done to capture the HDR+ with Bracketing images, Google explains why high dynamic range (HDR) scenes are difficult to capture, particularly on mobile devices. ‘Because of the physical constraints of image sensors combined with limited signal in the shadows […] We can correctly expose either the shadows or the highlights, but not both at the same time.’

Left: The result of merging 12 short-exposure frames in Night Sight mode. Right: A single frame whose exposure time is 12 times longer than an individual short exposure. The longer exposure has significantly less noise in the shadows but sacrifices the highlights. Caption and image via Google.

Google says one way to combat this is to capture two different exposures and combine them — something ‘Photographers sometimes [do to] work around these limitations.’ While this works fairly well with cameras with larger sensors and more capable processors inside tablets and laptops to merge the images, Google says it’s a challenge to do on mobile devices because it requires ‘Capturing additional long exposure frames while maintaining the fast, predictable capture experience of the Pixel camera’ and ‘Taking advantage of long exposure frames while avoiding ghosting artifacts caused by motion between frames.’

Google was able to mitigate these issues with its original HDR+ technology through prioritizing the highlights in an image and using burst photography to reduce noise in the shadows. Google explains the HDR+ method ‘works well for scenes with moderate dynamic range, but breaks down for HDR scenes.’ As for why, Google breaks down the two different types of noise that get into an image when capturing bursts of photos: shot noise and read noise.

Google explains the differences in detail:

One important type of noise is called shot noise, which depends only on the total amount of light captured — the sum of N frames, each with E seconds of exposure time has the same amount of shot noise as a single frame exposed for N × E seconds. If this were the only type of noise present in captured images, burst photography would be as efficient as taking longer exposures. Unfortunately, a second type of noise, read noise, is introduced by the sensor every time a frame is captured. Read noise doesn’t depend on the amount of light captured but instead depends on the number of frames taken — that is, with each frame taken, an additional fixed amount of read noise is added.’

Left: The result of merging 12 short-exposure frames in Night Sight mode. Right: A single frame whose exposure time is 12 times longer than an individual short exposure. The longer exposure has significantly less noise in the shadows but sacrifices the highlights. Caption and image via Google.

As visible in the above image, Google highlights ‘why using burst photography to reduce total noise isn’t as efficient as simply taking longer exposures: taking multiple frames can reduce the effect of shot noise, but will also increase read noise.’

To address this shortcoming, Google explains how it’s managed to use a ‘concentrated effort’ to make the most of recent ‘incremental improvements’ in exposure bracketing to combined the burst photography component of HDR+ with the more traditional HDR method of exposure bracketing to get the best result possible in extreme high dynamic range scenes:

‘To start, adding bracketing to HDR+ required redesigning the capture strategy. Capturing is complicated by zero shutter lag (ZSL), which underpins the fast capture experience on Pixel. With ZSL, the frames displayed in the viewfinder before the shutter press are the frames we use for HDR+ burst merging. For bracketing, we capture an additional long exposure frame after the shutter press, which is not shown in the viewfinder. Note that holding the camera still for half a second after the shutter press to accommodate the long exposure can help improve image quality, even with a typical amount of handshake.’

Google explains how its Night Sight technology has also been improved through the use of its advanced bracketing technology. As visible in the illustration below, the original Night Sight mode captured 15 short exposure frames, which it merged to create the final image. Now, Night Sight with bracketing will capture 12 short and 3 long exposures before merging them, resulting in greater detail in the shadows.

Capture strategy for Night Sight. Top: The original Night Sight captured 15 short exposure frames. Bottom: Night Sight with bracketing captures 12 short and 3 long exposures. Caption and image via Google.

As for the merging process, Google says its technology chooses ‘one of the short frames as the reference frame to avoid potentially clipped highlights and motion blur.’ The remaining frames are then aligned with the reference frame before being merged.

To reduce ghosting artifacts caused by motion, Google says it’s designed a new spatial merge algorithm, similar to that used in its Super Res Zoom technology, ‘that decides per pixel whether image content should be merged or not.’ Unlike Super Res Zoom though, this new algorithm faces additional challenges due to the long exposure shots, which are more difficult to align with the reference frame due to blown out highlights, motion blur and different noise characteristics.

Left: Ghosting artifacts are visible around the silhouette of a moving person, when deghosting is disabled. Right: Robust merging produces a clean image. Caption and image via Google.

Google is confident it’s been able to overcome those challenges though, all while merging images even faster than before:

Despite those challenges, our algorithm is as robust to these issues as the original HDR+ and Super Res Zoom and doesn’t produce ghosting artifacts. At the same time, it merges images 40% faster than its predecessors. Because it merges RAW images early in the photographic pipeline, we were able to achieve all of those benefits while keeping the rest of processing and the signature HDR+ look unchanged. Furthermore, users who prefer to use computational RAW images can take advantage of those image quality and performance improvements.’

All of this is done behind the scenes without any need for the user to change settings. Google notes ‘depending on the dynamic range of the scene, and the presence of motion, HDR+ with bracketing chooses the best exposures to maximize image quality.’

Google’s HDR+ with Bracketing technology is found on its Pixel 4a 5G and Pixel 5 devices with the default camera app, Night Sight and Portrait modes. Pixel 4 and 4a devices also have it, but it’s limited to Night Sight mode. It’s also safe to assume this and further improvements will be available on Pixel devices going forward.

You can read Google’s entire blog post in detail on its AI blog at the link below:

HDR+ with Bracketing on Pixel Phones

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google shares a deep dive into its new HDR+ with Bracketing technology found in its latest Pixel devices

Posted in Uncategorized

 

Apple removes claim that its Pro Display XDR goes ‘far beyond HDR’ in the UK

12 Apr

Following a complaint and subsequent review by the Advertising Standards Authority (ASA) in the UK, Apple has changed the marketing for its $ 5,000 Pro Display XDR. Per 9to5Mac, the ASA asked Apple to remove the term ‘Far beyond HDR’ from its marketing materials for its flagship display, a request which Apple abided, at least in the UK. In the US, the phrase ‘Far beyond HDR’ remains live.

The phrase ‘Far beyond HDR’ has become a sticking point because some customers believe it’s misleading. The Pro Display XDR displays 99% of the P3 wide color gamut, and complaints have alleged that the term ‘Far beyond HDR’ suggests that the display shows 100% of the P3 color gamut.

On Apple’s US store, the term ‘Far beyond HDR’ remains present.

In response to the complaints, Apple has taken two steps. It has removed ‘Far beyond HDR’ from its UK website, as mentioned. Still, Apple has also added a footnote following the sentence, ‘A P3 wide color gamut provides a color palette capable of creating the most vibrant imagery.’ This footnote corresponds to small text at the bottom of the product page, which states, ‘Pro Display XDR supports 99% of the P3 wide color gamut.’ No such footnote currently exists on the product page in the US.

On the other hand, in the UK, the term ‘Far beyond HDR’ has been removed.

The ASA has also taken issue with Apple’s claim that its XDR display has a 1,000,000:1 contrast ratio. As of now, that claim remains on Apple’s website. 9 to 5 Mac reports that Apple is having independent tests completed, which Apple hopes will corroborate its contrast ratio claim.

As you can see in this screenshot from Apple’s US store, there’s no footnote about color space performance on the Pro Display XDR’s product page.

On the ASA’s website, the complaint against Apple is listed as informally resolved. Since the complaints were in the UK, they have no impact on Apple’s obligations in other markets.

When Apple first announced the Pro Display XDR in 2019, the California-based company made many lofty claims. Some of them can be verified, such as claims about color space and contrast ratio, while others are more difficult to confirm.

In the UK, however, the text in the ‘Show your truest colors’ section now includes a footnote that corresponds to the text, ‘Pro Display XDR supports 99% of the P3 wide color gamut.’ Click to enlarge.

For example, Apple says the Pro Display XDR is the ‘world’s best pro display.’ What does that even mean? It likely means something different to different users. For what it’s worth, reviews for the display have been generally very positive, with many claiming that the display features incredible build quality and fantastic performance.

Apple’s popularity and position mean that the company attracts a lot of attention, not all of it positive. The company is no stranger to complaints, investigations and general government oversight across the many markets it operates. It’s merely part of doing business, big business in Apple’s case. Does Apple’s Pro Display XDR go ‘far beyond HDR?’ Well, I guess that depends on who, or rather, where you ask.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Apple removes claim that its Pro Display XDR goes ‘far beyond HDR’ in the UK

Posted in Uncategorized

 

Ricoh adds new ‘Handheld HDR’ still capture mode to its Theta V, Z1 360-degree cameras

28 Oct

Ricoh has released updated versions of its Ricoh Theta app that adds new ‘Handheld HDR’ functionality for its Theta V and Theta Z1 360-degree cameras.

The Ricoh Theta app update (version 1.26.0 on Android and version 2.8.0 on iOS) adds Ricoh’s new ‘Handheld HDR’ capture setting for still images and addresses a number of unspecified bug fixes. For the new HDR setting to work, the Theta V and Theta Z1 cameras need to be updated to the latest firmware, version 3.10.1 and version 1.20.1, respectively.

App Store screenshots from the iOS version of the Ricoh Theta app.

All of the apps and firmware updates are free to download. You can find instructions on how to update the Theta V and Theta Z1 firmware on Ricoh’s support pages.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Ricoh adds new ‘Handheld HDR’ still capture mode to its Theta V, Z1 360-degree cameras

Posted in Uncategorized

 

Google’s Dual Exposure Controls and Live HDR+ features remain exclusive to Pixel 4

22 Oct

Google’s brand new flagship smartphone Pixel 4 comes with a variety of new and innovative camera features. Google has already said that some of those features, for example the astrophotography mode, will be made available on older Pixel devices via software updates.

However, two of the new functions will be reserved for the latest generation Pixel devices: Dual exposure controls, which let you adjust highlights and shadows via sliders in the camera app, and Live HDR+, which gives you a WYSIWYG live preview of Google’s HDR+ processing. Google confirmed this in a tweet:

According to the tweet the reason these two features won’t be available on the Pixel 3 and older devices is down to hardware rather than a marketing decision. It appears the older phones simply don’t have enough processing oomph to display the HDR+ treatment and manual adjustments in real time.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s Dual Exposure Controls and Live HDR+ features remain exclusive to Pixel 4

Posted in Uncategorized

 

Moment Pro adds HDR+ with support for Pixel 2’s Visual Core hardware

03 Aug

Moment has released an update for its Moment Pro camera app that gives Pixel 2 smartphone owners access to the same HDR+ shooting mode available in the Google Camera app. The third-party shooting mode utilizes the Pixel Visual Core co-processor hardware Google built into the Pixel 2 smartphones, which Google activated for third-party apps via a software update issue in February.

Moment Pro’s new HDR+ feature takes advantage of the Pixel 2’s image processing hardware to rapidly capture high dynamic range images

As with Instagram, Snapchat, and WhatsApp, Moment Pro’s new HDR+ feature takes advantage of the Pixel 2’s image processing hardware to rapidly capture high dynamic range images, which are processed in the background. As Google previously explained, the Pixel Visual Core efficiently uses power to reduce battery drain while processing the HDR+ content.

In addition to the Pixel 2 HDR+ support, Moment Pro’s update adds a map with location data in detailed view, assuming location info is available, as well as the option to make Motion Pro the device’s default camera, performance improvements, and bug fixes. The camera app is available from Google Play Store for $ 1.99 USD.

Via: Engadget

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Moment Pro adds HDR+ with support for Pixel 2’s Visual Core hardware

Posted in Uncategorized

 

Vivo’s AI-powered ‘Super-HDR’ tech takes on Google’s HDR+

15 Mar

Google’s HDR+ mode is widely regarded as the current benchmark for computational imaging on smartphones, but Chinese manufacturer Vivo wants to unseat the champion. Earlier today, Vivo announced its AI-powered Super HDR feature—a direct competitor to the Google system found in Pixel devices.

Super HDR is designed to improve HDR performance while keeping a natural and “unprocessed” look. To achieve this, the system captures 12 exposures (Google uses 9) and merges them into a composite image, allowing for a fine control over image processing.

Additionally, AI-powered scene detection algorithms identify different elements of a scene—for example: people, the sky, the clouds, rocks, trees, etc.—and adjust exposure for each of them individually. According to Vivo, the end result looks more natural than most images that use the simpler tone-mapping technique.

Looking at the provided sample images, the system appears to be doing an impressive job. That said, these kind of marketing images have to be swallowed with a pinch of salt; we’ll see what the system is really capable of when it’s available in a production device we can test.

Speaking of which, as of now, we don’t know which device Super HDR will be shipping on first, but there is a chance it might be implemented on the upcoming Vivo V9, which is expected to be announced on March 22nd. The V9 is currently rumored to feature a Snapdragon 660 chipset and 12+8MP dual-camera.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Vivo’s AI-powered ‘Super-HDR’ tech takes on Google’s HDR+

Posted in Uncategorized

 

Google enables HDR+ for Instagram and other apps on Google Pixel 2

07 Feb

Google’s latest generation Pixel 2 smartphones come with the built-in Visual Core dedicated imaging processor that powers the HDR+ mode’s sophisticated multi-frame-stacking computational imaging functions and other camera features. However, Visual Core wasn’t activated when the Pixel 2 devices were first launched, and only was enabled for developers in November last year.

The latest Android update now brings the power of Visual Core to all Pixel 2 users, an update smartphone photographer should be very excited about.

This update mainly means that Google’s excellent HDR+ mode is now available on all apps that call the camera and target API level 26, not just Google’s own Camera App. According to Google, this includes popular examples such as Instagram, Whatsapp or Snapchat, but we hope it also covers some of the powerful third-party camera apps available on Google Play.

Previously, those apps relied on a much more basic camera API that could not produce the same image quality as HDR+.

The Android update for the Google Pixel 2 will be rolling out over the next few days, along with other software improvements, so make sure you install the newest version as soon as it becomes available to take full advantage of the phone’s camera capabilities.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google enables HDR+ for Instagram and other apps on Google Pixel 2

Posted in Uncategorized

 

Gear of the Year 2017 – Allison’s choice: Google’s HDR+ mode

16 Nov

I was told. And I believed. But I didn’t quite understand how good Google’s Auto HDR+ mode is. After shooting with the Pixel 2 in some very challenging lighting conditions, I’m a believer.

Google’s HDR+ mode is really, really good. And I’m prepared to defend it as my Gear of the Year.

Like I said, I was told. Our own Lars Rehm was impressed with Auto HDR+ in his Google Pixel XL review of last year. In his words: “the Pixel XL is capable of capturing decent smartphone image quality in its standard mode but the device really comes into its own when HDR+ is activated… The Pixel camera is capable of capturing usable images in light conditions that not too long ago some DSLRs would have struggled with.”

So heading out with the Pixel 2 in hand, I knew that was a strong suit of the camera. I was looking forward to testing it on some challenging scenes. Things didn’t look too promising though as the day started off pretty miserably.

The afternoon forecast looked better, but any Seattlite can tell you there are no guarantees in October. I figured I had a day of dull, flat lighting ahead of me that I’d have to get creative with. I was happily proved wrong.

The clouds started to thin out mid-afternoon. On a long walk from the bus toward Gas Works Park, I came across this row of colorful townhouses. The sun was behind them, and I snapped a photo that looked like a total loss as I composed it on the screen – the houses too dark and lost in the shadows. I didn’t want to blow out the sky to get those details in the houses, so I just took what I figured was a dud of a photo and moved on. So what I saw on my computer screen later was a total surprise to me: a balanced, if somewhat dark exposure, capturing the houses and the sky behind them.

Am I going to print this one, frame it and put it on the wall? No. But I’m impressed that it’s a usable photo, and it took no knowledge of exposure or post-processing to get it.

Gas Works used to be a ‘gasification’ plant owned by the Seattle Gas Light company and was converted into a park in the mid-70’s. Some of the industrial structures remain, monuments to a distant past surrounded now by green parkland and frequented by young families with dogs and weed-vaping tech bros alike. On a sunny afternoon in October it was, both literally and figuratively, lit.

I was convinced my photos were not turning out, but I kept taking them anyway. It’ll just be a deep shadows, blue sky kind of look, I thought. Little did I know that the Pixel 2 was outsmarting me every step of the way.

Back at my desk with the final photos in front of me, I was genuinely impressed by the Pixel 2. Did it do anything that I couldn’t with a Raw file and about 30 seconds of post processing? Heck no. But the point is that this is the new normal for a lot of people who take pictures and have no interest in pulling shadows in Photoshop. They will point their cameras at high contrast scenes like these and come away with the photos they saw in their heads. If you ask me, it’s just one more reason why smartphones will topple the mighty entry-level DSLR.

Apple’s catching on too. HDR Auto is enabled by default in new iPhones and veteran photographer/iPhone user Jeff Carlson is also impressed by how the 8 Plus handles high contrast scenes.

While smartphone manufacturers have been increasingly implementing HDR as an always-on-by-default feature, they’ve also been making these modes smarter and the effect more aggressive. What previously took technical know-how, dedicated software, and multiple exposures is now happening with one click of a virtual shutter button, and it’s going to keep getting better.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Gear of the Year 2017 – Allison’s choice: Google’s HDR+ mode

Posted in Uncategorized