RSS
 

Posts Tagged ‘Pixel’

iPhone 11’s Portrait Mode sets a high bar for the Pixel 4

12 Oct
Taken with iPhone 11 ISO 500 | 1/30 sec | F1.8

The bokeh-imitation effect that’s all over your Instagram feed is a few generations old, but it’s still a relatively young technology. Portrait Mode, as Apple calls it, is a computational feature that mimics the shallow depth of field closely associated with professional portrait photography. The latest iteration in the iPhone 11 is a great leap forward and, when compared with Google’s Pixel 3, shows that the search-engine giant is going to have to do something pretty special with the forthcoming Pixel 4.

Just to make sure you’re caught up – phone sensors and the lenses they are coupled with are quite small, and inherently limited in their ability to create a blurry background behind a subject. Hence, portrait mode was born (Portrait Mode is Apple’s proprietary name, but for the sake of simplicity I’ll use it throughout this article to refer to all such modes).

Compared side-by-side with results from the iPhone 11, the Pixel 3 has been surpassed in many respects

Like so many first-generation technologies, portrait mode was a bit dodgy at first – subjects poorly separated from their backgrounds and results that were decent but not quite convincing. But no matter where you stand on its current state from “It’s so terrible it’s an insult to photographers” to “Eh, it’s passable,” there’s no denying that it has steadily improved with each generation.

Apple, like most manufacturers, introduced Portrait Mode when it brought dual cameras to its devices. However, Google chose to offer it with a single camera, relying on dual pixel depth data, machine learning, and up-sampling to create that fake bokeh look. The results looked fine until, well, about now.

Compared side-by-side with results from the iPhone 11, the Pixel 3 has been surpassed in many respects. Here are the areas in which the iPhone 11 pulls clearly ahead of the Pixel – and where Google needs to do some catching up in the Pixel 4.

The nitty gritty details

Google achieves its portrait mode by digitally zooming in to mimic a longer focal length and creating a depth map by using dual pixels along with a learning-based algorithm to judge distance to a subject and separate it from its background, up-sampling the final result to a full 12MP resolution. Apple (and Samsung, Huawei, among others) instead use their telephoto camera, calculating depth with the help of the perspective offset between the telephoto and wide cameras – no cropping or up-sampling needed.

The images below demonstrate the difference – note that to match the subject’s size in the frame using the two different focal lengths, the 3a image was taken from about a meter farther back than the iPhone 11.

Of course the vast majority of portrait mode images will be viewed on a phone or computer screen, where the difference in detail is much harder to spot. Still, looking at the images above at even a 50% crop shows a vast difference in the level of detail captured, and all things being equal we’d much rather have more detail than less.

Backlit subjects

We’ve previously noted the Pixel 3’s fantastic ability to render high-contrast scenes, but one place this falls flat is with backlit portrait subjects. The camera’s tendency to preserve highlight detail and push up shadows is normally what we’d prefer, but it doesn’t work well when the shadows are your main subject.

Pixel 3a iPhone 11

The resulting image shows that the Pixel does a poor job of rendering the cat’s orange fur, giving him an overall ‘crunchy’ look in comparison to the more pleasing rendering by the iPhone. In our testing, the Pixel 3 has consistently shown this tendency to expose for highlights, even when it might do better to choose an exposure better suited to your human or feline subject, at the cost of highlight detail. Even tapping the subject’s face doesn’t adjust the exposure as much as we’d like.

Skin tones

The most sophisticated depth mapping in the world won’t save an image from bad-looking skin tones, and this is one area where Google really needs to catch up. The subject below is lit by window light that’s much cooler than the yellow lights of the kitchen behind him. It’s a tricky situation for sure, but the iPhone has clearly made the right call to warm up the subject’s skin tone rather than preserve the cool cast of the window light.

Pixel 3a iPhone 11

To be fair, both of these phones are susceptible to producing noticeably different colors based on slight shifts in framing, or using a different camera mode like Night Sight. But over the course of much use, we’ve seen that the Pixel 3’s standard camera mode renders skin tones particularly poorly by comparison.

Apple’s face smoothing and skin tone rendering has a tendency to go too far in some situations, and there are times when we prefer the more faithful color rendering of the Pixel. It’s also pretty easy to correct the Pixel’s skin tone rendering in the phone’s own Photos app, but we’re betting that most people don’t want to (and won’t) take the time to color correct every portrait that they take.

Focal length flexibility for portrait mode

Taken with iPhone 11 | ISO 200 | 1/60 sec | F1.8

Apple’s XR introduced wide-angle portrait mode to the iPhone, but the 11 and 11 Pro improve on it with more accurate depth maps thanks to the availability of the ultra-wide lens. Thus, the 11 offers a very good wide Portrait mode via its standard lens, and the 11 Pro offers both telephoto and wide portrait options.

Whether you prefer the look of a telephoto or wide portrait is of course a personal preference, and I tend to prefer the wide portrait mode on the iPhone 11. I like an across-the-table environmental portrait, which usually requires backing up if I’m using the telephoto lens or the crop imposed by the Pixel 3.

Whether or not you like the crop, it being forced on you makes it less flexible and, in my book, I’d rather have that wide-angle – and I’m sure I’m not alone.

Your move, Google

To be fair, there are things we prefer about the Pixel 3’s portrait mode. We find it’s much less prone to obvious errors in cutting around human subjects than the iPhone 11. I also far prefer using Google Photos to Apple’s iCloud, so the seamless integration with my photo archive is a big plus.

Healthy competition between two big tech companies keeps pushing phone camera technology forward at a rapid pace

We can also say with some certainty based on leaks and rumors that the Pixel 4 will address some of these shortcomings. We know that the device will offer more cameras, which will likely improve portrait mode. Whether we’ll see improvements to skin tones or better handling of backlit subjects is less certain, though encouragingly, leaked photos do show better rendering of skin tones. All will be revealed soon, but one thing is for sure – healthy competition between two big tech companies keeps pushing phone camera technology forward at a rapid pace, and that’s nothing but good news for the photo-taking public.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone 11’s Portrait Mode sets a high bar for the Pixel 4

Posted in Uncategorized

 

Researchers have developed reset-counting pixel that promises near-limitless highlight capture

12 Oct
Figure 4 (from the paper linked below): Realized CMOS test chip: (a) photograph of the packaged chip, (b) screenshot of the layout.

German researchers have developed a pixel design with the potential for massively increased dynamic range. Their design, reported in the ‘Advances in Radio Science’ journal isn’t limited by the point at which it saturates, meaning it can continue to capture more highlight data when other sensors would become overwhelmed.

Unlike conventional CMOS chips, their ‘self-resetting pixel’ doesn’t simply ‘clip’ when it becomes saturated, instead, it resets and has a circuit that counts how many times it’s had to reset during the exposure. It also contains a conventional analog-to-digital conversion circuit, so it is also able to measure the remaining charge at the end of the exposure.

Figure 2 (from the linked paper above): The working principle of the self-reset pixel.

This would mean that you don’t need to limit your exposure to protect highlight data and can instead set an optimal exposure for capturing your subject, safe in the knowledge that this won’t result in blown-out highlights. In their paper, the researchers from Institut für Mikroelektronik Stuttgart created a series of test pixels with different designs, and will now focus on the one that gave the most linear response to different light levels, both in terms of its reset characteristics and its conventional ADC mode.

Figure 1 (from the linked paper): Schematics of the analog and digital parts of one pixel cell and a global control for all pixel cells.

Before you get too excited, though, this work is still at a fairly early stage and is primarily focused on video for industrial applications, though lead researcher Stefan Hirsch tells us: ‘basically it should also be possible to use for still images.’

At present, the additional counting circuitry ends up meaning the light-sensitive photodiode in each pixel is very small, making up just 13% of the surface area of huge 53?m pixels. A move to a stacked CMOS design, with the circuitry built as different layers, would increase this, with potential for 20?m pixels with more of the area being light-sensitive. A three-layer design could allow still smaller pixels. For perspective, the pixels in the 12MP Full-Frame a7S II are around 8.5?m, so there would need to be a lot of work done to find a way to produce a sensor useful as a consumer video or stills camera.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Researchers have developed reset-counting pixel that promises near-limitless highlight capture

Posted in Uncategorized

 

Leaked Pixel 4 photos show new and improved astrophotography, portrait and Night Sight modes

03 Oct

The Google Pixel 4 is just around the corner, expected to be announced at the Made by Google Event on October 15. We’ve already seen what the Pixel 4 will look like, thanks to both Google and third-party leakers, but today we’re getting more than a hardware leak. 9to5Google has obtained exclusive images that it claims Google will use to promote the new camera capabilities of its impending device.

9to5Google has kindly given us permission to share the full-resolution images directly from their source and only saved once with a watermark over them. The images, as you’ll see below, are a combination of images captured with the front-facing selfie camera and the rear-facing cameras (rumors point to there being a 12-megapixel main camera and a 16-megapixel telephoto camera). The images appear to include photos shot in multiple camera modes, including the improved Night Sight mode and a new star-shooting mode that’s been rumored for some time now.

First up are a few photos that appear to show off the portrait mode of the front-facing camera onboard the Pixel 4. Interestingly, these photos measure in at 4.5-megapixels, nearly half the resolution of the 8-megapixel onboard the Pixel 3, so we’re not sure whether these are simply resized or from a larger sensor that’s been supersampled, but whatever the case is, they look impressive. The faked bokeh looks both realistic and smooth, while the outline, even around hair, seems to be precise, with only a few notable exceptions (specifically the arm on the white jacket).

Next up are more portrait mode shots with what we presume to be the rear-facing camera on the Pixel 4. These shots measure in at 7-megapixels and were taken with the main camera (the Pixel 4 will feature multiple camera modules). Like the previous shots, the fake bokeh appears to be incredibly accurate, even on difficult subjects, such as a long-haired pet and flyaway hairs.

Moving along, we have three photos (two 9.2-megapixels and one 5.2-megapixels) that appear to be taken with Google’s Night Sight mode. Based on the EXIF data embedded in some of the images, the photos were taken with the main 27mm (35mm equivalent) F1.7 camera onboard the Pixel 4. The actual lighting scenario in the scene isn’t known, but the images appear both bright and vibrant with nice dynamic range, even in the images that have multiple light sources at different color temperatures.

Along the lines of Night Sight, it appears a pair of photos showing off the much-rumored night sky camera mode expected to be onboard the Pixel 4. Based on the EXIF data, these images (the header image of this article and the below image) were also captured with the main camera unit and the GPS data reveals the shots were captured at Pinnacles National Park in Central California along State Route 146. For being captured with a smartphone, the amount of detail captured in the night sky is absolutely incredible. It seems as though stars get lost around the silhouette of the trees in the frames, but the rest of the sky showcases countless stars in the Milky Way.

The remainder of the photos showcase a number of scenes, but it’s not clear what specific camera modes are being used to capture these images. As noted by 9to5Google, it’s been rumored there will be a ‘Motion Mode’ with the Pixel 4, but that’s not yet confirmed, even though a few action-style shots are seen in the following images.

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_6823414838″,”galleryId”:”6823414838″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Plenty still remains to be seen, but with the Made by Google Event less than two weeks away, it won’t be long before we know just what the Pixel 4 is capable of. 9to5Google has also detailed a new ‘Dual Exposure’ mode that’s believed to be avaialble on the Pixel 4.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Leaked Pixel 4 photos show new and improved astrophotography, portrait and Night Sight modes

Posted in Uncategorized

 

High resolution Sony a7R IV pixel shift images added to studio scene, sample gallery updated

24 Sep

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_0635778599″,”galleryId”:”0635778599″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

One of the eye-catching features of the Sony a7R IV is its 16-image pixel-shift mode. This shoots four images centered around one position then shifts the sensor half a pixel sideways and takes another four, then another half pixel… until it’s taken 16 images. These 16 images can be turned into 240 megapixel images.

We’ve added pixel-shift images to our studio scene at several different ISO settings, along with a couple of real-world examples to our sample gallery showing both the 4-image demosaicing mode as well as the high-res 16-image mode. Just for good measure, we’ve added more standard images to the gallery as well.

Studio Scene

$ (document).ready(function() { ImageComparisonWidget({“containerId”:”reviewImageComparisonWidget-52074786″,”widgetId”:715,”initialStateId”:4752}) })

Image Processing

We’ve processed the images in the studio scene using PixelShift2DNG, because it allows us to use our standard Adode Camera Raw processing to maximize comparability with other cameras in the scene.

It should be noted that Imaging Edge has a setting called ‘Px Shift Multi Shoot. Correction,’ adjustable in eleven steps between 0 and 1, that smooths some of the stair-stepping and chequerboard errors that can appear in the image. The shots in our test scene effectively have this set to 0.

Before making this decision, we compared this output with the results from Sony’s own Image Edge software. We’ve created a rollover that compares the PixelShift2DNG result to the Imaging Edge output with sharpening, noise reduction and Px Shift Correction minimized, and to the default Imaging Edge result.

DNG -> ACR Imaging Edge Modified Imaging Edge Defaults

We’ve uploaded the Image Edge-combined ‘ARQ’ files to the studio scene, but you can download the combined DNGs here:

16-image files merged using PixelShift2DNG
  • ISO 100
  • ISO 6400
  • ISO 51200
  • ISO 102400

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on High resolution Sony a7R IV pixel shift images added to studio scene, sample gallery updated

Posted in Uncategorized

 

Google Camera app 7.0 leak reveals new Pixel 4 camera features

16 Sep

A leaked version of the Google Camera app 7.0, which will likely be installed on the upcoming Google Pixel 4 device, has made its way into the hands of the people at XDA Developers who have analyzed the code and found a bunch of new camera features to look forward to.

The new camera will likely come with a motion blur mode that lets you capture moving subjects in the foreground and blur the background to emphasize the impression of motion and speed. The feature, which should come in handy at racing or sports events, will likely be called Motion Mode.

A section of code inside the Google Camera 7.0 app that hints at the upcoming Motion Mode.

The app source code also suggests that the computational photography feature Night Sight will be improved on the Google Pixel 4, likely with the previously leaked astrophotography mode. The Night Sight feature will also be sped up by making use of zero shutter lag technology and for astrophotography Google will be using the chipset’s integrated GPU to accelerate segmentation of the sky as well as identifying and brightening stars.

References to Live HDR and HDRNet in the code hint at HDR rendering in the preview image and it also looks like the the Pixel 4 will come with an audio zoom feature, similar to what Apple has implemented on the iPhone 11 and what LG and HTC have been using for some time now. The feature allows the phone to focus its microphones on a major audio source when zooming the camera.

Code from within the Google Camera 7.0 app that references Live HDR settings, as well as mesh warp settings, presumably used in conjunction with depth data.

Other sections in the source code indicate that the Pixel 4 and other compatible Pixel devices will support saving depth data as a Dynamic Depth Format (DDF) file which should allow for re-focusing and other depth modifications in any app that supports the format.

Further improvements could include an updated version of the Photobooth feature which was introduced with the Pixel 3 and automatically takes photos when it detects smiles or funny faces in the frame, integration of an augmented reality measurement app into the Camera app, and a ‘rewind’ feature, the exact function of which is as yet unknown.

Google Pixel have traditionally been at the forefront of mobile imaging and it looks like the Pixel 4 will be no different. We’ll know more in October when the new device is expected to launch.


Image credits: Screenshots used with permission from XDA Developers

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google Camera app 7.0 leak reveals new Pixel 4 camera features

Posted in Uncategorized

 

Leaked promo video hints at Google Pixel 4 astrophotography mode

10 Sep

Google’s Pixel devices are usually cutting edge in terms of computational photography and the current Pixel 3 device comes with an entire range of computational imaging features, including the multi-frame-stacking Night Sight low light mode.

Now it looks like Google is planning to take things one step further with the upcoming Pixel 4 generation and offer some kind of astrophotography feature.

A fuzzy screenshot of the leaked promotional video showing off a dedicated camera mode for capturing stars.

Pro Android has managed to get hold of what appears to be an (as yet) unreleased Google Pixel 4 promotional video. The clip highlights several software features of the still unreleased device, including a Night Sight-like astrophotography mode. Unfortunately, no technical detail is provided but it is fair to assume to mode will use some combination of frame-layering techniques and artificial intelligence to create well-exposed noise-free images of the night sky.

Huawei’s current flagship P30 Pro already features a multi-frame star trail mode which is capable of achieving pretty attractive results in the right circumstances. We’ll have to wait until October, when the Pixel 4 is expected to be launched, to find out if Google’s solution is capable of improving on the Huawei feature.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Leaked promo video hints at Google Pixel 4 astrophotography mode

Posted in Uncategorized

 

Google reportedly ‘DSLR-like’ attachment for Pixel 4 smartphone

08 Aug

The upcoming Pixel 4 smartphone introduced by Google in June will reportedly be offered with a ‘DSLR-like attachment,’ according to 9to5Google, which recently leaked details about the model and its larger variant, the Pixel 4 XL.

The report claims Google is developing this attachment for the new Pixel 4 and that it will possibly be offered as an accessory for the handset. No other details about the attachment were provided, however. Google is expected to launch the Pixel 4 in October and will likely introduce this alleged accessory at the same time.

The Pixel 4 is expected to feature two rear cameras, a 12MP offering with phase-detect autofocus and a 16MP camera with a telephoto lens. Google confirmed in its June tweet that the Pixel 4 will be the first Pixel model to feature two rear cameras.

The Pixel 4 is expected to feature two rear cameras, a 12MP offering with phase-detect autofocus and a 16MP camera with a telephoto lens.

Until now, the Pixel line has featured only a single rear camera (though the Pixel 3 and 3a have dual front cameras), with Google electing to focus on computational photography utilizing artificial intelligence to produce the Pixel’s notable image quality. The inclusion of two front cameras on the Pixel 3 was largely viewed as writing on the wall for the eventual inclusion of two rear cameras on a Pixel phone.

Despite the company’s substantial AI capabilities, Google’s computational photography faced issues that only hardware could overcome. The inclusion of dual front cameras on the Pixel 3 enables the phone to capture wide-angle selfies. Google will overcome AI limitations surrounding focal length by adding a second rear camera to the Pixel 4.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google reportedly ‘DSLR-like’ attachment for Pixel 4 smartphone

Posted in Uncategorized

 

Google Pixel 3 camera defect causes loud clicking, OIS issue while shooting

07 Aug

A number of Google’s $ 499 Pixel 3 smartphone units are experiencing an issue that causes the camera to ‘shake’ while recording video even when the device is placed on a stable surface. A large number Pixel 3 owners have published complaints about this problem on Reddit, Twitter, the Google Support forums, and other online destinations.

The issue appears to primarily impact the Pixel 3 model, though there are some reports of it related to the larger Pixel 3 XL. Sample videos from users show the camera’s focus constantly adjusting itself or, in other examples, producing a prominent wobble effect similar to what one would get by shaking the phone.

Though Google hasn’t provided an official statement about the matter at this time, a loud clicking sound produced from the camera while recording indicates the problem may stem from the Pixel 3’s optical image stabilization system. Pixel 3 owner ‘anaymakan’ shared a video demonstrating this problem on the Pixel 3 subreddit in late May.

Because this appears to be a hardware defect, Pixel 3 owners have been unsuccessful in finding a workaround solution. Owners of the faulty devices report having it solved by getting a replacement phone that doesn’t suffer from the same problem.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google Pixel 3 camera defect causes loud clicking, OIS issue while shooting

Posted in Uncategorized

 

Google confirms that Pixel 4 will offer multiple rear cameras

13 Jun

Leaked renderings of the upcoming Pixel 4 seem to have prompted Google to get ahead of its own story by tweeting an image of the device well ahead of its expected fall launch date. Prominently featured is a square “bump” that appears to house two rear-facing cameras, a flash and two smaller sensors, one of which is visible if you raise the image’s shadows.

We imagine that the small sensor to the right of the flash is a spectral + flicker sensor, something that helps avoid banding with flickering light sources and also helps the auto white balance algorithm estimate the dominant illuminant for better color reproduction. The Pixel 3 has one of these sensors next to its flash, too.

The fourth sensor, positioned above and centered between the two main cameras, appears too small to be another camera unit. We’ve often wondered about the future inclusion of a Time-of-Flight (ToF) camera in the Pixel lineup. Competitors like Huawei and Samsung have been adding these sensors to their devices to help with depth mapping. The Pixel 3’s portrait mode images were already among the best from current mobile devices, so we’re interested to see what the addition of a secondary camera – and whatever that additional sensor is – might add up to.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google confirms that Pixel 4 will offer multiple rear cameras

Posted in Uncategorized

 

Google’s Photobooth brings automated selfie-shooting to the Pixel 3

19 Apr

Capturing a group selfie can be a daunting task. Someone is always looking the wrong way or unhappy with their facial expression in the shot, usually resulting in a large number of unusable shots in your camera roll. Google has now developed a clever piece of AI software for its Pixel phones that should make things much easier and reduce the image waste on your device.

Photobooth is a new shutter-free mode in the Pixel 3 Camera app. With the mode activated you hit the shutter once and the camera will automatically capture a shot when the camera is stable and all subjects have good facial expressions and their eyes open.

via GIPHY

Unlike face, smile and blink detection features of the past Photobooth does not simply rely on the shape and specific features of the human face. Smartphone processing power allows for better autonomous control of the capture process by the device. Photobooth is capable of identifying five expressions: smiles, sticking your tongue out, kisses, duck face, puffed out cheeks, and a look of surprise.

The Google engineers trained a neural network to identify these expressions in real time. After pressing the shutter button every preview frame is analyzed, looking for one of the expressions mentioned above and checking for camera shake.

In the camera app a white bar that expands and shrinks indicates how photogenic the preview scene is deemed by the algorithm, so users have some idea when the camera is likely to trigger the capture.

Some of the technology has been ported from one of Google’s now terminated hardware projects, the Clips lifelogging camera.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s Photobooth brings automated selfie-shooting to the Pixel 3

Posted in Uncategorized