RSS
 

Posts Tagged ‘Google’

Google teamed up with pro photographers to train its Clips lifelogging camera

26 Jan

Google debuted its $ 250 ‘Clips’ lifelogging camera to the public during the October 2017 launch event for the Google Pixel 2. Now, with the camera becoming officially available in only a few weeks time, the company has published a blog post that explains how the Clips camera’s underlying algorithms were trained to identify and keep the best shots, and discard the leftovers.

It turns out Google relied on the expertise of a documentary filmmaker, a photojournalist, and a fine arts photographer to train the AI and feed some high-quality photography into its machine learning model. The group collected and analyzed recorded footage from members of the development team to try and answer the question: “What makes a memorable moment?”

“We needed to train models on what bad looked like,” said Josh Lovejoy, Senior Interaction Designer at Google. “By ruling out the stuff the camera wouldn’t need to waste energy processing (because no one would find value in it), the overall baseline quality of captured clips rose significantly.”

The learning process includes basic elements of photography, such as an understanding of focus and depth-of-field, or the rule of thirds, but also some things that are obvious to most humans but less so to an algorithm—for example: don’t cover a lens with your finger and avoid abrupt movements while recording.

Google admits that there is still a ways to go before perfection. It says the AI has been trained to look at “stability, sharpness, and framing,” but without careful calibration, a face at the edge of the frame will be appreciated just as much as one at the center, even if the focus of interest is really somewhere else in the image.

“Success with Clips isn’t just about keeps, deletes, clicks, and edits (though those are important),” Lovejoy says. “It’s about authorship, co-learning, and adaptation over time. We really hope users go out and play with it.” More detail on the development and training process is available on the Google blog.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google teamed up with pro photographers to train its Clips lifelogging camera

Posted in Uncategorized

 

Google Pixel 2 gallery updated

24 Jan

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_7104657277″,”galleryId”:”7104657277″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

The Google Pixel 2 represents truly impressive computational photography output. And while we’re currently hashing out a full review from a photographer’s perspective – to be published in the near future – we wanted to share this enormous update to our original Pixel 2 gallery.

Note: All images in this gallery have been shot using the stock camera app with auto HDR+, except where noted.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google Pixel 2 gallery updated

Posted in Uncategorized

 

Google Clips smart camera will launch soon, appears in FCC documents

20 Jan

During its October 2017 event, Google surprised the camera world by introducing a small AI-powered lifelogging camera named Google Clips. And now, thanks to some uncovered FCC documents, it looks like we’re getting close to an official release date.

Google Clips is an interesting concept. Unlike other cameras that require a bit of input from the user, Google said Clips could analyze situations and automatically capture memorable moments, growing smarter over time—just place it on a shelf and it would ‘learn’ to capture your most important moments as they unfolded. Several months later, however, we still haven’t heard anything from Google about a release date. We know it’ll cost $ 250 USD when it launches, and the Google Clips product page offers prospective buyers the option to join a waitlist, but Google hasn’t revealed anything more.

That’s where the eagle-eyed folks at Variety come in. Earlier this week, they noticed that the camera recently passed through the FCC, indicating that a launch is imminent. In other words: if you’re holding out for the Google Clips, your wait is almost over.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google Clips smart camera will launch soon, appears in FCC documents

Posted in Uncategorized

 

The Google Arts & Culture app can find your fine art doppelganger

16 Jan

Google’s Arts & Culture app was first launched in 2016, offering “virtual access” to some of the most famous art collections in the world, and many stories about arts and culture from around the world. The latest update of the app, however, makes use of Google’s extensive knowledge of machine-learning-based facial recognition, and the front camera of your smartphone, to find your fine art doppelganger… just ’cause.

The new feature lets you record a selfie and receive a list of portrait artworks your self-portrait resembles. While the user interface is extremely simple, Google is using highly sophisticated facial recognition algorithms to compare your facial characteristics to the portraits among the 70,000+ works of art in its Google Art Project database.

To try it out, download and install the app and scroll down to the “is your portrait in a museum?” icon on the front screen. From there, you simply capture an image of your face, and the system will analyze which which work of art you most resemble.

Looking at some of the results on Twitter and other social media, it is fair to say that the feature generally does a pretty decent job in matching selfie subject and piece of art; however, among the examples we have posted below, you’ll also see the occasional slip-up.

Unfortunately, it appears that if you are, like myself, living outside the US, you are currently out of luck as the new feature has not been rolled out globally yet. Hopefully this will happen soon though. If you are based in the US, you can find Google Arts & Culture on Google Play and the Apple App Store.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on The Google Arts & Culture app can find your fine art doppelganger

Posted in Uncategorized

 

Google Camera mod brings Pixel 2 portrait mode to older devices

03 Jan
Portrait Mode on the Google Pixel 2

Google’s Pixel 2 comes with one of the best-rated smartphone cameras in the world, and is one of very few single-lens devices to offer a background-blurring, fake bokeh portrait mode. Unlike dual-lens setups, the camera uses machine learning and neural networking to generate a foreground-background segmentation on both front and rear cameras. On the rear, the Pixel 2 also uses depth data from the image sensor’s dual-pixel technology for this task.

Thanks to Charles Chow, developer of the Camera NX Google camera mod, the feature is now also available to users of the original Google Pixel as well as the Nexus 5X and 6P smartphones. Portrait mode was included in version 7.3 of the Camera NX app but, due to a lack of dual-pixel technology on older Google Android smartphones, uses the exclusively software-based approach of the Pixel 2’s front camera.

The developer says the functionality has so far only been tested on the Nexus 5X, although it should work on Nexus 6P and first generation Pixel phones as well. If you want to try Camera NX and the new Portrait Mode you can find all technical details and download links in Charles’ article on Chromloop.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google Camera mod brings Pixel 2 portrait mode to older devices

Posted in Uncategorized

 

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

14 Dec

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Like it or not, 2017 is the year that background-blurring Portrait Modes gained major traction in smartphone photography. Apple and Google both offer improved versions of the mode in their latest devices, making for better-looking results all around. But the two manufacturers take somewhat different approaches to the process, each with different limitations and strengths. Take a look some side-by-side shots to see how they square up, and learn about some of the underlying technologies in the accompanying text.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/60sec 4.459mm ISO 382

Because the Pixel 2 back cameras use both a depth map (stereo) generated from the split pixels as well as ‘segmentation’ (which uses machine learning to identify people / faces vs. background), both subjects in this photo are largely in focus. This is a result one wouldn’t expect from real optics, since the person behind should also be blurred. This doesn’t always happen with the Pixel 2, but sometimes it does if the subjects are close to one another and both identified as people / faces. Sometimes it’s actually desirable, but at other times it can feel unnatural.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/40sec 4.459mm ISO 400

Because of the F1.8 lens and HDR+ noise averaging (with alignment of images), the Pixel 2 can take photos of even slightly moving subjects in low light. Again note the progressive blur here: the back of the baby seat is only slightly blurred as are the switches in the background but the trees against the sky very far away are far more blurred.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/120 6.6mm ISO 80

Here the iPhone’s longer – albeit slower (F2.8 vs. F1.8) – lens renders the background blurrier than the similar Pixel 2 shot. Note the odd dark/light patterns in the out-of-focus highlights though. This is commonly seen in out-of-focus highlights on iPhone shots, but not on the Pixel’s shots.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/209sec 4.459mm ISO 50

The background is a bit less blurred vs. the iPhone shot, probably largely because of the shorter focal length. Note the algorithm has mistook the bike’s steerer tube as part of the background (or foreground). Note the slightly darker centers in the out-of-focus highlights. More on this in the next photo…

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/209sec 4.459mm ISO 50

Lenses in smartphones have complex aspherical elements in them, which can lead to somewhat unpleasant disc-shaped blur that lends itself to things like donut-hole and generally ‘busy’ bokeh. Portrait mode helps mitigate this effect by blurring background and foreground pixels enough that these odd effects are essentially ‘evened out’. But not perfectly: the pixels in the dark rings in the center of each OOF highlight are still replaced by translucent (larger) discs of the same color, meaning there will still be some dark translucent circles in those areas. It’s subtle, since most of the pixels in those OOF highlights are light, not dark, but it’s still there if you look for it (in the previous photo).

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/120sec 6.6mm ISO 320

Two things of note in this iPhone shot here: (1) note the patterning within the out-of-focus highlights (it’s not a uniform disc) and (2) the blown highlights on the wood since HDR is shy to activate in Portrait Mode. Often tapping on the bright overexposed portion in your preview will darken the image enough to force the iPhone to turn on its HDR mode, but results can be inconsistent. The Pixel 2 cameras in comparison are always operating in HDR+ mode, even in Portrait mode, and are less prone to this.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 89

Note the far better exposure vs. the iPhone: HDR+ ensured the wood in Portrait mode shot did not blow out.

Also, note the brightest out-of-focus highlight, just to the left of the plant. It does *not* have a darker middle as we saw in the bike shot. This is because in the original shot (next photo), this highlight is completely blown, so the algorithm isn’t starting with the donut-hole disc we saw in the out-of-focus yellow lights in the bike shot. Completely blown out-of-focus highlights will look smooth and uniform – more so than with the iPhone 8.*

*It’s important to keep in mind that since the blurs are largely algorithms, some aspects of the bokeh may be updated simply by software updates. The comments we’re making throughout here are only really applicable for the software versions we shot the images with.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 89

Note the three major out-of-focus highlights just to the left of the plant. The darker ones show donut-hole bokeh but are dim enough that they get completely blurred into surrounding pixels in the Portrait mode shot (previous photo). The blown out-of-focus highlight to the left of them gets blurred to a pleasing uniform disc, without a dark center (which was not the case in the yellow out-of-focus highlights in the bike shot, which had slightly darker centers).

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 218

Sometimes, with very close-up objects, we’ve noticed the Pixel 2 cameras do not blur the background much, if at all. Compare this Portrait image to the original image (next photo).

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 218

Non blurred version of previous image. It’s not much different. We haven’t noticed this issue with the iPhone.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/120sec 6.6mm ISO 200

The sprouts in the back cause artifacts in this image (see next image for comparison). This can happen with dual camera setups, since the two cameras often see very shifted stereo pairs for close objects. If the two cameras see two different things at what it thinks is the same location in the shot, this can cause artifacts not as easily caused from less separated stereo pairs (although lower separation comes with its share of issues as well).

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 127

The Pixel 2 cameras’ stereo pair viewpoints are less than 1mm apart (roughly the diameter of the lens), and appear to have fewer issues with artifacts when shooting close-up objects against farther backgrounds. Since overall stereo disparity in the pair isn’t drastic, there’s less of a chance that the two perspectives see different things at the same image location. Note the sprouts here don’t get blurred oddly as in the iPhone image.

Also note the progressive blur in the bread, with the closer parts of the bread less blurred than the further parts. This is because Google uses the stereo pair of images to generate an actual depth map. The subject in focus shows no stereo disparity, objects progressively behind show more and more disparity while objects in front show more disparity but in the *opposite* direction. This is how the algorithms can generate essentially a ‘heat map’ of further and further behind the subject (or in front) from which it decides how much blur to apply to each pixel.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/60sec 6.6mm ISO 320

The iPhone version of this shot has more blown highlights than the Pixel 2 version, presumably because HDR did not kick on automatically.

Also, there are more depth map errors around the subject’s hair, again possibly because of how close to the camera she is (where the two cameras are likely to see different things at the same image location).

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 147

The Pixel 2 version of this shot has far fewer depth map errors around our subject, particularly her hair.

Also, since HDR+ is always active on Pixel 2 cameras, the captured dynamic range is far higher.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/120sec 6.6mm ISO 32

We found the iPhone to struggle a little more with autofocus in backlight and low light, but it did nail focus here for the most part.

Interestingly, the iPhone appears to preserve more of the out-of-focus highlights in the background than the Pixel 2 (next photo).

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/867sec 4.459mm ISO 50

The Pixel 2 appeared to struggle less with autofocus than the iPhone 8 Plus, nailing it here.

Of note though is that the Pixel 2 appears to have preserved fewer of the out-of-focus highlights (‘bokeh balls’ as we call them here around the office), or at least dimmed them compared to the more obvious ones in the iPhone shot. We wonder if this has something to do with the HDR+ algorithm, but are purely speculating.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/60sec 6.6mm ISO 250

Often, the iPhone 8 Plus in Portrait Mode would overexpose high contrast scenes, instead of activating HDR mode. HDR seemed reticent to activate in Portrait mode, leading to the blown highlights on faces here.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/120sec 6.6mm ISO 160

Tapping on the blown highlights resets dims the exposure and often forces HDR mode to activate. The Pixel 2 phones don’t have this issue, as they’re always operating in HDR+ mode.

Once exposure is adjusted though, the result is a very well-lit image with nice colors and convincing background blur.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 52

Since the Pixel 2 cameras are always operating in HDR+ mode, blown highlights are well-controlled here resulting in a well-exposed image. Sometimes with very high contrast scenes, though, HDR+ images can start looking a bit ‘crunchy’ (the same thing happens in HDR merging software depending on the ‘radius’ setting).

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/120sec 6.6mm ISO 25

Here the iPhone 8 Plus produces a more pleasing result, with fewer depth map artifacts. It also preserves the warm tone of the sunset scene. Auto White Balance was generally stable and produced desirable results across many different shooting scenarios on the iPhone 8 Plus.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/1560sec 4.459mm ISO 86

The Pixel 2 cameras often show rather extreme variation in White Balance from shot to shot. Quite often, it neutralizes color casts too much: for example, here, it should have chosen a white balance closer to Daylight instead of neutralizing the warm sunset tones.

Also, when tones in the background and foreground are very similar, depth map errors can result. Note the errors around the hair of our subject, which might have been hard to distinguish from the dark trees in the background.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/1560sec 4.459mm ISO 61

Another example of depth map errors due to objects possibly appearing to similar to one another. Look at the artifacts around the hair on the right side of our subject and around her sunglasses. Next, look at how these regions might appear similar to one another in a lower resolution depth map by comparing to the un-blurred image (next photo)

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/1560sec 4.459mm ISO 61

You can see the areas of the blurred photo (previous) that contained artifacts are regions where the foreground and background (the hair vs. tree branches; the sunglasses vs. the dark background) might appear indistinguishable as you try and build a lower resolution depth map.

Another possibility is errors in segmentation, the process of identifying the entire foreground subject using machine learning.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 281

For such a complex scene, the Pixel 2 did remarkably well, choosing to blur more than the iPhone in this case (next photo).

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/60sec 6.6mm ISO 800

The iPhone also does well, but here keeps more foreground leaves in focus before extremely defocusing the farther background.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 281

Note the progressive blur: objects further in the background are blurred more than objects closer. This is because the depth map is generated from actual stereo measurements of how far an object is from the focus plane.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/60sec 6.6mm ISO 500

Apple quoted with the iPhone 7 that it calculates 9 different layers when making its depth map. It presumably does so by a process of precalibration, where certain stereo disparities from the focus plane correlate with certain distances from it. We wonder if this might be why sometimes the subject looks somewhat cut-out from a far away background, if there aren’t enough objects behind the subject that fall within those 8 layers (or however many Apple is now using) before that 9th (hyperfocal or infinity) one.

Either that or the masking in this photo makes the subject look somewhat cut-out (see around the hair).

It’s impressive though that the arm rest in front of our subject is properly blurred.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 207

The blur in this image looks more natural and progressive to us. The colors leave a bit to be desired though, with somewhat desaturated, greenish skintones.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 159

This looks more natural to us than the ‘cut-out’ look of the iPhone image, interestingly. However, what’s odd is the color tuning, which is different from the front-facing camera (next photo).

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/60sec 6.6mm ISO 400

We can’t help but feel our subject appears more ‘cut out’ against the background here. We wonder if this has something to do with the number of layers of depth mapping, or a suboptimal masking process (around the hair particularly).

Skintones are more pleasing than with the Pixel 2 image, though.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F2.4 1/60sec 3.38mm ISO 149

The front facing camera oddly has a different color tuning than the back camera and, arguably, a bit more pleasing. Skintones are more magenta as opposed to the cool, sometimes greenish skintones with the rear camera.

It’s worth noting the iPhone 8’s front camera cannot do Portrait mode. The Pixel 2’s front camera does not have a dual-pixel sensor on its front camera, so performs this blur simply through a process of segmentation. That’s where machine learning comes in. Google trained a ‘convolutional neural network’ with nearly a million images of people (‘and their hats, sunglasses, and ice cream cones’ according to Principal Engineer Marc Levoy) to learn which pixels belong to people vs. not.

And impressive result, given the lack of a depth map. You won’t get the progressive gradual blur you get with the real camera, but for selfies this is probably ‘good enough’.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/60sec 4.459mm ISO 382

I’ve included this here because I just wouldn’t have expected a smartphone to generate an image like this if you were to ask me just a year or two ago. In low light, dual-pixel AF got focus (it’s a little soft because Portrait mode uses a digital crop, then upscales), and foreground and background blur are both well controlled. Look at the progressive foreground blur on the right side of the plastic food table.

The image remains clean thanks to multi-image averaging, while using 1/60s indoors to ensure at least some sharp shots of even a toddler.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/60sec 6.6mm ISO 1250

The iPhone’s F2.8 aperture in Portrait mode (and smaller sensor), and likely the lack of the 9-frame image averaging HDR+ uses on the Pixel 2 results in many unusable Portrait mode images in low light. Compare this shot to the Pixel 2 one next…

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/60sec 4.459mm ISO 258

The use of a faster aperture (and likely larger sensor even after the digital crop) and 9-frame image averaging of HDR+ generally yields far more pleasing low light portraits on the Pixel 2 than on the iPhone 8 Plus.

HDR+ uses intelligent tile-based image alignment that can keep even moving subjects sharp by selecting appropriate ’tiles’ from the sharper images of the subject within the 9-frame buffer used for a single shot. That’s right, the camera is constantly shooting 9 full-resolution images at a minimum of 60 times a second – which also ensures zero shutter lag.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2
F1.8 1/60sec 4.442mm ISO 213

We’ve found some depth map errors can occur around high contrast edges. Note the dark rails surrounded by light backgrounds can cause problems. Still, this is a heck of a pleasing image of constantly moving toddler… taken indoors on a smartphone.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/3344sec 4.459mm ISO 51

Running toddler. Focused (well enough). Isolated from the background. Taken on a smartphone.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/294sec 6.6mm ISO 20

This is a good example of progressive blur with the iPhone 8 Plus. Note how the grass only a bit behind the subject is less blurred than the grass far behind the subject.

Furthermore, in this scenario, HDR did kick in in Portrait mode quite often, resulting in even exposures.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/5848sec 4.459mm ISO 61

This is another good example of the progressive blur thanks to the depth map on the Pixel 2: while all the grass and the background looked pretty much in focus in the original, the grass nearer to the subject is blurred less than the grass further away.

There are some artifacts around the subject’s hair, but that’s not surprising considering she was running toward me while I was running backward. The Pixel 2’s superior Dual Pixel AF allowed me to get the right moment more easily – it’s often as fast and responsive as a high-end ILC – while the iPhone 8 Plus would often experience a re-focusing lag after pressing the shutter button.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/585sec 6.6mm ISO 20

The extra telephoto reach of the iPhone is useful for further compressing foreground and background (and magnifying the background), which can be useful. The iPhone 8 Plus also tended to render more pleasing blue sky tones, and saturation generally.

And remember, since you’re shooting HEIF, you get extra storage space savings, and the advantages of 10-bit files with support for more colors thanks to the wide gamut P3 capture. Encoding in P3 gives the cameras a wider color palette to work with after Raw capture.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/2342sec 4.459mm ISO 51

Naturally there’s less compression with the Pixel phones due to their wider angle camera used in Portrait mode, but I quite like wide-angle portraiture.

Note the overall lower saturation, and somewhat bland skies. This is up to personal preference, but one thing to note is the Pixel cameras only output sRGB images. This means the color palette with which the camera can ‘draw’ is limited compared to recent iPhones. Google probably chose this method for now because sRGB is a good standard for most people, and Google doesn’t have a key advantage Apple has: a proper ecosystem. Apple is implementing P3 displays in all its devices, from its iPads to its Macbook Pros to its iMacs. That means you’ll actually be able to enjoy those extra colors in those P3 images – if they’re there – across all Apple devices.

The movie industry has already accepted P3 as the new standard (think of it like Adobe RGB but with more saturated reds, yellows and greens, but a little less cyan-green and cyan saturation). The video industry is eventually aiming for an even larger gamut: Rec.2020, which is only a bit smaller than ProPhoto RGB, and it’s great to see Apple pushing the stills industry to adopt it as well.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Sony a7R II
F1.8 1/4000sec 55mm ISO 100

Just for fun, we’ve included this full frame 55/1.8 shot. On a high resolution screen, or viewed at 1:1, the quality is obviously far above what either smartphone can produce. But flip to the next image and view it at an image level. For many people, the Pixel 2’s result is good enough. Especially for a device you have on you at all times that requires just one button press to take a well exposed, focused photo.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/4673sec 4.459mm ISO 62

Compared to the full-frame F1.8 previous shot, for many people this result will be good enough. Especially for a one button-press device you always have on you. Just be careful: don’t pixel peep.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/168sec 6.6mm ISO 20

The iPhone’s result is smudgier with more artifacts around the hair, but the blur and colors are quite pleasing.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/252sec 4.459mm ISO 51

Compared to the iPhone 8 Plus shot of this same scene, the Pixel 2 retains far more detail than the iPhone shot. This is likely due to its HDR+ mode that is always using multi-image averaging, therefore requiring less noise reduction. The iPhone shot (next) in comparison looks like it’s had a lot of noise reduction applied to it, at the cost of detail.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/120sec 6.6mm ISO 160

The smaller aperture on the iPhone combined with the less (or none at all) multi-frame image averaging in Portrait mode than the Pixel 2’s 9 shots means the iPhone 8 Plus uses more noise reduction than the Pixel 2. The result: a far smudgier image under the same (yet bright) conditions with far less detail.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/17sec 4.459mm ISO 413

In low light, HDR+ on the Pixel 2 ensures decent noise levels by aligning and averaging multiple images.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 8 Plus
F2.8 1/60sec 6.6mm ISO 1250

The combination of F2.8 and the requirement of 1/60s to avoid camera shake (no OIS on the telephoto lens), and possibly not as advanced multi-frame noise averaging as the Pixel 2 leaves a lot to be desired in low-light portraits on the iPhone.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone 5c
F2.4 1/20sec 4.12mm ISO 50

This is in here to remind us of how far smartphone cameras have come. Compare this iPhone 5c image to the Pixel 2 image (next)…

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/120sec 4.459mm ISO 215

Indoors, but only what should be blurred is blurred! The subject is sharp and in focus, with a blurred background, thanks to a fast shutter speed, HDR+ multi-image averaging with alignment so not much noise reduction is required, and a proper depth map to gradually blur subjects further from the focus plane.

And having this sort of a camera in your pocket at all times means you can capture fleeting moments like when your daughter doesn’t want you to leave for work.

Imagine what’s to come…

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

iPhone X
F2.4 1/60s ISO 320

We’ll leave you with one final comparison to whet your appetite for our next shootout: the iPhone X vs. Pixel 2. This is an iPhone X shot and it’s immediately obvious that the camera on the X does a better job at ‘cutting around’ hair, people and objects than the 8 Plus. Our best guess as to why is that perhaps it generates a higher resolution depth map, but that’s pure speculation. It’s repeatably better, though, at making heads look less cut out from the background.

Compared to the 8 Plus, OIS and F2.4 (compared to F2.8) on the telephoto lens both help Portrait mode on the X. Compared to the Pixel 2 shot of the same scene (next slide), the out-of-focus highlights are rendered more specular, and the colors are more pleasing.

Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Pixel 2 XL
F1.8 1/60sec ISO 233

The Google Pixel 2 XL shot of the same scene results in a far more candid portrait. Not only is the image sharper with more detail than the similar iPhone X shot, it’s closer to the shot I wanted. I was able to capture this fleeting hug instantaneously due to the fast autofocus. The previous iPhone X shot looks more posed and less candid because inside the Apple Store here, lighting was dim enough that the iPhone X was often slower at acquiring focus.

To our knowledge, Apple’s ‘Dual PDAF’ technology only dedicates roughly ~4% of its sensor’s pixels to AF. The Pixel 2’s Dual Pixel AF technology uses most of its sensor for AF, pixel binning to read out a low resolution, but also low noise, set of ‘left-looking’ vs ‘right-looking’ images. The 9-frame HDR+ buffer also helps reduce the noise for these sets of images, making autofocus in challenging situations vastly superior to any other smartphone we’ve tested.

The colors, on the other hand, leave a lot to be desired, with greenish skintones. The out-of-focus highlights are also not as specular as the iPhone’s result.

Stay tuned for an in-depth shootout of the Pixel 2 vs. the iPhone X…

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2

Posted in Uncategorized

 

Google launches three mobile photography ‘Appsperiments’

13 Dec

Google today launched the first three installments in a series of experimental photography apps that the company calls ‘appsperiments’. The apps build on a range of technologies that are currently under development at Google, including: object recognition, person segmentation, stylization algorithms or efficient image encoding and decoding technologies.

Storyboard – Android only

The Storyboard app is currently available for Android and converts videos into single-page comic layouts. After shooting a video the app automatically selects interesting video frames, creates a layout, and applies one of six visual styles to the imagery.

And since privacy is a big concern with anything ‘intelligent’ like this: all of this happens on-device, without sending any data to the cloud.

Selfissimo! – iOS and Android

Selfissimo! is available for iOS devices and Android. Once you activate the app, it automatically captures a black-and-white selfie each time you strike a pose. The app encourages you to pose and captures a photo whenever you stop moving.

You end the ‘photoshoot’ by tapping on the screen, and can then review a ‘contact sheet’ to save your favorite images.

Scrubbies – iOS only

Scrubbies is an iOS app and allows you to adjust the playback speed and direction of videos, as well as create looping clips. The idea is to shoot a video using the app and then ‘remix’ it by ‘scratching’ on the screen like a DJ would do with vinyl.

Scrubbing with one finger plays the video. Scrubbing with two fingers captures the playback so you can save or share it.


If you try one or more of these so-called appsperiments, you can send your feedback to Google via in-app feedback links. More information about each of these creations is available on the Google Research Blog.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google launches three mobile photography ‘Appsperiments’

Posted in Uncategorized

 

Google finally activates ‘Visual Core’ imaging chip inside Pixel 2 smartphone

29 Nov

The finalized version of Google’s Android 8.1 operating system is expected to be released in December, but today the company has announced the availability of the last Developer Preview which, among other things, activates the formerly dormant Visual Core chipset in the Pixel 2 and Pixel 2 XL smartphones.

The custom-built system-on-a-chip (SOC) is designed to power and accelerate the Pixel 2 phones’ HDR+ function that achieves better dynamic range and reduced noise levels through computational imaging. The feature is already incredibly powerful, so we can’t wait to see how it gets even better with this additional hardware boost applied.

HDR+ photo captured with the Pixel 2 for our Sample Gallery. Credit: Allison Johnson

The latest Pixel smartphone generation comes with the chip built in, but it appears Google ran out of time before the Pixel 2 launch to fully optimize Visual Core implementation in the device, and therefore decided to not activate it. With the new software version, Visual Core can can now be turned on through an option in the Developer menu.

In addition to souping up the Pixel 2’s native camera app, this update also allows third-party apps using Android Camera API to capture HDR+ shots. Previously, this function has been exclusive to the Google Camera app.

There is a wide selection of third-party apps for all types of mobile photographers available in the Google Play Store. It’s no doubt a positive move by Google to make the capability of using HDR+ available to all of them. To install the Android Developer Preview, your Pixel 2 device needs to be registered in the Android Beta Program. Or you could just wait for the official Android 8.1 launch in December.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google finally activates ‘Visual Core’ imaging chip inside Pixel 2 smartphone

Posted in Uncategorized

 

AI-powered ‘Google Lens’ is being integrated into Assistant on Pixel phones

25 Nov

With the Pixel 2 smartphone, Google introduced an exciting new software feature called Google Lens. Google Lens uses Artificial Intelligence to power its visual recognition algorithms and provides information about whatever your smartphone’s camera is pointed at—for example, what type of flower you are looking at or reviews and other information about a restaurant. You can also identify landmarks, look up movies, books or works of art and scan barcodes/QR codes and business cards.

Unfortunately, in its first implementation the feature wasn’t terribly easy or straightforward to use. You had to take a picture, then go to Google Photos and tap the Lens icon which would trigger the Google Lens scan. That’s too many steps to make the feature as useful as it could potentially be.

Thankfully, Lens will be integrated into Google Assistant soon. When you open the latter, there’ll now be a Lens icon near the bottom right of the display. Tapping this opens up a Google Lens camera. You can tap on any object of interest in the preview window and the app will provide any available information.

As usual, the new feature will be rolled out gradually. English-language Pixel phones that are using Assistant in the United States, United Kingdom, Australia, Canada, India, and Singapore will be served first over the coming weeks, but we’d expect the new feature to make it other regions soon after.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on AI-powered ‘Google Lens’ is being integrated into Assistant on Pixel phones

Posted in Uncategorized

 

Google explains the Pixel 2’s impressive hybrid video stabilization

15 Nov

Most smartphone cameras, even those with optical image stabilization systems, rely on electronic stabilization only for stabilizing video footage. Google’s new Pixel 2 devices, however, are managing to combine both optical and electronic stabilization for ultra-smooth handheld footage and panning. And a new post on the Google Research Blog explains in quite some detail how the system works.

As you would expect from a software company like Google, advanced algorithms making use of the company’s expertise in the area of machine learning are the key to the solution.

Motion information is collected from the optical stabilizer and the device’s built-in gyroscope. In a next step, the Pixel 2 devices then use a filtering algorithm that pushes video frames into a deferred queue, analyzes them, and uses machine learning to predict where and how the camera is going to move next.

The system can correct for more types of motion than conventional stabilization systems, including wobbling, rolling shutter and even focus hunting. Virtual motion is used to correct for strong variation in sharpness when the device is moved very quickly.

The system might still have scope for improvement, but with a video score of 96, including a very high sub-score of 93 for stabilization, the Pixel 2 is already performing very well in the DxOMark Mobile ranking, and already has us looking forward to future generations of Google’s AI-powered hybrid stabilization system.

For more detail read the original article on the Google Research Blog.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google explains the Pixel 2’s impressive hybrid video stabilization

Posted in Uncategorized