RSS
 

Posts Tagged ‘Pixel’

Google rolls out ‘Saturated’ mode to address Pixel 2 XL display issues

10 Nov

The Google Pixel 2 might sport one of the best smartphone cameras around, but when it comes to the display—particularly on the larger XL—model, Google has had nothing but trouble. Reports of everything from burn-in, to blue tint off-axis, to ‘dull’ colors have left the tech giant playing catch up, and today it finally … well… caught up. Or at least it tried.

A promised software update released on Tuesday (and rolling out to all users by the end of the week) addresses the issue of burn-in with some minor tweaks, and adds three total color saturation modes under the phone’s Display settings to hopefully quiet down the complaints about ‘dull’ colors.

Here’s a quick summary of the update in Google’s own words:

This update includes some of the enhancements we posted about on October 26, such as the new Saturated color mode for Pixel 2 and Pixel 2 XL, a fix for the faint clicking noise heard in some Pixel 2s, and other bug and security updates. As we mentioned in our deeper dive, this update also brings planned UI changes which extend the life of the OLED display, including a fade out of the navigation buttons at the bottom of the screen and an update to maximum brightness.

According to Android Central, the updated saturation settings come in three flavors: Natural, Boosted, and Saturated. Natural should provide the most accurate color reproduction; Boosted takes the place of the “Vivid Colors” setting previously available, which boosted saturation by 10%; and, finally, Saturated will put the display in an “unmanaged configuration” that will make colors “more saturated and vibrant, but less accurate,” according to Google’s deep dive on the topic.

Unfortunately, this mode throws away one of the most important things about Android Oreo: color management. In ‘Saturated’ mode, all apps, images and video will first render to sRGB (for now) and then be stretched to the display’s wider color gamut.

This will make for inaccurate colors across the two devices, but there is hope for us color nerds. As Seang Chau, VP of Engineering at Google, says in his blog post: due to color management under the hood in the new OS, “an Android app developer can now make use of the wider Display P3 color gamut precisely for a wider range of colors. Google apps will take advantage of wide colors in the future.” We’re hoping this means that future apps will render either to P3 or straight to a display profile provided by Google, which would allow for saturated colors when appropriate, but not at the cost of accuracy.

Finally, no comment was made on the poor viewing angle of the XL model that introduces a strong blue-tint off-axis (see picture above of the Pixel 2 XL vs the original XL). This can make photos with warmer tones look even more desaturated by shifting toward blue. But while Google was able to address some of its display complaints this week, this seems like a hardware problem that will be difficult to fix via software.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google rolls out ‘Saturated’ mode to address Pixel 2 XL display issues

Posted in Uncategorized

 

Google may address ‘dull’ colors of Pixel 2 XL display, is investigating burn-in reports

24 Oct

Google’s newly launched Pixel 2 XL smartphone has received some criticism from buyers who claim the POLED display appears ‘dull’ when compared to the vivid OLED displays used by some of Pixel 2 XL’s competition. Google recently commented on the criticism, telling 9to5Google that it will consider releasing a software update that adds more display color options.

“One of our design intents was to achieve a more natural and accurate rendition of colors,” Google said in its statement. For users who want more vivid colors, Google says it has provided an optional setting to increase the saturation by 10%. However, should that prove inadequate, Google says it will, “consider adding more display color options through software if that makes the product better.”

But a ‘dull’ screen isn’t the worst of Google’s Pixel 2 XL display troubles—this week’s news is just getting worse and worse for the handset. Some early adopters claim they are already experiencing burn-in, others claim there are ‘blue tint’ issues, and some reviewers have noticed a ‘graininess’ issue.

Regarding the burn-in issue, Google told The Verge it is “actively investigating” the reports. After all, while muted colors aren’t a concern for some users, rapid burn-in—a problem that causes a potentially permanent ‘ghosting’ image to appear on the screen—could be enough to deter consumers from buying the phone altogether.

Early Pixel 2 XL users are also reporting a distinct blue tint that is visible when looking at the display from an angle. This blue tint issue is said to be most visible when the display’s background is light; a similar problem has been observed with the previously launched LG V30, a handset that features the same panel used in the Pixel 2 XL.

We can indeed confirm the viewing angle issues of the Pixel 2 XL. Here it is pictured next to the original XL (bottom) in our offices. There is a cyanish-green shift accompanied by progressive desaturation as you tilt the phone in any direction in your hand. It’s noticeable at even modest viewing angles. This can particularly make using the Pixel 2’s otherwise phenomenal camera a bit uninspiring: the preview looks noticeably desaturated and greenish if you shoot any angle that doesn’t have the phone directly in front of you.

Finally, reviewers have noted that the Pixel 2 XL’s POLED display has an underlying graininess not shared by the Samsung OLED panel used in the smaller Pixel 2 phone. Ars Technica posted a side-by-side comparison photo of the two phones that highlights the XL’s graininess issue.

It’s safe to say it’s been a rough weekend for Google. We’ll keep you updated as Google addresses each of these issues in turn, and keep an eye out for our own Pixel 2 XL review coming soon!

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google may address ‘dull’ colors of Pixel 2 XL display, is investigating burn-in reports

Posted in Uncategorized

 

How to do Pixel Stretching in Photoshop

21 Oct

As photographers, we’re all too aware of the abundance of ways to edit a photograph in post-production. And as technology progresses, so will the potential for image making. Pixel stretching is one way to investigate the construction of a digital image through creative means.

How to do Pixel Stretching in Photoshop

Compared to other glitch-style editing techniques, pixel stretching is pretty straight-forward. The process involves selecting a single row or column of pixels and stretching them out over an image to create a warped, surrealistic visual effect. The results highlight the nuances of a digital image and explore the action of altering photographs through non-traditional means.

Getting started

First, open an image in Photoshop. It doesn’t have to be anything special, just an image with a few varying tones or colors. I’m using this photograph of blossoms because its colorful and I’m excited that it’s finally spring, here in Australia.

How to do Pixel Stretching in Photoshop

Duplicate your original image, which will be labeled as Background in the Layers panel. Right-click on the Background layer and select Duplicate Layer. It’s important that you don’t apply the pixel stretching technique directly to the original image in case you need to revert back to earlier stages of the project.

To preserve layers, photographers use Adjustment Layers to apply adjustments to an image without altering it directly. This process is called non-destructive editing. Pixel stretching, however, is by nature a destructive technique. The process applies an effect directly to the layer you have selected. This means that if your history is so stretched that you can’t return to a certain spot during editing, there’s no going back.

How to do Pixel Stretching in Photoshop

The process

On the Photoshop tools pallet, select the Single Marquee Tool. You may have to click and hold down the mouse over the Rectangular or Elliptical Marquee Tool until it reveals a small menu.

The Marquee Tool panel will reveal a choice between the Single Row Marquee Tool and the Single Column Marquee Tool. I’m going to use the Single Row Marquee Tool, but you can easily come back and experiment further once you get the hang of the technique.

How to do Pixel Stretching in Photoshop

With the Single Row Marquee Tool selected, click on an area in your image that you think is interesting. A dotted line stretching across your image will appear. This outlines the row of selected pixels that line up with the point you clicked on.

How to do Pixel Stretching in Photoshop

Once you have your pixels selected, click on Edit in the menu bar and select Free Transform. You can also select Free Transform by right-clicking on the dotted line of the Marquee Tool.

How to do Pixel Stretching in Photoshop

After you click on the Free Transform option, the cursor will appear as two opposing arrows when you hover over the Marquee Tool line. Click on the line where the opposing arrows appear and slowly drag the cursor down over the image.

You’ll see that whole row of pixels will stretch as far as you drag the mouse. When you’ve finished stretching the selection, press enter and there you go. Looks kind of neat, right?

How to do Pixel Stretching in Photoshop

How to do Pixel Stretching in Photoshop

How to do Pixel Stretching in Photoshop

Shaping the area of pixels to stretch

The next step is to fit our stretched pixels into the landscape of the image. This time, open a photograph featuring straight, hard lines. Bridges and streets are good subjects to start with.

Select the Single Column or Single Row Marquee Tool and align the Single Marquee Tool with a hard line in your image. Again, I’m using the Single Row Marquee Tool but feel free to experiment with the Single Column Marquee Tool instead.

How to do Pixel Stretching in Photoshop

Once you have your Single Marquee Tool lined up, select the Rectangular Marquee Tool from the Photoshop toolbar. You’ll need to depress the cursor over the Marquee Tool icon to reveal the Rectangular Marquee Tool.

How to do Pixel Stretching in Photoshop

With the Rectangular Marquee Tool selected, click on the Subtract option just below the menu bar (big red arrow below). The Subtract mode of the Rectangle Marquee Tool means that any portion of the selected line of pixels within the perimeter of the rectangle will be deleted. Drag the Rectangle Marquee Tool over an area of the Single Marquee Tool line and release the mouse.

How to do Pixel Stretching in Photoshop

Rectangular marquee in subtract mode on the left of the image.

You’ll notice that a section of the Single Marquee Tool line will be deleted. This means that only the remaining Single Marquee Tool line will be available for stretching pixels later. For the image below, I deleted the lines that intruded outside the perimeter of the staircase. It’s hard to see, but the remaining dotted line is still aligned with the top of the green staircase.

How to do Pixel Stretching in Photoshop

Now that you have a smaller portion of pixels selected, right click on the remaining dotted line and select Free Transform. This time when you drag the selected line of pixels up or down the image, only the remaining pixels selected by the Single Marquee Tool line will be stretched.

How to do Pixel Stretching in Photoshop

Pixels stretched up but only within the staircase area.

Conclusion

Now that you know the basics of pixel stretching, it’s time to experiment. This simple process has some distinctive painterly characteristics that alter the perspective of an image. The nature of digital photography often yields predictable, formulaic results…But be careful, you never know exactly how a pixel stretched image will turn out – which makes it quite addictive!

I would love to see your creations in the comments below. Happy pixel stretching!

How to do Pixel Stretching in Photoshop

How to do Pixel Stretching in Photoshop

The post How to do Pixel Stretching in Photoshop by Megan Kennedy appeared first on Digital Photography School.


Digital Photography School

 
Comments Off on How to do Pixel Stretching in Photoshop

Posted in Photography

 

Surprise! Google hid a custom-built image processor inside the Pixel 2

18 Oct

Google’s Pixel 2 launch event on October 4th put a lot of emphasis on the new smartphone’s camera capabilities. However, the presenters at the event left out one very interesting detail: Google Visual Core.

Visual Core is a custom-built system-on-a-chip (SOC) designed to power and accelerate the Pixel 2 phones’ much-lauded HDR+ function that achieves better dynamic range and reduced noise levels through computational imaging. The new Pixel 2 phones already come with the chip built in, but it has not been activated yet. It appears Google ran out of time before the Pixel 2 launch to fully optimize Visual Core implementation in the device.

The good news is it will be activated at some point “over the coming months”, which should make HDR+ processing on the new devices even quicker and smoother than it already is (and it’s already far faster than on the original Pixels). According to Google, it will then be 5x faster and use less than 1/10th of the the energy,” a real advantage over the current general purpose processing. In the future the chip could also take over additional image processing tasks.

The company will also enable Pixel Visual Core as a developer option in its Oreo 8.1 preview, allowing access to HDR+ for the developers of third-party camera apps. All of this is currently limited to Google’s Pixel 2 devices, but there’s hope other manufacturer will pick up the Visual Core technology and associated software in the future.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Surprise! Google hid a custom-built image processor inside the Pixel 2

Posted in Uncategorized

 

New Samsung image sensors use dual pixel for fast AF and fake bokeh

13 Oct

Samsung Electronics has launched a couple of new image sensor—both intended for use in mobile devices—under its ISOCELL sensor brand—the ISOCELL Fast 2L9 and the ISOCELL Slim 2×7.

As the model name suggests, the ISOCELL Fast 2L9 is part of the Fast line-up, providing fast autofocus speeds, even in low light conditions. To achieve this, the chip is using dual-pixel technology with two photo diodes at each pixel location. This not only speeds up autofocus but, according to Samsung, also allows for creation of a software-based bokeh-effect without the need for a dual-camera, which is more or less what Google’s new Pixel 2 devices do.

The 12MP sensor comes with a 1.28µm pixel size, which is slightly smaller than the 1.4µm currently used in Samsung flagship phones.

At 0.9µm, the pixel size is even smaller on the second new sensor, the Slim 2X7. Like the Fast 2L9, it is designed to fit into even very thin devices without the need for a camera bump, but this one comes with a higher 24MP pixel count. In low light, the sensor combines the image information captured by four neighboring pixels to increase sensitivity and reduce image noise, a process which Samsung calls Tetracell.

Like in other ISOCELL sensors, Deep Trench Isolation technology is applied to improve dynamic range and reduce color crosstalk on both sensors.

Looking at the technologies used in these new sensors it is evident that as a maker of both hard and software, Samsung is an excellent position to design its sensors with computational imaging applications already in mind. Unfortunately, there is no information yet on when we’ll see the new sensors integrated in actual devices.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on New Samsung image sensors use dual pixel for fast AF and fake bokeh

Posted in Uncategorized

 

Google’s unlimited full-res photo storage for Pixel 2 owners ends in 2020

10 Oct

Google is offering Pixel 2 buyers a special perk that allows them to store an unlimited number of full-resolution photos and videos through Google Photos, but it comes with a catch. Fine print listed at the bottom of Google’s Pixel 2 product page notes that the free unlimited full-res storage is only available until 2020; at that point, the handsets will revert to Google Photos’ typical ‘high-quality’ unlimited storage option.

‘High-quality’ is the term Google uses to denote a 1080p video resolution and 16MP image resolution.

Google Photos allows any user to upload an unlimited number of photos and videos at up to this high-quality threshold; anything that exceeds it is compressed when uploaded and that compressed version is stored. The Pixel 2 will sidestep this restriction, but only for a couple years.

Non-Pixel phone users can upload full-resolution videos and images for free up to a 15GB threshold. Once that threshold is reached—or, for Pixel 2 owners, once 2020 arrives—additional storage space can be purchased starting at $ 2/month (depending on location).

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s unlimited full-res photo storage for Pixel 2 owners ends in 2020

Posted in Uncategorized

 

Google shares high-resolution Pixel 2 sample photos

10 Oct

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_6275889395″,”galleryId”:”6275889395″,”isEmbeddedWidget”:true,”standalone”:false,”selectedImageIndex”:0,”startInCommentsView”:false,”isMobile”:false}) });

As with the original Pixel handsets, Google is stressing the quality of its Pixel 2 phone’s 12MP camera, calling it the “world’s highest rated smartphone camera” thanks to its DxOMark score of 98. The smartphone doesn’t start shipping to buyers until November 15th, but ahead of that the company has shared a gallery of unedited images and videos taken with the handset.

The gallery was shared on Saturday by Google employee Issac Reynolds, who explained that the content includes two videos that were edited to demonstrate the camera’s video stabilization through a side-by-side comparison. In addition, Reynolds says that the only equipment used in these shots, the Pixel 2 aside, were hand grips and, in certain instances, handheld reflectors.

Anyone can view and download the gallery’s content for analysis using third-party software. As recently reported, Google will allow Pixel 2 owners to upload unlimited full-resolution videos and photos to Google Photos for free until 2020. Once that date arrives, and assuming the regular 15GB threshold is reached, Pixel owners will need to pay for additional storage or continue using free backups at a lower quality compressed resolution.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google shares high-resolution Pixel 2 sample photos

Posted in Uncategorized

 

Nine things you should know about the Google Pixel 2

07 Oct

Nine things you should know about the Google Pixel 2

With all the hype surrounding the release of the Google Pixel 2 and Pixel 2 XL and their “world’s highest rated smartphone camera,” it’s easy to lose the forest for the trees. What’s important about this new phone? Where did Google leave us wanting more? How is this phone’s camera better than its predecessor? And why should photographers care about the technology baked into Google’s new flagship?

After covering the launch in detail and spending some time with the Pixel 2 in San Francisco, we’re setting out to answer those questions (and a few others) for you.

Dual Pixel AF

The new Pixel phones sport a very clever feature found on higher-end Canon cameras: split left- and right-looking pixels behind each microlens on the camera sensor. This allows the camera to sample left and right perspectives behind the lens, which can then be used to focus the camera faster on the subject (it’s essentially a form of phase-detect AF).

It’s officially called dual pixel autofocus, and it has the potential to offer a number of advantages over the ‘focus pixels’ Apple phones use: every pixel can be dedicated to focus without any impact to image quality (see this illustration). We’ve been impressed with its implementation on the Samsung Galaxy S7 and on Canon cameras. So we’re expecting fast autofocus for stills, even in low light, as well as very smooth autofocus in video with little to no hunting. Given how good the Pixel 2’s stabilized 4K video is, you might even make some professional-looking clips from these new phones.

Dual pixel + machine learning driven portraits

The split pixels have another function: the left-looking and right-looking pixels underneath each microlens essentially sample two different perspectives that are slightly shifted from one another. Google then build a rudimentary depth map using this set of separated images and some help from its machine learning algorithms.

Clever. However, the stereo disparity between the two images are likely to be very small compared to a dual camera setup, which is likely to make it difficult for the Pixel 2 cameras to distinguish background from subject for more distant subjects. This might explain the poor results in DXO’s comparison, but better results in the image above where Allison is much closer to the camera.

On the plus side, Portrait mode now renders full resolution 12MP files (you only got 5MP files on the original Pixels), and the ‘lens blur’ Google uses is generally more pleasing than Apple’s more Gaussian blur. Out of focus highlights are rendered as more defined circles compared to Apple’s results. This comes at a cost though: the blurring algorithm is computationally intensive so you’ll generally wait a few seconds before seeing the result (and you can’t see it in real time as you can with Apple).

Hardier hardware

Unsurprisingly if you’ve been following the rumor mill, the hardware specs on the new Pixel 2 phones didn’t particularly impress any more than what we’ve seen from other phones. They’re nice devices, and both are far more durable with IP67 ratings (a huge step up from the poor IP53 ratings of the previous Pixel phones, which were prone to quick wear and tear), but hardware-wise there’s not too much to be excited about.

We’ve lost the headphone jack but gained stereo speakers in the front. The XL has less of a bezel, but it’s still not as bezel-less as Samsung phones. No dual-cameras. RAM and processor are what you get in other Android phones. You can invoke the Assistant with a squeeze, but… well…

Nothing really stands out. But wait, there’s more to the story.

AI First

If there’s one point Google CEO Sundar Pichai continuously makes in his presentations, it’s that we’re moving from a ‘Mobile First’ to an ‘AI First’ world. He’s referring to the move away from thinking of mobile devices simply as pocketable computation devices but, instead, intelligent devices that can adapt to our needs and make our lives easier. And Google is a leader here, thanks to the intelligence it acquires from its search services and apps like Maps and Photos.

AI is increasingly being used in many services to make them better, but often transparently. CEO Pichai recently cited an example of the Fitness app: every time he opens it he navigates to a different page. But rather than have the app team change the default page, or add an option to, he figures AI should just learn your preference transparently.

What’s that mean for photography and videography? We’re purely speculating here, but, imagine a camera learning your taste in photography by the way you edit photos. Or the photos you take. Or the filters you apply. Or the photos you ‘like’. How about learning your taste in music so when Google Assistant auto-builds videos from your library of photos and videos, they’re cut to music you like?

The possibilities are endless, and we’re likely to see lots of cool things make their way into the new Pixel phones, like…

Google Lens

Sundar Pichai first talked about Google Lens at the I/O Developer Conference earlier this year. It marries machine vision and AI, and is now available for the first time in the Photos app and within Google Assistant on the new Pixel phones. Google’s machine vision algorithms can analyze what the camera sees, and use AI to do cool things like identify what type of flower you’re pointing your camera at.

This sort of intelligence is applicable to photography as well: Pichai talked about how AutoML has improved Google’s ability to automatically identify objects in a scene. Anything from a fence to a motorbike to types of food to your face: Google is getting increasingly better at identifying these objects and understanding what they are – automatically using reinforcement learning.

And once you understand what an object is, you can do all sorts of cool things. Remove it. Re-light it. Identify it so you can easily search for it without every keywording your photos. The Photos app can already pull up pictures of planes, birthdays, food, wine, you name it. We look forward to seeing how the inclusion of Google Lens in the new phones makes Photos and Assistant better.

Maybe intelligent object recognition could even fix flare issues by understand what flare is… though this may not be necessary for the new phone…

Goodbye ugly lens flare

Thankfully, the nasty flare issues that plagued the first-gen Pixel phones appear to be remedied by lifting the camera module above the glass backing, which has also been reduced and streamlined to fit flush with the rest of the phone.

The camera unit is raised from the back ever-so-slightly though, but that’s a compromise we’re willing to accept if it means the camera isn’t behind a piece of uncoated glass – a recipe for flare disaster. The only flare we’ve seen so far with our limited hands-on time is what DXO witnessed in their report: the lens element reflections in corners you sometimes see even in professional lenses. That’s something we’ll gladly put up with (and that some of us even like).

If flare bugged you on the previous Pixel phones (it certainly bugged me), consider it a non-issue on the new phones.

Incredibly smooth video

When the original Pixel launched, Google claimed its camera beat other cameras with optical image stabilization (OIS) despite lacking OIS. It claimed its software-based stabilization approach allowed it to get better with time as algorithms got better. Omitting OIS was also crucial to keeping the camera small such that it fit within the slim body.

Google is singing a different tune this year, including both OIS and electronic image stabilization (EIS) in its larger camera unit that extends ever-so-slightly above the back glass. And the results appear to be quite impressive. The original Pixels already had very good stabilization in video (even 4K), but combining OIS + EIS appears to have made the video results even smoother. Check out the video from Google above.

For low light photography, OIS should help steady the camera for longer shutter speeds. You should also get better macro results and better document scanning. Hey, that’s worth something.

Equally as important as what the new phones offer is what the new phones don’t offer…

Color management? HEIF?

Notably absent was any talk about proper color management on the new phones. The previous Pixels had beautiful OLED displays, but colors were wildly inaccurate and often too saturated due to lack of any color management or proper calibrated display modes.

iPhones have some of the most color accurate screens out there. Their wide gamut screens now cover most of DCI-P3 but, more importantly, iOS can automatically switch the screen’s gamut between properly calibrated DCI-P3 and standard gamut (sRGB) modes on-the-fly based on content.

This means you view photos and movies as they were intended. It also means when you send an image from your iPhone to be printed (using a service that at least understands color management, like Apple’s print services), the print comes back looking similar, though perhaps a bit dimmer.*

The Samsung Galaxy S8 also has calibrated DCI-P3 and sRGB modes, though you have to manually switch between them. The new Pixel phones made no mention of calibrated display modes or proper color management, though Android Oreo does at least support color management (though, like Windows, leaves it up to apps). But without a proper display profile, we’re not sure how one will get accurate colors on the Pixel 2 phones.


*That’s only because prints aren’t generally illuminated as much as bright backlit LCDs that these days reach anywhere from 6 to 10 times the brightness prints are generally viewed at.

HDR display?

Sadly there was no mention of 10-bit images or HDR display of photos or videos (using the HDR10 or Dolby Vision standards) at Google’s press event. This leaves much to be desired.

The iPhone X will play back HDR video content using multiple streaming services, but more importantly for photographers it will display photos in HDR mode as well. Remember, this has little to do with HDR capture but, instead, the proper display of photos on displays—like OLED—that can reproduce a wider range of tones.

To put it bluntly: photos taken on an iPhone X and viewed on an iPhone X will look more brilliant and have more pop than anything else you’re likely to have seen before thanks to the support for HDR display and accurate color. It’s a big deal, and Google seems to have missed the boat entirely here.

HDR displays require less of the tonemapping traditional HDR capture algorithms employ (though HDR capture is still usually beneficial, since it preserves highlights and decreases noise in shadows). Instead of brightening shadows and darkening bright skies after capture, as HDR algorithms like the Pixel 2’s are known to do post-capture (above, left), leaving many of these tones alone is the way to go with high dynamic range displays like OLED.

In other words, the image above and to the right, with its brighter highlights and darker shadows, may in fact be better suited for HDR displays like that of the Pixel 2, as long as there’s still color information present in the shadows and highlights of the (ideally 10-bit) image. Unfortunately, Google made no mention of a proper camera-to-display workflow for HDR capture and display.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Nine things you should know about the Google Pixel 2

Posted in Uncategorized

 

Live coverage of the Google Pixel 2 launch on DPReview

05 Oct

10:45am PT

That’s all folks! You can learn more about these products on the Google Store right now. As for us, we’ll be running over to get our hands on the new Pixel 2 and Pixel 2 XL in person, and see if they really are the world’s best smartphone camera. Stay tuned or our hands-on take later today!

10:43am PT

The Google Clips camera looks for clear moments to capture. You can clip it anywhere. It features an F2.4 lens, 130° field of view, and captures short ‘clips’ that can be saved as motion photos, videos, or high res stills. You can choose which high-res still to save by navigating through a clip. When reviewing clips on your phone, just swipe right to save any one.

Will cost $ 250 and is “coming soon.”

10:40am PT

“A camera that takes photos for you, so you can enjoy the moment and get shots you could never get before.”

Starts with an AI engine at the core of the camera. Google Clips looks for “moments” by analyzing the scene and capturing photos automatically, so you can be part of the moment you’re capturing.

10:38am PT

One more photography update, having to do with candid photography that lets you be part of the moment as the photographer.

Meet Google Clips: a new lifelogging-style camera designed with parents and pet owners in mind.

10:37am PT

Here’s something they did NOT mention when talking about the new screen: the new Pixel 2 wide gamut display claims to offer “100% DCI-P3 coverage.” While OLEDs often offer close to full DCI-P3 coverage, our Technology Editor Rishi Sanyal is a bit skeptical of the 100% figure and wants to see an actual CIELAB diagram. Some estimates ‘cheat’ by counting extended gamut outside of the P3 space in one color to make up for the lack of gamut coverage in another color. We’ll have to wait and see, but most OLED coverage estimates max out at 99% DCI-P3 coverage.

Plus, we’re still waiting to find out if the Pixel 2 phones offer proper color management to provide accurate color on these wide gamut displays. Even the original Pixel phones offered wide gamut displays, but displayed wildly inaccurate colors because of the lack of proper color management and display profiles.

10:30am PT

Worth noting about that DxOMark score of 98: that’s an aggregate of Photo and Video scores.

The Samsung Galaxy Note 8 still beats the Google Pixel 2 in the Photo category, scoring 100 to the Google Pixel’s 99. The Pixel’s insane Video score of 96 is what gives it that high overall score. In Video, the iPhone 8 Plus scored an 89 and the Note 8 only scored an 84. We’re guessing this high score is largely due to the smooth video the combination of optical + electronic stabilization enables.

Here’s DxOMark’s full review.

10:27am PT

Feature breakdown:

  • Ultra Vivid OLED Display
  • Super Fast Charging
  • Water Resistant
  • The Fastest Fingerprint Sensor
  • Smartest Assistant
  • First Phone with Google Lens
  • Exclusive AR Stickers
  • World’s Highest-Rated Camera

Pre-orders start today.

10:23am PT

12MP F1.8 rear camera with OIS. Smaller 1/2.55″ sensor though (1/2.3″ on last year’s models). HDR+ still takes a burst of shorter exposure shots to preserve highlights, then combines (averages) them to reduce noise. The latter essentially simulates the effect of a larger sensor. While this works very well for static scenes, it can be problematic for moving objects like running kids.

Portrait mode in the Pixel 2 uses Google’s computational photography tech. No second camera required. Just split pixels on the sensor combined with machine learning. This allows both the front and back camera to use Portrait Mode.

It’s actually quite clever: the phone creates a rudimentary depth map using Dual Pixel technology and machine learning. Or, as our Tech Editor explains it, “The pixels are split just like on Canon Dual Pixel sensors. And the Samsung Galaxy. It’s used for phase-detect AF (fast focus) as well as to create a rudimentary depth map using the left and right perspectives viewed from behind one lens. Smart.”

And you no longer have to move the camera upward while taking a photo in Portrait mode. You can just snap a shot. This would make it work better with slightly moving subjects compared to the original Pixel phones. Sadly, Portrait mode is not simulated in real-time as it is on recent iPhones.

Oh, and the Video mode uses OIS and EIS at the same time. We’ve seen this on 1″-type compact cameras and some ILCs like Canon M-series and the Olympus E-M1 Mark II, but it’s a first for smartphones. This should lead to incredibly smooth video!

10:20am PT

Pixel camera now!

“With Pixel 2, we have reimagined smartphone photography. DxOMark has issued Pixel 2 an unprecedented score of 98.”

That trounces the iPhone 8 Plus and Samsung Galaxy Note 8 which both scored 94.

10:18am PT

Augmented reality updates now. Very similar to the AR updates we saw with Apple and the new iPhones—inserting furniture or games into the real world through augmented reality.

Something ‘exclusive’ to Pixel 2 are AR stickers that interact with the world and with each other… because Google needed something to compete with Apple’s Animojis.

10:14am PT

Talking about Google Lens now. Using pictures, machine learning technology, and Google Assistant to pull information out of images and tell you all about them. Like pulling phone numbers off a flyer, or… telling the difference between muffins and chihuahuas (their example, not ours).

10:10am PT

Squeezing the phone triggers Google Assistant, so you can ask it to take a selfie. And it uses Machine Learning to tell if that squeeze was “intentional.”

Still waiting on more comprehensive updates about the camera. Hopefully it’s not all software and AI-based improvements. We’re really hoping for some hardware updates like OIS and maybe a bigger sensor or better processor.

*fingers crossed*

10:05am PT

Pixel 2: Full HD OLED display on the smaller 5-inch model. 100,000:1 contrast ratio. More than twice the contrast ratio of phones in its class (save for the iPhone X). Comes in three colors: Kinda Blue, Just Black, and Clearly White.

Pixel 2 XL: Less bezel, ‘gently curved’ screen, wide color gamut display, integrated circular polarizer so you can view the screen through sunglasses, 538 ppi (up from 534 ppi in the first Pixel XL). Comes in two colors: Just Black and ‘stylishly simple’ Black and White. We’re told the screen is optimized for VR, which may mean a pixel arrangement more amenable to high magnification.

“We don’t set aside better features for the larger device.” OOOO sick burn on Apple.

And yes, they are both IP67 dust and water resistant! On par with the iPhone, but a bit short of Samsung’s IP68. That’s a big upgrade from the IP53 rating of last year’s phones (what do IP ratings mean?).

10:00am PT

Google VP Mario Queiroz on stage, getting ready to talk about a ‘smarter’ and ‘simpler’ smartphone.

The Google Pixel 2, designed “with the best of Google built in.” Comes in 2 sizes, 5-inch and 6-inch XL. More Google Assistant capabilities and will “continue to offer the best photography.”

9:58am PT

One hour later, we’re FINALLY about to hear about Pixel 2 and Pixel 2 XL!!!

9:55am PT

The 12.3-inch Quad HD touchscreen is nice, it’s the first laptop with Google Assistant built in, and the laptop comes with the new Google Pen that can be used in concert with Google Assistant. 2,000 levels of pressure sensitivity… wonder how well photo editing in Lightroom on the Pixelbook works with the pen?

9:49am PT

*Sigh*…

Still waiting on the Google Pixel 2 launch. Moving on to Pixelbook from Google Home. It’s like they’re TRYING to torture the photo nerds. Let’s see if there’s any photo-centric reasons to be excited about the Pixelbook…

9:35am PT

We’re getting a bunch of Google Home updates/announcements. There’s a small one now… something about fabric… they needed 100+ tries to find an appropriately grey grey… cool stuff… clearly we’re very interested in this part.

*insert Jeopardy waiting music here*

9:21am PT

Next generation of Google devices are “fast” and “easy to use” and “anticipate your needs.” Products that get faster and more helpful over time thanks to machine learning.

9:19am PT

Rick Osterloh: “Pixel had the best and top rated smartphone camera. We’re really proud with how well the Pixel did as our first generation smartphone.”

He’s not wrong. But there’s a lot of room to improve…

Rick is talking about the challenges facing hardware development. So Google is going to take a “different approach” to smartphone [photography] advances by living at “the intersection of AI, software and hardware.”

9:12am PT

Pichai is confident that Google is at the forefront of driving the shift to this AI-first future.

One of the major leaps forward Google has made, is in Object Detection, which he says is now at 45% accuracy! The company is using this tech in Google Lens and, says Pichai, in the Google Pixel smartphones.

9:05am PT

Google CEO Sundar Pichai on stage. Started with a somber note about the horrifying tragedy in Las Vegas, and the natural disasters around the world.

Now talking about how Google is using machine learning technology to improve everything from Google Maps, to parking difficulty prediction, to Google Translate. Pichai is “excited about a shift from a Mobile-first to an AI-first world.”

This shift will no doubt have a major impact on the future of mobile photography.

8:59am PT

Are you ready? The Google DJs are winding down the music.

8:45am PT

We’re officially inside the SF Jazz Center waiting for the presentation to start! A few things we’re hoping for: optical image stabilization, better depth of field simulation with live preview, and a much more durable Pixel 2/XL on par with the iPhones (IP67 rating) or even Samsung’s Galaxy Note 8 (IP68 rating).

8:30am PT

Hot on the heels of Apple’s own smartphone announcement, Google is taking on the iPhone 8 Plus and iPhone X with its own release. In T-minus 30 minutes, Google is set to unveil the Pixel 2 and Pixel 2 XL (among a few other things) and we’ll be covering the launch live from San Francisco on Twitter and on this page.

Watch the livestream with us, and keep refreshing this page for up-to-the-minute takes on all things photography related from the Google event.


Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Live coverage of the Google Pixel 2 launch on DPReview

Posted in Uncategorized

 

Google unveils Pixel 2 phones: Adds OIS, Dual Pixel powered Portrait Mode and more

05 Oct

Ever since the iPhone 8 Plus and iPhone X were announced, we’ve been waiting for Google’s response. When the original Google Pixel came out, it quickly became one of the most raved about smartphone cameras in the world… would the Pixel 2 follow suit? The short answer, at least according to Google, is yes.

Just this morning, we sat down in the SF Jazz Center and, after an hour of other updates, Google finally unveiled the 5-inch Pixel 2 and 6-inch Pixel 2 XL.

The new phones house a 12.2MP sensor with 1.4um pixels, Dual Pixel phase detect autofocus and an F1.8 lens on the back, and an 8MP camera with 1.4um pixels, fixed focus and an F2.7 lens on the front. The newer 1/2.55″ sensor is smaller than the previous-gen’s 1/2.3″ sensor, but the brighter aperture nearly perfectly compensates.* Video specs for the rear camera max out at 4K 30fps (sorry, no 4K/60p like the new iPhones) while the front camera can do up to 1080p at 30fps. The camera units are now raised above the back glass surface, which remedies the nasty flare issues the previous Pixels had.

As we hoped, the whole phone is encased in an IP67 water and dust resistant aluminum unibody, and is powered by the latest and greatest Qualcomm Snapdragon 835 processor.

More impressive than the base specs are how Google uses its hardware in concert with software and machine learning technology to deliver a better photography and video experience.

Instead of opting for a dual camera on the back of the phone, the Google Pixel 2 and Pixel 2 XL uses just one camera, and combines this with Dual Pixel technology (split left/right pixels) and computational photography to create the now-ubiquitous fake bokeh Portrait Mode effect. And since stabilization is incredibly important, they’ve worked out how to use both optical and electronic image stabilization at the same time when you’re shooting video, which should deliver incredibly smooth footage. (more on that from San Francisco shortly…)

Unfortunately, in our brief time with the cameras so far, we discovered that Portrait mode is still not rendered live on either camera… it seems there are downsides to using a single camera instead of a dual cam setup, or in Google’s (we think correct) choice to use a more computationally intensive ‘lens’ blur as opposed to the more Gaussian (smooth) blur that Apple opts for.

Finally, no modern smartphone is complete until you look at the display your photos and videos will be viewed on.

Unfortunately, Google made no mention of color management or proper display profiles—which caused issues with the previous Pixel smartphones—but the new AMOLED (for the 5-inch model) and pOLED (for the 6-inch model) displays are wide-gamut. The Pixel 2 claims 93% DCI-P3 coverage while the Pixel 2 XL claims full 100% coverage of the same standard.

We bring this up because last year’s Pixel phones also offered a wide color gamut and high contrast ratio, thanks to their OLED display technology, but often displayed wildly inaccurate colors due to the lack of color management. It’s still possible the displays will come calibrated properly for the P3 or sRGB color spaces, but without any explicit mention of calibrated display modes that the OS automatically switches between based on the color space of the content (as Apple claims to do), we remain skeptical.

The lack of any talk of HDR display of video or photos was also a disappointment after the announcement of iPhone X’s support for HDR10 and Dolby Vision video, and HDR display of photos. The latter should make HDR photos pop on the bright contrasty OLED display of the iPhone X, rather than give them the flat tonemapped look we’re often used to. It seems Google has chosen to go the traditional method of compressing a high contrast scene into a flatter image, rather than take advantage of the HDR display capabilities of its OLED display.

We’re currently spending some time with the Google Pixel 2 and Pixel 2 XL in person today at the Jazz Center, so stay tuned for our hands-on impressions as the designated photography nerds at this event.

In the meantime, you can find out more about either of these phones on the Google Store, check out our Live Blog to see what we were thinking as the announcements were going up, or argue about your Apple vs Google allegiance in the comments.


* At least for low light performance, but perhaps not dynamic range. The discussion is complicated by the use of computational photography, of course, so it’s difficult to speculate on the overall impact of the smaller sensor / brighter aperture.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google unveils Pixel 2 phones: Adds OIS, Dual Pixel powered Portrait Mode and more

Posted in Uncategorized