RSS
 

Posts Tagged ‘Google’s’

Google’s long-awaited Clips Camera hits stores, will cost you $250

30 Jan

If you’ve been desperately waiting for Google’s artificial intelligence-driven Clips camera to go on sale now is your moment. The company has added the video clip shooting device to its store for US customers at the expected $ 250 price tag, with delivery expected between the end of February and the beginning of March.

The lifelogging camera was first revealed at the Pixel 2 event in October. It’s designed to recognize the best moments and composition, and to shoot automatically when it ‘thinks’ the occasion is right. The aptly named Clips camera shoots short ‘clips’ of video which can be reviewed in a Google Clips app. In the app, clips can be saved or deleted, and still images can be extracted from the clips as well.

The 12MP camera has a shutter button too for human driven activation, but the main idea is that it is placed somewhere it can see what’s going on, and it does all the work for you. The main idea is that using Clips in its automatic ‘intelligent’ mode allows the user to be in the pictures instead of having to be behind the camera.

Below is a sample clip posted to the Google blog, with the video captured by the camera on the left and the still extracted from the video on the right. Stills are extracted using the Google Clips app.

The camera can record at 15fps, and uses a lens with a 130° angle of view. Images are stored in the 16GB internal memory, and the camera can run for three hours on a single charge. Connection is via USB-C, Wi-Fi or Bluetooth.

As reported before, professional photographers were consulted to help the company understand what makes a good or a bad picture, so the after analyzing what’s happening and where the elements are in the frame, the device’s brain decides whether to record or not. The camera also learns about the people you mix with, and will take more clips of people it sees often, as it will assume they are closer to you. Thankfully, it will also get to know your cat, to save you the bother of photographing it yourself.

For more information, visit the Google webstore.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s long-awaited Clips Camera hits stores, will cost you $250

Posted in Uncategorized

 

Google’s new AI ranks photos on their technical and aesthetic quality

27 Dec
Image: Google

We have seen several attempts at automated image assessment from both technical and aesthetic points of view in the past. For example, Google researchers have previously used convolutional neural networks (CNNs) to assess image quality of specific image categories, such as landscapes.

However, these previous approaches could typically only differentiate between low and high image quality in a binary way. Now, a Google research team has developed a methodology that can provide a more granular assessment of the quality of a photograph that is applicable to all types of images.

The NIMA: Neural Image Assessment model uses a deep CNN that was trained to predict which images a typical user would rate as looking technically good or aesthetically pleasing, using that information to rate an image on a scale of 1 to 10.

To achieve this, it relies on state-of-the-art deep object recognition networks and uses them to develop an understanding of general categories of objects. As a result, NIMA can be used to score images in a reliable manner and with high correlation to human perception, which makes it a potentially very useful tool for labor intensive and subjective tasks, such as automated image editing or image optimization for user engagement.

The NIMA team says that, in testing, the model’s aesthetic ranking of images closely matches the mean scores that were assigned by human judges. What’s more, the technology is still in its infancy; further retraining and testing should improve the model even further. Once systems get better, future applications could include image capture with real-time feedback to the photographer, auto-culling, or providing guidance to image editors to achieve optimized post-processing results.

More detail on this fascinating new system are available on the Google Research Blog.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s new AI ranks photos on their technical and aesthetic quality

Posted in Uncategorized

 

Gear of the Year 2017 – Allison’s choice: Google’s HDR+ mode

16 Nov

I was told. And I believed. But I didn’t quite understand how good Google’s Auto HDR+ mode is. After shooting with the Pixel 2 in some very challenging lighting conditions, I’m a believer.

Google’s HDR+ mode is really, really good. And I’m prepared to defend it as my Gear of the Year.

Like I said, I was told. Our own Lars Rehm was impressed with Auto HDR+ in his Google Pixel XL review of last year. In his words: “the Pixel XL is capable of capturing decent smartphone image quality in its standard mode but the device really comes into its own when HDR+ is activated… The Pixel camera is capable of capturing usable images in light conditions that not too long ago some DSLRs would have struggled with.”

So heading out with the Pixel 2 in hand, I knew that was a strong suit of the camera. I was looking forward to testing it on some challenging scenes. Things didn’t look too promising though as the day started off pretty miserably.

The afternoon forecast looked better, but any Seattlite can tell you there are no guarantees in October. I figured I had a day of dull, flat lighting ahead of me that I’d have to get creative with. I was happily proved wrong.

The clouds started to thin out mid-afternoon. On a long walk from the bus toward Gas Works Park, I came across this row of colorful townhouses. The sun was behind them, and I snapped a photo that looked like a total loss as I composed it on the screen – the houses too dark and lost in the shadows. I didn’t want to blow out the sky to get those details in the houses, so I just took what I figured was a dud of a photo and moved on. So what I saw on my computer screen later was a total surprise to me: a balanced, if somewhat dark exposure, capturing the houses and the sky behind them.

Am I going to print this one, frame it and put it on the wall? No. But I’m impressed that it’s a usable photo, and it took no knowledge of exposure or post-processing to get it.

Gas Works used to be a ‘gasification’ plant owned by the Seattle Gas Light company and was converted into a park in the mid-70’s. Some of the industrial structures remain, monuments to a distant past surrounded now by green parkland and frequented by young families with dogs and weed-vaping tech bros alike. On a sunny afternoon in October it was, both literally and figuratively, lit.

I was convinced my photos were not turning out, but I kept taking them anyway. It’ll just be a deep shadows, blue sky kind of look, I thought. Little did I know that the Pixel 2 was outsmarting me every step of the way.

Back at my desk with the final photos in front of me, I was genuinely impressed by the Pixel 2. Did it do anything that I couldn’t with a Raw file and about 30 seconds of post processing? Heck no. But the point is that this is the new normal for a lot of people who take pictures and have no interest in pulling shadows in Photoshop. They will point their cameras at high contrast scenes like these and come away with the photos they saw in their heads. If you ask me, it’s just one more reason why smartphones will topple the mighty entry-level DSLR.

Apple’s catching on too. HDR Auto is enabled by default in new iPhones and veteran photographer/iPhone user Jeff Carlson is also impressed by how the 8 Plus handles high contrast scenes.

While smartphone manufacturers have been increasingly implementing HDR as an always-on-by-default feature, they’ve also been making these modes smarter and the effect more aggressive. What previously took technical know-how, dedicated software, and multiple exposures is now happening with one click of a virtual shutter button, and it’s going to keep getting better.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Gear of the Year 2017 – Allison’s choice: Google’s HDR+ mode

Posted in Uncategorized

 

Google’s unlimited full-res photo storage for Pixel 2 owners ends in 2020

10 Oct

Google is offering Pixel 2 buyers a special perk that allows them to store an unlimited number of full-resolution photos and videos through Google Photos, but it comes with a catch. Fine print listed at the bottom of Google’s Pixel 2 product page notes that the free unlimited full-res storage is only available until 2020; at that point, the handsets will revert to Google Photos’ typical ‘high-quality’ unlimited storage option.

‘High-quality’ is the term Google uses to denote a 1080p video resolution and 16MP image resolution.

Google Photos allows any user to upload an unlimited number of photos and videos at up to this high-quality threshold; anything that exceeds it is compressed when uploaded and that compressed version is stored. The Pixel 2 will sidestep this restriction, but only for a couple years.

Non-Pixel phone users can upload full-resolution videos and images for free up to a 15GB threshold. Once that threshold is reached—or, for Pixel 2 owners, once 2020 arrives—additional storage space can be purchased starting at $ 2/month (depending on location).

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s unlimited full-res photo storage for Pixel 2 owners ends in 2020

Posted in Uncategorized

 

Here’s Google’s impressive OIS + EIS video stabilization demonstrated

05 Oct

Optical image stabilization is a welcome update in the Google Pixel 2, but what’s really impressive is that it can be used in tandem with electronic stabilization in video mode. If Google’s demo at its launch event today is any indication, it’s pretty darn effective and makes for super smooth clips that look like they were shot with a steadicam. While we’ve seen this in the traditional camera space in 1″-type compacts from Sony and Canon, as well as ILCs like the Canon M5 and Olympus E-M1 II, it’s a first for smartphones.

We got a chance to see this same video in person; it was certainly impressive. We’re eager to give it a try ourselves when we get our hands on a review unit.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Here’s Google’s impressive OIS + EIS video stabilization demonstrated

Posted in Uncategorized

 

Shutterstock’s new watermarking system foils Google’s AI

27 Aug

Following a study Google released last week showing how easily an AI could remove watermarks from stock images, stock photography website Shutterstock has introduced stronger watermarks that are more difficult to eliminate using automated software. Google notified the stock photography company about the vulnerability before publishing its research paper, and it seems the company took their warning in stride.

Google’s technology works via a computer algorithm that learns to identify the common elements in many stock images—that is, the watermark—and then remove just those elements without altering the underlying photo at all.

Google ultimately identified the consistency of watermarks as their main point of vulnerability. To counteract the vulnerability, watermarks needed inconsistencies that involved ‘random geometric perturbations.’ In other words: the marks should be warped to avoid consistency across the entire photo library. Doing this confuses the artificial intelligence, and results in watermark artifacts being left on the image after attempted removal.

Stock photography website Shutterstock has already implemented an improved watermark system based on Google’s advice. With its new system, stock photos are protected by watermarks that are applied with slight geometry changes, making it impossible for an AI like Google’s to figure out precisely which pixels belong to the watermark versus the image itself.

“The result was a watermark randomizer that our engineering team developed so that no two watermarks are the same,” says Shutterstock CTO Martin Brodbeck. “By creating a completely different watermark for each image, it makes it hard to truly identify the shape.”

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Shutterstock’s new watermarking system foils Google’s AI

Posted in Uncategorized

 

Google’s Camera app has been unofficially ported to other Android phones

15 Aug

A developer going by the handle B-S-G has created an unofficial port of Google’s Camera app, allowing a larger number of Android users to utilize the software with much-loved features like HDR+. Though the app is only officially available on the Pixel smartphones, this port makes it available to any Android smartphone running a Qualcomm Snapdragon 820, 821 or 835 processor.

Phones that can now run the Google Camera app include the Galaxy S8, LG G6, and OnePlus 5.

Google’s Camera app (in conjunction with the Pixel camera hardware) has been praised for both the quality of the photos it takes and its wide range of features, including HDR+. However, the app’s limitation to the Pixel smartphones meant most Android users couldn’t use it. B-S-G has changed that, and though the ported app can’t be downloaded from the Play Store (given that it is an unofficial port), the APK is available online.

The folks at XDA Developers both tested and analyzed the app, and concluded that it doesn’t contain any malicious code and is safe to install. However, it is important to exercise caution with any non-official APK and understand that there is an implicit risk whenever an APK is sideloaded onto a device… proceed with caution.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s Camera app has been unofficially ported to other Android phones

Posted in Uncategorized

 

Google’s Motion Stills app is now available for Android

21 Jul

Last year Google launched Motion Still for iOS, an app that stabilizes the iPhone’s Live Photos and makes them shareable as looping GIFs and videos. Now the software giant has made the app available for Android devices running version 5.1 and later of its own mobile OS.

The app works a little differently on Android to the iOS version. Instead of using an existing Live Photo, the Android version forces you to record video inside the app. Stabilization is then applied using a, compared to the iOS version, redesigned video processing pipeline that processes each frame of a video as it is being recorded. As consequence the results are instant and no waiting is required to share the created GIFs.

Fast Forward is a new feature and builds on the stabilization algorithm to capture longer clips and create stabilized time-lapses or hyperlapses. Playback speed is adjustable from 1x to 8x and GIF output can be created in three sizes.

Motion Stills for Android is now available on Google Play.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s Motion Stills app is now available for Android

Posted in Uncategorized

 

Dream Deep: Trippy Maps Reenvisioned by Google’s Artificial Neural Network

29 Jun

[ By WebUrbanist in Art & Drawing & Digital. ]

FaceApp and similar reality-warping applications are especially fun to use in ways their designers never intended. Along similar lines, Google’s DeepDream (designed for photo manipulation) creates fascinating results using photographs but is even more stunning when applied to representations of cityscapes.

While training DeepDream (a neural network that adapts like a brain to new inputs) to identify, differentiate and understand images, Google researchers discovered it could “over-interpret” results as well. In short: it could start to “read into” images from previous experience, resulting in an array of beautiful (if disturbing) hybrids.

Once it went public, mapmakers were among those intrigued by the possibilities of geo-visualization, turning flat maps into seemingly living landscapes. Tim Waters, a geospatial developer, began taking OpenStreetMap data and running it through the system, generating these strangely psychedelic urban environments.

He discovered that a short run could create fractal and quilting effects, while longer and reiterated processing started to introduce faces and creatures to the mix.

Above: monkeys and frogs seem to emerge from the grid, while a coastal region forms the head of a bear, making the landscape look like a giant bearskin rug. Overall, the effects are quite beautiful, creating a sense of depth and adding character to what would otherwise be fairly generic representations.

Share on Facebook





[ By WebUrbanist in Art & Drawing & Digital. ]

[ WebUrbanist | Archives | Galleries | Privacy | TOS ]


WebUrbanist

 
Comments Off on Dream Deep: Trippy Maps Reenvisioned by Google’s Artificial Neural Network

Posted in Creativity

 

Google’s new PhotoScan app makes digitizing prints super easy

16 Nov

There are plenty of existing methods for digitizing printed photos, and most of them fall on a spectrum between ‘arduous with good results’ and ‘quick with terrible results.’ Google’s new PhotoScan app aims to aims to bridge the gap with a method that’s easy and produces good results by employing computational photography. 

The free app, available now for Android and iOS, requires the user to place their photo on a flat surface. After snapping a reference frame, the app directs the user to move their phone around the image to capture more data and, critically, move around the glare that the photo is almost certainly reflecting.

After you’ve made a successful pass, the app will work its magic and spit out a digitized, glare-free rendition of your photo. Images can be saved to your phone’s camera roll and to the cloud. In less than a minute, you’ve got a shareable digital photo that’s way better than the quick-and-dirty version.

Decent scans of instant photos with minimal effort? Sign me up. I scanned these Instax prints with Google’s PhotoScan app and they are gloriously glare-free.

The app analyzes your photo and identifies reference points so it can merge multiple versions of the same image, and compares pixel-level details to judge which image is free of glare. It’s based on technology Google and MIT have been developing to help remove unwanted reflections and obstructions from photos.

The app automatically crops, straightens and rotates your photo, but you can rotate and adjust the corners after capture if needed. My first few tries show surprisingly good results, with glare nearly totally removed in each image. The app uses your phone’s flash to provide illumination, but even so, using better available light produced the nicest results. The results look good enough for social sharing, but if it’s high resolution, high quality digital conversions you’re after, you’ll probably still need to go about it the hard way.

For more information you can watch Google’s Nat and Lo interview researchers about how it all works.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google’s new PhotoScan app makes digitizing prints super easy

Posted in Uncategorized