RSS
 

Posts Tagged ‘iphone’

Olloclip releases new lens and clip system with iPhone 11 and Samsung Galaxy S10 support

20 Nov

Mobile photography company Olloclip has released a new lens and clip system that includes support for the iPhone 11, 11 Pro and 11 Pro Max, as well as the Samsung Galaxy S10 and S10e smartphone models.

Olloclip is offering its new system with the Olloclip StartPack, MacroProPack and ElitePack, a trio of kits offering various lens bundles with the customer’s choice of clip. As well, the company has launched a new Pocket Telephoto 2X Essential lens, relaunched the Macro 10X Essential lens, and updated its Starter Kit to include the Pivot Grip stabilizer, an ultra-light tripod and the BSR Bluetooth Shutter Release.

With this new lens system, according to Olloclip, any of the lenses can be used with any clip, including a new one that’ll be released in December for the iPhone 7 through the iPhone 8 Plus models.

The Olloclip ElitePack retails for $ 129.99; it includes the applicable clip for the Samsung Galaxy S10 or iPhone 11 model, as well as the Pocket Telephoto 2X Essential lens and the two-in-one Fisheye / 15X Macro Essential lens. The Olloclip StartPack, meanwhile, retails for $ 79.99 and the MacroProPack starts at $ 199.99, each respectively offering Start lens models and various macro lenses.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Olloclip releases new lens and clip system with iPhone 11 and Samsung Galaxy S10 support

Posted in Uncategorized

 

Facebook fixes iOS bug that triggered the camera app on iPhone

15 Nov

Facebook has released an update that fixes a bug in its iOS app related to the iPhone camera app, something that had raised privacy concerns among users.

On Tuesday, a report from CNET highlighted a complaint from some Facebook users on iPhone who shared videos showing a bizarre bug involving the phone’s camera app. At least two different scenarios were found that would cause the Facebook app to become off-center on the phone’s display.

The iPhone’s camera app with its live view would be visible next to the offset Facebook app, spurring conspiracy theories that the social network may have been deliberately using the device’s camera to collect data on the user. In a tweet, however, Facebook VP of Integrity Guy Rosen explained that a recently published bug fix for a different issue caused the Facebook app to ‘partially’ navigate to the iPhone’s camera.

Soon after on Wednesday, November 13, Facebook released a fix for the problem through the App Store. Users will need to download and install the latest update to fix the bug. Some users have reported an inability to trigger the camera bug after updating, indicating that it is an effective fix.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Facebook fixes iOS bug that triggered the camera app on iPhone

Posted in Uncategorized

 

iPhone users get free unlimited Google Photo storage for original photo files, for now, while Pixel 4 owners have to pay up

22 Oct

All Android phones running Google apps come with free Google Photos cloud storage for images and videos captured with the device. However, image and video files are not stored at original quality. Instead, they are compressed to what Google calls ‘high quality’. Users who prefer to store their original out-of-camera files in the cloud have to shell out for one of Google’s storage plans.

On its own Pixel device, Google has in the past made an exception. Users of the Pixel 3 and previous Pixel devices could store unlimited original files for life, but this perk has ended with the brand new Pixel 4. Users of the latest Google flagship will just be treated like users of any other Android phone.

Now it seems the only ones benefiting from Google’s free unlimited storage for original photos are actually the users of one of Apple’s recent iPhones. Reddit user u/stephenvsawyer discovered that images in the HEIC/HEIF format, which recent iPhones use by default, will be stored without any compression.

A screenshot from Google’s Pixel 2 promotional page captured at the time of the release showing the no-longer-current benefit of getting unlimited storage of original files for free.

The reason for this is pretty simple: if Google tried to compress the images, the file size would actually increase. So the decision to save original HEIC/HEIF files to its cloud platform saves Google both storage space on its servers and computing power. It’s worth noting that this only applies to photos. iPhone videos are saved at 1080p resolution, even if they were recorded at 4K settings.

The latest version of the Android OS, Android 10, technically supports the HEIC/HEIF format, but Pixel 4 devices don’t currently offer this option. So, at least for now, iPhone users are actually getting more out of Google Photos than users of Google’s own flagship phone.

In a statement made to Android Police, Google said ‘We are aware of this bug and are working to fix it.’ What exactly this ‘fix’ entails remains to be seen, but there’s a good chance this iPhone loophole could get closed down in the near future.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone users get free unlimited Google Photo storage for original photo files, for now, while Pixel 4 owners have to pay up

Posted in Uncategorized

 

How does iPhone 11 Night Mode compare to Google Pixel 3 Night Sight?

15 Oct

Many smartphones today take great images in broad daylight. That’s no surprise – when there’s a lot of light, it doesn’t matter so much that the small smartphone sensor doesn’t collect as many photons as a larger sensor: there’s an abundance of photons to begin with. But smartphone image quality can take a nosedive as light levels drop and there just aren’t many photons to collect (especially for a small sensor). That’s where computational techniques and burst photography come in.

Low light performance is a huge differentiator that separates the best smartphones from
the worst

Low light performance is a huge differentiator that separates the best smartphones from the worst. And Google’s Night Sight has been the low-light king of recent1, thanks to its averaging of many (up to 15) frames, its clever tile-based alignment to deal with hand movement and motion in the scene, and its use of a super-resolution pipeline that yields far better resolution, particularly color resolution, and lower noise than simple frame stacking techniques.

With the iPhone 11, Apple launched its own Night Mode to compete with offerings from Android phones. It uses ‘adaptive bracketing’ to combine both long and short exposures (to freeze any movement) to build a high quality image in low light conditions. Let’s see how it stacks up compared to Google’s Night Sight and Apple’s own previous generation iPhone XS.

The set-up

‘Low light performance’ is difficult to sum up in one number or picture when it comes to computational imaging. Different devices take different approaches, which ultimately means that comparative performance across devices can vary significantly with light level. Hence we’ve chosen to look at how the iPhone 11 performs as light levels decrease from evening light before sunset to very low light conditions well after sunset. The images span an hour-long time frame, from approximately 500 lux to 5 lux. All shots are handheld, since this is how we expect users to operate their smartphones. The iPhone 11 images spanning this time period are shown below.

7:00 pm, evening light
1/60 | ISO 100
485 lux | 7.6 EV

7:25 pm, late evening light
1/8 | ISO 250
25 lux | 3.4 EV

7:50 pm, low light
1/4 | ISO 640
5 lux | 1 EV
8:05 pm, very low light
1/8 | ISO 1250
<5 lux | <1 EV

Note that Night mode is only available with the main camera unit, not the 2x or 0.5x cameras. And before we proceed to our comparisons, please see this footnote about the rollovers and crops that follow: on ‘HiDPI’ screens like smartphones and higher-end laptops/displays, the following crops are 100%, but on ‘standard’ displays you’ll only see 50% crops.2

Now, on to the comparisons. In the headings, we’ve labeled the winner.

Evening light (485 lux) | Winner: Google Pixel 3

Before sunset, there’s still a good amount of available light. At this light level (485 lux, as measured by the iPhone 11 camera), the option for Night mode on iPhone 11 is not available. Yet Night Sight on the Google Pixel 3 is available, as it is in all situations. And thanks to its averaging of up to 15 frames and its super-resolution pipeline, it provides far more detail than the iPhone 11.

It’s not even close.

Take a look at the detail in the foreground trees and foliage, particularly right behind the fence at the bottom. Or the buildings and their windows up top, which appear far crisper on the Pixel 3.

Late evening light (25 lux) | Winner: Google Pixel 3

As the sun sets, light levels drop, and at 25 lux we finally have the option to turn on Night Mode on the iPhone, though it’s clearly not suggested by Apple since it’s not turned on by default. You’ll see the Night Mode option as a moon-like icon appearing on the bottom left of the screen in landscape orientation. Below we have a comparison of the iPhone with Night Mode manually turned on next to the Google Pixel 3 Night Sight (also manually enabled).

There’s more detail and far less noise – particularly in the skies – in the Google Pixel 3 shot. It’s hard to tell what shutter speeds and total exposure time either camera used, due to stacking techniques using differing shutter speeds and discarding frames or tiles at will based on their quality or usability. But it appears that, at best, the Pixel 3 utilized 15 frames of 1/5s shutter speeds, or 3s total, while the iPhone 11 indicated it would use a total of 1s in the user interface (the EXIF indicates 1/8s, so is likely un-representative). In other words, here it appears the Pixel 3 used a longer total exposure time.

Apart from that, though, the fact that the iPhone result looks noisier than the same shot with Night Mode manually turned off (not shown) leads us to believe that the noisy results are at least in part due to Apple’s decision to use less noise reduction in Night Mode. This mode appears to assume that the longer overall exposures will lead to lower noise and, therefore, less of a need for noise reduction.

However, in the end, it appears that under these light levels Apple is not using a long enough total exposure (the cumulative result of short and long frames) to yield low enough noise results that the lower noise reduction levels are appropriate. So, in these conditions when it appears light levels are not low enough for Apple to turn on Night Mode by default, the Google Pixel 3 outperforms, again.

Low light (5 lux) | Winner: Tie

As light levels drop further to around 5 lux, the iPhone 11 Night mode appears to catch up to Google’s Night Sight. Take a look above, and it’s hard to choose a winner. The EXIF data indicates the Pixel used 1/8s shutter speeds per frame, while the iPhone used at least 1/4s shutter speed for one or more frames, so it’s possible that the iPhone’s use of longer exposure times per frame allows it to catch up to Google’s result, despite presumably using fewer total frames. Keynotes from Apple and personal conversations with Google indicate that Apple only uses up to 8-9 frames of both short and long exposures, while the Pixel uses up to 15 frames of consistent exposure, for each phone’s respective burst photography frame-stacking methods.

Very low light (< 5 lux) | Winner: iPhone 11

As light levels drop even further, the iPhone 11 catches up to and surpasses Google’s Night Sight results. Note the lower noise in the dark blue sky above the cityscape. And while overall detail levels appear similar, buildings and windows look crisper thanks to lower noise and a higher signal:noise ratio. We presume this is due to the use of longer exposure times per frame.

It’s worth noting the iPhone, in this case, delivers a slightly darker result, which arguably ends up being more pleasing, to me anyway. Google’s Night Sight also does a good job of ensuring that nighttime shots don’t end up looking like daytime, but Apple appears to take a slightly more conservative approach.

We shot an even darker scene to see if the iPhone’s advantage persisted. Indeed, the iPhone 11’s advantage became even greater as light levels dropped further. Have a look below.

(Night Mode Off)

(Night Sight Off)

As you can see, the iPhone 11 delivers a more pleasing result, with more detail and considerably less noise, particularly in peripheral areas of the image where lens vignetting considerably lowers image quality as evidenced by the drastically increased noise in the Pixel 3 results.

Ultimately it appears that the lower the light levels, the better the iPhone 11 performs comparatively.

A consideration: (slightly) moving subjects

Neither camera’s night mode is meant for photographing moving subjects, but that doesn’t mean they can’t deal with motion. Because these devices use tile-based alignment to merge frames to the base frame, static and moving subjects in a scene can be treated differently. For example, on the iPhone, shorter and longer exposures can be used for moving and static subjects, respectively. Frames with too much motion blur for the moving subjects may be discarded, or perhaps only have their static portions used if the algorithms are clever enough.

Below we take a look at a slightly moving subject in two lighting conditions: the first dark enough for Night mode to be available as an option on the iPhone (though it isn’t automatically triggered until darker conditions), and the second in very dim indoor lighting where Night mode automatically triggers.

Although I asked my subject to stay still, she moved around a bit as children are wont to do. The iPhone handles this modest motion well. You’ll recall that Apple’s Night mode uses adaptive bracketing, meaning it can combine both short and long exposures for the final result. It appears that the exposure times used for the face weren’t long enough to avoid a considerable degree of noise, which is exacerbated by more conservative application of noise reduction to Night mode shots. Here, we prefer the results without Night mode enabled, despite the slight watercolor painting-like result when viewed at 100%.

We tested the iPhone 11 vs. the Google Pixel 3 with very slightly moving subjects under even darker conditions below.

Here you can see that Apple’s Night mode yields lower noise than with the mode (manually) turned off. With the mode turned off, it appears Deep Fusion is active3, which yields slightly more detail at the cost of more noise (the lack of a smeary, watercolor painting-like texture is a giveaway that Deep Fusion kicked in). Neither iPhone result is as noise-free and crisply detailed as the Pixel 3 Night Sight shot, though. We can speculate that the better result is due to either the use of more total frames, or perhaps more effective use of frames where the subject has slightly moved, or some combination thereof. Google’s tile-based alignment can deal with inter-frame subject movement of up to 8% of the frame, instead of simply discarding tiles and frames where the subject has moved. It is unclear how robust Apple’s align-and-merge algorithm is comparatively.

Vs. iPhone XS

We tested the iPhone 11 Night Mode vs. the iPhone XS, which has no Night Mode to begin with. As you can see below, the XS image is far darker, with more noise and less detail than the iPhone 11. This is no surprise, but it’s informative to see the difference between the two cameras.

Conclusion

iPhone 11’s Night Mode is formidable and a very welcome tool in Apple’s arsenal. It not only provides pleasing images for its users, but it sometimes even surpass what is easily achievable by dedicated cameras. In the very lowest of light conditions, Apple has even managed to surpass the results of Google’s Night Sight, highly regarded – and rightfully so – as the industry standard for low light smartphone photography.

But there are some caveats. First, in less low light conditions – situations you’re actually more likely to be shooting in – Google’s use of more frames and its super-resolution pipeline mean that its Pixel 3 renders considerably better results, both in terms of noise and resolution. In fact, the Pixel 3 can out-resolve even the full-frame Sony a7S II, with more color resolution and less color aliasing.

Second, as soon as you throw people as subjects into the mix, things get a bit muddled. Both cameras perform pretty well, but we found Google’s Night Sight to more consistently yield sharper images with modest subject motion in the scene. Its use of up to 15 frames ensures lower noise, and its align-and-stack method can actually make use of many of those frames even if you subject has slightly moved, since the algorithm can tolerate inter-frame subject movement of up to ~8% of the frame.

If you’re photographing perfectly still scenes in very low light, Apple’s iPhone 11 is your best bet

That shouldn’t undermine Apple’s effort here which, overall, is actually currently class-leading under very, very low light conditions where the iPhone can use and fuse multiple frames of very long exposure. We’re told the iPhone 11 can use total exposure times of 10s handheld, and 28s on a tripod. Google’s Night Sight, on the other hand, tends to use an upper limit of 1/3s per frame handheld, or up to 1s on a tripod. Rumors however appear to suggest the Pixel 4 being capable of even longer total exposures, so it remains to be seen who will be the ultimate low light king.

Currently though, if you’re photographing perfectly still scenes in very low light, Apple’s iPhone 11 is your best bet. For most users, factoring in moving subjects and less low light (yet still dark) conditions, Google’s Night Sight remains the technology to beat.


Footnotes:

1 Huawei phones have their own formidable night modes; while we haven’t gotten our hands on the latest P30 Pro, The Verge has its own results that show a very compelling offering from the Chinese company.

2 A note about our presentation: these are rollovers, so on desktop you can hover your mouse over the states below the image to switch the crop. On mobile, simply tap the states at the bottom of each rollover to switch the crop. Tap (or click) on the crop itself to launch a separate window with the full-resolution image. Finally, on ‘Retina’ laptops and nearly all modern higher-end smartphones, these are 100% crops (each pixel maps 1 display pixel); however, on ‘standard’ (not HiDPI) displays these are 50% crops. In other words, on standard displays the differences you see are actually under-represented. [return to text]

3We had updated the iPhone 11 to the latest iOS 13.2 public beta by the time this set of shots was taken; hence the (sudden) availability of Deep Fusion.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on How does iPhone 11 Night Mode compare to Google Pixel 3 Night Sight?

Posted in Uncategorized

 

iPhone 11’s Portrait Mode sets a high bar for the Pixel 4

12 Oct
Taken with iPhone 11 ISO 500 | 1/30 sec | F1.8

The bokeh-imitation effect that’s all over your Instagram feed is a few generations old, but it’s still a relatively young technology. Portrait Mode, as Apple calls it, is a computational feature that mimics the shallow depth of field closely associated with professional portrait photography. The latest iteration in the iPhone 11 is a great leap forward and, when compared with Google’s Pixel 3, shows that the search-engine giant is going to have to do something pretty special with the forthcoming Pixel 4.

Just to make sure you’re caught up – phone sensors and the lenses they are coupled with are quite small, and inherently limited in their ability to create a blurry background behind a subject. Hence, portrait mode was born (Portrait Mode is Apple’s proprietary name, but for the sake of simplicity I’ll use it throughout this article to refer to all such modes).

Compared side-by-side with results from the iPhone 11, the Pixel 3 has been surpassed in many respects

Like so many first-generation technologies, portrait mode was a bit dodgy at first – subjects poorly separated from their backgrounds and results that were decent but not quite convincing. But no matter where you stand on its current state from “It’s so terrible it’s an insult to photographers” to “Eh, it’s passable,” there’s no denying that it has steadily improved with each generation.

Apple, like most manufacturers, introduced Portrait Mode when it brought dual cameras to its devices. However, Google chose to offer it with a single camera, relying on dual pixel depth data, machine learning, and up-sampling to create that fake bokeh look. The results looked fine until, well, about now.

Compared side-by-side with results from the iPhone 11, the Pixel 3 has been surpassed in many respects. Here are the areas in which the iPhone 11 pulls clearly ahead of the Pixel – and where Google needs to do some catching up in the Pixel 4.

The nitty gritty details

Google achieves its portrait mode by digitally zooming in to mimic a longer focal length and creating a depth map by using dual pixels along with a learning-based algorithm to judge distance to a subject and separate it from its background, up-sampling the final result to a full 12MP resolution. Apple (and Samsung, Huawei, among others) instead use their telephoto camera, calculating depth with the help of the perspective offset between the telephoto and wide cameras – no cropping or up-sampling needed.

The images below demonstrate the difference – note that to match the subject’s size in the frame using the two different focal lengths, the 3a image was taken from about a meter farther back than the iPhone 11.

Of course the vast majority of portrait mode images will be viewed on a phone or computer screen, where the difference in detail is much harder to spot. Still, looking at the images above at even a 50% crop shows a vast difference in the level of detail captured, and all things being equal we’d much rather have more detail than less.

Backlit subjects

We’ve previously noted the Pixel 3’s fantastic ability to render high-contrast scenes, but one place this falls flat is with backlit portrait subjects. The camera’s tendency to preserve highlight detail and push up shadows is normally what we’d prefer, but it doesn’t work well when the shadows are your main subject.

Pixel 3a iPhone 11

The resulting image shows that the Pixel does a poor job of rendering the cat’s orange fur, giving him an overall ‘crunchy’ look in comparison to the more pleasing rendering by the iPhone. In our testing, the Pixel 3 has consistently shown this tendency to expose for highlights, even when it might do better to choose an exposure better suited to your human or feline subject, at the cost of highlight detail. Even tapping the subject’s face doesn’t adjust the exposure as much as we’d like.

Skin tones

The most sophisticated depth mapping in the world won’t save an image from bad-looking skin tones, and this is one area where Google really needs to catch up. The subject below is lit by window light that’s much cooler than the yellow lights of the kitchen behind him. It’s a tricky situation for sure, but the iPhone has clearly made the right call to warm up the subject’s skin tone rather than preserve the cool cast of the window light.

Pixel 3a iPhone 11

To be fair, both of these phones are susceptible to producing noticeably different colors based on slight shifts in framing, or using a different camera mode like Night Sight. But over the course of much use, we’ve seen that the Pixel 3’s standard camera mode renders skin tones particularly poorly by comparison.

Apple’s face smoothing and skin tone rendering has a tendency to go too far in some situations, and there are times when we prefer the more faithful color rendering of the Pixel. It’s also pretty easy to correct the Pixel’s skin tone rendering in the phone’s own Photos app, but we’re betting that most people don’t want to (and won’t) take the time to color correct every portrait that they take.

Focal length flexibility for portrait mode

Taken with iPhone 11 | ISO 200 | 1/60 sec | F1.8

Apple’s XR introduced wide-angle portrait mode to the iPhone, but the 11 and 11 Pro improve on it with more accurate depth maps thanks to the availability of the ultra-wide lens. Thus, the 11 offers a very good wide Portrait mode via its standard lens, and the 11 Pro offers both telephoto and wide portrait options.

Whether you prefer the look of a telephoto or wide portrait is of course a personal preference, and I tend to prefer the wide portrait mode on the iPhone 11. I like an across-the-table environmental portrait, which usually requires backing up if I’m using the telephoto lens or the crop imposed by the Pixel 3.

Whether or not you like the crop, it being forced on you makes it less flexible and, in my book, I’d rather have that wide-angle – and I’m sure I’m not alone.

Your move, Google

To be fair, there are things we prefer about the Pixel 3’s portrait mode. We find it’s much less prone to obvious errors in cutting around human subjects than the iPhone 11. I also far prefer using Google Photos to Apple’s iCloud, so the seamless integration with my photo archive is a big plus.

Healthy competition between two big tech companies keeps pushing phone camera technology forward at a rapid pace

We can also say with some certainty based on leaks and rumors that the Pixel 4 will address some of these shortcomings. We know that the device will offer more cameras, which will likely improve portrait mode. Whether we’ll see improvements to skin tones or better handling of backlit subjects is less certain, though encouragingly, leaked photos do show better rendering of skin tones. All will be revealed soon, but one thing is for sure – healthy competition between two big tech companies keeps pushing phone camera technology forward at a rapid pace, and that’s nothing but good news for the photo-taking public.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone 11’s Portrait Mode sets a high bar for the Pixel 4

Posted in Uncategorized

 

Report: The camera components inside the iPhone 11 Pro Max cost $73.50

11 Oct

Technology analyst firm TechInsights has started its teardown of Apple’s latest flagship smartphone, the iPhone 11 Pro Max which features a triple-camera setup, including an ultra-wide angle and 2x tele lens.

The analysis of the camera components is still in progress (check back to the TechInsights website for updates) but the team has already been able to compile a cost estimate for all component groups used in the device. At $ 73.50 the camera and imaging components add the biggest chunk to the overall bill, highlighting how important the camera is these days on smartphones.

Apple iPhone 11 Pro Max (512GB) cost analysis (source: TechInsights)

Only the applications processor ($ 64), display (66.50) and non-electronic components, such as the housing ($ 61) come close. The overall bill is $ 490.50 which, at a $ 1449 retail price for the torn down 512GB model, should leave a sizeable profit margin for Apple, even after adding overhead expenses into the calculation.

Incidentally the cost of the iPhone 11 Pro Max camera components is only $ 0.50 more than what UBS found for the Samsung Galaxy S10 5G. For the full iPhone 11 Pro Max teardown head over to the TechInsights website.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Report: The camera components inside the iPhone 11 Pro Max cost $73.50

Posted in Uncategorized

 

DPReview TV: iPhone 11 Pro – what photographers may have missed

09 Oct

In this week’s episode, Chris is joined by our Science Editor Rishi Sanyal to take a closer look at some of the new photography features on the iPhone 11 and iPhone 11 Pro camera. Find out how some of the headline features like Night Mode perform, and learn about less-publicized photo features like the handy ‘Capture Outside of the Frame.’ With the Pixel 4 on the horizon, it’s just the beginning of more head-to-head comparisons of these powerful little cameras.

Be sure to subscribe to our YouTube channel to get new episodes of DPReview TV every week.

  • Introduction
  • Autofocus
  • Lenses
  • Depth Maps
  • Wide Angle Portraits
  • 2X Lens Portraits
  • Dual Lens Reframing
  • In-Camera Editing
  • HDR OLED Display
  • Night Mode
  • Night Mode Disadvantages
  • iPhone 11 Pro vs XS
  • Night Mode vs. Pixel 3 Night Sight
  • Lens Limitations
  • Conclusions

Sample images from this episode

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_0180096528″,”galleryId”:”0180096528″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });
Articles: Digital Photography Review (dpreview.com)

 
Comments Off on DPReview TV: iPhone 11 Pro – what photographers may have missed

Posted in Uncategorized

 

iPhone 11 Pro sample gallery (DPReview TV)

09 Oct

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_0180096528″,”galleryId”:”0180096528″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

Have a closer look at sample images from iPhone 11 Pro’s ultra-wide, standard and telephoto cameras featured in this week’s DPReview TV episode.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone 11 Pro sample gallery (DPReview TV)

Posted in Uncategorized

 

iPhone 11’s coolest photo feature is the hardest one to find

05 Oct
Cinerama leaning back – a natural result of pointing my camera upwards to capture the whole building.

Anyone who has stood at ground level and taken a photo of a building across the street has likely seen the effects of perspective distortion – you tilt your camera back to bring the whole building into frame, causing the straight lines of the building to appear to be ‘leaning back.’ Tilt-shift lenses are designed for exactly this problem, but they’re expensive, specialist optics.

More often, this effect will be corrected in software, but doing so usually requires the user to stretch the top of the image and crop to avoid the blank spaces this creates at the bottom of the frame. Apple is tackling this problem with a unique approach in the iPhone 11: by capturing more data outside of the frame.

I don’t know, I just like boring photos I guess?

For whatever reason, I’m drawn to the types of photos where perspective distortion is painfully obvious – signs, sides of buildings, etc. – but I’m horrible at lining them up correctly. Usually, I find out going through my images later that I wasn’t squared up to my subject even though I thought I was. Horizons are slightly askew, or I was leaning back slightly. Apple, it seems, has heard my cries.

When you’re shooting with the standard camera (with a focal length equivalent to about 26mm), the iPhone 11 will also capture image data from the ultra-wide (13mm equiv.) camera – a feature that is referred to in the settings menu as “Photos Capture Outside the Frame.” If you’re shooting on the telephoto camera of the 11 Pro, it’ll capture additional information from the standard camera.

That extra information is saved alongside your photo. When you edit that image in the native camera app, you’ll be able to use the extra data as you rotate and manipulate your image – a big help when you’re trying to fix crooked lines in a photo.

As you make image adjustments, you’ll see the extra data captured by the ultra-wide lens. This additional image information is available for 30 days.

The phone can use that information to automatically re-crop photos too. In the camera settings menu there’s an option to “Auto Apply Adjustments.” You’ll know that auto adjustments have been applied to an image when it shows a blue “Auto” icon above your captured photo. We’ve noticed this feature being employed when the phone detects a human subject cut off at the edge of the frame.

And even for many photos that aren’t automatically adjusted, the stock camera app will suggest tweaks when brought into edit. For example, take that image of the building that’s leaning back – if you edit it in the iPhone’s camera app and engage the crop tool, it will automatically correct for perspective distortion and use the extra image data it saved to fill in the areas at the edges of the frame that would otherwise need to be cropped out.

Bringing the image into the iPhone’s native editing app, then pressing the ‘crop’ option will take you to this view. The yellow ‘auto’ icon appears at the top of the image if there’s a suggested crop, as there is in this example.
The same adjustments can be applied in Photoshop, but without that extra image information at the sides of the frame you’ll need to crop in to avoid including blank space in your final image.
The iPhone goes beyond these limitations with that extra image data. In addition to correcting perspective, you can creatively re-crop your image to preserve details at the edge of the frame – and even include objects that were well outside of the frame in your initial standard image.

I don’t think many people will discover this feature, and that’s a shame. It’s not just helpful for correcting distortion and fixing crooked horizons – it’s a useful feature if you just want to re-crop an image after-the-fact. However, it will only be discovered by those who enable the ‘capture outside the frame’ feature and attempt to crop an image, which I imagine is a fraction of the many people who will use the camera day in and day out.

Regardless of how widely used this feature will be, what Apple is doing is clever. Photoshop’s Content Aware Fill feature does something similar – it will fill in missing data when rotating or stretching an image – but instead of using data from a wider lens, it’s filling in those empty spaces based on educated guesses. Apple’s approach is just one more way in which smartphone manufacturers are using data to their advantage – to the advantage of boring photo fans everywhere.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone 11’s coolest photo feature is the hardest one to find

Posted in Uncategorized

 

Video: iPhone records its dramatic fall from a plane over Iceland, is recovered a year later

05 Oct

Iceland Photo Tours pilot and photographer Haukur Snorrason has shared a video showing the descent of his iPhone 6S Plus as it fell from a small plane located about 60m (200ft) over Iceland. The incident happened more than a year ago; given the height and frozen tundra beneath, Snorrason had assumed at the time that his tiny iPhone hadn’t survived the fall.

Around 13 months after the phone was dropped, a group of hikers discovered the device in a patch of moss, which had cushioned the blow and enabled the phone to survive the drop. The device powered on when tested, revealing Snorrason’s name and making it possible to reunite him with his lost device.

In addition to being nearly entirely functional (only the microphone was damaged), Snorrason discovered that the iPhone had recorded and saved a video of its rapid descent from the plane. The device landed face down on the moss, protecting the display from the elements while leaving the camera exposed to record the bright blue sky and Sun until its battery died.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Video: iPhone records its dramatic fall from a plane over Iceland, is recovered a year later

Posted in Uncategorized