RSS
 

Posts Tagged ‘iphone’

Alleged 2019 iPhone renders show triple-camera arrangement

08 Jan

It’s still a long way to go until Apple’s usual iPhone launch month of September but it looks like iPhone photography fans will have a triple-camera setup to look forward to. A series of renders shared by @Onleaks in cooperation with digit.in shows alleged renders of a 2019 iPhone model with a Huawei Mate 20 Pro-style square camera setup, featuring three lenses.

Two of the lenses are vertically aligned, with the third lens offset to the right under the flash LED. It also looks like the phone’s rear panel is made from glass.

Onleaks notes the device is still in its EVT phase (Engineering Validation Test), so some design details are subject to change as we near the launch. We also don’t know what the function of the third camera will be, but with Huawei, LG and Samsung now all offering triple-camera smartphones with super-wide-angle and tele options, it’s likely Apple will want to go down the same route.

It’s still early in the year for new iPhone renders but @Onleaks has an excellent track record and is widely regarded a credible source, so there is a good chance one of the 2019 iPhones will look very similar to the renders. Given the latter are showing a triple-camera setup, were are likely looking at a top-end model here.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Alleged 2019 iPhone renders show triple-camera arrangement

Posted in Uncategorized

 

Hipstamatic harnesses iPhone X TrueDepth data to improve TinType app

06 Dec

Camera app developer Hipstamatic says it has found a way to use the depth data generated by the iPhone X to improve the way its TinType app works out which areas of a picture to render out of focus. The depth information the new camera phone creates has allowed Hipstamatic’s developers to identify a genuine plane of focus instead of having to guess and simulate the effect just with software blurring.

A portrait taken with the new app showing a map that demonstrates the area of the image the camera takes to be the subject and where the plane of sharp focus should be

Hipstamatic founder Ryan Dorshorst says that the TrueDepth feature of the new iPhone X provides information at every pixel about how far away the subject is, so with a subject identified it is a much easier job to determine what is background as well as what is in front of and behind the subject – and to blur only those areas. This allows the developers not so much to improve the impression of a tin type’s characteristics but the extremely shallow depth of field that we associate most with large format cameras.

In previous versions of the app a ring of blur was placed around the subject based on where the camera was focused, but it was only really effective when the subject was a person and they were in the right part of the frame. The new v2.1 version bases the decision about where the blur should go on real depth data, so the effect can be applied in a convincing way in a much wider range of situations.

The app is only available for iPhone users, and can be downloaded from the Apple App Store. For more information on the TinType app see Hipstamatic’s TinType page.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Hipstamatic harnesses iPhone X TrueDepth data to improve TinType app

Posted in Uncategorized

 

Officials say Apple’s claim of ‘studio quality’ portraits on iPhone X, Xs isn’t misleading

05 Dec

Two challenges to Apple’s claim that its iPhone X can shoot studio quality portraits have been turned down by the UK’s Advertising Standards Authority (ASA). The complainants took issue with Apple’s advertising line that the phone could deliver ‘Studio-quality portraits […] Without the studio’ and believed consumers would misled, but after an investigation the ASA found that the statement was fair.

The basis of the findings is that there isn’t clear definition of what ‘studio quality’ means, and that there is a wide variety of talent in the studio photography industry that meant that the term didn’t necessarily indicate that a ‘studio quality’ portrait was a good one. Rather, the ASA agreed with Apple that the Portrait Lighting effects, the depth-of-field mimicking software and the inclusion of a standard, instead of a wide, focal length meant that the characteristics of a ‘studio’ portrait could be achieved. The investigation also found that the effects shown in the Apple adverts could indeed be produced with the phone at the time of shooting or post capture.

The ruling might seem a smack in the face to the portrait business and to undermine respect for the profession, but photographers are perhaps becoming victims of our own well-worn stock phrases such as ‘the best camera is the one you have with you’. While there is no clear measure of what ‘studio quality’ means, skill and vison are required to create a good portrait and as we all know ‘it’s the archer not the arrow’ – though Apple forgot to mention that bit.

The fact is that smart phones are genuinely becoming better and better at taking pictures, and their developers are devising features and functions well ahead of those traditional camera makers offer. These features often exist to compensate for the physical limitations of the tiny camera units, but they also put incredible flexibility into the hands of the user. At every turn in history the advances of smaller formats have been opposed by ‘proper’ photographers, but that has done nothing to prevent the inevitable progress of the convenience and popularisation of photography. You would be mad to buy an iPhone X to start your portrait business however, as a decent interchangeable lens camera can be had for less than the same price – with change to use a pay phone.

For more information on the complaint, the investigation and the ruling see the Advertising Standards Authority website.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Officials say Apple’s claim of ‘studio quality’ portraits on iPhone X, Xs isn’t misleading

Posted in Uncategorized

 

iPhone XS: How does the variable bokeh effect compare to a real lens?

04 Dec

One of the key new features of Apple’s latest iPhones is the ability to adjust the ‘bokeh effect’ on portrait images, after they’ve been taken. But, as well as letting you adjust the intensity of the effect, the function has been enhanced to more accurately represent the bokeh characteristics of a real lens, rather than just trying to blur the background.

Every time you shoot an image using the 56mm-equivalent F2.4 portrait camera on the iPhone XS you have the choice of editing the bokeh effect. This brings up a scale marked in F-numbers. This may sound like Apple just borrowing an interface from the real-world (a process called skeuomorphism), but it goes beyond this: the company says it’s modeled the bokeh characteristics to mimic the behavior of a Zeiss lens.

We thought we’d put this to the test: how convincingly does the iPhone XS resemble a real-world lens? Is the F-number scale anything more than a pastiche? To find out, we shot the XS alongside the Nikkor 58mm F1.4, mounted on a full frame camera.

iPhone XS vs Nikon 58mm at F1.4

iPhone XS image processed as ‘F1.4’ Nikkor 58mm at F1.4

Scaling the Nikon image down to the same width, you can see the bokeh is around the right size:

Then, when you look at the bokeh off-center, you’ll see it develops an elongated ‘cat-eye’ effect.

iPhone XS vs Nikon 58mm at F8

iPhone XS image processed as ‘F8’ Nikkor 58mm at F8

Just as with the real lens, the cat-eye effect diminishes as you ‘stop down.’ And Apple has given its bokeh a smooth, fairly gaussian look, rather than the slightly bright-edged bokeh that Nikon has produced, being constrained by the limitations of things such as glass and physics.

Unlike the ‘real’ camera, the iPhone’s sharpness doesn’t always drop-off smoothly: for instance it’s blurred both shoulders and the subject’s scarf, despite the nearer being in a similar plane to the face.

However, while this doesn’t always looks natural, the phone is intentionally ensuring that the subject’s face remains entirely in focus, which is usually a good thing. And, unlike the $ 1600 Nikkor lens, it doesn’t become a little soft and dreamy when set to ‘F1.4.’

Equally, because the iPhone isn’t actually changing its aperture, you don’t find yourself with less light if you want more depth of field (the iPhone portrait camera’s actual depth of field is F15 equivalent, so there’s plenty that’s in focus in the underlying ‘native’ image), so you don’t have to worry so much about camera shake or subject movement.

The end result isn’t going to convince anyone if they look too closely (the processing has cut-off some of the fine hairs, for instance), but for social media use, it’s hard to deny that the effect is impressive. And we have to assume this technology will only get smarter and more powerful in future generations.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone XS: How does the variable bokeh effect compare to a real lens?

Posted in Uncategorized

 

iPhone X bug lets hackers steal deleted photos

16 Nov

If you have any particularly embarrassing or otherwise compromising photos on your iPhone you might want to think twice about how to keep them from being discovered by someone else. Simply deleting them might not be enough.

A vulnerability allowing hackers to access deleted photos and other files on the iPhone X was discovered by two researchers this week at the Pwn2Own hacking contest for finding iOS and Android bugs.

Richard Zhu and Amat Cama demoed the issue by connecting the iPhone X with iOS 12.1 to a malicious Wi-Fi network and exploiting a vulnerability in a so-called just-in-time (JIT) compiler which is designed to help iPhones to perform certain tasks faster.

The couple could then retrieve a photo from the Photo app’s Recently Deleted album where images are stored for 30 days after you delete them from the camera roll. This feature allows users to recover deleted photos should they have a change of mind. Through the same method other files processed by the JIT compiler could be accessed as well.

Zhu and Cama received a $ 50,000 reward for their findings and Apple has been informed of the bug. According to Forbes, the issue has yet to be fixed, though.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone X bug lets hackers steal deleted photos

Posted in Uncategorized

 

iPhone XR Portrait mode for pets, inanimate objects enabled by Halide developers

30 Oct

The developers behind camera app Halide may have discovered a way to enable Portrait mode for pets and objects on the iPhone XR. The revelation was made by one of the developers on Reddit over the weekend, where it was explained that the team found depth data from the iPhone XR’s camera and used it to successfully use Portrait mode on pets and inanimate objects.

Unlike the other new dual-camera iPhone models, the iPhone XR’s single rear camera only supports taking Portrait images of humans, a limitation that may be addressed by third-party apps like Halide. According to the Reddit post, using the mode for non-humans on the iPhone XR is a bit finicky at times and only works if there’s “enough variance in relative distance of objects,” the developer explained.

“Note that the depth map is way lower resolution than the dual camera setup, but it seems usable,” the post states. Halide developer Ben Sandofsky shared the above Twitter post showing the resolution difference between iPhone XS and iPhone XR depth data on Twitter. The feature needs “some more tooling,” the Reddit post states, but it’s likely Halide will offer the ability to iPhone XR users in a future update.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone XR Portrait mode for pets, inanimate objects enabled by Halide developers

Posted in Uncategorized

 

iPhone XS / XS Max sample gallery updated

26 Oct

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_7780769194″,”galleryId”:”7780769194″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

In the time since we posted our first look at the iPhone XS’ image quality, we’ve continued shooting with its larger sibling, the XS Max. The two devices use identical 12MP dual-camera systems, boasting better HDR and Portrait Mode effects thanks to processing and computational improvements. Take a look at some additional sample images from Apple’s latest flagship phone.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone XS / XS Max sample gallery updated

Posted in Uncategorized

 

The iPhone XS is a leap forward in computational photography

05 Oct

Aside from folks who still shoot film, almost nobody uses the term ‘digital photography’ anymore – it’s simply ‘photography,’ just as we don’t keep our food in an ‘electric refrigerator.’ Given the changes in the camera system in Apple’s latest iPhone models, we’re headed down a path where the term ‘computational photography’ will also just be referred to as ‘photography,’ at least by the majority of photographers.

The iPhone XS and iPhone XS Max feature the same dual-camera and processing hardware; the upcoming iPhone XR also sports the same processing power, but with only a single camera: the same wide-angle F1.8 one on the other models. The image sensor captures 12 megapixels of data, the same resolution as every previous model dating back to the iPhone 6s, but the pixels themselves are larger at 1.4 µm, compared to 1.22 µm for the iPhone X, meaning a slightly larger sensor. (For more on the camera’s specs, see “iPhone XS, XS Max, and XR cameras: what you need to know.”)

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_7780769194″,”galleryId”:”7780769194″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

More important this year is upgraded computational power and the software it enables: the A12 Bionic processor, the eight-core ‘Neural Engine,’ and the image signal processor (ISP) dedicated to the camera functions. The results include a new Smart HDR feature that rapidly combines multiple exposures for every capture, and improved depth-of-field simulation using Portrait mode. (All the examples throughout are straight out of the device.)

Smart HDR

This feature intrigued me the most, because last year’s iPhone 8, iPhone 8 Plus and iPhone X introduced HDR as an always-on feature. (See “HDR is enabled by default on the iPhone 8 Plus, and that’s a really good thing.”) HDR typically blends two or more images of varying exposures to end up with a shot with increased dynamic range, but doing so introduces time as a factor; if objects are in motion, the delay between captures makes those objects blurry. Smart HDR captures many interframes to gather additional highlight information, and may help avoid motion blur when all the slices are merged into the final product.

The iPhone XS image almost looks as if it was shot using an off-camera flash

Testing Smart HDR proved to be a challenge, because unlike with the HDR feature in earlier models, the Photos app doesn’t label Smart HDR images as such. After shooting in conditions that would be ripe for HDR – bright backgrounds and dark foreground, low-light conditions at dusk – nothing had that HDR indicator. I wasn’t initially sure if perhaps the image quality was due to Smart HDR or the larger sensor pixels; no doubt some credit is due to the latter, but it couldn’t be that much.

Comparing shots with those taken with an iPhone X reveals Smart HDR at work, though. In the following photo at dusk, I wanted to see how well the cameras performed in the fading light and also with motion in the scene (the flying sand). The iPhone X image is dark, but you still get a fair bit of detail in the girl’s face and legs, which are away from the sun. The iPhone XS image almost looks as if it was shot using an off-camera flash, likely because the interframes allow highlight retention and motion freezing even as ‘shutter speeds’ become longer.

Shot with iPhone X
Shot with iPhone XS

As another example, you can see the Smart HDR on the iPhone XS working in even darker light compared to the iPhone X shot. At this point there’s more noise in both images, but it’s far more pronounced in the iPhone X photo.

Shot with iPhone X Shot with iPhone XS

Smart HDR doesn’t seem to kick in when shooting in burst mode, or the effect isn’t as pronounced. Considering the following photo is captured at 1/1000 sec, and the foreground isn’t a silhouette, the result isn’t bad.

iPhone XS image shot in burst mode. It’s dark, but picks up the detail in the sand.
iPhone XS image shot in burst mode.
iPhone XS non-burst image captured less than a minute after the photo above.

Portrait Mode

The iPhone’s Portrait mode is a clever cheat involving a lot of processing power. On the iPhone X and iPhone 8 Plus, Apple used the dual backside cameras to create a depth map to isolate a foreground subject – usually a person, but not limited to people-shaped objects – and then blur the background based on depth. It was a hit-or-miss feature that sometimes created a nice shallow depth-of-field effect, and sometimes resulted in laughable, blurry misfires.

On the iPhone XS and iPhone XS Max, Apple augments the dual cameras with Neural Engine processing to generate better depth maps, including a segmentation mask that improves detail around the edge of the subject. It’s still not perfect, and one pro photographer I know immediately called out what he thought was a terrible appearance, but it is improved, and in some cases most people may not recognize that it’s all done in software.

The notable addition to Portrait mode in the iPhone XS and iPhone XS Max is the ability to edit the simulated depth of field within the Photos app. A depth control slider appears for Portrait mode photos, with f-stop values from F1.4 to F16. The algorithm that creates the blur also seems improved, creating a more natural effect than a simple Gaussian blur.

Apple also says it’s analyzed the optical characteristics of some “high-end lenses” and tried to mimic their bokeh. For instance, the simulated blue should produce circular discs at the center of the image but develop a ‘cats-eye’ look as you approach the edge of the image. The company says that a future update will include that control in the Camera app for real-time preview of the effect.

Portrait mode is still no substitute for optics and good glass. Sometimes objects appear in the foreground mask – note the coffee cup over the shoulder at left in the following image – and occasionally the processor just gets confused, blurring the horizontal lines of the girl’s shirt in the next example. But overall, you can see progress being made toward better computational results.

Flare and a Raw Footnote

One thing I noticed with my iPhone XS is that it produced more noticeable lens flare when catching direct light from the sun or bright sources such as playing-field lights, as in the following examples; notice the blue dot pattern in the foreground of the night image.

Since I wanted to focus on the Smart HDR and Portrait mode features for this look, I haven’t shot many Raw photos using third-party apps such as Halide or Manual (the built-in Photos app does not include a Raw capture mode). Sebastiaan de With, the developer of Halide, determined that in order to make faster captures, the camera is shooting at higher ISOs, and then de-noising the results via software. With Raw photos, however, that results in originals that aren’t as good as those created by the iPhone X, because they’re noisier and exposed brighter. You can read more at the Halide blog: iPhone XS: Why It’s a Whole New Camera.

Overall, though, the camera system in the iPhone XS and iPhone XS Max turn out to be larger improvements than they initially seemed, especially for the majority of iPhone owners who want to take good photos without fuss. Apple’s computational photography advancements in these models deliver great results most of the time, and point toward more improvements in the future.

iPhone XS sample gallery

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_7780769194_1″,”galleryId”:”7780769194″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });
Articles: Digital Photography Review (dpreview.com)

 
Comments Off on The iPhone XS is a leap forward in computational photography

Posted in Uncategorized

 

Camera app developer says there’s no ‘beauty filter’ being applied on the iPhone XS, XS Max

03 Oct

Yesterday we learned that at least a handful of iPhone XS and XS Max users are unhappy with their new devices’ front camera image quality, with some early adopters reporting over-excessive skin smoothening and beautification effects when taking self-portraits.

Software developer Sebastiaan de With, the man behind the Halide camera app, has had a closer look at the new iPhone models’ camera processing and says there isn’t any beautification applied to the front camera images. Instead, he says, it’s Apple’s new approach to image processing that can result in soft textures and smoothening.

Both the front and rear cameras in the iPhone XS and XS Max are applying computational photography methods, merging multiple frames into one to optimize image quality across the image. Frames are captured at different exposures, with the image processor picking the best elements of each frame and combining them into the final image output.

In his blog post de Wit says that this method results in a “whole new look” that’s quite different from previous iPhone cameras. The frame merging reduces the brightness of the bright areas and the darkness of the shadow areas, resulting in textures with lower levels of contrast. All the detail is still there but the viewer perceives those areas as softer and less sharp. This is also why the skin in selfie images looks softer.

Additionally, the new iPhone models are applying more aggressive noise reduction — something Apple was already known for going heavy on in the past. This is necessary because the iPhone XS tends to user faster shutter speeds and higher ISO values than previous versions, presumably to keep motion blur to a minimum. Getting rid of the noise inevitably also eliminates some fine detail.

The reduction in detail is particularly true for the front camera where a smaller image sensor comes with higher noise levels to start with. On the plus side, dynamic range is increased which is particularly useful for high-contrast scenes, where highlight-clipping is reduced and more shadow detail visible.

De With also says all these software parameters can be tweaked by Apple. So, if it turns out the “new look” isn’t too popular with consumers the Apple engineers could pretty easily revert to a more “traditional” look via a software update.

De With’s Halide app will soon receive a new Smart RAW feature that “deactivates” Apple’s Smart HDR algorithm to reduce noise reduction and reveal more image detail and fine textures. For more information head over to Sebastiaan’s complete article on the Halide blog.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Camera app developer says there’s no ‘beauty filter’ being applied on the iPhone XS, XS Max

Posted in Uncategorized

 

The cameras inside the iPhone Xs and Xs Max are estimated to cost $51.10

02 Oct

The cost benchmarking team at analyst firm IHS Markit has dissected an iPhone Xs Max in order to estimate the cost of components. For the 64GB version the team has calculated an estimated total bill of material (BOM) $ 390, a $ 20 increase from the smaller, previous-generation iPhone X.

With a starting price of $ 1,099 at the retail end of the supply chain this $ 20 increase in component cost translates into a $ 100 increase of retail pricing, however. As a comparison, Samsung’s Galaxy S9+ with 64GB of RAM has a BOM of $ 375.80 and retails at around $ 840.

The total cost of camera components inside the new iPhone models (the iPhone Xs and Xs Max use the exact same camera components) amounts to $ 51.10. This estimated total includes the two cameras on the rear of the device, the front-facing camera and the TrueDepth sensor used for FaceID, Apple’s face recognition feature. Without the TrueDepth module, the cost of the two 12-megapixel cameras on the rear of the camera and the 7-megapixel front-facing camera comes out to an estimated price tag of $ 37.60.

Combined, all of the camera components inside iPhone Xs and Xs Max devices represent approximately 13 percent of the total BOM, showing camera technology is becoming even more important on newer smartphones. On the single-camera iPhone 7 from 2016 the total cost of all camera components was only 9.6 percent of the total.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on The cameras inside the iPhone Xs and Xs Max are estimated to cost $51.10

Posted in Uncategorized