RSS
 

Posts Tagged ‘Apple’

Leak claims Apple will use sensor-shift stabilization tech in some 2020 iPhone 5G models

21 Dec

Taiwanese tech publication Digitimes has published a new report claiming that the 2020 iPhone model featuring 5G network support will feature ‘sensor-shift stabilization technology.’ This alleged new feature will only be available on select iPhone 5G models, according to the report, potentially offering better image quality over models that only feature optical image stabilization.

Optical image stabilization (OIS) works by shifting the lens whereas sensor-shift stabilization works by shifting the sensor. Though OIS is now a common feature on flagship smartphones, sensor-shift stabilization technology has been largely relegated to dedicated digital cameras, something Digitimes claims Apple will change starting next year.

Past leaks allege that Apple plans to release four new iPhones in 2020, including cheaper base tier models and more expensive higher-end models. On the high end of that scale, the 2020 iPhone is expected to feature a new 3D camera system for augmented reality applications.

It’s unclear whether the inclusion of sensor-shift stabilization technology would be limited to these higher-end models and whether the tech will play a role in Apple’s alleged AR ambitions. Digitimes itself has a mixed track record in regards to its consumer gadget leaks, though it has accurately published unreleased iPhone details in the past.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Leak claims Apple will use sensor-shift stabilization tech in some 2020 iPhone 5G models

Posted in Uncategorized

 

Apple acquires Spectral Edge to boost iPhone camera performance

13 Dec

It looks like Apple might have just made another move to improve image quality on its iPhone camera. Bloomberg reports that according to filings made public in the U.K. on Thursday, Apple has acquired the Cambridge, U.K.-based startup Spectral Edge Ltd..

Spectral Edge has developed a technology that can improve detail and color on digital cameras by taking an infrared picture and merging it with a conventional color photo using machine learning. Apple is already using AI in some of its camera algorithms, for example to improve low light images, so using AI to improve other aspects of image quality would fit nicely into the concept.

The Spectral Edge technology can be applied via software or baked into hardware which would allow Apple to integrate it into a custom image signal processor.

No purchase price has been reported but last year Spectral Edge said it raised more than $ 5 million in funding.

We should not expect the technology to be integrated in the next iPhone generation but in the medium and long term it will be interesting to see if Spectral Edge can help improve smartphone imaging.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Apple acquires Spectral Edge to boost iPhone camera performance

Posted in Uncategorized

 

Apple confirms its new Mac Pro, Pro Display XDR monitor will be available to order tomorrow

09 Dec

Apple has announced it’s opening up pre-orders for its new Mac Pro and Pro Display XDR monitor tomorrow, December 10.

It’s been six months since Apple first showed off the redesigned Mac Pro and accompanying Pro Display XDR monitor at WWDC. At that time, no definitive timeframe was given for its release, aside from vague hints it’d arrive autumn 2019.

The new emails sent out to customers confirmed the devices will be ‘Available to order December 10’ with an included ‘Save the date’ calendar reminder.

The Mac Pro starts at $ 5,999 and the Pro Display XDR monitor starts at $ 4,999 (and requires either the $ 999 stand or a $ 199 VESA mount). As Apple noted back in September, the new Mac Pro will be built in the United States, similar to its cylindrical Mac Pro predecessor. You can order the devices tomorrow at Apple.com.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Apple confirms its new Mac Pro, Pro Display XDR monitor will be available to order tomorrow

Posted in Uncategorized

 

Apple loses patent lawsuit, will have to pay RED royalties for ProRes RAW format

13 Nov

Apple has failed in an attempt to over-throw patents held by RED that govern methods for compressing Raw video, leaving the company open to paying royalties on its ProRes RAW file format. Apple had tried to show that the technology RED patented around its RedCode codec was obvious and shouldn’t have been granted protection, but the court rejected the claim leaving RED secure to license the lossless compression technique to camera, software and accessories manufacturers.

It seems that Apple had wanted to avoid paying royalties on the ProRes RAW format it introduced via Final Cut Pro last year, and which is used in some DJI drones, some Atomos recorders and a few other products. The problem is that RED claims ProRes RAW uses technology it owns for compressing those RAW files to make them manageable to work with. RedCode allows Raw video to be captured and compressed in-camera in much the same way that stills cameras do, allowing data directly from the sensor to be recorded and made available for very flexible post-production manipulation.

RED’s technology allows files to be compressed by ratios of up to 22:1, though it says 3:1 is mathematically lossless and 8:1 is visually lossless. The value of this is not only that it allows video files to be reduced in size, but also that for the same size file videographers can record in higher resolutions to provide the means for heavy cropping and frame splitting in post-production.

RED President Jarred Land posted on the RED User forum that he was glad the company’s technology remained protected but that the dispute between RED and Apple was just a means to finding where each stood technology-wise so they could continue to work together. He wrote:

‘We are pleased to see our REDCODE patents withstand another challenge.

To be clear, as I mentioned before, this never really was Apple vs. RED. It has always been APPLE + RED, and this was all part of the process defining how we work together in the future.

RED integration with Apple’s METAL framework for realtime R3D playback is coming along well and the work that the two teams are doing together is exceeding expectations. We are very excited for the new Mac Pro and the new XDR pro display and the power they bring to the entire RED workflow.’

The ‘another challenge’ refers to a similar attempt made by Sony in 2013.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Apple loses patent lawsuit, will have to pay RED royalties for ProRes RAW format

Posted in Uncategorized

 

Apple paid $400 in an attempt to trademark the word ‘Slofies’

19 Sep

Last week Apple showed off the slow-mo video capabilities of the front-facing camera on its new iPhone 11 models through the use of ‘Slofies,’ a portmanteau for the words slow-mo and selfies.

At the time, the concept was presented as a humorous take on selfies — which itself is a shortened version of of the phrase self-portrait — but not much more. Turns out, that might not be the case, as Apple has applied for a U.S. trademark for ‘Slofie,’ which would give them the ability to limit how the word is used.

The ‘drawing’ used in the trademark filing to show the phrase attempting to be trademarked.

All of Apple’s iPhone 11 models feature a front-facing camera that can record up to 120 frames per second (fps). As detailed in its demonstration video, the result, when slowed down, is a humorous slow-motion clip that puts a — sometimes literal — spin on selfies.

According to the filing, Apple hopes to trademark selfies as the word pertains to ‘downloadable computer software for use in capturing and recording video.’ Apple says the intent of the filing is to ensure it ‘has a bona fide intention, and is entitled, to use the mark in commerce on or in connection with the identified goods/services.’

As pointed out by The Verge, this likely ‘means this trademark seems to be more about preventing other companies from making slofie-branded camera apps than it is about limiting popular usage of this totally made-up word.’

According to the filing, Apple paid $ 400 for filing the trademark application.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Apple paid $400 in an attempt to trademark the word ‘Slofies’

Posted in Uncategorized

 

The iPhone 11 is more than just Apple catching up to Android

18 Sep

Apple announces iPhone 11 (Pro)

Major smartphone manufacturers introduce new models on a yearly cadence. Camera upgrades tend to be a major focus, with little else apparently to differentiate new models from old ones. Often, what seem like small, incremental upgrades can have significant impact on photo and video quality. The iPhone XS, for example, dramatically improved image quality in high contrast scenes thanks to the sensor’s ability to capture ‘inter-frames’ – short exposures in between longer ones – to improve dynamic range and noise. Similarly, 4K video was improved with multi-frame HDR capture.

Last week, Apple announced numerous updates to the cameras in the iPhone 11, some of which will inevitably be seen as attempts to catch up to capabilities of contemporary Android offerings. But, taken together, we think they stack up to meaningful upgrades that potentially make an already very capable camera one of the most compelling ones on the market.

See beyond your frame

The iPhone 11 offers a whopping 13mm 35mm equivalent field-of-view with its wide-angle, ‘0.5x’ lens. The iPhone 11 is the first Apple phone to feature an ultra-wide angle lens, a feature that’s been present on numerous Android phones. Wide angle lenses often add drama and a sense of depth to everything from landscapes, portraits, architecture and still life. They also allow for creative framing options, juxtaposing interesting foreground objects and distant ones. They’re also useful when you simply can’t step any further back from your subject.

The iPhone 11 models alert you to the potential presence of objects of interest beyond your current framing by showing you the wider field-of-view within the camera user interface. Simply tap the ‘1x’ focal length multiplier button to zoom out (or zoom in, on the Pro models).

Refined image processing pipeline

Newer, faster processors often mean increased photo and video capability, and the iPhone 11 is no exception. Its image processing pipeline, which handles everything from auto white balance to auto exposure, autofocus, and image ‘development’, gets some new features: a 10-bit rendering pipeline upgraded from the previous 8-bit one, and the generation of a segmentation mask that isolates human subjects and faces, allowing for ‘semantic rendering’.

10-bit rendering should help render high dynamic range images without banding, which could otherwise result from the extreme tone-mapping adjustments required. Semantic rendering allows faces to be processed differently from other portions of the scene, allowing for more intelligent tone mapping and local contrast operations in images with human subjects (for example, faces can look ‘crunchy’ in high contrast scenes if local contrast is uniformly preserved across the entire image). The end result? More pleasing photos of people.

Night mode

The general principle of night modes on smartphones is to use burst photography to capture multiple frames. Averaging pixels from multiple exposures reduces noise, allowing the camera software to brighten the image with less noise penalty.

Google set the bar for low light photography with its Night Sight mode. Other Android phones soon added their own similar modes, making the iPhone’s lack of such a mode particularly conspicuous (third party solutions like Hydra haven’t offered quite the level of detail as the best Android implementations, and of course require you to launch a separate app).

Apple has developed its own Night mode on the iPhone 11 phones, which turns on automatically under dim conditions. Apple’s approach is slightly different from Google’s, using ‘adaptive bracketing’ to capture and fuse multiple exposures with potentially differing shutter speeds (the Pixel takes a burst of images at the same shutter speed).

Varying shutter speeds to capture both short and long exposures can help reduce blur with moving subjects. Information from shorter exposures is used for moving subjects, while longer exposures – which are inherently brighter and contain less noise – can be used for static scene elements. Each frame is broken up into many small blocks before alignment and merging. Blocks that have too much motion blur are discarded, with a noise penalty resulting from fewer averaged frames for that scene element.

Deep Fusion

Google’s Night Sight mode isn’t just about better photos in low light. Night Sight uses burst photography and super resolution techniques to generate images with more detail, less noise, and less moiré thanks to the lack of demosaicing (slight shifts from frame to frame allow the camera to sample red, green and blue information at each pixel location). ‘Deep Fusion’, available in a soon-to-be-released update later this year, seems to be Apple’s response to Google’s Night Sight mode.

Deep Fusion captures up to 9 frames and fuses them into a higher resolution 24MP image. Four short and four secondary frames are constantly buffered in memory, throwing away older frames to make room for newer ones. The buffer guarantees that the ‘base frame’ – the most important frame to which all other frames are aligned – is taken as close to your shutter press as possible. The buffer ensures a very short, or zero, shutter lag, enabling the camera to capture your desired moment.

After you press the shutter, one long exposure is taken (ostensibly to reduce noise), and subsequently all 9 frames are combined – ‘fused’ – presumably using a super resolution technique with tile-based alignment (described in the previous slide) to produce a blur and ghosting-free high resolution image. Apple’s SVP of Worldwide Marketing Phil Schiller also stated that it’s the ‘first time a neural engine is responsible for generating the output image’. We look forward to assessing the final results.

Portrait mode

Apple is famous for using its technologies to perfect ideas that other companies introduced. That could be said about iPhone’s Portrait mode. Fake blurred background modes have been around for years, even on some compact cameras, but they were never convincing. By applying depth mapping technology to the problem, Apple made a portrait mode that worked, and soon the feature was everywhere. And we’re slowly seeing a similar trend with portrait re-lighting: researchers and companies have been quick to develop relighting techniques, some of which do not even require a depth map.

The iPhone 11 updates portrait mode in a few significant ways. First, it offers the mode even with the main 26mm equivalent main camera, allowing for shallow depth-of-field wide-angle portraiture. The main camera modules always have better autofocus, and its sensor has been updated to have ‘100% focus pixels’ (hinting at a dual pixel design), so wide-angle portrait mode will benefit from this as well.

Second, on the Pro models, the telephoto lens used for more traditional portraits has been updated: it’s F2.0 aperture lets in 44% more light than the F2.4 aperture on previous telephoto modules. That’s a little over a half stop improvement in low light light gathering ability, which should help both image quality and autofocus. Telephoto modules on most smartphone cameras have struggled with autofocus in low light, resorting to hunting and resulting in misfocused shots, so this is a welcome change.

And thirdly…

Portrait relighting

The iPhone 11 offers a new portrait relighting option: ‘High-Key Light Mono’. This mode uses the depth map generated from the stereo pair of lenses to separate the subject from the background, blow out the background to white, and ‘re-light’ the subject while making the entire photo B&W. Presumably, the depth map can aid in identifying the distance of various facial landmarks, so that the relighting effect can emulate the results of a real studio light source (nearer parts of the face receive more light than farther ones). The result is a portrait intended to look as if it were shot under studio lighting.

We’ve now talked a bit about the new features iPhone 11 brings to the table, but let’s turn our attention backwards and take a look at the ways in which iPhone cameras are already leading the industry, if not setting standards along the way.

Sublime rendering

Straight out of camera, iPhone photos are, simply put, sublime. Look at the iPhone XS shot above: with a single button press, I’ve captured the bright background and it looks as if my daughter has some fill light on her. If you want to get technical, then, white balance is well judged (not too cool), skintones are great (not too green), and wide dynamic ranges are preserved without leading to crunchy results, a problem that can result from tone-mapping a large global contrast range while retaining local contrast.

Much of this is thanks to segmented processing techniques that treat human subjects differently to others when processing the image. Digging deeper and looking at images at pixel-level, Apple’s JPEG engine could do a better job in balancing noise reduction and sharpening: often images can appear overly smoothed in some areas with aggressive sharpening and overshoot in others. This may be done in part because results have been optimized for display on high DPI retina devices, and a Raw option – that still utilizes all the computational multi-frame ‘smarts – would go a long way to remedying this for enthusiasts and pros.

But it’s hard to argue that iPhone’s default color and rendition aren’t pleasing. In our opinion, Apple’s white balance, color rendering, and tone-mapping are second to none. The improvements to image detail, particularly thanks to Apple’s as-of-yet unreleased ‘Deep Fusion’ mode, should (we hope) remedy many of our remaining reservations regarding pixel-level image quality.

HDR Photos

No, not the HDR you’re thinking about, that creates flat images from large dynamic range scenes. We’re talking about HDR display of HDR capture. Think HDR10 and Dolby Vision presentations of 4K UHD video. Traditionally, when capturing a high contrast scene, we had two processing options for print or for display on old, dim 100-nit monitors: (1) preserve global contrast, often at the cost of local contrast, leading to flat results; or (2) preserve local contrast, often requiring clipping of shadows and highlights to keep the image from looking too unnatural. The latter is what most traditional camera JPEG engines do in the absence of dynamic range compensation modes.

With the advent of new display technology like OLED, capable of 10x or higher brightness compared to old displays and print, as well as nearly infinite contrast, the above trade-off need no longer exist. The iPhone X was the first device ever to support the HDR display of HDR photos. Since then, iPhones can capture a wide dynamic range and color gamut but then also display them using the full range of its class-leading OLED displays. This means that HDR photos need not look flat, retaining both large global contrast from deep shadows to bright highlights, while still looking contrasty, with pop. All without clipping tones and colors, in an effort to get closer to reproducing the range of tones and colors we see in the real world.

It’s hard to show the effect, and much easier to experience it in person, but in the photo above we’ve used a camera to shoot an iPhone XS displaying HDR (left) vs. non-HDR (right) versions of the same photo. Note how the HDR photo has brighter highlights, and darker midtones, creating the impression that the sky is much brighter than the subject (which it is!). The bright displays on modern phones mean that the subject doesn’t look too dark compared to the non-HDR version, she just looks more appropriately balanced against the sky, rather than appearing almost the same brightness as the sky.

Wide color (P3)

Apple is also leading the field in wide gamut photography. Ditching the age-old sRGB color space, iPhone images can now fully utilize the P3 color gamut, which means images can contain far more saturated colors. In particular, more vivid reds, oranges, yellows and greens. You won’t see them in the image above because of the way our content management system operates, but if you do have a P3 display and a color managed workflow, you can download and view the original image here. Or take a look at this P3 vs. sRGB rollover here on your iPhone or any recent Mac.

Apple is not only taking advantage of the extra colors of the P3 color space, it’s also encoding its images in the ‘High Efficiency Image Format’ (HEIF), which is an advanced format intended to replace JPEG that is more efficient and also allows for 10-bit color encoding (to avoid banding while allowing for more colors) and HDR encoding to allow the display of a larger range of tones on HDR displays.

Video

The video quality from the iPhone XS was already class-leading, thanks to the use of a high quality video codec and advanced compression techniques that suppressed common artifacts like macro-blocking and mosquito noise. 4K video up to 30 fps also had high dynamic range capture: fusing both bright and dark frames together to capture a wider range of contrast. Check out this frame grab of a very high contrast scene, with complex motion due to constant camera rotation. Note the lack of any obvious artifacts.

The iPhone 11 takes things further by offering extended dynamic range (EDR) for 4K 60p capture. Android devices are still limited to standard dynamic range capture. While a few Android devices offer a high dynamic range (HDR) video output format – such as HDR10 or HLG – to take advantage of the larger contrast of recent displays, without HDR capture techniques (fusion of multiple frames) this benefit is limited.

To sum up: we expect that the iPhone 11 will offer the highest quality video available in a smartphone, with natural looking footage in even high contrast scenes, now at the highest resolution and frame rates. We do hope Apple pairs its HDR video capture with an HDR video output format, like HDR10 or HLG. This would allow for a better viewing experience of the extended dynamic range (EDR) of video capture on the latest iPhones, with more pop and a wider color gamut.

Video experience

Apple is also looking to change the experience of shooting video in the iPhone 11 models. First, it’s easier to record video than it was in previous iterations: just hold down the shutter button in stills mode to start shooting a video. Instant video recording helps you capture the moment, rather than miss it as you switch camera modes.

Perhaps more exciting are the new video editing tools built right into the camera app. These allow for easy adjustment of image parameters like exposure, highlights, shadows, contrast and color. And the interface appears to be intuitive as ever.

Multiple angles and ‘Pro’ capture

Advanced video shooters are familiar with the FiLMiC Pro app, which allows for creative and total control over movie shooting. The CEO of FiLMiC was invited on stage to talk about some of the new features, and one of the coolest was the ability to record multiple streams from multiple cameras. The app shows all four streams from all four cameras (on the Pro), allowing you to choose which ones to record from. You can even record both participants of a face-to-face conversation using the front and rear cameras. This opens up new possibilities for creative framing, in some cases obviating the need for A/B cameras.

Currently it’s unclear how many total streams can be recorded simultaneously, but even two simultaneous streams opens up creative possibilities. Some of this capability will come retroactively to 2018 flagship devices, as we describe here.

Conclusion

Much of the sentiment after the launch of the iPhone 11 has centered around how Apple is playing catch-up with Android devices. And this is somewhat true: ultra-wide angle lenses, night modes, and super-resolution burst photography features have all appeared on multiple Android devices, with Google and Huawei leading the pack. No-one is standing still, and the next iterations from these companies – and others – will likely leap frog respective capabilities even further.

Even if Apple is playing catch up in some regards though, it’s leading in others, and we suspect that when they ship, the combination of old features and new – like Deep Fusion and Night mode – will make the iPhone 11 models among the most compelling smartphone cameras on the market.

As the newest iPhone, the iPhone 11 camera is by inevitably the best Apple has made. But is the iPhone 11 Pro the best smartphone camera around currently? We’ll have to wait until we have one in our hands. And, of course, the Google Pixel 4 is a wildcard, and just around the corner…

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on The iPhone 11 is more than just Apple catching up to Android

Posted in Uncategorized

 

Apple debuts iPhone 11 and iPhone 11 Pro with ultra-wide camera

11 Sep

Apple has unveiled the iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max. All three devices offer a standard 12MP camera plus, for the first time on an iPhone, an ultra-wide 13mm camera module. In addition to those two cameras, the Pro models also provide a 12MP telephoto camera. The iPhone 11 provides a 6.1″ LCD ‘XDR’ display, with the 11 Pro and Pro Max offering OLED HDR displays measuring 5.8″ and 6.5″ respectively. All three displays support P3 wide color gamut and display of HDR video content.

All three phones offer a main 12MP ‘wide’ camera with a 26mm equivalent F1.8 6-element lens and optical image stabilization. It’s a new sensor, and Apple claims it offers ‘100% focus pixels’, which suggests a dual pixel sensor with split photodiodes.

The iPhone 11, 11 Pro and 11 Max all offer a second 12MP ‘ultra wide’ camera with a 13mm equivalent F2.4 5-element lens, which provides a dramatic wide 120 degree field of view. A new feature uses the ultra-wide camera to show you what’s beyond the frame when using the main camera, helping you decide whether to switch the wider field of view.

The 11 Pro and 11 Pro Max continue to offer the telephoto camera of previous generations. This is also a new 12MP sensor paired with a faster F2.0 lens with optical image stabilization.

All three iPhones have upgraded video capability, with the rear cameras capable of 4K 60p footage with what Apple is calling ‘extended dynamic range’ (EDR). Optical and digital stabilization work in combination for smooth footage. The front facing camera can record 4K 60p video, and can also record EDR video at 4K 30p. It’s now easier to switch from shooting photos to shooting videos: in stills mode, simply hold down the shutter to start recording a video.

Like previous models, slow motion and time-lapse modes are available. Apple claims the Pro models have all three rear cameras calibrated at the factory, so you can quickly switch between lenses when recording video and retain the same look across all cameras.

The iPhone 11, 11 Pro and 11 Pro Max will be available for pre-order this Friday and will ship September 20th. The iPhone 11 is the most affordable of the bunch starting at $ 699; the 11 Pro and Pro Max are priced at $ 999 and $ 1099, respectively. Apple says it will keep the XR in the current lineup for $ 599.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Apple debuts iPhone 11 and iPhone 11 Pro with ultra-wide camera

Posted in Uncategorized

 

2019 Guide: Apple iPad Pro for Photographers

25 Aug

Introduction

With the release of every new tablet, photographers peer past their laptops and wonder: could this be the one that lightens my gear load without sacrificing performance? The appeal of a fast and light tablet is seductive, even if you’re not looking to completely replace a desktop or laptop, but tradeoffs have so far made it a difficult choice.

2019 iPad Pro key specs:

  • Resolution of 2388×1668 pixels (11″ model), 2732×2048 pixels (12.9″ model)
  • 64GB, 256GB, 512GB, 1TB storage capacity options
  • 64-bit A12x chip
  • USB-C connection
  • 468g (11″ model), 632g (12.9″)

Apple’s latest iPad Pro models boast impressive hardware that’s making them competitive alternatives. Depending on what you need to do, though, the software still isn’t quite there yet. Partly that’s due to limitations imposed by Apple and iOS, but it’s also because third-party developers have only recently had the power to build the types of full-blown apps photographers expect.

That said, based on what’s been announced about the next versions of iOS (called iPadOS 13 for the tablets), the iPad Pro will become even more capable when it’s released in the fall. I haven’t run the iPadOS beta on the current iPad Pro because the software is still in development, so I won’t be evaluating any of those features in this article. However, I’ll reference them as needed to talk about some of the current limitations and what to expect later this year.

Performance

During the early days of the iPad, Apple didn’t share all the hardware specifications, preferring to convey the message that the iPad was perfect for anyone’s needs. Processor speed, memory, and graphics specs were details for nerds and pros, and the computer industry had become fixated with them. Honestly, Apple was likely deflecting from the fact that the original iPad shipped with just 256 MB of RAM—not terrible for most uses on that first model, but it hampered the machine when working with large image files. It wasn’t until the third-generation iPad that it crossed into the 1 GB territory for RAM.

Affinity Photo on this latest 12.9-inch iPad Pro isn’t fazed by significant demands

Apple has since returned from that marketing sojourn with a lot to trumpet. The iPad Pro is powered by an Apple-designed, 64-bit A12X Bionic processor with eight cores that balance power and battery life: four high-performance cores and four high-efficiency cores. When needed, all eight cores can be put to use. It includes 4 GB of LPDDR4 RAM, though the model equipped with 1 TB of storage includes 6 GB of RAM (the model I reviewed). The A12X also includes a 7-core graphics processor that Apple says delivers the same graphics performance as an XBox One S, and a Neural Engine that processes machine-learning tasks (such as identifying faces in the Photos app).

What does all that mean for photographers? It never feels like I’m waiting for the device to catch up. For example, in my review of Affinity Photo for iPad using a 2016 iPad Pro, I noted, “The tradeoff is that adding several Live Filters will slow down the live rendering performance. I added five Live Filters to a layer to test this, and making subsequent edits did lag significantly.” Affinity Photo on this latest 12.9-inch iPad Pro isn’t fazed by the same demands.

We’ll call it abstract art: a photo with multiple Live Effects applied in Affinity Photo for iPad.

Similarly, making adjustments in Lightroom for iPad is responsive, even on large raw files created by the Nikon Z7 and Sony A7 III cameras. I threw images at other photo editing apps, such as RAW Power, Pixelmator Photo, and Snapseed and I swear the iPad Pro yawned and asked, “Is that all you’ve got?” (I may have been overcaffeinated at the time.)

Editing in Lightroom for iPad Editing in RAW Power
Editing in Pixelmator Photo Editing in Snapseed

Storage for two of the configurations is roomy enough for photographers generating gigabytes of image and video files. While the 64 GB base model is pretty sparse and the 256 GB level is what I would consider tight, jumping to 512 GB or 1 TB is a lot more workable. Of course, you’ll be paying for the privilege: the 11-inch iPad Pro with 1 TB costs $ 1549. Depending on the speed of your Internet connection and your comfort level with cloud services, even the 256 GB configuration is workable if you’re using Apple’s Photos or Lightroom for iPad due to their ability to temporarily delete originals to conserve space and re-download them as needed.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on 2019 Guide: Apple iPad Pro for Photographers

Posted in Uncategorized

 

Chief Design Officer Jony Ive is leaving Apple to start new design firm

03 Jul
Jony Ive and Tim Cook at the September 2018 launch of iPhone XR | Photo provided by Apple

Apple has announced that Jonathan ‘Jony’ Ive, Apple’s chief design officer, is leaving the company ‘later this year to form an independent design company which will count Apple among its primary clients.’

Ive’s departure is a monumental one considering the impact he’s had on Apple products since he was hired in September 1992. From the original, iconic iMac to Apple’s new Apple Park headquarters in Cupertino, California, which is estimated to have cost upwards of five billion dollars to construct, Ive has played a critical role in the look and—arguably more importantly—feel of Apple products for more than two decades.

No one person will assume the role of chief design officer, a title more or less created for Ive. Instead, as Apple explains below, two other design team leaders within Apple are expected to fill the void Ive is leaving behind1:

Design team leaders Evans Hankey, vice president of Industrial Design, and Alan Dye, vice president of Human Interface Design, will report to Jeff Williams, Apple’s chief operating officer.

It’s unknown how much interaction will occur between Apple and Ive’s new design firm, but John Gruber of Daring Fireball says:

This angle that he’s still going to work with Apple as an independent design firm seems like pure spin. You’re either at Apple or you’re not. Ive is out.

Gruber also points out that despite Ive leaving Apple, there’s little doubt his impact on Apple will remain for years to come, regardless of the role his design firm will have with the design of Apple products:

Apple’s hardware and industrial design teams work so far out that, even if I’m right and Ive is now effectively out of Apple, we’ll still be seeing Ive-designed hardware 5 years from now. It is going to take a long time to evaluate his absence.

Time will tell the impact Ive’s departure will have on Apple and its products.


1AppleInsider has a great run-down of who Evans Hankey and Alan Dye are.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Chief Design Officer Jony Ive is leaving Apple to start new design firm

Posted in Uncategorized

 

Apple smartwatch patent hints at future Apple Watch models with built-in cameras

27 Jun

The United States Patent and Trademark Office (USPTO) has published an Apple patent detailing a method for adding a camera to future Apple Watch models. Rather than packing the camera module into the smartwatch body like some competing models, Apple’s design embeds the camera into an adjustable strap over the wrist band.

Based on images included with the patent, Apple envisions a smartwatch camera that is hidden out of sight against the wrist band when not in use. To capture images, the user extends the flexible strap in which the camera is embedded, making it possible to capture selfies without contorting one’s wrist at an uncomfortable angle.

Apple explains in its patent:

‘Such functionality can replace or at least meaningfully augment a user’s existing camera or camera-enabled device (e.g., smartphone, tablet). Such a wearable device that captures images and video may do so via an optical lens integrated into a distal end portion of a watch band that retains the device on a user’s wrist.’

Apple’s design involves a ‘core’ in the camera band that enables it to hold its position at whatever angle the user chooses. The patent indicates that some Apple Watch models may feature two cameras on the flexible band, making it possible to capture scenes both facing toward and away from the user.

In its latest iteration, the Apple Watch enables users to leave their iPhone behind by offering built-in cellular capabilities. The newly published patent indicates Apple views the camera as a possible future element for expanding the wearable’s independent functionality — users won’t have to choose between being able to snap images or leaving their iPhone at home.

The patent explains:

‘A smartwatch that has the capability of capturing images and video may provide an opportunity for users to be more reliant on their smartwatch and less reliant on other devices (e.g., smartphones, tablets, digital cameras) to capture images or videos. Thus, a smartwatch with the capability of capturing images or videos may enable a user to forego carrying a smartphone when doing some activities, especially activities or environments where it would be difficult to take a smartphone (e.g., hiking, running, swimming, surfing, snowboarding, and any number of other situations).’

The patent raises questions over whether camera functionality is something consumers truly want from a smartwatch. Though it would be convenient for taking stealthy images (that is, without pulling out a phone), the image quality would likely be considerably lower than what can be captured with the iPhone. As well, a camera positioned at the end of a thin extended band on one’s wrist would likely face blur issues due to slight tremors and other movements.

As with any patent, it’s possible Apple will never bring an Apple Watch with built-in cameras to the market.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Apple smartwatch patent hints at future Apple Watch models with built-in cameras

Posted in Uncategorized