A newly-published Apple patent, filed back in September 2019, details a light field panorama camera system seemingly intended for use in future iPhone and iPad devices. The technology would enable the average consumer to capture large light field panoramas of a particular scene by moving their device using gestures. The resulting content could be rendered and viewed on the device or using some type of head-mounted display (HMD), including VR headsets.
According to Patently Apple, which first spied the patent, Apple details technology that would build upon its current AR efforts by enabling its consumer devices to capture complex 3D scenes. To do this, the user would need to move their light field-equipped iPhone or iPad in a gesture, such as moving the device in a swooping infinity symbol, to capture light field images of the environment from multiple angles.
A flow-chart provided within the patent filing that shows the process of capturing, processing and viewing the resulting imagery. |
A rendering engine would process the individual images into a 3D panorama with six degrees of freedom (6DOF) made possible using the light field technology. As a result, the viewer would have the ability to look above and behind objects, zoom in on areas of the scene and view different angles of the environment. The patent follows Google’s acquisition of light field camera technology company Lytro in 2018.
Unlike conventional cameras, a light field camera system captures both the intensity of the light from a scene and the direction the light rays are traveling in space. The additional data gathered by light field camera systems enable new types of experiences, including the one detailed by Apple.
The patent indicates that Apple’s system may use the sensors in the iPhone and iPad to capture position, motion and other similar metadata alongside the images, the combination of which would contribute to the final light field panorama. The combination of captured images and metadata could then be used to render different views of the same 3D scene, according to the patent, ultimately giving the user six degrees of freedom for exploring the panorama using an HMD like a VR headset.
This would differ substantially from a traditional 360-degree panorama, which is captured from a single point, only allowing the viewer to move their head around within the rendered 3D scene. Light field panoramas will appear more realistic, keeping objects in their correct positions as the user moves around within the scene, which could realistically render from different angles as the user has a look around.
It’s no secret that Apple has been heavily focusing on augmented reality technologies; its most recent iPad Pro model underscores this effort with the inclusion of a LIDAR sensor.
Just a few of the possible movements you could use to capture the scenery using your Apple mobile device. |
In its announcement of the 2020 iPad Pro last month, Apple said the new LIDAR sensor ‘delivers cutting-edge depth-sensing capabilities, opening up more pro workflows and supporting pro photo and video apps,’ specifically with augmented reality in mind. The sensor works by measuring the distance of objects that are as far as 5m (16ft) away.
Apple went on to explain:
‘New depth frameworks in iPadOS combine depth points measured by the LiDAR Scanner, data from both cameras and motion sensors, and is enhanced by computer vision algorithms on the A12Z Bionic for a more detailed understanding of a scene. The tight integration of these elements enables a whole new class of AR experiences on iPad Pro.’
The future expansion of these capabilities using light field technology wouldn’t be surprising, particularly in light of ongoing rumors Apple is working on AR/VR gear. With that said, and as with any patent, it is possible we’ll never see this technology make its way into a consumer product. Per usual, Apple has not commented on the patent.
Articles: Digital Photography Review (dpreview.com)