RSS
 

Posts Tagged ‘enables’

New Sony release cable enables dual-shooting with the RX0

23 Feb

In addition to the new HVL-F60RM wireless flash, Sony also debuted a new release cable that might be of interest to owners of the company’s ultra-compact DSC-RX0 sort-of action cam. The VMC-MM2 cable is Sony’s “convenient dual-camera shooting solution” for users who want to shoot with their Sony ILC and RX0 at the same time.

The cable is used to sync your Sony alpha (or Cyber-shot) camera up with an RX0 enable simultaneous photo/video capture using only the main camera’s release button. To quote Sony:

This form of dual-camera shooting is especially useful for wedding, event and press conference photographers and journalists. It offers the opportunity to capture multiple perspectives using different angles of view that can be edited and packaged into an impactful series of work.

The new VMC-MM2 release cable will be available starting in April for $ 50 USD (or €55).

Press Release

Sony Introduces Dual-camera Shooting Solution for RX0 with Launch of new Release Cable

SAN DIEGO, Feb. 22, 2018 – Sony Electronics, a worldwide leader in digital imaging and the world’s largest image sensor manufacturer, has today announced the latest addition to its family of RX0 solutions with the launch of a new Release Cable, model VMC-MM2.

Helping to break down barriers to shooting style and image expression, the VMC-MM2 is a new solution for convenient dual-camera shooting, freeing the user to capture two different perspectives simultaneously.

The ultra-compact dimensions and superb image quality offered by the RX0 make it the ideal accompaniment to other cameras for dual-camera capture. By mounting an RX0 to the Multi Interface Shoe™ 1 or bracket/rig, users can use the RX0 to shoot high quality images concurrently with their main Sony ?™ or Cyber-shot® camera body2. The VMC-MM2 cable realizes simultaneous photo/movie shooting3 with just a single press of main camera’s release button. This enables the user to capture one moment in two different ways, with a variation of angle of view, depth of view or frame rate. The cable also has a coiled design with a right-angle connector to minimize clutter and keep it clear of the EVF during shooting.

This form of dual-camera shooting is especially useful for wedding, event and press conference photographers and journalists. It offers the opportunity to capture multiple perspectives using different angles of view that can be edited and packaged into an impactful series of work.

Pricing and Availability

The new VMC-MM2 will be available in North America in April, 2018 priced at approximately $ 50 US or $ 60 CA.


1 Shoe Mount not included

2 Refer to the Sony support page for camera compatibility information http://www.sony.net/acc/mm2/

3 To synchronise movie REC/STOP, the main camera must assign “Movie w/ shutter” to its release button

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on New Sony release cable enables dual-shooting with the RX0

Posted in Uncategorized

 

Google enables HDR+ for Instagram and other apps on Google Pixel 2

07 Feb

Google’s latest generation Pixel 2 smartphones come with the built-in Visual Core dedicated imaging processor that powers the HDR+ mode’s sophisticated multi-frame-stacking computational imaging functions and other camera features. However, Visual Core wasn’t activated when the Pixel 2 devices were first launched, and only was enabled for developers in November last year.

The latest Android update now brings the power of Visual Core to all Pixel 2 users, an update smartphone photographer should be very excited about.

This update mainly means that Google’s excellent HDR+ mode is now available on all apps that call the camera and target API level 26, not just Google’s own Camera App. According to Google, this includes popular examples such as Instagram, Whatsapp or Snapchat, but we hope it also covers some of the powerful third-party camera apps available on Google Play.

Previously, those apps relied on a much more basic camera API that could not produce the same image quality as HDR+.

The Android update for the Google Pixel 2 will be rolling out over the next few days, along with other software improvements, so make sure you install the newest version as soon as it becomes available to take full advantage of the phone’s camera capabilities.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google enables HDR+ for Instagram and other apps on Google Pixel 2

Posted in Uncategorized

 

Metabones enables 10 fps shooting with AF for Canon glass on Sony a9

28 Jun

If you were disappointed by reports that the Sony a9 struggles with long adapted Canon lenses, you might be able to take some comfort from Metabones’ latest firmware update. The update for EF-E Smart Adapter Mark IV/V and EF-E Speed Booster Ultra adds autofocus support for medium and high burst modes on the Sony a9. However, since adapted lens support maxes out at 10 fps with AF, high burst mode simply runs at medium speeds (10 fps electronic, 5 fps mechanical).

We’ve have had a chance to give this update a go with a number of Canon mount lenses (including Sigma lenses), and are impressed with the results: with wider lenses (85mm and wider), you get phase-detect AF over most of the frame at 10 fps in Wide and Flexible Spot modes. With longer lenses (70-200/2.8, 100-400/4.5-5.6), focus starts to falter outside of the central region – something that doesn’t happen with native E-mount lenses. In L drive mode (3 fps), the camera opens up the aperture in between shots – both for adapted and E-mount lenses, allowing the camera to continue focusing beyond F11 (at frame rates higher than 3 fps, the camera reverts to manual focus at apertures smaller than F11 – with both adapted and native lenses).

In manual focus mode, you can shoot up to 20 fps with adapted lenses. This is quite an impressive update for the Metabones adapter, and we’ve confirmed it to function significantly better with the a9 than the Sigma adapter (which has yet to issue a firmware update for the a9).

The firmware is available for download now from Metabones.

Firmware upgrade for EF-E Smart AdapterTM MARK IV/V and EF-E Speed BoosterTM ULTRA

RELEVANT PRODUCTS

This information is for the following models:

  • EF-E Smart AdapterTM MARK IV/V (model number MB_EF-E-BM4 / MB_EF-E-BT4 / MB_EF-E-BT5)
  • EF-E Speed BoosterTM ULTRA (model number MB_SPEF-E-BM2 / MB_SPEF-E-BT2 / MB_SPEF-E-BT3)

ABOUT THIS DOWNLOAD

  • Name: Firmware update V0.57 for EF-E Smart AdapterTM MARK IV/V and EF-E Speed BoosterTM ULTRA
  • Release date: 26 Jun 2017
  • Benefits and improvements:
    – Added autofocus support during high speed and medium speed continuous drive (up to 10fps) on Sony A9 (“Green” mode only). Experiment with the “Priority Set in AF-C” setting for the best compromise between hit rate and frame rate for your shooting style. Overall performance depends on lens used. The camera does not use hunting while tracking is in operation. If subject movement exceeds the measurement range of the OSPDAF sensor, autofocus pauses. This is by design. The measurement range of the OSPDAF sensor decreases as the focal length increases. Except for the original Mark I Smart Adapter this feature is available for all subsequent Speed Boosters and Smart Adapters.
    – Enlarged PDAF area on supported cameras when adapter is in Advanced mode, with the advisory that AF performance may be unsatisfactory outside of the central portion of the frame.
    – Enabled AF illuminator (Advanced mode only).
    – There is an AF accuracy issue when using AF-S or DMF on Sony A9 and telephoto lenses with Metabones in “Advanced” mode, which affects this and all previous firmware versions. Green mode, which is set by default on Sony A9, is not affected (except for the original Smart Adapter Mark I, which does not support “Green” mode). A9 users are advised to not use “Advanced” mode but stick with the default “Green” mode. In addition, some telephoto lenses rarely exhibit this issue, such as EF 200/2.8L II USM, EF 400/5.6L USM and Tamron 150-600/5-6.3 VC USD A011. Investigation of this issue is still in progress.
    – Fixed AF issue with EF-S 10-18mm f/4.5-5.6 IS STM and EF-S 18-135/3.5-5.6 IS Nano USM lenses.
    – Fixed smooth iris support for 40/2.8 STM, 50/1.8 STM and Sigma 50-100/1.8 DC HSM Art 016.
    – Fixed CN-E 18-80 T4.4 L IS KAS S servo zoom used by the camera’s zoom rocker and the lens’ rocker in alternation.
    – Fixed CN-E 18-80 T4.4 L IS KAS S auto iris when adapter is in Green mode, where extremely bright conditions no longer causes the iris to close completely.
    – Fixed aperture display with Canon EF 300mm f/4L IS USM lens and Kenko Pro 300 teleconvertter.
    – Corrected W-T zoom scale display in “Advanced” mode for Speed Booster and Kenko Pro 300 teleconverter (except Mark I/II/III and original Speed Booster).
    – Faster aperture diaphragm for still photography in Advanced mode when Live Vide mode is set to Setting Effect OFF.
    – LED (if available) now shows solid magenta when adapter is connected to USB waiting for Metabones App to run.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Metabones enables 10 fps shooting with AF for Canon glass on Sony a9

Posted in Uncategorized

 

Daredevil Santa: Human Flying Drone Enables Sky-High Snowboarding Tricks

24 Dec

[ By SA Rogers in Gadgets & Geekery & Technology. ]

human-flying-drone-1

“This isn’t fake, I promise,” said filmmaker Casey Neistat as he announced the impending debut of his ‘Human Flying Drone Holiday Movie’ on Twitter with a dubious-looking graphic. Anyone who saw that tweet could be forgiven for their skepticism, especially since Neistat was teaming up with fellow YouTube star Jesse Wellens of the channel PrankvsPrank to pull off the stunt. But by all accounts, this footage of the ‘world’s largest homemade drone’ is real, and a Santa-suited Neistat is actually flying 25 feet in the air.

human-flying-drone-5

No one in the world sells a drone that can lift a human being, so Neistat and his team set out to create one. The octocopter drone, which is augmented with a Samsung Galaxy Gear 360 action camera, reportedly took over a year to build, and the video clip was shot at a ski resort in Finland over the course of four days. In it, the daredevil YouTuber zooms down a slope on a snowboard and then takes off into the sky, going higher and higher before the final jump takes him 100 feet into the air, as smoke bombs fastened to his feet emit vivid pink plumes.

human-flying-drone-2

human-flying-drone-3

One thing that’s not quite what it seems is Neistat’s single-handed grip on the handle: he’s actually securely fastened to the drone, dubbed ‘Janet,’ by a body harness. The rest of it, as far as anyone can tell, is legit. Looks like fun! Check out how it’s done in the behind-the-scenes video above.

Share on Facebook





[ By SA Rogers in Gadgets & Geekery & Technology. ]

[ WebUrbanist | Archives | Galleries | Privacy | TOS ]


WebUrbanist

 
Comments Off on Daredevil Santa: Human Flying Drone Enables Sky-High Snowboarding Tricks

Posted in Creativity

 

Eye-Plug camera dongle enables Android phones to record 3D video

08 Jun

At Computex in Taipei recently, Chinese company Weeview Inc. showcased a USB-C dongle that adds another camera to an Android phone, enabling it to record stereoscopic 3D videos. Called Eye-Plug, this camera dongle records content simultaneously with either the rear or front-facing camera integrated in the handset; because it uses USB-C, the dongle can be inserted in either direction. 

A prototype version of Eye-Plug was demonstrated at Computex, and the company will, according to Engadget, begin production on a commercial version later on this year. Weeview plans to sell Eye-Plug for $ 35 and to eventually produce an iPhone version; it isn’t clear whether a mini USB model will also be produced or if it will remain limited to USB-C devices (which is a pretty small group of phones at this point.)

No information on the dongle camera’s resolution has been provided, though the product video below shows notable differences in quality and white balance between Eye-Plug’s footage and footage recorded with the handset’s own camera. It isn’t clear whether the company’s app will adjust the footage in post-processing to correct this issue. The video below suggests the app offers still image editing tools for selective adjustments made possible by layering two images taken simultaneously.

Availability for Eye-Plug was not provided by the company.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Eye-Plug camera dongle enables Android phones to record 3D video

Posted in Uncategorized

 

Sony enables XAVC S recording to SDHC card with a7R II and a7S II firmware update

28 Apr

Newly released firmware updates for the Sony a7R II and a7S II enable XAVC S format video recording to SDHC memory cards. Previously, XAVC S format video could only be recorded to an SDXC card. Sony makes a couple of notes on the use of SDHC cards for XAVC S video – any recorded files larger than 4GB will be split into multiple files to comply with a 4GB maximum file size limitation. Cards must also be at least SD Speed Class 10 and UHS Speed Class U1 or faster. Video recorded at 100Mbps or more must use a UHS Speed Class U3 card.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Sony enables XAVC S recording to SDHC card with a7R II and a7S II firmware update

Posted in Uncategorized

 

Change of focus: 755 MP Lytro Cinema camera enables 300 fps light field video

12 Apr

Lytro is bringing its Light Field technology to the world of cinema and visual effects, shortly after its CEO announced in a blog post Lytro’s intention of abandoning the consumer stills camera space. Lytro Cinema turns every frame of live action into a 3D model, capturing intensity, color, and angular information of light rays. Coupling light field with a 755 MP sensor capable of capturing images at 300 fps, Lytro Cinema promises extensive post-production freedom, including adjustment of focus, depth-of-field, shutter speed and frame rate, as well as the elimination of green screens.

Although Lytro experienced some difficulty in adoption of light field technology in stills, the technology had, and continues to have, immense potential for imaging. Saving creative decisions for post-processing allows for more creative freedom, and allows a photographer or DP to focus on other elements during capture. Nowhere will this be more appreciated than in cinema, where the realities of production mean that any technology aimed at saving certain creative decisions, like focus, for post-capture are welcome.

Focus and aperture sliders in post-production. In video. No joke. I wish my Raw converter had this (Lytro’s Raw converter already does). Photo credit: Lytro

And that’s exactly what Lytro Cinema aims to do. By capturing directional information about light rays and essentially sampling multiple perspectives behind the aperture, Lytro Cinema allows for adjustment of focus placement, depth-of-field (via aperture adjustment), perspective, and more in post-processing. And since a depth map is rendered for every frame of video, Lytro claims Cinema will make it easier to combine CGI with live footage, no longer requiring green screens to extract elements or subjects from a scene. You’ll be able to just extract a subject based on its depth, which Lytro shows in a convincing example below:

Since light field cameras effectively sample multiple perspectives from behind the lens, you can even simulate camera movement as if it were moved on-set. The degree of motion is of course limited, but the technique can be very effective, as demonstrated in this haunting music video shot entirely on the stills-focused Lytro Illum. As Lead Engineer for Light Field Video Brendan Bevensee explains: “You have a virtual camera that can be controlled in post-production.” That means there’s also nothing stopping one from simulating short dolly motion or perspective shifts in post, with nothing but a static camera at the time of capture. “You can shift the camera to the left… [or] to the right, as if you had made that exact decision on set. It can even move your camera in and out” says Head of Light Field Video, Jon Karafin.

Imagine small, smooth, meditative camera movements that don’t even require a complicated motion rig to set up.

Furthermore, by precisely recording X, Y, Z, pitch, roll, and yaw, Lytro Cinema even offers automated camera tracking, which makes it easier to composite and mat CGI elements. And just as the Illum paired with Lytro Desktop software allowed one to select various objects and depths to throw them in and out of focus for selective depth-of-field and background blur, one can do the same in video with the Cinema, choosing, for example, to marry live footage from minimum focus to, say, 10m with different footage, or CGI, for everything beyond those distances. In other words, control over not just single planes, but ranges of planes.

Beyond just light field benefits, Lytro is also addressing another common headache: the selection of shutter angle (or shutter speed). Often, this is a decision made at the time of capture, dictating the level of blur or stuttering (a la action scenes in ‘Saving Private Ryan’ or ‘Gladiator’) in your footage. At high frame rates of capture, though, high shutter angles are required, removing some of the flexibility of how much motion blur you can or can’t have (e.g. 300 fps cannot be shot with shutter speeds longer than 1/300s, which inevitably freezes action). By decoupling the shutter angle of capture from the shutter angle required for artistic effect, a DP can creatively use motion blur, or lack thereof, to suit the story. The technology, which undoubtedly uses some form of interpolation and averaging in conjunction with the temporal oversampling, also means that you can extract stills with a desired level of motion blur. 

Lytro claims that by capturing at 300 fps, they can computationally allow for any of a number of shutter angles in post-production, allowing a cinematographer to decouple shutter angle required for capture from that required for artistic intent. Photo credit: Lytro

With every development over at Lytro, we’ve been excited by the implications for both stills and video. The implications for the latter, in particular, have always been compelling. Along with the announcement of the Lytro Immerge 360º virtual reality light field rig, we’re extremely excited to see light field video becoming a reality, and look forward to what creatives can produce with what is poised to be an unimaginably powerful filmmaking platform. Filmmakers can sign up for a demonstration and a personalized production package on Lytro’s site. For now, Lytro Cinema will be available on a subscription basis, understandable given the complexities involved (the immense data capture rates require servers on-set).

Head over to the Lytro Cinema page for more in-depth information. Lytro will be demo-ing “Life”, a short film shot using Lytro Cinema at NAB 2016.

Lytro Brings Revolutionary Light Field Technology to Film and TV Production with Lytro Cinema

  • World’s First Light Field Solution for Cinema Allows Breakthrough Creative Capabilities and Unparalleled Flexibility on Set and in Post-Production
  • First Short Produced with Academy Award Winners Robert Stromberg, DGA and David Stump, ASC in Association with The Virtual Reality Company (VRC) Will Premiere at NAB on April 19

Lytro unlocks a new level of creative freedom and flexibility for filmmakers with the introduction of Lytro Cinema, the world’s first Light Field solution for film and television. The breakthrough capture system enables the complete virtualization of the live action camera — transforming creative camera controls from fixed on set decisions to computational post-production processes — and allows for historically impossible shots.

“We are in the early innings of a generational shift from a legacy 2D video world to a 3D volumetric Light Field world,” said Jason Rosenthal, CEO of Lytro. “Lytro Cinema represents an important step in that evolution. We are excited to help usher in a new era of cinema technology that allows for a broader creative palette than has ever existed before.”

Designed for cutting edge visual effects (VFX), Lytro Cinema represents a complete paradigm shift in the integration of live action footage and computer generated (CG) visual effects. The rich dataset captured by the system produces a Light Field master that can be rendered in any format in post-production and enables a whole range of creative possibilities that have never before existed.

“Lytro Cinema defies traditional physics of on-set capture allowing filmmakers to capture shots that have been impossible up until now,” said Jon Karafin, Head of Light Field Video at Lytro. “Because of the rich data set and depth information, we’re able to virtualize creative camera controls, meaning that decisions that have traditionally been made on set, like focus position and depth of field, can now be made computationally. We’re on the cutting edge of what’s possible in film production.”

With Lytro Cinema, every frame of a live action scene becomes a 3D model: every pixel has color and directional and depth properties bringing the control and creative flexibility of computer generated VFX to real world capture. The system opens up new creative avenues for the integration of live action footage and visual effects with capabilities like Light Field Camera Tracking and Lytro Depth Screen — the ability to accurately key green screens for every object and space in the scene without the need for a green screen.

“Lytro has always been a company thinking about what the future of imaging will be,” said Ted Schilowitz, Futurist at FOX Studios. “There are a lot of companies that have been applying new technologies and finding better ways to create cinematic content, and they are all looking for better ways and better tools to achieve live action highly immersive content. Lytro is focusing on getting a much bigger, better and more sophisticated cinematography-level dataset that can then flow through the VFX pipeline and modernize that world.”

Lytro Cinema represents a step function increase in terms of raw data capture and optical performance:

  • The highest resolution video sensor ever designed, 755 RAW megapixels at up to 300 FPS
  • Up to 16 stops of dynamic range and wide color gamut
  • Integrated high resolution active scanning

By capturing the entire high resolution Light Field, Lytro Cinema is the first system able to produce a Light Field Master. The richest dataset in the history of the medium, the Light Field Master enables creators to render content in multiple formats — including IMAX®, RealD® and traditional cinema and broadcast at variable frame rates and shutter angles.

Lytro Cinema comprises a camera, server array for storage and processing, which can also be done in the cloud, and software to edit Light Field data. The entire system integrates into existing production and post-production workflows, working in tandem with popular industry standard tools. Watch a video about Lytro Cinema at www.lytro.com/cinema#video.

“Life” the first short produced with Lytro Cinema in association with The Virtual Reality Company (VRC) will premiere at the National Association of Broadcasters (NAB) conference on Tuesday, April 19 at 4 p.m. PT at the Las Vegas Convention Center in Room S222. “Life” was directed by Academy Award winner Robert Stromberg, Chief Creative Officer at VRC and shot by David Stump, Chief Imaging Scientist at VRC.

Learn more about Lytro Cinema activities during the 2016 NAB Show and get a behind-the-scenes look on the set of “Life” at www.lytro.com/nab2016.

Lytro Cinema will be available for production in Q3 2016 to exclusive partners on a subscription basis. For more information on Lytro Cinema, visit www.lytro.com/cinema.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Change of focus: 755 MP Lytro Cinema camera enables 300 fps light field video

Posted in Uncategorized

 

DxO ONE update enables framing assist via the camera’s OLED monitor

11 Mar

A recent update for the DxO ONE has introduced framing assistance via the camera’s built-in OLED when the device is used in standalone shooting mode. A monochrome live image preview is displayed on the camera’s small, rear screen to improve the experience of using the camera without connection to an iPhone. The camera is also offered at a lower $ 499 price point, without software bundled. 

Firmware 1.3 also introduces a motion blur alert feature, as well as a modified interface for selecting white balance, metering and focus mode. When sharing photos, you’ll now see a visual confirmation of a successful upload, and JPEG compression level can be specified. 

The app update is available now for free through the App Store, and camera firmware can be updated through the camera itself. The DxO ONE is available now for $ 499.


Press release:

DxO ONE now features a dramatically enhanced stand-alone experience

DxO unbundles desktop software to make the camera available at a new low price of just $ 499

Press release:

PARIS, March 2, 2016 /PRNewswire/ — DxO announced today the immediate availability of yet another ground-breaking update to the award-winning DxO ONE professional quality connected camera for iPhone® and iPad®. The version 1.3 update, available for free via the iTunes App Store, introduces several new features that further extend the use of the DxO ONE, including the ability to use the OLED display as a novel framing assistant to help quickly compose while operating the camera with one hand. Additionally, DxO has unbundled their desktop software from the package (DxO FilmPack and DxO OpticsPro now sold separately), enabling even more photographers to get their hands on the revolutionary DxO ONE camera at a new low price of just $ 499.

“That is one trippy amazing viewfinder — love it!” said award-winning photographer, John Stanmeyer. “Even more wonderful, in very low light, the ONE handled all the complexities of ISO, focus, etc., instantly. Amazing. Perfectly fine for those rapid moments when you want to make an image, a RAW high res file, in any lighting conditions we’re placed in.”

Version 1.3, the second major upgrade to date, enables the DxO ONE to be used as a miniaturized pro-quality camera that is smaller, easier, and faster to shoot than any other camera on the market. To quickly capture life’s fleeting moments, simply pull the DxO ONE out of your pocket or purse, and in one movement, slide the lens cover open, compose the scene using the OLED display as a framing assistant, then depress the two-stage physical shutter button to lock focus and grab the shot. In stand-alone mode, the DxO ONE provides a fun, retro-style of photographing without “chimping,” and makes browsing newly captured images a surprising and delightful experience.

Best of all, when using the DxO ONE in stand-alone mode, all of your preferred camera settings for aperture, shutter speed, ISO, metering, white balance, etc. are preserved, exactly as you set them in the iOS app. For example, if you prefer to capture portraits at f/1.8, the camera will always be ready at f/1.8 when you pull it out of your pocket. And because the DxO ONE has a physical shutter button, it works even if you’re wearing gloves. So when you’re on the slopes, set the camera to 1/4000s (or higher), then when you pull the camera out of your ski jacket the DxO ONE is immediately ready to freeze fast action.

“During an assignment for Rolls-Royce Motor Cars I had the misfortune of seriously injuring myself during a biking accident,” said Robert Leslie, professional photographer and amateur cyclist. “Much to my client’s surprise I was able to complete the studio session and capture some incredible images while using the DxO ONE in the new stand-alone mode. Now what other camera in the world lets you do a professional shoot whilst your arm is in a sling with a broken collar bone?”

Version 1.3 also introduces a host of other features including motion blur alert, and an elegant new way to dial in white balance, metering and focus modes, which can also be viewed as overlays in the viewfinder along with your iPhone battery level. Browsing photos is faster than ever, with the gallery now sorted in the same order as in iOS Photos. You can be sure your images were successfully shared thanks to a new visual confirmation message, and you can set a preferred JPEG compression level for photos, and bitrate for videos. Of note, an innovative Message Center now provides a direct connection to DxO, with in-app access to current information designed to help you get the most out of your DxO ONE.

DxO ONE owners are invited to download and install version 1.3, which is available as a free update via the iTunes App Store. New firmware, also immediately available, can be downloaded to the iPhone and installed on the DxO ONE with a simple tap.

Pricing & Availability

The DxO ONE Miniaturized Pro Quality Camera™ for iPhone® and iPad® is available for purchase at dxo.com, Amazon, Apple online and select Apple stores in the US, B&H and other respected photo retailers for the new low price of $ 499.

The DxO ONE iOS app and companion Apple Watch app are both available for free via the iTunes App Store. Every purchase of a DxO ONE camera also includes free access to simple, but powerful desktop processing software — DxO Connect for Mac and PC, and the new DxO OpticsPro for OS X Photos. DxO FilmPack and DxO OpticsPro are available for purchase separately.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on DxO ONE update enables framing assist via the camera’s OLED monitor

Posted in Uncategorized

 

Laser Precision: 3D Site Scan Enables Architectural Intervention

09 Mar

[ By WebUrbanist in Architecture & Houses & Residential. ]

3d view

Incredibly accurate laser-scanning technology, precise down to a hundredth of a millimeter, has helped British architects not only plan a new structure but also secure permission from a local planning commission. Their proposed Rock House, now approved, preserves both natural and architectural features currently on a government-protected site.

3d view preservation

Cornish firm Poynton Bradbury Wynter Cole (PBWC) Architects enlisted CESurveys to lidar scan the existing property, located in a conservation area. Their scanner system fires tens of thousands of lasers per second to get precise distance readings on complex terrain. Compared to traditional surveying and site-mapping strategies, this approach is much faster, cheaper and more effective.

3d building model

3d architectural addition

The results are translated into a three-dimensional model that can be manipulated, showing the effects of site changes or interventions. Scans from around sites are stitched together to form a complete picture.

3d scan section

3d scan side

3d scan elevation

The resulting models have an array of benefits, including the ability to show approving parties what the impacts of additions and remodels might be to a given property. They also helped the architects, in this case, maintain key lines of site, such as views out to the sea, and limit the cost of revisiting the site frequently to document additional features. Slices of the scans also make it easy to generate sections and elevations, sliced directly out of the models.

3d model lidar laser

Applications of lidar scanning goes well beyond architecture, too, including the ability to document historic infrastructure and preserve 3D models of fresh crime scenes.

Share on Facebook





[ By WebUrbanist in Architecture & Houses & Residential. ]

[ WebUrbanist | Archives | Galleries | Privacy | TOS ]


WebUrbanist

 
Comments Off on Laser Precision: 3D Site Scan Enables Architectural Intervention

Posted in Creativity

 

GoPro and Periscope partnership enables live broadcasting from Hero4 action cams

27 Jan

Live-streamed content is about to get a little more extreme as action cam maker GoPro and live-broadcasting video app Periscope have announced a partnership. Starting today, Periscope users can broadcast live from GoPro’s Hero4 Black and Silver models. With Periscope’s iOS app, users can switch between the camera on their mobile device and a GoPro, enabling a two-camera setup. An iPhone 5s, 6 or 6+ running iOS 8.2 is required. If you’ve got all the necessary equipment, download the latest version of the iOS app to start broadcasting from your GoPro.


Press release:

GOPRO GOES LIVE WITH PERISCOPE

Live story-telling just got more immersive! Beginning today, Periscope users can now broadcast live directly from their GoPro HERO4 Black or Silver camera. This innovative integration allows the 10+ million Periscope users to toggle between broadcasting from their iPhone’s camera to their GoPro directly from the phone screen with the simple touch of a button. 

Much like a production switchboard, you can use your Periscope interface to flip between the two different camera angles, so even if your broadcast is lacking that heart-pounding action only GoPro can capture, you can still set up a two-camera shot for more dynamic story telling in real time. And yes, your GoPro will still record locally on the micro SD card even while broadcasting through Periscope.

Periscope lets you see what’s happening in the world right now, unedited and unfiltered. Integrating GoPro offers Periscope broadcasters a new tool to help tell their stories more creatively, while GoPro content creators now have a new platform and audience with Periscope and Twitter to share their experiences, live!

Live broadcasts from GoPro can now be shared directly to Twitter’s home timeline, enabling GoPro users and Periscope broadcasters to expand their reach to their Twitter fan base. This helps broadcasters cultivate new fans and interact with their audience right from their broadcasts in their home timeline, even after the live broadcast is done.

Pablo Jablonski, Periscope iOS Engineer said, “As a skier myself, I’ve always loved extreme sports, and I love how GoPro can show us all of the crazy things these athletes can do. As an iOS engineer on Periscope, bringing these two technologies together has been a the fulfillment of a longtime personal wish. Starting with X Games and moving forward, Periscope and GoPro together will bring these LIVE moments to all the fans.”

*Broadcast functionality is currently only available on iOS, but users will be able to view GoPro broadcasts from any platform.

*Compatible with iPhone 5s, 6 and 6+ with iOS 8.2

*Integration for use with GoPro HERO4 Black and Silver models

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on GoPro and Periscope partnership enables live broadcasting from Hero4 action cams

Posted in Uncategorized