RSS
 

Posts Tagged ‘Motion’

Sony VENICE motion picture camera firmware v4.0 update brings 4K/120fps

02 Feb

Sony has detailed its upcoming version 4.0 firmware for its VENICE motion picture camera system, adding optional High Frame Rate (HFR) license speeds for shooting at up to 120fps at 4K 2.39:1 and 60fps at 6K 3:2. When using anamorphic lenses, the VENICE cameras can also be used to shoot at 75fps at 4K 4:3 and 110fps at 4K 17:9. According to Sony, these new frame rate options are ideal for commercial 4K/6K productions and movies, as well as VR work.

Joining the HFR recording is upgraded remote control functionality with Sony’s 700 Protocol, ‘giving filmmaker’s greater flexibility,’ according to the company. As well, firmware v4.0 brings support for ZEISS eXtended Data and Cooke’s most recent /i3 metadata system, progressive HD-SDI output in 25p and 29p, an extended Mask+Line setting in the system’s Frame line set-up, plus there will be selectable functions for the DVF-EL200 viewfinder’s assignable buttons.

The version 4.0 firmware update will be free; both it and the optional High Frame Rate license will arrive in June.

January 31, 2019: Sony will be upgrading the capabilities of its next-generation motion picture camera system, VENICE, by introducing High Frame Rate (HFR) shooting, advanced remote-control functionalities and Cooke/i3 and Zeiss extended metadata support, as part of its latest firmware update. Following the recent release of VENICE’s firmware Version 3.0 and the upcoming launch of its Extension System (CBK-3610XS), which was developed in collaboration with James Cameron’s Lightstorm Entertainment and is currently being used to shoot the AVATAR sequels, the latest upgrade will offer filmmakers even greater creative freedom, flexibility and choice.

The new optional High Frame Rate license allows VENICE to shoot at speeds of up to 120fps at 4K 2.39:1, and 60fps at 6K 3:2 as well as up to 110fps at 4K 17:9 and 75fps at 4K 4:3 with anamorphic lenses. The new additional frame rates are particularly well-suited for drama, movie and commercial productions in 4K and 6K, as well as productions at 50/60p in 6K and VR productions using large viewing angle of 6K 3:2 in 60p.

All High Frame Rates support X-OCN recording including X-OCN XT* implemented from Ver.3.0 and High Frame Rate up to 60fps support XAVC 4K and ProRes recording.

“At Sony, we pride ourselves on working closely with our customers and partners to create solutions that enable modern filmmakers to bring their vision to reality just the way they intend to. In fact, High Frame Rate shooting was a feature that was frequently requested by our customers. We listened to their feedback and are excited to now offer this feature to all new and existing VENICE users,” explained Theresa Alesso, Vice President and Head of CineAlta for Sony Electronics. “Last year at Cine Gear Expo, we announced that Version 4.0 will include 120fps in 2K. However, we are excited to announce today that, as a result of the hard work of our engineering team, Version 4.0 will now include 120fps in 4K. With firmware Version 4.0, our state-of-the-art VENICE will become even more powerful, fortifying its position as the go-to solution for cinematographers who want to create stunning imagery and capture emotion in every frame.”Additionally, Version 4.0 of the VENICE firmware will introduce:

  • 700 Protocol – A control protocol developed by Sony to connect VENICE to a remote-control unit (RM-B750 or RM-B170) and an RCP-1500 series remote control panel, giving filmmakers greater flexibility in bringing their visions to life. Further expanding on the camera’s existing remote-control capabilities, the VENICE now offers paint control, iris control, recording start/stop, clip control, and more. The upgraded remote-control function also adds new workflows to extend VENICE’s use in multi-camera and live production settings, such as live concerts and fashion shows.
  • Support for Cooke’s /i third generation metadata Technology, /i3 and ZEISS eXtended Data technology (based on Cooke /i Technology) – Extended lens metadata can now be embedded straight into a RAW/X-OCN/XAVC file and HD-SDI output without the need for additional metadata equipment. The new function allows distortion and shading caused by supported lenses to be easily rectified, significantly reducing post-production costs.

Further features include an extended Mask+Line setting in the Frame line set-up, selectable functions for the assignable buttons of the DVF-EL200 viewfinder and pure Progressive HD-SDI output in 25p and 29p.

Both the free upgrade to firmware Version 4.0 and the optional HFR license will be available in June 2019.

To learn more about VENICE, please join Sony at BSC Expo 2019 in Battersea Evolution, Battersea, London at stand 545 or visit pro.sony/products/digital-cinema-cameras/venice.

*Excluding 6K 3:2 50p/60p

Via: PixelShift

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Sony VENICE motion picture camera firmware v4.0 update brings 4K/120fps

Posted in Uncategorized

 

Go With The Flow – Using Slow Shutter Speed to Create Motion Blur

22 Jan

The post Go With The Flow – Using Slow Shutter Speed to Create Motion Blur appeared first on Digital Photography School. It was authored by Kevin Landwer-Johan.

Starting out as a photography assistant in a daily newspaper, I had one thing drummed into me. Make sure it’s sharp. This was the cardinal rule. It was appropriate for the situation.

Any kind of unintentional fuzziness, especially when it renders the subject indistinct, looks awful when printed on newsprint.

Adding motion blur, or any other form of blur, in a photograph can work extremely well when circumstances are right.

Merlion Park, Singapore Using a Slow Shutter Speed to a Create Sense of Motion

© Kevin Landwer-Johan. Shutter speed: 8 seconds

Two main techniques for creating motion blur in a photo are subject movement and camera movement.

Times when adding motion blur is the right choice

Deciding to add motion blur is best when:

  • Some parts of the composition remain sharp
  • The light is favorable
  • You find the shutter speed sweet spot
  • You have a means of stabilizing your camera
Poi Sang Long Festival Using a Slow Shutter Speed to a Create Sense of Motion

© Kevin Landwer-Johan. Shutter speed: 1/20th of a second.

Adding some flash can at times truly enhance a photo made with a slow shutter speed. I find this works best when you sync your flash with the rear shutter closing.

Keeping some of it sharp

Using a slow shutter speed to create motion blur, I find it’s best to ensure that some parts of your composition remain sharp. Whether you are moving your camera or your subject is in motion, your results will be stronger when not all the composition is blurred.

Using a slow shutter speed and moving your camera in relation to a moving subject, is known as panning. This will keep your subject sharp and the background will blur. Getting a perfectly sharp subject while panning is challenging because it requires the camera to be moving in sync with how fast the subject is.

Tuktuk Panning Using a Slow Shutter Speed to a Create Sense of Motion

© Kevin Landwer-Johan. Shutter speed: 1/25th of a second.

Having your camera locked down while your subject or the background move you will have a better chance to render your subject sharp.

Getting the exposure when the light is right

Bright sunny days make it challenging to capture motion blur in a photograph. You need to use a slow shutter speed for the effect to happen. Setting your aperture to the smallest opening and your ISO as low is it can go will not always allow you to use a slow enough shutter speed.

Using a neutral density filter in bright sunshine will make a slower shutter speed possible. At times I have coupled a neutral density filter with a polarizing filter to cut the light entering the lens even more.

Market Scene Using a Slow Shutter Speed to a Create Sense of Motion

© Kevin Landwer-Johan. Shutter speed: 1/4th of a second.

In this photo I had my friend stand very still to achieve motion blur in the people walking behind her. Being such a bright sunny day meant that even with my aperture set to the minimum opening of f/11. My ISO was set at one hundred and still did not allow me to use a slow enough shutter speed. I attached a four-stop neutral density filter and a polarizing filter so I could set my shutter speed to 1/4th of a second to capture the motion blur.

At night and in other low light situations achieving a slow enough shutter speed is simple.

Finding the sweet spot for optimal blur

Choosing a shutter speed setting appropriate to the pace of movement in your composition is important. Having too much or not enough motion blur will give you a poor result. This varies greatly depending on your subject and the style of photograph you are creating.

Photographing waterfalls, people walking or traffic at night, all require different shutter speeds for best results. Generally, slower moving elements in your composition need slower shutter speeds. Things moving more quickly need faster shutter speed or there will be too much motion blur. It also depends on how much definition you want to retain in whatever is moving.

Twently Second Waterfall Using a Slow Shutter Speed to a Create Sense of Motion

© Kevin Landwer-Johan. Shutter speed: 20 seconds.

Flowing water, like in this waterfall, can be completely blurred. In fact, waterfall photos usually look best when a shutter speed of more than two seconds is used. I used a twenty-second exposure for this photo and there’s absolutely no definition in the water. It is still obvious what it is though.

On The Sidewalk Using a Slow Shutter Speed to a Create Sense of Motion

© Kevin Landwer-Johan. Shutter speed: 1/10th of a second.

Keeping the motion blur balanced is more important with some subjects. For this photo of people on a sidewalk in Bangkok, I chose a shutter speed of 1/10th of a second. A slower shutter speed would mean more blur and less definition. A fast speed would show less blur and may just look like it was a mistake. I was happy to capture an image where the people walking are blurred yet their feet are reasonably sharp. The young woman modeling for me was very patient as it took quite a while to make a composition with the right number of pedestrians in my frame.

Experimentation is key to finding the sweet spot with your shutter speed. You need to decide how clear or how blurred you want your subject and other elements in your composition.

Camera stability is important

You can use a slow shutter speed even if you do not have a tripod. Learn to hold your camera well and be in control of it. I do not often carry a tripod so am forced to use alternative means of preventing unwanted camera movement. Unintentional camera movement creates ghosting which introduces extra fuzziness to photos.

Flames Using a Slow Shutter Speed to a Create Sense of Motion

© Kevin Landwer-Johan. Shutter speed: 1/20th of a second.

Hand holding a camera while panning can be preferable for some more than using a tripod. Keeping a steady movement along with your subject is what’s most important. If you are panning with a passing vehicle you do not want to be jiggling your camera up and down as you track your subject.

Finding a firm surface to place your camera can be a good substitute when you don’t have a tripod. You may need to place something under the lens so your angle of view is level. I find my mobile phone or wallet often come in handy for this.

Using a tripod does make things more straightforward when using a slow shutter speed. With a tripod, you have more stability and often more control of your angle of view.

Coffee Roasting Using a Slow Shutter Speed to a Create Sense of Motion

© Kevin Landwer-Johan. Shutter speed: 1/4th of a second.

Introducing rear curtain synchronized flash

Many cameras give you an option to synchronize the flash so it fires just before the shutter closes. Doing this combined with a slow shutter speed and movement produces interesting effects.

As the flash is triggered near the end of the exposure it looks like the movement is partially frozen. Using a very slow shutter speed when there’s fast movement your subject may appear semi-transparent.

Tricycle Taxi Using a Slow Shutter Speed to a Create Sense of Motion

© Kevin Landwer-Johan. Shutter speed: 4 seconds.

I used a four-second exposure for this photo of a tricycle taxi in Chiang Mai. You can see the ghosted image of two people just above the handlebars of the cycle. They were riding past on a motorbike just at the end of the exposure as my flash fired.

Conclusion

Photographing movement using long exposures it pays to give yourself plenty of time to experiment and take lots of photos. Varying your shutter speed. Choose a faster or slower speed with the same subject. This can create vastly different looking photos.

Iron Bridge Using a Slow Shutter Speed to a Create Sense of Motion

© Kevin Landwer-Johan. Shutter speed: 1.6 seconds.

If you’ve never often used a slow shutter speed, begin to explore the possibilities. If you’ve had some experience, try some new angle or subject. Please share your photos and comments below.

The post Go With The Flow – Using Slow Shutter Speed to Create Motion Blur appeared first on Digital Photography School. It was authored by Kevin Landwer-Johan.


Digital Photography School

 
Comments Off on Go With The Flow – Using Slow Shutter Speed to Create Motion Blur

Posted in Photography

 

MIOPS Capture360 is a modular, pocket-sized motion control box

14 Jun

Meet the newest member of the MIOPS family, the Capture360. Deemed the ‘most versatile motion box ever created’ by its creators, the Capture360 is a pocket-sized motion device currently being funded on Kickstarter that works with any camera or smartphone.

What sets Capture360 apart from the rest of MIOPS’ lineup is that it can be be as simple or robust as your filming needs demand.

Out of the box, a single Capture360 only captures one range of motion—panning. But, when paired with MIOPS’ optional L-bracket, that single unit can also be used for tilting. Combine two units and you get a rig that can both pan and tilt, all without the need for wires—the Capture360 devices will automatically determine which one is for panning and which one is for tilting.

MIOPS is also offering a Capsule Slider accessory that will let you add yet another range of motion to your shots.

Like MIOPS’ other systems, the speed and direction of the motion is controlled using MIOPS smartphone app for iOS and Android. The app, which connects via Bluetooth, is used to program the motion path of the device.

In addition to manual configuration, the app can also work to keep track of a subject with both object tracking and face tracking. The app is also capable of controlling the camera settings to account for bulb ramping, interval ramping, long exposures, and high dynamic range (HDR).

If you purchase an optional turntable accessory, you can also program the app to capture 360-degree product photos.

The Capsule360 features a built-in rechargeable battery that can work for eight hours straight of continuous use or up to one week long in time-lapse mode. There’s also an option to attach a USB battery pack to extend the battery life.

As with all of MIOP’s previous devices, the Capture360 is currently being crowdfunded on Kickstarter with an expected ship date of December 2018. As of publishing this article, it’s already surpassed its $ 75,000 goal with 28 days to go. A single standalone Capsule360 starts at $ 200 and depending on the accessories and add-ons you want, prices go up from there.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on MIOPS Capture360 is a modular, pocket-sized motion control box

Posted in Uncategorized

 

Panning and Tips for Adding Motion to Your Street Photography

20 Apr

One of the things I teach people on my photography workshops and tours is how to do panning. It’s a great technique to add to your skillset for shooting great street photography. Panning helps to isolate a moving subject and freeze it while at the same time blurring a potentially boring or ugly background.

panning street photography

I happened upon this bike race in Trinidad, Cuba. The street was full of people and the scene was very busy. So I chose to pan the riders as they went past to add a sense of motion and speed.

See the difference in this shot where I did not pan and everything is sharp. Notice how busy the scene is and the bikers are almost lost. Doesn’t it look like they are going a lot slower or frozen in place here as compared to the image above? 

Tips for doing panning

Here is a video from Gavin Hoey and Adorama TV where he demonstrates how to do panning. He also walks through the camera settings to use to get started and how to adjust them as needed. Have a watch.

?

Street photography with slow shutter speeds

Here is a different approach to adding motion blur to your street photography, by photographer Doug McKinlay. In this video, he talks about the need for a neutral density filter if there is too much light, and using a tripod to blur moving subjects or part of your scene using long exposures.

?

Panning demonstration

Finally, here’s one more video that has a really good demonstration of how to execute panning, and what not to do as well.

I hope that gives you some ideas and starting points for adding panning and motion to your street photography.

The post Panning and Tips for Adding Motion to Your Street Photography appeared first on Digital Photography School.


Digital Photography School

 
Comments Off on Panning and Tips for Adding Motion to Your Street Photography

Posted in Photography

 

Rylo update adds 180° mode, bluetooth capture and motion blur timelapse effect

19 Apr
Credit: Rylo

The popular Rylo 360° camera—a camera we called the “360 degree camera done right” in our review—is receiving a major update today. The update adds two new features for both iOS and Android users of the Rylo camera and app, with a third feature available only to iOS users for now.

Let’s take them one by one.

180° Mode

The new 180° video mode shrinks the field of view, allowing you to capture 180° video at higher resolution and better image quality than 360° mode allows. According to Rylo, 180° mode “is especially useful for chest-mounted shots or activities/scenarios in which one lens is blocked.”

Bluetooth Remote Capture

The name kind of gives this one away. Remote capture lets you sync your phone to the Rylo camera via bluetooth, which allows you to: switch between recording modes, start or stop a video, and snap a photo, all from the app on your phone.

Obviously, this feature will help if you’ve got the camera mounted somewhere hard to reach.

Motion Blur

A new feature for timelapse shooting, Motion Blur adds a ‘cinematic’ motion blur effect that is actually synced up to the speed of your timelapse shots (more speed = more blur). The effect doesn’t show up while shooting, but will be viewable upon export.

All three features ship today, although Bluetooth Remote Capture is currently only available for iOS, with Android support “coming soon.”

If you own a Rylo 360-degree camera and want to try these features out, all you need to do is update your Rylo app via the App Store or Google Play, then update your camera’s software through the app. And if you haven’t heard about the Rylo and want to know what this camera is all about, check out our full review at the link below.

Review: Rylo is a 360° camera done right

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Rylo update adds 180° mode, bluetooth capture and motion blur timelapse effect

Posted in Uncategorized

 

Blackmagic releases DaVinci Resolve 15 with all-new VFX and motion graphics module

10 Apr

In addition to the new Pocket Cinema Camera 4K announced earlier today, Blackmagic Design also released a major update to its video production software DaVinci Resolve. According to Blackmagic, DaVinci Resolve 15 comes with “hundreds of new features and improvements” but the major addition is a new Fusion module that fully integrates visual effects and motion graphics tools into the DaVinci Resolve workflow.

If you were wondering if Blackmagic is serious about making this a class-leading application, this update should help answer that. Already an industry favorite for color-correction, DaVinci Resolve 15 now includes four high-end video production applications in one: there’s a module for editing, a module for color correction, a module for audio production, and now, the new Fusion module for VFX and motion graphics as well.

Previously available as a stand-alone application, Fusion—which Blackmagic Design calls the “world’s most advanced visual effects and motion graphics software”—is now built right into DaVinci Resolve 15.

“DaVinci Resolve 15 is a huge and exciting leap forward for post production because it’s the world’s first solution to combine editing, color, audio and now visual effects into a single software application,” said Grant Petty, CEO, Blackmagic Design. “We’ve listened to the incredible feedback we get from customers and have worked really hard to innovate as quickly possible. DaVinci Resolve 15 gives customers unlimited creative power to do things they’ve never been able to do before. It’s finally possible to bring teams of editors, colorists, sound engineers and VFX artists together so they can collaborate on the same project at the same time, all in the same software application!”

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_9400954258″,”galleryId”:”9400954258″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

According to Blackmagic, the Fusion module gives visual effects and motion graphics artists a true 3D workspace with over 250 tools for compositing, vector paint, particles, keying, rotoscoping, text animation, tracking, stabilization and more. You’ll still be able to purchase Fusion on its own, but Blackmagic plans to fully integrate the entire application into DaVinci Resolve 15 “within the next 12-18 months.”

Today’s release isn’t the final version, but rather a public beta that is available as a free download to all current DaVinci Resolve and DaVinci Resolve Studio customers. The free version of DaVinci Resolve 15 will remain free, while the Studio version—which adds multi-user collaboration, support for frame rates over 60p, more filters and effects, and more—is available for $ 300 with “no annual subscription fees or ongoing licensing costs.” Take that Adobe…

To learn more, check out the full press release below or visit the Blackmagic Design website.

Press Release

Blackmagic Design Announces DaVinci Resolve 15

New upgrade fully integrates visual effects and motion graphics, adds even more audio tools plus hundreds of new features and improvements that editors and colorists have asked for!

NAB 2018, Las Vegas, USA – April 9, 2018 – Blackmagic Design today announced DaVinci Resolve 15, a massive update that fully integrates visual effects and motion graphics, making it the world’s first solution to combine professional offline and online editing, color correction, audio post production, multi user collaboration and now visual effects together in one software tool. DaVinci Resolve 15 adds an entirely new Fusion page with over 250 tools for compositing, paint, particles, animated titles and more. In addition, DaVinci Resolve 15 includes a major update to Fairlight audio, along with over 100 new features and improvements that professional editors and colorists have asked for.

A public beta of DaVinci Resolve 15 will be available today and for immediate download from the Blackmagic Design website. DaVinci Resolve 15 will also be demonstrated on the Blackmagic Design NAB 2018 booth at #SL216.

DaVinci Resolve 15 continues to revolutionize post production by combining 4 extremely high end applications as different pages in one single piece of software. The edit page has all of the tools professional editors need for both offline and online editing, the color page features the world’s most advanced color correction tools, the Fairlight audio page is designed specifically for audio post production, and the new Fusion page gives visual effects and motion graphics artists everything they need to create feature film quality effects and animations. All it takes is a single click to instantly move between editing, color, effects and audio.

This gives individual users unlimited creative flexibility because they can learn and explore different toolsets. It also enables collaboration so people with different talents can work together on the same project at the same time. The DaVinci Resolve 15 collaborative workflow dramatically speeds up post production because customers no longer need to import, export or translate projects between different software applications, and work no longer needs to be conformed when changes are made. Everything is in the same software application.

The free version of DaVinci Resolve 15 can be used for professional work and has more features than virtually every other paid application for post production. DaVinci Resolve 15 Studio, which adds multi user collaboration, 3D, VR, dozens of additional filters and effects, unlimited network rendering and other advanced features such as temporal and spatial noise reduction, is available to own for only US$ 299. There are no annual subscription fees or ongoing licensing costs. DaVinci Resolve 15 Studio costs less than all other cloud based software subscriptions and it does not require an internet connection once the software has been activated. That means customers don’t have to worry about losing work in the middle of a job if there is no internet connection.

“DaVinci Resolve 15 is a huge and exciting leap forward for post production because it’s the world’s first solution to combine editing, color, audio and now visual effects into a single software application,” said Grant Petty, CEO, Blackmagic Design. “We’ve listened to the incredible feedback we get from customers and have worked really hard to innovate as quickly possible. DaVinci Resolve 15 gives customers unlimited creative power to do things they’ve never been able to do before. It’s finally possible to bring teams of editors, colorists, sound engineers and VFX artists together so they can collaborate on the same project at the same time, all in the same software application!”

DaVinci Resolve 15 Detailed Overview

DaVinci Resolve 15 features an entirely new Fusion page for feature film quality visual effects and motion graphics animation. Fusion was previously only available as a stand alone application and is the world’s most advanced visual effects and motion graphics software. It is now built into DaVinci Resolve 15. The new Fusion page gives customers a true 3D workspace with over 250 tools for compositing, vector paint, particles, keying, rotoscoping, text animation, tracking, stabilization and more. Adding Fusion to DaVinci Resolve has been a massive project that will be completed over the next 12-18 months. Customers can get started using Fusion today to complete nearly all of their visual effects and motion graphics work. The standalone version of Fusion will continue to be available for customers who need it.

In addition to bringing Fusion into DaVinci Resolve 15, Blackmagic Design has also added support for Apple Metal, multiple GPUs and CUDA acceleration, making Fusion in DaVinci Resolve faster than ever. To add visual effects or motion graphics, customers simply select a clip in the timeline on the Edit page and then click on the Fusion page where they can use Fusion’s dedicated node based interface, which is optimized for visual effects and motion graphics. Compositions created in the standalone version of Fusion can also be copied and pasted into DaVinci Resolve 15 projects.

DaVinci Resolve 15 also features a huge update to the Fairlight audio page. The Fairlight page now has a complete ADR toolset, static and variable audio retiming with pitch correction, audio normalization, 3D panners, audio and video scrollers, a fixed playhead with scrolling timeline, shared sound libraries, support for legacy Fairlight projects, and built in cross platform plugins such as reverb, hum removal, vocal channel and de-esser. With DaVinci Resolve 15, customers no longer have to worry about audio plugins when moving between Mac, Windows and Linux because the FairlightFX plugins run natively on all three platforms.

DaVinci Resolve is the fastest growing nonlinear video editor in the industry. It’s also Hollywood’s favorite color corrector. Blackmagic Design has listened carefully to feedback from professional colorists and editors. DaVinci Resolve 15 includes over a hundred new features and improvements that editors and colorists have asked for.

Colorists get an entirely new LUT browser for quickly previewing and applying LUTs, along with new shared nodes that are linked so when one is changed they all change, multiple playheads for quickly referencing different shots in a program, over 5x performance improvement for stabilization, improved noise reduction, and new Super Scale HD to 8K up-rezzing. DaVinci Resolve 15 also expands HDR support with GPU accelerated Dolby Vision metadata analysis and native HDR 10+ grading controls. In addition, new ResolveFX let customers quickly patch blemishes or remove unwanted elements in a shot using smart fill technology. There are also new ResolveFX for dust and scratch removal, lens and aperture diffraction effects, and more.

Professional editors will find new features in DaVinci Resolve 15 specifically designed to make cutting, trimming, organizing and working with large projects even better. DaVinci Resolve 15 has dramatically improved load times so that large projects with hundreds of timelines and thousands of clips now open instantly. New stacked timelines and timeline tabs let editors see multiple timelines at once so they can quickly cut, paste, copy and compare scenes between timelines. There are also new markers with on-screen annotations, subtitle and closed captioning tools, auto save with versioning, greatly improved keyboard customization tools, new 2D and 3D Fusion title templates, image stabilization on the Edit page, a floating timecode window, improved organization and metadata tools, Netflix render presets with IMF support and much more.

For the ultimate high speed workflow, customers can add a DaVinci Resolve Micro Panel, DaVinci Resolve Mini Panel or a DaVinci Resolve Advanced Panel. All controls are logically placed near natural hand positions and are made out of the highest quality materials. Smooth, high resolution weighted trackballs and precision engineered knobs and dials feature the perfect amount of resistance for accurately adjusting any setting. The DaVinci Resolve control panels give colorists and editors fluid, hands on control over multiple parameters at the same time, allowing them to create looks that are simply impossible with a standard mouse.

In addition, Blackmagic Design also introduced new Fairlight audio consoles for audio post production that will be available later this year. The new Fairlight consoles are available in 2, 3 and 5 bay configurations. Prices for the new Fairlight control panels are approximately 80% less than the previously available panels with prices ranging from US$ 21,995 to US$ 48,995.

Availability and Price

The public beta of DaVinci Resolve 15 is available today as a free download from the Blackmagic Design website for all current DaVinci Resolve and DaVinci Resolve Studio customers. DaVinci Resolve Studio is available for US$ 299 from Blackmagic Design resellers worldwide.

The Fairlight consoles will be available later this year and will be priced from US$ 21,995 for the Fairlight 2 Bay console. The Fairlight consoles will be available from Blackmagic Design Resellers worldwide.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Blackmagic releases DaVinci Resolve 15 with all-new VFX and motion graphics module

Posted in Uncategorized

 

Japan’s NHK will demo an 8K camera that can shoot 240fps slow motion at NAB 2018

04 Apr
NHK Fukuoka Broadcasting Bureau. Credit: Soramimi

Japan’s national public broadcasting organization NHK is developing an 8K slow-motion camera capable of recording ultra-high-definition content at 240fps. The technology was announced in a press release (partially translated here), and will be showcased at NAB 2018 in Las Vegas next week. Though 8K monitors and televisions are still in their infancy, the broadcaster is pioneering 8K technologies in anticipation of future demand.

To that end, NHK also plans to showcase a new 8K VR display during NAB 2018. The display is designed to eliminate the pixelated look common to current VR headsets.

NHK’s 8K 240fps camera

Finally, future 8K broadcasts may benefit from the NHK’s new transmitter technology, which reduces an 8K broadcast from a huge 40Gbps to a more manageable (but still huge) 8Gbps. The transmitter then converts the content into an IP-based signal for live broadcasting, a process that allegedly happens in “tens of microseconds.”

According to AV Watch, NHK anticipates using its new 8K technology for sports broadcasts (think Tokyo 2020 Olympics) and other content featuring fast-moving objects starting later this year. Unlike existing solutions, the NHK system is said to offer better compression and transmission for a very low delay while maintaining 8K quality for live shows.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Japan’s NHK will demo an 8K camera that can shoot 240fps slow motion at NAB 2018

Posted in Uncategorized

 

Google explains the tech behind the Pixel 2’s Motion Photos feature

15 Mar

Apple was the first mobile manufacturer to popularize still/video hybrid files with its Live Photos that were introduced on the iPhone 6s. Google then launched the Motion Stills app to improve and stabilize Apple’s Live Photos, and ported the system to the Android world soon after.

For the new Motion Photos feature on its latest Pixel 2 devices Google built on Motion Stills, improving the technology by using advanced stabilization that combines the devices’ soft and hardware capabilities. As before, Motion Photos captures a full-res JPEG with an embedded 3 second video clip every time you hit the shutter.

However, on the Pixel 2, the video clip also contains motion metadata that is derived from the gyroscope and optical image stabilization sensors.

This data is used to optimize trimming and stabilization of the motion photo and, combined with software based visual tracking, the new approach approach aligns the background more precisely than we’ve seen in the previous Motion Stills system (which was purely software-based). As before, the final results can be shared with friends or on the web as video files or GIFs.

If you are interested in more technical details of the Motion Photos feature, head over to the Google Research Blog. A gallery of Motion Photo files is available here.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Google explains the tech behind the Pixel 2’s Motion Photos feature

Posted in Uncategorized

 

Behind the scenes: Shooting a motion time-lapse in the Canadian wilderness

17 Feb

Back in summer 2017, I went on a six week adventure to British Columbia and Alberta in order to capture Canada’s beautiful landscapes in the most impressive way possible. I wanted to make a time-lapse film that raises more awareness of our planet and our environment.

Planning

When it came to logistics, I tried to be as prepared as possible for this 6-week trip, and did tons of research ahead of departure. I knew I probably wouldn’t have a whole lot of internet (or time to waste on the internet) in the Canadian wilderness, and wanted to be prepared to just take things as they came. So I collected locations that I discovered on Google, on Instagram, or on other photographers’ portfolios, and created a long list of spots that were worth checking out.

I didn’t have a specific shot list. I just tried to capture the most beautiful scenery and moments that I could find along my adventure. However, I paid a lot of attention to interesting details around me instead of going for spectacular vantage points only. That’s how the whole moody intro sequence was conceived. By stepping closer to the subject, I tried to approach time-lapse in a slightly different way than you see in your typical, ‘epic’ time-lapse films online.

Challenges

I guess my biggest challenge with shooting this project was my own safety—doing all of alone, in an area packed with grizzly bears, was pretty scary.

Hiking alone comes with a risk that I always had to bear in mind. I carried bear spray at all times, and tried to let the bears know that I was there by creating a lot of noise on the hiking trail (they can get really dangerous when they’re startled). When I set up a time-lapse shot, I always had to have an eye on my surroundings and make a lot of noise by singing or talking to myself. Over the course of my trip, that risk was something I got used to.

Being all alone also didn’t make it possible for me to camp out on location. Obviously because it is very risky, but most of all because I simply couldn’t carry camping gear along with my camera gear and slider all by myself.

As a result of this, I had trouble getting to the best possible location at the best times of day. In order to shoot at sunset or sunrise, I either had to find a location that was fairly close to the parking lot, or take the risk of hiking up or down in the darkness with my head lamp as the only light source.

This is all of the gear I brought with me into the Canadian wilderness

Gear

Since I was all alone on this mission and wanted to hike out to locations a lot, I had to keep my gear package as compact and efficient as possible. I packed my Sony a7S II and Nikon D5100 (as backup camera) together with a newly purchased Canon 16-35mm F4 and Rokinon 14mm F2.8, as well as two cheap vintage lenses: an SMC Pentax-M 50mm F1.7 and an SMC Pentax-M 100mm F2.8.

As you might have noticed, my camera package was pretty humble. That was all I had and all I could afford, and honestly—that was all I needed.

A great Sony camera with only 12 MP paired with a sharp Canon wide-angle lens that could almost do anything on location. This lens is my absolute favorite due to its great flexibility for wide three-dimensional time-lapse movements. The Rokinon lens is well-known for its night- and astrophotography abilities, as it has just little amount of coma and is wide enough to capture almost the whole night sky. The Pentax lenses are actually my first lenses I’ve ever bought. Obviously, they’re old, cheap, and not the sharpest; they’re also small, light, and capable of projecting an image onto my sensor I was totally happy with.

In contrast, my time-lapse gear was downright extravagant thanks to a sponsorship from the innovative company eMotimo. They gave me a loaner for a great package for the duration of my whole trip: The spectrum ST4 is a newly designed 4-axis motion control system that, combined with the iFootage Shark Slider S1, simply can do it all—slide, pan, tilt and even pull focus in video or time-lapse shooting mode.

Using a Playstation Controller, I could easily set up literally any shot I wanted, and the ability to set keyframes in between the start and end point of my programmed shot gave me ultimate control over my composition. Additionally, an innovation that I’ve never seen before is a mountable extra motor to automatically pull the focus as the time-lapse is running. There are many ways to mess around with that: one I used was to shift focus from an interesting foreground to the revealing scenery in the background.

Another great feature worth mentioning is the ability to repeat a programmed movement at different speeds. This allowed me to record shots at different frame rates, but still have the exact same camera movement in each.

Since I couldn’t get the idea out of my head to combine a moving time-lapse shot with a real time video, with both shots having the very same camera movement, I did exactly that in the final shot of ALIVE. I recorded the whole scenery in dynamic time-lapse except for the person (me) being recorded in real-time video 25fps. Even though it took a lot of time perfectly masking out the person in post production, this shot probably wouldn’t have been possible without the spectrum ST4.

With all the flexibility of the motion control devise I felt an enormous freedom as a time-lapse photographer and could explore further ways to creatively make use of its features. However, this turned out to be a weakness as well, as I often found myself tempted to design way too sophisticated time-lapse shots. In this case, the inorganic/mechanical camera motion often drew way too much attention when watching the processed time-lapse sequence making the scenery appear surreal.

So in the process of my trip I learned that less is more in this case. It is way more important so be at the right spot at the right time—which is ultimately what I was seeking for my whole adventure.

Editing

In the editing process of ALIVE it was certainly very hard to come up with an edit as I had to choose the best among 149 time-lapse shots I recorded on my trip. Since I wanted to keep my film under 4 minutes, I forced myself to make some hard decisions. That turned out to be a pretty emotional process because every shot has a background story that connects to me personally.

It just often felt super frustrating to kick out a shot for which I woke up at 4 a.m. and hiked up a mountain for two hours in order to be there for sunrise.

A Story

Let me conclude my little behind the scenes tale with a story from Lake O’Hara, one of the most captivating landscapes I have ever experienced in person, and the final shot of the film. I decided to hike up to this incredible viewpoint for sunset and kept shooting until there was no light left to work with. That was the moment when I found myself alone in the darkness on the edge of a cliff.

The final shot of the film, captured above Lake O’Hara at sunset.

Should I take the risk of hiking down the steep and forested trail in the darkness potentially ending up as grizzly bear food? Or should I rather just stay up here all night and do another time-lapse of the milky way? Without camping gear and food, I stayed and spent another 5 hours in the cold darkness until the sun came back up.

Even though this shot of the milky way didn’t even make it into the final film, there is nothing I regret about staying up on this mountain all night all alone without a tent in bear territory… simply because it was one of these adventures that made my trip so unique.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Behind the scenes: Shooting a motion time-lapse in the Canadian wilderness

Posted in Uncategorized

 

DxOMark Mobile testing protocol now considers bokeh simulation, low-light, motion and zoom

12 Sep

Camera and lens testing company DxOMark has announced an updated smartphone camera evaluation protocol that evaluates additional elements encompassing some of the newer mobile camera technologies. This new protocol builds upon the previous version, adding an updated low-light test that evaluates performance down to 1 lux, new bokeh and zoom tests, and a motion-based test.

DxOMark detailed the new mobile protocol on Monday, explaining that it is better capable of evaluating phones packing the newest mobile camera capabilities, particularly ones made possible by dual-camera hardware. The company has re-tested some top-tier phone models under the new protocol, finding that in some cases scores increased when looking at features like low-light performance, bokeh, and zoom.

A detailed analysis of the new protocol versus the old protocol sheds some light on what DxOMark is looking for in these new categories, as well as charting the score changes some phones experienced under the new protocol. The company also offers a more in-depth look at the new protocol in a blog post.

Via: Digital Trends

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on DxOMark Mobile testing protocol now considers bokeh simulation, low-light, motion and zoom

Posted in Uncategorized