RSS
 

Archive for the ‘Uncategorized’ Category

Hipstamatic returns with new free app and updated analog camera styles

08 Oct

Hipstamatic, the once-popular image filter app that was largely eclipsed by Instagram, has returned as the Hipstamatic X analog camera app. The new app is free to download for iOS devices, offering users access to image filters that imitate the look of many retro analog cameras.

Unlike Instagram, which requires users to manually apply and edit filters, Hipstamatic functions more like a camera, which means the filter and adjustments are automatically applied when the user snaps an image. ‘This camera brings all the joy, quirk, and randomness of analog film photography to your pocket,’ the Hipstamatic team explains on the App Store.

Hipstamatic offers a number of camera options, including Fisheye, Art House, Classic Toy, Tintype, Disposable, K-PRO X, Instant, and Pinhole. All films and lenses previously purchased for the Hipstamatic Classic app are supported by Hipstamatic X.

Though the app is now free, users can sign up for the Hipstamatic Makers Club at $ 2/month for access to the app’s full preset library, all cameras offered by the app, as well as more than 100 lenses and films released by the company over the past decade. Using these, mobile photographers are able to ‘build’ their own analog camera styles to get the effects they want.

The company plans to release additional camera styles in the future; Hipstamatic Makers Club members will get early access to these offerings ‘several times per year.’

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Hipstamatic returns with new free app and updated analog camera styles

Posted in Uncategorized

 

When fast-ish is fast enough: in praise of F1.8 lenses

07 Oct

Going back decades, well-heeled amateur and hobbyist photographers have lusted after fast prime lenses. Partly this is just human nature. In the days when most cameras shipped with standard 50mm F1.8 or F2 lenses, it was inevitable that such photographers would long for something a little more exotic. A little faster, more expensive, and more ‘professional’. For photography obsessives that grew up idolizing the famous LIFE magazine shooters of the late 20th Century, it was natural to aspire to own those kinds of lenses, despite their price.

There is still a demand for F1.4 and faster lenses, but that’s not the same thing as saying that there is a need for them

Partly though, the appeal of fast lenses is practical – regardless of your ability level or income. They let in more light, and more light, even today, is always good. In the film days though, you really needed every stop. For a long time, anything above ISO 400 was considered ‘fast’, and shooting so-called ‘high speed’ film involved compromises, in color rendition, grain and contrast. For photographers that needed to work in changing conditions, an F1.4 or even F1.2 lens was valuable insurance against missed opportunities created by a lack of light. Never mind that many of the F1.4 and F1.2 lenses of the film era were pretty soft wide open – a slightly hazy photo is better than no photo at all.

But today, two decades into the ‘Digital Century’, is there still a need for ultra-fast lenses?

One of the ultimate drool-worthy lenses, the Leica Summilux 35mm F1.4 (this example is from the 1970s) is the most compact F1.4 lens that Leica ever made for its M-series rangefinders. Its small size, light weight, and the premium attached to F1.4 means that it has long been a favorite of professionals and wealthy amateurs.

Never mind the fact that at it can’t focus closer than 1 meter, can’t accept normal filters and doesn’t really get sharp until F2.

Fast lenses continue to sell, and technically of course, the F1.4 and F1.2 (and faster) primes of today are far superior to the designs that came before. Standout examples of the current state-of-the-art include Canon’s superb RF 50mm F1.2 and EF 35mm F1.4L II, Sony’s GM 24mm F1.4, and Sigma’s 35mm F1.2 ‘Art’ among many others. Tamron’s new 35mm F1.4 is another stunning lens, and don’t let a Pentax fan catch you suggesting that the FA* 50mm F1.4 SDM AW is anything less than perfect. Technically speaking, all of the lenses I just mentioned are among the best of their type that you can buy.

Canon’s EF 35mm F1.4L II USM is a stunning lens – in fact arguably the best 35mm prime on the market. If you’re a Canon shooter, and you’re one of those people that really needs F1.4, this is the lens to get. But for most of us, it might be overkill.

Clearly, then, and partly for that reason, there is still a demand for F1.4 and faster lenses, but that’s not quite the same thing as saying that most photographers still have a need for them. I suggest that these days, with the modern BSI-CMOS sensors inside most full-frame interchangeable lens cameras, the average full-frame photographer will be fine with F1.8. And might actually be better off.

To explain why I think that, I’ll break down the three traditional arguments in favor of fast lenses:

1: Faster lenses let in more light, and more light is always good.

This is a fact. More light is never a bad thing, and the 2/3 of a stop which demarks an F1.4 lens from an F1.8 lens is not insignificant.*

Consider the practical implications of shooting at F1.4 versus F1.8: First, you’ll be able to shoot at faster (shorter duration) shutter speeds. Assuming a constant ISO sensitivity, an increase in 2/3 of a stop of aperture means the difference between shooting at a shutter speed of 1/25th of a second and shooting at 1/15th.

That’s potentially quite handy if, for example, you’re shooting with a 28mm lens. Without any form of stabilization, you’ll probably be able to hand-hold your shot at 1/25th, but you might struggle at 1/15th. So in marginal light, shooting at F1.4 will give you a little bit more peace of mind.

This portrait of everyone’s favorite dog was shot wide open, on the Nikon Z 50mm F1.8 S. Belvedere is sharp, there’s no CA anywhere, and foreground and background are pleasantly blurred. The high performance of the Nikon Z6’s BSI-CMOS sensor means that even at ISO 1,400, noise is barely an issue (and could be reduced even further with a little more NR in Adobe Camera Raw).
ISO 1400 | 1/250 sec | F1.8

The second practical implication is that more light coming in through the lens means that assuming a fixed shutter speed, you can shoot at lower ISO sensitivity settings. Two thirds of a stop is the difference between ISO 640 and ISO 400.

But do you care these days about the difference between shooting at ISO 640 and ISO 400? Or ISO 1,600 and ISO 1,000? Or even 160 and 100? The increased performance of modern sensors at high ISO sensitivity settings means that the days when you really needed to keep your ISO ultra-low for acceptable results are (fortunately) over. As such, when it comes to light gathering, the advantage of an F1.4 lens is less important now than ever before. That’s assuming you’re shooting with one of the new generation of BSI-CMOS sensors, of course, with dual-gain architectures.

2: Faster lenses make for more attractive images

But of course you know all about F-stops, and the reason you’re interested in an F1.4 lens is not for its technical advantages when it comes to pushing your exposure envelope, but for its aesthetic advantages. Specifically, shallower depth of field and blurrier backgrounds at maximum aperture.

This is fair enough – if you consider two lenses of the same focal length, one an F1.4 and one an F1.8, the F1.4 lens will deliver blurrier backgrounds, assuming a constant camera to subject distance. Physics again.

However, the difference between the appearance of background blur at F1.4 vs. F1.8 isn’t as great as all that. It’s highly dependent on camera to subject distance of course, but in general, I’ll bet that most people, if they saw a photograph shot at either F stop setting in isolation, would be unable to identify the aperture setting you used.

Look at the example above. The image on the left was shot at F1.8, the image on the right was shot at F1.4. The crop is from an area just to the left (her left) of our model’s head.

The two images look different, certainly. But are they that different? Meanwhile, the marginal increase in depth of field at F1.8 over F1.4 may actually be advantageous for some photographic situations – especially portraits like this, where even a slight sharpness difference between your subject’s eyes can be distracting.

3: A faster lens stopped down is sharper than a slower one is wide open

Traditionally, this is true. No lens is technically at its best when shot at its maximum aperture. Stopping down a touch is good practice if you want to achieve better overall sharpness, cut down vignetting, minimize some common aberrations, and you don’t mind losing a tiny bit of background blur in return.

This portrait was shot straight into the sun, on Nikon’s Z7 with a new Z 85mm F1.8 S attached. Wide open, this image is sharp across the frame, contrasty, and while there is some flare in evidence, you really have to go looking for it. This is not the kind of performance that we would traditionally associate with an 85mm F1.8.
ISO 64 | 1/2000 sec | F1.8

Again though, these days, you may find that the difference between an F1.4 lens stopped down to F1.8 and a good F1.8 lens wide open is minimal. Looking at the best of today’s crop of F1.8 primes their performance wide open is extraordinary. When examining images from the Nikon Z 85mm or 50mm F1.8 S or the Sony Sonnar T* FE 55mm F1.8 ZA, its obvious that compared to the ‘kit’ primes of the old days, they’re in a different league. Some of this is down to the increased design flexibility that mirrorless technology brings in terms of automatic software corrections, but not all.

At the end of the day, an F1.8 prime that is sharp and contrasty across the frame, which offers pleasant bokeh and lacks significant fringing when shot wide open is – I would argue – a much better value proposition than a more expensive F1.4 or F1.2 lens which needs to be shot at F1.8 or F2 for optimal results.

Disadvantages of ultra-fast lenses

Hopefully I’ve challenged some of the conventionally accepted advantages of faster lenses, but to further bolster my case I want to look at their outright disadvantages.

There are three: size, weight, and cost.

Lenses with a maximum aperture of F1.4 or faster are typically larger, heavier and as I’ve hinted at above, more costly than F1.8 or slower equivalents. The image below, showing Canon’s EF 50mm F1.8 STM next to the RF 50mm F1.2L USM is an extreme example, but nevertheless, if you see a 50mm F1.2 (or F1.4) and a 50mm F1.8 in a particular company’s lineup, you can bet that the F1.8 will be the lighter, smaller and cheaper of the pair.

I don’t want to pick out (or pick on) particular brands here, but Nikon’s Z-mount prime lens range is worth looking at in the context of this discussion because it currently only consists of F1.8 options (pending the arrival of the manual focus 58mm F0.95 Noct, which is a bit of a special case).

Two lenses, both made by Canon, one for DSLRs on the left, and one for mirrorless, on the right. The biggest reason for the size difference between these two is their maximum aperture. The lens on the left is the EF 50mm F1.8 STM, while the lens on the right is the RF 50mm F1.2L USM. The RF lens is one stop brighter than the EF lens. One stop brighter, and a whole lot heftier.

Of Nikon’s three currently available Z-mount lenses, the Z 50mm F1.8 S and Z 85mm F1.8 S are, in my opinion, optically outstanding in almost every way that a photographer should care about. The Z 35mm F1.8 S isn’t quite in the same league when it comes to CA suppression, but it’s still excellent. The combined cost of all three of these lenses is $ 2,250 (not inclusive of tax). That’s only $ 150 more than the MSRP of Canon’s admittedly stunning, but undeniably massive RF 50mm F1.2L, shown above. Meanwhile the combined weight of the three Nikon lenses comes in at only 300g more than the Canon 50mm on its own. And around 800g (about 1.7lb) less than the expected weight of one Nikon Noct, (pictured at the top of this article) if you’re playing that game. We don’t know how much the Noct will cost yet, but let’s assume it will be significantly more than $ 2,250…

If you want a really fast, flagship prime lens, be prepared to pay for it, in more ways than one

Clearly this is an imperfect comparison, drawn only to make a point. But hopefully you do get my point: If you want a really fast, flagship prime lens, be prepared to pay for it, in more ways than one. And ask yourself first – how much do you really need that extra stop or two of light?

Just one more thing…

Speaking of price brings me to a flaw in my argument – or at least to a caveat: The fact that all other things being equal, an F1.8 lens is likely to be cheaper and smaller than an F1.4 or F1.2 equivalent is unsurprising, and in itself proves nothing. What has proved surprising to some of our readers is that fact that the best of today’s crop of F1.8 primes for mirrorless systems are more costly than their D/SLR-era F1.8 equivalents. In fact, in some cases they’re more costly than their F1.4 D/SLR-era equivalents.

Nikon’s Z 85mm f1.8 S, for example, costs almost exactly twice as much as the still-current AF-S 85mm f1.8G. Meanwhile, the AF-S 50mm F1.4G is a fine lens, and still available new for around $ 400 – that’s 2/3 of the cost of the Z 50mm F1.8 S. Sony’s new FE 35mm F1.8 costs $ 750 – that’s more than Sigma’s 35mm F1.4 ‘Art’ – still one of our favorite fast prime lenses, even seven years after its introduction.

Sharp and free of distracting flare even when shot almost wide open, Sony’s new FE 35mm F1.8 is one of the most useful lenses for Sony’s mirrorless interchangeable lens system.
ISO 100 | 1/400 sec | F2.2
Photo by RIshi Sanyal

Why is this so? The reasons are various. There’s the the overall loss of value in the digital photography industry which has seen volume at the low end of the market disappear, driving the prices of high-end products up. The need to recoup some of the R&D costs of developing entirely new mirrorless mounts, the fluctuation in the value of the Japanese Yen over the past decade or so, and other factors.

$ 800 spent now on one of the current crop of state-of-the-art mirrorless lenses buys you more than $ 800 ever has

But let’s not lose sight of a really important fact, independent of all that: The newer lenses mentioned above tend to be superior to equivalents that came before. While $ 800 is clearly a lot more cash than $ 400, $ 800 spent now on one of the current crop of state-of-the-art mirrorless lenses buys you more than $ 800 ever has. As such – especially if you’re a Nikon Z or Sony FE mirrorless shooter – I would argue that it’s time to leave behind the old idea that faster always equals better and take this opportunity to downsize.

Look out for part 2 to this article, if I ever get time to write it – ‘Hey Canon and Sigma, how about some more compact, high-performance F1.8 primes?’

Interested in reading some lens reviews? Click here


* In fact, 2/3EV is the difference between APS-C and full-frame.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on When fast-ish is fast enough: in praise of F1.8 lenses

Posted in Uncategorized

 

Landscape photography with a drone: the advantages – part 3

06 Oct

In the previous articles in this series, I elaborated about the advantages of the drone, specifically that the drone offers more compositional opportunities, is cheap to run, portable, available anywhere and able to hover in place.

In this article I’d like to conclude the discussion of the drone’s advantages by mentioning its ability to hover in place and its most fun facet: its fearlessness in the face of danger.

Ability to hover

The ability to effortlessly hover in place is unique to the drone. True, good helicopter pilots can hover efficiently, but neither with the same GPS-controlled accuracy as the drone, nor with its ability to go near the subject. In terms of stability, a drone can only be compared to a tripod in the sky, which in turn means that it allows three things: relatively long exposures, parking abilities and immaculate precision.

Long exposures can be useful when the photographer wants to convey a sense of motion in an image. For example, an exposure of half a second or more can smear moving water, creating pleasing lines and a clear feel in an image. Under sufficiently still weather, a modern drone can shoot sharp images at half a second, a second or even more. Multiple attempts can result in a sharp shot even when shooting a several second long exposure – an unprecedented achievement for any aerial shooing (that doesn’t use a heavy, expensive gyro-stabilizer).

A long exposure of Fossa waterfall, Faroe Islands. If I had an ND filter handy, I could’ve extended the exposure even more.
DJI Mavic II Pro, 1/2 sec, F11, ISO 100.

I’ll explain and demonstrate what I mean by ‘parking abilities’ with an image I took earlier this year. I was shooting the total solar eclipse over lake Cuesta Del Viento, in the San Juan province of Argentina. Totality lasted for a mere 2 minutes (which seemed more like 45 seconds), during which I tried to shoot a wide-angle focus-stack, a telephoto closeup of the corona, and an aerial of the eclipse reflecting in the lake above the badlands. Naturally, I had set up my wide angle and telephoto compositions beforehand, but the point here is that the drone allowed me to set up my aerial composition as well.

A wide angle focus-stack of the eclipse above the badlands A telephoto closeup of the corona

I composed the shot about 5 or 10 minutes before the totality, and left the drone hovering in place. Once I was done with the two DSLR shots, I took the remote to find the aerial composition exactly how I had left it. This saved me precious time and allowed me to take all three shots in a very narrow time frame. The drone reflection shot, more than anything, is a true once-in-a-lifetime shot.

The drone aerial I took after the two DSLR shots.
DJI Mavic II Pro, 1/10 sec, F2.8, ISO 100. Lago Cuesta Del Viento, San Juan Province, Argentina

Finally, the controls of a modern drone allow for unprecedented precision. The drone can move very delicately (some drones offer a ‘tripod mode’ for extra delicate movement) and enables the photographer to create and capture a balanced image. This is especially important when shooting in close distances to certain subjects.

The window of showing the boat in the middle of the arch was very small. Delicate movements of the drone allowed me to get the shot with ease.
DJI Phantom 4 Pro, 1/40 sec, F5.6, ISO 200. Disko Bay, Greenland

Fearlessness in the face of danger

A major advantage of the drone is the fact that you can endanger it with little consequence. As a nature photographer who lives and breathes extreme environments, I can’t stress enough how overwhelming it is.

A drone doesn’t care about breathing toxic gases. A drone doesn’t care about being uncomfortable, hot, cold, breathless or tired. A drone is a robot, a slave to your will and it will go wherever you tell it to go. It will scream if the battery is about to run out, it will quietly protest if you try to fly in windy weather, its sensors will avoid contact with close-by objects, it won’t let you fly near airports (thank goodness). But other than that, it will obey the commands of its master, however stupid or dangerous… which gives the photographer a perfect opportunity to be as daring as he wishes.

This shot is hazy because it’s taken from within a caldera filled with toxic gasses.
DJI Mavic II Pro, 1/25 sec, F6.3, ISO 100. Kawah Ijen, East Java, Indonesia

Please note that I’m only legitimizing risking the drone, not people’s health. I will cover drone etiquette in a future article, but for now, let me stress that I’m talking about flying both legally and (even more importantly) morally, where there are no chances of people or the environment being harmed by the drone. Luckily, as a nature photographer, it’s easy to stay on the right side of legality and morality, simply because I do most of my shooting alone in the wild, without people or buildings around me. The worst thing that can happen to me is losing the drone (that has happened, of course, a tale which will be told in the future).

No person, and no manned aircraft for that matter, would dream of flying meters above an active volcano. Only uninformed people would go near an ice-arch, which can collapse at any moment with tragic consequences. But a drone can, and will do so happily. This fact opens a myriad of options which simply aren’t there without a drone. Let’s see some examples and explore the dangerous side of landscape photography.

Lava flows in the shape of a double-headed dragon. During this shoot I flew my drone so close to the lava that the camera was molten (!). Needless to say, I wouldn’t get this close myself.
DJI Phantom 4 Pro, 1/8 sec, F6.3, ISO 400. Taken outside of Volcanoes NP, Island of Hawaii.

I wrote extensively about my Hawaii volcano photography in a previous article, but I’ll mention here that it was an amazing shoot during which I flew my drone very close to the lava, closer than I’d ever venture myself. The red-hot lava was so hot that it melted my drone camera, the perfect example of the drone going where no man would, and coming back in one piece (if damaged).

The shoot was more than worth losing the drone, both financially (the images and videos sold for many times what I paid to fix the drone) and in the images I got from it. It was a once-in-a-lifetime event, and I risked the drone knowing very well I could lose it at any moment. Actually, it was the very fact that I melted the drone camera, rather than the unique images I got, that made this series go viral, and got me a front-page National Geographic website feature and interview.

The point where the lava burst out of the mountain side was extremely hot.
DJI Phantom 4 Pro, 1/100 sec, f/6.3, ISO 400. Taken outside of Volcanoes NP, Island of Hawaii.

From lava to ice. It is well known that large icebergs can be extremely dangerous. They can not only collapse catastrophically, but they can flip over, and both these scenarios involve dislocation of a huge amount of ice and water, creating high waves and endangering everyone sailing within a substantial radius. But again, a drone doesn’t care. It will fly under close-to-collapsing arches, hover meters away from gigantic icebergs and go where no man would dare.

To get the composition I wanted with the faraway iceberg and lenticular clouds framed inside the hole in the closer iceberg, I had to get very close to the ice. Needless to say, this would have been impossible in any other way, as I wouldn’t step on this iceberg, and no manned aircraft would fly this close to it.
DJI Mavic II Pro, 1/30 sec, f/8, ISO 200, vertical stitch. Uummannaq, Greenland

There are even more advantages to using a drone. The more you use it, the easier it is to use and the more freedom it gives you. Other points I won’t elaborate on are:

  • The drone, unlike a manned aircraft, doesn’t pose any obstacle to shooting. Manned aircraft have rotors (in helicopters), wings or beams blocking your view. The windows in light planes can also limit your range of motion.
  • Your carbon footprint is significantly lower with a drone compared to manned aircraft.
  • It’s a good conversation starter.
  • It’s so much fun to fly.

In the next article in the series, I’ll discuss the other side of things: the disadvantages and limitations of the drone.


Erez Marom is a professional nature photographer, photography guide and traveler based in Israel. You can follow Erez’s work on Instagram and Facebook, and subscribe to his mailing list for updates.

If you’d like to experience and shoot some of the world’s most fascinating landscapes with Erez as your guide, take a look at his unique photography workshops in The Lofoten Islands, Greenland, Namibia, the Argentinean Puna, the Faroe Islands and Ethiopia.

Erez offers video tutorials discussing his images and explaining how he achieved them.

More in This Series:

  • Landscape Photography with a Drone – Part 1: Forward / What is a Drone?
  • Landscape Photography with a Drone – Part 2: Advantages of the Drone (i)
  • Landscape Photography with a Drone – Part 3: Advantages of the Drone (ii)

Selected Articles by Erez Marom:

  • Parallelism in Landscape Photography
  • Winds of Change: Shooting changing landscapes
  • Behind the Shot: Dark Matter
  • On the Importance of Naming Images
  • On Causality in Landscape Photography
  • Shooting K?lauea Volcano, Part 1: How to melt a drone
  • The Art of the Unforeground
  • Whatever it Doesn’t Take
  • Almost human: photographing critically endangered mountain gorillas

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Landscape photography with a drone: the advantages – part 3

Posted in Uncategorized

 

CHASING DORY is a portable, affordable underwater drone with 1080p video

06 Oct

If there’s one omnipresent trend in the drone industry, it’s this: manufacturers are thinking smaller. Companies are aiming to make unmanned aerial vehicles more compact while inserting as many premium features, found in their larger counterparts, as possible. The CHASING DORY underwater drone, which is currently in the midst of a successful crowdfunding campaign on Kickstarter, is no exception.

Presumably named after the daffy but lovable fish from the movie Finding Nemo, at 18.8cm (7.4in) wide, the drone is smaller than a standard sheet of paper. It is the follow up to Shenzhen-based company CHASING’s GLADIUS MINI drone whose 2017 Indiegogo campaign ended with 1629% funding and a 100% delivery rate to backers. DORY is 56% lighter and 65% smaller than its predecessor which weighed 2.5kg (5.5 pounds).

An algorithm ensures photos remain vivid in all conditions.

DORY’s camera has an F1.6 lens with 1/2.9″ CMOS sensor capable of recording at 1080p, a 100° field of view, and a ±45° Tilt-Lock mode that allows you to scan the floor of a body of water or view its surface above. Two 250-lumen lights placed on the front of drone illuminate the area where its operating without overwhelming underwater inhabitants. CHASING’s color-restore algorithm keeps photos vivid in all conditions. 8 GB of internal memory means you don’t have to worry about losing any footage.

The lowest possible pledge available on Kickstarter will get you a DORY underwater drone, Wi-Fi buoy, tether, and charger. The Wi-Fi buoy helps your mobile device stay connected up to 15 meters (49 feet) away with the wired tether. An Anti-lost Warning is there to prevent any chance of getting disconnected. Depth Lock helps the drone remain stable so it doesn’t get tossed around. The charger takes two hours to power a 4800 mAh battery which will give you one hour of operating time. Unlike other underwater drones that require multiple components to be charged, only the drone needs it with the DORY.

DORY doesn’t come with a remote. Instead, a smartphone is all that is needed, along with the CHASING DORY app (available for iOS and Android), to control the drone. You can customize imagery with 19 different filters, share footage on social networks including Facebook and Instagram, and even stream live. The app also features Co-Play which enables one person to maneuver the drone while the other controls the camera.

As of this writing, there are 17 days left to back DORY on Kickstarter. The campaign has already raised over $ 120,000 of its initial $ 30,000 goal. If underwater exploration is your interest, this portable, affordable drone is an option worth considering.


Disclaimer: Remember to do your research with any crowdfunding project. DPReview does its best to share only the projects that look legitimate and come from reliable creators, but as with any crowdfunded campaign, there’s always the risk of the product or service never coming to fruition.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on CHASING DORY is a portable, affordable underwater drone with 1080p video

Posted in Uncategorized

 

The a9 II is the camera Sony had to make – but they didn’t make it for you

06 Oct

A lot of things are set to happen in 2020. It’s a presidential election year here in the US (actually it’s a presidential election year in lots of countries), Japan is hoping to establish a moon base*, and the UK will definitely, very likely, maybe have left the EU by he time Jan 1 rolls around. My money is on the moon base being ready long before the current British government gets its act together but we’ll see.

In addition to the aforementioned lunar exploits, Japan is also gearing up for the 2020 Olympic Games, to be held next year in late summer, here on Earth. We’ve yet to find out which countries will go home with the most gold medals (although knowing how hot Japan gets in late July I don’t fancy Team GB’s chances) but we do know that every jump that is jumped, every leap that is leapt, every shot that is put (putted?) and every hamstring that is torn will be captured by banks of television and stills cameras.

For this reason, Olympic years are big years for the camera industry. Traditionally, Canon and Nikon maintain a huge presence at these kinds of events, complete with large support staff, professional service centers, and stockrooms chock-full of cameras and lenses ready to be put into action by professional photographers from all over the world. Typically, we also see both companies announcing major new professional cameras either early in an Olympic year, or late the year before. Beijing 2008 saw photographers shooting with the Nikon D3 and Canon EOS-1D III, at London 2012 it was the then-new D4 and the EOS-1D X, and so on.

When the a9 was released about two and a half years ago, it was clear that Sony had its sights set on professional users

Sony is still learning how to be a ‘pro’ stills camera brand, but the company is moving extremely quickly. Sony has invested a lot in recent years in professional support, and these days has a large Pro Service presence at many major sporting events. When the a9 was released about two and a half years ago, it was clear that Sony had its sights set on professional users, and the expansion of professional support since then (as well as the release of some seriously impressive telephoto lenses) is further evidence that its leadership is very serious indeed about joining Canon and Nikon on the sidelines.

The new a9 II is, in effect, Sony’s 2020 Olympic camera. Announced fairly quietly today, without the usual Sony fanfare, the a9 II is a camera that the average DPReview reader will probably neither need nor buy. And Sony knows it. The upgrades compared to the a9 (which will continue in the lineup) are, for the most part, targeted at a small segment of the professional photographer user base. And even more specifically, towards photographers that shoot major sporting events.

A ten times increase in data transfer speed over LAN, the addition of 5GHz wireless connectivity, and the option to wirelessly send files from the camera when it’s turned off are valuable features for those times when you’re running around trying to send huge numbers of files to a remote editing station, but very few people ever need to actually do that. Likewise the ability to save up to ten sets of FTP settings to an SD card, or add 60-second voice memos to photographs, which can then be converted to text and appended automatically to EXIF using an app. Very cool, but not essential for most use-cases.

As an everyday machine for taking photographs, the a9 II is almost – but not quite – identical to the a9. Inside you’ll find the same 24MP full-frame sensor, the same autofocus system, albeit improved, the same 3.7 million-dot OLED viewfinder and broadly the same core feature set.

There are a few useful refinements though, some of which are courtesy of the new Bionz X processor: autofocus speed and precision have been improved, likewise face detection, and EVF responsiveness. A new mechanical shutter with a rated lifespan of 500,000 cycles brings faster mechanical shutter shooting (now up to 10fps), and the a9 II benefits from the ergonomic tweaks and improved weather-sealing introduced in the a7R IV. Image stabilization performance has also been slightly increased, from 5EV in the a9 to 5.5EV, and battery life has increased by around 6% (CIPA).

The a9 II’s video feature set is virtually unchanged over the a9, and shares its limitations (for some reason there’s still no Log option), but Sony has added real-time tracking.

Thanks to a series of firmware updates, the a9 is as competitive now as it ever was

You know what I think, but what’s your opinion? Should you buy one? After all, even if you’re just an amateur sports photographer, the increase in continuous shooting rate in mechanical shutter mode might make a big difference (specifically if the venue/s you shoot in use LED lighting or advertising panels) and the beefed-up weather-sealing could be essential for some situations.

For most people reading this article though, I suspect that the additions in the a9 II will prove to be of little or no interest compared to the original a9 which has been on the market for more than two years. Thanks to a series of firmware updates, the a9 is as competitive now as it ever was, and with the a9 II now at the top of the lineup, the older model is likely to get more affordable over the next few months.

Meanwhile, Sony can get the a9 II into the hands of the people that really need it – the pro sports shooters gearing up for next summer’s major sporting events. On the moon, or wherever.


* Yes, I know the cited article is from almost a decade ago, and since then the target date for Japan’s lunar base has been pushed back by at least a few years, but I’ll level with you – I was looking for a quick way to set up a cheap Brexit gag.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on The a9 II is the camera Sony had to make – but they didn’t make it for you

Posted in Uncategorized

 

Video: 37 different camera shutter sounds in 3 minutes

06 Oct

Similar to how no two fingerprints are identical, no two camera shutter sounds are exactly the same. As a fun little project, photographer and YouTuber Scott Graham has captured the shutter sound of 37 different camera models to show off the diversity of shutter sounds and to memorialize a number of cameras he’s selling.

In the video, which comes in just shy of four minutes, Graham succinctly captures the unique shutter sounds of all 37 cameras, ranging from analog SLR cameras to digital Fujifilm cameras. Each shutter sound was captured as close to 1/60th of a second as possible for consistency’s sake.

Graham didn’t elaborate on whether or not he will continue to do this with future cameras he acquires, but we think it’d be incredible to build an archive of shutter sounds from various cameras. What camera has the most pleasing sound to your ears, both from Graham’s collection and your own?

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Video: 37 different camera shutter sounds in 3 minutes

Posted in Uncategorized

 

Review: Live Planet VR live-streaming system

05 Oct

Live Planet VR camera and live-streaming platform
$ 3,495 | liveplanet.net

Live Planet VR bills itself as “the only end-to-end VR system,” and technically, since it includes a camera system as well as a cloud publishing suite that’s capable of delivering to just about every major VR headset and outlet currently available (including live-streaming high-resolution stereoscopic 360 video over mobile networks) they may be right.

Live Planet has a lot of things going for it, especially when it comes to the algorithm and software solutions side of things. Their live-streaming and real-time stitching execution is impressive, and I can also see many cases where their cloud publishing platform could be a godsend, which we’ll get to below.

In fact, whether or not Live Planet VR is right for you is highly dependent on how you plan to use it, as Live Planet is targeting a very specific user – mostly those looking to live-stream.

But we’ll start with the key features and the design.

Key Features

  • 16-lens stereoscopic VR 6K camera
  • DCI 4K/30p (4096×2160) resolution for live streaming
  • 6K/24p (6144×3328) for post-production stitching
  • In-camera real-time stitching
  • Live-streaming capability
  • Records to a single microSD card
  • VR headset live preview
  • Robust cloud publishing solution to all major VR platforms
  • Delivers high quality VR over LTE networks

Design

The camera itself is quite nice. It’s a hefty, well-crafted chunk of heavy polymer and machined metal about the size of an extra large coffee mug. It has sixteen Sunex DSL218 F2.0 lenses and 1/2.8” Sony IMX 326 sensors, and is flanked on top and bottom by generous ventilation grills.

The bottom of the unit has inputs for USB stereo, audio-in, ethernet, 12V DC 5A power, microSD slot, TOSLINK (optical audio), and HDMI out, as well as a standard 1/4”-20 thread for mounting to any standard tripod plate or system.

The LivePlanet camera may look like something out of a science fiction movie, but it’s a robust camera with sixteen F2.0 lenses.

The camera records to a single microSD card in a compressed .mp4 format. It also offers an HDMI out for YUV 4:2:0 capture so you can transmit the signal in both stereo or mono to a switcher or for a traditional broadcasting workflow. For standard recording, it captures at 50 Mbps, and for live-streaming it can capture at 15, 30, 45, or 60Mbps.

This all weighs in at around 700g (1.5 lb) and comes packaged in a Pelican case with custom foam cutouts.

I had no qualms about the aesthetic and physical design of the camera, but there are a few key points to take into account as you consider whether this system is right for you, which brings me to…

In use

As with most VR cameras, much of the magic happens on the software side. In the case of the Live Planet system, most of that magic is related to live streaming. If you’re looking for a rig to showcase and/or live-stream produced events, such as sports, concerts or conferences, and expect to do little to no post-processing, the Live Planet VR system is certainly one to take a look at, for this is where Live Planet VR truly shines.

Live Planet is targeting a very specific user, mostly those looking to live-stream.

However, if you’re looking for a fairly portable system that you can quickly grab and go to capture high-resolution, high-quality 360 footage, you should probably be aware of a few things:

There are no internal microphones. You need to connect a third party microphone, such as the Zoom H2N or Zoom H3-VR, if you plan to capture audio, as well as a second tripod or clamp to attach it to the tripod and keep it out of view below the camera. Live Planet does support direct-to-soundboard input so audio is attached to your video files.

There’s no internal battery pack. In order to use this unit away from a power source, you need a third party battery pack. The review unit came equipped with a Watson Pro VM-95-ME 97Wh external Lithium-Ion battery pack and Core SWX GPSCPM V-Mount plate and monopod clamp, also to be attached to the tripod. These are not included when purchasing the Live Planet VR camera.

The camera’s ethernet plug provides a reliable connection for live streaming content.

You can’t preview recorded clips in the field. In order to preview what you’ve recorded the app provides screenshots, but you can’t see recorded video files until you offload footage onto your laptop or computer. Live Planet says that this is a feature on their roadmap.

The camera only records in compressed .mp4 format. While most cameras at this price point offer several recording options, those interested in color grading and refining stitches may want to look elsewhere as the Live Planet VR camera does not currently record Log or Raw footage, nor does it allow you access to full resolution un-stitched camera files for fine-tuning using tools like Mistika VR or Nuke. Live Planet tells me they have features in beta for Raw capture and a Premiere Pro plugin, however it doesn’t seem that access to individual camera files is yet on the roadmap.

The Live Planet VR camera requires some accessories, like an off camera battery and audio recorder, so it’s not the best camera for quick projects. But for live-streaming events it’s a very powerful solution.

There’s no internal gyroscope or stabilization solution. While this wouldn’t be as much of an issue if individual camera files could be accessed to post-process using third party software, moving shots are virtually impossible with the Live Planet VR system without a gimbal or rover with stabilization.

These may seem like some serious limitations, and for certain uses they are. However, when you consider that the system is optimized to live-stream VR content, features like previewing recorded clips or the specific recording format used are probably less critical.

Software and apps

First, the basics.

The mobile app, as well as the web app from your computer or laptop, are very simple to use. For image control it offers the essentials: exposure (including auto-exposure), shadows, saturation, temperature (including auto-white balance), tint, and curves.

Additionally, you can choose between monoscopic and stereoscopic, quality of live-stream, choice of audio stream (with optional microphone attached), and field of view (currently 360 or 180, with plans on future updates to choose anywhere between 0 and 360 degrees). Finally, you get the option of recording or live-streaming, as well as a button to turn on and off the camera unit.

The Live Planet mobile app is easy to use and offers a lot of control.

Now, the magic.

First, by taking full advantage of an NVIDIA Jetson TX2 Module (with AI computing), the unit live-stitches 4K footage in real-time, and the algorithm does a fantastic job doing so. I was as close as a half meter (1.5 ft.) away, and stitch-lines are hardly noticeable as you can see in the video below

There is an impressively noticeable lack of optical flow Jello-effect, often seen with other software stitching systems at such close distances. It really wasn’t until I was about 30cm (1 ft.) away that I even noticed any stitch lines. To get these kind of results from a package that fits in the palm of your hand, when just a few years ago you needed an ultra powerful desktop-sized stitchbox sitting underneath the camera, is a more than impressive feat.

Editors note: for best results, we recommend watching this sample video on a mobile or head-mounted device.

The Live Planet system is able to live-stitch 4K footage in real-time, a very impressive feat considering how well it works. Stitch lines are hardly noticeable until something gets within about a half meter of the camera. (Please excuse the lack of audio – I failed to consider the lack of internal microphones until I offloaded later in the day).

Second, the ability to monitor and preview live using a Samsung Gear VR headset is priceless. It’s very easy to setup and use, and is a wonderful way for a producer, director or client to experience, on set, the 360 sphere the way the end user will experience it. It’s also capable of simultaneously streaming an equirectangular preview to a laptop or computer, from where you can also control the camera and settings.

Third, and this is perhaps the main selling point for the Live Planet system, is that it gives the user the opportunity to simultaneously live-stream in 4K stereoscopic video, at as low as 2Mbps, to various platforms, including Oculus Go, Samsung Gear VR, and Google Daydream headsets as well as YouTube and any platform that supports Real Time Messaging Protocol (RMTP).

To get these kind of results from a package that fits in the palm of your hand, when just a few years ago you needed a powerful desktop-sized stitchbox sitting underneath the camera, is a more than impressive feat.

This makes it an elegant solution for publishers of live events to easily distribute to multiple channels. All you need to do is connect the camera to a router via Ethernet cable, plug it in, and hit the Livestream button on the mobile app or computer. You then share a simple event code generated by the Live Planet Cloud to whomever you like and users can login to experience both live, as well as pre-recorded, video. Facebook and Vimeo support are on the way.

Now here’s the kicker – users of most major 360 cameras such as Insta360 Pro, Vuze+, Samsung Gear VR and Rylo can now take advantage of Live Planet VR Studio, Live Planet’s cloud publishing platform. Since the software is where Live Planet does some serious algorithmic voodoo, this is an incredibly welcomed feature. Using the Live Planet publishing platform can give you a consistent easy way to push your 360 content out to the world, no matter what platform a user chooses to experience it on. I cannot think of an easier turnkey way to simultaneously publish to all major VR and social media outlets.

Finally, and this will be one of the least talked about and least understood, but perhaps one of the most exciting parts of the technical design, it all runs on a blockchain-enabled infrastructure called VideoCoin.

I cannot think of an easier turnkey way to simultaneously publish to all major VR and social media outlets.

Those that know me know that anything blockchain and distributed ledger technology gets my full attention. While this isn’t the place to get into the nitty-gritty of blockchain tech, essentially the basis of most cryptocurrency, what’s important in the Live Planet VR system’s case is that it provides a peer-to-peer, decentralized, encrypted platform for data distribution – never has there been a safer way to safeguard, control, and distribute your own data, in this case video.

Live Planet also employs a proprietary algorithm in what they call ViewCast technology, which predicts head movements in order to maximize resolution in the direction the eye is facing, enabling high-resolution viewing in headsets even on mobile networks.

The Live Planet system arrives in a very sturdy Pelican case with custom foam cutouts.

The Live Planet team also indicated that they “plan to release an update every 3 weeks.” Some of the things they specifically pointed out to us include still photo capture, HDR capture, flexible field of view capture (0 to 360º), support for streaming from multiple cameras, RAW capture, an Adobe Premiere Pro plugin to aid in post-production color grading, and spatial audio support on the Live Planet publishing platform.

Be aware that space and publishing through the cloud system will cost you, based on how much streaming time and space you need. There are packages from $ 50/month for 90 minutes of streaming and 50GB storage, up to $ 270/month for 10 hours of streaming and 250GB storage. To take advantage of the ViewCast technology, it’ll cost you $ 9.99 per streamed hour.

Image quality

Image quality is certainly adequate and acceptable, especially for the turn-key live event use cases as described above. While a recent update allows for 6K capture (6144×3328) at 24fps, that is currently reserved for post-production stitching. Maximum resolution for live streaming is 4096 x 2160 at 30fps. The camera does a very good job of rendering details close-by, however, at further distances, in high-contrast situations, say under a tree canopy or between several buildings in daylight, there tends to be some noticeable edge-fringing.

Conclusions

When I first started using the Live Planet I ran into some of the same frustrations that I did when I started shooting 360 several years back, piecing together 3rd party accessories and solutions. With no internal battery, on-board audio recording, any sort of controls on the camera unit itself, it’s not the best camera for travel or on-location shooting. It has a lot of moving pieces, all of which have to work perfectly together with no issues.

Bottom line, this is a perfect turn-key solution for the quickly growing market of live-streaming events in 360 video.

However, when I began to truly consider what the Live Planet system was designed to do – effortlessly stream live events – my perspective changed. Live Planet’s software engineering and solutions are top notch. For live-streaming, especially over wireless networks, 4K is more than enough resolution for that bandwidth to handle. Furthermore, since Live Planet has begun to open up its software solutions to users of other cameras, it’s absolutely worth keeping an eye on Live Planet’s evolution as I’ve yet to see anything that rivals Live Planet VR Studio on the software and distribution front.

Bottom line, this is a perfect turn-key solution for the quickly growing market of live-streaming events in 360 video, and it gets my enthusiastic recommendation as a system to use for live streaming purposes. Additionally, Live Planet VR Studio certainly gets my nod as a publishing platform for users of any camera.

What we like

  • Build quality
  • Instantaneous stitching
  • Live Planet live-streaming publishing platform
  • Good image quality
  • End-to-end system
  • Only requires 1 MicroSD card

What we’d like to see improved

  • Price
  • No internal or swappable battery
  • No audio recording
  • No still photo capture (coming soon)
  • No raw recording (coming soon)
  • No access to individual camera files
  • No stabilization
(Rating based primarily on use as a live-streaming system)

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Review: Live Planet VR live-streaming system

Posted in Uncategorized

 

Adobe Photoshop and Premiere Elements 2020 arrive with new AI-powered tools

05 Oct
A sample from Adobe showing off the new One-click subject selection tool.

Adobe has released Photoshop Elements 2020 and Premiere Elements 2020, adding a number of new features and capabilities powered by the company’s Adobe Sensei AI, including automatic selection, skin smoothing, colorization, new Auto Creations and more, as detailed in an announcment blog post.

Photoshop Elements and Premiere Elements are Adobe’s entry-level versions of its products, offering general consumers access to some of the tools and capabilities found in these products, but with less complexity and lower prices at $ 99.99 each for a full software license.

A sample comparison from Adobe showing off the new Sensei-powered auto-colorization tool.

The Photoshop Elements 2020 update brings a number of new features, including new Pattern Brush, B&W Selection, Depth of Field, and Painterly effects for Auto Creations, support for automatically colorizing black-and-white images using AI, automatic skin smoothing, and one-click subject selection.

A comparison from Adobe showing off the skin-smoothing tool.

The updated software also enables users to remove unwanted objects from images, add heart and star patterns to photos, and search for content via Smart Tags. Beyond that, the software has received general performance enhancements, as well as support for HEIF images and HEVC videos. For customers located in the United States, Photoshop Elements now also supports directly ordering prints and other items through the Fujifilm Prints and Gifts service.

A comparison photo set showing the its noise reduction technology.

Joining the big Photoshop Elements 2020 update is the new Premiere Elements 2020, an update that adds simplified noise reduction, a sky replacement tool, support for turning images and videos into dynamic time-lapses, and a tool that replaces the black bars in vertical videos with a fill that matches the video, as seen below.

As with the new version of Photoshop Elements, this Premiere Elements update also adds Smart Tag search and support for HEIF/HEVC formats. The software also supports searching videos for specific people using Sensei’s face-matching capabilities. Finally, Premiere Elements now includes five guided edits that help users modify their videos.

In addition to the individual $ 99.99 license price, Adobe offers Photoshop Elements 2020 and Premiere Elements 2020 bundled for $ 149.99. Existing customers can upgrade either of the new products for $ 79.99 or both for a total of $ 119.99.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Adobe Photoshop and Premiere Elements 2020 arrive with new AI-powered tools

Posted in Uncategorized

 

iPhone 11’s coolest photo feature is the hardest one to find

05 Oct
Cinerama leaning back – a natural result of pointing my camera upwards to capture the whole building.

Anyone who has stood at ground level and taken a photo of a building across the street has likely seen the effects of perspective distortion – you tilt your camera back to bring the whole building into frame, causing the straight lines of the building to appear to be ‘leaning back.’ Tilt-shift lenses are designed for exactly this problem, but they’re expensive, specialist optics.

More often, this effect will be corrected in software, but doing so usually requires the user to stretch the top of the image and crop to avoid the blank spaces this creates at the bottom of the frame. Apple is tackling this problem with a unique approach in the iPhone 11: by capturing more data outside of the frame.

I don’t know, I just like boring photos I guess?

For whatever reason, I’m drawn to the types of photos where perspective distortion is painfully obvious – signs, sides of buildings, etc. – but I’m horrible at lining them up correctly. Usually, I find out going through my images later that I wasn’t squared up to my subject even though I thought I was. Horizons are slightly askew, or I was leaning back slightly. Apple, it seems, has heard my cries.

When you’re shooting with the standard camera (with a focal length equivalent to about 26mm), the iPhone 11 will also capture image data from the ultra-wide (13mm equiv.) camera – a feature that is referred to in the settings menu as “Photos Capture Outside the Frame.” If you’re shooting on the telephoto camera of the 11 Pro, it’ll capture additional information from the standard camera.

That extra information is saved alongside your photo. When you edit that image in the native camera app, you’ll be able to use the extra data as you rotate and manipulate your image – a big help when you’re trying to fix crooked lines in a photo.

As you make image adjustments, you’ll see the extra data captured by the ultra-wide lens. This additional image information is available for 30 days.

The phone can use that information to automatically re-crop photos too. In the camera settings menu there’s an option to “Auto Apply Adjustments.” You’ll know that auto adjustments have been applied to an image when it shows a blue “Auto” icon above your captured photo. We’ve noticed this feature being employed when the phone detects a human subject cut off at the edge of the frame.

And even for many photos that aren’t automatically adjusted, the stock camera app will suggest tweaks when brought into edit. For example, take that image of the building that’s leaning back – if you edit it in the iPhone’s camera app and engage the crop tool, it will automatically correct for perspective distortion and use the extra image data it saved to fill in the areas at the edges of the frame that would otherwise need to be cropped out.

Bringing the image into the iPhone’s native editing app, then pressing the ‘crop’ option will take you to this view. The yellow ‘auto’ icon appears at the top of the image if there’s a suggested crop, as there is in this example.
The same adjustments can be applied in Photoshop, but without that extra image information at the sides of the frame you’ll need to crop in to avoid including blank space in your final image.
The iPhone goes beyond these limitations with that extra image data. In addition to correcting perspective, you can creatively re-crop your image to preserve details at the edge of the frame – and even include objects that were well outside of the frame in your initial standard image.

I don’t think many people will discover this feature, and that’s a shame. It’s not just helpful for correcting distortion and fixing crooked horizons – it’s a useful feature if you just want to re-crop an image after-the-fact. However, it will only be discovered by those who enable the ‘capture outside the frame’ feature and attempt to crop an image, which I imagine is a fraction of the many people who will use the camera day in and day out.

Regardless of how widely used this feature will be, what Apple is doing is clever. Photoshop’s Content Aware Fill feature does something similar – it will fill in missing data when rotating or stretching an image – but instead of using data from a wider lens, it’s filling in those empty spaces based on educated guesses. Apple’s approach is just one more way in which smartphone manufacturers are using data to their advantage – to the advantage of boring photo fans everywhere.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone 11’s coolest photo feature is the hardest one to find

Posted in Uncategorized