RSS
 

Video: 37 different camera shutter sounds in 3 minutes

06 Oct

Similar to how no two fingerprints are identical, no two camera shutter sounds are exactly the same. As a fun little project, photographer and YouTuber Scott Graham has captured the shutter sound of 37 different camera models to show off the diversity of shutter sounds and to memorialize a number of cameras he’s selling.

In the video, which comes in just shy of four minutes, Graham succinctly captures the unique shutter sounds of all 37 cameras, ranging from analog SLR cameras to digital Fujifilm cameras. Each shutter sound was captured as close to 1/60th of a second as possible for consistency’s sake.

Graham didn’t elaborate on whether or not he will continue to do this with future cameras he acquires, but we think it’d be incredible to build an archive of shutter sounds from various cameras. What camera has the most pleasing sound to your ears, both from Graham’s collection and your own?

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Video: 37 different camera shutter sounds in 3 minutes

Posted in Uncategorized

 

Review: Live Planet VR live-streaming system

05 Oct

Live Planet VR camera and live-streaming platform
$ 3,495 | liveplanet.net

Live Planet VR bills itself as “the only end-to-end VR system,” and technically, since it includes a camera system as well as a cloud publishing suite that’s capable of delivering to just about every major VR headset and outlet currently available (including live-streaming high-resolution stereoscopic 360 video over mobile networks) they may be right.

Live Planet has a lot of things going for it, especially when it comes to the algorithm and software solutions side of things. Their live-streaming and real-time stitching execution is impressive, and I can also see many cases where their cloud publishing platform could be a godsend, which we’ll get to below.

In fact, whether or not Live Planet VR is right for you is highly dependent on how you plan to use it, as Live Planet is targeting a very specific user – mostly those looking to live-stream.

But we’ll start with the key features and the design.

Key Features

  • 16-lens stereoscopic VR 6K camera
  • DCI 4K/30p (4096×2160) resolution for live streaming
  • 6K/24p (6144×3328) for post-production stitching
  • In-camera real-time stitching
  • Live-streaming capability
  • Records to a single microSD card
  • VR headset live preview
  • Robust cloud publishing solution to all major VR platforms
  • Delivers high quality VR over LTE networks

Design

The camera itself is quite nice. It’s a hefty, well-crafted chunk of heavy polymer and machined metal about the size of an extra large coffee mug. It has sixteen Sunex DSL218 F2.0 lenses and 1/2.8” Sony IMX 326 sensors, and is flanked on top and bottom by generous ventilation grills.

The bottom of the unit has inputs for USB stereo, audio-in, ethernet, 12V DC 5A power, microSD slot, TOSLINK (optical audio), and HDMI out, as well as a standard 1/4”-20 thread for mounting to any standard tripod plate or system.

The LivePlanet camera may look like something out of a science fiction movie, but it’s a robust camera with sixteen F2.0 lenses.

The camera records to a single microSD card in a compressed .mp4 format. It also offers an HDMI out for YUV 4:2:0 capture so you can transmit the signal in both stereo or mono to a switcher or for a traditional broadcasting workflow. For standard recording, it captures at 50 Mbps, and for live-streaming it can capture at 15, 30, 45, or 60Mbps.

This all weighs in at around 700g (1.5 lb) and comes packaged in a Pelican case with custom foam cutouts.

I had no qualms about the aesthetic and physical design of the camera, but there are a few key points to take into account as you consider whether this system is right for you, which brings me to…

In use

As with most VR cameras, much of the magic happens on the software side. In the case of the Live Planet system, most of that magic is related to live streaming. If you’re looking for a rig to showcase and/or live-stream produced events, such as sports, concerts or conferences, and expect to do little to no post-processing, the Live Planet VR system is certainly one to take a look at, for this is where Live Planet VR truly shines.

Live Planet is targeting a very specific user, mostly those looking to live-stream.

However, if you’re looking for a fairly portable system that you can quickly grab and go to capture high-resolution, high-quality 360 footage, you should probably be aware of a few things:

There are no internal microphones. You need to connect a third party microphone, such as the Zoom H2N or Zoom H3-VR, if you plan to capture audio, as well as a second tripod or clamp to attach it to the tripod and keep it out of view below the camera. Live Planet does support direct-to-soundboard input so audio is attached to your video files.

There’s no internal battery pack. In order to use this unit away from a power source, you need a third party battery pack. The review unit came equipped with a Watson Pro VM-95-ME 97Wh external Lithium-Ion battery pack and Core SWX GPSCPM V-Mount plate and monopod clamp, also to be attached to the tripod. These are not included when purchasing the Live Planet VR camera.

The camera’s ethernet plug provides a reliable connection for live streaming content.

You can’t preview recorded clips in the field. In order to preview what you’ve recorded the app provides screenshots, but you can’t see recorded video files until you offload footage onto your laptop or computer. Live Planet says that this is a feature on their roadmap.

The camera only records in compressed .mp4 format. While most cameras at this price point offer several recording options, those interested in color grading and refining stitches may want to look elsewhere as the Live Planet VR camera does not currently record Log or Raw footage, nor does it allow you access to full resolution un-stitched camera files for fine-tuning using tools like Mistika VR or Nuke. Live Planet tells me they have features in beta for Raw capture and a Premiere Pro plugin, however it doesn’t seem that access to individual camera files is yet on the roadmap.

The Live Planet VR camera requires some accessories, like an off camera battery and audio recorder, so it’s not the best camera for quick projects. But for live-streaming events it’s a very powerful solution.

There’s no internal gyroscope or stabilization solution. While this wouldn’t be as much of an issue if individual camera files could be accessed to post-process using third party software, moving shots are virtually impossible with the Live Planet VR system without a gimbal or rover with stabilization.

These may seem like some serious limitations, and for certain uses they are. However, when you consider that the system is optimized to live-stream VR content, features like previewing recorded clips or the specific recording format used are probably less critical.

Software and apps

First, the basics.

The mobile app, as well as the web app from your computer or laptop, are very simple to use. For image control it offers the essentials: exposure (including auto-exposure), shadows, saturation, temperature (including auto-white balance), tint, and curves.

Additionally, you can choose between monoscopic and stereoscopic, quality of live-stream, choice of audio stream (with optional microphone attached), and field of view (currently 360 or 180, with plans on future updates to choose anywhere between 0 and 360 degrees). Finally, you get the option of recording or live-streaming, as well as a button to turn on and off the camera unit.

The Live Planet mobile app is easy to use and offers a lot of control.

Now, the magic.

First, by taking full advantage of an NVIDIA Jetson TX2 Module (with AI computing), the unit live-stitches 4K footage in real-time, and the algorithm does a fantastic job doing so. I was as close as a half meter (1.5 ft.) away, and stitch-lines are hardly noticeable as you can see in the video below

There is an impressively noticeable lack of optical flow Jello-effect, often seen with other software stitching systems at such close distances. It really wasn’t until I was about 30cm (1 ft.) away that I even noticed any stitch lines. To get these kind of results from a package that fits in the palm of your hand, when just a few years ago you needed an ultra powerful desktop-sized stitchbox sitting underneath the camera, is a more than impressive feat.

Editors note: for best results, we recommend watching this sample video on a mobile or head-mounted device.

The Live Planet system is able to live-stitch 4K footage in real-time, a very impressive feat considering how well it works. Stitch lines are hardly noticeable until something gets within about a half meter of the camera. (Please excuse the lack of audio – I failed to consider the lack of internal microphones until I offloaded later in the day).

Second, the ability to monitor and preview live using a Samsung Gear VR headset is priceless. It’s very easy to setup and use, and is a wonderful way for a producer, director or client to experience, on set, the 360 sphere the way the end user will experience it. It’s also capable of simultaneously streaming an equirectangular preview to a laptop or computer, from where you can also control the camera and settings.

Third, and this is perhaps the main selling point for the Live Planet system, is that it gives the user the opportunity to simultaneously live-stream in 4K stereoscopic video, at as low as 2Mbps, to various platforms, including Oculus Go, Samsung Gear VR, and Google Daydream headsets as well as YouTube and any platform that supports Real Time Messaging Protocol (RMTP).

To get these kind of results from a package that fits in the palm of your hand, when just a few years ago you needed a powerful desktop-sized stitchbox sitting underneath the camera, is a more than impressive feat.

This makes it an elegant solution for publishers of live events to easily distribute to multiple channels. All you need to do is connect the camera to a router via Ethernet cable, plug it in, and hit the Livestream button on the mobile app or computer. You then share a simple event code generated by the Live Planet Cloud to whomever you like and users can login to experience both live, as well as pre-recorded, video. Facebook and Vimeo support are on the way.

Now here’s the kicker – users of most major 360 cameras such as Insta360 Pro, Vuze+, Samsung Gear VR and Rylo can now take advantage of Live Planet VR Studio, Live Planet’s cloud publishing platform. Since the software is where Live Planet does some serious algorithmic voodoo, this is an incredibly welcomed feature. Using the Live Planet publishing platform can give you a consistent easy way to push your 360 content out to the world, no matter what platform a user chooses to experience it on. I cannot think of an easier turnkey way to simultaneously publish to all major VR and social media outlets.

Finally, and this will be one of the least talked about and least understood, but perhaps one of the most exciting parts of the technical design, it all runs on a blockchain-enabled infrastructure called VideoCoin.

I cannot think of an easier turnkey way to simultaneously publish to all major VR and social media outlets.

Those that know me know that anything blockchain and distributed ledger technology gets my full attention. While this isn’t the place to get into the nitty-gritty of blockchain tech, essentially the basis of most cryptocurrency, what’s important in the Live Planet VR system’s case is that it provides a peer-to-peer, decentralized, encrypted platform for data distribution – never has there been a safer way to safeguard, control, and distribute your own data, in this case video.

Live Planet also employs a proprietary algorithm in what they call ViewCast technology, which predicts head movements in order to maximize resolution in the direction the eye is facing, enabling high-resolution viewing in headsets even on mobile networks.

The Live Planet system arrives in a very sturdy Pelican case with custom foam cutouts.

The Live Planet team also indicated that they “plan to release an update every 3 weeks.” Some of the things they specifically pointed out to us include still photo capture, HDR capture, flexible field of view capture (0 to 360º), support for streaming from multiple cameras, RAW capture, an Adobe Premiere Pro plugin to aid in post-production color grading, and spatial audio support on the Live Planet publishing platform.

Be aware that space and publishing through the cloud system will cost you, based on how much streaming time and space you need. There are packages from $ 50/month for 90 minutes of streaming and 50GB storage, up to $ 270/month for 10 hours of streaming and 250GB storage. To take advantage of the ViewCast technology, it’ll cost you $ 9.99 per streamed hour.

Image quality

Image quality is certainly adequate and acceptable, especially for the turn-key live event use cases as described above. While a recent update allows for 6K capture (6144×3328) at 24fps, that is currently reserved for post-production stitching. Maximum resolution for live streaming is 4096 x 2160 at 30fps. The camera does a very good job of rendering details close-by, however, at further distances, in high-contrast situations, say under a tree canopy or between several buildings in daylight, there tends to be some noticeable edge-fringing.

Conclusions

When I first started using the Live Planet I ran into some of the same frustrations that I did when I started shooting 360 several years back, piecing together 3rd party accessories and solutions. With no internal battery, on-board audio recording, any sort of controls on the camera unit itself, it’s not the best camera for travel or on-location shooting. It has a lot of moving pieces, all of which have to work perfectly together with no issues.

Bottom line, this is a perfect turn-key solution for the quickly growing market of live-streaming events in 360 video.

However, when I began to truly consider what the Live Planet system was designed to do – effortlessly stream live events – my perspective changed. Live Planet’s software engineering and solutions are top notch. For live-streaming, especially over wireless networks, 4K is more than enough resolution for that bandwidth to handle. Furthermore, since Live Planet has begun to open up its software solutions to users of other cameras, it’s absolutely worth keeping an eye on Live Planet’s evolution as I’ve yet to see anything that rivals Live Planet VR Studio on the software and distribution front.

Bottom line, this is a perfect turn-key solution for the quickly growing market of live-streaming events in 360 video, and it gets my enthusiastic recommendation as a system to use for live streaming purposes. Additionally, Live Planet VR Studio certainly gets my nod as a publishing platform for users of any camera.

What we like

  • Build quality
  • Instantaneous stitching
  • Live Planet live-streaming publishing platform
  • Good image quality
  • End-to-end system
  • Only requires 1 MicroSD card

What we’d like to see improved

  • Price
  • No internal or swappable battery
  • No audio recording
  • No still photo capture (coming soon)
  • No raw recording (coming soon)
  • No access to individual camera files
  • No stabilization
(Rating based primarily on use as a live-streaming system)

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Review: Live Planet VR live-streaming system

Posted in Uncategorized

 

How to Make Your Photos Awesome in Lightroom or Photoshop Camera RAW

05 Oct

The post How to Make Your Photos Awesome in Lightroom or Photoshop Camera RAW appeared first on Digital Photography School. It was authored by Caz Nowaczyk.

In this video tutorial, Nemanja Sekulic will show you how to make some dramatic editing changes to your RAW photos using Lightroom or Photoshop Camera RAW.

?

During the process, you will learn the following in Lightroom (which you can also translate to Photoshop Camera RAW):

  • How to use the Basic Panel including the Exposure Slider, Highlight Slider, Shadow Slider, Color Temperature Slider,
  • The shortcut for viewing before/after (\)
  • How to use the Radial Filter tool – how to make multiple radial filter selections, reposition, and make adjustments to the selection.
  • How to use the Adjustment Brush Tool – including changing your brush size, flow, and feather amounts.
  • How to use it to make selective adjustments in your image, including color, temperature, exposure, highlights, shadows, clarity, etc. to fine-tune your image.
  • How to use selective color with your Adjustment Brush.
  • How to make new Adjustment Brushes to fine-tune the details in the eyes.
  • How to use Hue and Saturation Panels as well as the Split Toning Panel.
  • How to add a vignette.
  • How to go back and readjust any of your Radial Filter, and Adjustment Brush settings.

You can apply these techniques across any image you choose, or you can download Nemanja’s image file here.

You may also find the following helpful:

  • Photoshop vs Lightroom – the Power of Photoshop
  • Four Lightroom Tips to Enhance Your Landscape Photos
  • Lightroom Texture Slider vs. Skin Smoothing
  • Lightroom Shortcuts Every Photographer Needs to Know
  • 10 Tips to Make Lightroom Classic CC Run Faster

 

The post How to Make Your Photos Awesome in Lightroom or Photoshop Camera RAW appeared first on Digital Photography School. It was authored by Caz Nowaczyk.


Digital Photography School

 
Comments Off on How to Make Your Photos Awesome in Lightroom or Photoshop Camera RAW

Posted in Photography

 

Adobe Photoshop and Premiere Elements 2020 arrive with new AI-powered tools

05 Oct
A sample from Adobe showing off the new One-click subject selection tool.

Adobe has released Photoshop Elements 2020 and Premiere Elements 2020, adding a number of new features and capabilities powered by the company’s Adobe Sensei AI, including automatic selection, skin smoothing, colorization, new Auto Creations and more, as detailed in an announcment blog post.

Photoshop Elements and Premiere Elements are Adobe’s entry-level versions of its products, offering general consumers access to some of the tools and capabilities found in these products, but with less complexity and lower prices at $ 99.99 each for a full software license.

A sample comparison from Adobe showing off the new Sensei-powered auto-colorization tool.

The Photoshop Elements 2020 update brings a number of new features, including new Pattern Brush, B&W Selection, Depth of Field, and Painterly effects for Auto Creations, support for automatically colorizing black-and-white images using AI, automatic skin smoothing, and one-click subject selection.

A comparison from Adobe showing off the skin-smoothing tool.

The updated software also enables users to remove unwanted objects from images, add heart and star patterns to photos, and search for content via Smart Tags. Beyond that, the software has received general performance enhancements, as well as support for HEIF images and HEVC videos. For customers located in the United States, Photoshop Elements now also supports directly ordering prints and other items through the Fujifilm Prints and Gifts service.

A comparison photo set showing the its noise reduction technology.

Joining the big Photoshop Elements 2020 update is the new Premiere Elements 2020, an update that adds simplified noise reduction, a sky replacement tool, support for turning images and videos into dynamic time-lapses, and a tool that replaces the black bars in vertical videos with a fill that matches the video, as seen below.

As with the new version of Photoshop Elements, this Premiere Elements update also adds Smart Tag search and support for HEIF/HEVC formats. The software also supports searching videos for specific people using Sensei’s face-matching capabilities. Finally, Premiere Elements now includes five guided edits that help users modify their videos.

In addition to the individual $ 99.99 license price, Adobe offers Photoshop Elements 2020 and Premiere Elements 2020 bundled for $ 149.99. Existing customers can upgrade either of the new products for $ 79.99 or both for a total of $ 119.99.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Adobe Photoshop and Premiere Elements 2020 arrive with new AI-powered tools

Posted in Uncategorized

 

iPhone 11’s coolest photo feature is the hardest one to find

05 Oct
Cinerama leaning back – a natural result of pointing my camera upwards to capture the whole building.

Anyone who has stood at ground level and taken a photo of a building across the street has likely seen the effects of perspective distortion – you tilt your camera back to bring the whole building into frame, causing the straight lines of the building to appear to be ‘leaning back.’ Tilt-shift lenses are designed for exactly this problem, but they’re expensive, specialist optics.

More often, this effect will be corrected in software, but doing so usually requires the user to stretch the top of the image and crop to avoid the blank spaces this creates at the bottom of the frame. Apple is tackling this problem with a unique approach in the iPhone 11: by capturing more data outside of the frame.

I don’t know, I just like boring photos I guess?

For whatever reason, I’m drawn to the types of photos where perspective distortion is painfully obvious – signs, sides of buildings, etc. – but I’m horrible at lining them up correctly. Usually, I find out going through my images later that I wasn’t squared up to my subject even though I thought I was. Horizons are slightly askew, or I was leaning back slightly. Apple, it seems, has heard my cries.

When you’re shooting with the standard camera (with a focal length equivalent to about 26mm), the iPhone 11 will also capture image data from the ultra-wide (13mm equiv.) camera – a feature that is referred to in the settings menu as “Photos Capture Outside the Frame.” If you’re shooting on the telephoto camera of the 11 Pro, it’ll capture additional information from the standard camera.

That extra information is saved alongside your photo. When you edit that image in the native camera app, you’ll be able to use the extra data as you rotate and manipulate your image – a big help when you’re trying to fix crooked lines in a photo.

As you make image adjustments, you’ll see the extra data captured by the ultra-wide lens. This additional image information is available for 30 days.

The phone can use that information to automatically re-crop photos too. In the camera settings menu there’s an option to “Auto Apply Adjustments.” You’ll know that auto adjustments have been applied to an image when it shows a blue “Auto” icon above your captured photo. We’ve noticed this feature being employed when the phone detects a human subject cut off at the edge of the frame.

And even for many photos that aren’t automatically adjusted, the stock camera app will suggest tweaks when brought into edit. For example, take that image of the building that’s leaning back – if you edit it in the iPhone’s camera app and engage the crop tool, it will automatically correct for perspective distortion and use the extra image data it saved to fill in the areas at the edges of the frame that would otherwise need to be cropped out.

Bringing the image into the iPhone’s native editing app, then pressing the ‘crop’ option will take you to this view. The yellow ‘auto’ icon appears at the top of the image if there’s a suggested crop, as there is in this example.
The same adjustments can be applied in Photoshop, but without that extra image information at the sides of the frame you’ll need to crop in to avoid including blank space in your final image.
The iPhone goes beyond these limitations with that extra image data. In addition to correcting perspective, you can creatively re-crop your image to preserve details at the edge of the frame – and even include objects that were well outside of the frame in your initial standard image.

I don’t think many people will discover this feature, and that’s a shame. It’s not just helpful for correcting distortion and fixing crooked horizons – it’s a useful feature if you just want to re-crop an image after-the-fact. However, it will only be discovered by those who enable the ‘capture outside the frame’ feature and attempt to crop an image, which I imagine is a fraction of the many people who will use the camera day in and day out.

Regardless of how widely used this feature will be, what Apple is doing is clever. Photoshop’s Content Aware Fill feature does something similar – it will fill in missing data when rotating or stretching an image – but instead of using data from a wider lens, it’s filling in those empty spaces based on educated guesses. Apple’s approach is just one more way in which smartphone manufacturers are using data to their advantage – to the advantage of boring photo fans everywhere.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on iPhone 11’s coolest photo feature is the hardest one to find

Posted in Uncategorized

 

Instagram releases its ‘Restrict’ shadowbanning feature for all users

05 Oct

Instagram has fully released the ‘Restrict’ shadowban feature it first introduced as a test in July. The tool enables an Instagram user to restrict other accounts from posting content on and sending messages to their own account. As Instagram first explained this summer, Restrict is intended to limit the reach of bullies without fully blocking them, an action that may make the bullying worse.

The philosophy behind shadowbans on Instagram is simple: many users, particularly teens, face bullying from peers they know in real life, such as classmates. Blocking a bully on Instagram may cause that bully to increase their torment of the user in real life, which is why many users avoid blocking them.

In addition, and more broadly speaking, blocking an account that is posting abusive content may simply drive the bully to create a new account after the first one is blocked. For these reasons, blocking is not always the ideal way to prevent problematic comments and messages from being directed at an account.

Restrict is a solid alternative, enabling Instagram users to instead limit an unwanted account in a way that doesn’t alert the bully. Comments published by a restricted account are hidden by default and any private messages sent from the restricted account will be automatically sent to the recipient’s Message Request inbox. These restricted DMs can be read, but the sender won’t be alerted to the fact that their message was viewed.

Restrict is now available to all Instagram users.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Instagram releases its ‘Restrict’ shadowbanning feature for all users

Posted in Uncategorized

 

The latest iOS 13 developer beta gives us a sneak peek at Apple’s new Deep Fusion mode

05 Oct

Earlier this week, Apple released the first developer beta version of iOS 13 with support for its Deep Fusion technology built-in. Although there’s still plenty to learn about the feature, multiple developers have already taken the camera tech for a spin and shared their thoughts (and results) around the web.

To refresh, below is a brief explainer on what Deep Fusion is from our initial rundown on the feature:

‘Deep Fusion captures up to 9 frames and fuses them into a higher resolution 24MP image. Four short and four secondary frames are constantly buffered in memory, throwing away older frames to make room for newer ones […] After you press the shutter, one long exposure is taken (ostensibly to reduce noise), and subsequently all 9 frames are combined – ‘fused’ – presumably using a super resolution technique with tile-based alignment (described in the previous slide) to produce a blur and ghosting-free high resolution image.’

Although the tests are far from conclusive, we’ve rounded up a few sample images and comparisons shared by Twitter users from around the world. From the commentary shared by those who have tested the feature and from a brief analysis with our own eyes, Deep Fusion appears to work as advertised, bringing out more detail and clarity in images.

In addition to the above comparison, photographer Tyler Stalman also compared how Deep Fusion compares to the Smart HDR feature.

As noted by Halide co-founder Sebastiaan de With, it seems as though the image files captured with Deep Fusion are roughly twice the size of a standard photo.

Much remains to be seen about what Deep Fusion is actually capable of and how third-party developers can make the most of the technology, but it looks promising. There seems to be some confusion as well regarding whether Deep Fusion will work with Night Mode, but according to Apple guru John Gruber, the two are mutually exclusive, with Deep Fusion being applied to scenes between 600-10 lux while Night Mode kicks in at 10 or fewer lux.

We’ll know more for sure when we have a chance to test the new feature ourselves.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on The latest iOS 13 developer beta gives us a sneak peek at Apple’s new Deep Fusion mode

Posted in Uncategorized

 

Weekly Photography Challenge – Triangles

05 Oct

The post Weekly Photography Challenge – Triangles appeared first on Digital Photography School. It was authored by Caz Nowaczyk.

This week’s photography challenge topic is TRIANGLES!

Image: Chris Lawton

Chris Lawton

Following along on our shapes theme for the past fortnight, this week is Triangles. You can see triangles in flags, sails, architecture, windows, rooftops, in patterns and shadows, etc.

So go out and capture anything that has triangles. They can be color, black and white, moody or bright. Just so long as they have triangles in them! You get the picture! Have fun, and I look forward to seeing what you come up with!

Image: Bambi Corro

Bambi Corro

Image: ?????? ????

?????? ????

Image: freddie marriage

freddie marriage

 

Check out some of the articles below that give you tips on this week’s challenge.

Tips for Shooting TRIANGLES

Embracing Shadows in Photography – A Lesson for Light and Life

How to Tell Stories with Architecture Photography

9 Creative Architecture Photography Techniques for Amazing Photos!

Tips for Different Approaches to Architecture Photography

How to Understand Light and Color to Improve your Photography

Top Tips for Photographing the Best a City has to Offer in 48-hours

 

Weekly Photography Challenge – TRIANGLES

Simply upload your shot into the comment field (look for the little camera icon in the Disqus comments section) and they’ll get embedded for us all to see or if you’d prefer, upload them to your favorite photo-sharing site and leave the link to them. Show me your best images in this week’s challenge.

Share in the dPS Facebook Group

You can also share your images in the dPS Facebook group as the challenge is posted there each week as well.

If you tag your photos on Flickr, Instagram, Twitter or other sites – tag them as #DPStriangles to help others find them. Linking back to this page might also help others know what you’re doing so that they can share in the fun.

The post Weekly Photography Challenge – Triangles appeared first on Digital Photography School. It was authored by Caz Nowaczyk.


Digital Photography School

 
Comments Off on Weekly Photography Challenge – Triangles

Posted in Photography

 

Video: iPhone records its dramatic fall from a plane over Iceland, is recovered a year later

05 Oct

Iceland Photo Tours pilot and photographer Haukur Snorrason has shared a video showing the descent of his iPhone 6S Plus as it fell from a small plane located about 60m (200ft) over Iceland. The incident happened more than a year ago; given the height and frozen tundra beneath, Snorrason had assumed at the time that his tiny iPhone hadn’t survived the fall.

Around 13 months after the phone was dropped, a group of hikers discovered the device in a patch of moss, which had cushioned the blow and enabled the phone to survive the drop. The device powered on when tested, revealing Snorrason’s name and making it possible to reunite him with his lost device.

In addition to being nearly entirely functional (only the microphone was damaged), Snorrason discovered that the iPhone had recorded and saved a video of its rapid descent from the plane. The device landed face down on the moss, protecting the display from the elements while leaving the camera exposed to record the bright blue sky and Sun until its battery died.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Video: iPhone records its dramatic fall from a plane over Iceland, is recovered a year later

Posted in Uncategorized