RSS
 

Image Size and Resolution Explained for Print and Onscreen

27 Oct

The post Image Size and Resolution Explained for Print and Onscreen appeared first on Digital Photography School. It was authored by Helen Bradley.

One of the most confusing things for a new photographer is understanding image size, resolution, and printing.

So in this article, I’ll explain what these terms mean.

And I’ll show you how to resize your images depending on what you want to do with them.

Let’s get started.

What is resolution in digital cameras?

When talking about digital cameras, resolution refers to the number of megapixels produced by an image sensor.

This, in turn, generally corresponds to the amount of detail a camera can capture.

So if your camera packs 20 megapixels (often written as 20 MP), it captures less detail than a camera with 30 megapixels, which in turn captures less detail than a camera with 40 megapixels.

But what is a megapixel, really? And how does it affect your ability to print and display photos?

Megapixels and photo size

Find information about a photo (including resolution) using File > File Info

Technically, a megapixel is equal to 1,048,576 pixels; in reality, camera manufacturers round this number to 1,000,000 when stating how large of an image the camera will capture.

So my camera, for example, captures 14.6-megapixel images, which is around 14,600,000 pixels per image (14.6 x 1,000,000). This information tells you nothing about the actual pixel dimensions of the image – it only tells you the total number of pixels that make up the image.

My camera, like most DSLRs, captures images with an aspect ratio of 1.5. So the ratio comparing the number of pixels along the long edge of the image to the short edge of the image is 3:2.

Each of my full-sized RAW images is 4672 x 3104 pixels in dimension. So by multiplying the number of pixels along the image width by those along the image height, we get the actual number of pixels in the image (4672 x 3104 = 14,501,888). You and I might call this 14.5 MP, but camera manufacturers round this up and call it a 14.6 MP camera.

You can check the width and height of an image using your photo editing software. In Photoshop, you can open your image, then choose File > File Info > Camera Data. The image above shows the resulting information dialog box.

Now, a pixel itself is a single picture element – and for our purposes, it’s the smallest element that your photo can be divided up into. A pixel can only be one color, and a photograph is made up of a grid of thousands of pixels, each of the different colors that together make up your image.

You can see these pixels if you open a photo and zoom in until you see single blocks of color (as shown below). Each of these blocks is a pixel:

An image of pixels in a photo

Why size is important when printing

When you’re printing an image, you may encounter the term PPI or pixels per inch. This literally refers to the number of pixels in an inch-long line of an image.

Most printing services, and indeed your own printer, will require a certain density of pixels in the image (PPI) to be able to render a print that looks good (i.e., with smooth color transitions so you can’t see each individual pixel).

Typical printing PPI values range from 150 to 300 PPI, although some high-end magazines may require images that are 1200 PPI.

So for example, if you want to print a 4 x 6 inch image at 300 PPI, then you need a file that has at least 4 x 300 (1200) pixels along its short side and 6 x 300 (1800) pixels on the long side. In other words, it needs to be at least 1200 x 1800 pixels in size.

To print an 8 x 10 inch image at 300 PPI, use the same math:

Multiply the printed image’s width and height in inches by 300 pixels. The result is 2,400 x 3,000 pixels, which is the image resolution you need to print an 8 x 10 image at 300 PPI.

Therefore, when cropping and sizing an image for printing, you’ll need to know what PPI the image should be. Your printer manual or printing service should be able to tell you this.

Below is a screenshot from the MpixPro.com website, showing their optimal and minimum image sizes for standard print sizes. Their printer outputs at 250 PPI (but can handle 100 PPI images), though other services may differ, so always check before preparing your images.

Print size required for MpixPro printing

Use the crop or resize feature in your software to size your image to the desired width and height and the desired PPI resolution.

Here, an image cropped to a size of 3000 x 2400 pixels is being adjusted from 72 PPI to 300 PPI in preparation for printing at 300 PPI. There is no resampling required, as the image is already the correct dimensions and only the resolution requires adjusting.

Adjusting resolution in a photo without resampling it

Photoshop, like other applications, will also crop an image to a fixed size and resolution if you type your desired values into the options bar when you have the crop tool selected (see below). If your image is smaller than the typed dimensions, then the image will be enlarged using the default resampling method. While it isn’t generally advisable to enlarge images, provided the image is already close to the desired size, enlarging it a little generally won’t cause a noticeable loss of quality.

When cropping in Photoshop, you can specify image size and resolution

Sizing for the screen

When it comes to displaying images on the screen, you need far fewer pixels than you do for printing.

This is because the density of pixels on the screen is far less than what is required for printing. For example, a typical monitor is 1920 x 1080 pixels in size; to fill the monitor, you only need an image that is 1920 x 1080 pixels in size. That’s about the same size image you need for a 4 x 6 print at 300 PPI – yet the 1920 x 1080 pixel image displays perfectly on a 23-inch monitor.

The post Image Size and Resolution Explained for Print and Onscreen appeared first on Digital Photography School. It was authored by Helen Bradley.


Digital Photography School

 
Comments Off on Image Size and Resolution Explained for Print and Onscreen

Posted in Photography

 

SLC-1L-12: A Garden of Ideas

27 Oct

At first glance: a simple, one-light portrait of activist gardener Janssen Evelyn.

Dig deeper: a look at tonal mapping via specular highlights, stretching the range of your modestly powered flash, and how to discover your next project.

Read more »
Strobist

 
Comments Off on SLC-1L-12: A Garden of Ideas

Posted in Photography

 

BCN Retail shows Canon catching Sony in the Japanese full-frame MILC market, Nikon stagnant — for now

27 Oct

BCN Retail, a Japanese analyst firm that collects daily sales data of mirrorless interchangeable lens cameras from online and in-person points of sale in Japan, has published (translated) its latest numbers, showing the breakdown of Japanese domestic market share in the full-frame mirrorless interchangeable lens camera (MILC) market.

BCN Retail starts its report with partially encouraging news, noting the camera market, at least in Japan, has almost entirely recovered from the pandemic drop, with unit sales in the month of September being down just 2% and revenue from those sales down just 10% year-over-year (YoY). Lower numbers YoY is never a good thing, but considering the state of the camera market even pre-pandemic, these drops aren’t terrible.

According to BCN Retail’s latest numbers, Canon and Panasonic have seen a rise in market share over the past few months, while Nikon has more or less stayed even. Meanwhile, both Sony and Sigma have seen their market shares drop over the past few months.

Full-frame mirrorless market share numbers: Brown (Sony), Red (Canon), Yellow (Nikon), Blue (Panasonic), Grey (Sigma). The dark blue and red bars at the bottom show unit sales and revenue (as a percentage of overall interchangeable lens camera (ILC) camera sales), respectively.

BCN Retail says Canon’s rise in market share — now 34.7% — can be attributed to the release of its R5 and R6 mirrorless cameras, while Panasonic’s rise — now 5.8% — is attributed to the launch of its S5. Nikon’s market share saw a small increase in July, which could likely be attributed to the release of its entry-level Z5, but since August its market share has more or less stayed stagnant, sitting at roughly 13%. It’s possible its forthcoming Z6 II and Z7 II mirrorless cameras could give the company a boost, though.

Meanwhile, Sony has seen its market share drop from roughly 60% back in May to now just 43.9%, only 9% ahead of Canon who, at the start of the year, had just 15% of the market share. Sigma, too, has seen its market share drop to just 2.6% after once being ahead of both Nikon and Panasonic back in May when the FP sales were hot.

The Canon EOS R5 was the most popular full-frame mirrorless interchangeable lens camera (MILC) of September, according to BCN Retail.

It’s worth noting these market share numbers are specific to the Japanese market and greatly impacted by new cameras launched within a given month or quarter.

Back in the summer of 2018, Sony effectively had 100% of the full-frame MILC market share, as there were no other competitors. Within six months of both Canon and Nikon introducing their respective full-frame mirrorless cameras, Sony’s market share was effectively halved and since then, it’s been further chipped away at by Canon.

This doesn’t necessarily mean Canon or Nikon were eating into Sony sales at the beginning when the two first entered the market, as you can see unit volume also rose when Canon and Nikon introduced their mirrorless cameras, but now that sales have more or less returned to their pre-pandemic volume and Sony is further dropping in market share, it is possible we’re starting to see Canon starting to pull away some of Sony’s customers a bit.

Canon EOS RP (left), Nikon Z5 (right).

What should be interesting to see is whether Nikon’s new Z6 II and Z7 II take more market share from Canon or Sony or is simply converting more DSLR users and therefore adding to the sales volume rather than taking from elsewhere in the full-frame MILC market. In the past, it seems Canon’s numbers are more affected by the rise and fall of Nikon’s market share, whereas Sony’s are more affected by the rise and fall of Canon’s market share, but even with the charts, it’s difficult to get the full picture without knowing the precise number of units being sold and the price at which they’re selling for—two numbers that prove challenging to extrapolate from BCN Retail’s numbers or even CIPA.

BCN Retail also notes that full-frame sales have hit 10.7% of the overall interchangeable lens camera (ILC) market, marking the first time it’s been in double-digits. Revenue from full-frame MILC, as a percentage of the overall ILC market, also saw a dramatic jump to 25%. These are both the highest-ever numbers for the full-frame market, but BCN Retail does note this is because the average cost of a full-frame MILC tends to be 2.3x as much as a crop sensor ILC —¥230,000 (~$ 2,200) to ¥100,000 ($ 955), respectively.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on BCN Retail shows Canon catching Sony in the Japanese full-frame MILC market, Nikon stagnant — for now

Posted in Uncategorized

 

NASA uses infrared imaging to discover water on sunlit surface of the Moon

27 Oct

NASA’s Stratospheric Observatory for Infrared Astronomy (SOFIA) has used its onboard Faint Object infrared Camera for the SOFIA Telescope (FORCAST) to discover water molecules on the sunlit surface of the Moon. For the first time, there are indications that water may be distributed across the Moon’s surface, and not limited to just cold, dark areas of the lunar surface.

SOFIA’s infrared camera, used in conjunction with a 106-inch diameter telescope, picked up ‘the specific wavelength unique to water molecules, at 6.1 microns, and discovered a relatively surprising concentration in sunny Clavius Crater.’ This crater is one of the largest craters visible from Earth and is in the Moon’s southern hemisphere.

Casey Honniball is the lead author who published the results as part of her graduate thesis work at the University of Hawaii at M?noa. She is now a postdoctoral fellow at NASA’s Goddard Space Flight Center in Maryland. Of the discovery, Honniball says, ‘Prior to the SOFIA observations, we knew there was some kind of hydration. But we didn’t know how much, if any, was actually water molecules – like we drink every day – or something more like drain cleaner. Without a thick atmosphere, water on the sunlit lunar surface should just be lost to space. Yet, somehow we’re seeing it. Something is generating the water, and something must be trapping it there.’ If you’d like to read the full paper, it has been published in Nature Astronomy.

Data gathered using SOFIA’s onboard camera shows water in Clavius Crater in concentrations of 100 to 412 parts per million, ‘roughly equivalent to a 12-ounce bottle of water trapped in a cubic meter of soil spread across the lunar surface.’ Paul Hertz, director of the Astrophysics Division in the Science Mission Directorate at NASA Headquarters says, ‘We had indications that H2O – the familiar water we know – might be present on the sunlit side of the Moon. Now we know it is there. This discovery challenges our understanding of the lunar surface and raises intriguing questions about resources relevant for deep space exploration.’

It’s not a lot of water, about 1% of the water found in the Sahara desert, but it’s a significant discovery. The work of the SOFIA team has uncovered new questions about how water is created and how it persists on the airless Moon. Further, water is a critical resource in deep space exploration. NASA’s Artemis program is keen to learn more about the presence of water on the Moon, and ideally, discover a way to access water in its pursuit of establishing a sustainable human presence on the Moon by 2030.

‘Water is a valuable resource, for both scientific purposes and for use by our explorers,’ said Jacob Bleacher, chief exploration scientist for NASA’s Human Exploration and Operations Mission Directorate.’ Bleacher continues, ‘If we can use the resources at the Moon, then we can carry less water and more equipment to help enable new scientific discoveries.’

As to how the water molecules ended up on the surface remains an unanswered question. One theory is that ‘Micrometeorites raining down on the lunar surface, carrying small amounts of water, could deposit the water on the lunar surface upon impact.’ Another theory involves a two-step process ‘whereby the Sun’s solar wind delivers hydrogen to the lunar surface and causes a chemical reaction with oxygen-baring minerals in the soil to create hydroxyl’ which is then transformed into water by radiation from micrometeorites.

‘This illustration highlights the Moon’s Clavius Crater with an illustration depicting water trapped in the lunar soil there, along with an image of NASA’s Stratospheric Observatory for Infrared Astronomy (SOFIA) that found sunlit lunar water.’ Image and caption credits: NASA/Daniel Rutter

SOFIA, which is a modified Boeing 747SP jetliner, typically focuses on very distant objects, such as black holes, galaxies and star clusters. In fact, the newly-published results are from SOFIA’s very first mission looking at the Moon. The team was essentially testing the tracking capabilities of its equipment, and this test produced a significant discovery. Additional flights will take a further look at the lunar surface.

SOFIA’s standard observations take place during a 10-hour overnight flight and captures images at mid- and far-infrared wavelengths. You can view some of the images it has captured by clicking here.

This is far from the first time NASA’s camera technology has produced meaningful, significant scientific discovery. Looking to the future, NASA’s Perseverance is currently about halfway to Mars, carrying a rover outfitted with a record-breaking 19 cameras. These cameras will capture incredibly detailed images of the Martian landscape.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on NASA uses infrared imaging to discover water on sunlit surface of the Moon

Posted in Uncategorized

 

Nikon Z 14-24mm F2.8 S sample gallery

26 Oct

$ (document).ready(function() { SampleGalleryV2({“containerId”:”embeddedSampleGallery_2780918257″,”galleryId”:”2780918257″,”isEmbeddedWidget”:true,”selectedImageIndex”:0,”isMobile”:false}) });

The Nikon Z 14-24mm F2.8 S completes the ‘holy trinity’ of traditional F2.8 zooms for Z-mount. Offering substantial weight and size savings over the previous AF-S 14-24mm F2.8 (and capable of accepting both screw-in and cut gel filters) the new zoom is more practical than its predecessor, but is the higher price reflected in its performance? Take a look at our sample gallery to find out.

See our Nikon Z 14-24mm F2.8
sample gallery

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Nikon Z 14-24mm F2.8 S sample gallery

Posted in Uncategorized

 

Report: Zeiss’ full-frame Android-powered ZX1 camera to be released on October 29, cost $6K

26 Oct

Three weeks ago, the long-awaited Zeiss ZX1 camera reappeared for pre-order on B&H Photo after months and months of silence regarding the availability of the Android-powered mirrorless camera. While B&H has since pulled its listing, a new report from Nokishita claims the camera will be available on October 29 with an MSRP of $ 6,000/€6,000 (the same price B&H had it listed for).

When B&H listed the Zeiss ZX1 for pre-order earlier this month, we contacted both B&H and Zeiss on the matter, but both passed on the opportunity to comment on the matter. Sometime between then and now, the pre-order option on B&H was removed with no further information on when we might see more. That is, until Nokishita published the above tweet earlier this morning.

We have contacted Zeiss for confirmation and will update this article accordingly if we receive a response. While we wait to hear more about the camera, you can check out our hands-on with it back at CP+ 2019.

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Report: Zeiss’ full-frame Android-powered ZX1 camera to be released on October 29, cost $6K

Posted in Uncategorized

 

The World’s First Completely AI-Powered Photo Editor Will Debut Before the Year Is Out

26 Oct

The post The World’s First Completely AI-Powered Photo Editor Will Debut Before the Year Is Out appeared first on Digital Photography School. It was authored by Jaymes Dempsey.

Skylum’s Luminar AI, billed as “the first image editor fully powered by artificial intelligence,” will come available before the year is out.

Luminar AI is the world's first completely AI-powered photo editor

Already, the image editor has created controversy among photographers thanks to its automated, AI-based approach to image editing, with some claiming that Luminar AI’s easy, no-experience-necessary approach is problematic – or even cheating.

And while we don’t know exactly how the program works, Skylum has released several videos showing off some of Luminar AI’s standout features.

world's first AI-powered photo editor

Highlights include:

  • Composition AI, which automatically straightens your images and suggests cropping based on compositional guidelines and feedback from professional photographers
  • Face AI and Skin AI, which automatically retouch your subjects’ faces for improved teeth, lips, skin, and much more
  • Sky AI, which allows you to instantly swap skies while automatically adjusting for changes in lighting and color
  • Atmosphere AI, which lets you enhance your images with weather effects (such as haze, steam, drizzle, fog, and mist)

To see some of these AI effects in action, check out Skylum’s latest video:

While Luminar AI will be offered as a standalone editing program, Skylum’s most up-to-date software, Luminar 4, already packs some AI-based features (including a popular sky-replacement option). But Luminar AI promises to take AI editing to the next level, opening up advanced post-processing effects to a much larger audience.

Will Luminar AI do everything for you?

It doesn’t seem like it. As Skylum explains, you have to make creative choices; Luminar AI will do a lot of editing work for you, but you’ll remain at the helm.

world's first AI-powered photo editor

As for the Luminar AI release:

The date is currently unknown, but Skylum promises a “holiday season” release. I’d expect an early December debut, though mid-December or late November certainly isn’t out of the question.

In terms of price, you can currently preorder Luminar AI for a discounted rate:

$ 74 USD for a standard copy of Luminar AI, or $ 139 USD to download both Luminar AI and Luminar 4.

So if Luminar AI’s simple approach to editing appeals to you, make sure you take a look while the discount still lasts! You can view the software here.

Now over to you:

What do you think about Luminar AI? Do you like the idea of AI-based editing? Or does it feel like cheating? And will you purchase the program? Share your thoughts in the comments!

The post The World’s First Completely AI-Powered Photo Editor Will Debut Before the Year Is Out appeared first on Digital Photography School. It was authored by Jaymes Dempsey.


Digital Photography School

 
Comments Off on The World’s First Completely AI-Powered Photo Editor Will Debut Before the Year Is Out

Posted in Photography

 

Video: Weird lens guru turns $20 Carl Zeiss projector lens into a swirly-bokeh camera lens

26 Oct

Weird lens guru Mathieu Stern is back at it with a new video that shows images captured with two $ 20 Carl Zeiss projector lenses he converted into camera lenses.

As with many of Stern’s DIY projector lens projects, both of these lenses — a 120mm F1.9 and a 105mm F1.9 — lack any way to focus and don’t have any adjustable aperture. While the adjustable aperture isn’t quite so easy to address, the video briefly shows how he uses an M65 Helicoid ring adapter to give manual focus abilities to the lens. Although not shown in the video, Stern then uses an M65 to Sony E-mount adapter to use the custom lens to his Sony camera.

The resulting imagery captured with the lenses produces pronounced ‘swirly’ bokeh and gives a very sharp separation between the subject and the background. It’s not going to win any resolution or edge-to-edge sharpness contests, but considering you can pick up similar projector lenses for around $ 20 or so online and a set of adapters for your camera for roughly $ 50 or so, it’s a cheap way to get some unique shots.

Stern has a full list of the components he used in the video’s description on YouTube. You can find more of his work on his YouTube channel and website, which also features his always-growing ‘Weird Lens Museum.’

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Video: Weird lens guru turns $20 Carl Zeiss projector lens into a swirly-bokeh camera lens

Posted in Uncategorized

 

Roger Cicala: why I don’t use an MTF bench to test my own lenses

25 Oct
Roughly clockwise from left: 300mm collimator, laser transmission testing, lens test projector, Trioptics Imagemaster HR optical bench, spectrometry measurement. It might not look like much, but the total cost is similar to really nice house in a small city (or a decent house in big city).

I have a complete testing lab at my disposal: MTF benches, lens test projectors, spectrometers, lasers, an Imatest setup gathering dust in a back room; everything all the cool kids have. A lot of people assume I test the hell out of my own shiny new personal lenses after I buy them. (Yes, I buy my own stuff). I do test them, but not in the lab. I go out and take pictures with them.

It’s not because I’m such a great photographer that my practiced eye can tell more about the lens through photographs than any lab test could. I’m a mediocre photographer. Years ago I tried making a living as a photographer. I sold some prints once, made enough to pay for maybe half a lens, and after another six months without a sale I decided to explore other methods of supporting my extravagant lifestyle.

The lab is faster, gives tons of information, and makes cool graphs. But I still don’t use it to test my personal lenses

It’s not because the lab stuff doesn’t give useful information. The lab gives a LOT of useful information. Most people don’t have time to learn how to interpret it, or learn its value and limitations, but it’s useful information nonetheless. And the lab is fast; I can test a lens about 32 different ways in a couple of hours. My ‘test a lens with photography’ time is a half a day or more. So the lab is faster, gives tons of information, and makes cool graphs. But I still don’t use it to test my personal lenses.

Lab tests give a ton of precise information. Understanding and interpreting it is, I’ll admit, not completely intuitive.

That’s because all lab tests have some major limitations. The biggest one is this: real images are 3-dimensional, they are focused at a variety of distances, and almost always contain foregrounds and backgrounds. Optical tests are two-dimensional slices taken at a fixed focusing distance with no background or foreground. The focusing distance is infinity for an optical bench. It’s a single, close distance for Imatest / DxO / and other computer image analysis methods.

So, the lab tests tell me everything I want to know about the plane of exact best focus at one focusing distance. That’s really useful information, especially if you want to find out if a lens is optically maladjusted, want to know what kind of aberrations it has, or are interested in its maximum resolution. And it gives people numbers – the ammunition of choice in many a Forum War.

Even a three-dimensional standard comparison image, such as the kind that DPReview and other sites use, is basically limited to one focusing distance. That distance is different for different focal lengths but it’s always fairly close up. And, if it’s an indoor target, the depth of those targets is usually only a few feet at most; it’s not going to show you what the out of focus area 30 feet behind the image plane looks like.

What I actually do to test a new lens

Photographs give me far more information than the lab, even if it’s less exact. I don’t recommend brick wall or side-of-building photographs. Those are just 2-dimensional slices like the lab gives, but with more variables and less information. I want photographs of 3-dimensional subjects.

With the right background (I prefer a field or yard of grass) you can quickly compare resolution at a half-dozen focusing distances. Sure, some lenses are about the same at all distances, but many are not. No zoom lens is equally sharp at all focal lengths. My favorite grass field is a hill behind my office that slopes up away from me. I focus on the mower tracks and quickly get images at several focusing distances.

Simple grass slope image taken with a Canon 50mm F1.2 lens at F1.4.

Grass (or pebbles or concrete or all manner of things that make fairly uniform photographs filled with fine detail) are great for figuring out the zone of acceptable sharpness (for you) of a lens.

Repeating this set of images at several apertures lets me see at what aperture maximum center, middle, and edge sharpness occur (those are almost always different). It’s good to know things like there’s maximum center sharpness at F4 and the edges are at maximal sharpness at F6.3 or F8 or that they never get very sharp.

Grass is also great because it gives you a nice sharpness comparison as you leave the area of best focus. I also recommend looking at what you consider the depth of field at each aperture and focusing distance. Depth of field is not an area of maximal sharpness. It is an area of acceptable sharpness; there is greater and lesser sharpness within the depth of field. Your definitions of ‘acceptable sharpness’ in your images may be greater, or less, than the calculated depth of field.

You rarely see dramatic changes in a prime lens’ field curvature at different focusing distances, but you will usually see a dramatic change in a zoom’s field curvature at different focal lengths

More importantly, some lenses fall off of the sharpness cliff as they exit their area of maximal sharpness, others drift so slowly down the gentle sharpness slope that it really does seem as if the entire depth of field is maximally sharp. Also, that sharpness slope often changes at different apertures. Those are all good things to know.

The other thing I do is to take some of my grass images and run them through a Photoshop ‘Find Edges’ filter or equivalent. This will let you visualize the field curvature of your lens and see how it varies at different focal lengths or focusing distances. (Pro tip: you rarely see dramatic change in a prime lens’ field curvature at different focusing distances. You will, however, usually see a dramatic change in a zoom’s field curvature at different focal lengths.) That’s really useful information that few people know about their lenses. The find edges type filters are also a good way to look at depth of field at various apertures or with different lenses.

Same image as above (Canon 50mm F1.2) run through a find edges filter – the field curvature is obvious.
Field curvature of Canon 50mm F1.2 as measured on an optical bench. You get about the same information from the grass photo and find edges filter as you would from the $ 250,000 optical bench.

Grass shots also give you a superb way to see if your lens is softer in one area or if the field is tilted. The grass image above is very slightly tilted, an amount that’s about normal for a good prime lens. A more dramatic field curvature might look as though you’d rotated the dark area 15 or 20 degrees in Photoshop.

About half the people who take building or brick wall images and think their lens is ‘decentered’ actually have a lens with a field tilt; the lens is equally sharp on both sides, but not at the same distance as center focus. It’s actually very hard to detect a field tilt by shooting a chart and evaluating a two-dimensional image.

A large field tilt in a prime lens is unusual while a field tilt at some focal lengths of a zoom is pretty common. (I’ve seen 45 degree field tilts in zooms, but 10 degrees or so is routine.) If you return your zoom lens to the store for exchange, the replacement will probably have a different field tilt at another focal length.

People like to talk about a lens’ bokeh like it’s one thing, but bokeh often varies

If the lens is one for which I consider bokeh important, I use the a Bokelizer. Basically, this is a couple of strings of tiny Christmas lights hung in a three-dimensional pattern. I take some images at various focusing distances and evaluate the foreground and background in-focus highlights, as well as the in-focus lights. People like to talk about a lens’ bokeh like it’s one thing, but bokeh often varies in the foreground vs the background, at different focusing distances, and depending on how far off-center the object is for many lenses.

Why do I look at in-focus lights, since they have nothing to do with out-of-focus highlights? Because comparing pinpoint light sources is a superb way to see if the lens is optically maladjusted. ‘Optically maladjusted’ means a lens that has a decentered, tilted or poorly spaced element. On the forums, people often refer to all of these issues as ‘decentering’ but that’s less than correct.

Illustrations of the various types of optical maladjustments. In reality, a given lens usually has several small errors, rather than one single large one.

Each of those optical maladjustments causes different optical problems and often they’re apparent when looking at pinpoint light sources. Looking at pinpoint light sources also gives you an idea of the coma and other aberrations that the lens displays by design.

This image was created from equipment in the repair department that basically just projects pinhole lights. You can easily see the difference between a good lens (upper half) and one that is slightly decentered (bottom half).

Once I’m done with the stuff above, I go out and take the kinds of pictures that I bought the lens for. But the hour or two needed for the checks above gave me a lot of information about how to best use the lens’ strengths and weaknesses before I set off to shoot. It also shows me if the lens is optically maladjusted, and there’s no sense taking a bunch of photographs if I already know I’m going to return the lens.

Will taking pictures tell me if I got a copy that’s every bit as sharp as the copy Reviewer Guy got? Absolutely not. Does it let me spout numbers in ‘my lens is better than your lens’ Forum Wars? Again, no. But it certainly does tell me if the lens meets my expectations and will do the job I want it to do. Lab tests give me all manner of information, but they can’t tell me whether I’m going to like the images from the lens.

It doesn’t matter to me at all if I have the sharpest copy of a lens or not. I just want to know if it’s acceptable for the purposes I want to use it for

To be completely honest, if I think the lens isn’t as sharp as I expect, then I may actually take it to the lab and measure it on the bench. I’ve done that maybe twice in the last ten years out of a few dozen lenses I’ve purchased, and both times it turned out that the lens wasn’t up to spec. So, really, I knew the answer without using the bench.

Photographic testing won’t tell you if your lens is among the sharpest copies of that lens, or if it’s in the top half of the variation range or things like that. If you want to know that, then really you need to pay someone to test the lens on a test bench. Why don’t I do that? Because it doesn’t matter to me at all if I have the sharpest copy or not. I just want to know if it’s acceptable to me for my purposes.


Roger Cicala is the founder of Lensrentals.com. He started by writing about the history of photography a decade ago, but now mostly writes about the testing, construction and repair of lenses and cameras. He follows Josh Billings’ philosophy: “It’s better to know nothing than to know what ain’t so.”

Articles: Digital Photography Review (dpreview.com)

 
Comments Off on Roger Cicala: why I don’t use an MTF bench to test my own lenses

Posted in Uncategorized

 

Photographing Stars Using a Kit Lens

25 Oct

The post Photographing Stars Using a Kit Lens appeared first on Digital Photography School. It was authored by Adeel Gondal.

photographing stars the night sky
Nikon D5100 | 18mm | 20 sec | f/3.5 | ISO 1600

Have you seen those amazing starry sky and Milky Way photographs from professional photographers and wondered how to create similar results? Have you never tried because you thought you didn’t have the proper equipment?

Let me tell you, “You are wrong!”

If you own a normal DSLR or mirrorless camera, you can create stunning star photos with only a kit lens.

In this article, I will explain the whole process of photographing stars with your kit lens. I’ll give you step-by-step instructions in the easiest possible way – so that, even if you don’t have much technical knowledge, you can start capturing star photos like a pro.

Let’s dive right in.

The star photography basics

To get started, you need to have following points in mind:

  • You must shoot in a place away from the city lights. The less light pollution you have, the more chance you’ll have of getting clear stars.
  • You’ll want a moonless night. Stars can also be shot on a full-moon night, but the brighter the moon is, the more light pollution it creates, and the stars will not be as prominent.
  • You’ll need a normal DSLR or mirrorless camera with a standard 18-55mm kit lens (such as this Canon lens or this Nikon lens).
  • You’ll also want a tripod.

You can Google your surroundings to find locations that are far away from the city (check out Dark Sky for a helpful interactive map).

You should know beforehand in what direction, and at what time, the moon is going to rise. That will help you a lot with the composition of your images.

However, a moonless night is always best to shoot stars – so I recommend checking the current moon phase before heading out.

Additionally, you can use a compass app on your smartphone to locate the North Star for star trails. To get an idea of the stars above your location, you can download an app called Star Chart (for iOS or for Android) or Google Sky. Both of these apps also show you the direction of the Milky Way, so you can shoot it directly and get amazing results.

I also recommend checking out PhotoPills; this app offers an incredible suite of features for the beginner (or more advanced!) night sky photographer.

Anyway, these apps are pretty accurate. With their help, you can see Mars, even with your naked eye (I’m sure you’ve seen Mars before, but were likely unable to differentiate it from the surrounding stars).

If you want to plan a future shoot or look for an appropriate time to shoot the Milky Way in your location, you can download the desktop app Stellarium. Just pop in your coordinates, and it will show you the direction of the Milky Way at a specific date and time.

Thanks to Stellarium, you can know the exact time of year the brightest part of Milky Way will be above your location (so you can do some amazing star photography!).

stars over a mountain
Nikon D5100 | 18mm | 20 sec | f/3.5 | ISO 1600

Key camera settings for photographing stars

Now let’s get to the important part of photographing stars with a kit lens:

Camera settings.

You will need to take control of your camera, so you’ll want to keep it in Manual mode.

Change the shooting mode to Manual and adjust your setup to the following settings:

Focal length

Set your focal length to the widest possible option. In the case of a kit lens, this will often be 18mm.

While you can technically choose any focal length you want, the more you zoom in, the fewer stars you will be able to capture.

Plus, your optimum exposure time before star trails start to develop will decrease as you increase your focal length (see the 500 Rule, discussed in the shutter speed section below).

Aperture

Setting your aperture to the widest option is key here; for my kit lens, this is f/3.5.

By using the widest aperture your lens allows, more light will enter through your lens.

And this will result in brighter stars and a brighter Milky Way!

Shutter speed

If you are only shooting stars and the Milky Way, I recommend a shutter speed of 20 seconds.

Why 20 seconds?

Here’s the answer:

A shutter speed of under 20 seconds will result in less light reaching the camera sensor.

And a shutter speed of over 20 seconds will start to create star trails. In other words, the stars will move visibly across the sky.

In fact, there’s a handy equation for calculating your shutter speed, called the 500 Rule:

The optimum exposure before you start getting star trails is calculated by dividing 500 by your focal length (you’ll need to divide the answer once more, by around 1.5, if you are using a cropped sensor.)

So in the example of an 18mm lens on a cropped sensor, divide 500 by 18, for an answer of 27.78. Then divide this again, by 1.5, to get 18.52, which is roughly 20 seconds.

Make sense?

star photography with a car in the foreground
Nikon D5100 | 18mm | 20 sec | f/3.5 | ISO 1600

ISO

Start by keeping your ISO at 1600. You can then increase it later, depending on your results.

But keep in mind:

The greater the ISO, the more noise there will be in your image.

Now, this does depend on the signal-to-noise ratio of the camera body you are using; high-end cameras tend to offer the best high-ISO noise performance, and even modern consumer cameras feature decent results at high ISOs. But try to boost the ISO on an older camera, and you’ll end up with all sorts of unwanted noise.

Remote shutter release

You’ll want to have a remote shutter release to avoid camera shake from hitting the shutter button.

If you don’t have a shutter release, just use your camera’s 2-second or 10-second timer.

That will minimize any blur in the picture due to camera shake.

You should also switch off any “vibration reduction” or “image stabilization” technology included in your camera or lens, because this technology can actually increase blur during long-exposure images.

Focusing your lens to infinity

After dialing in all these settings, here’s the only important thing left to do:

Focus your lens to infinity.

Now, a kit lens doesn’t have an infinity marker on it, so you’ll need to use hyperfocal distance values to focus your lens.

Here’s what you do:

Mount your camera and lens on a tripod, and focus on any bright object at a distance of 20 feet or more.

(If you are in the dark and struggling to focus, you can point a flashlight toward your camera and use this as a point of focus.)

Once the lens is focused beyond 20 feet, its hyperfocal distance will project to infinity and your stars will be sharp. This will also help get everything in the foreground sharp, too.

Don’t forget to switch your lens to manual focus; otherwise, it will start to hunt for focus when the shutter is pressed.

You may be wondering:

Why is focusing to infinity so important?

If your lens is not focused to infinity, you will capture a Milky Way image. However, the stars will not be as sharp and they’ll appear bigger, like this:

example star photograph where lens was not focused to infinity
My lens was not focused to infinity.

And this:

Another example star photo where lens wasn't focused to infinity
My lens was not focused to infinity.

And this:

example photo where lens was not focused to infinity
My lens was not focused to infinity.

You’ll get a similar result if you use a too-long shutter speed and produce star trails.

So make sure you pay careful attention to both your settings and your point of focus.

A quick tip for photographing stars

Once you reach your location, it’s better to first sit in the dark for at least 15 minutes to let your eyes adjust to the surroundings. This will help you see a lot of stars (and even the Milky Way) with your naked eye. It will also help you compose your images better – because it’s easier to create beautiful compositions when you can see!

Plus, enjoying your surroundings for a while is better than simply shooting as soon as you reach the site.

Post-processing your star photos

When it comes to post-processing your star images, there are two things you should know:

First, always shoot in RAW. This will give you a lot of room for post-processing (that won’t affect the image quality).

Second, some post-processing is always needed to get optimum results. You can find many tutorials on how to post-process Milky Way images, including some here on dPS!

Capture stunning star trails

If you are satisfied with your shots, the next step is to capture star trails.

Simply locate the North Star using the Star Chart app discussed above.

Then keep the North Star in your composition, because this is the star that all other stars rotate around.

For star trails, all camera settings will remain the same, except that you can increase the shutter speed to 30 seconds.

Alternatively, you can go with faster shutter speeds (i.e., 20 seconds or faster, especially if there are lights in the area and 30 seconds results in overexposed images).

Keep your camera on its continuous shooting mode, and let it capture as many exposures as possible. The more pictures you have, the more clear your star trails will be. Of course, continuous shooting will only create a series of short star trails; later, you can join all the exposures in Photoshop or use special software to create full trails (such as StarStaX).

For instance, this shot is a combination of 18 separate exposures:

Star trails, a merge of 18 shots each at 30 sec

Alternatively, you can capture one shot of the stars and make star trails with it using the HM Technique:

Star trails via HM technique
Star trail created in Photoshop using the HM Technique

And you’re also free to have fun with Photoshop:

Zooming star trails
Fun in Photoshop!

Once you’ve nailed photographing the Milky Way, try including foreground objects for better compositions:

trees and the night sky
Nikon D7000 | 18mm | 20 sec | f/3.5 | ISO 1600

Photographing stars using a kit lens: conclusion

You’re all set to shoot your own stars!

With the help of a kit lens, you can create some beautiful star photography – the kind that’ll make you happy and impress your friends, too.

You could even try creating panoramas to get more of the Milky Way in your composition, like this:

Panorama Stitch of 4 shots of the night sky

So happy shooting, and keep me updated with your results! And if you need any help, let me know down in the comments.

The post Photographing Stars Using a Kit Lens appeared first on Digital Photography School. It was authored by Adeel Gondal.


Digital Photography School

 
Comments Off on Photographing Stars Using a Kit Lens

Posted in Photography