NASA’s Stratospheric Observatory for Infrared Astronomy (SOFIA) has used its onboard Faint Object infrared Camera for the SOFIA Telescope (FORCAST) to discover water molecules on the sunlit surface of the Moon. For the first time, there are indications that water may be distributed across the Moon’s surface, and not limited to just cold, dark areas of the lunar surface.
SOFIA’s infrared camera, used in conjunction with a 106-inch diameter telescope, picked up ‘the specific wavelength unique to water molecules, at 6.1 microns, and discovered a relatively surprising concentration in sunny Clavius Crater.’ This crater is one of the largest craters visible from Earth and is in the Moon’s southern hemisphere.
Casey Honniball is the lead author who published the results as part of her graduate thesis work at the University of Hawaii at M?noa. She is now a postdoctoral fellow at NASA’s Goddard Space Flight Center in Maryland. Of the discovery, Honniball says, ‘Prior to the SOFIA observations, we knew there was some kind of hydration. But we didn’t know how much, if any, was actually water molecules – like we drink every day – or something more like drain cleaner. Without a thick atmosphere, water on the sunlit lunar surface should just be lost to space. Yet, somehow we’re seeing it. Something is generating the water, and something must be trapping it there.’ If you’d like to read the full paper, it has been published in Nature Astronomy.
Data gathered using SOFIA’s onboard camera shows water in Clavius Crater in concentrations of 100 to 412 parts per million, ‘roughly equivalent to a 12-ounce bottle of water trapped in a cubic meter of soil spread across the lunar surface.’ Paul Hertz, director of the Astrophysics Division in the Science Mission Directorate at NASA Headquarters says, ‘We had indications that H2O – the familiar water we know – might be present on the sunlit side of the Moon. Now we know it is there. This discovery challenges our understanding of the lunar surface and raises intriguing questions about resources relevant for deep space exploration.’
It’s not a lot of water, about 1% of the water found in the Sahara desert, but it’s a significant discovery. The work of the SOFIA team has uncovered new questions about how water is created and how it persists on the airless Moon. Further, water is a critical resource in deep space exploration. NASA’s Artemis program is keen to learn more about the presence of water on the Moon, and ideally, discover a way to access water in its pursuit of establishing a sustainable human presence on the Moon by 2030.
‘Water is a valuable resource, for both scientific purposes and for use by our explorers,’ said Jacob Bleacher, chief exploration scientist for NASA’s Human Exploration and Operations Mission Directorate.’ Bleacher continues, ‘If we can use the resources at the Moon, then we can carry less water and more equipment to help enable new scientific discoveries.’
As to how the water molecules ended up on the surface remains an unanswered question. One theory is that ‘Micrometeorites raining down on the lunar surface, carrying small amounts of water, could deposit the water on the lunar surface upon impact.’ Another theory involves a two-step process ‘whereby the Sun’s solar wind delivers hydrogen to the lunar surface and causes a chemical reaction with oxygen-baring minerals in the soil to create hydroxyl’ which is then transformed into water by radiation from micrometeorites.
‘This illustration highlights the Moon’s Clavius Crater with an illustration depicting water trapped in the lunar soil there, along with an image of NASA’s Stratospheric Observatory for Infrared Astronomy (SOFIA) that found sunlit lunar water.’ Image and caption credits: NASA/Daniel Rutter
SOFIA, which is a modified Boeing 747SP jetliner, typically focuses on very distant objects, such as black holes, galaxies and star clusters. In fact, the newly-published results are from SOFIA’s very first mission looking at the Moon. The team was essentially testing the tracking capabilities of its equipment, and this test produced a significant discovery. Additional flights will take a further look at the lunar surface.
SOFIA’s standard observations take place during a 10-hour overnight flight and captures images at mid- and far-infrared wavelengths. You can view some of the images it has captured by clicking here.
This is far from the first time NASA’s camera technology has produced meaningful, significant scientific discovery. Looking to the future, NASA’s Perseverance is currently about halfway to Mars, carrying a rover outfitted with a record-breaking 19 cameras. These cameras will capture incredibly detailed images of the Martian landscape.
The Nikon Z 14-24mm F2.8 S completes the ‘holy trinity’ of traditional F2.8 zooms for Z-mount. Offering substantial weight and size savings over the previous AF-S 14-24mm F2.8 (and capable of accepting both screw-in and cut gel filters) the new zoom is more practical than its predecessor, but is the higher price reflected in its performance? Take a look at our sample gallery to find out.
Three weeks ago, the long-awaited Zeiss ZX1 camera reappeared for pre-order on B&H Photo after months and months of silence regarding the availability of the Android-powered mirrorless camera. While B&H has since pulled its listing, a new report from Nokishita claims the camera will be available on October 29 with an MSRP of $ 6,000/€6,000 (the same price B&H had it listed for).
When B&H listed the Zeiss ZX1 for pre-order earlier this month, we contacted both B&H and Zeiss on the matter, but both passed on the opportunity to comment on the matter. Sometime between then and now, the pre-order option on B&H was removed with no further information on when we might see more. That is, until Nokishita published the above tweet earlier this morning.
We have contacted Zeiss for confirmation and will update this article accordingly if we receive a response. While we wait to hear more about the camera, you can check out our hands-on with it back at CP+ 2019.
The post The World’s First Completely AI-Powered Photo Editor Will Debut Before the Year Is Out appeared first on Digital Photography School. It was authored by Jaymes Dempsey.
Skylum’s Luminar AI, billed as “the first image editor fully powered by artificial intelligence,” will come available before the year is out.
Already, the image editor has created controversy among photographers thanks to its automated, AI-based approach to image editing, with some claiming that Luminar AI’s easy, no-experience-necessary approach is problematic – or even cheating.
And while we don’t know exactly how the program works, Skylum has released several videos showing off some of Luminar AI’s standout features.
Highlights include:
Composition AI, which automatically straightens your images and suggests cropping based on compositional guidelines and feedback from professional photographers
Face AI and Skin AI, which automatically retouch your subjects’ faces for improved teeth, lips, skin, and much more
Sky AI, which allows you to instantly swap skies while automatically adjusting for changes in lighting and color
Atmosphere AI, which lets you enhance your images with weather effects (such as haze, steam, drizzle, fog, and mist)
To see some of these AI effects in action, check out Skylum’s latest video:
While Luminar AI will be offered as a standalone editing program, Skylum’s most up-to-date software, Luminar 4, already packs some AI-based features (including a popular sky-replacement option). But Luminar AI promises to take AI editing to the next level, opening up advanced post-processing effects to a much larger audience.
Will Luminar AI do everything for you?
It doesn’t seem like it. As Skylum explains, you have to make creative choices; Luminar AI will do a lot of editing work for you, but you’ll remain at the helm.
As for the Luminar AI release:
The date is currently unknown, but Skylum promises a “holiday season” release. I’d expect an early December debut, though mid-December or late November certainly isn’t out of the question.
In terms of price, you can currently preorder Luminar AI for a discounted rate:
$ 74 USD for a standard copy of Luminar AI, or $ 139 USD to download both Luminar AI and Luminar 4.
So if Luminar AI’s simple approach to editing appeals to you, make sure you take a look while the discount still lasts! You can view the software here.
Now over to you:
What do you think about Luminar AI? Do you like the idea of AI-based editing? Or does it feel like cheating? And will you purchase the program? Share your thoughts in the comments!
The post The World’s First Completely AI-Powered Photo Editor Will Debut Before the Year Is Out appeared first on Digital Photography School. It was authored by Jaymes Dempsey.
Weird lens guru Mathieu Stern is back at it with a new video that shows images captured with two $ 20 Carl Zeiss projector lenses he converted into camera lenses.
As with many of Stern’s DIY projector lens projects, both of these lenses — a 120mm F1.9 and a 105mm F1.9 — lack any way to focus and don’t have any adjustable aperture. While the adjustable aperture isn’t quite so easy to address, the video briefly shows how he uses an M65 Helicoid ring adapter to give manual focus abilities to the lens. Although not shown in the video, Stern then uses an M65 to Sony E-mount adapter to use the custom lens to his Sony camera.
The resulting imagery captured with the lenses produces pronounced ‘swirly’ bokeh and gives a very sharp separation between the subject and the background. It’s not going to win any resolution or edge-to-edge sharpness contests, but considering you can pick up similar projector lenses for around $ 20 or so online and a set of adapters for your camera for roughly $ 50 or so, it’s a cheap way to get some unique shots.
Stern has a full list of the components he used in the video’s description on YouTube. You can find more of his work on his YouTube channel and website, which also features his always-growing ‘Weird Lens Museum.’
Roughly clockwise from left: 300mm collimator, laser transmission testing, lens test projector, Trioptics Imagemaster HR optical bench, spectrometry measurement. It might not look like much, but the total cost is similar to really nice house in a small city (or a decent house in big city).
I have a complete testing lab at my disposal: MTF benches, lens test projectors, spectrometers, lasers, an Imatest setup gathering dust in a back room; everything all the cool kids have. A lot of people assume I test the hell out of my own shiny new personal lenses after I buy them. (Yes, I buy my own stuff). I do test them, but not in the lab. I go out and take pictures with them.
It’s not because I’m such a great photographer that my practiced eye can tell more about the lens through photographs than any lab test could. I’m a mediocre photographer. Years ago I tried making a living as a photographer. I sold some prints once, made enough to pay for maybe half a lens, and after another six months without a sale I decided to explore other methods of supporting my extravagant lifestyle.
The lab is faster, gives tons of information, and makes cool graphs. But I still don’t use it to test my personal lenses
It’s not because the lab stuff doesn’t give useful information. The lab gives a LOT of useful information. Most people don’t have time to learn how to interpret it, or learn its value and limitations, but it’s useful information nonetheless. And the lab is fast; I can test a lens about 32 different ways in a couple of hours. My ‘test a lens with photography’ time is a half a day or more. So the lab is faster, gives tons of information, and makes cool graphs. But I still don’t use it to test my personal lenses.
Lab tests give a ton of precise information. Understanding and interpreting it is, I’ll admit, not completely intuitive.
That’s because all lab tests have some major limitations. The biggest one is this: real images are 3-dimensional, they are focused at a variety of distances, and almost always contain foregrounds and backgrounds. Optical tests are two-dimensional slices taken at a fixed focusing distance with no background or foreground. The focusing distance is infinity for an optical bench. It’s a single, close distance for Imatest / DxO / and other computer image analysis methods.
So, the lab tests tell me everything I want to know about the plane of exact best focus at one focusing distance. That’s really useful information, especially if you want to find out if a lens is optically maladjusted, want to know what kind of aberrations it has, or are interested in its maximum resolution. And it gives people numbers – the ammunition of choice in many a Forum War.
Even a three-dimensional standard comparison image, such as the kind that DPReview and other sites use, is basically limited to one focusing distance. That distance is different for different focal lengths but it’s always fairly close up. And, if it’s an indoor target, the depth of those targets is usually only a few feet at most; it’s not going to show you what the out of focus area 30 feet behind the image plane looks like.
What I actually do to test a new lens
Photographs give me far more information than the lab, even if it’s less exact. I don’t recommend brick wall or side-of-building photographs. Those are just 2-dimensional slices like the lab gives, but with more variables and less information. I want photographs of 3-dimensional subjects.
With the right background (I prefer a field or yard of grass) you can quickly compare resolution at a half-dozen focusing distances. Sure, some lenses are about the same at all distances, but many are not. No zoom lens is equally sharp at all focal lengths. My favorite grass field is a hill behind my office that slopes up away from me. I focus on the mower tracks and quickly get images at several focusing distances.
Simple grass slope image taken with a Canon 50mm F1.2 lens at F1.4.
Grass (or pebbles or concrete or all manner of things that make fairly uniform photographs filled with fine detail) are great for figuring out the zone of acceptable sharpness (for you) of a lens.
Repeating this set of images at several apertures lets me see at what aperture maximum center, middle, and edge sharpness occur (those are almost always different). It’s good to know things like there’s maximum center sharpness at F4 and the edges are at maximal sharpness at F6.3 or F8 or that they never get very sharp.
Grass is also great because it gives you a nice sharpness comparison as you leave the area of best focus. I also recommend looking at what you consider the depth of field at each aperture and focusing distance. Depth of field is not an area of maximal sharpness. It is an area of acceptable sharpness; there is greater and lesser sharpness within the depth of field. Your definitions of ‘acceptable sharpness’ in your images may be greater, or less, than the calculated depth of field.
You rarely see dramatic changes in a prime lens’ field curvature at different focusing distances, but you will usually see a dramatic change in a zoom’s field curvature at different focal lengths
More importantly, some lenses fall off of the sharpness cliff as they exit their area of maximal sharpness, others drift so slowly down the gentle sharpness slope that it really does seem as if the entire depth of field is maximally sharp. Also, that sharpness slope often changes at different apertures. Those are all good things to know.
The other thing I do is to take some of my grass images and run them through a Photoshop ‘Find Edges’ filter or equivalent. This will let you visualize the field curvature of your lens and see how it varies at different focal lengths or focusing distances. (Pro tip: you rarely see dramatic change in a prime lens’ field curvature at different focusing distances. You will, however, usually see a dramatic change in a zoom’s field curvature at different focal lengths.) That’s really useful information that few people know about their lenses. The find edges type filters are also a good way to look at depth of field at various apertures or with different lenses.
Same image as above (Canon 50mm F1.2) run through a find edges filter – the field curvature is obvious.
Field curvature of Canon 50mm F1.2 as measured on an optical bench. You get about the same information from the grass photo and find edges filter as you would from the $ 250,000 optical bench.
Grass shots also give you a superb way to see if your lens is softer in one area or if the field is tilted. The grass image above is very slightly tilted, an amount that’s about normal for a good prime lens. A more dramatic field curvature might look as though you’d rotated the dark area 15 or 20 degrees in Photoshop.
About half the people who take building or brick wall images and think their lens is ‘decentered’ actually have a lens with a field tilt; the lens is equally sharp on both sides, but not at the same distance as center focus. It’s actually very hard to detect a field tilt by shooting a chart and evaluating a two-dimensional image.
A large field tilt in a prime lens is unusual while a field tilt at some focal lengths of a zoom is pretty common. (I’ve seen 45 degree field tilts in zooms, but 10 degrees or so is routine.) If you return your zoom lens to the store for exchange, the replacement will probably have a different field tilt at another focal length.
People like to talk about a lens’ bokeh like it’s one thing, but bokeh often varies
If the lens is one for which I consider bokeh important, I use the a Bokelizer. Basically, this is a couple of strings of tiny Christmas lights hung in a three-dimensional pattern. I take some images at various focusing distances and evaluate the foreground and background in-focus highlights, as well as the in-focus lights. People like to talk about a lens’ bokeh like it’s one thing, but bokeh often varies in the foreground vs the background, at different focusing distances, and depending on how far off-center the object is for many lenses.
Why do I look at in-focus lights, since they have nothing to do with out-of-focus highlights? Because comparing pinpoint light sources is a superb way to see if the lens is optically maladjusted. ‘Optically maladjusted’ means a lens that has a decentered, tilted or poorly spaced element. On the forums, people often refer to all of these issues as ‘decentering’ but that’s less than correct.
Illustrations of the various types of optical maladjustments. In reality, a given lens usually has several small errors, rather than one single large one.
Each of those optical maladjustments causes different optical problems and often they’re apparent when looking at pinpoint light sources. Looking at pinpoint light sources also gives you an idea of the coma and other aberrations that the lens displays by design.
This image was created from equipment in the repair department that basically just projects pinhole lights. You can easily see the difference between a good lens (upper half) and one that is slightly decentered (bottom half).
Once I’m done with the stuff above, I go out and take the kinds of pictures that I bought the lens for. But the hour or two needed for the checks above gave me a lot of information about how to best use the lens’ strengths and weaknesses before I set off to shoot. It also shows me if the lens is optically maladjusted, and there’s no sense taking a bunch of photographs if I already know I’m going to return the lens.
Will taking pictures tell me if I got a copy that’s every bit as sharp as the copy Reviewer Guy got? Absolutely not. Does it let me spout numbers in ‘my lens is better than your lens’ Forum Wars? Again, no. But it certainly does tell me if the lens meets my expectations and will do the job I want it to do. Lab tests give me all manner of information, but they can’t tell me whether I’m going to like the images from the lens.
It doesn’t matter to me at all if I have the sharpest copy of a lens or not. I just want to know if it’s acceptable for the purposes I want to use it for
To be completely honest, if I think the lens isn’t as sharp as I expect, then I may actually take it to the lab and measure it on the bench. I’ve done that maybe twice in the last ten years out of a few dozen lenses I’ve purchased, and both times it turned out that the lens wasn’t up to spec. So, really, I knew the answer without using the bench.
Photographic testing won’t tell you if your lens is among the sharpest copies of that lens, or if it’s in the top half of the variation range or things like that. If you want to know that, then really you need to pay someone to test the lens on a test bench. Why don’t I do that? Because it doesn’t matter to me at all if I have the sharpest copy or not. I just want to know if it’s acceptable to me for my purposes.
Roger Cicala is the founder of Lensrentals.com. He started by writing about the history of photography a decade ago, but now mostly writes about the testing, construction and repair of lenses and cameras. He follows Josh Billings’ philosophy: “It’s better to know nothing than to know what ain’t so.”
The post Photographing Stars Using a Kit Lens appeared first on Digital Photography School. It was authored by Adeel Gondal.
Nikon D5100 | 18mm | 20 sec | f/3.5 | ISO 1600
Have you seen those amazing starry sky and Milky Way photographs from professional photographers and wondered how to create similar results? Have you never tried because you thought you didn’t have the proper equipment?
Let me tell you, “You are wrong!”
If you own a normal DSLR or mirrorless camera, you can create stunning star photos with only a kit lens.
In this article, I will explain the whole process of photographing stars with your kit lens. I’ll give you step-by-step instructions in the easiest possible way – so that, even if you don’t have much technical knowledge, you can start capturing star photos like a pro.
Let’s dive right in.
The star photography basics
To get started, you need to have following points in mind:
You must shoot in a place away from the city lights. The less light pollution you have, the more chance you’ll have of getting clear stars.
You’ll want a moonless night. Stars can also be shot on a full-moon night, but the brighter the moon is, the more light pollution it creates, and the stars will not be as prominent.
You’ll need a normal DSLR or mirrorless camera with a standard 18-55mm kit lens (such as this Canon lens or this Nikon lens).
You’ll also want a tripod.
You can Google your surroundings to find locations that are far away from the city (check out Dark Sky for a helpful interactive map).
You should know beforehand in what direction, and at what time, the moon is going to rise. That will help you a lot with the composition of your images.
However, a moonless night is always best to shoot stars – so I recommend checking the current moon phase before heading out.
Additionally, you can use a compass app on your smartphone to locate the North Star for star trails. To get an idea of the stars above your location, you can download an app called Star Chart (for iOS or for Android) or Google Sky. Both of these apps also show you the direction of the Milky Way, so you can shoot it directly and get amazing results.
I also recommend checking out PhotoPills; this app offers an incredible suite of features for the beginner (or more advanced!) night sky photographer.
Anyway, these apps are pretty accurate. With their help, you can see Mars, even with your naked eye (I’m sure you’ve seen Mars before, but were likely unable to differentiate it from the surrounding stars).
If you want to plan a future shoot or look for an appropriate time to shoot the Milky Way in your location, you can download the desktop app Stellarium. Just pop in your coordinates, and it will show you the direction of the Milky Way at a specific date and time.
Thanks to Stellarium, you can know the exact time of year the brightest part of Milky Way will be above your location (so you can do some amazing star photography!).
Nikon D5100 | 18mm | 20 sec | f/3.5 | ISO 1600
Key camera settings for photographing stars
Now let’s get to the important part of photographing stars with a kit lens:
Camera settings.
You will need to take control of your camera, so you’ll want to keep it in Manual mode.
Change the shooting mode to Manual and adjust your setup to the following settings:
Focal length
Set your focal length to the widest possible option. In the case of a kit lens, this will often be 18mm.
While you can technically choose any focal length you want, the more you zoom in, the fewer stars you will be able to capture.
Plus, your optimum exposure time before star trails start to develop will decrease as you increase your focal length (see the 500 Rule, discussed in the shutter speed section below).
Aperture
Setting your aperture to the widest option is key here; for my kit lens, this is f/3.5.
By using the widest aperture your lens allows, more light will enter through your lens.
And this will result in brighter stars and a brighter Milky Way!
Shutter speed
If you are only shooting stars and the Milky Way, I recommend a shutter speed of 20 seconds.
Why 20 seconds?
Here’s the answer:
A shutter speed of under 20 seconds will result in less light reaching the camera sensor.
And a shutter speed of over 20 seconds will start to create star trails. In other words, the stars will move visibly across the sky.
In fact, there’s a handy equation for calculating your shutter speed, called the 500 Rule:
The optimum exposure before you start getting star trails is calculated by dividing 500 by your focal length (you’ll need to divide the answer once more, by around 1.5, if you are using a cropped sensor.)
So in the example of an 18mm lens on a cropped sensor, divide 500 by 18, for an answer of 27.78. Then divide this again, by 1.5, to get 18.52, which is roughly 20 seconds.
Make sense?
Nikon D5100 | 18mm | 20 sec | f/3.5 | ISO 1600
ISO
Start by keeping your ISO at 1600. You can then increase it later, depending on your results.
But keep in mind:
The greater the ISO, the more noise there will be in your image.
Now, this does depend on the signal-to-noise ratio of the camera body you are using; high-end cameras tend to offer the best high-ISO noise performance, and even modern consumer cameras feature decent results at high ISOs. But try to boost the ISO on an older camera, and you’ll end up with all sorts of unwanted noise.
Remote shutter release
You’ll want to have a remote shutter release to avoid camera shake from hitting the shutter button.
If you don’t have a shutter release, just use your camera’s 2-second or 10-second timer.
That will minimize any blur in the picture due to camera shake.
You should also switch off any “vibration reduction” or “image stabilization” technology included in your camera or lens, because this technology can actually increase blur during long-exposure images.
Focusing your lens to infinity
After dialing in all these settings, here’s the only important thing left to do:
Focus your lens to infinity.
Now, a kit lens doesn’t have an infinity marker on it, so you’ll need to use hyperfocal distance values to focus your lens.
Here’s what you do:
Mount your camera and lens on a tripod, and focus on any bright object at a distance of 20 feet or more.
(If you are in the dark and struggling to focus, you can point a flashlight toward your camera and use this as a point of focus.)
Once the lens is focused beyond 20 feet, its hyperfocal distance will project to infinity and your stars will be sharp. This will also help get everything in the foreground sharp, too.
Don’t forget to switch your lens to manual focus; otherwise, it will start to hunt for focus when the shutter is pressed.
You may be wondering:
Why is focusing to infinity so important?
If your lens is not focused to infinity, you will capture a Milky Way image. However, the stars will not be as sharp and they’ll appear bigger, like this:
My lens was not focused to infinity.
And this:
My lens was not focused to infinity.
And this:
My lens was not focused to infinity.
You’ll get a similar result if you use a too-long shutter speed and produce star trails.
So make sure you pay careful attention to both your settings and your point of focus.
A quick tip for photographing stars
Once you reach your location, it’s better to first sit in the dark for at least 15 minutes to let your eyes adjust to the surroundings. This will help you see a lot of stars (and even the Milky Way) with your naked eye. It will also help you compose your images better – because it’s easier to create beautiful compositions when you can see!
Plus, enjoying your surroundings for a while is better than simply shooting as soon as you reach the site.
Post-processing your star photos
When it comes to post-processing your star images, there are two things you should know:
First, always shoot in RAW. This will give you a lot of room for post-processing (that won’t affect the image quality).
Second, some post-processing is always needed to get optimum results. You can find many tutorials on how to post-process Milky Way images, including some here on dPS!
Capture stunning star trails
If you are satisfied with your shots, the next step is to capture star trails.
Simply locate the North Star using the Star Chart app discussed above.
Then keep the North Star in your composition, because this is the star that all other stars rotate around.
For star trails, all camera settings will remain the same, except that you can increase the shutter speed to 30 seconds.
Alternatively, you can go with faster shutter speeds (i.e., 20 seconds or faster, especially if there are lights in the area and 30 seconds results in overexposed images).
Keep your camera on its continuous shooting mode, and let it capture as many exposures as possible. The more pictures you have, the more clear your star trails will be. Of course, continuous shooting will only create a series of short star trails; later, you can join all the exposures in Photoshop or use special software to create full trails (such as StarStaX).
For instance, this shot is a combination of 18 separate exposures:
Alternatively, you can capture one shot of the stars and make star trails with it using the HM Technique:
Star trail created in Photoshop using the HM Technique
And you’re also free to have fun with Photoshop:
Fun in Photoshop!
Once you’ve nailed photographing the Milky Way, try including foreground objects for better compositions:
Nikon D7000 | 18mm | 20 sec | f/3.5 | ISO 1600
Photographing stars using a kit lens: conclusion
You’re all set to shoot your own stars!
With the help of a kit lens, you can create some beautiful star photography – the kind that’ll make you happy and impress your friends, too.
You could even try creating panoramas to get more of the Milky Way in your composition, like this:
So happy shooting, and keep me updated with your results! And if you need any help, let me know down in the comments.
The post Photographing Stars Using a Kit Lens appeared first on Digital Photography School. It was authored by Adeel Gondal.
The post Food Photography Techniques and Tips appeared first on Digital Photography School. It was authored by Jonathan Pollack.
Editor’s note: This article was updated in October 2020.
I originally wrote this article in 2009 when I was beginning to take pictures for my wife’s baking blog and various local magazines. Since then, I’ve had a lot of practice, and I’ve honed my food photography techniques.
I’m thrilled to update everyone with what I’ve learned, and I hope that you find the information here helpful when taking pictures of your food.
Food styling
When you’re taking pictures of food, it’s critical that your subject looks as good as it hopefully tastes. If you don’t have the budget to hire a food stylist, it’s important to know some basic styling techniques that can make food look its best.
Make more food than you think you’ll need, and always photograph the examples that look the best. You cannot make a burnt waffle or soggy asparagus look appetizing.
Props can help you set the mood for your food photos. I have a basement full of plates, platters, chargers, cups, glassware, utensils, napkins, tablecloths, surfaces, and cutting boards at my disposal. With those, I can pull together what I need to set a table (or a portion of a table) for a picnic setting, a fine-dining scene, or something in between.
While you don’t need to have a ton of food photography props, I recommend that you at least have a few place settings and utensils, as well as a nice surface to photograph on.
Plating your dish properly can elevate it from mundane to extraordinary. If you aren’t working with a professional chef or food stylist, I recommend you read articles and watch video tutorials about plating as a starting point.
Think about ingredient placement, creating height or depth, adding color, or increasing contrast.
Here are a few plating tips to improve your food photography:
Plate an odd number of the item you’re photographing.
Add garnish where appropriate, such as with soups.
Lean longer flat items against those with some height.
Use edible flowers or fresh herbs to add some contrasting color to your plate.
Food photography composition
When taking pictures of food, you have the advantage of a subject that is stationary. This means you have complete control over the camera’s position and angle, and how close or far away the camera is from the scene.
To get the best composition, I recommend you start with a few food photography techniques:
Use the traditional rule of thirds to yield strong compositions if you don’t have a lot of experience. Mentally divide your frame into a 3×3 grid and place key elements at the gridlines or intersections. Once you’re familiar with the rule of thirds and you can see how it can lend power to your scene, treat it as a suggestion and experiment.
Draw the viewer into your composition. Show part of the plate in your photo rather than the entire dish. Use utensils or napkins to help guide the viewer’s eyes to what you want them to focus on. And remember that the same contrasting elements that help you style a plate will also work in your favor when it comes to composition.
Use negative space to make a powerful photo. Clutter in your scene causes visual confusion, so remove it. The less your viewer is drawn to, the better.
Think of how the dish would best be viewed. Head-on views, overhead shots, and views looking slightly down into the dish are always preferable to looking down at a plate from a 45-degree angle.
Keep your camera level. Early in my food photography career, I felt that angles conveyed a sense of excitement to the viewer. They did – because they made it look like the food was falling off the plate! I stopped rotating my camera, and my photos became much stronger overnight.
Zoom with your lens and your feet. There are times when macro photos of food work well, and there are times when a wider shot conveys a better sense of place and atmosphere. See if background compression helps remove distractions from your dish. Try a number of options with your food and see what works best.
Food photography lighting and exposure
While you don’t need the absolute best equipment to photograph food – I’m using camera equipment that’s a decade old! – you do need to think about how your choice of lighting, exposure, and even camera equipment affect your photo.
Placing your dish with natural light to the side and behind generally yields great results, but it’s also not consistent or reproducible; a passing cloud or thunderstorm can suddenly destroy all of your styling and composition efforts.
If you’re working on taking photos over the span of a few hours and you want them all to look similar, I highly recommend that you use artificial light of some sort. Most food photos I’ve taken indoors are photographed using flashes with diffusion from behind or off to the side of the food. I use reflectors to bounce some light back to the front of the dish, if needed.
If you’re working within a budget, consider getting inexpensive work lights and putting white linens between the lights and your food.
Learn how to use the exposure triangle whether you’re using natural or artificial lighting, and start setting everything on your camera manually. Aim to blur out your background while keeping the foreground nice and sharply focused. And, of course, make sure everything is lit just right.
Equipment for food photography
The best camera equipment in the world won’t help if you haven’t learned what to do with it. I always advise people to start by renting equipment or buying cheap, used gear.
To start off, I recommend a tripod, a camera body that supports detachable lenses, and a medium zoom lens. I encourage you to learn to shoot and edit RAW photos, as you can adjust far more in post-production with them than you can with JPEG images.
If you’re ready to move on from your basic camera setup, I recommend you buy color calibration equipment and lighting. And when you feel confident that you have a good understanding of all of the gear you have and you feel that your current gear is holding you back, finally commit to a full-frame camera system and professional lenses.
Food photography techniques: conclusion
What food photography techniques have worked for you? What techniques have been problematic? Please let me know in the comments!
The post Food Photography Techniques and Tips appeared first on Digital Photography School. It was authored by Jonathan Pollack.
Autel Robotics announced its long-awaited EVO II series drone at CES 2020 in January, promising vast improvements over the original EVO model launched back in 2018. Its most notable feature is a modular camera system, offering three models that cover a range of features that meet different users’ needs, from consumers to professionals.
The camera on the standard EVO II uses a 1/2″ 48MP Quad Bayer sensor and is the first consumer drone to offer 8K video. The EVO II Pro uses a larger 1″-type 20MP sensor that gives 6K recording, and the EVO II Dual features both an optical and a thermal camera in a single unit and also maxes out at 6K recording. The modular system allows users to switch cameras if needed on a single drone.
12 computer vision sensors for omnidirectional obstacle avoidance
Controller with built-in color screen
9km video transmission
No geofencing
Online login not required to fly
These shared specifications suggest a capable drone. The lack of geofencing will certainly appeal to some, and the 40-minute flight time is impressive. However, there are key differences between models depending on which camera you intend to use.
EVO II
EVO II Pro
EVO II Dual
MSRP
$ 1495
$ 1795
$ 9998
Sensor size
1/2″ CMOS
1″-type CMOS
1/2″ CMOS (optical)
FLIR BOSON sensor (thermal)*
Sensor resolution
48MP Quad Bayer
20MP
20MP (optical)
640 x 512 (thermal)
Max photo resolution
48MP
20MP
20MP
ISO range
Video: 100-6400
Photo: 100-3200
Video: 100-6400
Photo: 100-12,800
Video: 100-6400
Photo: 100-12,800
Max video resolution
8K/25p, 6K/30p, 4K/60p
6K/30p, 4K/60p, HD/120p
6K/30p, 4K/60p, HD/120p
Lens
26mm equiv. (F1.8 fixed)
29mm equiv. (F2.8-11)
29mm equiv. (F2.8-11)
Zoom
8x (up to 4x lossless)
8x (up to 3x lossless)
8x (up to 3x lossless)
Takeoff weight
1150g (2.5 lbs.)
1191g (2.6 lbs.)
1150g (2.5 lbs.)
*FLIR sensor size not specified
When buying an EVO II, you can choose the model with the camera that best fits your needs. If you want to switch cameras at some point, you can do it without buying a whole new drone.
The EVO II was released in June following several delays, beginning with a software bug and supply chain shortages. Has the company ironed out the glitches that delayed its launch for a few months? And, how does it compare to similar models from DJI? We’ll explore both questions in this review.
We tested the standard EVO II, thanks to our friends at Drone-Works. Chicago-based professional Antoine Tissier lent us his EVO II Pro model for some additional tests. We did not test the EVO II Dual.
Aircraft and controller
The EVO II bears a strong resemblance to DJI’s folding Mavic series of drones, though its body is substantially larger, and it doesn’t quite fit in your palm. One thing that’s a bit perplexing is that the bottom propellers don’t fold neatly under. They jut out slightly, making it more difficult to carry the drone in-hand.
Aircraft
The EVO II features a total of 12 computer vision sensors located on the front, rear, top, bottom, left, and right side of the aircraft for omnidirectional obstacle avoidance. There are also two ultrasonic sensors located on the bottom of the drone for precision hovering.
The Owner’s Manual points out that there are blind spots on all 4 corners of the drone. When I flew the EVO II in diagonal directions, I noticed that obstacle avoidance didn’t activate at times. You should always fly your drone within visual line of sight, regardless.
The bottom of the Autel EVO II aircraft is equipped with 2 Ultrasonic sensors (closest to the camera) followed by the Downward Vision System (in the middle and back) and the Downward Vision Lighting LED (middle-right).
Autel claims a 40-minute battery life while flying and 35 minutes when hovering without wind. I found this figure extremely accurate. For comparison, the Mavic Air 2 clocks in at 34 minutes while the Mavic 2 Pro tops out around 30 minutes. That extra 6–10 minutes of battery life will matter if you’re performing an inspection or mapping a site.
The battery is huge at 7,100 mAH and slides in and out easily. According to Autel, a ‘patented Battlock system’ prevents the battery from ejecting during fast flights or crashes.
8GB of onboard storage is available if you’re without a memory card or as back up if you run out of space while capturing imagery. Media stored on the drone can be accessed through a USB-C port located on the right-hand side. On the opposite side is a microSD slot that can house a card up to 256GB.
Controls and flight modes
The EVO II is powered by the same type of remote as the original EVO, which is disappointing for several reasons. Because you’re using it to maneuver your drone, the remote should be ergonomically friendly. Unfortunately, that’s not the case with this particular design. Two rather awkward handles fold out from the bottom that are made of slick plastic. While I didn’t fly in hot weather, I couldn’t help but wonder how challenging it might be to hold on to the remote should my palms sweat.
Your mobile device clamps in on top of the remote, and you don’t need to remove your smartphone case. Much like the original EVO or competing Mavic models, tablets will not fit. The main part of the controller features a built-in 3.3–inch OLED display.
The controller’s 3.3–inch built-in OLED display gives you critical flight information.
It’s possible to operate the EVO II using the remote controller on its own. This works for taking photographs or video clips on the fly. However, Autel recommends using its Explorer app on a smartphone to access all of the drone’s features.
Unlike recent Mavic controllers, there isn’t a simple routing solution for connecting your mobile device if you’re using Apple’s iPhone. Instead, a USB Type-A port can be found at the bottom of the remote. This means you need to supply your own connecting cable, much like the DJI Phantom 4 models of 2016. For all other smartphones, a USB Type-C connector is included.
Another issue stems from two buttons labeled ‘A’ and ‘B’ on the remote’s backside. They’re way too easy to accidentally press while flying and activating, for example, the Voice Assistant or an Intelligent Flight mode. It’s possible to program the buttons to perform different functions, but you’re likely to activate a feature unintentionally at least once per flight, and it’s distracting at best.
Poorly placed buttons on the backside of the remote make it much too easy to activate features like Autel’s Voice Assistant accidentally.
I can’t help but wonder why Autel didn’t take a cue from DJI, who made it incredibly simple to switch flight modes by featuring them front–and–center on their Mavic Air 2 remote. For example, to activate ‘Ludicrous’ mode, the equivalent of Autel’s Sport mode, which allows the drone to travel at its top speed of almost 45 mph, you need to go into the app’s settings menu to switch over.
The sticks on the remote are easy to maneuver with just the right amount of resistance. When powering on, you’ll have to press down on the drone battery button for three or more seconds before it powers up or down, a bit different for DJI users accustomed to a quick tap followed by a two-second hold.
Odds and ends
Drone-Works sent me the EVO II ‘Rugged Bundle,’ which includes a hard case designed specifically for this product by GPC. It also has two extra sets of propellers and an additional flight battery. The case is rather large for what is fundamentally a compact drone and will be a hassle, especially with airport security, once air travel becomes commonplace again.
On the right is a Mavic 2 case I purchased for myself. Though the drone isn’t too much smaller than the EVO II, the case that comes with the ‘Rugged Bundle’ is overwhelmingly large for a foldable drone.
You must be logged in to post a comment.