We know more about the surface of the moon and Mars combined than we do about our own ocean floor, according to NASA Ames scientist Ved Chirayath, which is why he is developing a camera that can remove the water from our seas to reveal 3D images of what’s below the waves. Using a grant from Earth Science Technology Office, Chirayath is working on a project that uses both hardware and software to see and map the floors below great bodies of water as though the water isn’t there at all.
In the video above, Chirayath explains that it is hard to see the ocean floor due to the waves on the surface, but his Fluid Cam uses software called Fluid Lensing to image objects in up to 10 meters of water.
While he doesn’t explain exactly how this technique works, he does say it requires a camera with a lot of processing power, as the software runs on-board. The camera he shows in the video uses a Leica Elmarit-M 28mm F2.8 lens on front of what is described as a ‘high performance’ camera. We are told it uses a 16-core processor and has 1TB of RAM, and that it outputs data at a rate of 550MB per second.
At the moment, the camera is in the test stage and has been used attached to a drone, but NASA hopes that the technology will be housed in airplanes and satellites in the near future, so wider areas can be mapped and explored.
The project was unveiled on the NASA website as part of the agency’s program to mark Earth Day. For more information, visit this link.
Press Release
New Camera Tech Reveals Underwater Ecosystems from Above
Scuba divers and snorkelers spend vacations visiting exotic coastal locations to see vibrant coral ecosystems. Researchers also don their gear to dive beneath the surface, not for the stunning views, but to study the health of the reefs that are so critical to fisheries, tourism and thriving ocean ecosystems.
But one person can only see so much coral in a dive. What if you wanted to assess coral over an entire region or see how reefs are faring on a global scale?
Enter Ved Chirayath of NASA Ames Research Center in Silicon Valley, California. He has developed a new hardware and software technique called fluid lensing that can see clearly through the moving water to image reefs. Imagine you’re looking at something sitting at the bottom of a swimming pool. If no swimmers are around and the water is still, you can easily see it. But if someone dives in the water and makes waves, that object becomes distorted. You can’t easily distinguish its size or shape.
Ocean waves do the same thing, even in the clearest of tropical waters. Fluid lensing software strips away that distortion so that researchers can easily see corals at centimeter resolution. These image data can be used to discern branching from mounding coral types and healthy coral from those that are sick or dying. They can also be used to identify sandy or rocky material.
So far Fluid Cam, the imaging instrument that carries the fluid lensing software, has flown only on a drone. Someday, this technique could be flown on an orbiting spacecraft to gather image data on the world’s reefs.
That amount of data would be painstaking to sort through to look for specific coral attributes. So Chirayath’s team is cataloging the data they’ve collected and are adding it to a database to train a supercomputer to rapidly sort the data into known types – a process called machine learning. Because of the technology developments in both the tools to collect the data and the machine learning techniques to rapidly assess the data, coral researchers are a step closer to having more Earth observations to help them understand our planet’s reefs.
Articles: Digital Photography Review (dpreview.com)