Mapping the Closest Frontier, Our Oceans
Whenever you look through a substance, whether it’s the water in a pool or a pane of old, rippled glass, the objects you see look distorted. For centuries, astronomers have been mapping the sky through the distortions caused by our atmosphere, however, in recent years, they’ve developed techniques to counter these effects, clearing our view of the stars. If we turn to look at the Earth instead of the skies, distorted visuals are a challenge too: Earth scientists who want to map the oceans or study underwater features struggle to see through the distortions caused by waves at the surface.
Researchers at NASA’s Ames Research Center, in California’s Silicon Valley, are focused on solving this problem with fluid lensing, a technique for imaging through the ocean’s surface. While we’ve mapped the surfaces of the Moon and Mars in great detail, only 4% of the ocean floor is currently mapped. Getting accurate depth measurements and clear images is difficult in part, due to how light is absorbed and intensified by the water and distorted by its surface. By running complex calculations, the algorithm at the heart of fluid lensing technology is largely able to correct for these troublesome effects.
You’ve probably noticed these distortions between light and water before. When you look down at your body in a swimming pool, it appears at odd angles and different sizes because you’re looking at it through the water’s surface. When light passes through that surface, it also creates bright bands of light, in an almost web-like structure that you see at the bottom of the pool called caustics. When caustics, are combined with the other distortions caused by water, they make imaging the ocean floor a difficult process. Caustics on the ocean floor are so bright that sometimes they are even brighter than sunlight at the surface!
Researchers at the Laboratory for Advanced Sensing at NASA Ames are developing two technologies to image through the ocean surface using fluid lensing: FluidCam and MiDAR, the Multispectral Imaging, Detection, and Active Reflectance instrument.
A Lens to the Sea
The FluidCam instrument is essentially a high-performance digital camera. It’s small and sturdy enough to collect images while mounted on a drone flying above a body of water. Eventually, this technology will be mounted on a small satellite, or CubeSat, and sent into orbit around the Earth. Once images of the sea floor are captured, the fluid lensing software takes that imagery and undoes the distortion created by the ocean surface. This includes accounting for the way an object can look magnified or appear smaller than usual, depending on the shape of the wave passing over it, and for the increased brightness caused by caustics.
While FluidCam is passive, meaning it takes in light like a traditional camera and then processes those images, MiDAR will be active, collecting data by transmitting light that gets bounced back to the instrument, similar to how radar functions. It also operates in a wider spectrum of light, meaning it can detect features invisible to the human eye, and even collect data in darkness. It’s also able to see deeper into the ocean, using the magnification caused by the water’s surface to its advantage, leading to higher resolution images. MiDAR could even make it possible for a satellite in orbit to explore a coral reef on the centimeter scale.
Both technologies bring us closer to mapping the ocean floor with a level of detail previously only possible when teams of divers were sent under water to take photographs. By using fluid lensing on satellites in orbit, the oceans can be observed at the same level of detail across the globe.
Citizen Science to Help Save Coral
But why does mapping the ocean matter? Besides being the Earth’s largest ecosystem, it’s also home to one of the planet’s most unique organisms: coral. Coral is one of the oldest life forms on the planet, and one of the few that is visible from space. This irreplaceable member of the ocean world is dying at an unprecedented rate and, without proper tracking, it’s unclear exactly how fast or how best to stop its deterioration. With fluid lensing technology, the ability to track changes to coral reefs around the world is within reach.
A program called NeMO-Net aims to do just this, with some help from machine learning technologies and the general public. A citizen science game by the same name, soon to be released to the public, allows users to interact with real NASA data of the ocean floor, and highlight coral found in these images. This will train an algorithm to look through the rest of the data for more coral, creating a system that can accurately identify coral in any imagery that it processes.
Tracking coral allows scientists to better pinpoint the causes of its deterioration and come up with solutions to limit damaging human impact on this life form that hosts more biodiversity than the Amazon rainforest.
By using techniques originally designed to study the stars, fluid lensing will allow us to learn more about one of the greatest mysteries right here on our own planet: the ocean and all the multitudes of life within it. That alien world holds just as many mysteries as the cosmos, and with technologies like fluid lensing, discovering those enigmas is within our grasp.
- March 2019: In collaboration with the University of Puerto Rico, a research crew from NASA Ames will be deploying FluidCam and MiDAR to study the shallow reefs of Puerto Rico. Field sites include the La Gata and Caracoles Reefs, Enrique Reef, San Cristobal Reef, and Media Luna Reef.
- May 2019: Another deployment of the MiDAR instrument will take place in Guam, with the goal of testing while diving and in the air.
- Fall 2019: Fluid Lensing instruments will be deployed to the Great Barrier Reef.
The Laboratory for Advanced Sensing is supported by the NASA Biological Diversity Program, Advanced Information Systems Technology Program and Earth Science Technology Office.