• Thu. Dec 26th, 2024

North East Connected

Hopping Across The North East From Hub To Hub

What is Fluid Lensing?

Byadmin

Apr 22, 2020 #climate

Mapping the Closest Frontier, Our Oceans

A shallow marine ecosystem, with bright blue water, colorful fish and coral across the ocean floor.
A coral reef in American Samoa, one of the locations where researchers from the Laboratory for Advanced Sensing went on deployment to collect data using fluid-lensing instruments.
Credits: NASA/Ames Research Center/Ved Chirayath

Whenever you look through a substance, whether it’s the water in a pool or a pane of old, rippled glass, the objects you see look distorted. For centuries, astronomers have been mapping the sky through the distortions caused by our atmosphere, however, in recent years, they’ve developed techniques to counter these effects, clearing our view of the stars. If we turn to look at the Earth instead of the skies, distorted visuals are a challenge too: Earth scientists who want to map the oceans or study underwater features struggle to see through the distortions caused by waves at the surface.

A large swath of ocean, with three coral reefs shown at their intersection.
An astronaut on board the International Space Station used a powerful lens to photograph corals in Australia’s Great Barrier Reef. The photo area spans about 10 miles of the 1,700-mile reef system.
Credits: NASA

Researchers at NASA’s Ames Research Center, in California’s Silicon Valley, are focused on solving this problem with fluid lensing, a technique for imaging through the ocean’s surface. While we’ve mapped the surfaces of the Moon and Mars in great detail, only 4% of the ocean floor is currently mapped. Getting accurate depth measurements and clear images is difficult in part, due to how light is absorbed and intensified by the water and distorted by its surface. By running complex calculations, the algorithm at the heart of fluid lensing technology is largely able to correct for these troublesome effects.

You’ve probably noticed these distortions between light and water before. When you look down at your body in a swimming pool, it appears at odd angles and different sizes because you’re looking at it through the water’s surface. When light passes through that surface, it also creates bright bands of light, in an almost web-like structure that you see at the bottom of the pool called caustics. When caustics, are combined with the other distortions caused by water, they make imaging the ocean floor a difficult process. Caustics on the ocean floor are so bright that sometimes they are even brighter than sunlight at the surface!

Researchers at the Laboratory for Advanced Sensing at NASA Ames are developing two technologies to image through the ocean surface using fluid lensing: FluidCam and MiDAR, the Multispectral Imaging, Detection, and Active Reflectance instrument.

A researcher swimming in shallow waters with the FluidCam instrument.
A researcher validating the airborne imagery from the FluidCam instrument while on deployment in Puerto Rico.
Credits: NASA

A Lens to the Sea

The FluidCam instrument is essentially a high-performance digital camera. It’s small and sturdy enough to collect images while mounted on a drone flying above a body of water. Eventually, this technology will be mounted on a small satellite, or CubeSat, and sent into orbit around the Earth. Once images of the sea floor are captured, the fluid lensing software takes that imagery and undoes the distortion created by the ocean surface. This includes accounting for the way an object can look magnified or appear smaller than usual, depending on the shape of the wave passing over it, and for the increased brightness caused by caustics.

While FluidCam is passive, meaning it takes in light like a traditional camera and then processes those images, MiDAR will be active, collecting data by transmitting light that gets bounced back to the instrument, similar to how radar functions. It also operates in a wider spectrum of light, meaning it can detect features invisible to the human eye, and even collect data in darkness. It’s also able to see deeper into the ocean, using the magnification caused by the water’s surface to its advantage, leading to higher resolution images. MiDAR could even make it possible for a satellite in orbit to explore a coral reef on the centimeter scale.

Both technologies bring us closer to mapping the ocean floor with a level of detail previously only possible when teams of divers were sent under water to take photographs. By using fluid lensing on satellites in orbit, the oceans can be observed at the same level of detail across the globe.

Citizen Science to Help Save Coral

A player interacting with coral imagery on a tablet device on top of a light brown, wooden table.
A player classifying real NASA imagery of coral in the NeMO-Net game on a tablet from the comfort of home.
Credits: NASA/Ames Research Center/Ved Chirayath

But why does mapping the ocean matter? Besides being the Earth’s largest ecosystem, it’s also home to one of the planet’s most unique organisms: coral. Coral is one of the oldest life forms on the planet, and one of the few that is visible from space. This irreplaceable member of the ocean world is dying at an unprecedented rate and, without proper tracking, it’s unclear exactly how fast or how best to stop its deterioration. With fluid lensing technology, the ability to track changes to coral reefs around the world is within reach.

A sunrise over the ocean, viewed from the deck of a virtual research vessel, with the NASA logo on the floor.
A sunrise on the Nautilus, the virtual research vessel in the NeMO-Net game. From the ship’s deck, players can go on dives, check for messages with new missions, track progress and more.
Credits: NASA/Ames Research Center/Jarret Van Den Bergh

A program called NeMO-Net aims to do just this, with some help from machine learning technologies and the general public. A citizen science game by the same name, released to the public, allows users to interact with real NASA data of the ocean floor, and highlight coral found in these images. This will train an algorithm to look through the rest of the data for more coral, creating a system that can accurately identify coral in any imagery that it processes.

Tracking coral allows scientists to better pinpoint the causes of its deterioration and come up with solutions to limit damaging human impact on this life form that hosts more biodiversity than the Amazon rainforest.

By using techniques originally designed to study the stars, fluid lensing will allow us to learn more about one of the greatest mysteries right here on our own planet: the ocean and all the multitudes of life within it. That alien world holds just as many mysteries as the cosmos, and with technologies like fluid lensing, discovering those enigmas is within our grasp.

Fluid Lensing team flying instrument from a dock with a boat.
Researchers flying the FluidCam instrument during a field deployment in Puerto Rico.
Credits: NASA

Milestones:

  • March 2019: In collaboration with the University of Puerto Rico, a research crew from NASA Ames deployed FluidCam and MiDAR to study the shallow reefs of Puerto Rico. Field sites include the La Gata and Caracoles Reefs, Enrique Reef, San Cristobal Reef and Media Luna Reef.
  • May 2019: Another deployment of the MiDAR instrument took place in Guam, with the goal of testing while diving and in the air.
  • July 2019: MiDAR was honored as the runner up for NASA’s 2019 Government Invention of the Year Award.
  • April 2020: The video game NeMO-Net is released to the public. Download the game now from the Apple App store.

Partners:

The Laboratory for Advanced Sensing is supported by the NASA Biological Diversity Program, Advanced Information Systems Technology Program and Earth Science Technology Office.

Learn more:

For researchers:

For news media:

Members of the news media interested in covering this topic can access the NeMO-Net Media Resource Reel and should reach out to the NASA Ames newsroom.

Last Updated: April 10, 2020
Editor: Frank Tavares

By admin