Stanford University engineers have developed an in-air method for imaging underwater objects by combining light and sound to break through the seemingly impossible barrier at the interface of air and water.
The researchers anticipate that their hybrid optical-acoustic system will one day be used to conduct biological drone-based marine surveys, conduct large-scale aerial searches of sunken ships and aircraft, and map the depths of the ocean with speed and a similar level of detail as the Earth’s landscapes. Their “Photoacoustic Air Force Sonar System” is detailed in a recent study published in the journal IEEE Access.
“Air and space radar and laser systems, or LIDAR, have been able to map Earth’s landscapes for decades. Even radar signals can penetrate cloud and canopy cover. However, seawater is much too absorbent for imaging into the water, “said study leader Amin Arbabian, associate professor in electrical engineering at Stanford School of Engineering. “Our goal is to develop a more robust system that can visualize even through muddy water.”
Subheading: Loss of energy
Oceans cover about 70 percent of the Earth’s surface, but only a small proportion of their depths have been subjected to high resolution imaging and mapping.
The main obstacle relates to physics: Sound waves, for example, cannot pass from air into water or vice versa without losing most – more than 99.9 per cent – of their energy by reflection in by the other medium. A system that attempts to see underwater using sound waves traveling from air into water and back to air is subjected to this energy loss twice – resulting in an energy reduction of 99.9999 percent.
Similarly, electromagnetic radiation – an umbrella term that includes light, microwave and radar signals – also loses energy when passing from one physical medium to another, although the mechanism is different than for sound. “Light also loses some energy from reflection, but most of the energy loss is due to absorption by the water,” explained the study’s first author Aidan Fitzpatrick, a Stanford graduate student in electrical engineering. By the way, this absorption is also the reason why sunlight can’t penetrate the depths of the ocean and why your smartphone – which relies on cellular signals, a form of electromagnetic radiation – can’t receive underwater calls.
The highlight of all this is that oceans from space and space cannot be mapped in the same way that land can. To date, the majority of underwater mapping has been achieved by connecting sonar systems to ships trawling a particular region of interest. But this technique is slow and costly, and inefficient for covering large areas.
Subheading: Invisible jigsaw puzzle
Enter the Airborne Photoacoustic Sonar System (PASS), which combines light and sound to cut through the air-water interface. The idea for it came from another project that used microwaves to perform “disconnected” imaging and characterize the roots of underground plants. Some of the PASS instruments were initially designed for that purpose in collaboration with the laboratory of the Stanford professor of electrical engineering Butrus Khuri-Yakub.
At its core, PASS plays to the individual strengths of light and sound. “If we can use light in the air, where light travels well, and sound in the water, where sound travels well, we can get the best of both worlds,” Fitzpatrick said.
To do this, the system first fires a laser from the air that is absorbed on the water’s surface. When the laser is absorbed, it produces ultrasound waves that propagate down through the water column and reflect submerged objects before racing back towards the surface.
The returning sound waves are still emitted from most of their energy when they break the water surface, but by producing the underwater sound waves with lasers, the researchers can prevent the energy loss from happening twice.
“We have developed a system that is sensitive enough to compensate for the loss of this size and that still allows for signal detection and imaging,” Arbabian said.
The reflected ultrasound waves are recorded by instruments called transducers. Software is then used to piece the acoustic signals back together as an invisible jigsaw puzzle and recreate a three-dimensional image of the underwater feature or object.
“Similar to how light bends or ‘bends’ when it passes through water or any medium intensity than air, ultrasound also bends,” Arbabian explains. “Our image reconstruction algorithms are accurate for this bending that occurs when the ultrasound waves pass from the water to the air.”
Subhead: Drone ocean surveys
Conventional sonar systems can penetrate depths of hundreds to thousands of meters, and the researchers expect that their system will eventually be able to reach similar depths.
To date, PASS has only been tested in a large fish tank size container in the laboratory. “Current experiments use static water but we are currently working towards dealing with water waves,” said Fitzpatrick. “This is a challenging problem but we think it’s practical.”
The next step, researchers say, will be to conduct tests at a larger location and, eventually, an open water environment.
“Our vision for this technology is aboard a helicopter or drone,” Fitzpatrick said. “We expect the system to be able to fly at tens of meters above the water.”
Take the temperature of the ocean by measuring the speed of sound waves passing through it
Aidan Fitzpatrick et al, Airborne Sonar System for Remote Sensing and Imaging, IEEE Access (2020). DOI: 10.1109 / ACCESS.2020.3031808
Provided by Stanford University
Quote: Engineers combine light and sound to see underwater (2020, December 1) restored December 1, 2020 from https://techxplore.com/news/2020-12-combine-underwater.html
This document is subject to copyright. Other than any fair dealing for private study or research purposes, no part may be reproduced without the written permission. Content is provided for informational purposes only.