DKFI leverages AI to improve environmental perception of robotic underwater vehicles
Join Our Newsletter - Get important industry news and analysis sent to your inbox – sign up to our e-Newsletter here
X

DKFI leverages AI to improve environmental perception of robotic underwater vehicles

08 Nov 2021 (Last Updated November 8th, 2021 10:08)

DKFI leverages AI to improve environmental perception of robotic underwater vehicles
Credit: LeoWolfert/Shutterstock

Concept: German Research Center for Artificial Intelligence (DFKI) has developed a project called DeeperSense that combines visual and acoustic sensors with AI to improve the environmental perception of the robotic underwater vehicle. The project aims to improve the perception of unmanned underwater vehicles (UUV) in three use cases, namely diver monitoring in turbid waters, seabed mapping, and exploration of coral reefs.

Nature of Disruption: The DeeperSense project is based on the concept of intersensory learning wherein one sensor modality learns from another sensor modality. In this way, one sensor’s output is similar to that of other sensors in terms of accuracy as well as the type of output and interpretation of the data. In the case of UUV, it has a camera and sonar as two sensors that observe the same scene simultaneously. The low-resolution sonar data serves as input to an artificial neural network, while high-resolution camera data serve as output. The combination gradually adapts to the network to deliver the desired output and learns about relationships between the input and output data. The result is an algorithm that once trained, generates a camera-like image based only on the low-resolution sonar data.

Outlook: The maritime use cases DeeperSense wants to address earlier had one common problem of poor environmental perception of UUVs because of murky waters, cramped spaces, or low light conditions. In diver monitoring in turbid waters, the traditional monitoring system is restrained by optical sensors’ extent underwater. The project introduced by DFKI trains sensors on UUV to deliver camera-like images that are easily interpreted by human personnel at the control station. In the second use case of coral reef exploration, the challenge lies in reliable obstacle detection which is overcome by the combination of visual and acoustic sensors. The AI algorithm recognizes the object identified from the data of the sensor to data of another sensor. This way it navigates through coral reef instead of going over it. In the third use case, the project’s UUV reliably maps the seafloor which was expensively done by ship. The mapping done under the project is less expensive, reliable, and delivers detailed output. The application can be extended to exploration activities. The DFKI project is given $3.5M by the EU under the 2020 research framework program.

This article was originally published in Verdict.co.uk