Marine image analysis

We use deep learning to collect, detect and classify large volumes of observational data from the marine industry. This can be sonar acoustics, images and videos captured from trawls or the seabed, drone images of marine mammals, or different types of microscopic images.

This is an underwater image of a seabed. Important elements, such as fish, are highlighted with green, red and yellow dots. The water is green, the seabed is brown and the image is somewhat murky.
Detecting and tracking fish in underwater videos. Image: Kim Halvorsen, IMR.

Monitoring marine stocks and ecosystems

Vast amounts of complex observation data are being collected in the marine sector. These data contain valuable information critical for monitoring marine stocks, ecosystems, and ensuring sustainable fisheries and harvest. The sources are diverse, ranging from optical imagery and videos, to acoustic surveys. Next generation marine services, like real-time analysis of trawl content, will further increase the amount of data.

As the volume continues to grow, manual analysis is increasingly inefficient. Advancements in deep learning provide solutions to address these issues. At NR, we use deep learning to develop methods for automatic analysis and extraction of various types of marine image data, such as underwater videos and images, sonar acoustics, microscopic images of otoliths, and drone images of marine mammals.

Automated tracking of fish populations

Underwater cameras are increasingly used to closely monitor fish populations. They may be mounted in trawls, fish pens or near underwater habitats. Analysing the data from these cameras manually is costly, time-consuming, and ineffectual. However, by utilising deep learning algorithms, fish in these videos can be automatically located, classified and tracked over time. This can enhance the efficiency and effectiveness of marine monitoring and management. 

Detection and classification of data

The complexity of some types of marine image data and scarcity of training data present significant challenges. Acoustic trawl surveys are, for example, not typically labelled with high levels of detail (i.e. strong labels). We developed a method for detecting and classifying schools of sandeels based on strongly labelled acoustic data and are currently adapting this method for weakly labelled acoustic data.

In the marine sector, it is common to leverage data from different sources in order to make final predictions. Because of this, we have expanded our sandeel model to include auxiliary data with a different modality as input. This has enhanced the model’s performance.

Trustworthy and transparent predictions

When predictions are intended, such as model input for abundance estimation, trustworthiness is vital, and transparency and reliability are key.

Among our research areas, we are have examined how explainable artifical intelligence (XAI) can be used to understand what parts of otoliths our models focus on when making fish age predictions.

In the interest of reliability, we have also researched how the models perform on new datasets with slightly different distributions. For otoliths, this may happen when images come from different labs. Here, we are looking into methods for domain adaptation that, when applied to the otolith case, helps an existing model trained in one lab, to perform equally well for a different lab without retraining with labelled samples.


Selected projects

To learn more about marine image analysis at NR, please contact:

Further resources:

CRIMAC

Visual Intelligence

The image shows how our method detects and characterises sandeel. The image is split in two; the lower section highlights detected sandeel in red against a black background
In the COGMAR project, we developed a method to detect and classify sandeel in acoustic data. Image: NR.

The figure shows six images of otoliths to the left that demonstrate which areas of the object deep neural networks find notable when assessing the image. To the right, for comparison, an image of what humans see.
What does a neural network look for when estimating the age of fish using images of otoliths? Photo: NR.