{"id":21437,"date":"2023-06-26T09:24:47","date_gmt":"2023-06-26T07:24:47","guid":{"rendered":"https:\/\/nr.no\/en\/?post_type=bc_area&#038;p=21437"},"modified":"2024-11-26T13:44:42","modified_gmt":"2024-11-26T12:44:42","slug":"marine-image-analysis","status":"publish","type":"bc_area","link":"https:\/\/nr.no\/en\/areas\/image-analysis-and-earth-observation\/image-analysis\/marine-image-analysis\/","title":{"rendered":"Marine image analysis"},"content":{"rendered":"\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<p><strong>Every day, vast amounts of observational data are generated for monitoring and resource management in the marine sector. These data contain valuable information critical for sustainable fisheries and aquaculture, but manual methods are insufficient to handle the volume and complexity.  At NR, we develop deep learning-based methods that automate the collection, detection and classification of data, enabling efficient processes that are precise and scalable.<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-image size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"541\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/06\/marineimageanalysis-1024x541.png\" alt=\"This is an underwater image of a seabed. Important elements, such as fish, are highlighted with green, red and yellow dots. The water is green, the seabed is brown and the image is somewhat murky.\" class=\"wp-image-25328\" style=\"width:900px\" srcset=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/06\/marineimageanalysis-1024x541.png 1024w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/06\/marineimageanalysis-300x159.png 300w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/06\/marineimageanalysis-768x406.png 768w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/06\/marineimageanalysis.png 1379w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\"><em>Detecting and tracking fish in underwater videos. <\/em>Image: Kim Halvorsen, IMR.<\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Efficient analysis of complex marine data <\/strong><\/h2>\n\n\n\n<p>Vast amounts of complex observation data are being collected in the marine sector. These data contain valuable information critical for monitoring marine stocks, ecosystems, and ensuring sustainable fisheries and harvest. The sources are diverse, ranging from optical imagery and videos, to acoustic surveys. Next generation marine services, like real-time analysis of trawl content, will further increase the amount of data. <\/p>\n\n\n\n<p>As the volume continues to grow, manual analysis is increasingly inefficient. Advancements in deep learning provide solutions to address these issues. At NR, we use deep learning to develop methods for automatic analysis and extraction of various types of marine image data, such as underwater videos and images, sonar acoustics, microscopic images of otoliths, and drone images of marine mammals.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Automated tracking of fish populations<\/strong><\/h2>\n\n\n\n<p>Underwater cameras are increasingly used to closely monitor fish populations. They may be mounted in trawls, fish pens or near underwater habitats. Analysing the data from these cameras manually is costly, time-consuming, and ineffectual. However, by utilising deep learning algorithms, fish in these videos can be automatically located, classified and tracked over time. This can enhance the efficiency and effectiveness of marine monitoring and management.&nbsp;<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Detection and classification of data<\/strong><\/h2>\n\n\n\n<p>The complexity of some types of marine image data and scarcity of training data present significant challenges. Acoustic trawl surveys are, for example, not typically labelled with high levels of detail (i.e. strong labels). We developed a method for detecting and classifying schools of sandeels based on strongly labelled acoustic data and are currently adapting this method for weakly labelled acoustic data. <\/p>\n\n\n\n<p>In the marine sector, it is common to leverage data from different sources in order to make final predictions. Because of this, we have expanded our sandeel model to include auxiliary data with a different modality as input. This has enhanced the model&#8217;s performance. <\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Transparent predictions with explainable artificial intelligence<\/strong><\/h2>\n\n\n\n<p>When predictions are intended, such as model input for abundance estimation, trustworthiness is vital, and transparency and reliability are key.<\/p>\n\n\n\n<p>Among our research areas, we are have examined how explainable artificial intelligence (XAI) can be used to understand what parts of otoliths our models focus on when making fish age predictions. <\/p>\n\n\n\n<p>In the interest of reliability, we have also researched how the models perform on new datasets with slightly different distributions. For otoliths, this may happen when images come from different labs. Here, we are looking into methods for domain adaptation that, when applied to the otolith case, helps an existing model trained in one lab, to perform equally well for a different lab without retraining with labelled samples.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dots\" \/>\n\n\n\n<h3 class=\"wp-block-heading has-text-align-center\">Selected projects<\/h3>\n\n\n\t\t<div id=\"post-type-multi-block_0b33638db9463d7fa95f3c7a7c615876\" class=\"wp-block-post-type-multi type-manual style-card-bc_project-sm t2-grid\">\n\t\t\t\t\t\t\t<div class=\"t2-grid-item-col-6\">\n\t\t\t\t\t\t<a href=\"https:\/\/nr.no\/en\/projects\/cogmar\/\" class=\"card-post card-project\">\n\t\t\t\t\t<figure>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2024\/02\/lance-anderson-G2SDLsJp3rg-unsplash-scaled.jpg\" alt=\"The image shows a school of fish in dark waters.\">\n\t\t\t<\/figure>\n\t\t\t\t<div class=\"card-post__content\">\n\t\t\t\t\t\t\t<ul class=\"card-post__categories\">\n\t\t\t\t\t\t\t\t\t\t\t<li>Image analysis<\/li>\n\t\t\t\t\t\t\t\t\t\t\t<li>Marine image analysis<\/li>\n\t\t\t\t\t\t\t\t\t<\/ul>\n\t\t\t\t\t\t<h3 class=\"card-post__title\">Advancing marine services with computer vision (COGMAR)<\/h3>\n\t\t<\/div>\n\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<h3 class=\"wp-block-heading\">To learn more about marine image analysis at NR, please contact:<\/h3>\n\n\n\t\t<div id=\"post-type-multi-block_2c79f27e9ad84114ec596a1066c8351f\" class=\"wp-block-post-type-multi type-manual style-card-bc_employee t2-grid\">\n\t\t\t\t\t\t\t<div class=\"t2-grid-item-col-12\">\n\t\t\t\t\t\t<a href=\"https:\/\/nr.no\/en\/employees\/line-eikvil\/\" class='card-employee'>\n\t\t\t\t\t<figure>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/line-eikvil-4.jpg\" alt=\"\">\n\t\t\t<\/figure>\n\t\t\t\t<div class=\"card-employee__content\">\n\t\t\t<p class=\"card-employee__name\">Line Eikvil<\/p>\n\t\t\t\t\t\t\t<p class=\"card-employee__position\">Research Director<\/p>\n\t\t\t\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 24 24\" height=\"24\" width=\"24\" class=\"t2-icon t2-icon-arrowforward\" aria-hidden=\"true\" focusable=\"false\"><path d=\"M15.9 4.259a1.438 1.438 0 0 1-.147.037c-.139.031-.339.201-.421.359-.084.161-.084.529-.001.685.035.066 1.361 1.416 2.947 3l2.882 2.88-10.19.02c-8.543.017-10.206.029-10.29.075-.282.155-.413.372-.413.685 0 .313.131.53.413.685.084.046 1.747.058 10.29.075l10.19.02-2.882 2.88c-1.586 1.584-2.912 2.934-2.947 3-.077.145-.085.521-.013.66a.849.849 0 0 0 .342.35c.156.082.526.081.68-.001.066-.035 1.735-1.681 3.709-3.656 2.526-2.53 3.606-3.637 3.65-3.742A.892.892 0 0 0 23.76 12a.892.892 0 0 0-.061-.271c-.044-.105-1.124-1.212-3.65-3.742-1.974-1.975-3.634-3.616-3.689-3.645-.105-.055-.392-.107-.46-.083\"\/><\/svg>\n\t\t<\/div>\n\t<\/a>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\n\n\n<div class=\"wp-block-group has-primary-200-background-color has-background\">\n<p>Partner: <a href=\"https:\/\/www.hi.no\/hi\/en\" target=\"_blank\" rel=\"noreferrer noopener\">The Institute of Marine Research (IMR)<\/a><\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-group has-background\" style=\"background-color:#cdf1f1\">\n<p><strong>Further resources<\/strong>:<\/p>\n\n\n\n<p><a rel=\"noreferrer noopener\" href=\"https:\/\/crimac.no\/en\/projects\/crimac\" data-type=\"URL\" data-id=\"https:\/\/crimac.no\/en\/projects\/crimac\" target=\"_blank\">CRIMAC<\/a> <\/p>\n\n\n\n<p><a href=\"https:\/\/www.visual-intelligence.no\/\">Visual <\/a><a rel=\"noreferrer noopener\" href=\"https:\/\/www.visual-intelligence.no\/\" target=\"_blank\">Intelligence<\/a> <\/p>\n<\/div>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"247\" height=\"316\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/06\/marin-bildeanalyse1-1.png\" alt=\"The image shows how our method detects and characterises sandeel. The image is split in two; the lower section highlights detected sandeel in red against a black background\" class=\"wp-image-33590\" srcset=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/06\/marin-bildeanalyse1-1.png 247w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/06\/marin-bildeanalyse1-1-234x300.png 234w\" sizes=\"auto, (max-width: 247px) 100vw, 247px\" \/><figcaption class=\"wp-element-caption\"><em>In the COGMAR project, we developed a method to detect and classify sandeel in acoustic data. <\/em>Image: NR.<\/figcaption><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"533\" height=\"258\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/06\/marin-bildeanalyse2-1.png\" alt=\"The figure shows six images of otoliths to the left that demonstrate which areas of the object deep neural networks find notable when assessing the image. To the right, for comparison, an image of what humans see. \" class=\"wp-image-33591\" srcset=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/06\/marin-bildeanalyse2-1.png 533w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/06\/marin-bildeanalyse2-1-300x145.png 300w\" sizes=\"auto, (max-width: 533px) 100vw, 533px\" \/><figcaption class=\"wp-element-caption\"><em>What does a neural network look for when estimating the age of fish using images of otoliths?<\/em> Photo: NR.<\/figcaption><\/figure>\n<\/div>\n<\/div>\n","protected":false},"featured_media":25328,"parent":175,"menu_order":17,"template":"","meta":{"_acf_changed":false,"_trash_the_other_posts":false,"editor_notices":[],"footnotes":""},"class_list":["post-21437","bc_area","type-bc_area","status-publish","has-post-thumbnail"],"acf":[],"_links":{"self":[{"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/bc_area\/21437","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/bc_area"}],"about":[{"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/types\/bc_area"}],"version-history":[{"count":4,"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/bc_area\/21437\/revisions"}],"predecessor-version":[{"id":33617,"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/bc_area\/21437\/revisions\/33617"}],"up":[{"embeddable":true,"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/bc_area\/175"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/media\/25328"}],"wp:attachment":[{"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/media?parent=21437"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}