{"id":40300,"date":"2026-02-19T13:00:37","date_gmt":"2026-02-19T12:00:37","guid":{"rendered":"https:\/\/nr.no\/en\/?page_id=40300"},"modified":"2026-03-27T08:53:45","modified_gmt":"2026-03-27T07:53:45","slug":"visual-intelligence","status":"publish","type":"page","link":"https:\/\/nr.no\/en\/about\/research-centres\/visual-intelligence\/","title":{"rendered":"Visual Intelligence"},"content":{"rendered":"\n\t<div style=\"--t2-hero-dim:0.5;\" class=\"t2-hero t2-hero-content-position-center-left t2-hero-has-dim-50 t2-hero-has-fullsize-image wp-block-t2-hero\">\n\t\t<img decoding=\"async\" width=\"1024\" height=\"724\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/visual-intelligence-graphic-1024x724.png\" class=\"t2-hero__image\" alt=\"\" style=\"object-position: 88% 30%;\" loading=\"eager\" fetchpriority=\"high\" srcset=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/visual-intelligence-graphic-1024x724.png 1024w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/visual-intelligence-graphic-300x212.png 300w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/visual-intelligence-graphic-768x543.png 768w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/visual-intelligence-graphic-1536x1086.png 1536w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/visual-intelligence-graphic-2048x1448.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/>\n\t\t<div class=\"t2-hero__header\"><div class=\"t2-hero__header__content\">\n\n<h1 class=\"wp-block-heading alignwide has-text-align-left\">Visual Intelligence<\/h1>\n\n\n\n<p><strong>Deep learning and artificial intelligence for complex image data<\/strong><\/p>\n\n<\/div><\/div>\n\t<\/div>\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\"><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"960\" height=\"291\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/11\/vi_blaa.png\" alt=\"VI logo\" class=\"wp-image-39289\" srcset=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/11\/vi_blaa.png 960w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/11\/vi_blaa-300x91.png 300w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/11\/vi_blaa-768x233.png 768w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\"><\/div>\n<\/div>\n\n\n\t<div class=\"nr-spacer nr-spacer-small wp-block-nr-spacer\">\n\t<\/div>\n\t\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"133\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/VI-band-added-1024x133.png\" alt=\"\" class=\"wp-image-40303\" srcset=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/VI-band-added-1024x133.png 1024w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/VI-band-added-300x39.png 300w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/VI-band-added-768x100.png 768w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/VI-band-added-1536x200.png 1536w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/VI-band-added.png 1747w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">About Visual Intelligence<\/h2>\n\n\n\n<p><strong>NR is part of Visual Intelligence, a Centre for Research-Based Innovation (SFI) that specialises in the use of artificial intelligence for complex image data.<\/strong><\/p>\n\n\n\n<p><strong>In Visual Intelligence, we explore the next generation of deep learning for visual data, and develop practical solutions in close collaboration with our partners in the consoritium. Applications include medicine and healthcare, marine science, energy and Earth observation.<\/strong><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">The next generation of methods for deep learning of image data<\/h2>\n\n\n\n<p>We develop solutions based on deep learning for visual data, working in close collaboration with leading industry partners across domains.<\/p>\n\n\n\n<p>Our research particularly targets four key challenges in the field:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>How to address problems with limited training data<\/li>\n\n\n\n<li>How to incorporate information about context and dependencies<\/li>\n\n\n\n<li>How to develop models and methods that can estimate confidence and uncertainty<\/li>\n\n\n\n<li>How to design methods that provide explainable and trustworthy predictions<\/li>\n<\/ul>\n\n\n\t<div class=\"nr-spacer nr-spacer-small wp-block-nr-spacer\">\n\t<\/div>\n\t\n\n\n<h2 class=\"wp-block-heading\">Interdisciplinary method development with broad applications<\/h2>\n\n\n\n<p>Interdisciplinary method development makes our research relevant in many different areas. By working methodically across domains, we create solutions that can be applied in diverse contexts. Our methods foster synergies between disciplines and generate value for both academia and industry.<\/p>\n\n\n\n<p><br>Collaboration with research groups and industry partners is central to our work, and our research activities reflect this commitment.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Marine image analysis<\/h2>\n\n\n\n<p>We work with The Institute for Marine Research (IMR) to analyse different types of image data, including marine acoustics, aerial and microscopic images. Our objective is to improve work in monitoring, stock assessment and general understanding of marine ecosystems.<\/p>\n\n\n\n<p>Our work includes:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>developing models that detect and classify fish using acoustic data \u2013 a crucial part of estimating fish stock<\/li>\n\n\n\n<li>developing models that count seal pups in aerial images on ice-covered landmass in Greenland<\/li>\n\n\n\n<li>developing hierarchical models for classification of various types of plankton in microscopic images<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"975\" height=\"373\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/marinakustikk.png\" alt=\"Neural network detecting sandeel in 4-channel acoustic data. Points shown in color.\" class=\"wp-image-35466\" srcset=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/marinakustikk.png 975w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/marinakustikk-300x115.png 300w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/marinakustikk-768x294.png 768w\" sizes=\"auto, (max-width: 975px) 100vw, 975px\" \/><figcaption class=\"wp-element-caption\"><em>A neural network has been trained to detect sandeel in four-channel acoustic data. <\/em>Figure: NR.<\/figcaption><\/figure>\n\n\n\t<div class=\"nr-spacer nr-spacer-small wp-block-nr-spacer\">\n\t<\/div>\n\t\n\n\n<h2 class=\"wp-block-heading\">Medical image analysis<\/h2>\n\n\n\n<p>We develop methods for medical image analysis that support diagnostics and clinical decision-making.<\/p>\n\n\n\n<p>Together with the Cancer Registry of Norway, we are creating analytical methods for mammograms. By using explainable artificial intelligence, we can highlight the areas of an image where a cancerous tumour is located, providing both insight and support for further medical assessment.<\/p>\n\n\n\n<p>Another important collaboration is with GE Healthcare, where we analyse image sequences from cardiac ultrasounds. Here, we apply graph convolution networks to identify measuring points and model how they relate to each other,<\/p>\n\n\n\n<p>We are also exploring the potential of foundation models in this field, particularly their ability to reduce the need for time-consuming manual annotation.<\/p>\n\n\n\n<div class=\"wp-block-group\">\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"1024\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/mamogram-VI-1024x1024.jpg\" alt=\"Mammogram where a deep neural network has identified cancer, with explainable AI highlighting the detected area.\" class=\"wp-image-35468\" srcset=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/mamogram-VI-1024x1024.jpg 1024w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/mamogram-VI-300x300.jpg 300w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/mamogram-VI-150x150.jpg 150w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/mamogram-VI-768x768.jpg 768w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/mamogram-VI-1536x1536.jpg 1536w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/mamogram-VI-400x400.jpg 400w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/mamogram-VI.jpg 2048w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\"><em>A deep neural network has been developed to detect cancer in mammography images, and explainable AI has been used to highlight the area where the cancer was identified.<\/em> Image: NR.<\/figcaption><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"1024\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/ultralyd-VI-1024x1024.jpg\" alt=\"Graph convolutional network mapping measurement points in a cardiac ultrasound image, with points marked in blue, green, and red, and the rest in grayscale\" class=\"wp-image-35467\" srcset=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/ultralyd-VI-1024x1024.jpg 1024w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/ultralyd-VI-300x300.jpg 300w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/ultralyd-VI-150x150.jpg 150w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/ultralyd-VI-768x768.jpg 768w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/ultralyd-VI-1536x1536.jpg 1536w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/ultralyd-VI-400x400.jpg 400w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/ultralyd-VI.jpg 2048w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\"><em>A graph convolutional network has been developed to identify measurement points in a cardiac ultrasound image.<\/em> Image: NR.<\/figcaption><\/figure>\n<\/div>\n<\/div>\n<\/div>\n\n\n\t<div class=\"nr-spacer nr-spacer-small wp-block-nr-spacer\">\n\t<\/div>\n\t\n\n\n<div class=\"wp-block-group\">\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<h2 class=\"wp-block-heading\">A foundation model for seismic interpretation<\/h2>\n\n\n\n<p>We are developing a foundation model for seismic data in collaboration with Equinor and Aker BP.<\/p>\n\n\n\n<p>The model, named NCS, has been trained on seismic data from Diskos, Norway\u2019s national data repository for exploration and production-related information, using the Norwegian supercomputer Olivia.<\/p>\n\n\n\n<p>Collecting large volumes of complete training data for seismic interpretation is challenging. A model that reduces the need for extensive labelled data is therefore particularly valuable. <\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"755\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/seismikk-VI-1024x755.jpg\" alt=\"Visualisation of a seismic foundation model with color codes, used for interactive mapping of oil\u2013gas contact at the Troll field.\" class=\"wp-image-35470\" srcset=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/seismikk-VI-1024x755.jpg 1024w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/seismikk-VI-300x221.jpg 300w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/seismikk-VI-768x566.jpg 768w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/seismikk-VI-1536x1133.jpg 1536w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/seismikk-VI.jpg 2048w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\"><em>Seismic foundation model used for interactive mapping, showing oil\u2013gas contact in the Troll field with color-coded layers.<\/em>&nbsp;Image: NR.<\/figcaption><\/figure>\n<\/div>\n<\/div>\n<\/div>\n\n\n\t<div class=\"nr-spacer nr-spacer-small wp-block-nr-spacer\">\n\t<\/div>\n\t\n\n\n<div class=\"wp-block-group\">\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<h2 class=\"wp-block-heading\">Earth observation<\/h2>\n\n\n\n<p>Together with Kongsberg Satellite Services (KSAT), we are exploring the use of foundation models for Earth observation to detect and map marine oil spills, based on data from radar satellites.<\/p>\n\n\n\n<p>An important part of this work is integrating contextual information, such as wind speed and direction, since these data improve model performance and make the analyses more precise and reliable.<\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"1015\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/jordobservasjon-VI-1024x1015.jpg\" alt=\"Black-and-white radar image showing oil spills as dark patches on the sea surface.\" class=\"wp-image-35471\" srcset=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/jordobservasjon-VI-1024x1015.jpg 1024w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/jordobservasjon-VI-300x297.jpg 300w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/jordobservasjon-VI-150x150.jpg 150w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/jordobservasjon-VI-768x761.jpg 768w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/jordobservasjon-VI-1536x1522.jpg 1536w, https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/08\/jordobservasjon-VI.jpg 2048w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\"><em>A satellite image showing oil spills as dark patches, as the oil dampens ripples and waves on the water surface.<\/em> Image: NR.<\/figcaption><\/figure>\n<\/div>\n<\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\t<div class=\"nr-spacer nr-spacer-large wp-block-nr-spacer\">\n\t<\/div>\n\t\n\n\n<p><strong>To learn more about Visual Intelligence                          and our role in the centre, get in touch.<\/strong><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\t\t<div id=\"post-type-multi-block_c994fd87552afea0bb4ff93122906fe2\" class=\"wp-block-post-type-multi type-manual style-card-bc_employee t2-grid\">\n\t\t\t\t\t\t\t<div class=\"t2-grid-item-col-12\">\n\t\t\t\t\t\t<a href=\"https:\/\/nr.no\/en\/employees\/line-eikvil\/\" class='card-employee'>\n\t\t\t\t\t<figure>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/line-eikvil-4.jpg\" alt=\"\">\n\t\t\t<\/figure>\n\t\t\t\t<div class=\"card-employee__content\">\n\t\t\t<p class=\"card-employee__name\">Line Eikvil<\/p>\n\t\t\t\t\t\t\t<p class=\"card-employee__position\">Research Director<\/p>\n\t\t\t\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 24 24\" height=\"24\" width=\"24\" class=\"t2-icon t2-icon-arrowforward\" aria-hidden=\"true\" focusable=\"false\"><path d=\"M15.9 4.259a1.438 1.438 0 0 1-.147.037c-.139.031-.339.201-.421.359-.084.161-.084.529-.001.685.035.066 1.361 1.416 2.947 3l2.882 2.88-10.19.02c-8.543.017-10.206.029-10.29.075-.282.155-.413.372-.413.685 0 .313.131.53.413.685.084.046 1.747.058 10.29.075l10.19.02-2.882 2.88c-1.586 1.584-2.912 2.934-2.947 3-.077.145-.085.521-.013.66a.849.849 0 0 0 .342.35c.156.082.526.081.68-.001.066-.035 1.735-1.681 3.709-3.656 2.526-2.53 3.606-3.637 3.65-3.742A.892.892 0 0 0 23.76 12a.892.892 0 0 0-.061-.271c-.044-.105-1.124-1.212-3.65-3.742-1.974-1.975-3.634-3.616-3.689-3.645-.105-.055-.392-.107-.46-.083\"\/><\/svg>\n\t\t<\/div>\n\t<\/a>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><\/div>\n<\/div>\n\n\n\t<div class=\"nr-spacer nr-spacer-small wp-block-nr-spacer\">\n\t<\/div>\n\t\n\n\n<div class=\"wp-block-group\">\n<div class=\"wp-block-group has-primary-200-background-color has-background\">\n<p><strong>Visual Intelligence at at glance<\/strong><\/p>\n\n\n\n<p>Centre: Visual Intelligence<\/p>\n\n\n\n<p>Partners: UiT Arctic University of Norway, The University of Oslo (UiO), Kongsberg Satellite Services (KSAT), The Cancer Registry of Norway, GE Healthcare, The University Hospital of North Norway (UNN), Northern Norway Regional Health Authority, The Institute of Marine Research (IMR), Equinor, Aker BP<\/p>\n\n\n\n<p>Period: 2020 &#8211; 2028<\/p>\n\n\n\n<p>Funding: Visual Intelligence is a Norwegian Centre for Research-based Innovation and supported by the Research Council of Norway<\/p>\n\n\n\n<figure class=\"wp-block-image alignfull size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"66\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2026\/02\/Forskningsradet_Senter_SFI_Monokrom-Weller_Engelsk_RGB_11-300x66-1.png\" alt=\"\" class=\"wp-image-40311\" \/><\/figure>\n<\/div>\n<\/div>\n\n\n\t<div class=\"nr-spacer nr-spacer-medium wp-block-nr-spacer\">\n\t<\/div>\n\t\n\n\n<div class=\"wp-block-group has-background\" style=\"background-color:#beb3c4\">\n<p><strong>Additional resources<\/strong><\/p>\n\n\n\n<p><a href=\"https:\/\/nva.sikt.no\/projects\/2522411\" target=\"_blank\" rel=\"noreferrer noopener\">Project page in the Norwegian Research Information Repository (NVA)<\/a><\/p>\n\n\n\n<p><a href=\"http:\/\/visual-intelligence.no\/\" target=\"_blank\" rel=\"noreferrer noopener\">Visual Intelligence<\/a> &#8211; external project page<\/p>\n\n\n\n<p><a href=\"https:\/\/www.linkedin.com\/company\/sfi-visual-intelligence\/posts\/?feedView=all\">Visual Intelligence on&nbsp;<\/a><a href=\"https:\/\/www.linkedin.com\/company\/sfi-visual-intelligence\/posts\/?feedView=all\" target=\"_blank\" rel=\"noreferrer noopener\">Linkedin<\/a>&nbsp;<\/p>\n<\/div>\n<\/div>\n<\/div>\n\n\n\n<h3 class=\"wp-block-heading has-text-align-center\">Selected projects <\/h3>\n\n\n\n<p>Our goal is to develop methods and tools that remain applicable and can be further advanced beyond the centre\u2019s conclusion in 2028. Explore our projects below.<\/p>\n\n\n\t\t<div id=\"post-type-multi-block_90b207c32176a5d5a5cbd2e6686c47c8\" class=\"wp-block-post-type-multi type-manual style-card-bc_project-sm t2-grid\">\n\t\t\t\t\t\t\t<div class=\"t2-grid-item-col-4\">\n\t\t\t\t\t\t<a href=\"https:\/\/nr.no\/en\/projects\/cogmar\/\" class=\"card-post card-project\">\n\t\t\t\t\t<figure>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2024\/02\/lance-anderson-G2SDLsJp3rg-unsplash-scaled.jpg\" alt=\"The image shows a school of fish in dark waters.\">\n\t\t\t<\/figure>\n\t\t\t\t<div class=\"card-post__content\">\n\t\t\t\t\t\t\t<ul class=\"card-post__categories\">\n\t\t\t\t\t\t\t\t\t\t\t<li>Image analysis<\/li>\n\t\t\t\t\t\t\t\t\t\t\t<li>Marine image analysis<\/li>\n\t\t\t\t\t\t\t\t\t<\/ul>\n\t\t\t\t\t\t<h3 class=\"card-post__title\">Advancing marine services with computer vision (COGMAR)<\/h3>\n\t\t<\/div>\n\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"t2-grid-item-col-4\">\n\t\t\t\t\t\t<a href=\"https:\/\/nr.no\/en\/projects\/breast-cancer-detection-with-machine-learning\/\" class=\"card-post card-project\">\n\t\t\t\t\t<figure>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/09\/national-cancer-institute-W2OVh2w2Kpo-unsplash-scaled-1-scaled.jpg\" alt=\"The image shows stress fibres and microtubules in human breast cancer. Image by: Christina Stuelten, Carole Parent, 2011\">\n\t\t\t<\/figure>\n\t\t\t\t<div class=\"card-post__content\">\n\t\t\t\t\t\t\t<ul class=\"card-post__categories\">\n\t\t\t\t\t\t\t\t\t\t\t<li>Image analysis<\/li>\n\t\t\t\t\t\t\t\t\t\t\t<li>Machine learning<\/li>\n\t\t\t\t\t\t\t\t\t<\/ul>\n\t\t\t\t\t\t<h3 class=\"card-post__title\">Breast cancer detection with machine learning (MIM)<\/h3>\n\t\t<\/div>\n\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"t2-grid-item-col-4\">\n\t\t\t\t\t\t<a href=\"https:\/\/nr.no\/en\/projects\/deli\/\" class=\"card-post card-project\">\n\t\t\t\t\t<figure>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2024\/02\/arno-senoner-sf4YyPxoCvI-unsplash-scaled.jpg\" alt=\"deep learning-based methods for interpreting seismic data\">\n\t\t\t<\/figure>\n\t\t\t\t<div class=\"card-post__content\">\n\t\t\t\t\t\t\t<ul class=\"card-post__categories\">\n\t\t\t\t\t\t\t\t\t\t\t<li>Image analysis<\/li>\n\t\t\t\t\t\t\t\t\t\t\t<li>Machine learning<\/li>\n\t\t\t\t\t\t\t\t\t<\/ul>\n\t\t\t\t\t\t<h3 class=\"card-post__title\">Deep learning for seismic data (DELI)<\/h3>\n\t\t<\/div>\n\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"t2-grid-item-col-4\">\n\t\t\t\t\t\t<a href=\"https:\/\/nr.no\/en\/projects\/incus\/\" class=\"card-post card-project\">\n\t\t\t\t\t<figure>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2024\/02\/kenny-eliason-MEbT27ZrtdE-unsplash-scaled.jpg\" alt=\"\">\n\t\t\t<\/figure>\n\t\t\t\t<div class=\"card-post__content\">\n\t\t\t\t\t\t\t<ul class=\"card-post__categories\">\n\t\t\t\t\t\t\t\t\t\t\t<li>Image analysis<\/li>\n\t\t\t\t\t\t\t\t\t\t\t<li>Machine learning<\/li>\n\t\t\t\t\t\t\t\t\t<\/ul>\n\t\t\t\t\t\t<h3 class=\"card-post__title\">Intelligent cardiac ultrasounds (INCUS)<\/h3>\n\t\t<\/div>\n\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"t2-grid-item-col-4\">\n\t\t\t\t\t\t<a href=\"https:\/\/nr.no\/en\/projects\/a-foundation-model-for-smarter-climate-action-fm4cs\/\" class=\"card-post card-project\">\n\t\t\t\t\t<figure>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2025\/05\/Sentinel-5.jpg\" alt=\"This is a satellite image of central Norway and Sweden showing snow cover. The image colours are black, various shades of blue and purple, and of course white.\">\n\t\t\t<\/figure>\n\t\t\t\t<div class=\"card-post__content\">\n\t\t\t\t\t\t\t<ul class=\"card-post__categories\">\n\t\t\t\t\t\t\t\t\t\t\t<li>Earth observation<\/li>\n\t\t\t\t\t\t\t\t\t<\/ul>\n\t\t\t\t\t\t<h3 class=\"card-post__title\">THOR: A foundation model for smarter climate action (FM4CS)<\/h3>\n\t\t<\/div>\n\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"t2-grid-item-col-4\">\n\t\t\t\t\t\t<a href=\"https:\/\/nr.no\/en\/projects\/aiforscreening\/\" class=\"card-post card-project\">\n\t\t\t\t\t<figure>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/nr.no\/content\/uploads\/sites\/2\/2023\/12\/angiola-harry-SJCalEw-1LM-unsplash-1-scaled.jpg\" alt=\"The images shows the body of a woman in a pink shirt holding a bright pink ribbon which symbolises breast cancer awareness.\">\n\t\t\t<\/figure>\n\t\t\t\t<div class=\"card-post__content\">\n\t\t\t\t\t\t\t<ul class=\"card-post__categories\">\n\t\t\t\t\t\t\t\t\t\t\t<li>Image analysis<\/li>\n\t\t\t\t\t\t\t\t\t\t\t<li>Machine learning<\/li>\n\t\t\t\t\t\t\t\t\t<\/ul>\n\t\t\t\t\t\t<h3 class=\"card-post__title\">Trustworthy AI for breast cancer screenings (AIforScreening)<\/h3>\n\t\t<\/div>\n\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>About Visual Intelligence NR is part of Visual Intelligence, a Centre for Research-Based Innovation (SFI) that specialises in the use of artificial intelligence for complex image data. In Visual Intelligence, we explore the next generation of deep learning for visual data, and develop practical solutions in close collaboration with our partners in the consoritium. Applications [&hellip;]<\/p>\n","protected":false},"author":12,"featured_media":0,"parent":22565,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_acf_changed":false,"_trash_the_other_posts":false,"editor_notices":[],"footnotes":""},"class_list":["post-40300","page","type-page","status-publish"],"acf":[],"_links":{"self":[{"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/pages\/40300","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/users\/12"}],"replies":[{"embeddable":true,"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/comments?post=40300"}],"version-history":[{"count":5,"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/pages\/40300\/revisions"}],"predecessor-version":[{"id":41453,"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/pages\/40300\/revisions\/41453"}],"up":[{"embeddable":true,"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/pages\/22565"}],"wp:attachment":[{"href":"https:\/\/nr.no\/en\/wp-json\/wp\/v2\/media?parent=40300"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}