Using deep neural networks to map wetlands (LAVDAS)

NR is developing deep learning methods for automatic mapping of wetlands in Norway as part of the LAVDAS project.

Our goal is to detect, classify, and delineate wetlands based on satellite imagery and elevation data from airborne laser scanning. This can help reduce impacts on peatlands from land use changes, supporting both climate and biodiversity efforts.

Vassmyra in Sørkedalen, Oslo, between Skansebakken and Lysedammene. The back part of the peatland has scattered trees, while the surrounding area is covered by denser forest. In the middle of the peatland, there is open water.
Vassmyra in Sørkedalen, Oslo. Peatlands below the tree line often consist of different zones, and at Vassmyra you can see scattered trees, dense forest vegetation, and an open water surface. Photo: NR.

Why is peatland conservation important?

Peatlands are one type of wetland, and one of nature’s most significant carbon sinks.

Disturbing peatlands, for example through construction, causes the peat to dry out and release carbon dioxide (CO₂) into the atmosphere. This contributes to global warming and threatens the distinct biodiversity these ecosystems support.

To prevent further damage, peatlands must be mapped so that developers can avoid them when planning land use changes.

Today, however, mapping is incomplete, especially in areas above the tree line.

Peatlands below and above the tree line

Peatlands below the tree line often have distinct zones: an open centre, scattered trees in the transition area, and dense forest at the edges.

Above the tree line, vegetation patterns are less pronounced, but the species composition changes. This can make automatic mapping more challenging.

Deep neural networks and satellite data

We are testing different types of neural networks to automatically identify wetland areas. The results from our work with U-Net and foundation models are promising.

The models are trained on data from the Sentinel-2 satellite (10-metre spatial resolution), combined with elevation data from airborne laser scanning.

To learn more about this project, please contact:

Project: National Wetlands Geospatial Database (LAVDAS)

Partners: The Norwegian Mapping Authority (project leader), the Norwegian Environment Agency, the Norwegian Institute for Nature Research (NINA), the Norwegian Institute of Bioeconomy Research (NIBIO)

Funding: The Research Council of Norway

Period: 2024 – 2027

Further resources:

Project Bank (forskningsradet.no, external site)

What can satellite data tell us about wetlands?

Sentinel-2 captures both visible light and infrared light —wavelengths humans cannot see. Both provide valuable information.

By analysing these channels, we can, among other things:

  • Differentiate between wet peatland areas and drier ground
  • Identify differences in vegetation types and activity
  • Map moisture levels and detect changes over time

When using satellite imagery, cloud cover is common. To address this, we can either create cloud-free mosaics based on multiple images or combine results from the cloud-free parts of individual images (see figures 2 and 3).

Satellite image showing how clouds cover parts of the peatland, making mapping difficult.
Figure 2: Sentinel-2 satellite image over Roancejávri (6 August 2020). Clouds and shadows make peatland mapping difficult. The image displays shortwave infrared as red, near-infrared as green, and visible green as blue. Figure: NR.
SComposite, cloud-free mosaic from Sentinel-2 (2016–2024) over Roancejávri. Images are merged into a seamless whole, and vegetation is clearly visible throughout the area.
Figure 3: Composite, cloud-free mosaic from Sentinel-2 (2016–2024), Roancejávri. Images from multiple years are combined into a seamless, unified image. Vegetation is clearly visible across the entire area. Figure: NR.

What elevation data can tell us?

Airborne laser scanning provides detailed terrain data. From this, we can calculate, among other things:

  • Slope steepness and direction
  • Vegetation height
  • Topographic wetness index (an estimate of flow patterns and where water is likely to accumulate after rainfall)

Automatic mapping with terrain data and satellite images

The detailed terrain model for Norway is based on laser scanning below the tree line and image matching from overlapping aerial photos in the mountains.

Combined with cloud-free satellite images and terrain indices (slope information, vegetation height, and wetness index), these data form the basis for our automatic mapping method using deep neural networks (see figures 4 and 5).

Colour-coded map of the Roancejávri area showing slope steepness in red, vegetation height in green, and topographic wetness index in blue.
Figure 4: Terrain information from the Roancejávri area. Slope steepness (red), vegetation height (green), and topographic wetness index (blue) are shown as colour-coded layers. Figure: NR.
Map showing preliminary results from automatic peatland mapping. Light green indicates correctly identified peatland, dark green correctly identified non-peatland, red false positives, yellow false negatives, blue existing water bodies, and white/black areas without reference data.
Figure 5: Preliminary results from automatic peatland mapping. Light green: correctly identified peatland; dark green: correctly identified non-peatland; red: false positives; yellow: false negatives. Blue shows water bodies from existing maps. White and black indicate areas without reference data. Figure: NR.

What do the preliminary results show?


The method currently yields promising but incomplete results.

  • 59% of existing peatlands are correctly identified (true positives)
  • 41% are missed (false negatives)
  • 9% are incorrectly classified as peatlands (false positives)

The results vary depending on the area and data sources. On Finnmarksvidda, in the area around Roancejávri (figure 5), we obtained:

  • 55% true positives
  • 18% false positives

Using imagery from a single year (2020) leads to reduced performance (figures 6 and 7):

  • 25% true positives
  • 50% false positives

Mosaic image from 2020 over Roancejávri with large areas of cloud and snow, resulting in incomplete coverage and reduced image quality.
Figure 7: An attempt to create a cloud- and snow-free mosaic for 2020 resulted in incomplete coverage and poor image quality. Figure: NR.
Mapping results based on a snow- and cloud-affected mosaic from 2020. Peatland areas covered by snow are omitted.
Figure 8: Mapping results from a snow- and cloud-affected mosaic (2020). Snow-covered peatland areas are omitted. Figure: NR.

The results highlight the importance of using satellite imagery from multiple years to ensure robust and accurate mapping.

What are our next steps?

We are continuing to improve the neural networks. The goal is to increase the proportion of correctly classified peatland areas (true positives), while reducing misclassifications (false positives and false negatives).

These improvements are essential for developing reliable tools to support better land use decisions and protect vulnerable peatland ecosystems.