Information Technology

KGG213/543 Remote Sensing: Image Analysis and Advanced Earth Observation

17 May 2023 10:48 AM | UPDATED 1 year ago

KGG213/543 Remote Sensing: Image Analysis and Advanced Earth Observation :

KGG213/543 Remote Sensing: Image Analysis and Advanced Earth Observation
KGG213/543 Remote Sensing: Image Analysis and Advanced Earth Observation

Remote Sensing Project Descriptions 2023

You will work in teams of two on your project. If you need to maintain social distancing the I encourage you to use Zoom to collaborate if either partner requests it. Resource on how to setup a Zoom meeting can be found here.

You may present your project via Zoom as well, but I will set aside time in week 13 to present to the group (see Guidelines document).

See MyLO Project folder or the following UTAS network drive for project data (note you will need an active VPN connection if you are outside the UTAS network):

\\studentdata\units\Geography\KGG213\project (Note that this location may change prior to the official release of these guidelines… I am awaiting confirmation of your access to a shared unit data repository, for now I will just leave these links in here).

KGG213/543 Remote Sensing: Image Analysis and Advanced Earth Observation Project 1: Forest structure characterisation using Airborne LiDAR

Tasmanian forests are traditionally mapped using manual air photo interpretation. This is an expensive and time-consuming process that can no longer keep up with management demands for information regarding real-time forest condition. Airborne LiDAR sensors collect point samples on a semi regular grid pattern. Each point records the position of the target and the intensity (or brightness) of the laser reflection. Your task is to process a LiDAR data file to derive vegetation height, canopy cover, gap fraction and other forest structural parameters. These forest metrics, represented as grid layers, can then be used to classify the forest structural types or dominant species present in the scene.

KGG213/543 Remote Sensing Tasks:

  • Review the scientific literature on LiDAR metrics that can be derived to characterise forest structure, e.g. height percentiles and bincentiles, gap fraction, canopy cover
  • Explore the LAS files in lasview
  • Derive a digital surface model (DSM) and a digital terrain model (DTM) using LAStools
  • Produce a canopy height model (CHM)
  • Identify key forest variables that can be derived with the ‘lascanopy’ tool.
  • You can use LAStools to calculate a grid of canopy height. The ‘lascanopy’ tool can be used to quantify the vertical structure of trees. You can generate grids with lascanopy to capture forest structural parameters. These grid layers can be imported in ENVI as images and stacked into a single image.
  • Identify the most promising LiDAR metrics that can be used to distinguish vegetation communities. The TASVEG data layer on The List can be used for comparison.
  • You could attempt to stack several LiDAR derivative grid layers in ENVI and perform a classification to map forest structural types.

Data (KGG213 projects folder on teaching network drive):

  • See the following UTAS network drive for project data (note you will need an active VPN connection if you are outside the UTAS network):

\\studentdata\units\Geography\KGG213\project

  • Cathedral Rock LiDAR data (WellingtonPark)
  • LiDAR data collected over Kingston, Hobart and Mount Wellington in 2011
  • See project folder for full metadata
  • Nine LiDAR tiles near Cathedral Rock have been provided for this project. Other tiles/areas can be provided on request.

Project 2: Using Unmanned Aerial Vehicle (UAV) photography to generate 3D point clouds

Recent development in photogrammetry and computer vision have made it possible to reconstruct 3D point clouds from overlapping photographs. This technique, called structure- from-motion (SfM), is mainly used for 3D object reconstruction from photographs of buildings and small objects. This approach also works with overlapping UAV photography.

The output is very similar to a very dense LiDAR dataset. The problem is that unlike LiDAR, mainly the surface features are retained in the point cloud. The key task in this project is to test different point cloud filtering/classification algorithms in order to separate vegetation from bare ground.

Research questions for KGG213/543 Remote Sensing : Image Analysis and Advanced Earth Observating:

  • Compared to Prac 2 of this unit, what are the differences between LiDAR and photogrammetry based point clouds?
  • Which ground classification algorithms performed best and what settings are they sensitive to?

KGG213/543 Remote Sensing : Image Analysis and Advanced Earth Observating Tasks:

  • Review the literature on point cloud filtering and the use of SfM for vegetation studies
  • Review the lastools website and blogs and find suitable examples for SfM point cloud processing workflows: https://rapidlasso.com/blog/
  • Explore the LAS file in lasview
  • Split the point cloud into tiles
  • Remove low noise from the dataset
  • Test and compare different settings in lasground and assess the sensitivity of these parameter on the ground point identification
  • Alternatively, test the cloth simulation filter (CSF) in CloudCompare (or as a standalone application) for ground point identification and assess different parameter settings

Data (KGG213 projects folder on teaching network drive):

  • See the following UTAS network drive for project data (note you will need an active VPN connection if you are outside the UTAS network):

\\studentdata\units\Geography\KGG213\project

  • Point cloud from UAV photogrammetry – forest in Ridgeway, Hobart (near Waterworks reserve)
  • Orthophoto from UAV photogrammetry

Project 3: Change detection of land cover in the Hobart, Kingston, or Sorell area

The aim of this KGG213/543 Remote Sensing : Image Analysis and Advanced Earth Observating project is to look at land cover change of an area in Tasmania based on Landsat imagery. Different change detection techniques will be tested in this project.

Research Questions:

  • Is co-registration of the two images sufficiently accurate for change detection? If not, what type or geometric rectification needs to be applied?
  • What change detection technique is most suitable to quantify and map the changes in the area?
  • Are these changes due to differences in the image sensor or are these changes related to true land cover changes?

KGG213/543 Remote Sensing : Image Analysis and Advanced Earth Observating Tasks:

  • Perform a literature search on change detection with Landsat
  • Select an area that has changed a lot in recent years, e.g. Kingston or Sorell
  • Take a subset of the images in order to focus on a smaller area rather than the whole image and to reduce computation time.
  • Apply and test change detection methods:
    • Apply the band difference function in ENVI’s change detection menu
    • Calculate NDVI images based for both images and subtract the NDVI images with ENVI’s band math function or band difference function in the change detection menu.
    • Try to classify both images with ROIs and look at the class changes (see ENVI help on post-classification change detection)
  • Assess whether change has occurred and if so interpret these changes by looking at the change in spectral signatures.

Data (online):

  • Landsat data e.g. 5 and 8 available from the USGS Earth Explorer website (see last page for relevant links), or
  • Sentinel-2 imagery (also see online archive on the last page of this handout)

Project 4: Changes in the polar regions

Pick an event from the last 10 years where large part of iceshelf was lost in the polar regions. Use remote sensing techniques to estimate the amount of ice lost during one event.

An example of this is the Larsen B iceshelf where approximately 3,275 km2 of iceshelf collapsed in 2002 (you could look at more recent changes in this region)

Tasks:

  • Using MODIS imagery acquired before and after the breakup event, determine if the estimated area lost during the collapse is accurate
    • Several studies on the Larsen B breakup have been published in the literature, be sure to consult these studies
    • You will need to classify the scenes into two classes: ice and water
  • Has there been evidence of further ice loss since the event? Compare the ice shelf extent both five and ten years post-collapse.
    • If there is evidence of further collapse, how much ice has been lost? Is the rate of loss increasing over time? If so, give reasons for why this may have occurred.
    • If there is no evidence of further collapse, is there evidence of increase? If not, explain why you think that the glacier may have reached stasis, and what events or conditions may have contributed to the initial collapse.

Data (online):

  • MODIS time series

Project 5: Classification of sub-Antarctic vegetation on Macquarie Island using Quickbird imagery.

This KGG213/543 Remote Sensing : Image Analysis and Advanced Earth Observating project looks at vegetation classification of sub-Antarctic vegetation communities. More specifically, the aim is to distinguish healthy tussock vegetation from any other vegetation and land cover types. The main aim of this project is to compare different classification algorithms in ENVI and to assess whether the inclusion of texture measures can improve the classification.

KGG213/543 Remote Sensing : Image Analysis and Advanced Earth Observating Tasks:

  • Create a 3D view of the Quickbird image draped over the DEM to explore to area
  • Explore the spectral properties of tussock (provided as ENVI ROIs) compared to other features/classes.
  • Explore the use of occurrence and co-occurrence texture measures.
  • Add several texture measure to the multispectral bands as a layer stack.
  • Explore and apply different classification algorithms on the Quickbird image and test their accuracy. Include at least the following (maximum likelihood, Spectral Angle Mapper (SAM), and Support Vector Machines (SVM)).
  • Assess the classification accuracy for each of the classification results with the validation ROIs and ENVI confusion matrix. Study the overall accuracy, kappa coefficient, and user and producer accuracies. Explain the differences between these measures.

Data (KGG213 projects folder on teaching network drive):

  • See the following UTAS network drive for project data (note you will need an active VPN connection if you are outside the UTAS network):

\\studentdata\units\Geography\KGG213\project

  • Imagery in the ENVI format will be provided. The data sets will consist of the following files:
  • qb_finch: a subset of a Quickbird image of Macquarie Island
    • Acquisition date: 15 March 2005
    • Area: Finch Creek, Brothers Point, Bauer Bay
    • 4 multi-spectral bands
    • pixel size: 2.4m
    • projected coordinate system: WGS84 UTM Zone 57S
    • Processing: orthorectified with survey control points and the DEM
  • dem_finch: a subset of the Digital Elevation Model of Macquarie Island
    • Area: Finch Creek, Brothers Point, Bauer Bay
    • Acquired by NASA’s AIRSAR in 2000
    • Pixel size: 5m
    • projected coordinate system: WGS84 UTM Zone 57S
  • tussock.roi ROI file that can be used in combination with the Quickbird image for training and validation (accuracy assessment) of classification algorithms. These ROIs are based on field observations and GPS locations.

Project 6: Urban growth in the Pearl River Delta, China or other areas of fast urban growth

The Pearl River Delta, located on the south-eastern edge of China, is one of the fastest growing urban areas in the world. There are many other areas across the world where urban areas increase drastically like Beihai, Surat or Kabul.

KGG213/543 Remote Sensing : Image Analysis and Advanced Earth Observating Tasks:

  • Determine the rate of urban expansion of the city of Guangzhou and its neighbours. Analyse the change in 10 year increments from 1980 to 2010.
  • Determine the total area of urban land created between 1980 and 2010. Describe which primary land cover classes have been lost as a result of urban expansion. Describe which classes expanded and deduct which type of infrastructure was built (e.g. housing, industrial, agricultural). Describe the change over time.
  • Examine if there is evidence to suggest significant degradation of the river over time in relation to the urban expansion

Data (online):

  • Landsat MSS, TM, ETM+, OLI data available from the USGS Earth Explorer website (or Sentinel-2 in the online Sentinel archive)

Urban population cartogram by Worldmapper.org. The original shape of each territory is proportional to the variable mapped.

Project 7: Mineral Mapping using Imaging Spectrometry and Spectral Mixture Analysis

The technology of imaging spectrometry was developed in the late eighties in the U.S. The concept of imaging spectrometry is that reflected sun light is measured in many, narrow contiguous spectral bands for each pixel. Today airborne (AVIRIS, CASI, DAIS & HyMAP) and spaceborne imaging sensors are operational. Rocks, minerals and soils have diagnostic absorption features in the solar spectrum and can be detected by imaging spectrometers. In this exercise, you will use airborne hyperspectral images acquired by AVIRIS to investigate their usefulness for mapping of open mines.

KGG213/543 Remote Sensing : Image Analysis and Advanced Earth Observating Tasks:

  • Describe the system properties of the AVIRIS instrument
  • Describe (find in literature) the concept of Spectral Mixture Analysis
  • Determine the suitability of Spectral Mixture Analysis (SMA) of hyperspectral data for surface mineralogy mapping
  • Find two spectra of Kaolinite and Alunite and compare the spectral differences
  • Use reference spectra of Kaolinite and Alunite and the ENVI linear spectral unmixing tool to produce maps of mineral abundance around the open mines in the Cuprite area
  • Compare the results from the linear spectral unmixing technique with other tools

available in ENVI’s Spectral toolbox

Data (KGG213 projects folder on teaching network drive):

  • See the following UTAS network drive for project data (note you will need an active VPN connection if you are outside the UTAS network):

\\studentdata\units\Geography\KGG213\project

  • 1995 and 2011 AVIRIS data
  • Alteration map
  • Field picture
  • ENVI tutorial: Preprocessing AVIRIS
  • Literature

Project 8: Lake Erie Algal Blooms

Lake Erie, one of the Great Lakes located between the U.S. and Canada, has undergone many toxic algal blooms in the past. Scientists are now concerned that these blooms could increase in frequency, extent and vigour, as climate conditions change. The current concentration of algae in the lake is again critical.

KGG213/543 Remote Sensing : Image Analysis and Advanced Earth Observating Tasks:

  • Search the scientific literature to find a study that describes detection of algal blooms using remote sensing.
  • The latest severe algal bloom in the region occurred in 2015. Download a Landsat 8 scene from one of the months in which the bloom occurred.
  • Identify the extent of the algal bloom, and determine its total area compared to the total size of the lake.
  • Map the area covered by algae into healthy and unhealthy algae through the use of band ratios (indices) and/or image classification.
  • Examine the distribution of the bloom throughout the lake e.g. in relation to the

lake’s depth and the ratio and distribution of healthy vs. toxic algae

Data (online):

  • Landsat 8 data available from the USGS Earth Explorer website
  • Journal papers

Project 9: Estimating morphological traits for different forest layers from high point density UAV LiDAR data.

During a field camping in September 2019, high point density LiDAR data was collected over a Terrestrial Ecosystem Research Network (TERN) SuperSite at Tumbarumba in the Bago State forest in NSW. The site has been recognised for its significant contribution to the calibration of satellite-derived Earth Observation products. The forest is an open wet sclerophyll Eucalyptus forest with a maximal vegetation height of 47 m. The data shows that underneath the canopy more than one forest layer is present. The morphological traits such as the tree height and structure can be measured through morphological variables such as canopy height, density, coverage, etc. However, the different forest layers have their own traits that need to be measured separately. This requires the point cloud to be segmented and classified so that the morphological variables can be derived of every layer individually.

Research questions:

  • What are the morphological variables for the Tumbarumba site?
    • What are the morphological traits for each of the forest layers?
    • What is the influence of the layer thickness on the morphological variables?
    • How do the morphological traits differ between the layers and overall?
  • What are your conclusions regarding the layering of the forest?

KGG213/543 Remote Sensing : Image Analysis and Advanced Earth Observating Tasks:

  • Segment the different forest layers in CloudCompare
    • Create slices of 1 m thickness from the point cloud
    • First segment out the canopy layer and save
    • Then segment out the other forest layers and save them
    • Merge all canopy layer segments back together using LAStools, do the same for the other layers.
  • Calculate morphological variables with LAStools for the whole forest and each layers
    • Use different step sizes (e.g. different grid resolutions)
  • Compare the results

Data:

  • Ground classified, normalised and noise-filter point cloud of the Tumbarumba site can be found on MyLO.

Project 10: Geographic Object-Based Image Analysis (GEOBIA) using eCognition for object-based classification of urban areas

In this project you will apply building extraction techniques based on high-resolution digital aerial photography and LiDAR data acquired over Hobart in 2011. The aim is to automatically identify buildings based on their shape and other object characteristics and to analyse the accuracy of this approach. Ultimately, different landuse classes for buildings could also automatically be extracted.

KGG213/543 Remote Sensing: Image Analysis and Advanced Earth Observation Tasks:

  • Use the aerial photography and LiDAR data provided for the Hobart, Sandy Bay area
  • Experiment with different segmentation algorithms and segmentation parameters to identify the most suitable segmentation algorithm to define buildings. Show the impact of segmentation parameters on the output
  • Combining the image data and the LiDAR data build a ruleset for automatic identification of buildings
  • Test the accuracy of the building extraction by manually digitising several buildings and comparing the automatically identified building polygon to your digitised reference polygons.
  • Challenge: define rules that allow you to separate different land uses (rather than land cover): residential versus industrial

Data:

  • See the following UTAS network drive for project data (note you will need an active VPN connection if you are outside the UTAS network):

\\studentdata\units\Geography\KGG213\project

  • Sandy Bay orthophoto mosaic with RGB and NIR bands, collected in 2011
  • Sandy Bay LiDAR data, collected in 2011
  • Metadata doc
  • eCognition Deconstructed: NDSM Layer Calculation

Project 11: Remote sensing from an Unmanned Aerial Vehicle (UAV) – a saltmarsh classification case study

In this project, you will work with UAV aerial photography collected over a saltmarsh area in Ralphs Bay in southeast Tasmania. The key task is to classify the imagery into different vegetation communities using classification techniques.

KGG213/543 Remote Sensing: Image Analysis and Advanced Earth Observation Tasks:

  • Extract a range of texture measures, based on the GLCM approach and combine the texture bands with the RGB bands in an image stack.
    • Vishnu Prahalad mapped the vegetation communities of several saltmarshes in southeast Tasmania in great detail (Prahalad 2009). Vishnu’s Shapefile for Ralphs Bay can be used for training and testing purposes. Derive two separate region of

interest files (ROIs): one for training and one for testing. It’s best to generate many

(20+) small polygons for each class.

  • Based on pixel-based classification techniques or GEOBIA, derive saltmarsh vegetation communities from the UAV imagery.
    • Test the accuracy of the results and compare the results with Vishnu’s map.
    • Assess the effect of texture measures on the classification accuracy.

Data: (KGG213 project folder)

  • See the following UTAS network drive for project data (note you will need an active VPN connection if you are outside the UTAS network):

\\studentdata\units\Geography\KGG213\project

  • Orthophoto from UAV photogrammetry
    • DSM point cloud from UAV photogrammetry
    • Saltmarsh vegetation communities shapefile

Project 12: Drone based weed mapping at Tolmans Hill

In this project, you will work with UAV aerial photography and multispectral data (Micasense RedEdge MX) collected over a weed infested area at Tolmans Hill (near Mt Nelson, Hobart) on 30/8/2019. The focus of your project is up to you, but the core goal is to look how well the weeds can be mapped using the drone data.

KGG213/543 Remote Sensing: Image Analysis and Advanced Earth Observation Tasks (choose one or more of these optional tasks depending on your chosen focus):

  • Extract weed classes based on supervised classification (choose one or more techniques and compare the results. For example just choose one standard ENVI classification such as SAM and another that you are comfortable with and looks at how well they perform. You may want to include PCA or GLCM band(s).
    • eCognition might provide good results, you could compare an ENVI approach to an eCognition classification.
    • Reflectance panel data is available in case you want to try an empirical line correction using the grayscale panels near GCP#1.
    • DSM data (2.5D tif and 3D obj) is provided, you could look at how to include these datasets in your classifications (or compare the classifications with/without DSM data derivatives).
    • RGB data is provided, you could assess how this could be used for classification and/or classification improvement by comparing two scenarios (eg RGB classification vs multispectral data classification).

Data:

  • See the following UTAS network drive for project data (note you will need an active VPN connection if you are outside the UTAS network):

\\studentdata\units\Geography\KGG213\project

  • Flights: visible (RGB) @ 30m, 60m, 80m and 120m and RedEdge MX @ 60m, 80m and 120m.
    • Ground Truth data captured: 24 ASD FieldSpec spectra and RTK DGNSS coordinates of key species/ground cover.
    • Ground Control: 10 ground control markers (Propeller AeroPoints, admittedly not perfectly distributed due to gorse barriers and injury!).
    • Key weeds species: Gorse and the Spanish heath, both are flowering when data was collected and so is wattle (pictures provided).
    • Weather: It was a perfect sunny day!
    • See notes for datasets (in DatasetNotes.txt in project folder).

Project 13: Drone based vegetation mapping and change detection using RGB imagery of the Prosser River Mouth at Orford

In this project, you will work with UAV aerial photography collected over a bird habitat the Prosser River mouth at Orford (on the east coast of Tas, about 70km from Hobart). Two datasets were captured in June 2019 and April 2021. The focus of your project to see how well you can detect vegetation change between the two datasets (and coastal change if you want). They have both been controlled with GNSS ground control targets (AeroPoints) so they are not perfectly coregistered (important for change detection).

The site is a breeding habitat for shore birds and some of these species nest on the sand, the key reason for assessing the vegetation change is that weeds like marram grass can spread and overtake open sandy areas. The birds may choose to not nest amongst the marram grass and so the change detection should focus on the spread of marram grass in the area beside the river mouth (western bank). Boobialla is also spreading in the sanctuary and may also impact nesting sites. The two datasets are less that 2 years apart so the change will be small and may be difficult to accurately quantify, that is okay as the project focus is on the techniques used and your explanation of your results.

KGG213/543 Remote Sensing: Image Analysis and Advanced Earth Observation Tasks (suggestion):

  • Coregister the two datasets.
    • Classify the two datasets (optionally using DSM and/or photogrammetric point cloud if you want to incorporate a CHM or other derivatives in your classification).
    • Perform change detection (you may find that you need to classify the change as likely change and likely error).
    • Discuss your process and how it impacted the results (you may want to try various approaches and compare the results).

Data:

  • See the following UTAS network drive for project data (note you will need an active VPN connection if you are outside the UTAS network):

\\studentdata\units\Geography\KGG213\project

  • Two RGB orthomosaic datasets captured in June 2019 and April 2021 are provided along with a LAZ and DSM for each date that you may want to use to help in your classification.

Project 14: Define your own project!

You are allowed to propose your own topic, however, you have to be realistic about the objectives and tasks that you can achieve within the given time. Make sure that the datasets for your own project are readily available. If you want to work on your own topic, contact us in the first week of the project period ([email protected]).

Some examples of alternative remote sensing topics include:

  • Sentinel: Data for the Sentinel missions is available free of charge after creating an account with Copernicus open access hub https://scihub.copernicus.eu/dhus/#/home
    • MODIS time series: Look at time series of MODIS products such as EVI to assess changes in vegetation patterns over the seasons, or to assess the impacts of El Niño and La Niña on the vegetation patterns in Australia. Use time series of vegetation indices to separate crops from natural vegetation areas.

·         Advanced techniques/tools in ENVI:

  • ENVI’s Feature extraction module for object-based image analysis
    • Rigorous orthorectification
    • Atmospheric correction
    • Pansharpening
    • Change detection
    • Classification
    • Spectral unmixing
    • Planetary remote sensing: NASA and ESA have made satellite imagery of Mars freely

available for download. There’s loads of exciting stuff to explore!

Sources of free datasets:

Inspiring ideas for applications:

More topics for KGG213/543 Remote Sensing: Image Analysis and Advanced Earth Observation

  • Assess Australian forest fires
    • Assess Amazon forest fires
    • Assess regeneration after forest fires in Tasmania
    • Assess forest change (logging) in Tasmania
    • Detect changes during the global COVID-19 lock down, think of traffic etc.
    • Explore new Earth Observation data from CubeSats, e.g. Planet CubeSat data
    • Explore data from the GEDI global laser scanning mission on the International Space Station
    • Work with the TerraLuma group at UTAS on drone-based remote sensing applications (ask Arko Lucieer and/or Darren Turner)
    • Work on an experiment in the lab or outside with spectroscopy
    • Work on better understanding drone-based multispectral/hyperspectral/thermal/lidar sensors
    • Etc, etc, lots of interesting applications and datasets to explore and pursue!

Visit:https://auspali.info/

Also visit:https://www.notesnepal.com/archives/767