This site is using cookies to collect anonymous visitor statistics and enhance the user experience.  OK | Find out more

Skip to content
Natural Environment Research Council
Grants on the Web - Return to homepage Logo

Details of Award

NERC Reference : NE/R007888/1

Blue eyes: New tools for monitoring coastal environments using remotely piloted aircraft and machine learning.

Training Grant Award

Lead Supervisor:
Dr M Mackiewicz, University of East Anglia, Computing Sciences
Science Area:
Atmospheric
Earth
Freshwater
Marine
Terrestrial
Overall Classification:
Terrestrial
ENRIs:
Biodiversity
Environmental Risks and Hazards
Global Change
Natural Resource Management
Pollution and Waste
Science Topics:
Appl. of Expert Systems (AI)
Intelligent & Expert Systems
Data Fusion (Measurement)
Image Reconstruction
Image Registration
Sensor Data Fusion
Intelligent Measurement Sys.
Land - Ocean Interactions
Remote Sensing & Earth Obs.
Optical remote sensing
Vegetation monitoring
Survey & Monitoring
Biodiversity monitoring
Abstract:
Remote sensing of coastal environments is a rapidly evolving field, with remotely piloted aircraft (RPA) showing growing potential for mapping and tracking the distribution, health and dynamics of coastal features such as vegetation, shellfish, birds, mammals, geomorphology and even litter (e.g. Anderson & Gaston, 2013). Their advantages over traditional environmental monitoring techniques is in their speed of deployment and ability to deliver ultra-high resolution 2-D and 3-D images of entire landscapes and the environmental features within them, which can be used to accurately map and objectively assess even subtle or early-day changes in the physical environment and in biodiversity. The information generated from RPA monitoring allows managers to gain a detailed and accurate understanding of how entire features of coastal ecosystems change in response to climate, hydrographic conditions and anthropogenic activities and so they represent a metamorphosis in how these ecosystems and their natural and anthropogenic hazards can be managed. The methods employed to detect and classify the environmental features in the images are pivotal to the success of RPA use in marine ecosystems. Traditional image processing techniques use pixel-based supervised classification, but the super-high resolution of RPA images can render such methods cumbersome and ineffective - increasing costs and causing delays in data production. Hence, new techniques must be developed specifically to deal with the significant amounts of information contained in RPA images. This studentship aims to develop bespoke tools for RPA image analysis using `deep learning', a complex machine learning technique that has recently proven to provide a step-change in a number of computer vision applications (e.g. LeCun et al. ,2015). In particular, we envisage that fusion of the now available 3-D information with the traditional RGB imaging will be one of the main focal points of the project as it promises significant improvements in performance (Gupta et al. ,2014). The algorithm development will require a large dataset of annotated imagery for training and the expert knowledge on the image appearance which are available in Cefas. Cefas have an extensive library of RPA images covering beaches, mudflats and rocky shores generated from projects monitoring coastal geomorphology and erosion, seagrass, saltmarsh, algae and marine invertebrates. The student will apply various computational techniques starting with the ideas mentioned above with a view to develop algorthims for identifying the key coastal features, with the benefit that tool development can begin immediately on commencement of the PhD. The student will also contribute to the design and deployment of Cefas RPA flights as the research progresses, to provide targeted information for any environmental features that provde difficult to identify. The developed tools will allow a faster and more effective interpretation of images, vastly improving the utility of the coastal change tracking technology. Anderson, K. & Gaston, K.J. (2013) Lightweight unmanned aerial vehicles will revolutionise spatial ecology. Frontiers in Ecology and the Environment, 11(3) 138-146. LeCun, Y., Bengio, Y. & Hinton, G (2015). Deep learning. Nature 521, 436-444. Gupta S., Girshick R., Arbelaez P. & Malik J. (2014) Learning Rich Features from RGB-D Images for Object Detection and Segmentation. In ECCV 2014. Lecture Notes in Comp. Sci., 8695 345-360
Period of Award:
1 Oct 2018 - 30 Sep 2022
Value:
£89,114
Authorised funds only
NERC Reference:
NE/R007888/1
Grant Stage:
Completed
Scheme:
DTG - directed
Grant Status:
Closed
Programme:
Industrial CASE

This training grant award has a total value of £89,114  

top of page


FDAB - Financial Details (Award breakdown by headings)

Total - FeesTotal - RTSGTotal - Student Stipend
£17,480£11,000£60,637

If you need further help, please read the user guide.