Melbourne Water UAV Machine Learning for Vegetation

Using artificial intelligence to automatically detect vegetation from UAV imagery in Melbourne Water’s retarding basins.   

The Challenge

Melbourne Water manages more than 200 retarding basins (this number increases annually) across Greater Melbourne that are designed to reduce flooding by holding heavy rainfall in reserved low-lying areas of land. Retarding basins have minimum design requirements, meaning it is critical that newly acquired retarding basins meet the minimum standards, as well as ensuring that the health of existing retarding basins are monitored.

Currently, Melbourne Water uses a mix of random sampling and field work to determine if retarding basins meet design standards. The recent development of internal Melbourne Water Unmanned Aerial Vehicle (UAV) acquisition capabilities, provides a unique opportunity to develop a more comprehensive and cost-effective means to monitor the vegetation in retarding basins. The aim of this project was to develop a prototype algorithm which Melbourne Water could use to automatically detect vegetation presence, species, and health from UAV imagery over their retarding basins.


Melbourne Water was the primary partner in this project, and FrontierSI was supported by Player Piano Data Analytics.

The Solution

FrontierSI had previously prototyped a deep learning tree detection and counting algorithm within the Victorian Department of Environment, Land, Water and Planning’s (DELWP) Coordinated Imaging Program. This algorithm which had been successfully developed for wide area mapping had the ability to be naturally extended for species identification and vegetation detection using UAV imagery over Melbourne Water’s retarding basins. A cross functional collaborative team with a breadth of expertise and capability was an essential element for success in this project.

Following a UAV inspection of a wetland, images are combined into an orthomosaic image providing a high-resolution representation of the entire wetland. The assessment of vegetation in a wetland presents many challenges: variability is influenced by plant maturity, water levels, lighting and species; vegetation can be sparse or densely planted, requiring the ability to assess both individual plants and clusters of vegetation. To combat these challenges, image filters are applied to match features such as colour and texture within the UAV acquired imagery.

In developing the tool to apply to the UAV imagery, Melbourne Water and FrontierSI utilised open-source libraries to gain a greater level of configuration and customisation than common cloud-based capabilities currently offer. Free of any license costs, the tool can be scaled quickly and inexpensively. Importantly, this provides the ability to apply the technology in-field, allowing analysis to potentially be performed in real-time in the future. The tool outputs statistics and a heat map, to identify areas where plant count or area does, or does not, meet targets in accordance with design.


Using a machine learning approach to assess the vegetation presence/absence and health from UAV imagery, Melbourne Water can ensure issues within the retarding basins are identified, detected, and managed early in a consistent and repeatable manner. This is a more comprehensive, cost-effective, and efficient means to monitor the vegetation in wetlands than the previous manual counting methods.

Melbourne Water incorporated the tool into the handover/commissioning stage of their Developer Services Team for all newly constructed wetlands. Throughout the successive 12 months, the tool was running alongside existing processes to establish confidence among vegetation specialists and provide the opportunity to enhance the training data (for instance, with new species). In time, the vision is to have the toolset available to those constructing the wetlands to provide immediate feedback prior to handover, essentially becoming a self-assessment tool.


To learn more, contact FrontierSI at or connect directly with Chief Innovation and Delivery Officer, Phil Delaney, at