UAVs and Machine Learning Revolutionising Invasive Grass and Vegetation Surveys in Remote Arid Lands
(c) 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
The monitoring of invasive grasses and vegetation in remote areas is challenging, costly, and on the ground sometimes dangerous. Satellite and manned aircraft surveys can assist but their use may be limited due to the ground sampling resolution or cloud cover. Straightforward and accurate surveillance methods are needed to quantify rates of grass invasion, offer appropriate vegetation tracking reports, and apply optimal control methods. This paper presents a pipeline process to detect and generate a pixel-wise segmentation of invasive grasses, using buffel grass (Cenchrus ciliaris) and spinifex (Triodia sp.) as examples. The process integrates unmanned aerial vehicles (UAVs) also commonly known as drones, high-resolution red, green, blue colour model (RGB) cameras, and a data processing approach based on machine learning algorithms. The methods are illustrated with data acquired in Cape Range National Park, Western Australia (WA), Australia, orthorectified in Agisoft Photoscan Pro, and processed in Python programming language, scikit-learn, and eXtreme Gradient Boosting (XGBoost) libraries. In total, 342,626 samples were extracted from the obtained data set and labelled into six classes. Segmentation results provided an individual detection rate of 97% for buffel grass and 96% for spinifex, with a global multiclass pixel-wise detection rate of 97%. Obtained results were robust against illumination changes, object rotation, occlusion, background cluttering, and floral density variation.
This work was funded by the Plant Biosecurity Cooperative Research Centre (PBCRC) 2164 project, the Agriculture Victoria Research and the Queensland University of Technology (QUT). The authors would like to acknowledge Derek Sandow andWA Parks andWildlife Service for the logistic support and permits to access the survey areas at Cape Range National Park. The authors would also like to acknowledge Eduard Puig-Garcia for his contributions in co-planning the experimentation phase. The authors gratefully acknowledge the support of the QUT Research Engineering Facility (REF) Operations Team (Dirk Lessner, Dean Gilligan, Gavin Broadbent and Dmitry Bratanov), who operated the DJI S800 EVO UAV and image sensors, and performed ground referencing. We thank Gavin Broadbent for the design, manufacturing, and tuning of a two-axis gimbal for the camera. We also acknowledge the High-Performance Computing and Research Support Group at QUT, for the computational resources and services used in this work.
This is the final version of the article. Available from MDPI via the DOI in this record.
Vol. 18 (2), pp. 605 - 605