Accurate acquisition of the canopy information of fruit trees is very important for the precise variable spray of modern orchards. Using drones to obtain fruit tree canopy images is a better nondestructive method, but the lighting conditions in the orchard are complex. It is difficult to quickly extract the canopy of fruit trees from the aerial images of orchard by UAV, which can be used to guide UAV to apply pesticide in real time and accurately. Therefore, we propose a method for canopy segmentation of fruit tree canopy images using SegNet network model. We use images obtained from modern orchards to verify the accuracy and real time of the network, and use four indicators to compare our network model with Unet and FCN-8s network models: accuracy, precision, recall and harmonic average. After that, we optimize the SegNet network model structure from three aspects: input method, network training parameters and neural network structure. The results show that SegNet has achieved satisfactory results in segmenting the canopy. The optimized SegNet model has an average recognition accuracy of 95.30%, and the recognition time of a single image is as low as 0.045 s, and it has good robustness in both strong and weak light environments. This shows that using SegNet network segmentation to extract fruit tree canopy information is a promising method, and it can provide a reference for real-time and accurate spray of UAVs.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.