Perception plays a significant role in agricultural robots. If a robot fails to detect a target in the perception step, it will not perform any actions towards that target even if the control and manipulation systems are very effective. A robotic cotton harvester was tested in the field to evaluate its perception system performance. A ZED 2i stereo camera in conjunction with YOLOv4-tiny was utilized to detect and localize cotton bolls. To train the object detection network image data was gathered in two steps. Adding a black background panel behind the target row in the second step of image gathering eliminated the cotton bolls from other rows in the image. It also helped to improve object detection performance. The robot could detect 78% of the cotton bolls on the plant and localized 70% of the detected bolls. Assessing the precision of the localization system showed that, the mean absolute error on the X, Y, and Z axes in the camera’s coordinate system was 5.8, 5.2, and 8.1 mm respectively.
Unlike above-canopy imagery, under-canopy imagery is rarely used in agricultural research, but it can provide specific information about plants that above-canopy images may not carry, such as fruiting behavior as well as early nutrient deficiencies or diseases. In this research, under-canopy images of seven cotton varieties were collected with the goal of classification according to variety. An RGB FLIR camera was installed on a remotely controlled ground robot such that the camera was close to the ground looking upwards. A deep learning-based VGG16 network was used to extract features, and a softmax classifier in the network was trained to classify the images. The VGG16 network was able to classify the seven cotton varieties with 86% accuracy. The classification accuracy based on the area under the curve (AUC) of receiver operating characteristic (ROC) showed that the images belonging to the varieties CVTI-108, 110, 114, and 120 were the most accurately (AUC = 1.0) classified, while the images of variety CVT-115 were the least accurately (AUC = 0.93) classified.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.