Corn Counting Using Unmanned Aircraft Systems and Convolutional Neural Networks

Research poster
Sandeep Venkatesh
Sami Khanal
Department of Food, Agricultural and Biological Engineering

Understanding of the spatial and temporal variability of crop emergence and its density within a field is critical to improve crop productivity while minimizing the input costs and environmental impacts of agricultural practices. Uniformity of crop emergence can influence mid-season management decisions on variable-rate fertilizer, herbicide and pesticide applications in precision agriculture. The current approach of crop counting involves visual inspection via crop scouts, which is labor-intensive and time consuming if an inspection has to be done on a large field. Advancement in technologies including unmanned aerial systems (UAS), image processing and deep learning can help address this concern. The objective of this work is to demonstrate the automation and application of convolutional neural network (CNN), one of the Deep Learning (DL) techniques, to quantify the corn population using high-resolution visible imagery collected using low-cost UAS. The UAS data were collected from Synder farm, Wooster, Ohio at 30-meter altitude from the ground using visible camera when corn crops were at V4 growth stages. High-resolution visible images were cropped into 40*60-pixel boxes in a sliding window manner and each image was classified as either corn or soil images. CNN model was then trained with the dataset of 5K corn and soil (non-corn) images, each using TensorFlow in Python. The accuracy of the developed CNN model for corn plant counting was found to be 95%. The developed methodology provided time- and cost-effective approach for monitoring crop emergence and can be extended to other crops.