Assistant professors So Ra Baek (urban planning) & Martha Bohm (architecture) join UB mathematics associate professor John Ringland in developing tools to characterize food cultivation practices along roadside transects as a potential complement to traditional remote sensing approaches.
They report on two software tools for crop identification using a deep convolutional neural network (CNN) applied to Google Street View imagery. The first, a multi-class classifier distinguishes seven regionally common cultivated plant species, as well as uncultivated vegetation, built environment, and water along the roads. The second, a prototype specialist detector, recognizes the presence of a single plant species: in this case, banana. These two classification tools were tested along roadside transects in two areas of Thailand, a country where there is good Google Street View coverage.
On the entire test set, the overall accuracy of the multi-class classifier was 83.3%. For several classes, (banana, built, cassava, maize, rice, and sugarcane), the producer's accuracy was over 90%, meaning that the classifier was infrequently making omission errors. This performance on roadside transects is comparable with that of some remote-sensing classifiers, yet does not require any additional site-visits for ground-truthing. Moreover, the overall accuracy of the classifier on the 40% of images it is most sure about is excellent: 99.0%. For the prototype specialist detector, the area under the ROC curve was 0.9905, indicating excellent performance in detecting the presence of banana plants.
While initially tested over the road network in a small area, this technique could readily be deployed on a regional or even national scale to supplement remote sensing data and yield a fine-grained analysis of food cultivation activities along roadside transects.