Overview
In this work, we present a new dataset to advance the state-of-the-art in fruit detection, segmentation, and counting in orchard environments. While there has been significant recent interest in solving these problems, the lack of a unified dataset has made it difficult to compare results. We hope to enable direct comparisons by providing a large variety of high-resolution images acquired in orchards, together with human annotations of the fruit on trees. The fruits are labeled using polygonal masks for each object instance to aid in precise object detection, localization, and segmentation. Additionally, we provide data for patch-based counting of clustered fruits. Our dataset contains over 41,000 annotated object instances in 1000 images. We present a detailed overview of the dataset together with baseline performance analysis for bounding box detection, segmentation, and fruit counting as well as representative results for yield estimation.
Data download
The data can be downloaded from the U of M Data Depository.
Code examples
To check out the code repository with examples of how to load the data, extract bounding boxes, and segmentation masks, please visit GitHub.
Evaluate your results
We provide a CodaLab competition server for you to upload your results and compare them to other algorithms.
- Access the competition server for fruit detection
- Access the competition server for fruit segmentation
- Access the competition server for fruit counting
References
If you find this dataset or the associated codebase useful for your research, consider citing:
[1] N. Häni, P. Roy, and V. Isler, “MinneApple: A Benchmark Dataset for Apple Detection and Segmentation,” arXiv:1909.06441 [cs], Sep. 2019.
[2] N. Häni, P. Roy, and V. Isler, “A comparative study of fruit detection and counting methods for yield mapping in apple orchards,” Journal of Field Robotics, Aug. 2019.
[3] N. Häni, P. Roy, and V. Isler, “Apple Counting using Convolutional Neural Networks,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 2018, pp. 2559–2565.