{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,18]],"date-time":"2026-03-18T02:05:46Z","timestamp":1773799546266,"version":"3.50.1"},"reference-count":34,"publisher":"MDPI AG","issue":"15","license":[{"start":{"date-parts":[[2023,7,27]],"date-time":"2023-07-27T00:00:00Z","timestamp":1690416000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"National Natural Science Foundation of China","award":["32201665"],"award-info":[{"award-number":["32201665"]}]},{"name":"National Natural Science Foundation of China","award":["2022YFD2001801-3"],"award-info":[{"award-number":["2022YFD2001801-3"]}]},{"name":"National Natural Science Foundation of China","award":["2108085MC96"],"award-info":[{"award-number":["2108085MC96"]}]},{"name":"National Natural Science Foundation of China","award":["202004a06020016"],"award-info":[{"award-number":["202004a06020016"]}]},{"name":"Key R&amp;D Program of China","award":["32201665"],"award-info":[{"award-number":["32201665"]}]},{"name":"Key R&amp;D Program of China","award":["2022YFD2001801-3"],"award-info":[{"award-number":["2022YFD2001801-3"]}]},{"name":"Key R&amp;D Program of China","award":["2108085MC96"],"award-info":[{"award-number":["2108085MC96"]}]},{"name":"Key R&amp;D Program of China","award":["202004a06020016"],"award-info":[{"award-number":["202004a06020016"]}]},{"name":"Natural Science Foundation of Anhui","award":["32201665"],"award-info":[{"award-number":["32201665"]}]},{"name":"Natural Science Foundation of Anhui","award":["2022YFD2001801-3"],"award-info":[{"award-number":["2022YFD2001801-3"]}]},{"name":"Natural Science Foundation of Anhui","award":["2108085MC96"],"award-info":[{"award-number":["2108085MC96"]}]},{"name":"Natural Science Foundation of Anhui","award":["202004a06020016"],"award-info":[{"award-number":["202004a06020016"]}]},{"name":"Key R&amp;D Program of Anhui","award":["32201665"],"award-info":[{"award-number":["32201665"]}]},{"name":"Key R&amp;D Program of Anhui","award":["2022YFD2001801-3"],"award-info":[{"award-number":["2022YFD2001801-3"]}]},{"name":"Key R&amp;D Program of Anhui","award":["2108085MC96"],"award-info":[{"award-number":["2108085MC96"]}]},{"name":"Key R&amp;D Program of Anhui","award":["202004a06020016"],"award-info":[{"award-number":["202004a06020016"]}]},{"name":"Anhui Province New Energy Vehicle and Intelligent Connected Automobile Industry Technology Innovation Project","award":["32201665"],"award-info":[{"award-number":["32201665"]}]},{"name":"Anhui Province New Energy Vehicle and Intelligent Connected Automobile Industry Technology Innovation Project","award":["2022YFD2001801-3"],"award-info":[{"award-number":["2022YFD2001801-3"]}]},{"name":"Anhui Province New Energy Vehicle and Intelligent Connected Automobile Industry Technology Innovation Project","award":["2108085MC96"],"award-info":[{"award-number":["2108085MC96"]}]},{"name":"Anhui Province New Energy Vehicle and Intelligent Connected Automobile Industry Technology Innovation Project","award":["202004a06020016"],"award-info":[{"award-number":["202004a06020016"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>The identification of the growth and development period of rice is of great significance to achieve high-yield and high-quality rice. However, the acquisition of rice growth period information mainly relies on manual observation, which has problems such as low efficiency and strong subjectivity. In order to solve these problems, a lightweight recognition method is proposed to automatically identify the growth period of rice: Small-YOLOv5, which is based on improved YOLOv5s. Firstly, the new backbone feature extraction network MobileNetV3 was used to replace the YOLOv5s backbone network to reduce the model size and the number of model parameters, thus improving the detection speed of the model. Secondly, in the feature fusion stage of YOLOv5s, we introduced a more lightweight convolution method, GsConv, to replace the standard convolution. The computational cost of GsConv is about 60\u201370% of the standard convolution, but its contribution to the model learning ability is no less than that of the standard convolution. Based on GsConv, we built a lightweight neck network to reduce the complexity of the network model while maintaining accuracy. To verify the performance of Small-YOLOv5s, we tested it on a self-built dataset of rice growth period. The results show that compared with YOLOv5s (5.0) on the self-built dataset, the number of the model parameter was reduced by 82.4%, GFLOPS decreased by 85.9%, and the volume reduced by 86.0%. The mAP (0.5) value of the improved model was 98.7%, only 0.8% lower than that of the original YOLOv5s model. Compared with the mainstream lightweight model YOLOV5s- MobileNetV3-Small, the number of the model parameter was decreased by 10.0%, the volume reduced by 9.6%, and the mAP (0.5:0.95) improved by 5.0%\u2014reaching 94.7%\u2014and the recall rate improved by 1.5%\u2014reaching 98.9%. Based on experimental comparisons, the effectiveness and superiority of the model have been verified.<\/jats:p>","DOI":"10.3390\/s23156738","type":"journal-article","created":{"date-parts":[[2023,7,28]],"date-time":"2023-07-28T02:10:45Z","timestamp":1690510245000},"page":"6738","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":14,"title":["A Lightweight Recognition Method for Rice Growth Period Based on Improved YOLOv5s"],"prefix":"10.3390","volume":"23","author":[{"given":"Kaixuan","family":"Liu","sequence":"first","affiliation":[{"name":"College of Engineering, Anhui Agricultural University, Hefei 230036, China"}]},{"given":"Jie","family":"Wang","sequence":"additional","affiliation":[{"name":"Anhui Provincial Rural Comprehensive Economic Information Center, Hefei 230031, China"}]},{"given":"Kai","family":"Zhang","sequence":"additional","affiliation":[{"name":"College of Engineering, Anhui Agricultural University, Hefei 230036, China"}]},{"given":"Minhui","family":"Chen","sequence":"additional","affiliation":[{"name":"College of Engineering, Anhui Agricultural University, Hefei 230036, China"}]},{"given":"Haonan","family":"Zhao","sequence":"additional","affiliation":[{"name":"College of Engineering, Anhui Agricultural University, Hefei 230036, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-0696-9168","authenticated-orcid":false,"given":"Juan","family":"Liao","sequence":"additional","affiliation":[{"name":"College of Engineering, Anhui Agricultural University, Hefei 230036, China"},{"name":"Hefei Institute of Technology Innovation Engineering, Chinese Academy of Sciences, Hefei 230094, China"}]}],"member":"1968","published-online":{"date-parts":[[2023,7,27]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"50358","DOI":"10.1109\/ACCESS.2021.3069449","article-title":"Towards Paddy Rice Smart Farming: A Review on Big Data, Machine Learning and Rice Production Tasks","volume":"9","author":"Alfred","year":"2021","journal-title":"IEEE Access"},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Jiang, X., Fang, S., Huang, X., Liu, Y., and Guo, L. (2021). Rice Mapping and Growth Monitoring Based on Time Series GF-6 Images and Red-Edge Bands. Remote Sens., 13.","DOI":"10.3390\/rs13040579"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"65","DOI":"10.1016\/j.agrformet.2013.02.011","article-title":"Automatic image-based detection technology for two critical growth stages of maize: Emergence and three-leaf stage","volume":"174","author":"Yu","year":"2013","journal-title":"Agric. For. Meteorol."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"2509","DOI":"10.1109\/JSTARS.2016.2547843","article-title":"Paddy-rice phenology classification based on machine-learning methods using multitemporal co-polar X-band SAR images","volume":"9","author":"Erten","year":"2016","journal-title":"IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens."},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Zheng, J., Song, X., Yang, G., Du, X., Mei, X., and Yang, X. (2022). Remote sensing monitoring of rice and wheat canopy nitrogen: A review. Remote Sens., 14.","DOI":"10.3390\/rs14225712"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Liu, S., Peng, D., Zhang, B., Chen, Z., Yu, L., Chen, J., Pan, Y., Zheng, S., Hu, J., and Lou, Z. (2022). The Accuracy of Winter Wheat Identification at Different Growth Stages Using Remote Sensing. Remote Sens., 14.","DOI":"10.3390\/rs14040893"},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Sapkota, B., Singh, V., Neely, C., Rajan, N., and Bagavathiannan, M. (2020). Detection of Italian ryegrass in wheat and prediction of competitive interactions using remote-sensing and machine-learning techniques. Remote Sens., 12.","DOI":"10.3390\/rs12182977"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"340","DOI":"10.1016\/j.compag.2018.07.026","article-title":"Developing an integrated indicator for monitoring maize growth condition using remotely sensed vegetation temperature condition index and leaf area index","volume":"152","author":"Wang","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Ji, Z., Pan, Y., Zhu, X., Wang, J., and Li, Q. (2021). Prediction of Crop Yield Using Phenological Information Extracted from Remote Sensing Vegetation Index. Sensors, 21.","DOI":"10.3390\/s21041406"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"16","DOI":"10.1177\/1063293X21988944","article-title":"Smart paddy field monitoring system using deep learning and IoT","volume":"29","author":"Sethy","year":"2021","journal-title":"Concurr. Eng."},{"key":"ref_11","first-page":"300","article-title":"A survey of high resolution image processing techniques for cereal crop growth monitoring","volume":"9","author":"Rasti","year":"2022","journal-title":"Inf. Process. Agric."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"260","DOI":"10.1016\/j.agrformet.2018.05.001","article-title":"Rice heading stage automatic observation by multi-classifier cascade based rice spike detection method","volume":"259","author":"Bai","year":"2018","journal-title":"Agric. For. Meteorol."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"86843","DOI":"10.1109\/ACCESS.2021.3089670","article-title":"Automatic identification algorithm of the rice tiller period based on PCA and SVM","volume":"9","author":"Zhang","year":"2021","journal-title":"IEEE Access"},{"key":"ref_14","unstructured":"Kevin, K., Norbert, K., Raghav, K., Roland, S., Achim, W., and Helge, A. (2018, January 3\u20136). Soybean leaf coverage estimation with machine learning and thresholding algorithms for field phenotyping. Proceedings of the British Machine Vision Conference, Newcastle, UK."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"104978","DOI":"10.1016\/j.compag.2019.104978","article-title":"Deep learning-based automatic recognition network of agricultural machinery images","volume":"166","author":"Zhang","year":"2019","journal-title":"Comput. Electron. Agric."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"1733","DOI":"10.1007\/s00521-020-05064-6","article-title":"Crop growth stage estimation prior to canopy closure using deep learning algorithms","volume":"33","author":"Rasti","year":"2021","journal-title":"Neural Comput. Appl."},{"key":"ref_17","first-page":"219","article-title":"Recognition of cotton growth period for precise spraying based on convolution neural network","volume":"8","author":"Wang","year":"2021","journal-title":"Inf. Process. Agric."},{"key":"ref_18","first-page":"99","article-title":"Strawberry Growth Period Recognition Method Under Greenhouse Environment Based on Improved YOLOv4","volume":"3","author":"Jiehua","year":"2021","journal-title":"Smart Agric."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"417","DOI":"10.1016\/j.compag.2019.01.012","article-title":"Apple detection during different growth stages in orchards using the improved YOLO-V3 model","volume":"157","author":"Tian","year":"2019","journal-title":"Comput. Electron. Agric."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"3895","DOI":"10.1007\/s00521-021-06651-x","article-title":"A fast accurate fine-grain object detection model based on YOLOv4 deep neural network","volume":"34","author":"Roy","year":"2022","journal-title":"Neural Comput. Appl."},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Ahmed, K.R. (2021). Smart Pothole Detection Using Deep Learning Based on Dilated Convolution. Sensors, 21.","DOI":"10.3390\/s21248406"},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"107757","DOI":"10.1016\/j.compag.2023.107757","article-title":"Detection of tomato plant phenotyping traits using YOLOv5-based single stage detectors","volume":"207","author":"Cardellicchio","year":"2023","journal-title":"Comput. Electron. Agric."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"7","DOI":"10.1186\/s13007-015-0047-9","article-title":"Automated characterization of flowering dynamics in rice using field-acquired time-series RGB images","volume":"11","author":"Guo","year":"2015","journal-title":"Plant Methods"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Hong, S., Jiang, Z., Liu, L., Wang, J., Zhou, L., and Xu, J. (2022). Improved Mask R-CNN Combined with Otsu Preprocessing for Rice Panicle Detection and Segmentation. Appl. Sci., 12.","DOI":"10.3390\/app122211701"},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"106217","DOI":"10.1016\/j.engappai.2023.106217","article-title":"Lightweight object detection algorithm for robots with improved YOLOv5","volume":"123","author":"Liu","year":"2023","journal-title":"Eng. Appl. Artif. Intell."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"15523","DOI":"10.1038\/s41598-022-19674-8","article-title":"Road damage detection algorithm for improved YOLOv5","volume":"12","author":"Guo","year":"2022","journal-title":"Sci. Rep."},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Howard, A., Sandler, M., Chu, G., Chen, L.-C., Chen, B., Tan, M., Wang, W., Zhu, Y., Pang, R., and Vasudevan, V. (November, January 27). Searching for MobileNetV3. Proceedings of the 2019 IEEE\/CVF International Conference on Computer Vision (ICCV), Seoul, Republic of Korea.","DOI":"10.1109\/ICCV.2019.00140"},{"key":"ref_28","unstructured":"Li, H. (2022). Slim-neck by GSConv A better design paradigm of detector architecturs for autonomos vehicles. arXiv."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Hu, J., Shen, L., Albanie, S., Sun, G., and Wu, E. (2018, January 18\u201323). Squeeze-and-Excitation Networks. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA.","DOI":"10.1109\/CVPR.2018.00745"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Chollet, F. (2017, January 21\u201326). Xception: Deep Learning with Depthwise Separable Convolutions. Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA.","DOI":"10.1109\/CVPR.2017.195"},{"key":"ref_31","unstructured":"Howard, A.G. (2017). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv."},{"key":"ref_32","unstructured":"Ramachandran, P. (2017). Searching for activation functions. arXiv."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., and Chen, L.-C. (2018, January 18\u201323). MobileNetV2: Inverted Residuals and Linear Bottlenecks. Proceedings of the 2018 IEEE\/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA.","DOI":"10.1109\/CVPR.2018.00474"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Huang, G., Liu, Z., Van Der Maaten, L., and Weinberger, K.Q. (2017, January 21\u201326). Densely Connected Convolutional Networks. Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA.","DOI":"10.1109\/CVPR.2017.243"}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/15\/6738\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T20:21:03Z","timestamp":1760127663000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/15\/6738"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,7,27]]},"references-count":34,"journal-issue":{"issue":"15","published-online":{"date-parts":[[2023,8]]}},"alternative-id":["s23156738"],"URL":"https:\/\/doi.org\/10.3390\/s23156738","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,7,27]]}}}