Real-Time Navigation Line Extraction for Cabbage Harvesting Robots in Complex Farmland Environments: A Multi-Mask Fusion and Dynamic Row-Tracking Approach

Authors

  • Jing Zhang
  • Yuan Ma
  • Yizhuo Fan
  • Nan Su
  • Xun He
  • Hongmei Zhang

DOI:

https://doi.org/10.6919/ICJE.202506_11(6).0033

Keywords:

Cabbage; Crop Row Detection; Navigation Line Extraction; Computer Vision; Agricultural Automation.

Abstract

As an important leafy vegetable economic crop in China, cabbage harvesting is faced with problems such as low efficiency of manual operation, high labor intensity and high labor cost. However, the existing navigation technologies for harvesting robots has shortcomings in accuracy and adaptability to complex environments.This research presents an advanced navigation line extraction framework tailored for automated cabbage harvesting, targeting the challenges posed by mature heading-stage crops in diverse field conditions. By introducing a multi-mask fusion segmentation strategy grounded in the HSV color model, the study overcomes the limitations of conventional grayscale methods, enabling precise separation of cabbage vegetation and plastic mulch while effectively suppressing interference from weeds, soil reflections, and plant yellowing. Three key algorithmic innovations are developed: First, an adaptive row-tracking algorithm incorporates dynamic historical midpoint adjustment and perspective-constrained hierarchical scanning, achieving an average angular accuracy of 0.24° for central navigation lines across high- and low-light environments, with a 94% reduction in false edge detections compared to traditional sliding window techniques. Second, a RANSAC-based robust fitting method reduces boundary line extraction time to 0.192 seconds per frame and centerline fitting time to less than 0.001 seconds, demonstrating superior performance in handling complex scenarios such as missing plants, curved rows, and heading angle deviations up to 15°. Third, a vector bisector-based centerline extraction approach, utilizing geometric intersection modeling and direction vector synthesis, ensures navigation lines maintain an accuracy within 2 pixels relative to manual annotations. Experimental validation in Xiangcheng County, Henan Province, using the ‘Zhonggan 56’ cabbage variety, confirms the framework's effectiveness, achieving an average precision rate of 92% (defined as the ratio of correctly detected rows to total annotated rows) across varied field conditions including weedy plots and camera tilts. These findings establish a reliable foundation for automated cabbage harvesting systems, with potential applications extending to other leafy vegetable crops. Future research will focus on field deployment via edge computing, followed by algorithm optimization through parallel architectures and deep learning integration and GIS technologies for large-scale agricultural management.

Downloads

Download data is not yet available.

References

[1] Bai Y, Zhang B, Xu N, Zhou J, Shi J, Diao Z. Vision-based navigation and guidance for agricultural autonomous vehicles and robots: A review[J]. Computers and Electronics in Agriculture, 2023, 205: 107584.

[2] Gao S, Yang K, Shi H, Wang K, Bai J. Review on Panoramic Imaging and Its Applications in Scene Understanding[J]. IEEE Transactions on Instrumentation and Measurement, 2022, 71: 1–34.

[3] He Jing, Zang Y, Luo X, Zhao R, He Jie, Jiao J, Key Laboratory of Key Technology on Agricultural Machine and Equipment, Ministry of Education, South China Agricultural University, Guangzhou 510642, China. Visual detection of rice rows based on Bayesian decision theory and robust regression least squares method[J]. International Journal of Agricultural and Biological Engineering, 2021, 14(1): 199–206.

[4] He Y, Zhang X, Zhang Z, Fang H. Automated detection of boundary line in paddy field using MobileV2-UNet and RANSAC[J]. Computers and Electronics in Agriculture, 2022, 194: 106697.

[5] Huang P, Zhu L, Zhang Z, Yang C. Row End Detection and Headland Turning Control for an Autonomous Banana-Picking Robot[J]. Machines, 2021, 9(5): 103.

[6] Jiang P, Ergu D, Liu F, Cai Y, Ma B. A Review of Yolo Algorithm Developments[J]. Procedia Computer Science, 2022, 199: 1066–1073.

[7] Jiang Q, Wang Y, Chen J, Wang J, Wei Z, He Z. Optimizing the working performance of a pollination machine for hybrid rice[J]. Computers and Electronics in Agriculture, 2021, 187: 106282.

[8] Kaur A, Kumar M, Jindal M K. Shi-Tomasi corner detector for cattle identification from muzzle print image pattern[J]. Ecological Informatics, 2022, 68: 101549.

[9] Liu Y, Zhang Yao, Wang Y, Hou F, Yuan J, Tian J, Zhang Yang, Shi Z, Fan J, He Z. A Survey of Visual Transformers[Z]. arXiv, 2022(2022-12-06).

[10] Liu Z, Lin Y, Cao Y, Hu H, Wei Y, Zhang Z, Lin S, Guo B. Swin Transformer: Hierarchical Vision Transformer using Shifted Windows[Z]. arXiv, 2021(2021-08-17).

[11] Ma Z, Yin C, Du X, Zhao L, Lin L, Zhang G, Wu C. Rice row tracking control of crawler tractor based on the satellite and visual integrated navigation[J]. Computers and Electronics in Agriculture, 2022, 197: 106935.

[12] Mazzia V, Salvetti F, Aghi D, Chiaberge M. DeepWay: A Deep Learning waypoint estimator for global path generation[J]. Computers and Electronics in Agriculture, 2021, 184: 106091.

[13] Rabab S, Badenhorst P, Chen Y-P P, Daetwyler H D. A template-free machine vision-based crop row detection algorithm[J]. Precision Agriculture, 2021, 22(1): 124–153.

[14] Saleem M H, Potgieter J, Arif K M. Automation in Agriculture by Machine and Deep Learning Techniques: A Review of Recent Developments[J]. Precision Agriculture, 2021, 22(6): 2053–2091.

[15] Sun Q, Zhang R, Chen L, Zhang L, Zhang H, Zhao C. Semantic segmentation and path planning for orchards based on UAV images[J]. Computers and Electronics in Agriculture, 2022, 200: 107222.

[16] Wang S, Yu S, Zhang W, Wang X, Li J. The seedling line extraction of automatic weeding machinery in paddy field[J]. Computers and Electronics in Agriculture, 2023, 205: 107648.

[17] Wang S, Zhang W, Wang X, Yu S. Recognition of rice seedling rows based on row vector grid classification[J]. Computers and Electronics in Agriculture, 2021, 190: 106454.

[18] Wang T, Chen B, Zhang Z, Li H, Zhang M. Applications of machine vision in agricultural robot navigation: A review[J]. Computers and Electronics in Agriculture, 2022, 198: 107085.

[19] Winterhalter W, Fleckenstein F, Dornhege C, Burgard W. Localization for precision navigation in agricultural fields-Beyond crop row following[J]. Journal of Field Robotics, 2021, 38(3): 429–451.

[20] Wu T, Tang S, Zhang R, Cao J, Zhang Y. CGNet: A Light-Weight Context Guided Network for Semantic Segmentation[J]. IEEE Transactions on Image Processing, 2021, 30: 1169–1179.

[21] Xue J L, Grift T E. Agricultural Robot Turning in the Headland of Corn Fields[J]. Applied Mechanics and Materials, 2011, 63–64: 780–784.

[22] Yang Y, Zhou Y, Yue X, Zhang G, Wen X, Ma B, Xu L, Chen L. Real-time detection of crop rows in maize fields based on autonomous extraction of ROI[J]. Expert Systems with Applications, 2023, 213: 118826.

[23] Yasuda Y D V, Martins L E G, Cappabianco F A M. Autonomous Visual Navigation for Mobile Robots: A Systematic Literature Review[J]. ACM Computing Surveys, 2021, 53(1): 1–34.

[24] Yu C, Gao C, Wang J, Yu G, Shen C, Sang N. BiSeNet V2: Bilateral Network with Guided Aggregation for Real-Time Semantic Segmentation[J]. International Journal of Computer Vision, 2021, 129(11): 3051–3068.

[25] Yu Y, Bao Y, Wang J, Chu H, Zhao N, He Y, Liu Y. Crop Row Segmentation and Detection in Paddy Fields Based on Treble-Classification Otsu and Double-Dimensional Clustering Method[J]. Remote Sensing, 2021, 13(5): 901.

[26] Zheng S, Lu J, Zhao H, Zhu X, Luo Z, Wang Y, Fu Y, Feng J, Xiang T, Torr P H S, Zhang L. Rethinking Semantic Segmentation from a Sequence-to-Sequence Perspective with Transformers[Z]. arXiv, 2021(2021-07-25).

Downloads

Published

2025-05-28

Issue

Section

Articles

How to Cite

Zhang, J., Ma, Y., Fan, Y., Su, N., He, X., & Zhang, H. (2025). Real-Time Navigation Line Extraction for Cabbage Harvesting Robots in Complex Farmland Environments: A Multi-Mask Fusion and Dynamic Row-Tracking Approach. International Core Journal of Engineering, 11(6), 291-304. https://doi.org/10.6919/ICJE.202506_11(6).0033