Next Article in Journal
Combined Application of Chemical and Organic Fertilizers: Effects on Yield and Soil Nutrients in Spring Wheat under Drip Irrigation
Previous Article in Journal
The Genome-Wide Identification, Characterization, and Expression Profiles of the NADPH Oxidase (NOX) Gene Family under Drought and Salt Stress in Opisthopappus taihangensis (Asteraceae)
Previous Article in Special Issue
Detection and Localization of Tea Bud Based on Improved YOLOv5s and 3D Point Cloud Processing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design and Experiment of an Agricultural Field Management Robot and Its Navigation Control System

Nanjing Institute of Agricultural Mechanization, Ministry of Agriculture and Rural Affairs, Nanjing 210014, China
*
Author to whom correspondence should be addressed.
Agronomy 2024, 14(4), 654; https://doi.org/10.3390/agronomy14040654
Submission received: 23 February 2024 / Revised: 14 March 2024 / Accepted: 19 March 2024 / Published: 23 March 2024
(This article belongs to the Collection Advances of Agricultural Robotics in Sustainable Agriculture 4.0)

Abstract

:
The application of robotics has great implications for future food security, sustainable agricultural development, improving resource efficiency, reducing chemical pesticide use, reducing manual labor, and maximizing field output. Aiming at the problems of high labor intensity and labor shortage in the fields of pesticide application, weeding, and field information collection, a multifunctional and electric field management robot platform is designed, which has four switching steering modes (Ackermann steering, four-wheel steering, crab steering, and zero-radius steering), and its wheel-track can be automatically adjusted. Commonly used spraying booms, weeders, crop information collectors, and other devices can be easily installed on the robot platform. A multi-sensor integrated navigation system including a satellite positioning system, an RGB camera, and a multi-line lidar is designed to realize the unmanned driving of the robot platform in a complex field environment. Field tests have shown that the robot can follow the set route, and tests under simulated conditions have indicated that it can also dynamically correct paths based on crop rows by using a visual system. Results from multiple trials showed that the trajectory tracking accuracy meets the requirements of various field management operations.

1. Introduction

In many countries, field management such as pesticide application and weeding require a large amount of labor, while the number of individuals involved in agricultural labor is declining [1,2]. The application of autonomous robots in agriculture is becoming increasingly popular, thanks to their potentially significant impact on food security, sustainability, resource use efficiency, reduced chemical handling, and reduced manpower [3,4,5]. The rapid development of agricultural robotics focuses on field management, such as spraying, fertilization, crop phenotypic information collection, and selective harvesting. The research work of field management robots is of great significance to solve the labor shortage and reduce the use of chemical pesticides [6].
In the past decade, advances in sensors, control chips, artificial intelligence, and other technologies have accelerated the development of agricultural robots [7]. International research institutions and robot companies have continued to carry out interdisciplinary cooperation research. Robot technology has been introduced into the agricultural field and successfully used in unmanned field management [8]. An integrated multi-purpose farming robotic platform BoniRob was developed as a joint project between the University of Osnabrueck, the DeepField Robotics start-up, Robert Bosch company, and Amazone, which can be used for automatic weeding, crop phenotype detection, soil detection, etc. The navigation system uses 3D lidar, IMU, GPS receiver, and other sensors to sense the working environment and can carry a payload of 150 kg [9,10]. Robots developed by other commercial companies include the ANATIS robot of Carré [11], the Robotti robot of AGROINTELLI [12], the Asterix robot of Adigo [13,14], and Ecorobotix ARA (Ecorobotix, Yverdon-les-Bains, Switzerland) and were the first commercially available spot-sprayers, which allowed automated single-plant detection and control of broad-leaved dock in meadows [8]. The Naïo technologies corporation developed four different kinds of robot models Oz, Ted, Bob, and Dino [15,16]. Various promising technologies for field management robots have been introduced and implemented over the past 10 years. For example, the robot named AgBot II was developed by the Queensland University of Technology for automated fertilization, weed detection and sorting, and mechanical or chemical weed control. It uses an online weed detection system to determine weed locations, and the success rate of weed detection and classification is more than 90%. The modular agricultural robot named Thorvald II was developed by the Norwegian University of Life Sciences to meet the physical characteristics (wheel width, power, ground clearance, etc.) required for operation in various farm environments such as greenhouses and outdoor fields through the reconfiguration of the chassis [17]. Hortibot is a versatile field management robot developed by Aarhus University that can install a variety of weed detection and weeding machines, such as spray rods and weeding shovels [18]. A field management robot called RIPPA developed by the University of Sydney uses solar power to generate electricity for intelligent perception and precise spraying of the farmland environment [19]. Two prototype orchard pesticide-spraying robots were developed with the support of the Rovitis 4.0 research project, which can be connected to a decision support system (DSS) and automatic-spray vineyards [20].
The existing robots mentioned above are mainly used for tasks such as pesticide application, fertilization, soil detection, mechanical weeding, and crop phenotype information collection. Their key technologies include mobile chassis technology, positioning and navigation technology, and precision control technology. There are many challenges for robots in field management scenarios, for example, the cultivation mode in different regions is diversified, the crop row spacing of different types of crops is different, the height and form of crops in different growth periods are also changing, which brings great challenges to field management operations, most of the existing agricultural robot chassis does not have good reconfigurability, cannot adjust the wheel–track, the ground clearance is relatively low, and can only complete a single task in a specific scene [21], and the above factors lead to a low reuse rate of the whole machine, and the cost of use and maintenance is relatively high. In this paper, a multifunctional field management robot platform is designed, which has four electric legs, each with three degrees of freedom. The chassis can automatically switch between four steering modes (Ackermann steering, four-wheel dual Ackermann steering, crab steering, and zero-radius steering) and can also automatically adjust the wheelbase and wheel–track. We have chosen a consumer grade brushless DC motor, and the overall manufacturing cost of the robot is very low. These things considered, the designed robot has a payload of 350 kg, which is larger than the existing robot chassis.
In order to achieve full utilization of robots, we need to support multiple application modules, just like tractors using different combinations of tractors/implements. According to the scheme, the agricultural robot needs to be a vehicle with some basic functions and can support a variety of applications. Therefore, we have designed a large installation space, electrical interface, and data interface of agricultural tools on the chassis, which can quickly integrate various application modules, such as spot-sprayers, mechanical weeders, phenotypic sensors, etc.
Among the basic capabilities, a robust and reliable navigation and perception system is required for pesticide application and weed control in such scenarios. Early agricultural robots typically used navigation systems based on global navigation satellite systems, which used RTK-GNSS receivers to measure the robot’s position and direction in real time. Navigation systems based on GNSS do not have the ability to perceive local dynamic scenes, cannot avoid obstacles, cannot track crop movements, and there is a risk of wheels crushing crops [15,21]. The navigation system we designed integrates a low-cost RTK-GNSS receiver, IMU sensor, and 16-line lidar. Navigation software (Version: 1.20.0) is developed based on the existing ROS open-source software framework (Version: 1.0) [22], which can control the robot to avoid obstacles and control the robot to move along the set path or crop line. The robot positioning program adopts the EKF package, which combines GPS odometer data with the attitude data from the IMU (inertial measurement unit) to estimate the position and attitude of the robot. When operating in enclosed environments such as greenhouses, the robot adopts an LIO-SAM smooth mapping lidar inertial odometer tightly coupled framework to achieve high-precision real-time position and attitude estimation as well as map construction for mobile robots.
In this paper, a multifunctional field management robot is designed, which can be used for tasks such as pesticide application, weeding, and crop information collection in the future. We summarized our design method and provided a detailed introduction to the hardware, software, and integrated methods. Then, the feasibility of the robot design scheme and the navigation accuracy of the robot are verified by multiple sets of field experiments. The robot can drive along the crop line or the planned path, and the feasibility of the four-wheel drive four-turn chassis design scheme and the navigation accuracy of the robot are verified through multiple sets of field experiments.
The organizational structure of this paper is as follows. The detailed design procedures for the reconfigurable robot and navigation system of the multi-purpose field management robot studied are shown in Section 2. Section 3 introduces the results and discussion of full-coverage path planning and trajectory tracking tests on different steering models (Ackermann steering, four-wheel steering, crab steering, and zero-radius steering) as well as crop following tests. Section 4 introduces the conclusion.

2. Materials and Methods

2.1. Design of Four-Wheel Drive and Four-Wheel Steering Electric Chassis

The use cases for field management robots are complex. For example, different crops have differences in planting terrains, infrastructure, planting patterns, etc., which puts forward higher requirements for the environmental adaptability of robots. Especially in the process of crop growth, in order to reduce the damage of the wheels to the crops when the vehicle passes, the wheelbase needs to be adjusted according to the row distance of the crops and, in addition, the vehicle needs to have a high ground clearance. The design of the vehicle is primarily influenced by the need to carry modular robotic systems. Existing commercial platforms already exist, such as the Warthog from ClearPath Robotics, but their maximum payload, battery life, and vehicle geometry make them unsuitable for field management of field crops. The mass of the robot module used for pesticide application or mechanical weeding can be as much as 300 kg so the payload of the robot was designed to be 350 kg. The height of the crop also varies between 0.1 and 0.8 m when applying pesticides or weeding typical crops, such as wheat and corn, so the vehicle must also have a high ground clearance compared to a normal tractor, and the design value of the chassis height is defined at 1.0 m. Different crops are cultivated with different row spacing, which is usually between 0.2 m and 0.7 m, so the vehicle needs to have the function of automatic wheel–track adjustment, enabling the wheels to travel between two rows of crops to reduce unnecessary crop damage. In addition, the module carried by the robot platform occupies a large installation space, and the four supporting legs of the chassis need to be kept load-balanced when designing the robot to avoid overturning on the ramp, and a large rectangular installation space is designed in the geometric center of the chassis for installing the module.
In this paper, a robot chassis with four-wheel drive, independent four-wheel steering, and automatically adjustable wheel-track is designed. Each wheel leg has 3 degrees of freedom. The wheel leg chassis structure is shown in Figure 1. It can be configured to be suitable for different cultures, such as corn or wheat, different track widths, number of rows, and field structure. The three-dimensional design scheme of the robot is shown in Figure 2, which is driven by 10 brushless DC motors: 4 of them are for steering, another 4 of them are for driving wheels, and the last 2 are for adjusting the track. Each motor output shaft is directly connected with a planetary gear reducer to increase the output torque. The designed wheel-track adjustment mechanism is composed of one servo motor, one reducer, two sets of bevel gears, and two transmission shafts. A total of two sets of wheel-track adjustment mechanisms are installed at the head and tail of the frame, respectively. This design scheme allows the track widths of the front and rear wheels to be adjusted independently, and the track width can be adjusted from 1156 mm to 1900 mm. The wheel–leg chassis structure increases the flexibility of the robot’s movement, allowing it to pass freely in off-road environments. At the same time, the increase in the number of motors increases the complexity of the control system so we use the CAN bus to control all motors.

2.1.1. Design of Four-Wheel Independent Steering System

The four-wheel independent steering chassis has the advantages of great pass ability and strong adaptability to complex terrain conditions. Each of its wheel legs has a straight line of travel and steering, which gives the mobile platform a variety of steering modes. The steering system allows the driver to guide in the desired path. In the 4-wheel steering vehicle, both front wheels and rear wheels can be steered according to speed and space available for turning. The chassis designed in this paper has four different steering modes, including Ackermann steering, four-wheel double Ackermann steering, crab steering, and zero-radius steering. A schematic of the steering modes is shown in Figure 3.
In the designed four-wheel independent steering system, each wheel is equipped with a set of independent steering units composed of a DC brushless motor, a worm gear reducer, a steering shaft, etc.; the output end of the reducer is a flange; the steering shaft is connected to the flange by screws; and the other end of the steering shaft is also fixed together by the flange and the wheel leg bracket.
The steering angle of the wheel legs is adjusted by controlling the motor. We designed a sports mode switching knob on the remote control handle, and the user can select the desired mode according to the working conditions.
The steering motion of the robot is triggered by the remote control handle or the navigation program on the control industrial computer, for example, when the heading angle of the robot needs to be adjusted, the inverse kinematics calculation is performed according to the steering mode selected by the user, and the target value sent by the remote control handle or navigation program and then the motor angle command signal is sent to the drives of the four steering motors via the CAN bus to drive the steering motor to rotate to the desired angle. The steering motor is equipped with an absolute encoder. After the actual rotation angle of each motor is fed back to the driver, the drive adopts the position closed-loop operation mode, and the control algorithm adopts the feed-forward PID controller. The drive compares the feedback angle and the desired angle and then adjusts the current of the motor coil in real time to reduce the deviation between the actual steering angle and the command steering angle to meet the steering accuracy requirements. At the same time, the driver sends the encoder value to the industrial computer via the CAN bus, where the forward kinematics calculation is completed and the chassis odometer data are obtained, including chassis position, attitude, and speed.
The torque requirements of the steering motor are based on static friction scenarios where the vehicle is fully loaded on concrete. The formula for calculating the friction torque of the wheeled agricultural robot tires used by Jones M. et al. [18] is as follows.
T = μ W B 2 8 + E 2
where T is the torque required to break static friction, W is the weight carried by each wheel, μ is the coefficient of friction, B is the nominal width of the tire, and E is the offset between the tire’s contact surface and its axis of rotation. The axis of rotation on the vehicle lies directly through the center of the tire, meaning E = 0. A value of 0.75 was used as the coefficient of friction as a best-guess representation of a tractor-grip tire on dry concrete. The design requirement is that the unloaded weight of the entire vehicle is less than 600 kg, and the rated load is 350 kg, so the total mass is 950 kg. Taking into account the uneven weight distribution, the maximum mass of each wheel at an instantaneous load is calculated as 500 kg, and the common tire width is 0.15 m. According to Equation (1), the required steering torque of 194.90 Nm can be calculated. According to the calculation results and the motor selection manual, a brushless DC motor (D110M-06030C5-A23, Guangzhou Demark Motor Co., Ltd., Guangzhou, China) with a rated torque of 4.5 Nm, a speed of 3000 r/min, and a rated power of 1.4 kW was selected. The selected worm gear reducer (FADR110, Shanghai Severer Reducer Co., Ltd., Shanghai, China) has a 50:1 reduction rate and amplifies the output torque to 225 Nm, which can overcome the static friction torque from the ground during steering.

2.1.2. Design of Driving System

When the robot applies pesticides, weeds, or collects data in the field, the desired driving speed usually ranges from 1.0 to 6.0 km/h, and the operation module has no time to respond if the speed is too fast, so the maximum speed index defined as 6.0 km/h (1.67 m/s). Performance requirements for the vehicle’s traction system, whilst loaded, during up-hill acceleration, were calculated as follows.
F r = C r r m g cos α
F g = m g sin α
F a c c = m Δ v t
F t o l = F r + F g + F a c c
where Fr is the force due to rolling resistance; Fg is the grade (or incline) force; and Facc is the force required for mass acceleration. A rolling resistance coefficient (Crr) of 0.04 was chosen as it represents the case of a pneumatic tire on medium–hard soil [18]. Other variables used are a vehicle mass (m) of 950 kg, slope angle (α) of 15°, the robot is fully loaded on a slope of 15°, and under full load conditions, the speed change (Δv) reaches 1.67 m/s (6 km/h) within an acceleration time (t) of 6 s. Substituting the values of these parameters into Equations (2)–(5) gives a total force requirement Ftol of 3.97 kN. On a per-wheel basis, this is 0.99 kN, or 242.05 Nm, when taking the wheel radius (r) of 0.3 m into account. Required traction power per wheel (P) is calculated as follows.
P = F t o l v 4 η
where η is the transmission efficiency of the reducer, P is the power requirement of each wheel, v is the vehicle velocity, and Ftol is the total traction force. At a velocity of 1.67 m/s, the calculations give a power requirement of 1.89 kW per wheel.
The selected wheel-driven motors are brushless DC motors (S110BL170-430, Changzhou Shuangjie Electronics, Changzhou, China) with integrated 50:1 fixed-radio planetary boxes (FADR110, Safelo-Drive, Shanghai, China). At 48 V, each motor has a power rating of 1.9 kW, a maximum angular velocity of 3000 r/min, a torque of 6.0 Nm, and a maximum output of 5.7 kW. At the output of the gearbox, the torque jumps to 300 Nm, while the angular velocity drops to 60 r/min, bringing the robot’s top speed to 1.88 m/s (6.78 km/h). Inside the motor, there is an incremental rotary encoder with 1024 wires to measure the speed of the motor and feed it back to the motor controller, the power cord of each motor is connected to its respective motor controller (FBL2360, ROBOTEQ, Scottsdale, AZ, USA), and each controller has an encoder interface to read the signal of the encoder and calculate the motor speed. It also has multiple digital I/O and has an emergency stop function of the motor.

2.1.3. Design of Power Supply System

Figure 4 shows the power distribution of the robot. A 48 V 200 Ah (LiFePO4) battery (SYL16S3248200, Hangzhou Remote Speed Technology Co., Ltd., Hangzhou, China) is selected to supply power for the robot. The built-in LCDs voltage, current, temperature, and status of charge (SoC) are shown in real time. The gasoline power generation device is installed in the tail of the robot and consists of a gasoline engine and a generator. The generator’s control board and battery management system (BMS) are connected to the control computer via the CAN bus. When the battery level is detected to be less than 20%, the industrial control computer triggers the starting relay of the generator through its digital output port, and the generator starts to work. The total power of the generator is 7.0 kW, and at least 4.0 kW of power is converted into electrical energy. The electrical energy from the power generation unit is fed to the battery in a series hybrid configuration. A charging plug is installed to allow charging of the battery from an external power source.
The system’s electrical bus connects the battery and generator to the motor controller and power converter. A series of heavy-duty contactors are used to connect each device to the bus and the bus to the power supply. Two sets of 48.0 V to 12.0 V converters are also installed on the chassis, which can provide 2.4 kW of power output, and four sets of 48.0 V to 24.0 V converters can provide 4.8 kW of power output to power sensors, computers, and farm tools. Multiple waterproof sockets are installed on the connection board at the front of the farm tool module installation area to provide various voltage power supplies for farm tools.
The robot also has a knob switch on the housing for turning the main power on and off. Eight emergency stop buttons are set around the robot body to cut off the contactors on the motor power bus. The wireless controller is also equipped with an emergency stop switch. If the emergency stop button is pressed during operation or the robot is out of range of the remote control, the bus power supply will cut off the motor power, and the robot will stop moving.

2.1.4. Design of Communication System

The architecture of the robot’s communication system is shown in Figure 5. The platform is centrally controlled by an industrial control computer (Genius S10, Vensin, Shanghai, China) running Ubuntu 18.04 server edition, referred to as the “Execution PC”, which has 4-channel RS232 and 2-channel RS485 interfaces. The wireless controller (DCH-32, DCH Radio Ltd., Shanghai, China) allows the operator to issue drive commands via a joystick. The receiver module of the controller communicates with the Execution PC through the RS485 interface, and all inputs from the controller are also broadcast to the RS485 bus and read by the ROS nodes on the Execution PC. The wireless controller has two joystick inputs, eight paddle switches, eight rotary switches, four self-locking pushbutton switches, and an emergency stop switch. The emergency stop switch is connected to the 48 V bus contactor via the relay output of the receiver unit. If this switch is turned off during operation, or if the controller is out of range, the bus power is cut off within 500 milliseconds. This engages the motor brakes, removes all traction from the motor, and depowers the mounted modules.
The Execution PC communicates with the digital signal input and output control module (TCP-507E, Fuzhou Gekong, Fuzhou, China) through RS485 serial link. The digital control modules have 8-channel digital signal input interfaces and 8-channel digital signal output interfaces. The control software developed can realize the power on/off control of the motor driver, voltage conversion module, agricultural machinery power interface, warning light, buzzer, and other equipment. The digital input interface is used to collect the status information of the travel switch and emergency stop switch. When the IO module detects a digital input, it will feed back the status value to the PC. The industrial computer also has a CAN card (Inndisk, EMUC-B202, Beijing, China) for communication with all motor drivers.
A second computer, called a “navigation PC,” is responsible for a higher level of control over the robot. It is also a general-purpose PC running Ubuntu systems (Intel, NUC11PHKi7C, Santa Clara, CA, USA) that contains a standalone image computing card (GeForce RTX 2060, NVIDIA, Santa Clara, CA, USA). The graphics card was used to accelerate neural network algorithms and some image-processing functions. The navigation PC is used to collect information from various sensory sensors, including the high-speed industrial camera, stereo camera, GNSS receivers, and IMU sensor, send motion commands to the Execution PC through Ethernet, and perform other processing tasks related to autonomous navigation of the robot. The two computers and two lidars are connected to the gigabit network switch through the cable, through which data transmission is realized.

2.1.5. Modular Agricultural Machinery and Controller

The field management robot is designed to carry several types of agricultural machineries, such as spray boom, fertilizer spreader, mechanical weeder, and crop phenotyping detection devices. The electrical components commonly used in the above agricultural machinery are motors, electric cylinders, solenoid valves, relays, etc. Most of the control signals are digital, and common communication interfaces are RS232, RS485, and CAN. Sensors commonly used in agricultural machinery include detection elements such as rotary encoders and stroke switches, and their interfaces are all digital signals. Therefore, a special controller is designed for the automation of modular agricultural machinery, as shown in Figure 6. A 32-bit floating point microcontroller unit (MCU) (TMS320F28377D, Texas Instruments, Texas, USA) is used for processing, sensing, and actuation to improve closed-loop performance in real-time control applications such as spraying and weeding. The dual real-time control subsystems are based on the 32-bit MCU, which provides 200 MHz of signal processing performance in each core. Performance analog and control peripherals are also integrated into the controller to further enable system consolidation. Four 16-bit ADC blocks manage multichannel analog signals accurately and efficiently, which ultimately improves the system throughput. Other analog and control peripherals include DAC, ePWM, and other peripherals. Peripherals such as EMIFs and CAN modules (ISO 11898-1/CAN 2.0B-compliant) extend the connectivity of the controller. To sum up, the controller has the functions of digital input and output, analog input and output, common external equipment communication, and closed-loop motion control.
A targeting spray system is designed to provide a variable rate application. The sprayer consists of 12 high-speed solenoid valves (DS115880-12, TeeJet Technologies, Springfield, IL, USA) installed on the boom hanger with equal spacing of 0.3 m. The high-speed solenoid valves are composed of a nozzle, nozzle cover, and electromagnet. The spray boom was divided into 11 sections, and each section is equipped with a solenoid valve, which allows independent control of each section. The sprayer is equipped with a tank with a capacity of 100 L.
The controller receives signals from Execution PC and sensors and after processing, sends control commands to the nozzle’s solenoid valve and motor controller. The spray controller provides a variable control signal to the DC motor, which drives the piston pump to rotate at the appropriate speed to produce the required rated flow. A pressure sensor (16-05015, TEEJET Technologies, Springfield, IL, USA) is used to measure pipe pressure and stabilize spray pressure by controlling a proportional relief valve. The control program is developed in an integrated development environment (Code Composer Studio 12.3, Texas Instruments, Dallas, TX, USA). The spray decision is determined based on the RTK-GPS location information and prescription map, and then the computer sends commands to the spray controller. As an option, the control command can also be determined based on the weed location information recognized after all the parts are assembled into the robot, as shown in Figure 7. Because the target application system is under development and the performance test of the spray will be carried out in the future research, the design and testing of the robot platform and its navigation system are the focus of this paper.

2.2. Design of Navigation System

2.2.1. Function Description

A navigation system of a robot is the key to the excellent motion of the robot. The function of the navigation control system is to achieve robot positioning, perception, path planning, and driving control. The navigation system mainly includes a hardware layer, a driver layer, a data processing layer, and a response control layer. The hardware layer includes various sensors, motors, motor drivers, industrial control computers, and other hardware. The driver layer includes various interface drivers, sensor drivers, etc., which enable software nodes to read data in real time from hardware and issue control instructions to executing components such as motor controllers. The data processing layer is a variety of programs used for converting, filtering, fusing, and other operations on input and output data. The response control layer mainly performs some advanced tasks, such as trajectory planning, crop row detection, global positioning mapping and chassis motion control, remote control, and remote monitoring.

2.2.2. Hardware Setup

Positioning is the key to achieving navigation by obtaining the position and posture of the robot relative to global coordinates in the working environment. The conventional method calculates the wheel speed and mileage by using the pulse quantity of the encoder inside the drive motor. It obtains the relative position and attitude of the robot by combining the chassis kinematics equation. However, if the robot’s wheels are prone to slipping while driving in the wild, the wheel mileage data obtained from the encoder data will be inaccurate [23]. Therefore, we chose a high-precision dual antenna GNSS receiver (CGI210, Huace Navigation, Shanghai, China) to obtain the robot’s position and direction in real time. It has a built-in 4G communication module and supports differential positioning data access. The rover receiver is set to output the ‘‘NMEA-0183 GPGGA, GPRMC’’ string containing the geographic coordinates (latitude and longitude) and yaw angle in degree at a 10 Hz rate via an RS232 serial connection. Figure 8 shows the equipment installation location of the navigation system.
The extended Kalman filtering algorithm is used to fuse the satellite positioning data with the IMU inertial measurement unit (LPMS-CU2, Open Zen, Guangzhou, China) attitude data and solve the high-precision robot position, orientation, and speed information in real time, which improves the reliability, accuracy, and dynamics of robot positioning.
The 3D lidar can obtain the point cloud information of the surrounding environment and realize the simultaneous localization and mapping of the robot through the simultaneous localization and mapping (SLAM) algorithm. Single-plane lidar units produce clouds of unstructured data among the structured features. The cause was the lidar’s scan plane intercepting with the above canopy whilst driving over convex terrain. Similarly, this issue arose on concave terrain when the plane intercepted with the ground, which cannot meet the requirements of field navigation. Therefore, it was decided that a multi-layer lidar would be best suited for navigation due to its ability to see more distant features while driving on undulating ground. The multi-layer lidar (VLP-16, Velodyne Lidar, CA, USA) has 16 horizontal 360° layers separated over 15°. The measurement distance is more than 150 m, the measurement accuracy is less than 2 cm, and the 16 laser transmitters simultaneously emit high-frequency laser beams to continuously scan the external environment and provide 3D point cloud and object reflectivity through a high-speed digital signal processing algorithm. Another main purpose of the multi-line lidar used in this paper is obstacle avoidance, and the algorithm used is the dynamic window method.
A depth camera (D435i, Intel RealSense, Santa Clara, CA, USA) is set in front of the robot to extract the crop raw, guide the robot to travel along the crop raw, and avoid seedling pressing. To collect crop information in real time, the robot is also equipped with a vertically downward-mounted RGB camera (GO-5000M-USB, JAI, Valby Copenhagen, Denmark) with a CMOS resolution of 2560 × 2048 and a frame rate of 62 fps. It can capture images of sufficient quality and has a hardware trigger function. The captured images are used for the detection of crops/weeds in the field, and the industrial computer sends the classification results to the application controller through the serial bus.

2.2.3. Software Design of Navigation System

The chosen operating system is Ubuntu 18.04 LTS distribution with an installed robot operating system, ROS, which facilitates communication between computers and software nodes running within each computer [24]. The flowchart of the navigation system on the autonomous robot is shown in Figure 9. Firstly, the hardware driver layer software is developed and the ROS driver nodes of the laser radar, camera, inertial measurement unit, satellite positioning receiver, handheld remote controller, and other hardware on the robot platform to read the original data directly from the hardware interface. Then, the data processing layer software is developed to parse the data read by the hardware and publish it to the ROS node server using ROS universal message types such as geometry messages, sensor messages, and navigation messages. Nodes can be written in C or Python language and can be simply used to convert data or perform calculations on data while transferring data between other nodes. Finally, a response control layer software is developed, whose function is to enable the robot to plan a reasonable work path based on the work site, control the robot to reach the designated endpoint along the path, and control agricultural tools to complete the work, mainly including positioning, surveying, path planning, motion control, and agricultural tool control.
The robot positioning program uses the robot Pose EKF program under ROS, which fuses GPS odometer data with attitude data of the IMU to estimate the position and attitude of the robot, and uses an extended Kalman filter with a 6D model (3D position and 3D orientation) to combine the measurement results of the wheel odometer, IMU sensor, and GPS odometry, the basic idea of which is to provide loosely coupled integration with different sensors [25,26]. Mapping can divide into real-time mapping and offline mapping. When operating in a confined environment, the robot adopts the LIO-SAM smoothly mapped lidar inertial odometer tightly coupled framework to achieve high-precision and real-time trajectory estimation and map construction of mobile robots [27]. The algorithm integrates the data of the IMU sensor, lidar, and satellite positioning system and can reconstruct the three-dimensional farmland scene in real time in orchards, greenhouses, and other environments. The satellite positioning signal is relatively poor or there is no satellite positioning signal after using SLAM algorithm to complete mapping. According to the map for path planning and navigation, mapping navigation is suitable for use in orchards and greenhouses. For large and regular farmland, the offline mapping method is the best option, and this paper adopts the manual mapping method, using a handheld GNSS receiver (T50, Huace navigation, Shanghai, China) and ground station software for plot mapping and then using the software to perform path planning on the mapping map.
As shown in Figure 9, the designed navigation software includes multiple functional packages such as path planning, path tracking control, and obstacle avoidance. The path control software package provides an interface to communicate with other ROS nodes and feedback on the robot status parameters in real time.
The path planning package is used to plan the driving path on a given map. In this paper, the field management robot adopts the boustrophedon coverage planning for robot path planning. After entering the plot coordinates, set the starting point, adjacent route spacing, route direction, and turning radius of the robot operation, then select the turn mode (there are three modes to choose from: four-wheel double Ackermann steering, zero-radius steering, and crab steering), and run the program to automatically generate the job path.
The path tracking control package is used to control the robot to travel on a planned path. The robot’s position, attitude, and planned trajectory are used as inputs to the control package. The linear and angular velocities of the robot are calculated using a pure tracking algorithm [28,29], which is then published to the node server and subscribed to by the chassis motion control package.
The obstacle detection algorithm package is used to publish point clouds of obstacles in real time to avoid collisions. Scan2cloud software package is called to convert the data from the 3D laser sensor to a point cloud, then obstacles are identified, and local path planning is performed, which uses DWA algorithms [30].
The chassis motion control package is used to receive motion control command signals, complete kinematic solutions, and issue control commands to the steering motor and drive motor drive, so that the robot moves at the desired linear speed and angular velocity. It is worth noting that using the ROS Twist package accepts input from both the remote control and the navigation program’s motion control instructions, prioritizes them according to their priority, message timeout, and input lock topic, and selects the top topic to publish to the server as motion control instructions.
The fault detection package is used to detect the operating status parameters, speed, position, power, and agricultural machinery status parameters of the robot in real time. When a fault occurs, a fault code is sent and the roof fault light lights up, prompting the user to check the fault and carry out a manual intervention.
To reduce wheel crushing when crops are not arranged in a straight line, a crop row identification software package is used. Controlling the robot to travel along the crop rows requires extracting the crop arrangement curve. For each frame of image captured by the front-facing camera, we run a real-time crop/weed target detection model based on YOLO v5S on the navigation computer, automatically mark the area where the crop rows are located with boxes, and then extract and fit the crop centerline on the recognized crop area. The extraction method calculates the excess green vegetation index (ExG) of the crop area and uses the normalized 2IG–IR–IB as the color feature factor to highlight the vegetation coverage area in the image [31]. IG, IR, and IB correspond to the red, green, and blue channels of the image. Plants and background segmentation require setting a threshold first, then comparing the grayscale value of each pixel with the threshold, and dividing the pixel into plants or backgrounds based on the comparison results. For each green crop area in the vegetation coverage area, we calculate the center point of the crop. Then, we estimate the robot target trajectory by using the least squares fitting method to calculate the most suitable curve for all center points. We use a visual servo-based controller to control the robot to follow the path curve [32,33]. The controller uses local directional features extracted from camera images to control the robot, and the heading information is obtained from crop row structure features. This structure continuously captures the row being tracked by the robot through a sliding window and integrates a switching mechanism. When the robot reaches the end of the crop row, it uses GPS signals to guide the robot to switch from one row to the next row for operation.

3. Results

To test the field performance of the multifunctional field management robot, the performance test was carried out in the Baima Scientific Research and Innovation Experimental Demonstration Base of Nanjing Institute of Agricultural Mechanization, Ministry of Agriculture and Rural Affairs (located in Baima Avenue, Baima Town, Lishui District, Nanjing, China). A large number of experiments were carried out to verify the full-coverage path planning, trajectory tracking algorithm, automatic turn-around algorithm, and automatic crop row tracking algorithm.

3.1. Full-Coverage Path Planning and Trajectory Tracking Test

Full-coverage application requires the robot to carry out effective trajectory tracking on the specified plot according to the planned path. Firstly, the latitude and longitude coordinates of each corner in the field are measured by RTK mobile station with an accuracy of ±2.0 cm. The initial position information, job width, route angle, and other information are entered in the user interface. It usually adopts the operation mode of the straight round trip along the clipping line, invokes the reciprocating full-coverage path planning algorithm, automatically generates the target path, and starts the trajectory tracking software package through the button on the remote control.
The robot designs three automatic turning algorithms to choose from, including four-wheel steering, crab steering, and zero-radius steering, to complete the task in the field. The automatic navigation test site is shown in Figure 10. The robot navigation system has a data recording function, which records the real-time motion status data of the robot during the testing process, mainly including position, speed, heading angle, etc., for comparison with the target trajectory and analysis of tracking errors.
The measured trajectory during automatic navigation is compared with the planning trajectory, as shown in Figure 11. xd is the planning trajectory, and x1 is the measured trajectory at the geometric center of the robot. The speed of the robot linear segment line is 1.5 m/s, and the speed when cornering is 0.5 m/s. In the first set of experiments, a four-wheel rut turning method was used with a robot wheel–track of 1.6 m, respectively. Navigation control is based on the resulting action path. When the chassis reaches the turning point, the vehicle speed will automatically decrease to the set turning speed. The average error of the robot path tracking of the straight operation sections (excluding turning sections) is calculated to be 44.79 mm, and the maximum tracking error is 79.40 mm. The error is within the allowable error range of sowing and application of major food crops, which proves that the developed navigation system has good tracking accuracy.
The second set of tests used the method of zero-radius steering. The deflection angle of each wheel can be calculated from the track and wheelbase. The wheel deflection angle is calculated as follows.
α = arctan ( W L )
where W and L represent the wheel-track and wheelbase, respectively, the deflection angle of the front left wheel and the rear right wheel is α, the deflection angle of the front right wheel and the rear left wheel is -α, and the rotational speed of the four wheels is consistent.
In the second set of tests, the robot turned into zero-radius steering mode. That is, when the robot travels to the turning point on the ground, the speed is reduced to zero, and the robot switches from the standard walking mode to the zero-radius steering mode so that the robot rotates 90° around the central axis of the chassis. It then returns to normal driving at a normal working width (1.45 m), then switches to zero-radius steering mode, rotating another 90°. Finally, it switches to normal driving, completes the U-turn, and completes the work tasks of the entire field in turn. The planning trajectory and measurement estimation are shown in Figure 12, and the average error of the robot path trajectory in the straight travel section is 52.61 mm. The maximum tracking error is 107.12 mm. The largest error when the robot turns is caused by the error in the control of the heading angle when the robot turns with zero radius.
The third group of experiments used crab steering mode for route switching, where at the end of the route, four wheels rotated 90° in the same direction without changing the heading angle of the robot chassis and directly moved horizontally to the next route. Specific control process: When the robot travels in a straight line to the turning point at the head of the ground, the speed drops to 0, switches from normal walking mode to crab walking mode, travels a distance of one working width (1.45 m), then switches to normal driving mode, enters the next route, travels backward along the planned path, and the rear of the vehicle becomes the front. The planning and measurement route is shown in Figure 13. The calculation shows that the average path tracking error of the linear segment robot is 20.75 mm, and the maximum tracking error is 43.35 mm, indicating that the trajectory tracking accuracy of the robot is high when using the crab steering mode.
The test results are shown in Table 1, which shows that the navigation system has the characteristics of high tracking accuracy. The following three performance indices will be employed to measure the quality of each steering algorithm, i.e., the maximum Me, average μ, and standard deviation of the tracking errors σ. Their specific definitions can be found by authors of [34,35]. In the crab steering mode, the robot tracking accuracy was the highest in the three sets of experiments, followed by the same steering mode. When the robot is sowing seeds and weeding, the crab steering mode can be considered. During the application operation, the length of the spray boom is large, resulting in a large turning radius, and the four-wheel steering mode is preferred. The trajectory error of the robot in the zero-radius steering mode is relatively large, mainly due to the long adjustment time required for the alignment of the robot’s heading angle with the next planned path point. The heading angle control algorithm will be further optimized in the future.

3.2. Test of Crop Row Following Function

The navigation system based on machine vision uses the vision sensor in front of the robot to collect the image information of the separated rows, extract the navigation path through image processing, and the vision controller controls the robot to walk along the path recognized by the guide. We define the robot position and attitude as = [x, y, θ]T. The control variables are the linear and angular velocity of the robot = [v, ω]T. We specify that the robot moves at a fixed constant translation speed, v. Therefore, the controller mainly controls the angular velocity, ω, of the robot. The input to the controller is the clipping line feature vector calculated on the current image of the camera and the expected value of the corresponding feature, and the controller is the PID feedback controller.
In this group of experiments, the crop row following algorithm was tested, and green simulated plants were arranged on the test site, as shown in Figure 14a. Rows of crops were laid at equal spacing, and the crop height was about 10 cm, the spacing was 40 cm, and the overall arrangement was S-shaped. The number of crop rows between the left and right wheels of the robot is three, and the chassis adopts a four-wheel dual Ackermann steering mode. The robot is driven to the starting position and starts a machine vision-based navigation mode. The robot automatically walks along the crop rows, and the control system adjusts the robot’s posture in real time to ensure that the center point of the robot is always located on the centerline of the three rows of crops. Before the test, a high-precision integrated navigation system (CGI-430, Huace navigation, Shanghai, China) was used to map the latitude and longitude coordinates of the crops laid on the ground, and the ROS bag record program was used to record the robot odometer information during the robot’s driving. After the experiment, the latitude and longitude data of crop rows were collected by a handheld RTK receiver and converted to the plane coordinate system, and the target trajectory and the measured trajectory are shown in Figure 14b. The average error of the robot’s path tracking in the x direction is calculated to be 47.98 mm, and the maximum tracking error is 133.34 mm. The average error of the robot’s path tracking in the y direction is 53.28 mm, and the maximum tracking error is 139.68 mm, indicating that the machine vision-based navigation system has good trajectory tracking accuracy without satellite positioning.

4. Discussion

In comparison to previous achievements, our designed chassis is reconfigurable, with a payload capacity of 350 kg, a wheelbase adjustment range from 1.1 m to 1.9 m, and three degrees of freedom for each supporting leg. Current field management robot chassis, like BoniRob, have a payload capacity of 150 kg. A gasoline generator was installed on the chassis that we designed. The engine automatically begins charging the lithium battery when the battery level falls below 20%, giving the robot longer uninterrupted operating time. Furthermore, in order to facilitate the combined control of many farm tools and robots, we built specific circuits and electrical interfaces for farm tools. At present, there is a lack of the literature on the research of robot chassis for field management of similar large grain crops. Many studies focus on fruit-picking robots and orchard pesticide-spraying robots. In the crab-shaped navigation mode, the tracking error of the navigation software developed in this paper is less than the nominal 25 mm tracking error of the automatic steering system commonly found in tractors.
The following restrictions on the designed robot were also found through the aforementioned field tests and simulated field trials:
Future research will concentrate on testing the performance of the pesticide sprayer installed on the robot, as it has not yet been tested due to the target spraying system’s control software still being in the commissioning stage.
Design flaws like damage to the steering bearings surfaced during the agricultural management robot’s chassis testing. The design of the steering mechanism will be optimized in the next-generation prototype.
It was discovered during the navigation system test that the irregular planting mode’s visual navigation was not sufficiently trustworthy. To improve the stability of the navigation system, the next stage is to integrate GPS-based navigation algorithms with visual navigation.

5. Conclusions

In this paper, a general farmland management robot platform is designed, which adopts an electric wheel leg chassis, which has the functions of switching multiple steering modes (co-track turn, in-place turn, crab walking, etc.) and automatic wheel–rail adjustment. It can be used for pesticide application, mechanical weeding, phenotypic information acquisition, soil nutrient collection, etc., of field crops or vegetables.
The design process of four-wheel drive, four-wheel steering electric wheel leg chassis is described in detail. Taking the full load mass of the whole vehicle of 950 kg, the climbing angle of 15°, and the driving speed of 5 km/h as the design indicators, the power and torque required by the four-wheel independent steering system and the travel drive system were calculated; the selection of motor, reducer, and other components was carried out according to the results; and the power distribution system, communication control system, and operating tool controller were designed.
An agricultural robot navigation system based on the ROS robot operating system was designed, which realized the unmanned path planned by the robot and the unmanned along the crop rows and introduced the hardware composition and software architecture of the navigation system.
The designed full-coverage path planning algorithm, satellite positioning trajectory tracking algorithm, turning algorithm, and crop row tracking algorithm were tested outdoors. In the test process based on the satellite positioning trajectory tracking algorithm, three modes of four-wheel steering, crab steering, and zero-radius turning were used to switch routes, among which the robot had the highest trajectory tracking accuracy when using the crab steering method, with an average lateral error of 20.75 mm and a maximum tracking error of 43.35 mm. The average lateral position deviation of the trajectory tracking algorithm based on machine vision was 77.98 mm, and the maximum tracking error was 133.34 mm, which can meet the needs of farmland management operations. The visual navigation method uses the inherent row structure of the crop field to guide the robot to drive along the crop row, without establishing an environmental map, which can effectively avoid crop yield reduction caused by wheel crushing.
In the future, we will design different types of application modules, study their integration methods with robot navigation systems, and study multi-sensor fusion navigation algorithms to improve the adaptability and operational efficiency of robots in various environments.

Author Contributions

Conceptualization, L.C. and X.X.; methodology, L.C., F.L. and T.S.; software, L.C. and T.S.; validation, F.L. and Y.J.; formal analysis, F.L. and T.S.; investigation, F.L. and Y.J.; resources, L.C. and T.S.; data curation, F.L. and X.X.; writing—original draft preparation, L.C.; writing—review and editing, F.L.; visualization, F.L. and T.S.; supervision, X.X.; project administration, L.C. and X.X.; funding acquisition, L.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was financially supported by the National Key R&D Program of China (No.2022YFD2000700); Innovation Program of Chinese Academy of Agricultural Sciences (No. CAAS SAE-202301); the China Modern Agricultural Industrial Technology System (grant number CARS-12) and Jiangsu Province and Education Ministry Cosponsored Synergistic Innovation Center of Modern Agricultural Equipment Project (No. XTCX1004).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available in the paper.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Crist, E.; Mora, C.; Engelman, R. The interaction of human population, food production, and biodiversity protection. Science 2017, 356, 260–264. [Google Scholar] [CrossRef]
  2. Sylvester, G. E-Agriculture in Action: Drones for Agriculture; FAO: Bangkok, Thailand, 2018. [Google Scholar]
  3. King, A. Technology: The future of agriculture. Nature 2017, 544, S21–S23. [Google Scholar] [CrossRef]
  4. Botta, A.; Cavallone, P.; Baglieri, L.; Colucci, G.; Tagliavini, L.; Quaglia, G. A Review of Robots, Perception, and Tasks in Precision Agriculture. Appl. Mech. 2022, 3, 830–854. [Google Scholar] [CrossRef]
  5. Pretto, A.; Aravecchia, S.; Burgard, W.; Chebrolu, N.; Dornhege, C.; Falck, T.; Fleckenstein, F.; Fontenla, A.; Imperoli, M.; Khanna, R.; et al. Building an Aerial–Ground Robotics System for Precision Farming: An Adaptable Solution. IEEE Robot. Autom. Mag. 2021, 28, 29–49. [Google Scholar] [CrossRef]
  6. Gonzalez-de-Santos, P.; Fernández, R.; Sepúlveda, D.; Navas, E.; Emmi, L.; Armada, M. Field Robots for Intelligent Farms—Inhering Features from Industry. Agronomy 2020, 10, 1638. [Google Scholar] [CrossRef]
  7. Charatsari, C.; Lioutas, E.D.; Papadaki Klavdianou, A.; Michailidis, A.; Partalidou, M. Farm advisors amid the transition to Agriculture 4.0: Professional identity, conceptions of the future and future-specific competencies. Sociol. Rural. 2022, 62, 335–362. [Google Scholar] [CrossRef]
  8. Shamshiri, R.R.; Weltzien, C.; Hameed, I.A.; Yule, I.J.; Grift, T.E.; Balasundram, S.K.; Pitonakova, L.; Ahmad, D.; Chowdhary, G. Research and development in agricultural robotics: A perspective of digital farming. Int. J. Agric. Biol. Eng. 2018, 11, 1–14. [Google Scholar] [CrossRef]
  9. Ruckelshausen, A.; Biber, P.; Dorna, M.; Gremmes, H.; Klose, R.; Linz, A.; Rahe, R.; Resch, R.; Thiel, M.; Trautz, D.; et al. BoniRob–an autonomous field robot platform for individual plant phenotyping. Precis. Agric. 2009, 9, 841–847. [Google Scholar]
  10. Wu, X.; Aravecchia, S.; Pradalier, C. Design and implementation of computer vision based in-row weeding system. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 4218–4224. [Google Scholar]
  11. Machleb, J.; Peteinatos, G.G.; Sökefeld, M.; Gerhards, R. Sensor-based intrarow mechanical weed control in sugar beets with motorized finger weeders. Agronomy 2021, 11, 1517. [Google Scholar] [CrossRef]
  12. Green, O.; Schmidt, T.; Pietrzkowski, R.P.; Jensen, K.; Larsen, M.; Jørgensen, R.N. Commercial autonomous agricultural platform: Kongskilde Robotti. In Proceedings of the Second International Conference on Robotics and Associated High-Technologies and Equipment for Agriculture and Forestry, Madrid, Spain, 21–23 May 2014; pp. 351–356. [Google Scholar]
  13. Grimstad, L.; From, P.J. The Thorvald II Agricultural Robotic System. Robotics 2017, 6, 24. [Google Scholar] [CrossRef]
  14. Utstumo, T.; Urdal, F.; Brevik, A.; Dørum, J.; Netland, J.; Overskeid, Ø.; Gravdahl, J.T. Robotic in-row weed control in vegetables. Comput. Electron. Agric. 2018, 154, 36–45. [Google Scholar] [CrossRef]
  15. Urdal, F.; Utstumo, T.; Vatne, J.K.; Ellingsen, S.A.Å.; Gravdahl, J.T. Design and control of precision drop-on-demand herbicide application in agricultural robotics. In Proceedings of the 2015 IEEE International Conference on Industrial Technology (ICIT), Seville, Spain, 17–19 March 2015; pp. 357–362. [Google Scholar]
  16. Champ, J.; Mora-Fallas, A.; Goëau, H.; Mata-Montero, E.; Bonnet, P.; Joly, A. Instance segmentation for the fine detection of crop and weed plants by precision agricultural robots. Appl. Plant Sci. 2020, 8, e11373. [Google Scholar] [CrossRef]
  17. Grimstad, L.; From, P.J. Thorvald II-a modular and re-configurable agricultural robot. IFAC-Pap. Line 2017, 50, 4588–4593. [Google Scholar] [CrossRef]
  18. Jones, M.H.; Bell, J.; Dredge, D.; Seabright, M.; Scarfe, A.; Duke, M.; MacDonald, B. Design and testing of a heavy-duty platform for autonomous navigation in kiwifruit orchards. Biosyst. Eng. 2019, 187, 129–146. [Google Scholar] [CrossRef]
  19. Oliveira, L.F.P.; Moreira, A.P.; Silva, M.F. Advances in agriculture robotics: A state-of-the-art review and challenges ahead. Robotics 2021, 10, 52. [Google Scholar] [CrossRef]
  20. Biocca, M.; Aiello, L.; Baldoin, C. Rovitis 4.0: An autonomous robot for spraying in vineyards. In Proceedings of the Safety, Health and Welfare in Agriculture and Agro-Food Systems, SHWA 2020, Ragusa, Italy, 16–19 September 2020; Springer: Cham, Switzerland, 2020; pp. 176–185. [Google Scholar]
  21. Pérez-Ruiz, M.; Gonzalez-de-Santos, P.; Ribeiro, A.; Fernández-Quintanilla, C.; Peruzzi, A.; Vieri, M.; Agüera, J. Highlights and preliminary results for autonomous crop protection. Comput. Electron. Agric. 2015, 110, 150–161. [Google Scholar] [CrossRef]
  22. Alqobali, R.; Alshmrani, M.; Alnasser, R.; Rashidi, A.; Alhmiedat, T.; Alia, O.M. A Survey on Robot Semantic Navigation Systems for Indoor Environments. Appl. Sci. 2024, 14, 89. [Google Scholar] [CrossRef]
  23. Mochurad, L.; Hladun, Y.; Tkachenko, R. An Obstacle-Finding Approach for Autonomous Mobile Robots Using 2D LiDAR Data. Big Data Cogn. Comput. 2023, 7, 43. [Google Scholar] [CrossRef]
  24. Macenski, S.; Foote, T.; Gerkey, B.; Lalancette, C.; Woodall, W. Robot Operating System 2: Design, architecture, and uses in the wild. Sci. Robot. 2022, 7, eabm6074. [Google Scholar] [CrossRef]
  25. Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
  26. Mochurad, L. Implementation and analysis of a parallel kalman filter algorithm for lidar localization based on CUDA technology. Front. Robot. AI 2024, 11, 1341689. [Google Scholar] [CrossRef] [PubMed]
  27. Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 5135–5142. [Google Scholar]
  28. Zhang, C.; Gao, G.; Zhao, C.; Li, L.; Li, C.; Chen, X. Research on 4WS Agricultural Machine Path Tracking Algorithm Based on Fuzzy Control Pure Tracking Model. Machines 2022, 10, 597. [Google Scholar] [CrossRef]
  29. Yang, Y.; Li, Y.; Wen, X.; Zhang, G.; Ma, Q.; Cheng, S.; Chen, L. An optimal goal point determination algorithm for automatic navigation of agricultural machinery: Improving the tracking accuracy of the Pure Pursuit algorithm. Comput. Electron. Agric. 2022, 194, 106760. [Google Scholar] [CrossRef]
  30. Fox, D.; Burgard, W.; Thrun, S. The dynamic window approach to collision avoidance. IEEE Robot. Autom. Mag. 1997, 4, 23–33. [Google Scholar] [CrossRef]
  31. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  32. Ahmadi, A.; Nardi, L.; Chebrolu, N.; Stachniss, C. Visual servoing-based navigation for monitoring row-crop fields. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 4920–4926. [Google Scholar]
  33. Ahmadi, A.; Halstead, M.; McCool, C. Towards Autonomous Visual Navigation in Arable Fields. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 6585–6592. [Google Scholar]
  34. Cui, L.; Xue, X.; Le, F. Adaptive robust precision control of an active spray boom suspension with disturbance estimation. J. Vib. Control 2023, 29, 925–941. [Google Scholar] [CrossRef]
  35. Cui, L.; Xue, X.; Le, F.; Mao, H.; Ding, S. Design and experiment of electro hydraulic active suspension for controlling the rolling motion of spray boom. Int. J. Agric. Biol. Eng. 2019, 12, 72–81. [Google Scholar] [CrossRef]
Figure 1. Diagram of the wheel leg chassis.
Figure 1. Diagram of the wheel leg chassis.
Agronomy 14 00654 g001
Figure 2. Design scheme of multifunctional field management robot.
Figure 2. Design scheme of multifunctional field management robot.
Agronomy 14 00654 g002
Figure 3. Steering modes of the robot: (a) Ackermann steering; (b) double Ackermann steering; (c) crab steering; and (d) zero-radius turning.
Figure 3. Steering modes of the robot: (a) Ackermann steering; (b) double Ackermann steering; (c) crab steering; and (d) zero-radius turning.
Agronomy 14 00654 g003
Figure 4. Diagram of the power distribution system (the gray dashed lines indicate control lines to contactors or electric relays).
Figure 4. Diagram of the power distribution system (the gray dashed lines indicate control lines to contactors or electric relays).
Agronomy 14 00654 g004
Figure 5. Diagram of the communication system.
Figure 5. Diagram of the communication system.
Agronomy 14 00654 g005
Figure 6. Special control board for agricultural tools.
Figure 6. Special control board for agricultural tools.
Agronomy 14 00654 g006
Figure 7. Field management robot prototype.
Figure 7. Field management robot prototype.
Agronomy 14 00654 g007
Figure 8. Hardware installation location of the navigation system.
Figure 8. Hardware installation location of the navigation system.
Agronomy 14 00654 g008
Figure 9. Flowchart of the navigation system.
Figure 9. Flowchart of the navigation system.
Agronomy 14 00654 g009
Figure 10. Field testing of the autonomous robot.
Figure 10. Field testing of the autonomous robot.
Agronomy 14 00654 g010
Figure 11. Testing the trajectory tracking performance of the robot using four-wheel dual Ackermann steering. The red line represents the measured trajectory, while the black line represents the planned trajectory.
Figure 11. Testing the trajectory tracking performance of the robot using four-wheel dual Ackermann steering. The red line represents the measured trajectory, while the black line represents the planned trajectory.
Agronomy 14 00654 g011
Figure 12. Test the trajectory tracking performance of the robot using a zero-radius turning model. The red line represents the actual measured trajectory, while the black line represents the planned trajectory.
Figure 12. Test the trajectory tracking performance of the robot using a zero-radius turning model. The red line represents the actual measured trajectory, while the black line represents the planned trajectory.
Agronomy 14 00654 g012
Figure 13. Testing the trajectory tracking performance of the robot using crab steering. The red line represents the measured trajectory, while the black line represents the planned trajectory.
Figure 13. Testing the trajectory tracking performance of the robot using crab steering. The red line represents the measured trajectory, while the black line represents the planned trajectory.
Agronomy 14 00654 g013
Figure 14. Test results of vision-based navigation methods in simulated environments: (a) S-shaped testing environment; (b) target trajectory and measured trajectory of the robot.
Figure 14. Test results of vision-based navigation methods in simulated environments: (a) S-shaped testing environment; (b) target trajectory and measured trajectory of the robot.
Agronomy 14 00654 g014
Table 1. Tracking performance indices for the robot driving in a straight line.
Table 1. Tracking performance indices for the robot driving in a straight line.
Steering ModeMe/mmμ/mmσ/mm
four-wheel steering79.4044.7934.52
zero-radius steering107.1252.6138.52
crab steering43.3520.7514.39
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cui, L.; Le, F.; Xue, X.; Sun, T.; Jiao, Y. Design and Experiment of an Agricultural Field Management Robot and Its Navigation Control System. Agronomy 2024, 14, 654. https://doi.org/10.3390/agronomy14040654

AMA Style

Cui L, Le F, Xue X, Sun T, Jiao Y. Design and Experiment of an Agricultural Field Management Robot and Its Navigation Control System. Agronomy. 2024; 14(4):654. https://doi.org/10.3390/agronomy14040654

Chicago/Turabian Style

Cui, Longfei, Feixiang Le, Xinyu Xue, Tao Sun, and Yuxuan Jiao. 2024. "Design and Experiment of an Agricultural Field Management Robot and Its Navigation Control System" Agronomy 14, no. 4: 654. https://doi.org/10.3390/agronomy14040654

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop