Next Article in Journal
Numerical Analysis of Friction-Filling Performance of Friction-Type Vertical Disc Precision Seed-Metering Device Based on EDEM
Previous Article in Journal
Deep Learning Tools for the Automatic Measurement of Coverage Area of Water-Based Pesticide Surfactant Formulation on Plant Leaves
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enabling Autonomous Navigation on the Farm: A Mission Planner for Agricultural Tasks

by
Ruth Cordova-Cardenas
,
Luis Emmi
* and
Pablo Gonzalez-de-Santos
Centre for Automation and Robotics (UPM-CSIC), 28500 Arganda del Rey, Madrid, Spain
*
Author to whom correspondence should be addressed.
Agriculture 2023, 13(12), 2181; https://doi.org/10.3390/agriculture13122181
Submission received: 20 September 2023 / Revised: 10 November 2023 / Accepted: 17 November 2023 / Published: 22 November 2023
(This article belongs to the Section Digital Agriculture)

Abstract

:
This study presents the development of a route planner, called Mission Planner, for an agricultural weeding robot that generates efficient and safe routes both in the field and on the farm using a graph-based approach. This planner optimizes the robot’s motion throughout the farm and performs weed management tasks tailored for high-power laser devices in narrow-row crops (wheat, barley, etc.) and wide-row crops (sugar beet, maize, etc.). Three main algorithms were integrated: Dijkstra’s algorithm to find the most optimal route on the farm, the VRMP (Visibility Road-Map Planner) method to select the route within cultivated fields when roads are not visible, and an improved version of the Hamiltonian path to find the best route between the crop lines. The results support the effectiveness of the strategies implemented, demonstrating that a robot can safely and efficiently navigate through the entire farm and perform an agricultural treatment, in this case study, in laser-based weed management. In addition, it was found that the route planner reduced the robot’s operation time, thus improving the overall efficiency of precision agriculture.

1. Introduction

In recent decades, the adoption of robots in agriculture has seen significant growth. Technological advances have drastically transformed agricultural production processes [1] to minimize operating costs, reduce environmental impact, and optimize the total production cycle. Inspired by these advances in the application of technology in the agricultural sector, the main objective of this study is to develop a mission planner for an agricultural robot that can navigate autonomously, not only in crop fields but in the entire extension of the farm. The Mission Planner presented will use georeferenced maps and the Global Navigation Satellite System (GNSS) Real Time Kinematic (RTK) [2,3], allowing the robot to move efficiently around the farm and perform specific tasks, such as weed management, using laser-based technology, which is the treatment example used in this study. This process is automated by implementing specific route planning algorithms.
Map generation in precision agriculture is a constantly developing area that plays a critical role in agricultural decision making and the optimization of crop production and has benefited from various techniques and technologies, such as drones, manual sampling, online tools, and yield mapping, among others [4]. In recent years, techniques such as Simultaneous Localization and Mapping (SLAM) have generated interest in the domain of agriculture [5] thanks to their ability to integrate multiple sensors to position and locate a vehicle without prior knowledge of the operating environment. In contrast, a metric map of the environment is built for future use. However, in the agricultural environment, SLAM techniques have encountered several challenges, mainly because of seasonal variation and the dimensionality of the maps, given that agricultural environments can represent several hectares. Some SLAM solutions perform well in mapping and localization in urban environments but have struggled to reconstruct the field composed of dynamic growing crops in 3D. Another significant challenge is the alignment of multiple 3D submaps when generating a global 3D map [6]. As an alternative, Wang et al. [7] proposed the placement of stationary landmarks in the environment to mitigate seasonal changes. However, this solution requires modification of the environment, and the landmarks can be occluded by vegetation or worn over time.
Significant examples of automatic planning are found in Zoto et al. [8], where a system that generates coverage routes for uncrewed aerial vehicles (UAVs) in mountainous vineyards is described. This system uses georeferenced information from images captured by drones, which allows efficient planning of work routes in complex terrain. In addition to drone technology, another technique employed is manual sampling, as applied in [9]. In this approach, the user manually defines the perimeter of the terrain by selecting the appropriate latitude and longitude coordinates using a GNSS device. Then, topological navigation is carried out using landmarks on the ground and a graph-based algorithm to generate a detailed topological map to plan the optimal route of the agricultural robot specially designed for corn planting.
On the other hand, there are examples of Geographic Information Systems (GIS) for the generation of maps of the agricultural environment, as documented in [10,11]. In these cases, online tools based on free software are used to improve crop management. These tools leverage geographic data, such as aerial imagery and harvest records, to calculate productivity indicators and support agricultural decision making.
Regarding route planning, there is extensive literature on path planning, both in orchards and arable lands [12]. Linker and Blass [13] presented an approach based on the A* algorithm for determining the optimal paths for car-like vehicles in orchards, considering some vehicle and environment constraints, such as a limited steering angle, a limited range of pitch and roll angles, forward motion, and the undesirability of frequent turning. Santos et al. [14] present an off-line trajectory plan for a vineyard robot, taking into consideration the nearest recharging point. The authors use an A* algorithm that restricts the path to the maximum turning rate the robot allows and a 16-layer occupational grid map, where the vineyards and recharge station are located. Another relevant example of route planning in orchards is the work developed by Bochtis et al. [15], where routing plans for intra- and inter-row orchard operations are presented based on the adaptation of an optimal area coverage method developed for arable farming operations (B-patterns) that represents fieldwork tracks covering an entire area, which do not adhere to a predefined pattern but instead emerge from an optimization process based on one or more chosen criteria. In this work, the optimization criteria were the nonworking distance traveled to move from one track to a subsequent one.
On the other hand, route planning on arable lands diverges from planning in orchards since, in most cases, there is no a priori knowledge of the location of the crop, in addition to the fact that most of the work focuses on absolute coverage maps, especially on tillage and seeding tasks. Han et al. [11] developed a field path planning system capable of creating routes for complete coverage within polygonal fields, including the necessary turns at the field boundaries. The system was designed to automatically produce a map of coverage paths, following an X-shaped turning pattern, offering guidance for autonomous tillage tractors to perform efficient operations in polygonal fields. Hameed et al. [16] created a method that provides an enhanced planning strategy for producing a path map tailored for agricultural tasks, considering obstacles within the field.
Most of the works that can be found in the literature seek the optimization of the planned trajectory in terms of operation time and energy consumption, and some even consider the 3D shape of the work environment to define optimal routes. It should also be noted that the optimization problem focuses mainly on reducing the effect generated by navigation in areas outside the field, i.e., in areas where no work or treatment is carried out. This is where the vehicle turns to change crop lines or tracks.
Moreover, most mission planners focus exclusively on creating trajectories for the field without considering the structure of the farm, adjacent fields, and entry points, among others. There are some solutions for navigation in long-range open environments, as is the case of the work developed by Shah and Levine [17], where an algorithm that combines learning and planning approaches is introduced. The approach uses auxiliary data sources such as simplified roadmaps, satellite imagery, and GPS coordinates as planning aids without requiring absolute accuracy. This methodology presents its usefulness for navigation in open environments, such as in the case of a farm, but it represents specific challenges: (i) it is necessary to train the system so that it is capable of defining alternative routes given the image it perceives; (ii) it does not take into consideration the structure of the work environment beyond information from third parties, such as online maps; and (iii) it lacks knowledge of the type of environment it passes through, which is critical to defining the behavior that the robotic system must comply with.
This study introduces a Mission Planner designed for an agricultural weeding robot. The planner employs a graph-based strategy with the objective of (i) enhancing the robot’s movement in large-scale farms and (ii) facilitating in-field targeted weed management tasks. These weeding tasks are customized explicitly for high-power laser devices in two distinct crop scenarios: narrow-row crops, such as wheat and barley, and wide-row crops, such as sugar beet and maize. In addition, this study demonstrates the effectiveness of georeferenced maps and GNSS-RTK in mission planning for agricultural robots. The article’s structure is as follows: Section 2 describes the materials and methods used, including the tools and devices used to validate the approach. Section 3 presents the results that are analyzed and validated in Section 4. Finally, Section 5 summarizes the findings and discusses the implications of this research.

2. Materials and Methods

Before describing the mission planner, it is necessary to describe the types of maps used, the mobile platform, and the communication architecture used as a case study to validate it [3]. Moreover, it is also necessary to describe the methodology of how to build the connections between the different elements that constitute the digital description of the work environment. With these elements, presented in Section 2.1, Section 2.2, Section 2.3, Section 2.4 and Section 2.5, the mission planner shown in Section 2.6 can be described.

2.1. Types of Maps

The primary goal of a route planner is to determine the shortest route between two specific points within the operational workspace, which requires a detailed farm map. This map allows the route planner to identify and avoid obstacles and generate efficient and safe trajectories for the robot. In this study, the combination of three types of maps has been implemented, described as follows:
  • Metric maps: These maps combine information from obstacle-based maps with an occupational grid for global location and detailed navigation in local areas. Metric maps focus on representing the precise configuration and location of features in the working environment, describing the geometry of space, dimensions of obstacles, terrain features, and other relevant elements [18].
  • Topological maps: Topological maps model the working environment as a graph, capturing spatial relationships and connectivity between different locations. They use graph or network structures, where nodes represent key locations and links represent connections or paths between these nodes. Topological maps simplify the representation of the environment by capturing the overall structure and relationships between sites, facilitating route planning and navigation by autonomous robots [19,20].
  • Semantic maps: These maps focus on representing the meaning and contextual information of features in the working environment. They incorporate meta-information and attributes that provide additional knowledge about the characteristics and function of areas or objects in the environment. In the agricultural field, these maps are essential for specific crop areas, irrigated areas, areas with pests, or areas of particular interest [21].

2.2. Mobile Platform

To execute the planning routes, the Carob commercial platform, developed by Agreenculture, Toulouse, France (see Figure 1 and Table 1), was used, in which the different components have been integrated to configure a weeding control system with laser-based technology. A controller based on ROS was developed to control the platform, which converts speed commands into messages that the platform can understand [22]. Trajectory tracking algorithms benefit from onboard sensors to know the robot’s position and orientation.
The selected mobile platform uses a tracked locomotion system with differential kinematics (skid steer). With this configuration, the robot movements must be optimized to ensure minimal impact on crops and soil. Therefore, specific restrictions and guidelines have been established for the robot’s rotation to ensure efficiency [23].

2.3. Communication Architecture

The communications architecture provides a complete set of tools and technologies to manage data, facilitate interoperability, and ensure connectivity in IoT (Internet of Things) environments. In the context of Smart Farm, FIWARE (Future Internet WARE) [24] stands out as the most appropriate option since it offers all the functionalities required in operations, such as monitoring or crop management. FIWARE has a Context Broker that plays a crucial role as a centralized repository of real-time contextual information. This functionality allows relevant data to be collected and task planning and decision making of the mission planner to be accessed [3]. The information managed by the Context Broker includes the digital representation of the maps of the areas below (see Section 2.4), as well as information from the mobile platform (position, state, status, etc.).

2.4. Mapping

For the data model, FIWARE uses the NGSI model, a FIWARE application programming interface, which represents context information using context elements and attributes [25]. The RESTful NGSI v2 API (application programming interface) enables the exchange of context information through defined operations, while the representation of an entity follows a specific JavaScript Object Notation (JSON) structure. This integration between the NGSI model and the FIWARE API Management enables effective communication and a coordinated exchange of information between the system components and the planner [3].
To represent the working environment, this study leverages data models created by the FIWARE community to represent previous working environments digitally. In a former work [26], an evolution of these data models was proposed, where data were conditioned to be adapted to robotic systems and oriented mainly to navigation.
This representation of the agricultural working environment includes a description of the farm, called AgriFarm, which comprises four main areas: fields (AgriParcels), roads (RoadSegment), buildings (Building), and restricted areas (RestrictedTrafficArea). Each area has its digital representation in the FIWARE data models called entities and contains a set of properties and characteristics that allow the farm to be composed of metric, topological, and semantic descriptions. Figure 2 presents the relationship between the entities that constitute a map of the farm, including some relevant properties.

2.5. Agricultural Mission

The next step is to establish the autonomous operation, according to ISO 8373:2012, “Robots and robotics devices—Vocabulary” (ISO 8373:2021) in [27]. Scheduling tasks involves solving a specific task by creating a set of required steps. During this process, tasks are subdivided into smaller subtasks, and the necessary motions and actions are determined as follows (see Figure 3):
  • ACTION: A specific activity for which the robot is programmed. This involves dividing complex trajectories into simpler ones.
  • OPERATION: A set of specific actions for which the robot is programmed.
  • TASK PLANNING (Mission): A set of tasks that the robot has to perform in the working environment (farm) and must be completed in a specific order and time.
The agricultural mission involves carrying out specific operations in the working environment described above in a predetermined order and within a certain period. During the mission, the robot moves in a planned manner from a starting point, which can be any point within the farm, to an endpoint in the specific field to be treated.
The robot’s mission planning is based on coordinating various operations (see Figure 4). At the start of the mission, a set of internal actions is activated by the “Wakeup” operation. These actions are necessary to:
  • start the robot modules;
  • initialize the procedures and sensors;
  • establish communications between internal systems.
The robot then moves to specific positions in the field using the “GoTo” operation, which involves moving the robot from a point on the farm to the field where the treatment will occur.
During the movement, the “GoTo” operation is divided into two types of actions: “FollowPath” and “MoveTo”. The “FollowPath” action is intended to ensure an efficient and accurate path following a predefined route. This action is ideal when the robot needs to move along a specific, predetermined path to reach a known destination and avoid obstacles, such as a road. In contrast, the “MoveTo” action allows the robot to move between different field areas without being restricted to a fixed path, for example, when there is an open field. This approach is convenient in scenarios where it is not necessary to follow a defined route, but the robot is required to reach a specific position or destination directly and without preset limitations.
Once the robot arrives at the field where the treatment will be performed, several additional actions are carried out in the “Treatment” operation. These actions include
  • plan the movement of the robot between each crop line;
  • activate and deactivate the laser-based tool when the presence of weeds is detected between crops;
  • perform a robot’s U-turn to change the crop line in the headlands.
In addition, the design, implementation, and evaluation of the necessary actions to eliminate weeds by laser are carried out.
At the end of the mission, the “Shutdown” operation is performed to stop the system’s operation safely. This allows information and data to be saved for later analysis while ensuring the protection of the robot and helping to extend its service life.

2.6. Mission Planner

As introduced above, the Mission Planner relies on two main operations: “GoTo” path planning and the “Treatment” plan.

2.6.1. GoTo Path Planning

The “GoTo” operation allows the robot to autonomously and accurately move from a specific point A (start) to a point B (end) on the farm. The main objective is to ensure a safe and efficient movement of the robot from the departure area to the destination area, where it will carry out the “Treatment” task.
To ensure efficient performance in this operation, it is essential to precisely define the necessary components and algorithms used to design an optimal route. This precision, by definition, ensures the proper functioning of the “GoTo” operation and contributes to the overall efficiency of the robot.
(i) 
Input and output parameters in the “GoTo” operation
Input variables fall into two main categories: user-defined and field-related.
  • User-defined input and output variables
    • From place: Represents the starting area in the farm (Building, RoadSegment, or the same AgriParcel) from where the robot begins its navigation.
    • AgriParcel: Indicates the endpoint or destination of the “GoTo” operation.
    • Round trip: This variable determines whether the robot should perform a round trip (where the robot returns to the mission’s starting point) or a one-way operation (where the mission ends at the end of the treatment in the field itself).
  • Input and output variables related by the field
    • Entities: Represents the different elements that make up the farm, such as Building, RestrictedTrafficArea, AgriParcel, and RoadSegment. These elements are extracted from the map and correctly classified according to their typology. RestrictedTrafficArea are forbidden places where the robot cannot pass.
    • Road: Represents the type of paths and its structure. In this study, considering the field where the tests were conducted, the roads have been categorized according to the material they are made of, including two main types: dirt roads and asphalt roads. In addition, it is essential to note that these roads are interconnected to facilitate the robot’s navigation.
    • Open field: Indicates if the robot is operating in an open field and must navigate safely in this type of terrain.
(ii) 
Conditions and restrictions
Within the framework of this study, conditions and constraints are established that are defined by a specific set of rules. These rules define the interaction and behavior between the various entities present on the farm, including:
Condition 1: Rules for roads and buildings
Based on the observations, it has been established that the building entity where the robot is stored is usually connected to an asphalt-type road entity. This condition has been observed on most farms where agricultural robots have been tested.
Condition 2: Rules for roads and roads (between roads)
A steady transition between asphalt roads and dirt roads has been identified in the test areas. This transition is crucial for the robot’s autonomous navigation and has been established as a condition in this analysis.
Condition 3: Rules for roads and gates
Upon observing several farms, a recurring feature was identified: most fields are endowed with some access, either a physical gate or a designated space for the entrance of agricultural machinery. For this reason, this property has been incorporated into the AgriParcel entity, using points that represent the coordinates of the access, which is called Gate. As mentioned in the previous condition about roads, it is essential to note that there will always be a connection between a dirt road and access to the countryside. Therefore, it is necessary to consider this restriction in the analysis since the dirt road prevails in the cultivation areas and finally connects with the entrance access to the field.
Condition 4: Rules for gates and fields
Finally, it is essential to consider the relationship between access and the field to be treated. This ensures that the robot enters through the correct access and treats the field previously defined by the AgriParcel input variable, thus optimizing the management of the cultivation areas and facilitating logistics on the farm.
(iii) 
Route planner approach to Operation “GoTo
Once the inputs and outputs are established, as well as the conditions and restrictions for the various entities that make up the agricultural farm, it is critical to determine the algorithm that will be used for route planning in the “GoTo” operation. After a thorough analysis, the importance of selecting a route planner that considers topography, crop layout, and obstacle location has been identified [28].
In this context, a method based on graphs has been chosen for the route planner since it visually represents all the farm entities. Each area is represented as a node in the graph, and the nodes are interconnected by edges, as shown in Figure 5. Entities described include B = Building, R.A = Road Asphalt, R.D = Road dirt, OF = Open Field, G = Gate, and F = AgriParcel.
To build the full topological map, the allocation of the weight of the edges in the graph is based on the distance between the interconnected nodes, based on the metric map, and this is performed in strict accordance with the conditions and rules established above. The initial connection is established between the Building and the Road, and the weight of the edge is calculated based on the distance between its nearest points, fulfilling the observations made in Condition 1. To establish the connection between roads, Condition 1 must be complied with. In addition, when connecting the Road and the Gate, Condition 3 must be respected at all times to ensure the validity of the connection. It is important to note that this strategy is effective when good visibility exists in the working environment, and the robot can identify possible collisions, given that the maneuver to transit between a Road and a Gate, as well as the maneuver of crossing the Gate, can be a hazardous situation for the robot. However, manual control will be required in cases of limited or no visibility. This condition can manifest itself due to various factors, including fog, rain, darkness, uneven terrain, and even challenges related to the robot’s perception of its surroundings.
The Dijkstra algorithm is used to obtain an optimal route for the robot within the farm, according to the graph obtained in Figure 5. This algorithm finds the shortest path between two points in a graph, considering the weight of the edges connecting the nodes. This algorithm was selected since it can find an optimal path, although it sacrifices computing speed [29]. It should be noted that the Mission Planner is designed to be an offline system located in the cloud, so it does not require real-time performance. Notably, Dijkstra’s algorithm considers the presence of obstacles in the way, assigning greater weight to nodes representing them compared to nodes representing accessible paths. This ensures that the robot can navigate safely and avoid collisions during its trajectory.
If there is limited visibility and no defined route to establish the connection between the Gate and the Field, performing low-level planning using geometric space to represent the operating environment in 2D is necessary. In this scenario, AgriParcel entities are considered hazardous areas with no clear path for the robot. To address this situation, the Visibility Road-Map Planner (VRMP) trajectory planning algorithm is used [30] to construct a network of roads representing the visible areas of the space, as shown in Figure 5. This network is used to plan safe routes that allow the robot to move around, avoiding obstacles.
VRMP is based on the premise that trajectory planning is most efficient when considering information about visible areas of space rather than relying solely on geometric details [31]. To do this, visibility nodes are generated from the start and destination coordinates, along with the vertices of the obstacle polygons. A roadmap is then created using this information, and Dijkstra’s search algorithm is employed to determine the shortest route from the starting point to the destination point. In this way, the trajectory is efficiently planned, considering the visible areas and avoiding dangerous areas.

2.6.2. “Treatment” Plan: Route Planning

The “Treatment” operation is a fundamental process within the Mission Planner since it is responsible for planning the robot’s route to perform the agricultural task, i.e., the treatment. To achieve this, an improved version of the Hamiltonian path is used, which allows finding an optimal solution in this complex environment.
During this process, the robot navigates the field on a permitted route without damaging the crop, detecting the presence of weeds. This treatment process helps improve crop quality and reduce production costs, making it a valuable tool for farmers.
(i) 
Input and output parameters in “Treatment
Input variables fall into three main categories:
  • User-defined input variables
    • AgriParcel: Indicates the working area where the “Treatment” operation will occur.
    • interRow: The distance between crop lines. This information can also be defined by the type of crop belonging to the field.
    • Headland: The space available at the end of each crop row where the robot performs the U-turn maneuvers.
  • Robot-defined input variables
    • Robot and implement dimensions: The accuracy with which the robot can work within fields depends mainly on its size and the implement it uses. Therefore, it is crucial to review the characteristics of the robot carefully. In this study, those characteristics are detailed in Table 1.
  • Input variables defined by the field
    • Field area: Contains the shape and dimensions of the field; this can be a closed polygon of more than 4 sides.
    • gateLocation: Holds the coordinates of the access door to the field.
    • Bearing: Contains the direction of crops expressed in degrees measured clockwise from true or magnetic north.
(ii) 
Output variables
The result of the system is a graphical representation of the drawing of the work area, which shows the crop lines of the field and the lines through which the robot must pass without damaging the crop. Figure 6 presents an example of a populated field with crops, given its contour and the lanes (Robot path) through which the robot must pass based on the width of the implement.
(iii) 
Conditions and restrictions
The system also sets certain conditions and restrictions to ensure the efficiency of the process. For example, the vertex closest to the door should be identified as the starting point for creating the first lane. In addition, crop lines are only created if their length is greater than or equal to 70% of the calculated maximum field length. This is because, in many situations, the contour of the field is irregular, which can generate small crop lines that do not correspond to reality.
(iv) 
Route planner approach
The Hamiltonian path is a heuristic approach that visits all graph vertices exactly once without repeating any of them [32]. A thorough exploration of all possible routes is performed to determine whether the graph has a Hamiltonian path. This process ensures no solution is lost and is helpful in route optimization problems. These techniques are widely recognized in route planning because they can find optimal solutions, even with incomplete or partial information available.
In this study, an improved Hamiltonian path algorithm is implemented. To achieve this, it is necessary to define the graph’s vertices corresponding to the start and end points of the Robot path, or lanes, through which the robot must travel. Figure 6 presents these trajectories as the lines in red, where the nodes on one side are represented with odd numbers and on the other side with even numbers (see Figure 6b). Among the improvements applied to the Hamiltonian trajectory, the incorporation of weights on the edges of the graph stands out. The weight between any no deny node j is defined as W i , j , where i ,   j N and N is the set of real numbers that represent each of the nodes of the graph, with the size of N being even. Additionally, jumps are implemented, whose weights are determined according to the allowed jumps, considering the robot’s U-turn capability and the headland’s space.
To perform the computation of the weights that connect nodes at the headland of the crop field, a base weight W b a s e 1 is calculated, which serves as a seed to estimate the weight of the transitions below the jumps, given by Equation (1).
W b a s e 1 = 2 · j u m p s · W r e f
where W r e f is a predefined weight (set at 30), and j u m p s is the desired number of lines for the robot to skip on the U-turns. Subsequently, the value of the weights corresponding to the possible transitions of lines smaller than the desired jump is defined, given by Equation (2):
W i , i + k = W b a s e 1 / k , i   m o d   2 = i + k   m o d   2 2 k 2 · j u m p s
In addition, a second base weight W b a s e 2 is calculated that serves as a seed to estimate the weight of the transitions above the jumps, given by Equation (3):
W = W b a s e 2 j u m p s N n o d e s j u m p s j u m p s 1
where N n o d e s represent the total amount of nodes in the graph. Moreover, Equation (4) presents the computation of the value of the weights corresponding to the possible transitions of lines greater than the jumps:
W i , i + k = W r e f + W b a s e 2 · k j u m p s 2 + 2
where i   m o d   2 = i + k   m o d   2 and ( N n o d e s > k > 2 j u m p s + 1 ) .
To force the planner to choose the crossing of crop lines as a predefined route, the weights between nodes that correspond to the opposites of the field headlands are defined in Equation (5):
W i , i + 1 = W l a n e , i   m o d   2 = 0
where W l a n e has been defined as a weight with several orders of magnitude less (0.01) than the reference weight W r e f .
This allows the planner to be valid for any robot with any morphology and traction system. Moreover, this configuration parameter is critical, both for the users when defining the behavior they want the robot to follow and for the robot to define the type of maneuver to be executed for smoother navigation in the fields. With all this, it seeks to ensure that the route planner is effective and optimal in the treatment. The current version of the Mission Planner defines the number of jumps as a user-defined parameter. However, this parameter could be estimated based on the aforementioned robot capabilities, although this is outside the scope of this work.

3. Results

The route planner presented in this work was validated in an experimental farm setup at the Centre for Automation and Robotics (CAR). The farm (see Figure 7) consists of an area made up of 6 fields, separated into two blocks (fields 1 to 5; fields 6 and 7), each with a specific Gate. The map is also made up of a set of buildings, where only two of them allow access to the robot (Buildings T and C). Moreover, the farm includes a small network of roads and restricted areas.
Using the geographical data obtained from the map and following the agreed approach for the mission planner, the user selects the starting point (From_place) and the endpoint (AgriParcel) where the laser weeding treatment will be performed. According to the task and its attributes, the operations are generated in a preset order, starting with the route planner [26].
In Operation “GoTo”, which includes the actions necessary for its execution, Dijkstra’s algorithm is used to find the optimal route from the start point to the endpoint, as illustrated in Figure 8. The algorithm selects the shortest and safest route, following a known route built mainly by the known roads, ensuring safety. This provides safe navigation and prevents entry into unknown areas potentially dangerous for the robot.
In cases where the routes between the fields are not visible, the VRMP method is used to select the route that the robot should follow within the fields without damaging the crops (see Figure 9). As a result, several solutions are obtained, but the algorithm has selected the safest solution with the lowest cost, always avoiding crossing any field.
After the “GoTo” operation is completed, the “Treatment” operation is performed. This allows the user to select the number of jumps between crop lines and how many lines the robot should treat. It is achieved that the robot can pass between the crops in a path generated for its transit (see Figure 10).
Finally, the data of all these processes are integrated and stored in the Mission task, which contains a set of geographical points representing both the route the robot must follow and the type of operation the robot must perform in autonomous mode. These points are stored in the order in which each operation must be completed.
On the map, the greenhouse (Building T) was selected as the mission’s starting point and a field (Field 3) as the endpoint. The number of jumps between crop lines was set to three, and the return was set to FALSE to illustrate the mission generated and the trajectory followed without overlapping. Figure 11 presents the geographic coordinates of the robot (green path) while executing the planned mission (blue path). It should be noted that the minimum turning radius of the robot was set to 1.5 m, which does not allow the robot to rotate on its own axis. Therefore, on the intersections of the planned mission, when changes in direction are representative (around 90°), the robot must perform turning maneuvers following the minimum turning radius parameter, which forces it to leave the trajectory. This has been defined as a desired behavior of the robot, sacrificing moving away from the trajectory but ensuring that upon reaching the intersection point, the robot has the desired orientation to continue to the next waypoint.
Table 2 presents the data on the duration and distance traveled by the robot in the different stages of the mission. During the Wakeup operation process, the robot requires aligning the GNSS antenna with the IMU sensor. This process is particularly required by this robot, so the actions executed are not considered during the mission planning. The “GoTo” Operation combines approaching the roads and the field where the treatment is carried out and following the roads. During the “Treatment” operation, the robot performs two main actions: following the crop rows (“FollowPath”), and completing the U-turns on the headlands (“MoveTo”). It should be noted that during the Treatment, two events occurred that were not foreseen in the mission planning (denoted as Action None): (i) an initialization time of the systems for crop-row following, and (ii) a pause of the robot since it encountered an obstacle.
Figure 12 presents a temporal graph indicating the moments during the mission where the robot carried out the planned actions and operations.

4. Discussion

The first fundamental aspect to consider when validating the mission generator is the ability to perform U-turns, which is important due to the dimensions and particularities of the robot used in this study. In the test carried out in Field 6, it was shown that the U-turn time represents around 46% of the total treatment time in motion. It is relevant to note that performing a “U” turn could entail a significant cost, depending on the robot morphology and traction system, as well as the robot’s maneuver capability. For this reason, instead of opting for wide turns or backing up briefly before changing direction, the robot has been configured to perform U-turns without backing up. This strategy has been designed to increase efficiency in the change to a new direction and, simultaneously, avoid the loss of helpful areas. The importance of adapting this maneuver to the robot is confirmed by numerous tests that have consistently demonstrated the necessity of this turn for efficient and safe maneuverability in various situations. The results obtained in these tests support the effectiveness of the implemented path planner and the algorithms used, as they have managed to generate satisfactory and practical paths for the robot.
A key strategy employed to simplify route planning was dividing problems into three types of maps. This division has proven beneficial, as it has facilitated the consideration of various variables and characteristics when planning robot routes. Moreover, the description of the working environment using FIWARE smart-data models (NGSIv2) has proven useful, allowing it to integrate metric and semantic properties.
The process begins with obtaining a graphical reference highlighting the farm’s relevant features, such as its geographic location and territorial boundaries. The three types of maps that play an essential role in this research are generated from this information. However, to carry out this task effectively, it is imperative to have a tool that allows adjustments to be made according to the configuration of conditions and constraints that affect the different elements of the farm. An example of these types of tools is Geojson.io [33], which is an open-access tool to create, change, and publish maps.
These adjustments, although small in scale, are of utmost importance. They are due both to the possibility that the information provided may not be up to date and to the need to assist the robot in overcoming challenges that arise due to the proximity or distance between critical farm elements, such as roads, buildings, and fields, as well as the presence of restricted areas for the robot. In addition, it has been observed that the treatment required in the fields varies according to the orientation of the crops, which is limited by the robot’s dimensions. This information has been considered to ensure a smooth passage of the robot between crops, especially when they are planted with different crop spaces (interRow) and the shape of the field is not entirely polygonal.
Updating maps plays a critical role in ensuring the feasibility and optimal performance of the Mission Planner, especially about the specific conditions and constraints mentioned in Section 2.6.1.
In an ever-changing agricultural environment, where elements such as buildings, fields, and roads may undergo modifications, it is essential to keep the maps used by the planner up to date. This ensures that the information depicted on the maps accurately reflects the reality of the terrain. When conditions, such as the junction between a building and a paved road, are not met or changed, adapting and adjusting the maps is crucial. For example, an adequate junction between the building and the asphalt road is required for smooth navigation. When any of these rules are not met, it is suggested that the nearest road be considered, adjusting its properties and attributes to simulate an asphalt road. This flexibility in route planning ensures system continuity and optimal performance.

5. Conclusions

The Mission Planner introduced in this paper addressed two key objectives: (1) to find an optimal path for agricultural weeding robots across large-scale farms and (2) to provide an integrated solution for weed management tasks. The extension of the planner to the whole farm marks a significant milestone in agricultural robotics and precision farming. Moreover, this study comprehensively explored mission planning for autonomous robots in agricultural environments, emphasizing an integrated approach combining multiple vital strategies adapted to the different navigation scenarios an autonomous agricultural robot can encounter.
Incorporating diverse map types, including metric, topological, and semantic maps, has significantly enriched the understanding of the agricultural environment by providing detailed information about the physical terrain configuration and capturing the spatial relationships and connectivity between locations. Moreover, the meticulous definition of rules to define the relationships between the different working areas within the farm has resulted in the generation of highly accurate topological representations of the environment, ensuring top-quality route planning. The inherent flexibility of these rules allows this solution to adapt to different agricultural scenarios, increasing its applicability and versatility.
Regarding scalability, the proposed solution made significant strides by incorporating generic data models such as FIWARE’s NGSI. This strategic choice has enabled this contribution to go beyond the limitations of applying only to a particular type of agricultural robot. This approach also fosters collaboration and innovation within the agricultural and robotics community.
In future work, machine learning techniques will be explored to be integrated into the existing system. These techniques could allow the robot to learn and adapt in real time to different scenarios, further improving its ability to overcome obstacles safely and efficiently.

Author Contributions

Conceptualization, L.E., R.C.-C. and P.G.-d.-S.; methodology, R.C.-C. and L.E. software, R.C.-C. and L.E.; validation, R.C.-C. and L.E.; investigation, L.E., R.C.-C. and P.G.-d.-S.; resources, P.G.-d.-S.; writing—original draft preparation, R.C.-C.; writing—review and editing, L.E. and P.G.-d.-S.; supervision, L.E. and P.G.-d.-S.; funding acquisition, P.G.-d.-S. All authors have read and agreed to the published version of the manuscript.

Funding

This article is part of a project that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No 101000256.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethical Committee of CSIC.

Data Availability Statement

The dataset associated with this research paper contains data obtained from experimental tests conducted with an autonomous robot in an outdoor field environment. We are committed to promoting transparency, reproducibility, and the sharing of our research findings. The dataset is available for research and verification purposes in DIGITAL.CSIC at https://doi.org/10.20350/digitalCSIC/15665 named “Data on the WeLASER robot carrying out a mission in the experimental fields of the Center for Automation and Robotics (UPM-CSIC)”, accessed on 3 November 2023.

Conflicts of Interest

The authors were employed by the Spanish National Research Council (CSIC).

References

  1. Fountas, S.; Mylonas, N.; Malounas, I.; Rodias, E.; Hellmann Santos, C.; Pekkeriet, E. Agricultural robotics for field operations. Sensors 2020, 20, 2672. [Google Scholar] [CrossRef]
  2. Lytridis, C.; Kaburlasos, V.G.; Pachidis, T.; Manios, M.; Vrochidou, E.; Kalampokas, T.; Chatzistamatis, S. An overview of cooperative robotics in agriculture. Agronomy 2021, 11, 1818. [Google Scholar] [CrossRef]
  3. Emmi, L.; Fernandez, R.; Gonzalez-de-Santos, P.; Francia, M.; Golfarelli, M.; Vitali, G.; Sandmann, H.; Hustedt, M.; Wollweber, M. Exploiting the Internet Resources for Autonomous Robots in Agriculture. Agriculture 2023, 13, 1005. [Google Scholar] [CrossRef]
  4. Pedersen, S.M.; Lind, K.M. Precision Agriculture: Technology and Economic Perspectives, 2nd ed.; Springer International Publishing: Cham, Switzerland, 2017; pp. 1–21. [Google Scholar]
  5. Ding, H.; Zhang, B.; Zhou, J.; Yan, Y.; Tian, G.; Gu, B. Recent developments and applications of simultaneous localization and mapping in agriculture. J. Field Robot. 2022, 39, 956–983. [Google Scholar] [CrossRef]
  6. Se, S.; Lowe, D.; Little, J. Vision-based mapping with backward correction. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Kyoto, Japan, 23–27 October 2022. [Google Scholar]
  7. Wang, L.K.; Hsieh, S.C.; Hsueh, E.W.; Hsaio, F.B.; Huang, K.Y. Complete pose determination for low altitude unmanned aerial vehicle using stereo vision. In Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, Edmonton, AB, Canada, 2–6 August 2005. [Google Scholar]
  8. Zoto, J.; Musci, M.S.; Khaliq, A.; Chiaberge, M.; Aicardi, I. Automatic path planning for unmanned ground vehicle using UAV imagery. In Advances in Service and Industrial Robotics: Proceedings of the 28th International Conference on Robotics in Alpe-Adria-Danube Region (RAAD 2019), Kaiserslautern, Germany, 19–21 June 2019; Springer: Berlin/Heidelberg, Germany; Cham, Switzerland, 2019. [Google Scholar]
  9. Nehme, H.; Aubry, C.; Solatges, T.; Savatier, X.; Rossi, R.; Boutteau, R. Lidar-based structure tracking for agricultural robots: Application to autonomous navigation in vineyards. J. Intell. Robot. Syst. 2021, 103, 61. [Google Scholar] [CrossRef]
  10. Zhang, N.; Taylor, R.K. Applications of a Field–Level Geographic Information System (FIS) in Precision Agriculture. Appl. Eng. Agric. 2001, 17, 885. [Google Scholar] [CrossRef]
  11. Han, X.; Kim, H.J.; Jeon, C.W.; Moon, H.C.; Kim, J.H.; Seo, I.H. Design and field testing of a polygonal paddy infield path planner for unmanned tillage operations. Comput. Electron. Agric. 2021, 191, 106567. [Google Scholar] [CrossRef]
  12. Chakraborty, S.; Elangovan, D.; Govindarajan, P.L.; ELnaggar, M.F.; Alrashed, M.M.; Kamel, S. A comprehensive review of path planning for agricultural ground robots. Sustainability 2022, 14, 9156. [Google Scholar] [CrossRef]
  13. Linker, R.; Blass, T. Path-planning algorithm for vehicles operating in orchards. Biosyst. Eng. 2008, 101, 152–160. [Google Scholar] [CrossRef]
  14. Santos, L.; dos Santos, F.N.; Mendes, J.; Ferraz, N.; Lima, J.; Morais, R.; Costa, P. Path planning for automatic recharging system for steep-slope vineyard robots. In Proceedings of the Third Iberian Robotics Conference (ROBOT 2017), Seville, Spain, 22–24 November 2017. [Google Scholar]
  15. Bochtis, D.; Griepentrog, H.W.; Vougioukas, S.; Busato, P.; Berruto, R.; Zhou, K. Route planning for orchard operations. Comput. Electron. Agric. 2015, 113, 51–60. [Google Scholar] [CrossRef]
  16. Hameed, I.A.; Bochtis, D.D.; Sørensen, C.G.; Vougioukas, S. An object-oriented model for simulating agricultural in-field machinery activities. Comput. Electron. Agric. 2012, 81, 24–32. [Google Scholar] [CrossRef]
  17. Shah, D.; Levine, S. Viking: Vision-based kilometer-scale navigation with geographic hints. In Proceedings of the Robotics: Science and Systems (RSS), New York, NY, USA, 27 June–1 July 2022; p. 3. [Google Scholar]
  18. Thrun, S. Learning metric-topological maps for indoor mobile robot navigation. Artif. Intell. 1998, 99, 21–71. [Google Scholar] [CrossRef]
  19. Fermin-Leon, L.; Neira, J.; Castellanos, J.A. Incremental contour-based topological segmentation for robot exploration. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2017), Singapore, 29 May–3 June 2017; pp. 2554–2561. [Google Scholar]
  20. Kuipers, B.; Byun, Y.T. A robot exploration and mapping strategy based on a semantic hierarchy of spatial representations. Robot. Auton. Syst. 1991, 8, 47–63. [Google Scholar] [CrossRef]
  21. Ruiz-Sarmiento, J.; Galindo, C.; Gonzalez-Jimenez, J. Building Multiversal Semantic Maps for Mobile Robot Operation. Knowl.-Based Syst. 2017, 119, 257–272. [Google Scholar] [CrossRef]
  22. Koubâa, A. (Ed.) Robot Operating System (ROS); Springer: Cham, Switzerland, 2017. [Google Scholar]
  23. Moysiadis, V.; Tsolakis, N.; Katikaridis, D.; Sørensen, C.G.; Pearson, S.; Bochtis, D. Mobile Robotics in Agricultural Operations: A Narrative Review on Planning Aspects. Appl. Sci. 2020, 10, 3453. [Google Scholar] [CrossRef]
  24. López-Riquelme, J.A.; Pavón-Pulido, N.; Navarro-Hellín, H.; Soto-Valles, F.; Torres-Sanchez, R. A software architecture based on FIWARE cloud for Precision Agriculture. Agric. Water Manag. 2017, 183, 123–135. [Google Scholar] [CrossRef]
  25. Smart Agrifood—FIWARE Foundation Open Source Platform. Available online: https://www.fiware.org/community/smart-agrifood/ (accessed on 18 September 2023).
  26. Emmi, L.; Parra, R.; González-de-Santos, P. Digital representation of smart agricultural environments for robot navigation. In Proceedings of the 10th International Conference on ICT in Agriculture, Food & Environment (HAICTA 2022), Athens, Greece, 22–25 September 2022. [Google Scholar]
  27. Robotics—Vocabulary (No. ISO 8373:2021). Available online: http://www.springer.com/lncs (accessed on 18 September 2023).
  28. Vahdanjoo, M.; Sorensen, C.G. Novel route planning method to improve the operational efficiency of capacitated operations. Case: Application of organic fertilizer. AgriEngineering 2021, 3, 458–477. [Google Scholar] [CrossRef]
  29. Candra, A.; Budiman, M.A.; Hartanto, K. Dijkstra’s and a-star in finding the shortest path: A tutorial. In Proceedings of the 2020 International Conference on Data Science, Artificial Intelligence, and Business Analytics (DATABIA), Medan, Indonesia, 16–17 July 2020. [Google Scholar]
  30. Siméon, T.; Laumond, J.P.; Nissoux, C. Visibility-based probabilistic roadmaps for motion planning. Adv. Robot. 2000, 14, 477–493. [Google Scholar] [CrossRef]
  31. Murrieta-Cid, R.; Tovar, B.; Hutchinson, S. A sampling-based motion planning approach to maintain visibility of unpredictable targets. Auton. Robot. 2005, 19, 285–300. [Google Scholar] [CrossRef]
  32. Rahman, M.S.; Kaykobad, M. On Hamiltonian cycles and Hamiltonian paths. Inf. Process. Lett. 2005, 94, 37–41. [Google Scholar] [CrossRef]
  33. About Edit GeoJSON. Available online: http://geojson.io/about.html (accessed on 26 October 2023).
Figure 1. Mobile platform carrying the laser-weeding implement in a maize field.
Figure 1. Mobile platform carrying the laser-weeding implement in a maize field.
Agriculture 13 02181 g001
Figure 2. Relations and properties of the FIWARE entities that made up the farm map.
Figure 2. Relations and properties of the FIWARE entities that made up the farm map.
Agriculture 13 02181 g002
Figure 3. Mission structure.
Figure 3. Mission structure.
Agriculture 13 02181 g003
Figure 4. Elements that define a task and their planning strategies.
Figure 4. Elements that define a task and their planning strategies.
Agriculture 13 02181 g004
Figure 5. Topological map used on the “GoTo” planner (B = Building, R.A = Road Asphalt, R.D = Road dirt, OF = Open Field, G = Gate, and F = AgriParcel).
Figure 5. Topological map used on the “GoTo” planner (B = Building, R.A = Road Asphalt, R.D = Road dirt, OF = Open Field, G = Gate, and F = AgriParcel).
Agriculture 13 02181 g005
Figure 6. Representation of a plane with all study elements: (a) plane with eight vertices and (b) plane with six vertices.
Figure 6. Representation of a plane with all study elements: (a) plane with eight vertices and (b) plane with six vertices.
Agriculture 13 02181 g006
Figure 7. Map of the experimental farm at the Centre for Automation and Robotics (CAR).
Figure 7. Map of the experimental farm at the Centre for Automation and Robotics (CAR).
Agriculture 13 02181 g007
Figure 8. The optimum path using Dijkstra’s algorithm (B = Building, R.A = Road Asphalt, R.D = Road dirt, OF = Open Field, G = Gate, and F = AgriParcel).
Figure 8. The optimum path using Dijkstra’s algorithm (B = Building, R.A = Road Asphalt, R.D = Road dirt, OF = Open Field, G = Gate, and F = AgriParcel).
Agriculture 13 02181 g008
Figure 9. VRMP method applied to the field of study.
Figure 9. VRMP method applied to the field of study.
Agriculture 13 02181 g009
Figure 10. Treatment operation on Field 3.
Figure 10. Treatment operation on Field 3.
Agriculture 13 02181 g010
Figure 11. Example of an entire mission planned from Building T to Field 3.
Figure 11. Example of an entire mission planned from Building T to Field 3.
Agriculture 13 02181 g011
Figure 12. Temporal graph of the different operations and actions executed by the robot.
Figure 12. Temporal graph of the different operations and actions executed by the robot.
Agriculture 13 02181 g012
Table 1. Mobile platform features.
Table 1. Mobile platform features.
PropertiesValues
Type of robotAutonomous tracked vehicle
Steering mechanismSkid-steer
Propulsion systemMotor and batteries
Maximum speed6 km/h
Position accuracy±0.015 m
Positioning systemGNSS with real-time kinematics (RTK), inertial measurement units (IMU)
Dimensions1.76 m × 1.5 m × 1.647 m (width × length × height)
Distance between tracks0.80–2.20 m (adjustable)
Minimum distance for crops1.48 m
Table 2. Duration and distance data in the execution of the different operations and actions.
Table 2. Duration and distance data in the execution of the different operations and actions.
OperationActionDuration [s]Distance Traveled [m]
WakeupNone66.98.8
GoToMoveTo309.572.2
FollowPath324.5126.4
TreatmentMoveTo638.185.7
FollowPath74.1288.1
None146.80
Total37.11 min581.2
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cordova-Cardenas, R.; Emmi, L.; Gonzalez-de-Santos, P. Enabling Autonomous Navigation on the Farm: A Mission Planner for Agricultural Tasks. Agriculture 2023, 13, 2181. https://doi.org/10.3390/agriculture13122181

AMA Style

Cordova-Cardenas R, Emmi L, Gonzalez-de-Santos P. Enabling Autonomous Navigation on the Farm: A Mission Planner for Agricultural Tasks. Agriculture. 2023; 13(12):2181. https://doi.org/10.3390/agriculture13122181

Chicago/Turabian Style

Cordova-Cardenas, Ruth, Luis Emmi, and Pablo Gonzalez-de-Santos. 2023. "Enabling Autonomous Navigation on the Farm: A Mission Planner for Agricultural Tasks" Agriculture 13, no. 12: 2181. https://doi.org/10.3390/agriculture13122181

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop