Agricultural robots will typically execute computed motions for a very large number of times

Dynamic, on-line route planning has recently received attention in the agricultural robotics literature for large-scale harvesting operations because of its economic importance and the availability of auto-guided harvesters and unloading trucks. Reported approaches compute a nominal routing plan for the harvesters assuming some initial yield map, and then they route the support units based on the computed points where harvesters fill up their tanks . The plan is adjusted during operations based on updated predictions of when and where harvester tanks will be full. A recent application that falls in this category is robot-aided harvesting of manually harvested fruits , where a team of robotic carts transports the harvested crops from pickers to unloading stations, so that pickers spend less time walking. Overall, the increasing deployment of commercially available auto-guided harvesters and unloading trucks, and the emerging paradigm of replacing large, heavy machines with teams of smaller agricultural autonomous vehicles drive the need for practical on-line route planning software.Primary units and support autonomous vehicles form a ‘closed-loop’ system: the delays introduced by the support vehicles affect the primary units’ temporal and spatial distributions of future service requests. Reactive policies are not efficient enough,growing blueberries in a pot because support trucks/robots must traverse large distances to reach the primary units in the field, thus introducing large waiting times.

The agricultural vehicle routing problem lies under the broad category of Stochastic Dynamic VRP. The incorporation of predictions about future service requests has been shown to improve scheduling for SDVRP. However, most SDVRP applications are characterized by requests that are stochastic and dynamic in time, but fixed and known in terms of location . In contrast, service requests from primary units in agriculture are stochastic and dynamic, both temporally and spatially . Also, the real-time and dynamic nature of agricultural operations means that very few established requests are available to the planner/scheduler, which has to rely much more on predicted requests. In addition, the optimization objective also varies depending on the situation. For example, it can be minimizing waiting time, maximizing served requests and so on, while VRP mainly focuses on minimizing travel distance. Therefore, existing SDVRP predictive scheduling approaches are not well suited for agriculture and more research is needed to incorporate uncertainty in on-line route planning for teams of cooperating autonomous agricultural machines.Therefore particular focus has been on computing paths and trajectories that are optimal in some economic or agronomic sense. Also, in most cases vehicles are non-holonomic. The general problem of moving a vehicle from one point/pose to another lies in the area of general motion planning and is covered adequately in the robotics literature . The focus of this section is on motion planning inside field or orchard blocks. When several machines operate independently of each other in the field they do not share resources, other than the physical area they work in. Furthermore, independent robots will typically operate in different field or orchard rows and their paths may only intersect in headland areas, which are used for maneuvering from one row to the next.

Therefore, motion planning is restricted to headland turning and involves: a) planning of independent geometrical paths for turning, and b) computing appropriate velocity profiles for these paths so that collision avoidance is achieved, when two or more robot paths intersect. Problem is a coordinated trajectory planning problem and has been addressed in the robotics literature . In headlands, optimal motion planning is of particular interest, as turning maneuvers are non-productive and require time and fuel. Auto-guided agricultural vehicles must be able to perform two basic navigation tasks: follow a row, and maneuver to enter another row. The latter requires detection of the end of the current row and the beginning of the next row. The route planning layer specifies the sequence of row traversal and the motion planning layer computes the nominal paths. During row following, precision crop cultivation requires precise and repeatable control of the vehicle’s pose with respect to the crop. Inside rows, agricultural vehicles travel at various ground speeds, depending on the task. For example, self-propelled orchard harvesting platforms move as slow as 1-2 cm/s; tractors performing tillage operations with their implement attached and their power take off engaged may travel at 1 Km/h up to 5 Km/h. Sprayers may travel at speeds ranging from 8 Km/h up to 25 Km/h. Vehicle working speeds in orchards are typically less than 10 Km/h. The above speeds are for straight or slightly curved paths; during turning maneuvers much slower ground speeds are used. Wheel slippage is common during travel, especially in uneven or muddy terrain. Also, agricultural vehicles will often carry a trailer or pull an implement, which can introduce significant disturbance forces. There are two basic auto-guidance modes: absolute and crop-relative. Absolute auto-guidance relies exclusively on absolute robot localization, i.e., real-time access to the geographical coordinates of the vehicle’s location, its absolute roll, pitch and yaw/heading, and time derivatives of them.

These components of the vehicle’s state are estimated based on GNSS and Inertial Navigation System . Tractor GPS based absolute auto-guidance was first reported in 1996 , after Carrier Phase Differential GPS technology became available. Since then, auto-guidance for farming using Global Navigation Satellite Systems has matured into commercial technology that can guide tractors – and their large drawn implements – with centimeter-level accuracy, on 3D terrain, when Real Time Kinematic corrections are used. Absolute guidance can be used for precision operations when there is an accurate georeferenced map of the field and crop rows that is valid during operations, and the vehicle knows its exact position and heading in this map, in real-time. Essentially, establishing accurate vehicle positioning with respect to the crop is achieved by achieving absolute machine positioning on the map. The first step towards this approach is to use RTK GPS guided machines to establish the crop rows – and their map , transplanting . After crop establishment, as long as crop growth does not interfere with driving, vehicles can use the established map to repeatedly drive on the furrows between rows using RTK GPS . Accurate, robust and repeatable path tracking control is needed for precision guidance. The topic has received significant attention in the literature with emphasis given on slip compensation and control of tractor-trailer systems. Approaches reported in the literature include pure-pursuit , side-slip estimation and compensation with model based Liapunov control , back stepping predictive control , fuzzy neural control , sliding mode control , and others. Model-based approaches have also been proposed, such as nonlinear model predictive control , and robust nonlinear model predictive control . Absolute auto-guidance is an established commercially available technology that has acted as enabler for many precision agriculture technologies for row crops, such as variable rate application of seeds and chemicals. It has also led to recent advances in field automation,blueberry sunlight requirement including the development of remotely supervised autonomous tractors without cabin and master-slave operation of grain carts with combines for autonomous harvesting systems . Absolute auto-guidance is not practical in row crops or orchards where one or more of the following are true: a) no accurate crop rows map is available to be used for guidance because crop establishment was performed with machines without RTK GPS; b) such a map exists but changes in the environment or crop geometries may render pre-planned paths non collision-free ; c) GNSS is inaccurate, unreliable or unavailable . In these operations plants grow in distinct rows and the wheels of the autonomous vehicles must drive only inside the space between rows. Examples include open field row crops ; orchards with trees/vines/shrubs and their support structures; greenhouses and indoor farms. Crop-relative auto-guidance is necessary in the situations described above. Researchers have used various sensors, such as onboard cameras and laser scanners to extract features from the crops themselves, and use them to localize the robot relative to the crop lines or trees rows in order to auto-steer. Crop-relative guidance in open fields and orchards is still more of a research endeavor rather than mature, commercial technology. Most of the work so far has focused on row detection and following, and in particular on the estimation of the robot’s offset and heading relative to middle line of the row between the crop lines. All approaches exploit the fact that multiple parallel crop lines are spaced at known and relatively fixed distances from each other. Although the problem of finding such crop rows in images may seem straightforward, real-world conditions introduce complications and challenges that will be discussed next.

When the crop is visually or spectrally different from the material inside furrows , discrimination between soil and crop is easy . However, it can be very challenging in the presence of intra-row weeds or when there are cover crops or intercropping in the furrows , as the visual appearance of the intra-row plants can have similar visual and spectral characteristics to the crops in the rows that need to be detected. Other challenges include row detection of different plant types at various crop growth stages, variability in illumination conditions during daytime or nighttime operation, and environmental conditions that affect sensing. Robustness and accuracy are very important features for such algorithms, as erroneous line calculations can cause the robot to drive over crops and cause economic damage.Researchers have used monocular cameras in the visible  or near infrared spectrum , or multiple spectra to segment crop rows from soil based on various color transformations and greenness indices that aimed at increasing segmentation robustness against variations in luminance due to lighting conditions. Recently, U-Nets , a version of Fully Convolutional Networks were used to segment straw rows in images in real-time . Other approaches do not rely on segmentation but rather exploit the a priori knowledge of the row spacing, either in the spatial frequency domain – using band pass filters to extract all rows at once – or in the image domain . An extension of this approach models the crop as a planar parallel texture. It does not identify crop rows per se, but computes the offset and heading of the robot with respect to the crop lines . Once candidate crop row pixels have been identified various methods have been used to fit lines through them. Linear regression has been used, where the pixels participating are restricted to a window around the crop rows . Single line Hough transform has also been used per independent frame , or in combination with recursive filtering of successive frames . In an effort to increase robustness, a pattern Hough transform was introduced that utilizes data from the entire image and computes all lines at once. Researchers have also used stereo vision for navigation. In an elevation map was generated and the maximum value of the cross-correlation of its profile with a cosine function was used to identify the target navigation point for the vehicle.In depth from stereo was used to project image optical flow to vehicle motion in ground coordinates and calculate offset and heading using visual optical flow.Most reported work was based on monocular cameras, with limited use of stereo vision and 2D/3D lidars. One reason is that in early growth stages the crops can be small in surface and short in height; hence, height information is not always reliable. Given the increasing availability of real-time, low-cost 3D cameras, extensions of some of the above methods to combine visual and range data are conceivable and could improve robustness and performance in some situations. Also, given the diversity of crops, cropping systems and environments, it is possible that crop or application targeted algorithms can be tuned to perform better than ‘generic’ ones and selection of appropriate algorithm is done based on user input about the current operation. The generation of publicly available datasets with accompanying ground truth for crop lines would also help evaluate and compare approaches.Orchards rows are made of trees, vines or shrubs. If these plants are short and the auto-guided robot is tall enough to straddle them, the view of the sensing system will include several rows and the guidance problem will be very similar to crop-row relative guidance. When the plants are tall or the robot is small and cannot straddle the row, the view of the sensing system is limited to two tree rows when the robot travels inside an alley, or one row if it is traveling along an edge of the orchard.