Prosecution Insights
Last updated: April 19, 2026
Application No. 18/072,114

SYSTEM AND METHOD FOR AUTONOMOUS NAVIGATION OF A FIELD ROBOT

Non-Final OA §103
Filed
Nov 30, 2022
Examiner
REDA, MATTHEW J
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Earthsense Inc.
OA Round
7 (Non-Final)
54%
Grant Probability
Moderate
7-8
OA Rounds
3y 2m
To Grant
83%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
126 granted / 231 resolved
+2.5% vs TC avg
Strong +28% interview lift
Without
With
+28.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
46 currently pending
Career history
277
Total Applications
across all art units

Statute-Specific Performance

§101
8.5%
-31.5% vs TC avg
§103
51.1%
+11.1% vs TC avg
§102
20.8%
-19.2% vs TC avg
§112
15.0%
-25.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 231 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-2, 4-9, 11-13, 17, and 21-22 are pending and examined below. This action is in response to the claims filed 9/25/25. Continued Examination Under 37 CFR 1.114 The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 9/25/25 has been entered. Response to Amendment Applicant’s arguments, see Applicant Remarks 35 U.S.C. § 103 filed on 9/25/25, regarding 35 U.S.C. § 103 rejections are persuasive in view of amendments filed 9/25/25. However, upon further consideration, new grounds of rejection are made in view of further citations to the art of record below. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-2, 4-9, 11-13, 17, and 21-22 are rejected under 35 U.S.C. 103 as being unpatentable over Reid et al. (US 2020/0293057), in view of Madsen et al. (US 2019/0129435) and Lombrozo et al. (US 2020/0079381). Regarding claims 1 and 17, Reid discloses an autonomous vehicle trafficability system including a system for autonomous navigation of a field robot, the system comprising: a memory; and a processor coupled to the memory, wherein the processor is configured to execute program instructions stored in the memory for (¶49): receiving a current location of the FR, a target location, and a set of checkpoints within a field (Figs. 3 and 11 and ¶104-106 - navigating between a start point and a destination with a series of segments in between corresponding to the recited current location, target location and checkpoints within a field), wherein a checkpoint from the set of checkpoints indicates and intermediate location between the current location and the target location on the field; determining a sequence of checkpoints based on the target location, the current location of the FR, and the set of checkpoints (Fig. 3, ¶64, ¶71, and ¶106 – mapped segments in different regions in between the start point and the destination corresponding to the recited checkpoints as intermediate locations between the current location and the target locations wherein the FR covers each checkpoint before reaching the target location where the disclosure of the general route between the start point and the destination as being a trajectory between the two points discloses a guidance path which implicitly is comprised of a sequence of checkpoints between the start and end point corresponding to the recited sequentially located intermediate locations); determining a direction of navigation for the FR based on the current location, the target location, and the sequence of checkpoints (¶89 and ¶106 – determining approach/exit angles corresponding to the recited determining a direction through the different segments for routing between start and destination corresponding to the recited based on the current location, the target location, and the sequence of checkpoints); obtaining, by the processor, a set of images of a plurality of parts of the field in real-time from one or more cameras installed on the FR during navigation of the FR in the determined direction (¶91 and ¶105 - images may be captured from a stereoscopic camera corresponding to the recited one or more cameras installed on the FR to update the hazard levels of the terramechanics model in real time corresponding to the recited set of images of a plurality of parts of the field in real-time); predicting, by the processor, a future motility of the FR using a Kinodynamic Motion Planning Model (KMPM) based on direction, angular velocity, acceleration, yaw, pitch, roll angles, traction, and velocity of the FR (¶66-69 and ¶87-89 – trafficability prediction functionality corresponding to the recited predicting a future motility of the FR using KMPM based on image and sensor data including an inertial measurement unit (IMU) 108 may provide information about the rover's orientation, rotational rates and linear accelerations corresponding to the recited direction, angular velocity, acceleration, yaw, pitch, roll angles and velocity of the FR, traction/slip corresponding to the recited traction), and wherein the KMPM is trained using a training dataset comprising a set of images, types of terrain, a plurality of velocities, a plurality of angular velocities, a plurality of acceleration values, a plurality of directions, yaw, pitch, roll angles, traction of a plurality of FRs, and corresponding predicted future motility of the FRs (¶66-71 and ¶87-89 – the trafficability prediction functionality is trained utilizing rover captured images and sensor data including an inertial measurement unit (IMU) 108 may provide information about the rover's orientation, rotational rates and linear accelerations corresponding to the recited direction, angular velocity, acceleration, yaw, pitch, roll angles and velocity of the FR, traction/slip corresponding to the recited traction), wherein the KMPM is trained using recursive learning based on historical data from a plurality of field robots, wherein the historical data comprises a plurality of sets of images and coefficients of traversal for each part of the field in the plurality of sets of images, and wherein the coefficients of traversal in the historical data are determined using the obstacle detection model and annotations received from human participants controlling the plurality of field robots (¶67-69 and ¶76-78 – terramechanics model corresponding to the recited KMPM utilizes a continuously trained model corresponding to the recited recursive learning based on data of terrain images and sensor data including hazard heuristics levels for the terrain corresponding to the recited coefficients of traversal corresponding to the recited historical data comprising a plurality of sets of images and coefficients of traversal where the hazard heuristics levels includes terrain class of the terramechanics model and sensor data included with obstacle detection for review and labeling by human operators corresponding to the recited coefficients of traversal in the historical data are determined using the obstacle detection model and annotations received from human participants); determining, by the processor, coefficients of traversal of the plurality of parts of the field in each image of the set of images based on the predicted future motility of the FR and one or more obstacles detected in an image of the set of images using an obstacle detection model and the Kinodynamic Motion Planning Model (KMPM), wherein the obstacle detection model is trained based on a data set comprising a set of images with labeled obstacles (¶68, ¶102-105 and ¶109-110 - trafficability assessment functionality comprises terrain classifiers that given images, or image signatures, can output a likely terrain class and terramechanics models for each terrain class … as well as setting the hazard heuristics levels for the terrain corresponding to the recited coefficients of traversal of the plurality of parts of the field using machine learning and deep learning techniques, for predicting slip of the vehicle at a future location based on current information corresponding to the recited more machine learning models and a kinodynamic motion planning model where the predicted slip for specific future locations corresponding to the recited predicted future motility using the terramechanics models corresponding to the recited KMPM where hazard levels include the identification unknown or unique formations, patterns, objects, etc. as being of interest which are trained utilizing images and obstacle avoidance techniques where the object detection is used for hazard detection in the trafficability assessment as areas of interest to potentially be avoided. While Reid does not explicitly disclose that the obstacle avoidance techniques are utilizing the same machine learning and deep learning techniques for the slip predictor models, one of ordinary skill in the art before the effective filing date would understand obstacle avoidance techniques utilized to identify unknown objects would include utilizing a trained object detection model trained utilizing a data set comprising a set of images with labeled objects. It would have been obvious to one of ordinary skill in the art before the filing date to have applied the same training techniques for all identification models utilized for autonomous navigation); wherein the identified traversable route passes sequentially through each checkpoint before reaching the target location (¶106 - the route may be determined by traversing only ‘safe’ segments between the start and destination. Once the route is determined, the data associated with the segments on the route, such as the images of the segments, may be stored (1122) and then the rover can be controlled to navigate the route (1124)). While Reid does disclose following a route segment sequence between a start and destination location utilizing obstacle avoidance techniques, it does not explicitly disclose the checkpoints as being associated with agricultural tasks or modifying the processing model to account for the geometrical and mechanical differences in the FRs. However, Madsen discloses a terrain traversability system including wherein the sequence of checkpoints indicate field locations to be worked on by the FR by performing one or more agricultural tasks, wherein the agricultural tasks comprise at least planting, harvesting, spraying, and soil cultivation (¶110-112 and ¶ 118 – the path is defined by specific points along the path where the different paths are associated with a task being performed by the vehicle and/or vehicle implement, the type of crop being planted, tended, or harvested, and/or other features corresponding to the recited the agricultural tasks comprise at least planting, harvesting, spraying, and soil cultivation), wherein the sequence of checkpoints is determined to minimize time required to cover the set of checkpoints and distance covered by the FR to cover all the checkpoints, and wherein the sequence is optimized for agricultural task efficiency (¶77 and ¶100-102 – the path is determined to have the shortest time to traverse corresponding to the recited the sequence of checkpoints is determined to minimize time required to cover the set of checkpoints and distance covered by the FR to cover all the checkpoints while considering optimized operational efficiency while following the particular path); identifying a traversable route for the FR based on the coefficients of traversal of the plurality of parts, the target location, and the sequence of checkpoints using a route identification model, wherein the identified traversable route passes sequentially through each checkpoint before reaching the target location, and wherein the traversable route is configured to maintain agricultural task performance requirements at each checkpoint (¶60-64, ¶99-102, ¶110-112, and Fig. 4 – determining a path corresponding to the recited identifying a traversable route for the FR based on a terrain feature comprising a slope, identify the steepness of the slope, and determine whether the slope of the terrain feature is traversable by the vehicle corresponding to the recited coefficient of traversal as assessed along various points along the route corresponding to the recited sequentially passing each checkpoint before reaching the target location using a route identification model where the path is planned to optimize the usage of the vehicle’s operational efficiency corresponding to the recited maintain agricultural task performance requirements at each checkpoint) wherein the KMPM is modified for the FR based on geometrical and mechanical differences between the plurality of FRs and the FR, wherein geometrical difference is determined by comparing dimensions including height, width, length, and wheel diameter of the plurality of FRs and the FR, and wherein the mechanical difference is determined by comparing velocity, angular velocity, acceleration, yaw, pitch, and roll angles of the plurality of FRs and the FR (¶66-72, ¶83, ¶97, and ¶117-122 – slippage prediction model corresponding to the recited modified KMPM which is calculated utilizing vehicle specific information such as model/type, size, shape, weight, geometry, specific tire information including pressure, wear, etc. of the specific vehicle and implement corresponding to the recited geometrical differences for individual vehicles and additional characteristics of the vehicle corresponding to the recited mechanical differences including speed corresponding to the recited velocity, directional change corresponding to the recited angular velocity, change in speed corresponding to the recited acceleration, and heading measurements corresponding to the recited yaw, pitch, and roll angles of the vehicles); The combination of the following a route segment sequence between a start and destination location utilizing obstacle avoidance techniques and traversability of the different segments of Reid with the agricultural task points for identifying a traversable route utilizing vehicle specific characteristics of Madsen fully discloses the elements as claimed. It would have been obvious to one of ordinary skill in the art before the filing date to have combined the following a route segment sequence between a start and destination location utilizing obstacle avoidance techniques and traversability of the different segments of Reid with the agricultural task points for identifying a traversable route utilizing vehicle specific characteristics of Madsen in order to identify safe (e.g., dryer) paths for vehicles planning to drive over the terrain to avoid equipment sinking or damaging the field (Madsen - ¶48). While Reid in view of Madsen does disclose utilizing traction in the calculations, it does not explicitly define how traction is calculated, however Lombrozo discloses a friction and slippage assessment for autonomous vehicles including wherein the traction corresponds to traction between the wheels of the FR and the type of terrain, and wherein the traction is calculated based on weight of the FR, a frictional coefficient of the tires and a frictional coefficient of the type of terrain (¶38 and ¶62 – traction severity/slip is calculated utilizing location based road friction on different conditions such as ice/puddles corresponding to the recited traction between the wheels of the FR and the type of terrain as well as steering angle, normal force (weight) corresponding to the recited weight of the FR, the friction coefficient corresponding to the recited frictional coefficient of the type of terrain, vehicle acceleration, tire pressure, tread parameters corresponding to the recited frictional coefficient of the tires, road attitude, commanded braking, among other parameters.), The combination of the terrain based traversability assessments/planning for agricultural tasks of Reid in view of Madsen with the traction severity calculations of Lombrozo fully discloses the elements as claimed. It would have been obvious to one of ordinary skill in the art before the filing date to have combined the terrain based traversability assessments/planning for agricultural tasks of Reid in view of Madsen with the traction severity calculations of Lombrozo in order to control the vehicle more safely and effectively in situations where the friction is very low (Lombrozo - ¶26) Regarding claim 2, Reid further discloses identifying a current location (¶104) but does not explicitly disclose utilizing GPS and GNSS. However, Madsen further discloses the current location is received from at least a Global Positioning System (GPS), and a Global Navigation Satellite System (GNSS) (¶32 and ¶53 - system may map the environment (e.g., based on current GNSS/INS auto steering systems) where maps may be generated using data from a variety of sources, such as satellite imagery, surveying using a global navigation satellite system (GNSS) such as a global positioning system (GPS)). It would have been obvious to one of ordinary skill in the art before the filing date to have combined the autonomous vehicle trafficability system of Reid with the positioning sensors of Madsen in order to create and update terrain modeling locations in a system map (Madsen - ¶53). Regarding claim 4, Reid further discloses the current location of the FR continuously changes as the FR moves, and wherein the set of images are continuously obtained using the one or more cameras (¶99-102 – dynamic updating of the terrain characteristics to allow for the rover to continue to traverse the terrain based on the terrain images corresponding to the recited continuously obtained images as the FT moves its current location). Regarding claim 5, Reid further discloses tracking the current location of the FR to check when the FR passes through each checkpoint (¶78 – models are updated as the rover traverses through the regions and segments of the terrain corresponding to the recited tracking the current location of the FR to check when the FR passes through each checkpoint). Regarding claim 6, Reid further discloses the coefficients of traversal are further determined by (¶104-105 - trafficability assessment functionality comprises terrain classifiers that given images, or image signatures, can output a likely terrain class and terramechanics models for each terrain class … as well as setting the hazard heuristics levels for the terrain class corresponding to the recited coefficients of traversal): identifying, by the processor, types of terrain of the plurality of parts of the field in the set of images using a terrain identification model trained (¶105-106 - trafficability assessment functionality comprises terrain classifiers that given images, or image signatures, can output a likely terrain class and terramechanics models for each terrain class … as well as setting the hazard heuristics levels for the terrain class); and determining, by the processor, the coefficients of traversal based on the types of terrain identified (¶68 and ¶104-106 – hazard heuristics levels for the terrain class for future locations of the rover utilizing obstacle avoidance and terramechanics models for the terrain class corresponding to the recited determination of coefficients of traversal). Reid does not explicitly disclose the terrain identification model is trained for agricultural field conditions, however Madsen further discloses wherein the coefficients of traversal are further determined by: identifying, by the processor, types of terrain of the plurality of parts of the field in the set of images using a terrain identification model trained specifically for agricultural field conditions; and determining, by the processor, the coefficients of traversal based on the types of terrain identified and their suitability for agricultural operations (¶116-118 – slippage prediction model corresponding to the recited terrain identification model used to predict the type of terrain/traversability is trained using geo-referenced features including agricultural vehicle specific information as well as a task being performed by the vehicle and/or vehicle implement, the type of crop being planted, tended, or harvested, and/or other features corresponding to the recited terrain identification model trained specifically for agricultural field conditions to determine the traversability/slope of the different locations corresponding to the recited coefficients of traversal based on the types of terrain identified and their suitability for agricultural operations). The combination of the following a route segment sequence between a start and destination location utilizing obstacle avoidance techniques and traversability of the different segments of Reid with the agricultural task points for identifying a traversable route utilizing agricultural vehicle specific characteristics of Madsen fully discloses the elements as claimed. It would have been obvious to one of ordinary skill in the art before the filing date to have combined the following a route segment sequence between a start and destination location utilizing obstacle avoidance techniques and traversability of the different segments of Reid with the agricultural task points for identifying a traversable route utilizing agricultural vehicle specific characteristics of Madsen in order to identify safe (e.g., dryer) paths for vehicles planning to drive over the terrain to avoid equipment sinking or damaging the field (Madsen - ¶48). Regarding claim 7, Reid further discloses the coefficient of traversal of a part of the field is a value between 1 and 0, and wherein the coefficient of traversal is 1 when the part of the field is most traversable and 0 when the part of the field is least traversable (¶71 and ¶85-88 - the hazard levels may be provided from a set number of levels or ranges, depicted as “safe” indicating that it is predicted that the rover will be able to safely traverse the terrain without getting stuck, “risky” indicating that it is predicted that there is a risk that the rover will get stuck while traversing the terrain and “unsafe” indicating that it is predicted that rover will likely get stuck when traversing the terrain where The hazard heuristics functionality 506 provides the predicted hazard level 508, which may be specified as <low, medium, high> or <safe, risky, unsafe> or <1,2,3> or any other level labels desired. Even though Reid does not explicitly disclose the use of 1 and 0 as the most and least traversable notations, it does disclose that other level labels are possible and it would have been obvious for one of ordinary skill in the art before the effective filing date to have utilized other notations such as 1 and 0 to denote the level of traversability in order to adequately map the ability for a vehicle to traverse different segments as identified. It has been held that where the general conditions of a claim are disclosed in the prior art, discovering the optimum or workable ranges involves only routine skill in the art, In re Aller, 105 USPQ 233.). Regarding claim 8, Reid further discloses highlighting one or more parts from the plurality of parts of the field having a coefficient of traversal greater than a defined threshold (¶71 – different regions are highlighted as “safe” or “risky” based on the hazard level corresponding to the recited parts of the field having a coefficient of traversal greater than a defined threshold). Regarding claim 9, While Reid does disclose utilizing colored images (¶72 – captured images in different operating environments and lighting utilizes different color and textures to identify the models), it does not explicitly disclose utilizing RGB images. However, Madsen further discloses the set of images comprises RGB (Red, Green, Blue) images and depth images of the plurality of parts of the field (¶44 - the camera may capture images in the human-visible spectrum (e.g., red-green-blue or “RGB” images)). It would have been obvious to one of ordinary skill in the art before the filing date to have combined the autonomous vehicle trafficability system of Reid with the RGB camera of Madsen in order to identify features in a field of view (Madsen - ¶14). Regarding claim 11, Reid further discloses identifying the traversable route further comprises (¶105-106 - trafficability assessment functionality comprises terrain classifiers that given images, or image signatures, can output a likely terrain class and terramechanics models for each terrain class … as well as setting the hazard heuristics levels for the terrain class): sampling a set of trajectories for the FR for each image based on the target location and the sequence of checkpoints (¶109 – identifying routes based on potential interest level of certain areas as well as the hazard levels corresponding to the recited sampling a set of trajectories); computing an average coefficient of traversal for each trajectory based on the coefficients of traversal of the one or more parts of the plurality of parts of the field in the trajectory, wherein each trajectory passes through the one or more parts of the plurality of parts of the field (¶92-96 – terramechanics models calculate an average slip of the terrain to identify the hazard levels for the different terrains corresponding to the recited average coefficient of traversal for each trajectory based on the coefficients of traversal of the one or more part of the plurality of parts of the field); and automatically selecting a trajectory with the highest average coefficient of traversal (¶106 - the route may be determined by traversing only ‘safe’ segments between the start and destination corresponding to the recited highest average coefficient of traversal). Reid does not explicitly disclose the terrain identification model is trained for agricultural field conditions, however Madsen further discloses wherein each trajectory is evaluated for agricultural task compatibility and coefficient of traversal that maintains sequential checkpoint navigation for agricultural task completion (¶60-64, ¶99-102, ¶110-112, and Fig. 4 – determining a path corresponding to the recited identifying a traversable route for the FR based on a terrain feature comprising a slope, identify the steepness of the slope, and determine whether the slope of the terrain feature is traversable by the vehicle corresponding to the recited coefficient of traversal as assessed along various points along the route corresponding to the recited sequentially passing each checkpoint before reaching the target location using a route identification model where the path is planned to optimize the usage of the vehicle’s operational efficiency corresponding to the recited maintain agricultural task performance requirements at each checkpoint) The combination of the following a route segment sequence between a start and destination location utilizing obstacle avoidance techniques and traversability of the different segments of Reid with the agricultural task points for identifying a traversable route utilizing agricultural vehicle specific characteristics of Madsen fully discloses the elements as claimed. It would have been obvious to one of ordinary skill in the art before the filing date to have combined the following a route segment sequence between a start and destination location utilizing obstacle avoidance techniques and traversability of the different segments of Reid with the agricultural task points for identifying a traversable route utilizing agricultural vehicle specific characteristics of Madsen in order to identify safe (e.g., dryer) paths for vehicles planning to drive over the terrain to avoid equipment sinking or damaging the field (Madsen - ¶48). Regarding claim 12, Reid further discloses the KMPM is trained using at least one of supervised learning, transfer learning, and recursive learning (¶73 – human supervisor of data training as well as the deep learning terrain classifier 904 may use convolutional neural networks (CNNs), recurrent neural networks (RNNs) including long short term memory (LTSM) as well as other techniques for deep learning, the claim element “at least one of” requires only one of the following to be disclosed to teach the entire element as claimed). Regarding claim 13, Reid further discloses updating the coefficient of traversal of the plurality of parts of the field by (¶107 - update the terramechanics model (1226) of the terrain class of the region of terrain): receiving inertial information of the FR from inertial sensors installed on the FR (¶66 - An inertial measurement unit (IMU) 108 may provide information about the rover's orientation, rotational rates and linear accelerations); determining a current motility of the FR in the part from the plurality of parts of the field based on the inertial information using the KMPM, wherein the inertial information comprises at least a current velocity, a current angular velocity, a current direction, and a current acceleration of the FR (¶66 and ¶91 - An inertial measurement unit (IMU) 108 may provide information about the rover's orientation, rotational rates and linear accelerations corresponding to the recited a current velocity, a current angular velocity, a current direction, and a current acceleration of the FR to determine the terrain type that the rover is currently driving on, the current slip, and the current slope); comparing the current motility with the predicted future motility of the FR for the part from the plurality of parts of the field (¶107 - predicted slip for the terrain is compared to the measured slip (1206)); and updating the coefficient of traversal of the part from the plurality of parts of the field based on the current motility when the current motility of the FR does not match the predicted motility of the FR for the part from the plurality of parts of the field (¶107 - If the difference is within a warning threshold, the heuristic mapping for the terrain class that maps slip amounts to a hazard level is adjusted (1212)). Reid does not explicitly disclose applying the system to agricultural specific information however Madsen further discloses wherein the inertial information includes agricultural task-specific motion data and the updating accounts for agricultural field-specific conditions (¶23 and ¶46-51 – inertial navigation system utilized for determining real-time discrepancies in a pre-existing 3D terrain map, and update the 3D terrain map accordingly including the topography of vegetation and crops as well as the traversability of bodies of water in the field corresponding to the recited inertial information including agricultural task-specific motion data and updating the terrain maps with agricultural field-specific conditions) The combination of the following a route segment sequence between a start and destination location utilizing obstacle avoidance techniques and traversability of the different segments of Reid with the agricultural task points for identifying a traversable route utilizing agricultural vehicle specific characteristics of Madsen fully discloses the elements as claimed. It would have been obvious to one of ordinary skill in the art before the filing date to have combined the following a route segment sequence between a start and destination location utilizing obstacle avoidance techniques and traversability of the different segments of Reid with the agricultural task points for identifying a traversable route utilizing agricultural vehicle specific characteristics of Madsen in order to identify safe (e.g., dryer) paths for vehicles planning to drive over the terrain to avoid equipment sinking or damaging the field (Madsen - ¶48). Regarding claim 21, Reid further discloses the future motility includes an ability of the FR to traverse through a part of the field (¶104-105 - trafficability assessment functionality comprises terrain classifiers that given images, or image signatures, can output a likely terrain class and terramechanics models for each terrain class … as well as setting the hazard heuristics levels for the terrain class corresponding to the recited coefficients of traversal of the plurality of parts of the field using machine learning and deep learning techniques, for predicting slip of the vehicle at a future location based on current information corresponding to the recited more machine learning models and a kinodynamic motion planning model where the predicted slip for specific future locations corresponding to the recited predicted future motility using the terramechanics models corresponding to the recited KMPM where the predicted slip for an area corresponding to the recited ability of the FR to traverse through a part of the field). Reid does not explicitly disclose applying the system to agricultural specific information however Madsen further discloses while maintaining agricultural task performance capabilities (¶100 and ¶116-118 - determine the suitability of traversing the section of terrain based on other characteristics of the terrain and vehicle specific characteristics to optimize the operating efficiency of the agricultural vehicle corresponding to the recited maintaining agricultural task performance capabilities) The combination of the following a route segment sequence between a start and destination location utilizing obstacle avoidance techniques and traversability of the different segments of Reid with the agricultural task points for identifying a traversable route utilizing agricultural vehicle specific characteristics of Madsen fully discloses the elements as claimed. It would have been obvious to one of ordinary skill in the art before the filing date to have combined the following a route segment sequence between a start and destination location utilizing obstacle avoidance techniques and traversability of the different segments of Reid with the agricultural task points for identifying a traversable route utilizing agricultural vehicle specific characteristics of Madsen in order to identify safe (e.g., dryer) paths for vehicles planning to drive over the terrain to avoid equipment sinking or damaging the field (Madsen - ¶48). Regarding claim 22, Reid does not disclose utilizing a 3D model of an area of the field, however Madsen further discloses generating a three dimensional (3D) model of an area of the field surrounding the FR based on the set of images and a 3D modeling machine learning algorithm trained for agricultural field environments, wherein a top-down view of the FR is generated based on the 3D model of the area, and wherein the 3D model includes agricultural field features for enhanced navigation planning (¶32-36, ¶44-47, ¶57, and ¶116-118 – slippage prediction model corresponding to the recited terrain identification model used to predict the type of terrain/traversability is trained using geo-referenced features including agricultural vehicle specific information as well as a task being performed by the vehicle and/or vehicle implement, the type of crop being planted, tended, or harvested, and/or other features corresponding to the recited 3D modeling machine learning algorithm trained for agricultural field environments and generating three-dimensional (3D) terrain maps (also known as three dimensional elevation models) corresponding to the recited generating a three dimensional (3D) model of an area of the field surrounding the FR utilizing imagery and sensor data captured from ground-based-vehicles corresponding to the recited based on the set of images where the system utilizes machine learning to optimize maps using sensor input analysis algorithms and controllers to improve performance where the system generates a two dimensional representation of the three-dimensional terrain map that includes ground surface topography corresponding to the recited top down view generated based on the 3D model of the area including determining a height of a portion of the vegetation (e.g., crops) on the terrain above the ground surface to help determine whether a crop is ready for harvesting, identify brush that may need to be cleared from a field before planting, assess the health of crops, and other uses corresponding to the recited the 3D model includes agricultural field features for enhanced navigation planning. While Madsen does not explicitly disclose that the machine learning models used to optimize the maps is explicitly a 3D modeling machine learning algorithm, it would have been obvious to one of ordinary skill in the art before the filing date to utilize map based machine learning models to generate the 3D model based on the sensor input analysis in order to optimize the maps and improve performance (Madsen - ¶57)). It would have been obvious to one of ordinary skill in the art before the filing date to have combined the autonomous vehicle trafficability system of Reid with the positioning sensors of Madsen in order to create and update terrain modeling locations in a system map (Madsen - ¶53). Additional References Cited The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Matsuda (US 2019/0257664) discloses a system of generating a path for an autonomous vehicle which includes autonomous farming tractors including optimizing point sequence of the target path (¶33 and ¶70). Ellaboudy et al. (US 2021/0000006) discloses a dynamic programming or another optimization algorithm may be implemented to generate 530 the path as a sequence of waypoints to achieve the coverage objective subject to additional constraints, including constraints based on the data specifying the parameters of the vehicle (e.g., turning radius) and the implement (e.g., implement width) for automated control of vehicles in agricultural and industrial environments (Abstract and ¶103). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Matthew J Reda whose telephone number is (408)918-7573. The examiner can normally be reached on Monday - Friday 7-4 ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached on (571) 272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MATTHEW J. REDA/Primary Examiner, Art Unit 3665
Read full office action

Prosecution Timeline

Nov 30, 2022
Application Filed
Mar 16, 2023
Response after Non-Final Action
Jun 06, 2023
Non-Final Rejection — §103
Sep 06, 2023
Response Filed
Sep 15, 2023
Final Rejection — §103
Nov 08, 2023
Response after Non-Final Action
Nov 13, 2023
Response after Non-Final Action
Nov 28, 2023
Request for Continued Examination
Nov 30, 2023
Response after Non-Final Action
Jan 18, 2024
Non-Final Rejection — §103
Apr 24, 2024
Response Filed
Jun 17, 2024
Final Rejection — §103
Sep 20, 2024
Response after Non-Final Action
Sep 23, 2024
Response after Non-Final Action
Oct 29, 2024
Applicant Interview (Telephonic)
Oct 29, 2024
Examiner Interview Summary
Nov 20, 2024
Request for Continued Examination
Nov 21, 2024
Response after Non-Final Action
Feb 11, 2025
Non-Final Rejection — §103
May 14, 2025
Response Filed
Jun 26, 2025
Final Rejection — §103
Sep 25, 2025
Request for Continued Examination
Oct 03, 2025
Response after Non-Final Action
Nov 06, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12573248
AN ELECTRONIC CONTROL UNIT FOR A VEHICLE CAPABLE OF CONTROLLING MULTIPLE ELECTRICAL LOADS
2y 5m to grant Granted Mar 10, 2026
Patent 12570509
INDUSTRIAL TRUCK WITH DETECTION DEVICES ON THE FORKS
2y 5m to grant Granted Mar 10, 2026
Patent 12533065
METHOD AND APPARATUS FOR CLASSIFYING SUBJECT INDEPENDENT DRIVER STATE USING BIO-SIGNAL
2y 5m to grant Granted Jan 27, 2026
Patent 12530029
SYSTEM AND METHOD OF ADAPTIVE, REAL-TIME VEHICLE SYSTEM IDENTIFICATION FOR AUTONOMOUS DRIVING
2y 5m to grant Granted Jan 20, 2026
Patent 12525071
METHOD FOR ASSISTED OPERATING SUPPORT OF A GROUND COMPACTION MACHINE AND GROUND COMPACTION MACHINE
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
54%
Grant Probability
83%
With Interview (+28.5%)
3y 2m
Median Time to Grant
High
PTA Risk
Based on 231 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month