Prosecution Insights
Last updated: April 19, 2026
Application No. 18/051,361

SYSTEMS AND METHODS FOR AUTOMATIC FIELD INFORMATION DETERMINATION

Non-Final OA §103
Filed
Oct 31, 2022
Examiner
DEL VALLE, LUIS GERARDO
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Deere & Company
OA Round
4 (Non-Final)
72%
Grant Probability
Favorable
4-5
OA Rounds
2y 11m
To Grant
96%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
111 granted / 154 resolved
+20.1% vs TC avg
Strong +24% interview lift
Without
With
+23.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
30 currently pending
Career history
184
Total Applications
across all art units

Statute-Specific Performance

§101
13.1%
-26.9% vs TC avg
§103
60.5%
+20.5% vs TC avg
§102
11.2%
-28.8% vs TC avg
§112
12.7%
-27.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 154 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Examiner’s Response re: 103 Rejection Examiner’s Response re: Claim 1 Applicant’s arguments, see Pages 10-11, filed 04 Nov 2025, with respect to the rejection(s) of claim(s) 1 under 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of , Ellaboudy, O’Donnell , Davis, Brown, and Losch. Examiner’s Response re: Claim 11 Applicant’s arguments, see Pages 9-10, filed 04 Nov 2025, with respect to the rejection(s) of claim(s) 11 under 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of , Ellaboudy, O’Donnell , Davis, Brown, and Flajolet. Examiner’s Response re: Claim 19 Applicant’s arguments, see Pages 10-11, filed 04 Nov 2025, with respect to the rejection(s) of claim(s) 19 under 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of , Ellaboudy, O’Donnell , Davis, Brown, and Schoon. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1 and 4-18, and 21 are rejected under 35 U.S.C. 103 as being unpatentable over Ellaboudy et al. US 20210000006 A1 (herein, Ellaboudy), in view of Losch et al., US 20140343803 A1 (herein, Losch), O’Donnell US 20200332479 A1 (herein, O’Donnell), Davis et al., US 20180243772 A1 (herein, Davis) and in further view of Brown et al. US 20230091677 A1 (herein, Brown). Regarding Claim 1, Ellaboudy discloses, a mobile machine (FIGS. 1 and 13, #110) comprising: ground-engaging traction elements (FIG. 13, ¶[0048] – “…may include wheels, tracks, and/or treads…”) configured to be driven to propel the mobile machine over a surface (¶[0048] – “…the vehicle 110 is configured to move across land…”) of a worksite (FIG. 13 illustrates a worksite): a position sensor system (FIG. 1, #148) configured to detect a geographic location (¶[0055] – “…position sensors, including but not limited to those employing lasers, hall effect, resistor, switches and photogates to obtain position …”) of the mobile machine and generate sensor data (¶[0055] – “vehicle state”) indicative of the detected geographic position of the mobile machine; an observation sensor system (FIG. 1, #140, Claim 6) configured to detect a plurality of plants (¶[0053] – “…depicting one or more plants in a vicinity of the vehicle 110….” – i.e. a plurality of plants) at or around the worksite, and external of the mobile machine (¶[0053] – “…connected to the vehicle…” – e.g. its external and FIG. 13), and generate sensor data indicative of the plants (¶[0053] – “…difference vegetation index…”). Ellaboudy disclose a processing system (FIG. 6, #600), the processed sensor data is indicative of the plurality of plant locations of the plants (¶[0113] – “…detected crop row with a crop row…”) at or around the worksite, wherein the processed mobile machine sensor data generated by the processing system of the mobile machine (FIG. 6, #640 and ¶[0113] – “…match detected crop row…”) but does not disclose, and a boundary location of at least portion of a plants of interest area boundary comprising the plurality of plants in the worksite, configured to generate processed sensor data based on the sensor data indicative of the detected geographic location of the mobile machine and the sensor data indicative of the plurality of plants at or around the worksite, wherein the processed sensor data is indicative of plant locations of the plants at or around the worksite and a boundary location of at least portion of a plants of interest area boundary in the worksite. However, Losch teaches, a boundary location of at least portion of a plants of interest area boundary comprising the plurality of plants in the worksite. (¶[0004] – “The agricultural working machine often works through a field of crop and, once the boundary of the field of crop is reached, enters the headland…”). Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by Ellaboudy to include a boundary location of the plants of interest as taught by Losch. Doing so provides enhanced information due to knowing the location of the boundary of the plants of interest during the agricultural process. Modified Ellaboudy does not teach, configured to generate processed sensor data based on the sensor data indicative of the detected geographic location of the mobile machine and the sensor data indicative of the plurality of plants at or around the worksite, wherein the processed sensor data is indicative of plant locations of the plants at or around the worksite and a boundary location of at least portion of a plants of interest area boundary in the worksite. However, O’Donnell teaches, configured to generate processed sensor data based on the sensor data indicative of the detected geographic location of the mobile machine and the sensor data indicative of the plants at or around the worksite (FIG. 2 and 5, ¶[0031] – “…a control interface may be configured to display, for example, at least part of a map of the work surface 158 and/or of the worksite generally, a travel path associated with the slave machine 146,… a worksite map that identifies the location, size, and/or other parameters of objects disposed on and/or at least partly beneath the area and/or the work surface 158,…”) , wherein the processed sensor data is indicative of plant locations of the plants at or around the worksite and a boundary location of at least portion of a plants of interest area boundary in the worksite. Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by modified Ellaboudy to include the mobile machine’s geographic detection as taught by O’Donnell. Doing so provides enhanced information due to knowing the location of the mobile machine during the agricultural process. Modified Ellaboudy discloses a map (¶[0044] – “a map”), worksite, plants, location, boundary, and processed sensor data but does not disclose, a map generator configured to generate a map of the worksite based on the processed mobile machine sensor data, that is based on the location data indicative of the geographic location of the mobile machine and the plant data indicative of the plants, wherein the map indicates the plurality of plants at the plant locations and the at least portion of the plants of interest area boundary at the boundary location. However, Brown teaches, a map generator (FIG. 3, #300, ¶[0066] – “…assists in generating cropped area maps…”) but does not disclose, configured to generate a map of the worksite based on the processed mobile machine sensor data, that is based on the location data indicative of the geographic location of the mobile machine and the plant data indicative of the plants, wherein the map indicates the plurality of plants at the plant locations and the at least portion of the plants of interest area boundary at the boundary location. Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by Ellaboudy to include a map generator as taught by Brown. Doing so provides enhanced flexibility by providing a mobile agricultural machine with capability to generate the map of the worksite area and its corresponding plants/crops. However, Davis teaches, a map generator configured to generate a map of the worksite based on the processed mobile machine sensor data, that is based on the location data indicative of the geographic location of the mobile machine and the plant data indicative of the plants, wherein the map indicates the plurality of plants at the plant locations and the at least portion of the plants of interest area boundary at the boundary location (FIG. 1 and ¶[0027] – “…(1) the position data and associated attitude of the beam 10 or vehicle 11 from one or more location-determining receivers (22, 24) or the electronic data processor 903,…path planning module 910 can use a survey, field boundaries and keep-out zones, or prior maps to generate a path plan for the sprayer vehicle 11 to cover an entire area of a field with spray with minimal overlap of crop inputs.”). Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by Ellaboudy to include the generating of a map based on location data of the machine and plants as taught by Davis. Doing so provides enhanced flexibility by providing a mobile agricultural machine with capability to generate the map of the worksite area and its corresponding plants/crops based on location data. Ellaboudy discloses the mobile machine and map but does not disclose, a control system configured to control the mobile machine based on the map. However, O’Donnell teaches, a control system configured to control the mobile machine based on the map (¶[0031] – “…operator station 166 may include a console and/or other levers or controls for operating the slave machine 146…”). Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by Ellaboudy to include a control system as taught by O’Donnell. Doing so provides the capability to control the map based mobile machine via a control system. Regarding Claim 4, modified Ellaboudy further discloses, wherein the observation sensor system includes a near-infrared (NIR) sensor (FIG. 1, #144 – infrared spectrum sensors) configured to detect the plants at or around the worksite, and external of the mobile machine (¶[0053]), and to generate the plant data indicative of the plants (¶[0060] – “…detect the one or more plants based on the image data; responsive to detecting the one or more plants, adjust…”). Regarding Claim 5, modified Ellaboudy further discloses, wherein the observation sensor system (140) comprises a multi-spectral imager (¶[0143] – multispectral camera) that includes an imager (¶[143] – imagers) configured to receive near- infrared (NIR) radiation reflected from plants, to detect the plants at or around the worksite, and external of the mobile machine, based on the received NIR radiation, and to generate the plant data indicative of the plants (¶[130] – crop) and non-plant data indicative of non-plant objects (¶[0087] – “…the geographic area may include or be part of a farm, a mine, a warehouse, or a construction site…”). Regarding Claim 6, modified Ellaboudy further discloses, wherein the observation sensor system (140) is configured to receive near-infrared (NIR ) radiation reflected from plants and non-plant objects (¶[0053] – “…configured to capture light in bands of the spectrum corresponding to plant vitality…”), external of the mobile machine, and is further configured to detect the plants and non-plant objects at or around the worksite based on the received NIR radiation (FIG. 13 – illustrates and ¶[0087] – describes that plants/non-plant object around worksite), and generate the plant data indicative of the plants and non-plant data indicative of the non-plant objects (¶[0087]): wherein the processing system (600) is further configured to generate the processed mobile machine (110) sensor data further indicative of locations of the non-plant objects (FIG. 6, #630, ¶[0109] – “…a map representing locations of physical objects in a geographic area…” – i.e., physical objects are non-plants) at or around the worksite based on the location data indicative of the geographic location of the mobile machine and the non-pant data indicative of the non-plant objects (¶[0087]; wherein the map generator (Brown, 300) is further configured to generate the map of the worksite that further indicates the non-plant objects at locations, of the non-plant objects, at or around the worksite (Brown, ¶[0066]). Regarding Claim 7, modified Ellaboudy further discloses, wherein the map generator is configured to generate the map of the worksite to include a plant map layer (¶[0104] – map layer) indicative of the at least portion of the plants of interest area boundary at the boundary location in the worksite and indicative of the plants at the plant locations at or around the worksite (¶[0104] – “…a particular region of crop might be affected by disease…”), based on the processed mobile machine (110) sensor data, a non-plant map layer (¶[0104] – “…a map layer may associate stationary features on the map to a set…For example, the trees in a map…) indicative of the non-plant objects at the plant locations at or around the worksite, based on the processed sensor data, and a plant and non-plant map layer (¶[0087] – “…the map data structure may include point cloud data representing the positions of objects (e.g., trees or other plants, furrows, buildings, fences, and/or shelves)…”) indicative of the at least portion of the plants of interest area boundary at the boundary location in the worksite, indicative of the plants at the plant locations at or around the worksite, and indicative of the non-plant objects at the locations, of the non-plant objects (¶[0104] – “…a map layer may associate stationary features on the map…), at or around the worksite, based on the processed mobile machine (110) sensor data (¶[0104] – “…the vehicle may recognize features (e.g., plants) as it navigates and control a connected implement accordingly based on those features perceived.) Regarding Claim 8, modified Ellaboudy further discloses, wherein the observation sensor system comprises: a first component (¶[0132] - “…captured using one or more image sensors…”) configured to capture near-infrared (NIR) (144) radiation reflected from plants and non-plant objects at or around the worksite, and external of the mobile machine (¶[0132] – “…connected to a vehicle…”); and a second component (FIG. 11, #1114) configured to capture electromagnetic radiation belonging to a region (¶[0143] – “…and/or ultraviolet bands…”) of the electromagnetic (EM) spectrum different than a NIR region (¶[0143] – “…narrow bands) of the EM spectrum; and wherein the observation sensor system (140) is configured to generate the plant data indicative of the plants (¶[0053] – “…one or more plants…”) and non-plant data indicative of the non-plant objects (¶[0054] – “…reflecting the locations of objects in a vicinity of the vehicle…”); and wherein the processing system (600) generates the processed mobile machine (110) sensor data indicative of locations of the plants at or around the worksite (FIG. 6, #620), indicative of locations of the non-plant objects at or around the worksite (FIG. 6, #630),, and indicative of the at least portion of the plants of interest area boundary in the worksite (FIG. 6, #640 – crop row has boundary), based on the location data indicative of the geographic location of the mobile machine (FIG. 6, #650) the plant data indicative of the plants, and non-plant data indicative of the non-plant objects (FIG. 6, ¶[0114] – “…estimate based on a detected furrow…”). Regarding Claim 9, modified Ellaboudy further discloses, wherein the processing system is configured to utilize a non-machine learning algorithm to generate the processed mobile machine (110) sensor data indicative of the locations of the plants (¶[0205] – “…other line fitting algorithm (e.g., a Hough transform or a random sample consensus (RANSAC) algorithm) to the positions associated with the edge of the raised planting bed of the crop row…”) at or around the worksite and the locations of the non- plant objects at or around the worksite (¶[0082] – “…a sensing algorithm to plan an alternate route may include estimating three-dimensional size of the obstacle…”) and wherein the processing system is further configured to utilize a machine learning algorithm (¶[0107] – a neural network) to generate the processed mobile machine (110) sensor data further indicative of a type of non-plant object based on the non-plant data indicative of the non-plant objects (¶[0107] – “…a neural network based real-time object detection algorithm, the vehicle may be programmed to stop completely for the objects like rocks, human, trees, fences, or other vehicles, or go over the unharmed and non-dangerous objects like grass, weeds, dust, or hays…”). Regarding Claim 10, modified Ellaboudy further discloses, plants of interest area boundary comprising a plurality of crop rows (FIG. 32 illustrates a plurality of crop rows that have a boundary area per 3240 – “bounding box”). With respect to Claim 17, please see the rejections above with respect to Claims 6 and 9, drawn to a mobile machine, which are commensurate in scope to Claim 17 being drawn to a method. Claims 11-16, 18, and 21 are rejected under 35 U.S.C. 103 as being unpatentable over Ellaboudy et al. US 20210000006 A1 (herein, Ellaboudy), in view of Losch et al., US 20140343803 A1 (herein, Losch), O’Donnell US 20200332479 A1 (herein, O’Donnell), Davis et al., US 20180243772 A1 (herein, Davis) and in further view of Brown et al. US 20230091677 A1 (herein, Brown). Regarding Claim 11, Ellaboudy discloses, a computer implemented method of generating a map of a worksite (FIG. 4, ¶[0071] – “… feature map can be generated using several different methods….”) comprising: detecting a geographic location (¶[0055] – “…position sensors, including but not limited to those employing lasers, hall effect, resistor, switches and photogates to obtain position …”) of a mobile machine (FIGS. 1 and 13, #110) in a non-plants of interest area and generating sensor data indicative of a geographic location of the mobile machine in the non-plants of interest area (¶[0082] – “… The system may then decide whether to stop, go over obstacles (e.g., based on 100% certainty), or plan an alternate route around obstacles. For example, a sensing algorithm to plan an alternate route may include estimating three-dimensional size of the obstacle, calculating width of the route to travel, …”); Ellaboudy does not disclose, while the mobile machine is located in the non-plants of interest area, detecting, by one or more sensors on the mobile machine, at least a portion of a boundary of a plants of interest area, that includes a plurality of plants, at the worksite, wherein the boundary of the plants of interest area separates the plants of interest area from the non-plants of interest area and generating sensor data indicative of the at least portion of the boundary of the plants of interest area detected by the one or more sensors on the mobile machine. However, Flajolet teaches, while the mobile machine is located in the non-plants of interest area, detecting, by one or more sensors on the mobile machine, at least a portion of a boundary of a plants of interest area, that includes a plurality of plants, at the worksite, wherein the boundary of the plants of interest area separates the plants of interest area from the non-plants of interest area and generating sensor data indicative of the at least portion of the boundary of the plants of interest area detected by the one or more sensors on the mobile machine (¶[0052] – “Once the autonomous machine 100 is dispatched to this agricultural field and once a weeding cycle by the autonomous machine 100 is subsequently initiated by an operator (e.g., locally or remotely), the autonomous machine 100 can, in Block S110: navigate to the specified start location (e.g., around rather than through the georeferenced boundary of the agricultural field); orient itself into alignment with the longitudinal direction of a first set of crop rows at the start location; and accelerate to the target ground speed parallel to the first set of crop rows…”). Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by Ellaboudy to include detecting the mobile machine located in the non-plants area of interest by the sensors as taught by Flajolet. Doing so provides enhanced information due to knowing the location of the boundary of the plants of interest during the agricultural process. Modified Ellaboudy disclose generating with a processing system (FIG. 6, #600), the processed sensor data is indicative of the plurality of plant locations of the plants (¶[0113] – “…detected crop row with a crop row…”) at or around the worksite, wherein the processed mobile machine sensor data generated by the processing system of the mobile machine (FIG. 6, #640 and ¶[0113] – “…match detected crop row…”) but does not disclose, and a boundary location of at least portion of a plants of interest area boundary comprising the plurality of plants in the worksite, configured to generate processed sensor data based on the sensor data indicative of the detected geographic location of the mobile machine and the sensor data indicative of the plurality of plants at or around the worksite, wherein the processed sensor data is indicative of plant locations of the plants at or around the worksite and a boundary location of at least portion of a plants of interest area boundary in the worksite. However, Losch teaches, a boundary location of at least portion of a plants of interest area boundary comprising the plurality of plants in the worksite. (¶[0004] – “The agricultural working machine often works through a field of crop and, once the boundary of the field of crop is reached, enters the headland…”). Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by modified Ellaboudy to include a boundary location of the plants of interest as taught by Losch. Doing so provides enhanced information due to knowing the location of the boundary of the plants of interest during the agricultural process. Ellaboudy discloses a map (¶[0044] – “a map”), worksite, plants, location, boundary, and processed sensor data but does not disclose, Generating with a map generator configured to generate a map of the worksite based on the processed mobile machine sensor data, that is based on the location data indicative of the geographic location of the mobile machine and the plant data indicative of the plants, wherein the map indicates the plurality of plants at the plant locations and the at least portion of the plants of interest area boundary at the boundary location. However, Brown teaches, a map generator (FIG. 3, #300, ¶[0066] – “…assists in generating cropped area maps…”) but does not disclose, configured to generate a map of the worksite based on the processed mobile machine sensor data, that is based on the location data indicative of the geographic location of the mobile machine and the plant data indicative of the plants, wherein the map indicates the plurality of plants at the plant locations and the at least portion of the plants of interest area boundary at the boundary location. Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by Ellaboudy to include a map generator as taught by Brown. Doing so provides enhanced flexibility by providing a mobile agricultural machine with capability to generate the map of the worksite area and its corresponding plants/crops. However, Davis teaches, Generating, with a map generator configured to generate a map of the worksite based on the processed mobile machine sensor data, that is based on the location data indicative of the geographic location of the mobile machine and the plant data indicative of the plants, wherein the map indicates the plurality of plants at the plant locations and the at least portion of the plants of interest area boundary at the boundary location (FIG. 1 and ¶[0027] – “…(1) the position data and associated attitude of the beam 10 or vehicle 11 from one or more location-determining receivers (22, 24) or the electronic data processor 903,…path planning module 910 can use a survey, field boundaries and keep-out zones, or prior maps to generate a path plan for the sprayer vehicle 11 to cover an entire area of a field with spray with minimal overlap of crop inputs.”). Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by Ellaboudy to include the generating of a map based on location data of the machine and plants as taught by Davis. Doing so provides enhanced flexibility by providing a mobile agricultural machine with capability to generate the map of the worksite area and its corresponding plants/crops based on location data. Ellaboudy discloses the mobile machine and map but does not disclose, controlling the mobile machine based on the map. However, O’Donnell teaches, controlling the mobile machine based on the map (¶[0031] – “…operator station 166 may include a console and/or other levers or controls for operating the slave machine 146…”). Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by Ellaboudy to include a control system as taught by O’Donnell. Doing so provides the capability to control the map based mobile machine via a control system. Regarding Claim 12, modified Ellaboudy further discloses, wherein controlling the mobile machine; generating a route for the mobile machine (FIG. 1 and ¶[0044] – “…predetermined path data structure, which may specify a desired path for a vehicle as a sequence of waypoints in a map of a geographic area. For example, waypoints of the path may include implement control data that specify how a mounted implement is to be used at locations associated with the respective waypoints…”) based on the map of the worksite. Regarding Claim 13, modified Ellaboudy further discloses, wherein the non-plants of interest area comprises a non-crop area ([0061] – “…geographic area (e.g., a farm, a mine, a warehouse, a construction site, or another worksite) may be mapped and the resulting map may be used to control motion of a vehicle and/or operation of an implement connected to the vehicle to perform an operation at a subset of locations in the geographic area.). Regarding Claim 14, modified Ellaboudy further discloses, further comprising: detecting an additional geographic location (¶[0055] – “…obtain position, including but not limited to absolute and relative positioning…” – i.e., conducted by a “feedback” sensor, there are additional geographic locations) of the mobile machine in the plants of interest area and generating additional sensor data indicative of the geographic location of the mobile machine in the plants of interest area (¶[0055] – since additional feedback due to the sensors therefore there is additional sensor data); detecting at least an additional portion (¶[0104] – “…include different zones…a particular region of crop might be affected by disease…” – i.e., the unaffected area is an additional portion)of the boundary of the plants of interest area at the worksite and generating additional sensor data indicative of the at least additional portion of the boundary of plants of interest area; generating, with the processing system, additional processed sensor data (¶[0113] – “…detected crop row with a crop row… comparing the current point cloud data corresponding to the detected row to expected point cloud data for nearby crop row” – i.e., additional sensor data) indicative of a geographic location (See above Claim Objection) of the at least additional portion of the boundary of the plants of interest area at the worksite based on the additional sensor data indicative of the geographic location of the mobile machine in the plants of interest area and the sensor data indicative of the at least additional portion of the boundary of the plants of interest area at the worksite (¶[0113] – “…georeferencing the detected crop row…”); and generating, with the map generator (Brown, 300), an updated map (¶[0075] – “…map-based localization technique…state estimate is then updated based on the proprioceptive data…”), of the worksite indicative of the at least portion of the boundary of the plants of interest area and the at least additional portion of the boundary of the plants of interest area in the worksite (¶[0104] – “…the trees in a map of an orchard might be classified and tagged such that when the vehicle observes or is within a certain vicinity…” – i.e., the plants (trees) of interest and have corresponding boundaries). Regarding Claim 15, modified Ellaboudy further discloses, wherein the non-plants of interest area comprises at least one of: a headland, or a road ([0176] – “…vehicle heading may be calculated from the detected tree lane in every frame (e.g., the vehicle stays in the middle of the lane and its heading is parallel to the lane).” With respect to Claim 16, please see the rejections above with respect to Claim 8, drawn to a mobile machine, which are commensurate in scope to Claim 16 being drawn to a method. Regarding Claim 18, modified Ellaboudy further discloses, wherein detecting the at least a portion of the boundary of the plants of interest area comprises: controlling the mobile machine to traverse in the non-plants of interest area at least partially around the plants of interest area (FIG. 32, #3200, ¶[0228] – “..an example of an agricultural lane following scenario 3200. In the scenario 3200, two crop rows, a left crop row 3202 and a right crop row 3204 bound a lane on either side. A vehicle 3210 (e.g., the vehicle 110) is configured to automatically detect move along the lane..”); and detecting, by the one or more sensors on the mobile machine the at least a portion of the boundary of the plants of interest area (FIG. 32 and ¶[0231] – “…A line 3250 is then fit to the position data for the plants of left crop row 3202, including the bounding box 3240.”) while the mobile machine is traversing in the non-plants of interest area (FIG. 32 illustrates the machine traversing per 3260 over the ground that has no plants). Regarding Claim 21, modified Ellaboudy further discloses, wherein the boundary of the plants of interest area comprises a plurality of crop rows (FIG. 32 illustrates the boundary of the plants such as 3250/3252 for a plurality of crop rows going horizontal or vertical of FIG. 32). Claims 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Ellaboudy et al. US 20210000006 A1 (herein, Ellaboudy), in view of Losch et al., US 20140343803 A1 (herein, Losch), O’Donnell US 20200332479 A1 (herein, O’Donnell), Davis et al., US 20180243772 A1 (herein, Davis), Brown et al. US 20230091677 A1 (herein, Brown), and in further view of Schoon et al., US 20210341944 A1 (herein, Schoon). With respect to Claim 19, please see the rejections above with respect to Claims 1, 3-4, and 18, drawn to a mobile machine, which are commensurate in scope to Claim 19 being drawn to a machine operating system. However, modified Ellaboudy does not disclose, receive sensor data, from a sensor system on the mobile machine configured to receive near-infrared (NIR) radiation, indicative of plants and non-plant objects at or around the worksite. However, Schoon teaches, receive sensor data, from a sensor system on the mobile machine configured to receive near-infrared (NIR) radiation, indicative of plants and non-plant objects at or around the worksite (¶[0111] – “The sensors sense one or more characteristics of an object and can include, for example, accelerometers, position sensors, pressure sensors (including weight sensors), or fluid level sensors among many others….For example, a rotational sensor can be used to detect speed(s) of object(s), a photodetector can be used to detect light or other electromagnetic radiation,…” – i.e. a NIR). Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by modified Ellaboudy to include a machine to receive NIR from the sensor as taught by Schoon. Doing so provides the capability better determine the non-plant objects that the mobile machine may encounter and avoid the same so as to not damage the mobile machine. Regarding Claim 20, modified Ellaboudy further discloses, wherein the control of the mobile machine (110) comprises a commanded route (¶[0178] – preferred route) for the mobile machine at the worksite. Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Ellaboudy et al. US 20210000006 A1 (herein, Ellaboudy), in view of Losch et al., US 20140343803 A1 (herein, Losch), O’Donnell US 20200332479 A1 (herein, O’Donnell), Davis et al., US 20180243772 A1 (herein, Davis) and in further view of Brown et al. US 20230091677 A1 (herein, Brown), and further in view of Flood et la. US 20190102623 A1 (herein, Flood). Regarding Claim 2, modified Ellaboudy discloses the mobile machine and further discloses a mobile agricultural machine (¶[0044] – “...a vehicle (e.g., a tractor, a truck, or an all-terrain vehicle) and operation of an implement (e.g., a boom sprayer, a spreader, a harvester, a row crop cultivator, an auger, a plow, a tiller, a backhoe, a forklift, or a mower) that is connected to the vehicle…”) but does not disclose, wherein the mobile machine comprises a mobile agricultural machine, a mobile forestry machine, a mobile construction machine, or a mobile turf management machine. However, Flood teaches, wherein the mobile machine comprises a mobile agricultural machine, a mobile forestry machine, a mobile construction machine, or a mobile turf management machine (¶[0003] – “…types of equipment, such as construction equipment, turf care equipment, agricultural equipment, and forestry equipment...”). Therefore, it would have been obvious to one of ordinary skill in the art the before the effective filing date of the claimed invention to modify the mobile agricultural machine as disclosed by modified Ellaboudy to include the different types of equipment as taught by Flood. Doing so provides enhanced flexibility by providing a mobile agricultural machine that be utilized in multiple types of geographic locations. Modified Ellaboudy further discloses, further comprising: a control system (FIG. 12, #1200) that generates a control signal (¶[0150] – control signals) to control a controllable subsystem (¶[0150] – “…manipulating one or more implements…) of the mobile machine based on the map of the worksite. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. The references cited but not utilized in the Office Action pertain to a mobile machine and/or generating a map. A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to LUIS G DEL VALLE whose telephone number is (303)297-4313. The examiner can normally be reached Monday-Friday, 0730 - 1630 MST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached on (313) 446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /LUIS G DEL VALLE/Examiner, Art Unit 3666 /ANNE MARIE ANTONUCCI/Supervisory Patent Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Oct 31, 2022
Application Filed
Sep 06, 2024
Non-Final Rejection — §103
Dec 07, 2024
Interview Requested
Dec 20, 2024
Examiner Interview Summary
Dec 20, 2024
Applicant Interview (Telephonic)
Jan 15, 2025
Response Filed
Mar 07, 2025
Final Rejection — §103
Apr 22, 2025
Interview Requested
Apr 28, 2025
Applicant Interview (Telephonic)
Apr 28, 2025
Examiner Interview Summary
May 02, 2025
Request for Continued Examination
May 05, 2025
Response after Non-Final Action
Jul 28, 2025
Non-Final Rejection — §103
Oct 13, 2025
Interview Requested
Nov 04, 2025
Response Filed
Feb 03, 2026
Non-Final Rejection — §103
Apr 16, 2026
Interview Requested

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597040
SHARED CHECKLISTS FOR ONBOARD ASSISTANT
2y 5m to grant Granted Apr 07, 2026
Patent 12596010
DISPLAY DEVICE, DISPLAY METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12592151
SYSTEM AND METHOD FOR MULTI-IMAGE-BASED VESSEL PROXIMITY SITUATION RECOGNITION SUPPORT
2y 5m to grant Granted Mar 31, 2026
Patent 12570325
VEHICLE MOVING METHOD AND VEHICLE
2y 5m to grant Granted Mar 10, 2026
Patent 12546615
SYSTEMS AND METHODS FOR PREDICTING FUEL CONSUMPTION EFFICIENCY
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

4-5
Expected OA Rounds
72%
Grant Probability
96%
With Interview (+23.8%)
2y 11m
Median Time to Grant
High
PTA Risk
Based on 154 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month