Prosecution Insights
Last updated: April 19, 2026
Application No. 18/217,211

AUTONOMOUS WEED TREATING DEVICE

Non-Final OA §102§103§112
Filed
Jun 30, 2023
Examiner
SKRZYCKI, JONATHAN MICHAEL
Art Unit
2116
Tech Center
2100 — Computer Architecture & Software
Assignee
Dandy Technology LLC
OA Round
1 (Non-Final)
66%
Grant Probability
Favorable
1-2
OA Rounds
3y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 66% — above average
66%
Career Allow Rate
146 granted / 221 resolved
+11.1% vs TC avg
Strong +33% interview lift
Without
With
+33.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
18 currently pending
Career history
239
Total Applications
across all art units

Statute-Specific Performance

§101
11.4%
-28.6% vs TC avg
§103
42.2%
+2.2% vs TC avg
§102
15.1%
-24.9% vs TC avg
§112
27.3%
-12.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 221 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Claims 1-13 and 13a (filed 06/30/2023) have been considered in this action. Claims 14-23 have not been considered due to the telephonic election to a restriction as recited below. Election/Restrictions Restriction to one of the following inventions is required under 35 U.S.C. 121: I. Claims 1-13 and 13a, drawn to a weed treating device that uses a model trained with deep learning to identify a weed and the control actions taken for the device, classified in G05D2101/15. II. Claims 14-23, drawn to a weed treating device that defines a chassis with different distances from the ground relative to the front and rear portion of the chassis and further defining mechanical arrangement of the chassis, classified in A01C23/008. During a telephone conversation with Steven Becker (42,308) on 12/03/2025 a provisional election was made without traverse to prosecute the invention of I., claims 1-13 and 13a. Affirmation of this election must be made by applicant in replying to this Office action. Claims 14-23 are withdrawn from further consideration by the examiner, 37 CFR 1.142(b), as being drawn to a non-elected invention. Claim Objections The numbering of claims is not in accordance with Patent Rule 1.75(f) which requires “(f) If there are several claims, they shall be numbered consecutively in Arabic numeral” because applicant has presented claim 13a. When claims are canceled, the remaining claims must not be renumbered. When new claims are presented, they must be numbered consecutively beginning with the number next following the highest numbered claims previously presented (whether entered or not). The applicant is required to renumber claim 13a in the response to this action. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 10 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 10 recites the limitation "the handheld computing device" and “the boundary coordinates” in its only limitation. There is insufficient antecedent basis for these limitations in the claim. Claims 13 and 13a are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. The term “more complex features” in claim 13 is a relative term which renders the claim indefinite. The term “more complex features” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. It is unclear what the “more complex features” are and what they are precisely more complex than. This phrase is broad and does provide a reasonable definiteness in scope in the specification for distinguishing when this boundary is breached. Because of the indefinite scope of this phrase, claim 13 is found to be indefinite. Claim 13a is dependent upon claim 13, and this inherits the rejection of claim 13 under 35 U.S.C. 112(b). Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1, 2, 5, 6 and 11 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Sibley et al. (US 20220377970, hereinafter Sibley). In regards to Claim 1, Sibley teaches “An autonomous weed treating device for treating weeds on grassy terrain beneath the device, comprising: a body comprising a chassis” ([0048] Agricultural treatment delivery system 111a may be disposed in a vehicle, such as vehicle 110, to facilitate mobility to any number of targets 112a within a geographic boundary 120 to apply a corresponding treatment 112b. In some examples, vehicle 110 may include functionalities and/or structures of any motorized vehicle, including those powered by electric motors or internal combustion engines. For example, vehicle 110 may include functionalities and/or structures of a truck, such as a pick-up truck (or any other truck), an all-terrain vehicle (“ATV”), a utility task vehicle (“UTV”), or any multipurpose off-highway vehicle, including any agricultural vehicle, including tractors or the like; wherein the various vehicles are known to have chassis) “a plurality of rotating members driven to move the chassis along the grassy terrain” ([0048] vehicle 110 may include...associated linkages to steerable wheels) “a camera coupled to the body and configured to acquire images of the grassy terrain” (Fig. 4 and [0051] Any of agricultural treatment delivery systems 111a or 111b may be configured to operate, for example, in a sensor mode during which a sensor platform 113 may be configured to receive, generate, and/or derive sensor data from any number of sensors as vehicle 110 traverses various path portions 119. For example, sensor platform 113 may include one or more image capture devices to identify and/or characterize an agricultural object, thereby generating. Examples of image capture devices include cameras (e.g., at any spectrum, including infrared), Lidar sensors, and the like. Image-based sensor data may include any include any data associated with an agricultural object, such as images and predicted images, that may describe, identify, or characterize physical attributes) “a dispenser configured to dispense a substance” ([0047] Agricultural treatment delivery system 111a may include one or more emitters, such as emitter 112c. Emitter 112c may be configured emit a treatment 112b, for example, via a trajectory 112d in any direction to intercept a target (“T”) 112a as vehicle 110 traverses path portions 119) “a processing circuit configured to: drive the rotating members to move the chassis along the grassy terrain” ([0049] vehicle 110 may include a mobility platform 114 that may provide logic (e.g., software or hardware, or both), and functionality and/or structure (e.g., electrical, mechanical, chemical, etc.) to enable vehicle 110 to navigate autonomously over one or more paths 119, based on, for example, one or more treatments to be applied to one or more agricultural objects. Any of agricultural treatment delivery systems 111a or 111b may be configured to detect, identify, and treat agricultural objects autonomously (e.g., without manual intervention); [0070] Mobility controller 214 may include hardware, software, or any combination thereof, to implement a planner (not shown) to facilitate generation and evaluation of a subset of vehicle trajectories based on at least a location of agricultural treatment delivery vehicle 210 against relative locations of external dynamic and static objects. The planner may select an optimal trajectory based on a variety of criteria over which to direct agricultural treatment delivery vehicle 210 in way that provides for collision-free travel or to optimize delivery of an agricultural projectile to a target. In some examples, a planner may be configured to calculate the trajectories as probabilistically-determined trajectories. Mobility controller 214 may include hardware, software, or any combination thereof, to implement a motion controller (not shown) to facilitate conversion any of the commands (e.g., generated by the planner), such as a steering command, a throttle or propulsion command, and a braking command, into control signals (e.g., for application to actuators, linkages, or other mechanical interfaces 217) to implement changes in steering or wheel angles and/or velocity autonomously) “process the images using a model trained with deep learning to identify a weed” ([0056] Precision agricultural management platform 101 may include hardware (e.g., processors, memory devices, etc.) or software (e.g., applications or other executable instructions to facilitate machine learning, deep learning, computer vision techniques, statistical computations, and other algorithms), or any combination thereof. [0057] Precision agricultural management platform 101 may be configured to, index and assign a uniquely identifier to each agricultural object in transmitted data 197 (e.g., as a function of a type of agricultural object, such as a blossom, a location of the agricultural object, etc.). Precision agricultural management platform 101 also may operate to store and manage each agricultural object (in agricultural object data 197) as indexed agricultural object data 102a, whereby each data arrangement representing each indexed agricultural object may be accessed using an identifier. [0083] Biotic object data 272 may describe a living organism present in an ecosystem or a location in a geographic boundary. For example, biotic object data 272 may include status data 272a that may identify a type of bacteria, a type of fungi (e.g., apple scab fungus), a plant (e.g., a non-crop plant, such as a weed), and an animal (e.g., an insect, a rodent, a bird, etc.), and other biotic factors that may influence or affect growth and harvest of a crop. [0085] Further to FIG. 2B, precision agricultural management platform 201 may include analyzer logic 203 and a policy generator 205. Analyzer logic 203 may be configured to implement computer vision algorithms and machine learning algorithms) “and control the dispenser to dispense a substance on the weed” ([0092] At 310, an emitter from a subset of one or more emitters may be selected to perform an action. Further, a corresponding action to be performed in association with a particular agricultural object may be identified, the agricultural object being an actionable object (e.g., an agricultural object for which an action is perform, whether chemical or mechanical, such as robotic pruners or de-weeding devices). In some cases, an optical sight associated with an emitter may be identified, and a corresponding action may be associated with the optical sight to determine a point in time to activate emission of an agricultural projectile. [0167] , consider that event detector 2386 may be configured to detect an event for weed 2324b, whereby associated event data may indicate that weed 2324b has sufficient foliage prior to germination to be optimally treated with an herbicide. Responsive to generation of data specifying that an event identifies a growth stage of a weed, action selector 2388 may be configured to determine an action (e.g., based on policy data 2372), such as applying a treatment that applies an herbicide to weed 2324b). In regards to Claim 2, Sibley further discloses “The device of Claim 1, wherein the processing circuit is further configured to identify a boundary between the grassy terrain and a neighboring region.” ( [0068] Further, agricultural treatment delivery vehicle 210 may include a motion estimator/localizer 219 configured to perform one or more positioning and localization functions. In at least one example, motion estimator/localizer 219 may be configured to determine a location of one or more component of agricultural treatment delivery vehicle 210 relative to a reference coordinate system that may facilitate identifying a location at specific coordinates (i.e., within a geometric boundary, such as an orchard or farm). [0103] Planner 464 may be configured to generate a number of candidate vehicle trajectories for accomplishing a goal of traversing within a geographic boundary via a number of available paths or routes). In regards to Claim 5, Sibley further discloses “The device of Claim 1, further comprising a location circuit configured to provide geographic location data for the device, wherein the processing circuit is configured to navigate the device using the geographic location data” ([0051] Sensor platform 113 may also include one or more location or position sensors, such as one or more global positioning system (“GPS”) sensors and one or more inertial measurement units (“IMU”), as well as one or more radar devices, one or more sonar devices, one or more ultrasonic sensors, one or more gyroscopes, one or more accelerometers, one or more odometry sensors (e.g., wheel encoder or direction sensors, wheel speed sensors, etc.), and the like. Position-based sensors may provide any data configured to determine locations of an agricultural object relative to a reference coordinate system, to vehicle 110, to emitter 112c, or to any other object based, for example, GPS data, inertial measurement data, and odometry data, among data generated by other position and/or location-related sensors. [0071] agricultural treatment delivery system 211 may be configured to implement one or more of a perception engine to detect and classify agricultural objects, a planner to determine actions (e.g., one or more trajectories over which to propel an agricultural projectile), and a motion controller to control, for example, position or orientation of emitter 212. In other examples, agricultural treatment delivery system 211, sensor platform 213 (including any sensor), and motion estimator/localizer 219 each may integrated into a modular agricultural treatment delivery system 221, which, in turn, may be integrated into agricultural treatment delivery vehicle 210, along with mobility controller 214, to facilitate autonomous navigation of vehicle 210 and autonomous operation of agricultural treatment delivery system 211). In regards to Claim 6, Sibley further discloses “The device of Claim 5, wherein the location circuit comprises a global positioning circuit, wheel motor encoders and at least one of an inertial measurement unit and a magnetometer to provide the geographic location” ([0051] Sensor platform 113 may also include one or more location or position sensors, such as one or more global positioning system (“GPS”) sensors and one or more inertial measurement units (“IMU”), as well as one or more radar devices, one or more sonar devices, one or more ultrasonic sensors, one or more gyroscopes, one or more accelerometers, one or more odometry sensors (e.g., wheel encoder or direction sensors, wheel speed sensors, etc.), and the like. Position-based sensors may provide any data configured to determine locations of an agricultural object relative to a reference coordinate system, to vehicle 110, to emitter 112c, or to any other object based, for example, GPS data, inertial measurement data, and odometry data, among data generated by other position and/or location-related sensors). In regards to Claim 11, Sibley teaches “The device of Claim 1, wherein the substance is a liquid herbicide” ([0063] Agricultural projectile 152b may be configured as a liquid-based projectile propelled from emitter 152c for a programmable interval of time to form the projectile; [0140] In one example, agricultural projectile delivery system 1430 may include a storage for compressed gas (“compressed gas store”) 1431, which may store any type of gas (e.g., air), a gas compressor 1432 to generate one or more propulsion levels (e.g., variable levels of pressure), and a payload source 1433, which may store any treatment or payload (e.g., a liquid-based payload), such as fertilizer, herbicide, insecticide, etc.). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 3-4 are rejected under 35 U.S.C. 103 as being unpatentable over Sibley as applied to claim 2 above, and further in view of Di Federico et al. (US 20220308588, hereinafter Di Federico). In regards to Claim 3, Sibley teaches the device for dispensing a substance on weeds that are identified as incorporated by claim 2 above. Sibley fails to teach “The device of Claim 2, wherein the processing circuit is further configured to drive rotating members in a reverse direction in response to identifying the boundary”. Di Federico teaches “The device of Claim 2, wherein the processing circuit is further configured to drive rotating members in a reverse direction in response to identifying the boundary” ([0047] The control loop for direction of motion (reverse) is used to account for the geometric constraints (area allowed for operation). The idea of “mirror mapping” is used as follows. When the robot reaches the boundary of allowed area (dashed rectangle, FIGS. 1-4), its direction of motion changes to the opposite; wherein turning wheels in the opposite direction would be understood as reversing). It would have been obvious to a person having ordinary skill in the art before the effective file date of the claimed invention to have modified the system which defines a boundary for navigation during the treating weeds in a farm/orchard/grassy area with the use of navigation logic in which when a device is driven approaches a border it causes the vehicle to go in reverse as taught by Di Federico, because it would gain the obvious benefit of performing navigation while staying within the confines of the boundaries of Sibley. Furthermore, both Sibley and Di Federico are in the related field of autonomous navigation in an agricultural environment. By combining these elements, it can be considered taking the known method of causing a vehicle to go in reverse when it reaches a boundary, and using it to improve the vehicle navigation system of Sibley in a known way that achieves predictable results. In regards to Claim 4, the combination of Sibley and Di Federico teaches the device as incorporated by claim 3 above. Di Federico further teaches “of Claim 3, wherein the processing circuit is further configured to identify the grassy terrain after identifying the boundary and, in response to identifying the grassy terrain, to drive the rotating members to turn a direction of travel of the chassis” ([0048] Proposed algorithm allows the robot to move within a closed area from one boundary to another, until it reaches the attraction domain in the vicinity of the target path. FIGS. 2-4 illustrate trajectories of the maneuvers explaining the control loops. FIG. 2 shows the trajectory passed by the robot from the initial time moment to the moment when it first crosses the boundary of area allowed for maneuver. The trajectory shown by solid line indicates that the robot is moving forward. After reaching the boundary of the area allowed for maneuver, the turning angle of the steering wheels changes sign, and the direction of motion changes to the opposite (ν=−0.5 m/s) and reverse motion continues. The reverse trajectories are shown by dashed lines (FIGS. 3-4). This continues until the next boundary of constrained area is reached and the direction of motion is changed to the opposite (see FIG. 3). The robot moves from one boundary to another until its state gets inside the attraction domain (FIG. 4). After entering the attraction domain, the starting maneuver (search mode) finishes, and the control system switches back to the closed loop (normal) operation). Claim(s) 7, 8 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Sibley as applied to claim 1 above, and further in view of Hand (US 20240000001, hereinafter Hand). In regards to Claim 7, Sibley teaches the autonomous weed treating device as incorporated by claim 1 above. Sibley fails to teach “The device of Claim 1, wherein, upon detecting a boundary, the processing circuit is configured to change the direction of travel by an angle of less than 180 degrees” Hand teaches “The device of Claim 1, wherein, upon detecting a boundary, the processing circuit is configured to change the direction of travel by an angle of less than 180 degrees” ([0049] when the autonomous robot 15, following its trajectory 25 and under the scrutiny of the robot navigation module 110, comes into proximity to one of the magnetic stakes 20, the domain boundary detector 125, which in the preferred embodiment is a digital positioning multi-axis magnetometer, detects one or more magnetized stakes 20, which then signals the CPU 100. As shown in FIG. 8, the CPU 100 then commands S215 said robot 15 to stop and reflectively turn by a prescribed reflection angle 27, in the range 60°-120°). It would have been obvious to a person having ordinary skill in the art before the effective file date of the claimed invention to have modified the system which defines a boundary for navigation during the treating weeds in a farm/orchard/grassy area with the use of navigation logic in which when a device is driven approaches a border it causes the vehicle to turn its direction of travel by an angle of less than 180 degrees as taught by Hand, because it would gain the obvious benefit of performing navigation while staying within the confines of the boundaries of Sibley. Furthermore, both Sibley and Hand are in the related field of autonomous navigation in an agricultural environment. By combining these elements, it can be considered taking the known method of causing a vehicle to turn by an angle of less than 180 degrees when it reaches a boundary, and using it to improve the vehicle navigation system of Sibley in a known way that achieves predictable results. In regards to Claim 8, the combination of Sibley and Hand teaches the device as incorporated by claim 7 above. Hand further teaches “The device of Claim 7, wherein the angle is pseudorandomly selected” ([0027] upon being activated by the user, said robot will patrol the defined domain of the aesthetic mulch garden, following a random reflective trajectory, whereby the robot reflects at a prescribed reflection angle from boundaries and obstacles leading to a randomized trajectory. [0049] As seen in FIG. 3, the CPU thereby directs the movement of said robot 15 to patrol for weeds 3 with a trajectory 25 constrained inside said boundary 12 and continues to repeatedly cross the domain 10 following a series of random reflections until either a preprogrammed time limitation is met, a minimum frequency of weed detection criteria is met, or the robot 15 is deactivated by the user or by other criteria which may be programmed into CPU 100). In regards to Claim 12, Sibley teaches the autonomous weed treating device as incorporated by claim 1 above. Sibley fails to teach “The device of Claim 1, wherein the device is configured to be lifted and moved to a new location by a human person” ([0100] Said robot 15 must be small enough so that it can maneuver between shrubs and obstacles but be large enough so that it can position itself over a weed and extract it. Furthermore, said robot 15 must carry an electrical power supply if significant size and weight so as to power said drive motors 80, 85, and 90, said servos 300 and 305, as well as lighting and electrical components, and to so providing a sufficient period of operation to effect weed removal from said mulch garden 5. While no means critical or a limitation to this invention, for typical mulch garden weeds in the size range of two inches, the preferred embodiment has a size for said robot 15 of a width of 9 to 18 inches, and a length of 9 to 18 inches, and a weight of 2.0 to 4.0 pounds; wherein a device which is 18x18 inches and 4 pounds is capable of being lifted and moved by a person). It would have been obvious to a person having ordinary skill in the art before the effective file date of the claimed invention to have modified the agricultural device that identifies and sprays weeds, with the features of Hand in which the size and dimensions and weight of an autonomous weed treatment device is such that the device can be lifted and moved by a person as taught by Hand, because it would gain the obvious benefit of being more portable, while being able to fit into smaller spaces such as between shrubs as noted by Hand. By combining these elements, it can be considered taking the known device that identifies and applies treatment to a weed, and fitting it into a package that is capable of being lifted and moved by a person in a known way that achieves predictable results. Claim(s) 9 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Sibley as applied to claim 1 above, and further in view of Woo et al. (US 20230320262, hereinafter Woo). In regards to Claim 9, Sibley teaches the device as incorporated by claim 1 above. Sibley further teaches “The device of Claim 1, further comprising a network interface circuit configured to communicate with a handheld computing device” ([0214] In some embodiments, modules 3850 and 3851 of FIG. 38, or one or more of their components, or any process or device described herein, can be in communication (e.g., wired or wirelessly) with a mobile device, such as a mobile phone or computing device, or can be disposed therein) “to control the device to operate within a boundary” ([0090] At 306, a mobility platform may be activated to control autonomously motion and/or position of an agricultural treatment delivery vehicle. A mobility platform may be configured to implement a map, which may include data configured to position one or more emitters of an agricultural projectile delivery system adjacent to an agricultural object within a geographic boundary. Hence, a map may include data specifying a location of an indexed agricultural object, and can be used to navigate a vehicle autonomously to align an emitter with an agricultural object to deliver a treatment). Sibley fails to teach “wherein the processing circuit is configured to receive boundary coordinates from the handheld computing device and to control the device to operate within a boundary defined by the boundary coordinates”. Woo teaches “wherein the processing circuit is configured to receive boundary coordinates from the handheld computing device and to control the device to operate within a boundary defined by the boundary coordinates” ([0026] The computer may also be configured with wireless communications such as wifi, cellular connection, or other wireless communication protocol to connect to a network via the electronic components 165 to send the images obtained by the 2D and/or 3D cameras 170 and 175 to an application or website having a user interface 310, an embodiment of which is shown displayed on user device 300 in the embodiment illustrated in FIG. 3. The application or website may be operated using the user interface 310 on user device 300, which may be a smartphone or a separate computer. The application or website may cause the user device 300 to display images of a lawn including grass 320 and/or a non-grass material 330 adjacent to the lawn to a user through the user interface 310. The application or website may also be used to provide input via the user interface 310 regarding a position of the chassis 100, identification of zones in which the chassis performs designated edging, mowing, weeding, and/or other gardening operations, identification of plants, and selection of operations or prioritization related to identified plants. The user interface 310 may also indicate a current location of the chassis 100, which may be shown in relation to features of the surrounding environment including a boundary 340 between grass 320 and non-grass 330, as well as a corner 350 of the lawn zone including the grass 320; [0060] A SLAM algorithm will be used to perform localization and mapping, which are vital for the robot to perform its tasks. As noted above, on setup, the robot may start in its charging station 200. The charging station 200 may be considered the origin point of the world coordinate system (i.e. point (0, 0, 0) in 3D space) defined for the robot; wherein the locations are corresponded with coordinates in the world coordinate system). It would have been obvious to a person having ordinary skill in the art before the effective file date of the claimed invention to have modified the system for operating a device within a boundary and which allows communication with handheld computing devices, with the use of defining those boundaries on the mobile phone as taught by Woo, because it would offer the obvious benefit of allowing a person to define the boundary in a convenient manner. Furthermore, both Sibley and Woo are in the related field of agricultural robots that treat weeds. By combining these elements, it can be considered taking the known method of providing boundary coordinates using a handheld computer as taught by Woo, and applying it to the known agricultural device with handheld computer of Sibley in a known way that achieves predictable results. In regards to Claim 10, Sibley teaches the device as incorporated by claim 1 above. Sibley further teaches “The device of Claim 1, further comprising the handheld computing device…” ([0214] In some embodiments, modules 3850 and 3851 of FIG. 38, or one or more of their components, or any process or device described herein, can be in communication (e.g., wired or wirelessly) with a mobile device, such as a mobile phone or computing device, or can be disposed therein). Sibley fails to teach “wherein the handheld computing device is programmed to display a map and define the boundary coordinates based on receiving user input tracing the boundary on the map”. Woo teaches “wherein the handheld computing device is programmed to display a map and define the boundary coordinates based on receiving user input tracing the boundary on the map” ([0028] The computer may also be configured to process data received from a global positioning system (GPS) receiver provided with the electronic components 165. Data from the GPS receiver may be transmitted to the application or website in order to display a location of the chassis 100 using the user interface 310. Using a map such as a GPS map or other type of map representing an environment in which the chassis is to operate, the application or website can also be used to set a geofence to keep chassis 100 inside and set zones including areas with grass 320 for the chassis 100 to edge and/or cut). It would have been obvious to a person having ordinary skill in the art before the effective file date of the claimed invention to have modified the device with a handheld computer as taught by Sibley, with the displaying of a map for defining the boundaries of the working area as taught by Woo, because it would gain the obvious benefit of improved boundary definition and interaction by a user. By combining these elements, it can be considered taking the known handheld device of Sibley, and improving it with the known methods of having a user define the boundary on the handheld computer as taught by Woo in a known way that achieves predictable results. Claims 13 and 13a are rejected under 35 U.S.C. 103 as being unpatentable over Sibley et al. (US 20220377970, hereinafter Sibley) in view of Jordan (“Convolution Neural Networks”, hereinafter Jordan) and Kounalakis et al. (“A Robotic System Employing Deep Learning for Visual Recognition and Detection of Weeds in Grasslands”, hereinafter Kounalakis. In regards to Claim 13, Sibley teaches “An autonomous weed treating device for treating weeds on grassy terrain beneath the device, comprising: a body comprising a chassis” ([0048] Agricultural treatment delivery system 111a may be disposed in a vehicle, such as vehicle 110, to facilitate mobility to any number of targets 112a within a geographic boundary 120 to apply a corresponding treatment 112b. In some examples, vehicle 110 may include functionalities and/or structures of any motorized vehicle, including those powered by electric motors or internal combustion engines. For example, vehicle 110 may include functionalities and/or structures of a truck, such as a pick-up truck (or any other truck), an all-terrain vehicle (“ATV”), a utility task vehicle (“UTV”), or any multipurpose off-highway vehicle, including any agricultural vehicle, including tractors or the like; wherein the various vehicles are known to have chassis) “a plurality of rotating members driven to move the chassis along the grassy terrain” ([0048] vehicle 110 may include...associated linkages to steerable wheels) “a camera coupled to the body and configured to acquire images of the grassy terrain” (Fig. 4 and [0051] Any of agricultural treatment delivery systems 111a or 111b may be configured to operate, for example, in a sensor mode during which a sensor platform 113 may be configured to receive, generate, and/or derive sensor data from any number of sensors as vehicle 110 traverses various path portions 119. For example, sensor platform 113 may include one or more image capture devices to identify and/or characterize an agricultural object, thereby generating. Examples of image capture devices include cameras (e.g., at any spectrum, including infrared), Lidar sensors, and the like. Image-based sensor data may include any include any data associated with an agricultural object, such as images and predicted images, that may describe, identify, or characterize physical attributes) “a dispenser configured to dispense a substance” ([0047] Agricultural treatment delivery system 111a may include one or more emitters, such as emitter 112c. Emitter 112c may be configured emit a treatment 112b, for example, via a trajectory 112d in any direction to intercept a target (“T”) 112a as vehicle 110 traverses path portions 119) “a processing circuit configured to: drive the rotating members to move the chassis along the grassy terrain” ([0049] vehicle 110 may include a mobility platform 114 that may provide logic (e.g., software or hardware, or both), and functionality and/or structure (e.g., electrical, mechanical, chemical, etc.) to enable vehicle 110 to navigate autonomously over one or more paths 119, based on, for example, one or more treatments to be applied to one or more agricultural objects. Any of agricultural treatment delivery systems 111a or 111b may be configured to detect, identify, and treat agricultural objects autonomously (e.g., without manual intervention); [0070] Mobility controller 214 may include hardware, software, or any combination thereof, to implement a planner (not shown) to facilitate generation and evaluation of a subset of vehicle trajectories based on at least a location of agricultural treatment delivery vehicle 210 against relative locations of external dynamic and static objects. The planner may select an optimal trajectory based on a variety of criteria over which to direct agricultural treatment delivery vehicle 210 in way that provides for collision-free travel or to optimize delivery of an agricultural projectile to a target. In some examples, a planner may be configured to calculate the trajectories as probabilistically-determined trajectories. Mobility controller 214 may include hardware, software, or any combination thereof, to implement a motion controller (not shown) to facilitate conversion any of the commands (e.g., generated by the planner), such as a steering command, a throttle or propulsion command, and a braking command, into control signals (e.g., for application to actuators, linkages, or other mechanical interfaces 217) to implement changes in steering or wheel angles and/or velocity autonomously) “process the images to identify a weed wherein the processing comprises using a model trained by a neural network algorithm…” ([0056] Precision agricultural management platform 101 may include hardware (e.g., processors, memory devices, etc.) or software (e.g., applications or other executable instructions to facilitate machine learning, deep learning, computer vision techniques, statistical computations, and other algorithms), or any combination thereof. [0057] Precision agricultural management platform 101 may be configured to, index and assign a uniquely identifier to each agricultural object in transmitted data 197 (e.g., as a function of a type of agricultural object, such as a blossom, a location of the agricultural object, etc.). Precision agricultural management platform 101 also may operate to store and manage each agricultural object (in agricultural object data 197) as indexed agricultural object data 102a, whereby each data arrangement representing each indexed agricultural object may be accessed using an identifier. [0083] Biotic object data 272 may describe a living organism present in an ecosystem or a location in a geographic boundary. For example, biotic object data 272 may include status data 272a that may identify a type of bacteria, a type of fungi (e.g., apple scab fungus), a plant (e.g., a non-crop plant, such as a weed), and an animal (e.g., an insect, a rodent, a bird, etc.), and other biotic factors that may influence or affect growth and harvest of a crop. [0085] Further to FIG. 2B, precision agricultural management platform 201 may include analyzer logic 203 and a policy generator 205. Analyzer logic 203 may be configured to implement computer vision algorithms and machine learning algorithms) “and control the dispenser to dispense a substance on the weed” ([0092] At 310, an emitter from a subset of one or more emitters may be selected to perform an action. Further, a corresponding action to be performed in association with a particular agricultural object may be identified, the agricultural object being an actionable object (e.g., an agricultural object for which an action is perform, whether chemical or mechanical, such as robotic pruners or de-weeding devices). In some cases, an optical sight associated with an emitter may be identified, and a corresponding action may be associated with the optical sight to determine a point in time to activate emission of an agricultural projectile. [0167] , consider that event detector 2386 may be configured to detect an event for weed 2324b, whereby associated event data may indicate that weed 2324b has sufficient foliage prior to germination to be optimally treated with an herbicide. Responsive to generation of data specifying that an event identifies a growth stage of a weed, action selector 2388 may be configured to determine an action (e.g., based on policy data 2372), such as applying a treatment that applies an herbicide to weed 2324b). Sibley fails to teach “wherein the processing comprises using a model trained by a neural network algorithm having first layers learning gradients and lines, second deeper layers recognizing more complex features, and a final layer to distinguish the weed from grassy terrain”. Jordan teaches “wherein the processing comprises using a model trained by a neural network algorithm having first layers learning gradients and lines, second deeper layers recognizing more complex features…” ([page 13] we've described the abstract concept of a square using lower-level concepts such as lines and angles. We could use these same lower-level concepts to describe a plethora of shapes. When we visualize convolutional networks, we find that the same phenomenon occurs. The network learns to use earlier layers to extract low-level features and then combine these features into more rich representations of the data. This means that later layers of the network are more likely to contain specialized feature maps....As you progress into deeper layers of a network, it's very common to see an increase in the channel depth (number of feature maps); as feature maps become more specialized, representing higher-level abstract concepts, we simply need more feature maps to represent the input. Consider the fact that a small set of lines can be combined into a large number of shapes, and these shapes can be combined into an even larger number of objects. If our goal is to build a model with general representational power, we'll likely need to grow our feature maps accordingly). It would have been obvious to a person having ordinary skill in the art before the effective file date of the claimed invention to have modified the deep learning neural networks using computer vision which detect a weed as taught by Sibley, with the teaching of Jordan in which a deep learning neural network has multiple layers and the first layers detect gradients and lines and middle layers which recognize more complex features as taught by Jordan, because it can be considered taking the neural network that is deep learned and recognizes a weed, and replacing it with the neural network with deep learning that operates as a convolutional network with many layers. It can be reasoned that Sibley is utilizing the features of Jordan even without stating them, as just because Sibley does not go into detail about how the neural network operates does not mean it does not already have these features as Jordan is stating background information about how all computer vision systems operate using neural networks. By combining these elements, it can be considered taking the known neural network of Sibley, and implementing the features in the neural network of Jordan in a known way that achieves predictable results. The combination of Sibley and Jordan fail to teach “…and a final layer to distinguish the weed from grassy terrain”. Kounalakis teaches “…and a final layer to distinguish the weed from grassy terrain” ([page 2 col 1]The first CNN, termed sNet, is used for vegetation detection and is trained with images processed using Normalized Difference Vegetation Index (NDVI). Then the cNet, performs the final recognition between three categories including crop, weed and soil. The sequential use of CNNs in combination with a blob-wise based voting, achieve a real-time operation with comparable recognition results to the swallow visual architecture). It would have been obvious to a person having ordinary skill in the art before the effective file date of the claimed invention to have modified the system for distinguishing between crops and objects using a neural network with layers as taught by Sibley and Jordan, with a final output layer that classifies an object as crop/hay/grass and weed (undesired plant) and soil (boundary) as taught by Kounalakis because it would gain the obvious benefit of distinguishing where weeds are so that the herbicide is only applied to those plants which are not desired to grow (weeds), while not applying herbicide to the crop/grass; also saving herbicide at the same time through more precise application through the object recognition/classification. By combining these elements, it can be considered taking the known neural network which recognizes weeds, and improving it to recognize weeds and crops/grass in a known way that achieves predictable results. In regards to Claim 13a, the combination of Sibley, Jordan and Kounalakis teach the weed treatment device as incorporated by claim 13 above. Kounalakis further teaches “The autonomous weed treating device of Claim 13, wherein the model was trained by the neural network algorithm having a final layer to distinguish a boundary from a grassy terrain” ([page 2 col 1] The first CNN, termed sNet, is used for vegetation detection and is trained with images processed using Normalized Difference Vegetation Index (NDVI). Then the cNet, performs the final recognition between three categories including crop, weed and soil. The sequential use of CNNs in combination with a blob-wise based voting, achieve a real-time operation with comparable recognition results to the swallow visual architecture; wherein soil is a type of boundary). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Friell et al. (US 20250151673) – teaches a ground-based autonomous robot that applies treatments to target areas that include weeds He et al. (US 20190278269) – teaches a ground-based autonomous robot that has different modules for different lawn treatments Any inquiry concerning this communication or earlier communications from the examiner should be directed to JONATHAN M SKRZYCKI whose telephone number is (571)272-0933. The examiner can normally be reached M-Th 7:30-3:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KAMINI SHAH can be reached at 571-272-2279. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JONATHAN MICHAEL SKRZYCKI/Examiner, Art Unit 2116
Read full office action

Prosecution Timeline

Jun 30, 2023
Application Filed
Dec 03, 2025
Examiner Interview (Telephonic)
Dec 09, 2025
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12595886
CONTROL OF A WATER SUPPLY SYSTEM USING PUMPING STATIONS WITH RESOURCE OPTIMIZED PRESSURE AND FLOW TARGET VALUES
2y 5m to grant Granted Apr 07, 2026
Patent 12570003
SYSTEMS, AND METHODS FOR REAL TIME CALIBRATION OF MULTIPLE RANGE SENSORS ON A ROBOT
2y 5m to grant Granted Mar 10, 2026
Patent 12562352
PREDICTION METHOD AND INFORMATION PROCESSING APPARATUS FOR PREDICTING THE PROCESS RESULT IN A PLASMA ETCHING PROCESS
2y 5m to grant Granted Feb 24, 2026
Patent 12560918
PRODUCTION SEQUENCING OPTIMIZATION FOR AUTOMOTIVE ACCESSORY INSTALLATION
2y 5m to grant Granted Feb 24, 2026
Patent 12530014
PROCESS MODEL AUTOMATIC GENERATION SYSTEM AND PROCESS MODEL AUTOMATIC GENERATION METHOD
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
66%
Grant Probability
99%
With Interview (+33.1%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 221 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month