Prosecution Insights
Last updated: April 19, 2026
Application No. 18/857,770

SPACING-AWARE PLANT DETECTION MODEL FOR AGRICULTURE TASK CONTROL

Non-Final OA §103
Filed
Oct 17, 2024
Examiner
SHAFI, MUHAMMAD
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Farmwise Labs Inc.
OA Round
1 (Non-Final)
89%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 89% — above average
89%
Career Allow Rate
978 granted / 1100 resolved
+36.9% vs TC avg
Strong +17% interview lift
Without
With
+16.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
35 currently pending
Career history
1135
Total Applications
across all art units

Statute-Specific Performance

§101
18.8%
-21.2% vs TC avg
§103
48.3%
+8.3% vs TC avg
§102
7.2%
-32.8% vs TC avg
§112
20.7%
-19.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1100 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application is being examined under the pre-AIA first to invent provisions. 2. This communication is a first office action, non-final rejection on the merits. Claims 1-20, as originally filed, are currently pending and have been considered below. Claim Objections 3. Claims 18, 19 and 20 are objected to because of the following informalities: In Claim 18, line 1, “claim 16” should be corrected to –claim 17--. In Claim 19, line 1, “claim 16” should be corrected to –claim 17--. In Claim 20 , line 1, “claim 16” should be corrected to –claim 17--. Appropriate correction is required. Claim Rejections - 35 USC § 103 4. The following is a quotation of pre-AIA 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action: (a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under pre-AIA 35 U.S.C. 103(a) are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims under pre-AIA 35 U.S.C. 103(a), the examiner presumes that the subject matter of the various claims was commonly owned at the time any inventions covered therein were made absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and invention dates of each claim that was not commonly owned at the time a later invention was made in order for the examiner to consider the applicability of pre-AIA 35 U.S.C. 103(c) and potential pre-AIA 35 U.S.C. 102(e), (f) or (g) prior art under pre-AIA 35 U.S.C. 103(a). 5. Claims 1-2, 7-10, 16-18 and 20 are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Fu et al. (USP 2021/0090274) in view of Koch et al. ( USP 2017/0034986). As per Claim 1, Fu et al. (Fu) teaches, a method (via using a method of treating a plant using a plant identification module , method 1200, Fig.12, [0202]) for controlling a robotic action for an agricultural task, in which all steps are computer-implemented, (The method 1200 may be performed from the perspective of the control system 130. Figs. 1C, 12),[0202], also see [0199]-[0200]), comprising: receiving, using an imager moving along a crop row, at least one image of at least a portion of the crop row; “(A farming machine (e.g., farming machine 100) includes one or more sensors for capturing an image as the farming machine travels through a field. A control system (e.g., control system 130) accesses 1210 an image of the field captured by the sensors. The image includes pixels representing a plurality of objects of the field including at least one plant”, [0203]); using the at least one image, a plant detection model (Control system 130 classifies 1230 one or more pixels as a plant [or crop] based on the depth information [depth identification model 605] for the pixels. The control system 130 can additionally classify other objects in the image [e.g., dirt, grass, etc.] based on their depth information, Fig. 6, [0205]), to generate an output from the plant detection model; (Control system 130 determines 1240 a treatment action based on the depth information and pixels identified as a plant, [0206]); outputting a control signal for the robotic action based on the output from the plant detection model; (“the control system 130 classifies crops 302a, 302b, 302c and the weed 350 based on the depth information in depth map 360. The control system 130 then determines which plant(s) to treat and with which treatment action(s), if any. In a first example, the control system 130 compares the depth information of the crops 302a, 302b, 302c and the weed 350. The control system 130 determines that the weed 350 is significantly shorter than the plants 302a-c. The control system 130 selects the weed 350 for treatment with an herbicide sprayed from a spray nozzle because the weed 350 is significantly shorter than the plants 302a-c., Fig. 4B, [0206]); and conducting the robotic action for the agricultural task in response to the control signal ( via The control system 130 then actuates Step-1250 (Fig.12) a treatment mechanism 120 (Fig.3B) to treat one or more of the identified plants with the determined treatment action, as needed, [0206], Figs. 3B, 12). However, Fu does not explicitly teach, the method comprising using an average inter-crop spacing for the crop row. In a related field of art Koch et al. (Koch) teaches, crop stand optimization systems, methods and apparatus ,wherein, method comprising an average inter- crop spacing for the crop row (via “The relative plant location criteria may include one or more plant spacing criteria (e.g., a distance between the plant and the nearest adjacent plant, the average distance between the plant and each adjacent plant, the average distance between the plant and other plants within a threshold distance of the plant, the average distance between the plant and the nearest plants within the same planting row, the average distance between the plant and the nearest plants in the adjacent planting rows”, [0044]). It would have been obvious to one of ordinary skill in the art, having the teachings of Fu and Koch before him before the effective filing date of the claimed invention to modify the systems of Fu, to include the teachings (crop stand optimization method.. inter-crop spacing) of Koch and configure with the system of Fu in order to estimate ear potential value for each plant and optimize the crop population in field based on plant criteria. Motivation to combine the two teachings is, identifying crop appearing in certain zone of crop row and distinguishing between crop and weed (i.e., weed identification to uproot weed). As per Claim 2, Fu as modified by Koch teaches the limitation of Claim 1. However, Fu in view of Koch teaches, receiving the average inter-crop spacing (Koch : an average inter-crop spacing [0044]), via a user interface for the vehicle. ( Fu : “the computer system 1600 can include a graphics display 1610 an alphanumeric input device 1612 (e.g., a keyboard), a cursor control device 88 etc.,[0242], Fig.16). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the systems of Fu, to include average intercrop spacing as taught by Koch in order to receive the average inter-crop spacing on the display of Fu and to guide the robot to execute robotic action . As per Claim 7, Fu as modified by Koch teaches the limitation of Claim 1. However, Fu in view of Koch teaches, wherein the robotic action is one of: precision harvesting of the crop row; precision weeding of an intercrop area of the crop row; precision watering on the crop row; precision thermal treatment of the intercrop area of the crop row; and precision chemical application on the crop row. ( Fu : via “the control system 130 classifies crops 302a, 302b, 302c and the weed 350 based on the depth information in depth map 360. The control system 130 then determines which plant(s) to treat and with which treatment action(s), if any. In a first example, the control system 130 compares the depth information of the crops 302a, 302b, 302c and the weed 350. The control system 130 determines that the weed 350 is significantly shorter than the plants 302a-c. The control system 130 selects the weed 350 for treatment with an herbicide sprayed from a spray nozzle because the weed 350 is significantly shorter than the plants 302a-c.”, Fig.4B, [0206]). As per Claim 8, Fu as modified by Koch teaches the limitation of Claim 1. However, Fu in view of Koch teaches, wherein the image includes at least one of: a three-dimensional image; a two-dimensional image; a stereoscopic image; a multispectral image; a color filtered visual spectrum image; a color filtered hyperspectral image; and a normalized difference vegetation index image (Fu : via “The control system 130 can then generate a labelled point cloud using labelled depth information and determine plant treatment actions based on the labelled point cloud. A labelled point cloud is a three-dimensional representation of objects in the scene in the field captured by the camera array. Because the objects are represented in three-dimensions, the farming machine 100 may more accurately determine the location of plants and/or perform appropriate farming actions.”, [0156]). As per Claim 9, Fu as modified by Koch teaches the limitation of Claim 1. However, Fu in view of Koch teaches, wherein the plant detection model includes at least one of: an artificial neural network; a machine learning model; and a color filter preprocessor (Fu : “The control system 130 employs the plant identification module 232 to classify groups of pixels as plants based on depth information extracted by the depth identification module 234, [0121]; the depth identification model 605 is a convolutional neural network model, [0128]”). As per Claim 10, Fu as modified by Koch teaches the limitation of Claim 1. However, Fu in view of Koch teaches, wherein the output from the biased plant detection model includes: a location of a plant; and size information for the plant. (Fu : “method of identifying and treating a plant using a plant identification module, Fig. 14, [0216]; The control system 130 can determine the feature value using the label and three-dimensional coordinates of the points in the point cluster. The feature value describes one or more characteristics of the plant that can inform various treatment actions. For example, the control system 130 can determine the height, size, position, proximity, canopy cover, physiology, etc. of the plant based on points in the point cluster representing the plant, [0222]). Also see [0216]-[0218]). As per Claim 16, Fu as modified by Koch teaches the limitation of Claim 1. However, Fu in view of Koch teaches, moving a vehicle along the crop row; and wherein the imager is positioned to be moved along the crop row by the vehicle ( Fu : via “A farming machine (e.g., farming machine 100) includes one or more sensors for capturing an image as the farming machine travels through a field. A control system (e.g., control system 130) accesses 1210 an image of the field captured by the sensors. The image includes pixels representing a plurality of objects of the field including at least one plant. To illustrate, referring to FIG. 4A, the control system 130 can access captured image 400 including crops 302a, 302b, and 302c, and weed 350., [0203]). As Per Claim 17, Fu et al. ( Fu) teaches, a system for controlling a robotic action for an agricultural task (computer system 1600, Fig. 16, [0239]) comprising: an imager; (A farming machine [e.g., farming machine 100] includes one or more sensors for capturing an image as the farming machine travels through a field, [0203]); one or more processors; (The computer system 1600 includes one or more processing units 1602, [0241]); one or more computer-readable media storing instructions that, when executed by the one or more processors, (The storage unit 1616 includes a machine-readable medium 1622 on which is stored instructions 1624 [e.g., software] embodying any one or more of the methodologies or functions described, [0243]), cause the system to: move the vehicle (farming machine 100, Figs. 1A-E) along the crop row; (“A farming machine [e.g., farming machine 100] includes one or more sensors for capturing an image as the farming machine travels through a field”, [0203]), [0008]); receive, using the imager moving along a crop row, at least one image of at least a portion of the crop row; ( “A control system (e.g., control system 130) accesses 1210 an image of the field captured by the sensors. The image includes pixels representing a plurality of objects of the field including at least one plant. To illustrate, referring to FIG. 4A, the control system 130 can access captured image 400 including crops 302a, 302b, and 302c, and weed 350.”, [0203],Fig.4A); using the at least one image, a plant detection model (“Control system 130 classifies 1230 one or more pixels as a plant [or crop] based on the depth information [depth identification model 605] for the pixels. The control system 130 can additionally classify other objects in the image [e.g., dirt, grass, etc.] based on their depth information”, Fig. 6, [0205]), to generate an output from the plant detection model; (Control system 130 determines 1240 a treatment action based on the depth information and pixels identified as a plant, [0206]); outputting a control signal for the robotic action based on the output from the plant detection model; (“the control system 130 classifies crops 302a, 302b, 302c and the weed 350 based on the depth information in depth map 360. The control system 130 then determines which plant[s] to treat and with which treatment action[s], if any. The control system 130 compares the depth information of the crops 302a, 302b, 302c and the weed 350. The control system 130 determines that the weed 350 is significantly shorter than the plants 302a-c. The control system 130 selects the weed 350 for treatment with an herbicide sprayed from a spray nozzle because the weed 350 is significantly shorter than the plants 302a-c”, Fig. 4B, [0206]); and conducting the robotic action for the agricultural task in response to the control signal. ( via The control system 130 then actuates Step-1250 (Fig.12) a treatment mechanism 120 (Fig.3B) to treat one or more of the identified plants with the determined treatment action, as needed, [0206], Figs. 3B, 12). However, Fu does not explicitly teach, the system comprising an average inter-crop spacing for the crop row. In a related field of art Koch et al. (Koch) teaches, crop stand optimization systems, methods and apparatus ,wherein, method comprising an average inter- crop spacing for the crop row ( via “The relative plant location criteria may include one or more plant spacing criteria (e.g., a distance between the plant and the nearest adjacent plant, the average distance between the plant and each adjacent plant, the average distance between the plant and other plants within a threshold distance of the plant, the average distance between the plant and the nearest plants within the same planting row, the average distance between the plant and the nearest plants in the adjacent planting rows”, [0044]). It would have been obvious to one of ordinary skill in the art, having the teachings of Fu and Koch before him before the effective filing date of the claimed invention to modify the systems of Fu, to include the teachings (crop stand optimization method.. inter-crop spacing) of Koch and configure with the system of Fu in order to estimate ear potential value for each plant and optimize the crop population in field based on plant criteria. Motivation to combine the two teachings is, identifying crop appearing in certain zone of crop row and distinguishing between crop and weed (i.e., weed identification to uproot weed). As per Claim 18, Fu as modified by Koch teaches the limitation of Claim 16. However, Fu in view of Koch teaches, a vehicle; (farming machine 100, Figs. 1A-E); wherein the imager is positioned to be moved along the crop row by the vehicle; (A farming machine [e.g., farming machine 100] includes one or more sensors for capturing an image as the farming machine travels through a field, Figs. 1A-E, [0203]); wherein the one or more computer-readable media further store instructions that, when executed by the one or more processors, (The storage unit 1616 includes a machine-readable medium 1622 on which is stored instructions 1624 [e.g., software] embodying any one or more of the methodologies or functions described, [0243]), cause the system to move the vehicle along the crop row. (The farming machine 100 includes a control system 130 for controlling operations of system components. The control system 130 may be configured to control operating parameters of the farming machine 100 [e.g., speed, direction], [0086]). As per Claim 20, Fu as modified by Koch teaches the limitation of Claim 17. However, Fu in view of Koch teaches, a vehicle; (farming machine 100, Figs. 1A-E); an agricultural implement; (treatment mechanism 120, Figs. 1C-E, [0086]); wherein the imager is positioned to be moved along the crop row by the vehicle; (A farming machine [e.g., farming machine 100] includes one or more sensors for capturing an image as the farming machine travels through a field, Figs. 1A-E, [0203]); wherein the agricultural implement conducts the robotic action; (“The control system 130 then actuates 1250 a treatment mechanism 120 to treat one or more of the identified plants with the determined treatment action, as needed”[0206]); and wherein at least one of the agricultural implement and the imager are towed by the vehicle. (“The farming machine 100, includes a detection mechanism 110, a treatment mechanism 120, and a control system 130. The farming machine 100 can additionally include a mounting mechanism 140, Figs. 1A-E, [0079]; the detection mechanism 110 includes an array of image sensors configured to capture an image of a plant. The detection mechanism 110 is statically mounted to the mounting mechanism 140 proximal the treatment mechanism 120 relative to the direction of travel 115”, [0083]). Allowable Subject Matter Claims 3-6, 11-15 and 19 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The statement of reasons for allowance will be provided in subsequent office action pending amendments to claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MUHAMMAD SHAFI whose telephone number is (571)270-5741. The examiner can normally be reached M-F 8:30 am -5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott Browne can be reached at 571-270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MUHAMMAD SHAFI/Primary Examiner, Art Unit 3666C
Read full office action

Prosecution Timeline

Oct 17, 2024
Application Filed
Feb 07, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12587320
DISTANCE-BASED NACK PROCEDURES IN A VEHICULAR PLATOON
2y 5m to grant Granted Mar 24, 2026
Patent 12583440
ACTIVE SAFETY SUSPENSION SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12578721
SYSTEMS AND METHODS FOR REMOTE CONTROL OF VEHICLES
2y 5m to grant Granted Mar 17, 2026
Patent 12573251
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND CONTROL APPARATUS
2y 5m to grant Granted Mar 10, 2026
Patent 12568871
SYSTEM AND METHOD FOR DETERMINING RESIDUE COVERAGE OF A FIELD
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
89%
Grant Probability
99%
With Interview (+16.7%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 1100 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month