Prosecution Insights
Last updated: April 19, 2026
Application No. 18/826,893

VISION GUIDANCE SYSTEM USING DYNAMIC EDGE DETECTION

Non-Final OA §103
Filed
Sep 06, 2024
Examiner
LEITE, PAULO ROBERTO GONZ
Art Unit
3663
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Deere & Company
OA Round
1 (Non-Final)
52%
Grant Probability
Moderate
1-2
OA Rounds
3y 8m
To Grant
70%
With Interview

Examiner Intelligence

Grants 52% of resolved cases
52%
Career Allow Rate
44 granted / 85 resolved
At TC average
Strong +18% interview lift
Without
With
+17.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
35 currently pending
Career history
120
Total Applications
across all art units

Statute-Specific Performance

§101
11.3%
-28.7% vs TC avg
§103
67.0%
+27.0% vs TC avg
§102
9.6%
-30.4% vs TC avg
§112
8.8%
-31.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 85 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This Office Action is in response to the aforementioned Application filed September 6, 2024. Claims 1-20 are presently pending and presented for examination. Information Disclosure Statement The information disclosure statement (IDS) submitted on December 18, 2024, is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Specification The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification. Claim Objections Claim 20 is objected to because of the following informalities: Claim 20 recites “the memory comprising insturctions that, when exeucted by the one or more processors, cause the one or more processors to:...” The bolded words are believed to contain typos and should be amended to recite --the memory comprising instructions that, when executed by the one or more processors, cause the one or more processors to:...-- Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-8 and 10-20 are rejected under 35 U.S.C. 103 as being unpatentable over Ellaboudy et al. (US 20210000006; hereinafter Ellaboudy, of record in IDS), in view of Slichter et al. (US 20150253427; hereinafter Slichter). Regarding Claim 1, Ellaboudy teaches A method (Ellaboudy: Abstract) comprising: ... identifying a set of candidate edges within an image portion corresponding to the location within the image, (Ellaboudy: Paragraph [0196]) each candidate edge corresponding to a candidate boundary between two different surface types; (Ellaboudy: Paragraph [0203]) ... modifying an operation of the vehicle based on the selected candidate edge. (Ellaboudy: Paragraph [0203]-[0205], [0215]) Ellaboudy does not explicitly teach a method for selecting an edge for a vehicle to follow by determining an edge from a set of candidate edges. However in the same field of endeavor, Slichter teaches ... receiving, from an operator, an input representative of a location within an image of a ground surface in front of a vehicle; (Slichter: Paragraph [0028]; “The measurements of the distance sensors 112 are interpreted by the controller 114 to again identify the location of the edge. The controller 114 adjusts the location of the edge according to the offset of the distance sensor 112 from either of the first or second ends 116, 118. Accordingly, the controller (or operator) assesses which of the first or second ends 116, 118 is closest to the edge of the unharvested crop to ensure the header 104 harvests. The controller 114 uses the selected first or second end 116, 118 to accordingly identify the edge location and then index the identified location relative to the selected end 116, 118.”) ... determining, for each of the set of candidate edges, a distance between the candidate edge and the location within the image (Slichter: Paragraph [0031], [0035]) represented by the input received from the operator; (Slichter: Paragraph [0028]; “The measurements of the distance sensors 112 are interpreted by the controller 114 to again identify the location of the edge. The controller 114 adjusts the location of the edge according to the offset of the distance sensor 112 from either of the first or second ends 116, 118. Accordingly, the controller (or operator) assesses which of the first or second ends 116, 118 is closest to the edge of the unharvested crop to ensure the header 104 harvests. The controller 114 uses the selected first or second end 116, 118 to accordingly identify the edge location and then index the identified location relative to the selected end 116, 118.”) applying an edge selection model to the set of candidate edges, the edge selection model configured to select an edge of the set of candidate edges based at least in part on the determined distance for each candidate edge; (Slichter: Paragraph [0031]-[0032]) and ... It would be obvious for one with ordinary skill in the art before the effective filling date of the claimed invention to modify the method for operating an agricultural vehicle of Ellaboudy with the candidate edge identification system of Slichter for the benefit of providing an accurate and reliable automatic identification of the edge of a region, for instance a cut edge of an agricultural crop during harvesting. (Slichter: Paragraph [0006]) Regarding Claim 2, Ellaboudy, in view of Slichter, teaches The method of claim 1, wherein modifying the operation of the vehicle comprises aligning a tool or instrument of the vehicle with the selected candidate edge. (Ellaboudy: Paragraph [0005], [0196], [0203]-[0205]; The vehicle is able to detect distance from the crop row and move itself and any attached tools to align with the path that is bounded by the crop row.) Regarding Claim 3, Ellaboudy, in view of Slichter, teaches The method of claim 1, wherein modifying the operation of the vehicle comprises modifying a route navigated by the vehicle. (Ellaboudy: Paragraph [0203]-[0205], [0215]) Regarding Claim 4, Ellaboudy, in view of Slichter, teaches The method of claim 3, wherein the route navigated by the vehicle is modified to align a tool or instrument pulled by the vehicle with the selected candidate edge. (Ellaboudy: Paragraph [0005], [0196], [0203]-[0205]; The vehicle is able to detect distance from the crop row and move itself and any attached tools to align with the path that is bounded by the crop row.) Regarding Claim 5, Ellaboudy, in view of Slichter, teaches The method of claim 1, wherein modifying the operation of the vehicle comprises modifying a speed of operating the vehicle. (Ellaboudy: Paragraph [0139]) Regarding Claim 6, Ellaboudy, in view of Slichter, teaches The method of claim 1, wherein the image is one or more of a series of images automatically captured by the vehicle while navigating via autonomous steering through an area of different surface types. (Ellaboudy: Paragraph [0053]) Regarding Claim 7, Ellaboudy, in view of Slichter, teaches The method of claim 1, wherein the operator is remote to the vehicle. (Ellaboudy: Paragraph [0148]; “In some implementations, users may be enabled to send commands necessary to manually control the vehicle, either from a cockpit of the vehicle, near the vehicle, or remotely (e.g., teleoperation).”) Regarding Claim 8, Ellaboudy, in view of Slichter, teaches The method of claim 1, wherein the image portion within the image comprises a bounding box centered on the location represented by the received input. (Ellaboudy: Paragraph [0195]; The system uses bounding boxes to identify crops in the crop row, therefore identifying the location of the crop row and the distance from the vehicle.) Regarding Claim 10, Ellaboudy, in view of Slichter, teaches The method of claim 1, further comprising: identifying, by the vehicle, each of the two different surface types; (Ellaboudy: Paragraph [0123]) and selecting, by the vehicle, the edge selection model from a set of edge selection models based on the identified surface types. (Ellaboudy: Paragraph [0196], [0203]) Regarding Claim 11, Ellaboudy, in view of Slichter, teaches The method of claim 1, wherein the set of candidate edges are identified by and the edge selection model is applied by a remote computing system communicatively coupled to the vehicle. (Ellaboudy: Paragraph [0118], [0162], [0165]) Regarding Claim 12, Ellaboudy, in view of Slichter, teaches The method of claim 1, further comprising: iteratively identifying additional sets of candidate edges within additional images of the ground surface in front of the vehicle; (Ellaboudy: Paragraph [0169], [0173]) iteratively selecting respective edges of the iteratively identified additional sets of candidate edges; (Ellaboudy: Paragraph [0169], [0173]) and autonomously modifying the operation of the vehicle based on the iteratively selected respective edges. (Ellaboudy: Paragraph [0203]-[0205], [0215]) Regarding Claim 13, the claim is analogous to Claim 1 limitations with the following additional limitations: A system comprising a hardware processor (Ellaboudy: Paragraph [0050]; Processing Apparatus 130) and a non-transitory computer-readable storage medium (Ellaboudy: Paragraph [0050]; “The processing apparatus 130 may include memory, such as random access memory device (RAM), flash memory, or any other suitable type of storage device such as a non-transitory computer readable memory.”) storing executable instructions that, when executed by the processor, are configured to cause the system to: ... Therefore the claim is rejected under the same premise as Claim 1. Regarding Claim 14, the claim is analogous to Claim y limitations and is therefore rejected under the same premise as Claim 2. Regarding Claim 15, the claim is analogous to Claim y limitations and is therefore rejected under the same premise as Claim 3. Regarding Claim 16, the claim is analogous to Claim y limitations and is therefore rejected under the same premise as Claim 4. Regarding Claim 17, the claim is analogous to Claim y limitations and is therefore rejected under the same premise as Claim 5. Regarding Claim 18, the claim is analogous to Claim y limitations and is therefore rejected under the same premise as Claim 6. Regarding Claim 19, the claim is analogous to Claim 7 limitations and is therefore rejected under the same premise as Claim 7. Regarding Claim 20, the claim is analogous to Claim 1 limitations with the following additional limitations: An autonomous farming vehicle comprising: a guidance system for determining steering instructions for autonomous steering of the vehicle while navigating through an area of different surface types; (Ellaboudy: Paragraph [0052], [0139]) an image sensor for capturing images of a ground surface in front of the vehicle; (Ellaboudy: Paragraph [0053]; Image Sensors 144) memory; (Ellaboudy: Paragraph [0050]; “The processing apparatus 130 may include memory, such as random access memory device (RAM), flash memory, or any other suitable type of storage device such as a non-transitory computer readable memory. The memory of the processing apparatus 130 may include executable instructions and data that can be accessed by one or more processors of the processing apparatus 130.”) and one or more processors operatively coupled to the memory, (Ellaboudy: Paragraph [0050]; Processing Apparatus 130) the memory comprising insturctions that, when exeucted by the one or more processors, cause the one or more processors to: ... Therefore the claim is rejected under the same premise as Claim 1. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Ellaboudy, in view of Slichter, as applied to claims 1-8 and 10-20 above, and further in view of Guo et al. (US 20190228224; hereinafter Guo, of record in IDS). Regarding Claim 9, Ellaboudy, in view of Slichter, teaches The method of claim 1, wherein the two different surface types comprise soil and one of a crop, grass, and pavement, (Ellaboudy: Paragraph [0071], [0232]; The two terrain types are the soil in the lane that the vehicle drives on and the raised plant beds where the crops are located.) Ellaboudy, in view of Slichter, does not teach that the edge selection model is a machine learned model that can distinguish between various surface and boundary types. However in the same field of endeavor, Guo teaches ...and wherein the edge selection model comprises a machine-learned model that is trained on images of manually tagged boundaries between soil and the one of a crop, grass, and pavement. (Guo: Paragraph [0025], [0029], [0040]) It would be obvious for one with ordinary skill in the art before the effective filling date of the claimed invention to modify the method of Ellaboudy, in view of Slichter, with the machine-learned model of Guo for the benefit of inexpensively, accurately, and frequently identifying agricultural land on a sufficiently granular level for one or more particular geographical regions and the crop(s) growing on the agricultural land. (Guo: Paragraph [0005]) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to PAULO ROBERTO GONZALEZ LEITE whose telephone number is (571)272-5877. The examiner can normally be reached Mon-Fri: 8:00 am - 4:30 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Flynn can be reached at 571-272-9855. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /P.R.L./ Examiner, Art Unit 3663 /ABBY J FLYNN/ Supervisory Patent Examiner, Art Unit 3663
Read full office action

Prosecution Timeline

Sep 06, 2024
Application Filed
Jan 15, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12590808
METHOD FOR RECOMMENDING PARKING, ELECTRONIC DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12589754
MOTOR VEHICLE HAVING A FIRST DRIVE MACHINE AND A SECOND DRIVE MACHINE CONFIGURED AS AN ELECTRIC MACHINE AND METHOD FOR OPERATING A MOTOR VEHICLE
2y 5m to grant Granted Mar 31, 2026
Patent 12570415
UAV WITH MANUAL FLIGHT MODE SELECTOR
2y 5m to grant Granted Mar 10, 2026
Patent 12559916
WORK MACHINE CONTROL SYSTEM FOR INDICATING IMPLEMENT POSITION
2y 5m to grant Granted Feb 24, 2026
Patent 12533986
APPARATUS AND APPLICATION FOR PREDICTING DISCHARGE OF BATTERY
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
52%
Grant Probability
70%
With Interview (+17.8%)
3y 8m
Median Time to Grant
Low
PTA Risk
Based on 85 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month