Prosecution Insights
Last updated: April 19, 2026
Application No. 18/419,704

Real-Time Correction of Agricultural-Field Images

Non-Final OA §103
Filed
Jan 23, 2024
Examiner
DUNPHY, DAVID F
Art Unit
2673
Tech Center
2600 — Communications
Assignee
Centure Applications Ltd.
OA Round
1 (Non-Final)
85%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
95%
With Interview

Examiner Intelligence

Grants 85% — above average
85%
Career Allow Rate
646 granted / 761 resolved
+22.9% vs TC avg
Moderate +10% lift
Without
With
+10.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
21 currently pending
Career history
782
Total Applications
across all art units

Statute-Specific Performance

§101
9.0%
-31.0% vs TC avg
§103
42.9%
+2.9% vs TC avg
§102
25.7%
-14.3% vs TC avg
§112
11.5%
-28.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 761 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1 and 4-5 are rejected under 35 U.S.C. 103 as being unpatentable over Rees et al (US PG Pub. No. 2019/0095710) in view of Herman et al (US PG Pub. No. 2023/0043536). With regards to claim 1, the limitations of this claim are obvious over the teachings of the prior art, as evidenced by the following references: The Rees reference Rees discloses capturing an image of an agricultural field with a camera (e.g., “machine vision system 40”) mounted on a spray boom (e.g., “spray boom 36”) of an agricultural spray system, the camera operably coupled to a computer (e.g., “processor 52” and “memory 54”) in the agricultural spray system at ¶¶ [0048]-[0053] and FIG. 1 and FIG. 5: PNG media_image1.png 271 586 media_image1.png Greyscale Rees discloses correcting, with the computer, a white balance of the image at ¶ [0082] (“[T]he technique provides for adjustment of the white balance of a resulting image based on detection of the white balance of the light source using an algorithm as described in the above identified White Balance application.”). However, Rees does not specify white balance of the image was corrected based on a spectral power distribution of the sun and a camera response function of the camera. However, this limitation was known in the art as evidenced by the Herman reference. The Herman reference Herman discloses estimating, with a computer, a spectral power distribution of the sun based on a date and a time that the image was captured at: ¶¶ [0017]-[0019](“[A] light source 202 (e.g., the sun) may emit light as an ambient light source... E(λ) may represent the spectral power density of external lighting”) and FIGS. 2-3; ¶¶ [0024]-[0025] (“the lighting (sun) position can be inferred from vehicle orientation and sun orientation (which can be obtained from a lookup table based on date, time, and location). For example, solar position may be determined based on a day of the year and the time of day”). Herman discloses correcting, with the computer, a white balance of the image based on the spectral power distribution of the sun and a camera response function of the camera, the camera response function stored in computer memory operably coupled to the computer at: ¶¶ [0013]-[0014](“Given a camera, and assuming that its color space response function has been measured… [G]iven a response function for the camera, the unknowns mentioned above can be solved and used to derive accurate white balancing matrices”); ¶¶ [0018]-[0019](“The formula 110 in FIG. 2 models camera sensor response as a function of pixel location (x), light-object angle (a), object-camera angle (n), illumination intensity (I), surface reflectivity(S), wavelength, and spectral power distribution (E).”); ¶¶ [0021]-[0022]; ¶¶ [0025]-[0026]. PNG media_image2.png 703 687 media_image2.png Greyscale At the time of the filing of the present application, it would have been obvious to a person of ordinary skill in the art to correct the white balance of the image based on a spectral power distribution of the sun and a camera response function of the camera, as taught by Herman, when correcting a white balance of the image, as taught by Rees. The motivation for doing so comes from Herman, which discloses, “Traditional white balancing and color correction of images is based on assumptions about the lighting environment… However, depending on the camera sensor and the display and illumination of the environment, an object may have different apparent color and lighting. For images of an object, this variation can be displeasing to the eye and can reduce the accuracy of image processing algorithms such as computer vision algorithms. In short, there is a problem with uncertainty in the lighting environment within which an image is captured.” (¶ [0012]). Therefore, it would have been obvious to combine Herman with Rees to obtain the invention specified in this claim. With regards to claim 4, Herman discloses estimating the spectral power distribution of the sun includes querying the computer memory at ¶ [0025](“[T]he lighting (sun) position can beinferred from vehicle orientation and sun orientation (which can be obtained from a lookup table based on date, time, and location). For example, solar position may be determined based on a day of the year and the time of day…”); see, also, ¶¶ [0018]-[0019] and FIG. 2. With regards to claim 5, Herman discloses querying a database or a look-up table stored in the computer memory at ¶ [0025](“[T]he lighting (sun) position can beinferred from vehicle orientation and sun orientation (which can be obtained from a lookup table based on date, time, and location). For example, solar position may be determined based on a day of the year and the time of day…”); see, also, ¶¶ [0018]-[0019] and FIG. 2. Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Rees et al (US PG Pub. No. 2019/0095710) in view of Herman et al (US PG Pub. No. 2023/0043536) in further view of Collins et al (US PG Pub. No. 2025/0000079). With regards to claim 2, Rees discloses capturing an image of an agricultural field with a camera (e.g., “machine vision system 40”) mounted on a spray boom (e.g., “spray boom 36”) of an agricultural spray system, the camera operably coupled to a computer (e.g., “processor 52” and “memory 54”) in the agricultural spray system at ¶¶ [0048]-[0053] and FIGS. 1 and 5. Rees does not specify producing light with a light source mounted on the spray boom, the light produced while capturing the image, the light source operably coupled to the computer. However, this limitation was known in the art as evidenced by the Collins reference. Collins discloses producing light with a light source (e.g., “lights 60”) mounted on the spray boom (e.g., “boom arm 22”), the light produced while capturing an image, the light source operably coupled to the computer at: ¶ [0043]; ¶ [0046]; ¶¶ [0047]-[0051](“disposed on boom arm 22 to illuminate a subset of nozzle sprays from nozzles 50”); ¶¶ [0068]-[0069](“camera (70) can capture images of a spray pattern of a nozzle (50, 51)”) and FIG. 2. At the time of the filing of the present application, it would have been obvious to a person of ordinary skill in the art to produce light with a light source (e.g., “lights 60”) mounted on the spray boom (e.g., “boom arm 22”) as taught by Collins, while capturing an image of an agricultural field with a camera (e.g., “machine vision system 40”) mounted on a spray boom (e.g., “spray boom 36”), as taught by Rees. The motivation for doing so comes from the prior art wherein one of ordinary skill in the art would have understood, as a matter of common sense, that illuminating a subject improves images of that subject. Therefore, it would have been obvious to combine Collins with Rees to obtain the invention specified in this claim. Herman discloses correcting, with a computer, the white balance of the image based on the spectral power distribution of the sun, a spectral power distribution of a light source, and the camera response function of the camera, the spectral power distribution of the light source stored in the computer memory at: ¶¶ [0013]-[0014](“Given a camera, and assuming that its color space response function has been measured… [G]iven a response function for the camera, the unknowns mentioned above can be solved and used to derive accurate white balancing matrices… [K]nowns might include …, lighting controlled in the environment… Another benefit of the present disclosure is that a device may capture ambient lighting inside a vehicle from the vehicle (e.g., sunlight outside of the vehicle captured from inside the vehicle), or from light emitted by a device within the vehicle (e.g., from a cell phone).”); ¶¶ [0016]-[0019](“[A] vehicle cabin is exposed to highly dynamic lighting conditions as well as variation in the color of objects in the field of view of the camera. Weather, time of day, streetlights, headlights of other vehicles, low-horizon sunlight, user computing devices, and other conditions introduce potentially significant unknown lighting and color affects that may affect images of the vehicle cabin… The formula 110 in FIG. 2 models camera sensor response as a function of pixel location (x), light-object angle (a), object-camera angle (n), illumination intensity (I), surface reflectivity(S), wavelength, and spectral power distribution (E).”); ¶¶ [0021]-[0022]; ¶¶ [0025]-[0026]. The motivation for the combination is the same as previously presented. Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Rees et al (US PG Pub. No. 2019/0095710) in view of Herman et al (US PG Pub. No. 2023/0043536) and Khait et al (US PG Pub. No. 2022/0092705). With regards to claim 3, Rees discloses, after correcting the white balance of the image, automatically analyzing, the white-balance corrected image for a presence of at least one weed at: ¶¶ [0060]-[0062](“[T]he system utilizes the information from the plant image identification process to control a selective application system configured to selectively apply pesticide to those plants identified as weed plants”); see, also, ¶ [0055], ¶¶ [0054]-[0055](“[C]orrection of image shadows is performed in Step S62… , a plant identification process is performed in Step S64”) and ¶ [0082](“[A] correction technique mentioned above that employs a camera aimed at the light source can be employed at this point to compensate for any inconsistent lighting. That is, the technique provides for adjustment of the white balance…”) However, Rees does not specify using a machine-learning (ML) model when analyzing for the presence of a weed. However, this limitation was known in the art as evidenced by the Khait reference discussed below. Rees further discloses automatically selectively spraying one or more of the respective regions of the agricultural field using one or more selective-spray nozzles associated with the white balance corrected image where the at least one weed is detected, the one or more selective-spray nozzles fluidly coupled to a container holding one or more herbicides at: ¶¶ [0060]-[0062](“[T]he system utilizes the information from the plant image identification process to control a selective application system configured to selectively apply pesticide to those plants identified as weed plants”); see, also, ¶ [0055], ¶¶ [0054]-[0055] and ¶ [0082]. Khait automatically analyzing, with a trained machine-learning (ML) model running on a computer, an image for a presence of at least one weed and automatically detecting, with the computer, the at least one weed in the image at: ¶¶ [0139]-[0153]; see, also, ¶¶ [0038]-[0039]. Khait discloses the trained ML model having been trained with first (“correlated”) and second (“not necessarily correlated”) images of agricultural fields, the first training images including one or more target weeds (“similar weed paramaters”), the second training images not including the one or more target weeds (“different weed parameters”) at ¶ [0163](“At 702, a training dataset is provided and/or selected. The training dataset may include sample images from sample fields that are correlated with one or more target fields … depicting similar weed parameters (e.g., similar weed species and/or similar stages of growth and/or similar size weeds )… [A]dditionally, the training dataset depicts sample images from sample fields which are not necessarily correlated with any specific target fields… Such training dataset may include sample images depicting different weed parameters from fields of different field parameters.”) At the time of filing of the present application, it would have been obvious to a person of ordinary skill in the art to identify weeds in an image using machine learning, as taught by Khait, as a substitute for identifying weeds in an image using a plant image database, as taught by Rees. This combination is a simple substitution of one known element for another to obtain predictable results. The prior art contained a method, taught by Rees, which differed from the claimed method by the substitution of the image analysis method used to identify weeds in an image. Machine learning, and its ability to function as weed identifier in an agricultural setting were known in the art as evidenced by the Khait reference. One of ordinary skill in the art could have substituted the machine learning weed identifier into the method taught by Rees and the results would have been predictable; to wit, weeds would be identified in the agricultural images. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID F DUNPHY whose telephone number is (571)270-1230. The examiner can normally be reached 9 am - 5 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chineyere Wills-Burns can be reached at (571) 272-9752. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DAVID F DUNPHY/Primary Examiner, Art Unit 2673
Read full office action

Prosecution Timeline

Jan 23, 2024
Application Filed
Mar 06, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596884
NATURAL LANGUAGE PROCESSING BASED DOMINANT ITEM DETECTION IN VIDEOS
2y 5m to grant Granted Apr 07, 2026
Patent 12586341
AUTOMATIC EXTRACTION OF SALIENT OBJECTS IN VIRTUAL ENVIRONMENTS FOR OBJECT MODIFICATION AND TRANSMISSION
2y 5m to grant Granted Mar 24, 2026
Patent 12586358
OBJECT RECOGNITION METHOD AND APPARATUS, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12579216
TECHNIQUES FOR CLASSIFICATION WITH NEURAL NETWORKS
2y 5m to grant Granted Mar 17, 2026
Patent 12579831
GENERATING IMAGE DIFFERENCE CAPTIONS VIA AN IMAGE-TEXT CROSS-MODAL NEURAL NETWORK
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
85%
Grant Probability
95%
With Interview (+10.3%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 761 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month