Prosecution Insights
Last updated: April 19, 2026
Application No. 17/586,479

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, PROGRAM, AND STORAGE MEDIUM

Final Rejection §103
Filed
Jan 27, 2022
Examiner
ISLAM, MEHRAZUL NMN
Art Unit
2662
Tech Center
2600 — Communications
Assignee
Canon Kabushiki Kaisha
OA Round
4 (Final)
58%
Grant Probability
Moderate
5-6
OA Rounds
3y 4m
To Grant
86%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
29 granted / 50 resolved
-4.0% vs TC avg
Strong +28% interview lift
Without
With
+28.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
46 currently pending
Career history
96
Total Applications
across all art units

Statute-Specific Performance

§101
9.2%
-30.8% vs TC avg
§103
68.6%
+28.6% vs TC avg
§102
4.1%
-35.9% vs TC avg
§112
15.2%
-24.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 50 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Applicant’s response to the Non-final Office Action dated 06/04/2025, filed with the office on 10/01/2025, has been entered and made of record. Priority Certified copies of foreign documents JP2019-140818 and JP2020-124031 have not been provided to the Office. It is respectfully requested that Applicant provide certified copies of the cited foreign references to the Office. Status of Claims Claims 2, 4-6, 10-13, 15, 17 and 20 are pending. Claims 2, 4-6, 11, 12, 15, 17 and 20 are amended. Claims 1, 7-9, 14, 16 and 19 are cancelled. Claims 3 and 18 are non-elected. Response to Arguments Applicant’s amendment of independent Claims 2 and 17, which has altered the scope of the claims of the instant application, has necessitated the new ground(s) of rejection presented in this office action with respect to claims of the instant application. Accordingly, in response to Applicant’s arguments that are merely directed to the amended portion of the claims, new analyses have been presented below, which make Applicant’s arguments moot. Consequently, THIS ACTION IS MADE FINAL. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitations are: “imaging unit” in claims 2 and 17, and “imaging element” in claims 2, 15 and 17. Because these claim limitations are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, these are being interpreted to cover the corresponding structures described in the applicant’s drawings: algorithms (flow charts) depicted in Fig. 22, and applicant’s specification: ¶0018: “The imaging unit 205 is, for example, an imaging element such as a CCD or CMOS sensor, performs photoelectric conversion of an optical image that is formed by the optical system 204 on the imaging element of the imaging unit 205, and outputs an obtained analog image signal to an A/D conversion unit 206” and ¶0004: “a lens that is an optical system or an imaging element” as performing the claimed functions, and equivalents thereof. If applicant does not intend to have these limitations interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 2, 5, 6, 10, 11, 15, 17 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Abe (US 2016/0373643 A1), in view of Masuda (US 2011/0298899 A1), in further view of Tang et al. (US 2015/0042812 A1) and still in further view of Skidmore et al. (US 2018/0350103 A1). Regarding claim 2, Abe teaches, An image processing apparatus (Abe, ¶0018: “an image processing device”) comprising: at least one processor and/or circuit; (Abe, ¶0064: “comprise one or more processors (e.g., central processing unit (CPU)”) and at least one memory (Abe, ¶0064: “a random-access memory (RAM), a read only memory (ROM)”) storing a computer program, which causes the at least one processor and/or circuit to function as following units: (Abe, ¶0064: “computer executable instructions may be provided to the computer, for example, from a network or the storage medium”) an input unit configured to input a distance information (Abe, ¶0002: “acquiring information regarding a distance of a subject”) distribution calculated from an image captured by using an optical system that forms a field image (Abe, ¶0005: “calculating a distribution of a subject distance in a photographing scene”) on an imaging element of an imaging unit; (Abe, ¶0023: “The image sensor… (CMOS) element”; please see 112f for the examiner interpretation of imaging element) (Abe, ¶0026: “image processing unit 30 detects a relative image deviation amount… through correlation calculation”) between the distance information distribution and the estimated depth direction, (Abe, ¶0026: “The image processing unit 30 calculates defocus information corresponding to a subject distance”) an evaluation value indicating a deviation degree (Abe, ¶0026: “image processing unit 30 detects a relative image deviation amount of the images”) in a depth direction of a subject in the image; (Abe, ¶0028: “calculates a defocus amount (notated as DEF) of a subject with respect to the imaging surface using an image deviation amount”). However, Abe does not explicitly teach, an estimation unit configured to estimate a depth direction in the image from an imaging condition of the imaging unit, and a generation unit configured to generate an information for controlling at least one of a pan, tilt, and an imaging position of an imaging apparatus including the imaging unit based on the evaluation value in order that the imaging apparatus faces the subject, wherein the imaging condition is at least one of orientation information of the apparatus when the image is captured, a vanishing point in the image, a change in a density of texture in the image and a determination result as to whether a structure with a known shape is included in the image, and wherein the decision unit decides the evaluation value based on a comparison between the estimated depth direction and a depth direction derived from the distance information distribution. In an analogous field of endeavor, Masuda teaches, an estimation unit configured to for estimate (Masuda. ¶0060: “it is calculated that”) a depth direction in the image (Masuda. ¶0060: “a direction parallel to an optical axis of the taking lens (a distance in a depth direction)”) from an imaging condition of the imaging unit; (Masuda. ¶0060: “based on… a focal length”; applicant’s specification defines imaging condition in ¶0075: “imaging conditions (focal length, focus lens position, aperture) of the optical system”). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Abe using the teachings of Masuda to introduce computing a depth direction using imaging condition. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of estimating the direction of focus of the camera. Therefore, it would have been obvious to combine the analogous arts Abe and Masuda to obtain the above-described limitations of claim 2. However, the combination of Abe and Masuda does not explicitly teach, a generation unit configured to generate an information for controlling at least one of a pan, tilt, and an imaging position of an imaging apparatus including the imaging unit based on the evaluation value in order that the imaging apparatus faces the subject, wherein the imaging condition is at least one of orientation information of the apparatus when the image is captured, a vanishing point in the image, a change in a density of texture in the image and a determination result as to whether a structure with a known shape is included in the image, and wherein the decision unit decides the evaluation value based on a comparison between the estimated depth direction and a depth direction derived from the distance information distribution. In another analogous field of endeavor, Tang teaches, a generation unit configured to generate an information (Tang, ¶0126: “determine reference and desired aim-point motion for digital pan-tilt camera and camera track system”) for controlling at least one of a pan, tilt, and an imaging position of an imaging apparatus including the imaging unit based on the evaluation value in order that the imaging apparatus faces the subject, (Tang, ¶0126: “the automatic target following camera view control is to adjust the camera pan and tilt angles… By achieving this, the target object is guaranteed to be displayed in the camera image frames with the best relative frame position”) wherein the imaging condition is at least one of orientation information of the apparatus when the image is captured, (Tang, ¶0011: “video frames are captured from a camera system whose orientation is determined from the camera's position and motion”) a vanishing point in the image, a change in a density of texture in the image and a determination result as to whether a structure with a known shape is included in the image, (alternative limitation is considered). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Abe in view of Masuda using the teachings of Tang to introduce controlling camera pan-tilt angles. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of adjusting a camera angle to keep the subject in desired location in the frame. Therefore, it would have been obvious to combine the analogous arts Abe, Masuda and Tang to obtain the invention of claim 2. However, the combination of Abe, Masuda and Tang does not explicitly teach, wherein the decision unit decides the evaluation value based on a comparison between the estimated depth direction and a depth direction derived from the distance information distribution. In still analogous field of endeavor, Skidmore teaches, wherein the decision unit decides the evaluation value based on a comparison (Skidmore, Claim 3: “correcting an initial value for the FOV based on the difference between the first angle estimate and the second angle estimate”) between the estimated depth direction (Skidmore, Claim 3: “a second angle estimate for the angle subtended by… a location along the viewing direction”) and a depth direction derived from the distance information distribution. (Skidmore, Claim 3: “a first angle estimate for an angle subtended by… a viewing direction of the camera using distances measured directly from one or more frames”). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Abe in view of Masuda in further view of Tang using the teachings of Skidmore to introduce comparing computed camera angles. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of detecting a deviation degree of the camera. Therefore, it would have been obvious to combine the analogous arts Abe, Masuda, Tang and Skidmore to obtain the invention of claim 2. Regarding claim 5, Abe in view of Masuda, in further view of Tang and still in further view of Skidmore teaches, The image processing apparatus according to claim 2, wherein the distance information distribution is (The strikethrough portion is not considered) information related to a distribution of an actual distance from an imaging position to a subject. (Abe, ¶0004: “distributions of distances to subjects (hereinafter referred to as subject distances)”). Regarding claim 6, Abe in view of Masuda, in further view of Tang and still in further view of Skidmore teaches, The image processing apparatus according to claim 2, wherein the information related to the distribution of a parallax amount of the subject is obtained from a pair of images with a parallax therebetween. (Masuda, ¶0002: “a pair of camera with a certain distance in right and left between them, captures a measurement object as the subject, and gets a parallax image”). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Abe in view of Masuda, in further view of Tang and still in further view of Skidmore using the additional teachings of Masuda to introduce an image pair with a parallax. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of computing the coordinates of an object visible in the images. Therefore, it would have been obvious to combine the analogous arts Abe, Masuda, Tang and Skidmore to obtain the invention of claim 6. Regarding claim 10, Abe in view of Masuda, in further view of Tang and still in further view of Skidmore teaches, The image processing apparatus according to claim 2, wherein the deviation degree in the depth direction of the subject in the image (Abe, ¶0026: “image processing unit 30 detects a relative image deviation amount of the images”) is a difference between defocus amounts (Abe, ¶0026: “The image processing unit 30 calculates defocus information corresponding to a subject distance”) at a plurality of positions in the distance information distribution. (Abe, ¶0026: “a relative image deviation amount of the images A and B through correlation calculation (phase difference detection scheme)”). Regarding claim 11, Abe in view of Masuda, in further view of Tang and still in further view of Skidmore teaches, The image processing apparatus according to claim 2, wherein the at least one memory causes the at least one processor and/or circuit to further function as a notification unit configured to notify the evaluation value. (Abe, ¶0057: “control unit 20 communicates with the lens control unit 120 of the imaging lens 3 to acquire the imaging surface correction information”). Regarding claim 15, Abe in view of Masuda, in further view of Tang and still in further view of Skidmore teaches, The image processing apparatus according to claim 2, wherein, by associating the evaluation value indicating the deviation degree with information on the imaging element (Abe, ¶0004: “aberration of curvature of field or the like caused due to a manufacturing error or optical characteristics of design of an imaging lens”) and the optical system that have acquired the image for which the evaluation value has been calculated, (Abe, ¶0005: “optical characteristics of design of an imaging lens and imaging surface flatness of an image sensor are calculated.”) the information is output to an external apparatus. (Abe, ¶0050: “An external apparatus, a computer, or the like connected to the imaging apparatus acquires the image data and the metadata”). Regarding claim 17, it recites a method with steps corresponding to the elements in the apparatus recited in claim 2. Therefore, the recited steps of the method claim 17 are mapped to the proposed combination in the same manner as the corresponding elements in the apparatus claim 2. Additionally, the rationale and motivation to combine Abe, Masuda, Tang and Skidmore presented in rejection of claim 2, apply to this claim. Additionally, Abe teaches, An image processing method (Abe, ¶0064: “a method performed by the computer of the system or apparatus”). Regarding claim 20, Abe in view of Masuda, in further view of Tang and still in further view of Skidmore teaches, A non-transitory computer-readable storage medium having stored the computer program stored in the at least one memory included in the image processing apparatus according to claim 2. (Abe, ¶0064: “computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiments and/or that includes one or more circuits”). Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Abe (US 2016/0373643 A1), in view of Masuda (US 2011/0298899 A1), in further view of Tang et al. (US 2015/0042812 A1) still in further view of Skidmore et al. (US 2019/0281214 A1) and yet in further view of Miyazawa et al. (US 2017/0257556 A1). Regarding claim 4, Abe in view of Masuda, in further view of Tang and still in further view of Skidmore teaches, The image processing apparatus according to claim 2, wherein the distance information distribution is information related to a distribution obtained by. However, the combination of Abe, Masuda, Tang and Skidmore does not explicitly teach, normalizing a distribution of a defocus amount of a subject by an F-number and an acceptable circle of confusion. In an analogous field of endeavor, Miyazawa teaches, normalizing a distribution of a defocus amount of a subject by an F-number and an acceptable circle of confusion. (Miyazawa, ¶0155: “the defocus amount in the case where the focus state is “normal” can be set to values ranging from 1Fδ to 4Fδ (F: F number, δ: the diameter of a permissible circle of confusion)”). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Abe in view of Masuda, in further view of Tang and still in further view of Skidmore using the teachings of Miyazawa to introduce F-number and circle of confusion. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of improving the tracking performance of the autofocus. Therefore, it would have been obvious to combine the analogous arts Abe, Masuda, Tang, Skidmore and Miyazawa to obtain the invention of claim 4. Claims 12 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Abe (US 2016/0373643 A1), in view of Masuda (US 2011/0298899 A1), in further view of Tang et al. (US 2015/0042812 A1) still in further view of Skidmore et al. (US 2018/0350103 A1) and yet in further view of Gerlach (US 2014/0168434 A1). Regarding claim 12, Abe in view of Masuda, in further view of Tang and still in further view of Skidmore teaches, The image processing apparatus according to claim 2, wherein, when an input image is determined to However, the combination of Abe, Masuda, Tang and Skidmore does not explicitly teach, include a typical object including a ground, a water surface, and a structure constructed in a perpendicular direction of the ground or water surface, In an analogous field of endeavor, Gerlach teaches, include a typical object including a ground, a water surface, and a structure constructed in a perpendicular direction of the ground or water surface, (Gerlach, ¶0052: “the image 501 includes a house 506, a road 508, a small pond”). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Abe in view of Masuda, in further view of Tang and still in further view of Skidmore using the teachings of Gerlach to introduce a predetermined scenario in an image. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of finding a deviation degree using a change in the given image. Therefore, it would have been obvious to combine the analogous arts Abe, Masuda, Tang, Skidmore and Gerlach to obtain the invention of claim 12. Regarding claim 13, Abe in view of Masuda, in further view of Tang and still in further view of Skidmore and yet in further view of Gerlach teaches, The image processing apparatus according to claim 12, wherein a statistic of the distance information distribution is a histogram of the distance information distribution. (Masuda, ¶0080: “a histogram of the pixel value of each image Pt, Pc is calculated”). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Abe in view of Masuda, in further view of Tang and still in further view of Skidmore and still in further view of Gerlach using the additional teachings of Masuda to introduce a histogram. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of visually presenting an information through a histogram. Therefore, it would have been obvious to combine the analogous arts Abe, Masuda Skidmore and Gerlach to obtain the invention of claim 13. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MEHRAZUL ISLAM whose telephone number is (571)270-0489. The examiner can normally be reached Monday-Friday: 8am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Saini Amandeep can be reached at (571) 272-3382. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MEHRAZUL ISLAM/Examiner, Art Unit 2662 /AMANDEEP SAINI/Supervisory Patent Examiner, Art Unit 2662
Read full office action

Prosecution Timeline

Jan 27, 2022
Application Filed
Sep 07, 2024
Non-Final Rejection — §103
Jan 10, 2025
Response Filed
Feb 18, 2025
Final Rejection — §103
May 01, 2025
Response after Non-Final Action
May 22, 2025
Request for Continued Examination
May 23, 2025
Response after Non-Final Action
May 29, 2025
Non-Final Rejection — §103
Oct 01, 2025
Response Filed
Nov 28, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602808
METHOD FOR INSPECTING AN OBJECT
2y 5m to grant Granted Apr 14, 2026
Patent 12592075
REMOTE SENSING FOR INTELLIGENT VEGETATION TRIM PREDICTION
2y 5m to grant Granted Mar 31, 2026
Patent 12579695
Method of Generating Target Image Data, Electrical Device and Non-Transitory Computer Readable Medium
2y 5m to grant Granted Mar 17, 2026
Patent 12524900
METHOD FOR IMPROVING ESTIMATION OF LEAF AREA INDEX IN EARLY GROWTH STAGE OF WHEAT BASED ON RED-EDGE BAND OF SENTINEL-2 SATELLITE IMAGE
2y 5m to grant Granted Jan 13, 2026
Patent 12489964
PATH PLANNING
2y 5m to grant Granted Dec 02, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
58%
Grant Probability
86%
With Interview (+28.3%)
3y 4m
Median Time to Grant
High
PTA Risk
Based on 50 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month