Prosecution Insights
Last updated: April 19, 2026
Application No. 17/942,305

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND COMPUTER-READABLE RECORDING MEDIUM

Final Rejection §103
Filed
Sep 12, 2022
Examiner
ISLAM, MEHRAZUL NMN
Art Unit
2662
Tech Center
2600 — Communications
Assignee
Denso Ten Limited
OA Round
4 (Final)
58%
Grant Probability
Moderate
5-6
OA Rounds
3y 4m
To Grant
86%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
29 granted / 50 resolved
-4.0% vs TC avg
Strong +28% interview lift
Without
With
+28.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
46 currently pending
Career history
96
Total Applications
across all art units

Statute-Specific Performance

§101
9.2%
-30.8% vs TC avg
§103
68.6%
+28.6% vs TC avg
§102
4.1%
-35.9% vs TC avg
§112
15.2%
-24.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 50 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Applicant’s response to the Non-final Office Action dated 09/18/2025, filed with the office on 11/24/2025, has been entered and made of record. Status of Claims Claims 1 and 3-10 are pending. Claims 1, 3, 4, 9 and 10 are amended. Claims 2 and 11-13 are cancelled. Response to Arguments Applicant's arguments filed on November 24, 2025 with respect to the rejection of claims under 35 U.S.C. 103 has been fully considered; but they are not found persuasive. Specifically, in page 7 of its reply, Applicant argues in third paragraph that the Office Action relies upon impermissible hindsight to reach its conclusion of obviousness. In response to Applicant's argument, it must be recognized that any judgment on obviousness is in a sense necessarily a reconstruction based upon hindsight reasoning. But so long as it takes into account only knowledge which was within the level of ordinary skill at the time the claimed invention was made, and does not include knowledge gleaned only from the applicant's disclosure, such a reconstruction is proper. See In re McLaughlin, 443 F.2d 1392, 170 USPQ 209 (CCPA 1971). Applicant further argues in page 8, second paragraph, that none of the prior art references disclose or renders obvious changing the shape of the ROI for first and second calibration value calculations, let alone changing the shape from rectangular to trapezoidal. Examiner respectfully disagrees. The claim language does not recite changing a shape of the ROI. It rather recites two different region of interests—first ROI (rectangular) and second ROI (trapezoidal). Additionally, a first calibration value is computed using optical flow of feature points in the first ROI and a second calibration value is computed using optical flow of feature points in the second ROI. The cited prior art reference Tanaka teaches a rectangular ROI—Tanaka, ¶0056: “marker recognizing unit 9 sets a portion of the image by the front camera 17 as a target detection area 43” also see Fig. 5, area 43 is rectangular. In an analogous field of endeavor, Oba discloses trapezoidal region of interest in Figure 16 (reproduced below): PNG media_image1.png 436 619 media_image1.png Greyscale Additionally, Oba teaches an optical flow filter for computing the optical flow of feature points—Oba, ¶0147: “optical flow filter 126 detects the velocity at which a specific number of points on the object move between the time series image frames”. Therefore, it would have been obvious to a person skilled in the art to combine the known elements as described above and achieve the predictable result of tracking the flow of feature points on the road of both rectangular ROI and trapezoidal ROI to perform a calibration. Therefore, Applicant’s arguments are not found persuasive. Consequently, THIS ACTION IS MADE FINAL. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitations are: “controller configured to estimate” and “controller is further configured to: execute” in claim 1; “controller is further configured to: extract” in claim 5; “controller determines” in claim 6; “controller determines” in claim 7; and “controller stops at least one frame detection function” and “controller has determined” in claim 8. Because these claim limitations are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, these are being interpreted to cover the corresponding structures described in the applicant’s drawings: algorithms (flow charts) depicted in Fig. 8, and applicant’s specification: ¶0036: “The control unit15 is a "controller" and is implemented by, for example, a central processing unit (CPU) or a micro processing unit (MPU)” as performing the claimed functions, and equivalents thereof. If applicant does not intend to have these limitations interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1, 3-5, 9 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Tanaka (US 2019/0259180 A1), in view Lin et al. (US 2019/0206078 A1) and in further view of Oba et al. (US 2018/0365859 A1). Regarding claim 1, Tanaka teaches, An information processing apparatus comprising: a controller configured to (Tanaka, ¶0038: “calibration apparatus 1 is mainly configured by a known microcomputer that includes a central processing unit”) estimate an attitude of a single onboard camera based on captured images that have been captured by the single onboard camera, (Tanaka, ¶0003: “an attitude or the like of the onboard camera is calibrated through use of the acquired image”) wherein the controller is further configured to: in a case where the single onboard camera is mounted in a first state without a known calibration value, (Tanaka, ¶0083: “the determined roll angle θroll and depression angle θpitch being applied to expression 2, the position of the front camera 17 relative to the imaged markers 39 and 41 is determined”; applicant’s specification ¶0017: “calibration values (mounting position as well as pan, tilt, and roll) of the camera11” therefore, position of the first camera is interpreted to teach-- “first state without a known calibration value”) execute first attitude estimation processing (Tanaka, ¶0100: “the attitude, that is, the depression angle θ pitch, the roll angle θ roll, and the direction θ yaw of each camera relative to the own vehicle 27 can be determined by the processes”) that includes: setting a rectangular-shaped first region of interest in the captured images; (Tanaka, ¶0056: “marker recognizing unit 9 sets a portion of the image by the front camera 17 as a target detection area 43” also see Fig. 5, area 43 is rectangular) and estimating the attitude of the single onboard camera, (Tanaka, ¶0003: “attitude or the like of the onboard camera is calibrated through use of the acquired image”). However, Tanaka does not explicitly teach, calculating a first calibration value based on optical flows of feature points across frames of the captured images in the first region of interest; and in a case where the single onboard camera is mounted in a second state with the known calibration value that has been previously determined in the first attitude estimation processing, execute second attitude estimation processing that includes: setting, in the captured images, a trapezoidal-shaped second region of interest corresponding to a shape of a road surface appearing in the captured images by using a known calibration value; and estimating the attitude of the single onboard camera, and calculating a second calibration value, based on optical flows of feature points across frames of the captured images in the second region of interest. In an analogous field of endeavor, Lin teaches, calculating a first calibration value (Lin, ¶0025: “in the process of tracking a pose of a camera… characteristic point method so that the pose of the camera can be calculated more accurately”) based on optical flows of feature points across frames of the captured images in the first region of interest; (Lin, ¶0006: “tracking based on a fixed marker is mostly triggered by an indirect method (characteristic point+descriptor), and is conducted by an algorithm such as an optical flow method”) and in a case where the single onboard camera is mounted in a second state (Lin, ¶0071: “tracking by the direct method, to output pose 2, which is recorded as a second pose”) with the known calibration value that has been previously determined in the first attitude estimation processing, execute second attitude estimation processing (Lin, ¶0049: “the pose of the camera may be determined based on the first pose and the second pose… Kalman filtering is applied to the spatial coordinates (Tx, Ty, Tz) and the angles (roll, pitch, yaw) estimated from the previous and current frames”) that includes: setting, in the captured images, a second region of interest (Lin, ¶0081: “unit 303 is configured to determine a region of interest based on the second pose”) corresponding to a shape of a road surface appearing in the captured images by using a known calibration value; (Lin, ¶0039: “marker placed in a real scene, including a planar object such as a poster, a display board, a book, a floor sticker, etc. Typically, a planar marker can be used as a positioning reference, such as a rectangular parallelepiped”) and estimating the attitude of the single onboard camera, (Lin, ¶0039: “By recognizing the marker in the image and processing the image containing the marker, the pose of the camera is estimated”) and calculating a second calibration value, (Lin, ¶0042: “the pose of the camera may be defined, for example, by six degrees of freedom, including coordinates (tx, ty, tz) in a spatial xyz coordinate system and angles (row, pitch, yaw) of the camera relative to respective axes of xyz”) based on optical flows of feature points across frames of the captured images in the second region of interest. (Lin, ¶0006: “tracking based on a fixed marker is mostly triggered by an indirect method (characteristic point+descriptor), and is conducted by an algorithm such as an optical flow method”). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Tanaka using the teachings of Lin to introduce estimating camera pose using optical flow method. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of computing different camera poses. Therefore, it would have been obvious to combine the analogous arts Tanaka and Lin to obtain the above-described limitations in claim 1. However, the combination of Tanaka and Lin does not explicitly teach, a trapezoidal-shaped (region or interest). In another analogous field of endeavor, Oba teaches, a trapezoidal-shaped (region or interest) (Oba, ¶0194: “the relevant process may be limited to a predictable road surface range such as a finite neighbor trapezoidal range”; also see, Fig. 16: trapezoidal region of interest). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Tanaka in view of Lin using the teachings of Oba to introduce a trapezoidal shaped region of interest. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of tracking feature points on the road in a specific region of interest to perform calibration computation. Therefore, it would have been obvious to combine the analogous arts Tanaka, Lin and Oba to obtain the invention in claim 1. Regarding claim 3, Tanaka, in view of Lin, and in further view of Oba teaches, The information processing apparatus according to claim 1, wherein the second region of interest has a shape according to a shape of the road surface that appears to converge toward a vanishing point in the captured images. (Oba, ¶0181: “parallel lines on the right and left sides of the vehicle are captured as the right and left sides of a trapezoid. In a case where the infinite far point is included in the screen, the intersection point is the so-called infinite point”; and ¶0199: “infinite point (vanishing point)”). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Tanaka in view of Lin, and in further view of Oba using the additional teachings of Oba to introduce a vanishing point. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of accounting for a vanishing point in calibration calculation for higher accuracy. Therefore, it would have been obvious to combine the analogous arts Tanaka, Lin and Oba to obtain the invention in claim 3. Regarding claim 4, Tanaka, in view of Lin, and in further view of Oba teaches, The information processing apparatus according to claim 1, wherein the second attitude estimation processing further includes: estimating the attitude of the single onboard camera, and calculating the second calibration value based on optical flows of feature points across frames of the captured images in a superimposed portion (Oba, ¶0191: “a central projection image including the rectangular shape of the road surface, for example, the parallel lines on the right and left sides of the vehicle are captured as the right and left sides of a trapezoid”) where the first region of interest and the second region of interest overlap. (Oba, Fig. 6 shows the regions of interest overlaps). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Tanaka in view of Lin, and in further view of Oba using the additional teachings of Oba to introduce overlapping regions of interest. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of tracking the overlapping region to detect expected markings for camera calibration. Therefore, it would have been obvious to combine the analogous arts Tanaka, Lin and Oba to obtain the invention in claim 4. Regarding claim 5, Tanaka, in view of Lin, and in further view of Oba teaches, The information processing apparatus according to claim 1, wherein the controller is further configured to: extract, from each of the first region of interest and the second region of interest, a group of line segment pairs based on the optical flows; (Tanaka, ¶0046: “A pair of markers 39 and 41 is provided on a floor surface”; and ¶0007: “a first feature portion in the marker moves to a position overlapping a second feature portion in the marker in the image”) and estimate, as the first calibration value and the second calibration value, rotation angles of pan, tilt, and roll axes of the single onboard camera based on each of the line segment pairs. (Tanaka, ¶0100: “the depression angle θ pitch, the roll angle θ roll, and the direction θ yaw of each camera relative to the own vehicle 27 can be determined by the processes”; also see Figs. 9-12). Regarding claim 9, it recites a method with steps corresponding to the elements of the apparatus recited in claim 1. Therefore, the recited steps in method claim 9 are mapped to the proposed combination in the same manner as the corresponding elements in apparatus claim 1. Additionally, the rationale and motivation to combine Tanaka, Lin and Oba presented in rejection of claim 1, apply to this claim. Additionally, Tanaka teaches, An information processing method performed by an information processing apparatus having a controller, (Tanaka, ¶0038: “the calibration apparatus 1 is mainly configured by a known microcomputer that includes a central processing unit (CPU)”). Regarding claim 10, it recites a computer-readable recording medium storing a program that executes a process corresponding to the elements of the apparatus recited in claim 1. Therefore, the recited steps of the program process in computer-readable medium of claim 10 are mapped to the proposed combination in the same manner as the corresponding elements of the apparatus claim 1. Additionally, the rationale and motivation to combine Tanaka, Lin and Oba presented in rejection of claim 1, apply to this claim. Additionally, Tanaka teaches, A non-transitory computer-readable recording medium having stored thein a program that causes a computer to execute a process (Tanaka, ¶0038: “Various functions of the calibration apparatus 1 are actualized by the CPU 3 running a program that is stored in a non-transitory computer-readable storage medium”). Claim 6 and 7 are rejected under 35 U.S.C. 103 as being unpatentable over Tanaka (US 2019/0259180 A1), in view of Lin et al. (US 2019/0206078 A1), in further view of Oba et al. (US 2018/0365859 A1) and still in further view of Hosono et al. (US 2022/0277592 A1). Regarding claim 6, Tanaka, in view of Lin, and in further view of Oba teaches, The information processing apparatus according to claim 5, wherein the controller determines angle estimates for the pan, tilt, and roll axes. However, the combination of Tanaka, Lin and Oba does not explicitly teach, based on a median value after making a histogram of each of the rotation angles that have been estimated. In an analogous field of endeavor, Hosono teaches, based on a median value (Hosono, ¶0053: “the median thereof is defined as the action direction of this frame image”) after making a histogram of each of the rotation angles that have been estimated. (Hosono, ¶0053: “a movement direction histogram is generated based on the angles of the motion vectors of the optical flow that are included in the object region of each frame image”). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Tanaka in view of Lin and in further view of Oba using the teachings of Hosono to introduce a histogram of angles. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of estimating an angle with better accuracy. Therefore, it would have been obvious to combine the analogous arts Tanaka, Lin, Oba and Hosono to obtain the invention in claim 6. Regarding claim 7, Tanaka, in view of Lin, in further view of Oba and still in further view of Hosono teaches, The information processing apparatus according to claim 6, wherein the controller determines axis misalignment of the single onboard camera based on the angle estimates that have been determined. (Oba, ¶0011: “a field angle mismatch amount that is the correction amount of the first camera as a shift angle between a camera optical axis and a vehicle running direction”). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Tanaka in view of Lin, in further view of oba and still in further view of Hosono using the additional teachings of Oba to introduce an angle calculation. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of estimating an axis misalignment based on the angle measurement. Therefore, it would have been obvious to combine the analogous arts Tanaka, Lin, Oba and Hosono to obtain the invention in claim 7. Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Tanaka (US 2019/0259180 A1), in view Lin et al. (US 2019/0206078 A1), in further view of Oba et al. (US 2018/0365859 A1), still in further view of Hosono et al. (US 2022/0277592 A1) and yet in further view of Nakagawa (US 2021/0284137 A1). Regarding claim 8, Tanaka, in view of Lin, in further view of Oba and still in further view of Hosono teaches, The information processing apparatus according to claim 7. However, the combination of Tanaka, Lin, Oba and Hosono does not explicitly teach wherein the controller stops at least one of a parking frame detection function and an automatic parking function when the controller has determined that the axis misalignment is exists. In an analogous field of endeavor, Nakagawa teaches, wherein the controller stops at least one of a parking frame detection function and an automatic parking function when the controller has determined that the axis misalignment is exists. (Nakagawa, ¶0023: “a signal for stopping the automatic parking operation to the automatic drive controller 330 when determining that the error detector 340 detects an error or malfunction of the automatic parking operation”; the error/malfunction in the system is interpreted to include axis misalignment). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Tanaka in view of Lin, in further view of Oba, and still in further view of Hosono using the teachings of Nakagawa to introduce termination of an operation. A person skilled in the art would be motivated to combine the known elements as described above and achieve the predictable result of ending a parking operation when the sensor view is misaligned to avoid potential collision. Therefore, it would have been obvious to combine the analogous arts Tanaka, Lin, Oba, Hosono and Nakagawa to obtain the invention in claim 8. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MEHRAZUL ISLAM whose telephone number is (571)270-0489. The examiner can normally be reached Monday-Friday: 8am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Saini Amandeep can be reached at (571) 272-3382. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MEHRAZUL ISLAM/Examiner, Art Unit 2662 /AMANDEEP SAINI/Supervisory Patent Examiner, Art Unit 2662
Read full office action

Prosecution Timeline

Sep 12, 2022
Application Filed
Aug 03, 2023
Response after Non-Final Action
Jan 31, 2025
Non-Final Rejection — §103
May 05, 2025
Response Filed
May 17, 2025
Final Rejection — §103
Jul 23, 2025
Interview Requested
Jul 31, 2025
Examiner Interview Summary
Jul 31, 2025
Applicant Interview (Telephonic)
Aug 20, 2025
Request for Continued Examination
Aug 25, 2025
Response after Non-Final Action
Sep 15, 2025
Non-Final Rejection — §103
Nov 24, 2025
Response Filed
Feb 06, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602808
METHOD FOR INSPECTING AN OBJECT
2y 5m to grant Granted Apr 14, 2026
Patent 12592075
REMOTE SENSING FOR INTELLIGENT VEGETATION TRIM PREDICTION
2y 5m to grant Granted Mar 31, 2026
Patent 12579695
Method of Generating Target Image Data, Electrical Device and Non-Transitory Computer Readable Medium
2y 5m to grant Granted Mar 17, 2026
Patent 12524900
METHOD FOR IMPROVING ESTIMATION OF LEAF AREA INDEX IN EARLY GROWTH STAGE OF WHEAT BASED ON RED-EDGE BAND OF SENTINEL-2 SATELLITE IMAGE
2y 5m to grant Granted Jan 13, 2026
Patent 12489964
PATH PLANNING
2y 5m to grant Granted Dec 02, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
58%
Grant Probability
86%
With Interview (+28.3%)
3y 4m
Median Time to Grant
High
PTA Risk
Based on 50 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month