Prosecution Insights
Last updated: April 19, 2026
Application No. 18/555,884

IMAGE PROCESSING DEVICE

Final Rejection §103§112
Filed
Oct 18, 2023
Examiner
LU, ZHIYU
Art Unit
2665
Tech Center
2600 — Communications
Assignee
Fanuc Corporation
OA Round
2 (Final)
49%
Grant Probability
Moderate
3-4
OA Rounds
3y 8m
To Grant
63%
With Interview

Examiner Intelligence

Grants 49% of resolved cases
49%
Career Allow Rate
374 granted / 759 resolved
-12.7% vs TC avg
Moderate +14% lift
Without
With
+13.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
57 currently pending
Career history
816
Total Applications
across all art units

Statute-Specific Performance

§101
2.9%
-37.1% vs TC avg
§103
66.6%
+26.6% vs TC avg
§102
11.8%
-28.2% vs TC avg
§112
17.0%
-23.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 759 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Election/Restrictions Applicant’s election without traverse of claims 1-3 and 5 in the reply filed on 10/30/2025 is acknowledged. Claim Interpretation This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: model storage unit, characteristic point extraction unit, original matching degree calculation unit, target object detection unit, parameter setting unit, detection information storage unit, simple matching degree calculation unit in claim 1. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Response to Arguments Applicant's arguments filed 02/04/2026 have been fully considered but they are not persuasive. Regarding 112 rejection, applicant argued that claim 1 shall be understood to mean that the claimed image processing device handles the detection threshold only as a changeable detection parameter without a definition of any other detection parameter, without contradicting the description in the specification. However, examiner respectfully disagrees. In argued claim, detection is based on comparison between a matching degree and a detection threshold, wherein the detection threshold is a detection parameter. But, when the detection parameter changed, the matching degree also changed. This means matching degree changes as detection threshold changes, which makes the comparison indefinite. At least based on current claim language, the argued claim shows “comparison” as a constant due to “detection threshold” being dependent of “matching degree.” Thus, rejection is proper and maintained. Regarding 103 rejection on claim 1, applicant argued that Mai appears to only describe storing programs and settings (tuples), but does not describe storing detection results (pixel positions and feature vectors). Applicant further argued that there is no evidence to believe that “recalculating the matching degree based on the stored detection information after a parameter change is a predictable and routine step.” However, examiner respectfully disagrees. Despite of applicant’s argument, nowhere in claim limits storing detection results to be permanent or temporary. In argued claim, “detection information including at least a position of the characteristic point with respect to the characteristic point of the target object detected.” In Fig. 8 of Mai, both distinctiveness of attributes of object of interest (410) and relative orientation of the candidate object (425), herein distinctiveness of attributes refers to pixel location (Fig. 11, paragraph 0057) and relative orientation refers to bearing angle (paragraphs 0135, 0141). Under the broadest reasonable interpretation, either one would be considered as a position of the characteristic point with respect to the characteristic point of the target object detected. Thus, rejections are proper and maintained. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-3, 5 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. In claim 1, applicant claims “… a target object detection unit that detects the target object in the captured image based on a comparison between the matching degree and a detection threshold; a parameter setting unit that sets a detection parameter including at least the detection threshold… and a simple matching degree calculation unit that calculates, when the detection parameter has been changed, the matching degree based on the detection parameter changed and the detection information stored in the detection information storage unit.” It is indefinite because the claim sets “matching degree” to be based on “detection threshold” (e.g., detection parameter) while detection is based on a comparison between “matching degree” and “detection threshold.” In filed specification, “the detection threshold” is only one of a plurality of “detection parameters.” It would make more sense for “matching degree” to be based on a “detection parameter” other than a “detection threshold.” For examination purpose, the broadest reasonable interpretation is taken. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1 and 5 are rejected under 35 U.S.C. 103 as being unpatentable over Mai et al. (US2018/0075300). To claim 1, Mai teaches an image processing device that detects a target object in a captured image of an imaging device (Fig. 3), the image processing device comprising: a model storage unit that stores a model pattern (paragraph 0113, one or more template images of each attribute class); a characteristic point extraction unit that extracts a characteristic point from the captured image (paragraphs 0057, 0059, 0113, 0130, features extracted from interest points on the candidate objects); an original matching degree calculation unit that calculates a matching degree between the model pattern and an arrangement of the characteristic point (paragraph 0060, “detectability” of an attribute describes the degree of certainty with which the attribute can be detected in an image of a candidate object; paragraph 0113, an attribute is classified by computing a matching score between features of the detected candidate and one or more template images of each attribute class); a target object detection unit that detects the target object in the captured image based on a comparison between the matching degree and a detection threshold (450 of Fig. 4; paragraph 0103, the object of interest is described by a predetermined plurality of attributes; paragraphs 0116-0118) a parameter setting unit that sets a detection parameter including at least the detection threshold (Fig. 9, paragraph 0150, manually pre-defined probability thresholds for testing that the identity of the candidate); a detection information storage unit that stores detection information including at least a position of the characteristic point with respect to the characteristic point of the target object detected by the target object detection unit (Fig. 5; paragraphs 0057, 0111, pixel location; paragraphs 0119-0123, candidate object location; paragraphs 0129, vector comprises position); and a simple matching degree calculation unit that calculates, when the detection parameter has been changed, the matching degree based on the detection parameter changed and the detection information stored in the detection information storage unit (paragraphs 0060, 0113, “detectability” of an attribute describes the degree of certainty with which the attribute can be detected in an image of a candidate object; paragraph 0115, using the new camera settings to update the confidence that the candidate object is the object of interest; paragraph 0130, the attribute classifier is updated online while executing the method, for example based on feedback from a user about whether the object of interest has been correctly identified; paragraph 0135, the detectability of each attribute is updated online during execution of method. In one example, the detectability is updated online based on feedback from a user about whether the object of interest has been correctly identified; wherein obviously matching degree would change based on changed attributes, e.g., lighting condition, and stored detection information). Mai explicitly teaches updating detectability/confidence and online updating of attribute classifiers in response to new camera settings or user feedback (paragraphs 0115, 0130, 0135). A person of ordinary skill would have been motivated to change detection parameters (for example, detection thresholds or camera settings) in view of Mai’s disclosure that detectability/confidence is updated to reflect changed imaging conditions, because adjusting such parameters is a routine and predictable optimization to maintain detection accuracy when imaging conditions (e.g., lighting or camera settings) change. Recalculating the matching degree based on stored detection information after a parameter change is a predictable and routine step: Mai teaches both (a) storage of detection information (e.g., pixel locations and candidate vectors) and (b) updating detectability/confidence based on new parameters or feedback, thus providing the necessary teachings to perform a recalculation (Fig. 8; paragraphs 0094-0095). Therefore, it would have been obvious to a person of ordinary skill to recalculate the matching degree using the stored detection information upon changing detection parameters to maintain or improve detection performance. To claim 5, Mai teaches claim 1. Though Mai does not expressly disclose further comprising a user interface unit that displays the matching degree calculated by the simple matching degree calculation unit, displaying matching result is a well-known practice in the art, which would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate for preferential display, hence Official Notice is taken. Claim(s) 2-3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mai et al. (US2018/0075300) in view of Odashima et al. (US2019/0340456). To claim 2, Mai teach claim 1. Mai teach wherein the characteristic point extraction unit extracts, as the characteristic point, a point in which an attribute value is equal to or more than an extraction threshold, the detection parameter to be set by the parameter setting unit includes the extraction threshold, and the detection information stored in the detection information storage unit includes the attribute value (paragraph 0113, an attribute is classified by applying a predetermined threshold to features extracted from a region of the detected candidate, wherein equal to or more than would be an obvious condition for comparison). Odashima teach extracting, a characteristic point, a point in which an attribute value is equal to or more than an extraction threshold, the detection parameter to be set by the parameter setting unit includes the extraction threshold (abstract, paragraphs 0005, 0046-0047, 0064-0068, 0097), which would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate into the apparatus of Mai, in order to implement comparison. To claim 3, Mai and Odashima teach claim 2. Mai teach wherein the attribute value includes at least any one selected from color, luminance, magnitude of a luminance gradient, and direction of the luminance gradient of the characteristic point (paragraph 0057, intensity gradient; paragraph 0059, body colour; paragraphs 0113, 0134-0135, lighting conditions). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZHIYU LU whose telephone number is (571)272-2837. The examiner can normally be reached Weekdays: 8:30AM - 5:00PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen R Koziol can be reached at (408) 918-7630. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. ZHIYU . LU Primary Examiner Art Unit 2669 /ZHIYU LU/Primary Examiner, Art Unit 2665 February 15, 2026
Read full office action

Prosecution Timeline

Oct 18, 2023
Application Filed
Nov 11, 2025
Non-Final Rejection — §103, §112
Feb 04, 2026
Response Filed
Feb 15, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601695
METHOD FOR MEASURING THE DETECTION SENSITIVITY OF AN X-RAY DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12597268
METHOD AND DEVICE FOR DETERMINING LANE OF TRAVELING VEHICLE BY USING ARTIFICIAL NEURAL NETWORK, AND NAVIGATION DEVICE INCLUDING SAME
2y 5m to grant Granted Apr 07, 2026
Patent 12596187
METHOD, APPARATUS, AND SYSTEM FOR WIRELESS SENSING MEASUREMENT AND REPORTING
2y 5m to grant Granted Apr 07, 2026
Patent 12592052
INFORMATION PROCESSING DEVICE, AND INFORMATION PROCESSING METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12581142
APPROACHES FOR COMPRESSING AND DISTRIBUTING IMAGE DATA
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
49%
Grant Probability
63%
With Interview (+13.9%)
3y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 759 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month