Prosecution Insights
Last updated: April 19, 2026
Application No. 18/579,388

MONITORING DEFINED OPTICAL PATTERNS BY MEANS OF OBJECT DETECTION AND MACHINE LEARNING

Non-Final OA §103
Filed
Jan 15, 2024
Examiner
LU, ZHIYU
Art Unit
2665
Tech Center
2600 — Communications
Assignee
Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V.
OA Round
1 (Non-Final)
49%
Grant Probability
Moderate
1-2
OA Rounds
3y 8m
To Grant
63%
With Interview

Examiner Intelligence

Grants 49% of resolved cases
49%
Career Allow Rate
374 granted / 759 resolved
-12.7% vs TC avg
Moderate +14% lift
Without
With
+13.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
57 currently pending
Career history
816
Total Applications
across all art units

Statute-Specific Performance

§101
2.9%
-37.1% vs TC avg
§103
66.6%
+26.6% vs TC avg
§102
11.8%
-28.2% vs TC avg
§112
17.0%
-23.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 759 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bartschat et al. (US2020/0191122) in view of Glaser et al. (US 2018/0014382). To claim 1, Bartschat teach a system for monitoring technical installations (paragraphs 0009, 0014), comprising: symbols (1) that are provided on parts of technical installations to be monitored in a surrounding area (paragraphs 0018-0019, 0022-0023, position markers); at least one camera (2) that acquires image data of the surrounding area and applies spatial coordinates and a recording point in time thereto (paragraphs 0026, 0028, the times of the acquisition of optical images can thus be coordinated with specific operating parameters of the installation and/or with the occurrence of specific framework conditions… as a result of such a coordination in the capturing of images, images and parameters that are currently captured can be compared with reference data recorded under similar framework conditions); an image database (4) in which the image data are archived (paragraph 0016, images and parameters detected in previous measurements are stored in a storage device and are retained for comparison); a symbol library (5) in which a plurality of symbols (1) and rules assigned thereto are stored (paragraphs 0011-0013); and an object recognition unit (3), which is designed to recognize symbols (1) in the image data and compare these to the symbols (1) stored in the symbol library (4), wherein a spatial coordinate is assigned to a symbol (1) when the symbol (1) is recognized in the image data (Figs. 3-6, object recognition would be an obvious implementation in identifying markers and respective positions for comparison with stored reference), a comparison to earlier image data of the surrounding area is carried out (paragraphs 0028), and an alarm is triggered when a rule that is assigned to the recognized symbol (1) is not adhered to (paragraph 0009, generating an error signal relating to the connection point reproduced in an image as soon the deviations of the image from a reference image or of a parameter from a reference parameter exceed a specified threshold during imaging). In further said obviousness, Glaser teach a system of using imaging device to monitor positions/states/stages of devices in an environment (Figs. 1-8; paragraphs 0059-0106), wherein device captured in image would be identified, feature would be extracted to recognize device operation state of said device, and further rule implementation on recognized device operation state (Figs. 33-35; paragraphs 0195-0203), wherein character extraction for comparison includes metadata such as location, position, time, etc. (paragraphs 0152, 0158, 0160, 0203). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate teaching of Glaser into the system of Bartschat, in order to implement object recognition and analysis. To claim 10, Bartschat and Glaser teach a method for monitoring technical installations (as explained in response to claim 1 above, wherein both Bartschat and Glaser teach saving captured character/symbol in earlier captured images for referencing). To claim 2, Bartschat and Glaser teach claim 1. Bartschat and Glaser teach characterized in that the at least one camera (2) is fixedly installed (paragraph 0022, imaging device can be fixedly mounted). To claim 3, Bartschat and Glaser teach claim 1. Bartschat and Glaser teach characterized in that the at least one camera (2) is mobile (obvious for having a mobile monitoring camera, which is well-known in the art, hence Official Notice is taken). To claims 4 and 11, Bartschat and Glaser teach claims 1 and 10. Bartschat and Glaser teach characterized in that the at least one camera (2) comprises a device for position determination (6), a device for determining the recording angle (7), and a device for distance measurement (8) so as to determine the spatial coordinates of the image data (Glaser, paragraphs 0072, 0087, 0113, 0115, 0139). To claim 5, Bartschat and Glaser teach claim 4. Bartschat and Glaser teach characterized in that the distance measurement (8) is carried out by a laser range finder or by setting a focus of the at least one camera (Galser, paragraph 0139, distance estimation; measuring distance by laser range finder or by setting a focus of the camera is well-known technique in the art, which would have been obvious to one of ordinary skill in the art to incorporate for distance estimation, hence Official Notice is taken). To claims 6 and 13, Bartschat and Glaser teach claims 1 and 10. Bartschat and Glaser teach characterized in that the symbols (1) can be distinguished well from the surrounding area as a result of the coloring and reflective properties thereof (Bartschat, paragraphs 0019, 0049, shape markers, colour markers, fluorescent). To claims 7 and 14, Bartschat and Glaser teach claims 1 and 10. Bartschat and Glaser teach characterized in that the symbol library assigns rules to symbols (1) which relate a plurality of symbols (1) to one another (Glaser, paragraphs 0089-0093, 0196, 0203, association). To claims 8 and 15, Bartschat and Glaser teach claims 1 and 10. Bartschat and Glaser teach characterized in that the object recognition unit (3) utilizes a machine learning-based model for recognizing the symbols (1) in the image data (Glaser, paragraph 0158). To claim 9, Bartschat and Glaser teach claim 1. Bartschat and Glaser teach characterized in that the image database (4), the symbol library (5) and the object recognition unit (3) are parts of a processor (9) that is connected via a network to the at least one camera (2) (Glaser, paragraphs 0037-0039, 0120, 0186). To claim 12, Bartschat and Glaser teach claim 10. Bartschat and Glaser teach characterized in that the step of preparing the image data (S2) additionally comprises a preprocessing and a filtering of the image data (Glaser, paragraph 0190, filters or other image transformations may additionally be performed; wherein preprocessing would be obvious to manipulating raw image data into a usable, consistent format to enhance computer vision model performance, hence Official Notice is taken). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZHIYU LU whose telephone number is (571)272-2837. The examiner can normally be reached Weekdays: 8:30AM - 5:00PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen R Koziol can be reached at (408) 918-7630. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. ZHIYU . LU Primary Examiner Art Unit 2669 /ZHIYU LU/Primary Examiner, Art Unit 2665 February 7, 2026
Read full office action

Prosecution Timeline

Jan 15, 2024
Application Filed
Feb 07, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601695
METHOD FOR MEASURING THE DETECTION SENSITIVITY OF AN X-RAY DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12597268
METHOD AND DEVICE FOR DETERMINING LANE OF TRAVELING VEHICLE BY USING ARTIFICIAL NEURAL NETWORK, AND NAVIGATION DEVICE INCLUDING SAME
2y 5m to grant Granted Apr 07, 2026
Patent 12596187
METHOD, APPARATUS, AND SYSTEM FOR WIRELESS SENSING MEASUREMENT AND REPORTING
2y 5m to grant Granted Apr 07, 2026
Patent 12592052
INFORMATION PROCESSING DEVICE, AND INFORMATION PROCESSING METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12581142
APPROACHES FOR COMPRESSING AND DISTRIBUTING IMAGE DATA
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
49%
Grant Probability
63%
With Interview (+13.9%)
3y 8m
Median Time to Grant
Low
PTA Risk
Based on 759 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month