Prosecution Insights
Last updated: April 19, 2026
Application No. 18/266,778

METHOD AND APPARATUS FOR MANAGING CAMERA SYSTEM

Final Rejection §103
Filed
Jun 12, 2023
Examiner
JAMES, DOMINIQUE NICOLE
Art Unit
2666
Tech Center
2600 — Communications
Assignee
ABB Schweiz AG
OA Round
2 (Final)
76%
Grant Probability
Favorable
3-4
OA Rounds
3y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
16 granted / 21 resolved
+14.2% vs TC avg
Strong +38% interview lift
Without
With
+38.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
27 currently pending
Career history
48
Total Applications
across all art units

Statute-Specific Performance

§101
19.5%
-20.5% vs TC avg
§103
51.5%
+11.5% vs TC avg
§102
14.6%
-25.4% vs TC avg
§112
14.3%
-25.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 21 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Status This action is in response to the application filed on December 11, 2025. Claim 10-12, 15-18, and 20 are amended. Therefore, claims 1-19 are pending for examination in this application. Priority Receipt is acknowledged that application is a National Stage application of PCT/CN2020/140856. Priority to PCT/CN2020/140856 with a priority date of December 29, 2020 is acknowledged under 35 USC 119(e) and 37 CFR 1.78. Response to Amendment Applicant's remarks and amendments filed December 11, 2025 have been entered. Applicant’s amendments regarding the 35 U.S.C. 112(f) interpretations previously set forth in the Non-Final Office Action mailed October 01, 2025, are persuasive. Accordingly, the 35 U.S.C. 112(f) interpretations are withdrawn in response. Applicant’s amendments regarding the 35 U.S.C. 112(a) and 112(b) rejections previously set forth in the Non-Final Office Action mailed October 01, 2025, are persuasive. Accordingly, the 35 U.S.C. 112(a) and 112(b) rejections are withdrawn in response. Applicant’s amendments regarding the 35 U.S.C. 101 rejections previously set forth in the Non-Final Office Action mailed October 01, 2025, are persuasive. Accordingly, the 35 U.S.C. 101 rejections are withdrawn in response. Response to Arguments Argument: On page 8, the applicant alleges, “claim is not taught by any of the asserted combinations.” Response: The examiner respectfully disagrees. Wang was relied on to teach a third position after a movement of the first and second objects see Wang Fig. 10, Col 16, Lines 15-33, “The calibration object physical feature positions F.sub.frusta,feature are characterized by 3D points (x,y,z) for each feature. … For example, the first vertex position is (0,0,0). The second vertex position is (x1,0,0) and the third vertex position is (x2,y2,0),” and Col 6, Lines 45-50, “FIGS. 29-31 are diagrams of the encoderless motion conveyance and associated sensor of FIG. 28, shown at a start position, moving between a start position and a stop position, and at a stop position, respectively,” the third vertex position is the calibration object physical feature position and the calibration object (the calibration object is composed of at least three sub-objects) moves with respect to the cameras from a start, moving between a start and stop position, and a stop position and is considered to be a third position after a movement of the first and second objects. Ye was relied on to teach and a fourth position for the first and second objects from the first and second cameras respectively see Ye, Col 3, Lines 13-19, “The system also retains and stores feature correspondences with respect to the calibration object imaged in a plurality of positions. During validation, the same or a substantially similar calibration object is imaged at another position,” a plurality of positions with respect to the calibration object is considered to contain a fourth position. Thus it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Ye into Wang because by utilizing means of Ye to store feature correspondences of the calibration object in a plurality of positions in to the system of Wang to calibrate one or more 3D sensors on a moving manipulator because the concept of storing positions after movement would be obvious to one of ordinary skill in the art to keep track of the calibration object and sub-objects to determine physical feature positions of Wang. Thus it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Ye in to Wang because determining a position of a calibration object based on a movement of the calibration object to provide a more straight-forward diagnosis of vision system failures related to deteriorating camera calibration. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-2, 5, 7, 10-11, 14, 16, 19-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wang et al, US 10812778 in view of Ye et al, US 11699247 in further view of Liu et al, US 10664994. Regarding claim 1, Wang teaches a method for managing a camera system, the camera system comprising at least a first camera (see Wang, Fig. 1, 110) and a second camera (see Wang Fig. 1, 112), the method comprising: obtaining a first position (see Wang, Fig. 10, Col, 16 Lines 29-34, “the first vertex position is (0,0,0)”) and a second position (see Wang, Fig. 10, Col, 16 Lines 29-34, “second vertex position is (x1,0,0)”) for a first object (see Wang, Fig. 1, 150) and a second object (see Wang, Fig. 1, 152) from the first and second cameras, respectively (see Wang, Col 7, Lines 43-47, “exemplary laser displacement sensors 110, 112, 114 and 116 of the arrangement 100 consist of an image sensor (or imager) S,” displacement is considered to be obtaining a first and second position), the first and second objects being used for calibrating the camera system (see Wang, Fig. 1 and Col 8, Lines 13-19, “The object 120 shown in FIG. 1 is a stable object (also generally termed herein as a “calibration object”) consisting of a plurality of individual, spaced apart frustum assemblies (also termed calibration “subobjects”) 150, 152, 154 and 156 that each define a discrete “feature set”); obtaining, after a movement of the first and second objects, a third position (see Wang, Fig. 10, Col, 16 Lines 29-34, “the third vertex position is (x2,y2,0)”) (see Wang, Col 20, Lines 43-50, “when multiple subobjects are measured by the same sensor within one scan, each pair of relative positions are computed (up to a linear transform due to the sensing modality), and one set of pairs of relative positions are boot-strapped to induce an estimate of all of the subobject feature positions”); Wang does not expressively teach and a fourth position for the first and second objects from the first and second cameras, respectively However, Ye in a similar invention in the same field of endeavor teaches and a fourth position for the first and second objects from the first and second cameras, respectively (see Ye, Col 3, Lines 13-19, “The system also retains and stores feature correspondences with respect to the calibration object imaged in a plurality of positions. During validation, the same or a substantially similar calibration object is imaged at another position”), a relative object position between the first and second objects remaining unchanged during the movement; The combination of Wang and Ye are analogous art because they are both in the same field of endeavor of calibrating a camera system. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to image the calibration object at another position during validation in the method for runtime determination of camera miscalibration of Ye in the method of concurrently calibrating a plurality of 3D sensors of Wang for faster, less-expensive and more-straightforward diagnosis of vision system failures related to deteriorating camera calibration (see Ye, Abstract). Wang in view of Ye does not expressively teach and determining a relative camera position between the first and second cameras based on the first, second, third, and fourth positions. However, Liu in a similar invention in the same field of endeavor teaches and determining a relative camera position between the first and second cameras based on the first, second, third, and fourth positions (see Liu, Col 2, Lines, 38-43, “image of the plate, which is in a particular position within the field of view of all cameras. The machine vision calibration software deduces the relative position of each camera from the image of the plate acquired by each camera”). The combination of Wang, Ye, and Liu are analogous art because they are all in the same field of endeavor of calibrating a camera system. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to deduce the relative position of each camera from the image of the plate acquired by each camera; for interpolation to employ two known equations; for a version of accurate calibration to be generated for each camera by executing the extrapolation/interpretation described in the method for generating camera calibrations for a vision system camera of Liu in the method of concurrently calibrating a plurality of 3D sensors of Wang in view of Ye to desirably allow a measurement of a plurality of heights to be accurately calibrated-for, with a minimum of calibration plate manipulation and setup (see Liu, Col 3, Lines 17-20). Regarding claim 2, Wang in view of Ye in further view of Liu teaches the method of claim 1, wherein determining the relative camera position comprises: generating an equation associated with the relative camera position, the first, second, third, and fourth positions based on a geometry relationship between the first and second objects and the first and second cameras (see Liu, Col 8, Lines, 16-23, “When non-parallel, the interpolation can employ known equations describing each of the two calibrated planes and the third to-be-calibrated plane. These equations are used to a ray's intersection with each of the specified planes in a manner clear to those of skill;”); and determining the relative camera position by solving the equation (see Liu, Col 10, Lines 16-24, “A version of its own accurate calibration data is generated for each camera by executing the extrapolation/interpolation process described above. In addition, it is expressly contemplated that the vision system camera (including a plurality of cameras) can be either stationary or in relative motion (double arrow M in FIG. 1) with respect to the motion coordinate system”). The rationale of claim 1 has been applied herein. Regarding claim 5, Wang in view of Ye in further view of Liu teaches the method of claim 1, wherein the first object is placed within a first field of view of the first camera and the second object is placed within a second field of view of the second camera (see Wang, “each frustum 150, 152, 154 and 156 resides within the local FOV of one of the respective 3D sensors 110, 112, 114 and 116,” 150 is in the field of view of 110 and 152 is in the field of view of 112 which are considered to be first object within a field of view of the first camera and second object within a field of view of the second camera). The rationale of claim 1 has been applied herein. Regarding claim 7, Wang in view of Ye in further view of Liu teaches the method of claim 1, wherein the camera system further comprises a third camera (see Wang, Fig. 1, 114), and the method further comprises: (see Wang, Fig. 1, 154) from the third camera , the third object being used for calibrating the camera system (see Wang, Fig. 1 and Col 8, Lines 13-19, “The object 120 shown in FIG. 1 is a stable object (also generally termed herein as a “calibration object”) consisting of a plurality of individual, spaced apart frustum assemblies (also termed calibration “subobjects”) 150, 152, 154 and 156 that each define a discrete “feature set””); obtaining a fifth position for a third object (see Ye, Col 3, Lines 13-19, “The system also retains and stores feature correspondences with respect to the calibration object imaged in a plurality of positions. During validation, the same or a substantially similar calibration object is imaged at another position”) obtaining, after a movement of the third object together with the first and second objects, a six position for the third object from the third camera (see Ye, Col 3, Lines 13-19, “The system also retains and stores feature correspondences with respect to the calibration object imaged in a plurality of positions. During validation, the same or a substantially similar calibration object is imaged at another position”), a relative object position between third object and any of the first and second objects remaining unchanged during the movement (see Wang, Col 20, Lines 43-50, “when multiple subobjects are measured by the same sensor within one scan, each pair of relative positions are computed (up to a linear transform due to the sensing modality), and one set of pairs of relative positions are boot-strapped to induce an estimate of all of the subobject feature positions”); and determining a relative camera position between the first and third cameras based on the first, fifth, third, and sixth positions (see Liu, Col 2, Lines, 38-43, “image of the plate, which is in a particular position within the field of view of all cameras. The machine vision calibration software deduces the relative position of each camera from the image of the plate acquired by each camera”). The rationale of claim 1 has been applied herein. As per claim 10, Claim 10 claims an apparatus for managing a camera system the camera system comprising the same limitation as Claim 1 therefore, the rejection and rationale are analogous to that made in Claim 1. Wang further teaches a computer processor coupled to a computer-readable memory unit comprising instructions that when executed by the computer processor is configured to (see Wang, Col 8, Lines 65-67 – Col 9, Lines 1-2, “The computing device 180 can comprise a server, PC, laptop, tablet, smartphone or purpose-built processing device, among other types of processors with associated memory, networking arrangements, data storage, etc., that should be clear to those of skill”): As per claim 11, Claim 11 claims the same limitation as Claim 2 and is dependent on a similarly rejected independent claim. Therefore the rejection and rationale is analogous to that made in Claim 2. As per claim 14, Claim 14 claims the same limitation as Claim 5 and is dependent on a similarly rejected independent claim. Therefore the rejection and rationale is analogous to that made in Claim 5. As per claim 16, Claim 16 claims the same limitation as Claim 7 and is dependent on a similarly rejected independent claim. Therefore the rejection and rationale is analogous to that made in Claim 7. Regarding claim 19, Wang in view of Ye in view of Liu further teaches a system for managing a camera system, comprising: a computer processor coupled to a computer-readable memory unit, the memory unit comprising instructions that when executed by the computer processor implements the method according to claim 1 (see Wang, Col 39, Lines 50-61, “Moreover, a depicted process or processor can be combined with other processes and/or processors or divided into various sub-processes or processors. Such sub-processes and/or sub-processors can be variously combined according to embodiments herein. Likewise, it is expressly contemplated that any function, process and/or processor herein can be implemented using electronic hardware, software consisting of a non-transitory computer-readable medium of program instructions, or a combination of hardware and software”). The rationale of claim 1 has been applied herein. Regarding claim 20, Wang in view of Ye in view of Liu further teaches a non-transitory computer readable medium having instructions stored thereon, the instructions, when executed on at least one processor, cause the at least one processor to perform the method according to claim 1 (see Wang, Col 39, Lines 50-61, “Moreover, a depicted process or processor can be combined with other processes and/or processors or divided into various sub-processes or processors. Such sub-processes and/or sub-processors can be variously combined according to embodiments herein. Likewise, it is expressly contemplated that any function, process and/or processor herein can be implemented using electronic hardware, software consisting of a non-transitory computer-readable medium of program instructions, or a combination of hardware and software”). The rationale of claim 1 has been applied herein. Claim(s) 3 and 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wang et al, US 10812778 in view of Ye et al, US 11699247 in view of Liu et al, US 10664994 in view of Devitt et al, US 20210225035 in further view of Held et al, US 20180096485. Regarding claim 3, Wang in view of Ye in view of Liu does not expressively teach the method of claim 2, wherein solving the equation comprises: representing the relative camera position by a transformation matrix including a plurality of unknown parameters However, Devitt in a similar invention in the same field of endeavor teaches wherein solving the equation comprises: representing the relative camera position by a transformation matrix including a plurality of unknown parameters (see Devitt, Paragraph [0030],” The position of the image acquisition system relative to the external system is preferably known (e.g., during calibration, during setup, dynamically determined during operation, etc.) and represented by one or more transformation matrices, but can alternatively be unknown”); The combination of Wang, Ye, Liu, and Devitt are analogous art because they are all in the same field of endeavor of calibrating a camera system. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention for the position of the image acquisition system relative to the external system to be known and represented by one or more transformation matrices and can alternatively be unknown in the method for imaging system calibration of Devitt in the method of concurrently calibrating a plurality of 3D sensors of Wang in view of Ye in further view of Liu to create new and useful system and method for calibration (see Devitt, Paragraph [0003]). Wang in view of Ye in view of Liu in further view of Devitt does not expressively teach generating a group of equations including the plurality of unknown parameters based on the equation; and determining the plurality of unknown parameters by solving the group of equations However, Held in a similar invention in the same field of endeavor teaches generating a group of equations including the plurality of unknown parameters based on the equation; and determining the plurality of unknown parameters by solving the group of equations (see Held, Paragraph [0071], “The unknown t can therefore be solved on the basis of the above-mentioned equation with the aid of a plurality of test measurements during calibration. Since the position vector p of the probe body 28, which does not change during calibration, is also initially unknown, this must also be determined”). The combination of Wang, Ye, Liu, Devitt, and Held are analogous art because they are all in the same field of endeavor of calibrating a camera system. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to solve for unknown variables using an equation during calibration in the measuring system of Held in the method of concurrently calibrating a plurality of 3D sensors of Wang in view of Ye in view of Liu in further view of Devitt to determine, on the basis of the speed data and/or acceleration data of the probe body, whether or not probing is present in which the probe body makes contact with a measurement object for the purpose of capturing a measuring point (see Held, Abstract). As per claim 12, Claim 12 claims the same limitation as Claim 3 and is dependent on a similarly rejected dependent claim. Therefore the rejection and rationale is analogous to that made in Claim 3. Claim(s) 4, 6, 13, and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wang et al, US 10812778 in view of Ye et al, US 11699247 in view of Liu et al, US 10664994 in further view of Hain et al, US 11270465. Regarding claim 4, Wang in view of Ye in further view of Liu does not expressively teach the method of claim 1, wherein the first and second objects are connected with a fixed connection such that the relative object position remains unchanged during the movement However, Hain in a similar invention in the same field of endeavor teaches wherein the first and second objects are connected with a fixed connection such that the relative object position remains unchanged during the movement (see Hain, Col 14, Lines 52-58, “the position of a sensor device remains fixed relative to a target while the target is registered or two targets are moved relative to each other. A fixed position can for example be achieved by rigidly attaching one object to another.”). The combination of Wang, Ye, Liu, and Hain are analogous art because they are all in the same field of endeavor of calibrating a camera system. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention for the position of a sensor device to remain fixed relative to the target or two targets are moved relative to each other in the method for generating pose transformation data of Hain in the method of concurrently calibrating a plurality of 3D sensors of Wang in view of Ye in further view of Liu to improve accuracy of, for example, the accuracy of the angle of a bone cut (see Hain, Col 1, Lines, 35-37). Regarding claim 6, Wang in view of Ye in further view of Liu does not expressively teach the method of claim 1, wherein obtaining the first position comprises: obtaining a first image for the first object from the first camera; and determining the first position from the first image, the first position representing a relative position between the first object and the first camera. However, Hain in a similar invention in the same field of endeavor teaches wherein obtaining the first position comprises: obtaining a first image for the first object from the first camera (see Hain, Col 2, Lines 40-47, “using the first digital camera, a first plurality of images of a first calibration object in a frame of reference of a first calibration object indexed to a corresponding first plurality of measured linear displacement data of the distance between the first digital camera and the first calibration object;”); and determining the first position from the first image, the first position representing a relative position between the first object and the first camera (see Hain, Col 2, Lines 40-47, “first plurality of measured linear displacement data of the distance between the first digital camera and the first calibration object;’). The combination of Wang, Ye, Liu, and Hain are analogous art because they are all in the same field of endeavor of calibrating a camera system. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention for a first plurality of images of a first calibration object to correspond to first plurality of measured linear displacement data of the distance between the first camera and first calibration object; determine a first plurality of linear displacement data of the distance between the first digital camera and the first calibration object in the method for generating pose transformation data of Hain in the method of concurrently calibrating a plurality of 3D sensors of Wang in view of Ye in further view of Liu to improve accuracy of, for example, the accuracy of the angle of a bone cut (see Hain, Col 1, Lines, 35-37). As per claim 13, Claim 13 claims the same limitation as Claim 4 and is dependent on a similarly rejected independent claim. Therefore the rejection and rationale is analogous to that made in Claim 4. As per claim 15, Claim 15 claims the same limitation as Claim 6 and is dependent on a similarly rejected independent claim. Therefore the rejection and rationale is analogous to that made in Claim 6. Claim(s) 8-9 and 17-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wang et al, US 10812778 in view of Ye et al, US 11699247 in view of Liu et al, US 10664994 in further view of Arora et al, US 10839557. Regarding claim 8, Wang in view of Ye in further view of Liu does not expressively teach the method of claim 1, further comprising: calibrating the camera system based on the relative camera position However, Arora in a similar invention in the same field of endeavor teaches further comprising: calibrating the camera system based on the relative camera position (see Arora, Col 15, Lines 36-40, “the present calibration uses the relative position and forced changes in at least the skew and principal points for the cameras”). The combination of Wang, Ye, Liu, and Arora are analogous art because they are all in the same field of endeavor of calibrating a camera system. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the relative position of the cameras for calibration in the system of Arora in the method of concurrently calibrating a plurality of 3D sensors of Wang in view of Ye in further view of Liu provide a calibration output that may be applied to generate views of an item in an augmented reality application (see Arora, Abstract). Regarding claim 9, Wang in view of Ye in view of Liu in further view of Arora teaches the method of claim 8, wherein the camera system is deployed in a robot system and the method further comprises: monitoring an operation of the robot system based on the calibrated camera system (see Ye, Col 4, Lines 48-55, “one or more runtime objects to be inspected, aligned, acted upon by a robot manipulator, or any other operation that is controlled or assisted by machine vision processes. The system can be calibrated (and calibration can be later self-diagnosed) according to an illustrative embodiment of this invention.”). The rationale of claim 8 has been applied herein. As per claim 17, Claim 17 claims the same limitation as Claim 8 and is dependent on a similarly rejected independent claim. Therefore the rejection and rationale is analogous to that made in Claim 8. As per claim 18, Claim 18 claims the same limitation as Claim 9 and is dependent on a similarly rejected dependent claim. Therefore the rejection and rationale is analogous to that made in Claim 9. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DOMINIQUE JAMES whose telephone number is (703)756-1655. The examiner can normally be reached 9:00 am - 6:00 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Emily Terrell can be reached at (571)270-3717. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DOMINIQUE JAMES/Examiner, Art Unit 2666 /MING Y HON/Primary Examiner, Art Unit 2666
Read full office action

Prosecution Timeline

Jun 12, 2023
Application Filed
Sep 24, 2025
Non-Final Rejection — §103
Dec 11, 2025
Response Filed
Mar 11, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591976
CELL SEGMENTATION IMAGE PROCESSING METHODS
2y 5m to grant Granted Mar 31, 2026
Patent 12567138
REGISTRATION METROLOGY TOOL USING DARKFIELD AND PHASE CONTRAST IMAGING
2y 5m to grant Granted Mar 03, 2026
Patent 12548159
SCENE PERCEPTION SYSTEMS AND METHODS
2y 5m to grant Granted Feb 10, 2026
Patent 12462681
Detection of Malfunctions of the Switching State Detection of Light Signal Systems
2y 5m to grant Granted Nov 04, 2025
Patent 12462346
MACHINE LEARNING BASED NOISE REDUCTION CIRCUIT
2y 5m to grant Granted Nov 04, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
76%
Grant Probability
99%
With Interview (+38.5%)
3y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 21 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month