Prosecution Insights
Last updated: April 19, 2026
Application No. 18/943,227

INTELLIGENT INSPECTION DEVICE AND ITS OPERATING METHOD

Non-Final OA §102§103
Filed
Nov 11, 2024
Examiner
CODUROGLU, JALAL C
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Inventec Corporation
OA Round
1 (Non-Final)
86%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
92%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
262 granted / 305 resolved
+33.9% vs TC avg
Moderate +6% lift
Without
With
+6.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
21 currently pending
Career history
326
Total Applications
across all art units

Statute-Specific Performance

§101
4.2%
-35.8% vs TC avg
§103
58.1%
+18.1% vs TC avg
§102
20.1%
-19.9% vs TC avg
§112
5.7%
-34.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 305 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-4 & 9-10 are rejected under 35 U.S.C. 102(a)1 and 102(a)2 as being anticipated by Hill et al., Pub. No. : US 20150130664 A1. Regarding claims 1 & 9, Hill et al. discloses an intelligent inspection device & operating method applicable to an intelligent inspection device comprising a signal positioning module, an inertial positioning module, and a computing element, ([0006] “track mobile devices in three dimensional space measure the time that a signal arrives from the mobile device to a system's connected (either wired or wireless) antennae.” & [0030] FIG. 1 … “RF-based positioning system measurements and an inertial devices subsystem measurement” & [0031] FIG. 2 “RF signal measurements are used to determine the position of a set of transmitter antennae with respect to a set of receiving antennae.”) wherein the method comprises: obtaining, by the signal positioning module, a field signal through establishing a communication to a signal collection point ([0045] “the signal is received by the receiver antennae 21 and receiver reference antenna 101... The receiver antennae receive the transmitted signal and forward these signals to the receiver circuitry 110 for demodulation ... The differential phases are used by the position and orientation algorithm in the tracking processor 24 to determine position and orientation 121 of a tracked object.), wherein the signal collection point is configured to collect the field signal emitted by a signal source ([0023] The present invention relates to RF position tracking system that tracks, in two or three dimensions, one or more wireless mobile device(s). The disclosure features utilizing an inertial/magnetic subsystem (IMDS) integrated in the mobile device to better perform tracking by adding stability to the system's RF signals received at the system's receiver(s).” & [0024] “a system for wirelessly tracking the physical position of an object. The system has at least one radio frequency (RF) device having an antenna and at least one inertial sensor. The RF device is configured to emit a radio signal. The system has at least three receiver antennae that are each configured to receive a radio signal emitted by the device and transmit that signal to a receiver. The system also has a receiver in communication with the three or more receiver antennae. The receiver is configured to receive the radio signal from each receiver antenna and is further configured to communicate data to a data processor.” & [0038] “The positioning and navigation system 1 includes inertial/magnetic devices subsystem 10 (IMDS), an RF tracking system 20, a fusion algorithm processor 30 and a corrected position and orientation output interface 40. ); collecting, by the inertial positioning module, inertial positioning information (0023] “inertial information is also received that helps the system screen interference and multipath by weighting the RF data to best match the inertial data provided by the IMDS. The combined system allows a user to obtain a more stabilized/accurate position solution.” & [0048] The inertial/magnetic devices subsystem 10 (IMDS) provides inertial and magnetic field measurements including body angular rates, specific forces, and information on the Earth's magnetic field direction which are sent to the fusion algorithm processor 30 for minimizing RF tracking system errors during loss or corruption of RF signal.”); obtaining, by the computing element, a device location of the intelligent inspection device from the field signal, a collection point location of the signal collection point and a signal source location of the signal source from a database, and calculating a relative position of the intelligent inspection device with respect to the signal source ([0046] Tracking a single transmitter device or transmitter antenna ... The receiver antennae 21 provide the reference frame in which the transmitter antennae are tracked. ... The receiver antennae 21 must be distinct and their respective locations known in space. More transmitter antennae 21 attached to or embedded in a tracked object allow the object's orientation to be calculated based on geometric principles.); and calculating, by the computing element, positioning information of the intelligent inspection device according to the relative position and the inertial positioning information, wherein the positioning information comprises a three-dimensional coordinate and an orientation angle (0047] “Three transmitter antennae 22 provide enough information to calculate a three-dimensional orientation.” & [0048] The inertial/magnetic devices subsystem 10 (IMDS) provides inertial and magnetic field measurements including body angular rates, specific forces, and information on the Earth's magnetic field direction which are sent to the fusion algorithm processor 30 for minimizing RF tracking system errors during loss or corruption of RF signal. In one embodiment, the position and orientation of the transmitter antennae 22 are calculated in RF algorithm block 24.” & [0050] “By measuring the transmitter signal's phase differences recorded at two receiver antennae the distance is calculated.”). Regarding claims 2 & 10, Hill et al. discloses the intelligent inspection device of claim 9 & operating method of claim 1, wherein the field signal comprises at least one of a signal strength, a signal reception frequency and a signal reception time ([0031] “FIG. 2 … a positioning and navigation system in which RF signal measurements are used to determine the position of a set of transmitter antennae with respect to a set of receiving antennae.” & [0049] “the phase is used to measure range.” The operating wavelengths of the RF tracking system provide ambiguous phase measurements because phase measurements are modulo 2.pi. numbers.” & [0050] “measuring the transmitter signal's phase differences recorded at two receiver antennae the distance is calculated.” & [0063] “the accelerometers 12 is as a power-saving device”). Regarding claim 3, Hill et al. discloses the intelligent inspection device operating method of claim 1, wherein the inertial positioning comprises at least one of an acceleration value, an angle and a displacement value ([0038] “accelerometers 12 and/or magnetic sensors 13, with their accompanying signal conditioning methods and algorithms processor 14.” & [0040]-[0041] “The accelerometer 12 may be … types of technologies used for measuring acceleration.” & [0062] “the inertial sensing processor 14 to modify the IMDS 10 output while interface 37 sends acceleration and velocity data to the RF algorithm 24. … provides the RF algorithm 24 with acceleration and velocity data from the inertial hardware 11, 12, and/or 13. Interface 37 allows RF algorithm 24, which would preferably be a Kalman filter, to incorporate acceleration and velocity data into its model.”). Regarding claim 4, Hill et al. discloses the intelligent inspection device operating method of claim 1, further comprising: performing, by the computing element, a filtering operation on the field signal to delete a portion of the field signal, wherein the portion is outside a specified communication frequency band and a signal strength of the portion is below a threshold ([0043] “RF system hardware 23 may consist of amplifiers, limiters, filters, signal sources, demodulators, modulators, and other devices.” & [0055] “The Kalman filter 30 is a recursive filter that estimates the state of a dynamic system. ... The Kalman filter 30 is used to combine, in an optimal manner, ... If the filter 30 detects short term divergence of the RF and IMDS subsystem, it weights the final solution towards the IMDS information and supplies a corrected position and orientation output 40.” & [0056] “Both filters have pros and cons, such as implementation simplicity and speed of processing”). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 5 is rejected under 35 U.S.C. 103 as being obvious over , Hill et al., Pub. No. : US 20150130664 A1 in view of Sinyavskiy et al., Pub. No.: US 20180281191 A1. Regarding claim 5, Hill et al. discloses the intelligent inspection device operating method of claim 1. Hill et al. is not explicit on “calculating a shortest path” & “generating avoidance points” & “performing a segmented path connection”, however, Sinyavskiy et al., US 20180281191 A1, teaches SYSTEMS AND METHODS FOR ROBOTIC PATH PLANNING and discloses, further comprising: calculating, by the computing element, a shortest path according to the positioning information and a target location ([0133] “the shortest path field can be computed. For example, an end point can be determined for robot 200. For every path, the shortest path to the end point can be determined given the present orientation of the robot. In this way, the shortest path can be determined for every point in a map.”); obtaining, by the computing element, a virtual scene from the database to identify obstacle space information located between the positioning information and the target location according to the virtual scene ([0100] FIG. 5A is an overhead view graphical representation of a cost map in accordance to some implementations of this disclosure. Cost map 502 includes a map that correlates with an environment of robot 200. For example, robot indicator 500 indicates the position of robot 200. In some cases, robot indicator 500 may not be actually present on cost map 502. Indicators 506A-506C can indicate at least in part the position of obstacles, such as walls. Indicator 504 can be indicative at least in part of a desirable travel path portion, wherein it is desirable for robot 200 to travel within indicator 504. A person having ordinary skill in the art would appreciate that a cost map may not exist as an image, but rather a data structure with values. However, in some cases, the cost map can exist as an image, such as cost map 502, and/or be more readily visualized as an image.” & See also para. [0106], [0110], [0126]-[0128]); generating, by the computing element, a plurality of avoidance points according to the shortest path and the obstacle space information; and performing, by the computing element, a segmented path connection according to the positioning information, the plurality of avoidance points, and the target location to generate a planned path ([0064] “the operator can also desire for the robot to avoid collisions in the space and/or navigate to a particular destination. … to perform a task at certain places (and not other places) … a robot to navigate a space (and not navigate others), avoid obstacles, perform a desired task, and/or go to a destination.” & [0095] “it is desirable for robot 200 to travel to certain locations … avoid traffic, and/or any other desirable characteristics of transportation. … robot 200 not travelling to certain locations… a robot to travel to certain areas, locations, and/or positions in the course of performing a robotic task. & [0157] “the interface can allow a user to edit a map, adjust a path, move a starting position (e.g., home marker), delete path/path segment, add operations … can allow a user to delete a path segment. For example, after selecting option 934A, a user can then select a portion of a path displayed in map 932 and/or path portions 940. Robot 200 can then delete such selected path portion. … a user to adjust a path. For example, a user can select a path portion included in map 932 and/or path portions 940 and move/manipulate that path portion.”). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to use these above mentioned features disclosed by Sinyavskiy et al. with the system disclosed by Hill et al. in order to generate sensor data about an environment of the robot to propel travel of the robot; to generate a map of the environment; generate a cost map associated with at least a portion of the generated map of the environment, corresponds to a respective location in the environment; generate a plurality of masks, each mask of the plurality having projected path portions for the travel of the robot within the environment, wherein each mask pixel of the plurality corresponds to a respective location in the environment (see Abstract and para.[0020]). Claim 6 is rejected under 35 U.S.C. 103 as being obvious over , Hill et al., Pub. No. : US 20150130664 A1 in view of Sinyavskiy et al., Pub. No.: US 20180281191 A1, in view of Crabtree et al., Pub. No.: US 20250352907 A1. Regarding claim 6, Hill et al. discloses the intelligent inspection device operating method of claim 5. Hill et al. is not explicit on “A* pathfinding algorithm”, however, Crabtree et al., US 20250352907 A1, teaches SYSTEM AND METHOD FOR AI-DRIVEN MULTI-MODAL CONTENT GENERATION AND IMMERSIVE INTERACTION EXPERIENCES and discloses, wherein the computing element performs an A* pathfinding algorithm to generate the planned path ([0235] Local AI agent system 1600 also incorporates advanced pathfinding and spatial awareness capabilities 1604. AI agents can navigate complex, dynamic environments, avoiding obstacles and other characters in a natural manner. In an embodiment, this system uses a combination of traditional A* pathfinding algorithms and machine learning models trained on human movement patterns to create more realistic and varied navigation behaviors. In a crowded city scene, for example, NPCs would exhibit diverse walking speeds, maintain personal space, form natural-looking groups, and react appropriately to unexpected obstacles or events.) . Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to use these above mentioned features disclosed by Crabtree et al. with the system disclosed by Hill et al. in view of Sinyavskiy et al. in order to provide a system and method for creating complex, immersive, and interactive digital content, integrates advanced artificial intelligence, multi-modal input processing, cloud-based shared environments, and immersive hardware to generate, optimize, and deliver rich interactive experiences. The platform supports content mashups, custom scenario generation, and adaptive AI behaviors, enabling the creation of unique and engaging digital environments across various media formats (see Abstract and para.[0004]). Claim 7 is rejected under 35 U.S.C. 103 as being obvious over , Hill et al., Pub. No. : US 20150130664 A1 in view of Sinyavskiy et al., Pub. No.: US 20180281191 A1, in view of Mai et al., Pub. No.: US 20250244136 A1. Regarding claim 7, Hill et al. discloses the intelligent inspection device operating method of claim 5. Hill et al. is not explicit on “generating guide icons”, however, Mai et al., US 20250244136 A1, teaches AUGMENTED REALITY WAYFINDING and discloses, wherein the intelligent inspection device further comprises a display element ([0076] “a map view may be displayable on a client device app, which may run on the rider's mobile phone, wearable (e.g., a smart watch or smart glasses), tablet computer, etc. The app may have a map-focused display with the option to launch an AR live view.”), and the method further comprises: generating, by the computing element, a plurality of guide icons according to the planned path; and controlling, by the computing element, the display element to display the plurality of guide icons ([0077] “Additional information can be presented in the map portion 408. For instance, one or more user-selectable icons 418 may be provided.” & [0084] “the system may only rely on live imagery obtained in real time from the client device in order to present the AR elements in the UI. … the user may be able to toggle between an AR experience showing their imagery and the vehicle's imagery (or imagery from other sources), via an icon or other control in the app.” & [0092] “Contextual signals about landmarks, points of interest or “anchors” can help guide the user.”). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to use these above mentioned features disclosed by Mai et al. with the system disclosed by Hill et al. in view of Sinyavskiy et al. in order to provide an AR indicator, generated for presentation in the second UI region. Upon selection of the indicator, the system modifies the second region into a first section to display at least a portion of the map information and a second section to display an augmented reality view, or replace the map information with the AR view, an icon (see Abstract and para.[0003]). Claim 8 is rejected under 35 U.S.C. 103 as being obvious over , Hill et al., Pub. No. : US 20150130664 A1 in view of Schwartz`131, Pub. No.: US 20210174131 A1. Regarding claim 8, Hill et al. discloses the intelligent inspection device operating method of claim 1. Hill et al. is not explicit on “capturing, by the camera element, an image of a target object, an optical character recognition when digital/an image processing procedure when analog”, however, Schwartz`131, US 20210174131 A1, teaches Computer Vision System for Industrial Equipment Gauge Digitization and Alarms and discloses, wherein the intelligent inspection device further comprises a camera element ([0037] “FIG. 2, a gauge monitoring computer vision system 20 … digital cameras 10), and the method further comprises: capturing, by the camera element, an image of a target object, wherein the target object is configured to display data information ([0035] “the gauge monitoring computer vision system is applied to binarize an analog gauge by providing output values indicative of the readings of the gauge, as captured by images of the gauge in operation.” & [0074] The processor 108 may execute software instructions by performing various input/output, logical, and/or mathematical operations. … the processor 108 may be capable of generating and providing electronic display signals to a display device and other functions.), determining, by the computing element, a type of the data information according to a portion of the image associated with the target object ([0043] “The gauge monitoring computer vision system …enables digital monitoring of analog gauges that requires minimal per-gauge customization and can be adapted to monitor various types of gauges with minimal effort.); performing, by the computing element, an optical character recognition process to output a result associated with the data information when the type is digital ([0031] “a digital twin refers to a digital replica of a physical object. ... The digital replica can be a virtual or cyber replica. ... the gauge monitoring computer vision system is used to implement a digital twin for industrial analog gauges to monitor gauge readings” & [0032] “FIG. 1 illustrates example analog gauges to which digital monitoring may be desired. … The analog gauge also includes an indicator”); and performing, by the computing element, an image processing procedure to output the result associated with the data information when the type is analog ([0029] “a computer vision system for analog gauge monitoring uses a machine learning model that is trained using synthetic training data generated based on one or a few images of the gauge being monitored and a geometric model describing the scale and the indicator of the gauge.” & See also para. [0031]-[0037]). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to use these above mentioned features disclosed by Schwartz`131 with the system disclosed by Hill et al. in order to provide a system and a method for a gauge monitoring computer vision system, for generating a training data set to enable analog gauge digitization and alarms (see Abstract and para.[0001], [0006]). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See Notice of References Cited. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jalal C CODUROGLU whose telephone number is (408)918-7527. The examiner can normally be reached Monday -Friday 8-6 PT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached at 571-272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Jalal C CODUROGLU/Examiner, Art Unit 3665
Read full office action

Prosecution Timeline

Nov 11, 2024
Application Filed
Jan 24, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600501
MOBILE ROBOTIC ARM FOR MOMENTUM UNLOADING AND ORBIT CONTROL
2y 5m to grant Granted Apr 14, 2026
Patent 12600489
ELECTRICAL MONITORING SYSTEM AND METHOD FOR VTOL AIRCRAFT
2y 5m to grant Granted Apr 14, 2026
Patent 12600466
LANDING GEAR ASSEMBLIES, ROTORCRAFT AND ROTORCRAFT METHODS
2y 5m to grant Granted Apr 14, 2026
Patent 12595045
SYSTEM AND METHOD TO MINIMIZE AN AIRCRAFT GROUND TURN RADIUS
2y 5m to grant Granted Apr 07, 2026
Patent 12589884
AIRCRAFT CONTROL SYSTEM FAILURE EVENT SEARCH
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
86%
Grant Probability
92%
With Interview (+6.3%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 305 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month