DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
The amendment filed on 11/26/2025 has been entered. Claims 1-3, 11-13, and 20 have been amended. Claims 1-20 are pending.
Response to Arguments
Applicant's arguments filed 11/26/2025 have been fully considered but they are not persuasive.
Regarding claims 1, 11, and 20 applicant states on pages 1-3 of the Remarks that Morris fails to teach, “any processor-executed step that automatically identifies the area of interest within the asset” and “determining user interface configuration (e.g. configuring the user interface) having the graphical interface configuration with the graphical prompt corresponding to the inspection task to be performed for the identified area of interest, for example, prompts to perform measurements of the anomaly”
In response to the applicant’s argument, examiner respectfully disagrees. Morris describes configured graphical interface in fig. 5-6 and paragraph 58, “The screen 154 may display the report screen by electronically displaying a visualization of a graphical user interface indicative of a summary of data (e.g., images, sensor data, and/or videos) captured during the inspection, in addition to flags, tags, notes, and the like, related to data captured during the inspection. Actuation of the “Flag” button 256 may cause a current position of the inspection or a current image/data of the inspection to be flagged for future reference (e.g., reference via the report, the report screen, or at the end of an inspection to cause the operator to determine if to repeat inspection of a portion of the asset)” Morris also describes identifying an area of interest in the inspection procedure with user interface in fig. 5-6 and paragraph 60, “the processor 16 provides the CAD model 265 as an overlaid image of the live feed 268 during the inspection. In this way, an operator may use the comparison capture mode to determine if a portion of the asset under inspection is close to the ideal representation presented by the CAD model 265. For example, the operator may find cracks in an asset from the comparison of the live feed 268 to the ideal representation shown with the CAD model 265. In some embodiments, the CAD model 265 may follow the positioning of the sensor 148 throughout the inspection, such that that CAD model 265 may generally display the same inspection point on the asset as the sensor 148. For example, during an inspection an operator may navigate the sensor 148 to a turbine tip and the CAD model 265 may track, mirror, or follow the sensor 148 navigation to the turbine tip. In some embodiments, the tracking is possible through software interpretation of positional feedback from positioning sensors in the head end section 138” Paragraph 71 describes, “Once the adjustment is determined, the processor 16 transmits one or more commands associated with the adjustment to the positioning elements of the borescope 14, for example, to control positioning of the borescope 14. In this way, the processor 16 autonomously determines where the sensor 148 is in the recorded inspection, determines where the sensor 148 is in the real inspection, and determines what motion to move the borescope 14 to bring the sensor 148 into the position indicated by the recorded inspection. In some embodiments, the processor 16 works with the borescope 14 in a feedback loop to make small adjustments to the positioning of the sensor 148, checking each time if the difference exceeds the predefined threshold and adjusting appropriately until the difference is within the predefined threshold, before continuing on to a next step of the recorded inspection” Examiner assert that Morris describes the above limitation. Rejection Maintained.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Morris (US Pub. 20190331613).
Regarding claim 1, Morris discloses:
A method, (at least refer to fig. 6 and paragraph 64. Describes the method 276 is described as being performed by the processor 16) comprising:
Receiving, by a data processor of a non-destructive testing (NDT) device, data characterizing an inspection point identifying an asset to be inspected using the NDT device, (at least refer to fig. 1, 6 and paragraphs 15, 65. Describes some embodiments of the disclosed subject matter may utilize a probe driver to control, for example, a retraction and/or an insertion of an imaging device of an NDT device into an asset based on an operation of the probe driver. Para. 65, describes: the processor 16 determines a current step of a recorded inspection. The recorded inspection is accessed from memory 18 by the processor 16 and may include indications of steps of the recorded inspection);
Determining, by the data processor, an inspection procedure associated with the asset based on the inspection point included in the received data, wherein the inspection procedure includes automatically identifying, by the data processor, an area of interest in the asset at the inspection point, (at least refer to fig. 5-6 and paragraphs 60, 66, 71. Describes for example, during an inspection an operator may navigate the sensor 148 to a turbine tip and the CAD model 265 may track, mirror, or follow the sensor 148 navigation to the turbine tip. In some embodiments, the tracking is possible through software interpretation of positional feedback from positioning sensors in the head end section 138. Para. 66, describes: once the next movement associated with performing the recorded inspection is determined, at block 280, the processor 16 determines a goal position associated with the current step of the recorded inspection. The goal position of the current step may be saved along with the recorded inspection, or along with the indication of the step of the recorded inspection. Para. 71, describes: Once the adjustment is determined, the processor 16 transmits one or more commands associated with the adjustment to the positioning elements of the borescope 14, for example, to control positioning of the borescope 14. In this way, the processor 16 autonomously determines where the sensor 148 is in the recorded inspection, determines where the sensor 148 is in the real inspection, and determines what motion to move the borescope 14 to bring the sensor 148 into the position indicated by the recorded inspection. In some embodiments, the processor 16 works with the borescope 14 in a feedback loop to make small adjustments to the positioning of the sensor 148, checking each time if the difference exceeds the predefined threshold and adjusting appropriately until the difference is within the predefined threshold, before continuing on to a next step of the recorded inspection);
In response to identifying the area of interest during the inspection procedure, determining, by the data processor, a user interface configuration of the NDT device corresponding to the identified area of interest, (at least refer to fig. 5-6 and paragraphs 58, 71. Describes the screen 154 may display the report screen by electronically displaying a visualization of a graphical user interface indicative of a summary of data (e.g., images, sensor data, and/or videos) captured during the inspection, in addition to flags, tags, notes, and the like, related to data captured during the inspection. Para. 71, describes: Once the adjustment is determined, the processor 16 transmits one or more commands associated with the adjustment to the positioning elements of the borescope 14, for example, to control positioning of the borescope 14. In this way, the processor 16 autonomously determines where the sensor 148 is in the recorded inspection, determines where the sensor 148 is in the real inspection, and determines what motion to move the borescope 14 to bring the sensor 148 into the position indicated by the recorded inspection. In some embodiments, the processor 16 works with the borescope 14 in a feedback loop to make small adjustments to the positioning of the sensor 148, checking each time if the difference exceeds the predefined threshold and adjusting appropriately until the difference is within the predefined threshold, before continuing on to a next step of the recorded inspection) wherein the user interface configuration includes:
A graphical interface configuration provided via a graphical user interface displayed on a display of the NDT device, the graphical interface configuration including a graphical prompt corresponding to an inspection task to be performed for the area of interest, (at least refer to fig. 5-6 and paragraphs 58, 62, 71. Describes the screen 154 may display the report screen by electronically displaying a visualization of a graphical user interface indicative of a summary of data (e.g., images, sensor data, and/or videos) captured during the inspection, in addition to flags, tags, notes, and the like, related to data captured during the inspection. Actuation of the “Flag” button 256 may cause a current position of the inspection or a current image/data of the inspection to be flagged for future reference (e.g., reference via the report, the report screen, or at the end of an inspection to cause the operator to determine if to repeat inspection of a portion of the asset). Actuation of the “Capture” button 258 may cause the sensor 148 to capture an image, video, and/or other sensor data associated with the point of inspection. Para. 62, describes: the processor 16 may operate to perform the macro to provide commands to the probe driver 162 while in an autonomous operational mode. When the inspection is performed while recording a macro, each rotation, insertion, removal, and data capture and/or inspection action is recorded and stored in the memory 18, or other suitable memory, as associated with a movement such that the same inspection may be repeated by executing the macro recorded. A probe driver 162 operating in the autonomous operational mode may execute commands recorded (e.g., performed previously and recorded) with the macro while the probe driver 162 is used to perform an inspection in conjunction with the borescope 14); and
A manual interface configuration corresponding to an actuated interface device of the NDT device, (at least refer to fig. 2, 5 and paragraphs 30, 43. Describes the operator may operate the borescope 14 and/or mobile device 30 to acquire an image or video of the asset. To do so, the operator may actuate a button and/or a user input element on the borescope 14 and/or mobile device 30. In response to the actuation, the button and/or the user input element may generate a signal indicative of a request for acquisition of an image or video. Para. 43, describes: The probe driver 162 may operate in a fully-manual operational mode. In the fully-manual operational mode, the probe driver 162 may receive a positioning instruction via user inputs 166); and
Configuring, by the data processor, the NDT device to perform the inspection procedure by applying the determined user interface configuration, (at least refer to fig. 1, 5 and paragraph 24. Describes the operator may control a position of a sensor of the borescope 14 using relative control gestures (e.g., touch gestures). The relative control gestures may be used on their own or may be combined with inputs derived from other control devices (e.g., physical manipulation device such as a physical joystick, one or more buttons, a physical control pad, and so on) to position a sensor. Additionally, the relative control gestures may be combined with control inputs from other external systems, such as a second NDT system, a laptop, cell phone, tablet).
Regarding claim 11, Morris discloses:
A non-destructive testing (NDT) device, (at least refer to fig. 1 and paragraph 15. Describes non-destructive testing (NDT) systems and devices) comprising:
A sensor, (at least refer to fig. 1 and paragraph 17. Describes NDT devices and systems sometimes include measurement devices (e.g., sensors));
An actuated interface device, (at least refer to fig. 1 and paragraph 17. Describes the operator may actuate a button and/or a user input element on the borescope 14 and/or mobile device 30);
A data processor communicatively coupled to the sensor and the actuated interface device, (at least refer to fig. 1 and paragraph 30. Describes the operator may actuate a button and/or a user input element on the borescope 14 and/or mobile device 30. Once the camera 144 acquires the image or video, the borescope 14 and/or mobile device 30 may receive data corresponding to the acquired image or video and may store the data in the memory 18 or 34, may process the image via the processor 16 or 32, the processor 16 or 32 may receive the data and display the data as an image or video via the screen 154, the screen 156);
A display coupled to the data processor, (at least refer to fig. 1 and paragraph 30. Describes the processor 16 or 32 may receive the data and display the data as an image or video via the screen 154, the screen 156);
A memory coupled to the data processor and storing computer-readable executable instructions, which when executed by the data processor perform operations comprising receive data characterizing an inspection point identifying an asset to be inspected using the NDT device, (at least refer to fig. 1, 6 and paragraphs 30, 65, 87. Describes once the camera 144 acquires the image or video, the borescope 14 and/or mobile device 30 may receive data corresponding to the acquired image or video and may store the data in the memory 18 or 34, may process the image via the processor 16 or 32, the processor 16 or 32 may receive the data and display the data as an image or video via the screen 154, the screen 156. Para. 65, describes: the processor 16 determines a current step of a recorded inspection. The recorded inspection is accessed from memory 18 by the processor 16 and may include indications of steps of the recorded inspection. Para. 87, describes: The essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data);
Determine an inspection procedure associated with the asset based on the inspection point included in the received data, wherein the inspection procedure includes automatically identifying, by the data processor, an area of interest in the asset at the inspection point, (at least refer to fig. 5-6 and paragraphs 60, 66, 71. Describes for example, during an inspection an operator may navigate the sensor 148 to a turbine tip and the CAD model 265 may track, mirror, or follow the sensor 148 navigation to the turbine tip. In some embodiments, the tracking is possible through software interpretation of positional feedback from positioning sensors in the head end section 138. Para. 66, describes: once the next movement associated with performing the recorded inspection is determined, at block 280, the processor 16 determines a goal position associated with the current step of the recorded inspection. The goal position of the current step may be saved along with the recorded inspection, or along with the indication of the step of the recorded inspection. Para. 71, describes: Once the adjustment is determined, the processor 16 transmits one or more commands associated with the adjustment to the positioning elements of the borescope 14, for example, to control positioning of the borescope 14. In this way, the processor 16 autonomously determines where the sensor 148 is in the recorded inspection, determines where the sensor 148 is in the real inspection, and determines what motion to move the borescope 14 to bring the sensor 148 into the position indicated by the recorded inspection. In some embodiments, the processor 16 works with the borescope 14 in a feedback loop to make small adjustments to the positioning of the sensor 148, checking each time if the difference exceeds the predefined threshold and adjusting appropriately until the difference is within the predefined threshold, before continuing on to a next step of the recorded inspection);
In response to identifying the area of interest during the inspection procedure, determining a user interface configuration of the NDT device corresponding to the identified area of interest, (at least refer to fig. 5-6 and paragraphs 58, 71. Describes the screen 154 may display the report screen by electronically displaying a visualization of a graphical user interface indicative of a summary of data (e.g., images, sensor data, and/or videos) captured during the inspection, in addition to flags, tags, notes, and the like, related to data captured during the inspection. Para. 71, describes: Once the adjustment is determined, the processor 16 transmits one or more commands associated with the adjustment to the positioning elements of the borescope 14, for example, to control positioning of the borescope 14. In this way, the processor 16 autonomously determines where the sensor 148 is in the recorded inspection, determines where the sensor 148 is in the real inspection, and determines what motion to move the borescope 14 to bring the sensor 148 into the position indicated by the recorded inspection. In some embodiments, the processor 16 works with the borescope 14 in a feedback loop to make small adjustments to the positioning of the sensor 148, checking each time if the difference exceeds the predefined threshold and adjusting appropriately until the difference is within the predefined threshold, before continuing on to a next step of the recorded inspection) wherein the user interface configuration includes:
A graphical interface configuration provided via a graphical user interface displayed on the display, the graphical interface configuration including a graphical prompt corresponding to an inspection task to be performed for the area of interest, (at least refer to fig. 5-6 and paragraphs 58, 62, 71. Describes the screen 154 may display the report screen by electronically displaying a visualization of a graphical user interface indicative of a summary of data (e.g., images, sensor data, and/or videos) captured during the inspection, in addition to flags, tags, notes, and the like, related to data captured during the inspection. Actuation of the “Flag” button 256 may cause a current position of the inspection or a current image/data of the inspection to be flagged for future reference (e.g., reference via the report, the report screen, or at the end of an inspection to cause the operator to determine if to repeat inspection of a portion of the asset). Actuation of the “Capture” button 258 may cause the sensor 148 to capture an image, video, and/or other sensor data associated with the point of inspection. Para. 62, describes: the processor 16 may operate to perform the macro to provide commands to the probe driver 162 while in an autonomous operational mode. When the inspection is performed while recording a macro, each rotation, insertion, removal, and data capture and/or inspection action is recorded and stored in the memory 18, or other suitable memory, as associated with a movement such that the same inspection may be repeated by executing the macro recorded. A probe driver 162 operating in the autonomous operational mode may execute commands recorded (e.g., performed previously and recorded) with the macro while the probe driver 162 is used to perform an inspection in conjunction with the borescope 14); and
A manual interface configuration corresponding to the actuated interface device, (at least refer to fig. 2, 5 and paragraphs 30, 43. Describes the operator may operate the borescope 14 and/or mobile device 30 to acquire an image or video of the asset. To do so, the operator may actuate a button and/or a user input element on the borescope 14 and/or mobile device 30. In response to the actuation, the button and/or the user input element may generate a signal indicative of a request for acquisition of an image or video. Para. 43, describes: The probe driver 162 may operate in a fully-manual operational mode. In the fully-manual operational mode, the probe driver 162 may receive a positioning instruction via user inputs 166); and
Configure the NDT device to perform the inspection procedure by applying the determined user interface configuration, (at least refer to fig. 1, 5 and paragraph 24. Describes the operator may control a position of a sensor of the borescope 14 using relative control gestures (e.g., touch gestures). The relative control gestures may be used on their own or may be combined with inputs derived from other control devices (e.g., physical manipulation device such as a physical joystick, one or more buttons, a physical control pad, and so on) to position a sensor. Additionally, the relative control gestures may be combined with control inputs from other external systems, such as a second NDT system, a laptop, cell phone, tablet).
Regarding claim 20, Morris discloses:
A non-transitory computer readable medium having instructions stored therein that, when executed by a microprocessor, causes the microprocessor to perform a method, (at least refer to fig. 1 and paragraph 24. Describes a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks) the method comprising:
Receiving, by a data processor of a non-destructive testing (NDT) device, data characterizing an inspection point identifying an asset to be inspected using the NDT device, (at least refer to fig. 1, 6 and paragraphs 15, 65. Describes some embodiments of the disclosed subject matter may utilize a probe driver to control, for example, a retraction and/or an insertion of an imaging device of an NDT device into an asset based on an operation of the probe driver. Para. 65, describes: the processor 16 determines a current step of a recorded inspection. The recorded inspection is accessed from memory 18 by the processor 16 and may include indications of steps of the recorded inspection);
Determining, by the data processor, an inspection procedure associated with the asset based on the inspection point included in the received data, wherein the inspection procedure includes automatically identifying, by the data processor, an area of interest in the asset at the inspection point, (at least refer to fig. 5-6 and paragraphs 60, 66, 71. Describes for example, during an inspection an operator may navigate the sensor 148 to a turbine tip and the CAD model 265 may track, mirror, or follow the sensor 148 navigation to the turbine tip. In some embodiments, the tracking is possible through software interpretation of positional feedback from positioning sensors in the head end section 138. Para. 66, describes: once the next movement associated with performing the recorded inspection is determined, at block 280, the processor 16 determines a goal position associated with the current step of the recorded inspection. The goal position of the current step may be saved along with the recorded inspection, or along with the indication of the step of the recorded inspection. Para. 71, describes: Once the adjustment is determined, the processor 16 transmits one or more commands associated with the adjustment to the positioning elements of the borescope 14, for example, to control positioning of the borescope 14. In this way, the processor 16 autonomously determines where the sensor 148 is in the recorded inspection, determines where the sensor 148 is in the real inspection, and determines what motion to move the borescope 14 to bring the sensor 148 into the position indicated by the recorded inspection. In some embodiments, the processor 16 works with the borescope 14 in a feedback loop to make small adjustments to the positioning of the sensor 148, checking each time if the difference exceeds the predefined threshold and adjusting appropriately until the difference is within the predefined threshold, before continuing on to a next step of the recorded inspection);
In response to identifying the area of interest during the inspection procedure, determining, by the data processor, a user interface configuration of the NDT device corresponding to the identified area of interest, (at least refer to fig. 5-6 and paragraphs 58, 71. Describes the screen 154 may display the report screen by electronically displaying a visualization of a graphical user interface indicative of a summary of data (e.g., images, sensor data, and/or videos) captured during the inspection, in addition to flags, tags, notes, and the like, related to data captured during the inspection. Para. 71, describes: Once the adjustment is determined, the processor 16 transmits one or more commands associated with the adjustment to the positioning elements of the borescope 14, for example, to control positioning of the borescope 14. In this way, the processor 16 autonomously determines where the sensor 148 is in the recorded inspection, determines where the sensor 148 is in the real inspection, and determines what motion to move the borescope 14 to bring the sensor 148 into the position indicated by the recorded inspection. In some embodiments, the processor 16 works with the borescope 14 in a feedback loop to make small adjustments to the positioning of the sensor 148, checking each time if the difference exceeds the predefined threshold and adjusting appropriately until the difference is within the predefined threshold, before continuing on to a next step of the recorded inspection) wherein the user interface configuration includes:
A graphical interface configuration provided via a graphical user interface displayed on a display of the NDT device the graphical interface configuration including a graphical prompt corresponding to an inspection task to be performed for the area of interest, (at least refer to fig. 5-6 and paragraphs 58, 62, 71. Describes the screen 154 may display the report screen by electronically displaying a visualization of a graphical user interface indicative of a summary of data (e.g., images, sensor data, and/or videos) captured during the inspection, in addition to flags, tags, notes, and the like, related to data captured during the inspection. Actuation of the “Flag” button 256 may cause a current position of the inspection or a current image/data of the inspection to be flagged for future reference (e.g., reference via the report, the report screen, or at the end of an inspection to cause the operator to determine if to repeat inspection of a portion of the asset). Actuation of the “Capture” button 258 may cause the sensor 148 to capture an image, video, and/or other sensor data associated with the point of inspection. Para. 62, describes: the processor 16 may operate to perform the macro to provide commands to the probe driver 162 while in an autonomous operational mode. When the inspection is performed while recording a macro, each rotation, insertion, removal, and data capture and/or inspection action is recorded and stored in the memory 18, or other suitable memory, as associated with a movement such that the same inspection may be repeated by executing the macro recorded. A probe driver 162 operating in the autonomous operational mode may execute commands recorded (e.g., performed previously and recorded) with the macro while the probe driver 162 is used to perform an inspection in conjunction with the borescope 14); and
A manual interface configuration corresponding to an actuated interface device of the NDT device, (at least refer to fig. 2, 5 and paragraphs 30, 43. Describes the operator may operate the borescope 14 and/or mobile device 30 to acquire an image or video of the asset. To do so, the operator may actuate a button and/or a user input element on the borescope 14 and/or mobile device 30. In response to the actuation, the button and/or the user input element may generate a signal indicative of a request for acquisition of an image or video. Para. 43, describes: The probe driver 162 may operate in a fully-manual operational mode. In the fully-manual operational mode, the probe driver 162 may receive a positioning instruction via user inputs 166); and
Configuring, by the data processor, the NDT device to perform the inspection procedure by applying the determined user interface configuration, (at least refer to fig. 1, 5 and paragraph 24. Describes the operator may control a position of a sensor of the borescope 14 using relative control gestures (e.g., touch gestures). The relative control gestures may be used on their own or may be combined with inputs derived from other control devices (e.g., physical manipulation device such as a physical joystick, one or more buttons, a physical control pad, and so on) to position a sensor. Additionally, the relative control gestures may be combined with control inputs from other external systems, such as a second NDT system, a laptop, cell phone, tablet).
Regarding claim 2, Morris discloses:
Wherein the inspection task includes performing measurements of the area of interest, (at least refer to fig. 5-6 and paragraph 77. Describes Upon the borescope 14 and/or the processor 16 positioning of the sensor 148 into the goal position, the processor 16 may interpret a next goal action to perform when continuing to the next portion of the recorded inspection. A goal action may include capturing video, a photo, and/or taking a sensor measurement. The goal action may also relate to storing data in the report, to storing an indication of an observation from the inspection, and/or flagging a particular portion, image, video, and/or sensor reading for additional and/or alternative analysis).
Regarding claim 3, Morris discloses:
Wherein the manual interface configuration includes an actuation input pattern that, when received via the actuated interface device, cause the NDT device to execute the inspection task, (at least refer to fig. 2, 5 and paragraphs 30, 78. Describes For example, a borescope operator may physically manipulate the borescope 14 at one location. Para. 30, describes: the operator may actuate a button and/or a user input element on the borescope 14 and/or mobile device 30. In response to the actuation, the button and/or the user input element may generate a signal indicative of a request for acquisition of an image or video. The borescope 14 and/or mobile device 30 may receive the signal indicative of the request for acquisition of the image or video and may transmit a control signal to operate the camera 144. Furthermore, the acquired image or video may be saved in a report detailing results of the inspection. Para. 78, describes: The processor 16, via the MDI 250, may indicate to the operator to place the probe driver 162 into the disengaged operational mode, or the fully-manual operational mode, such that the operator take over the control of the inspection from the processor 16).
Regarding claim 4, Morris discloses:
Wherein responsive to receiving the actuation input pattern, controlling, by the NDT device, an actuator associated with an image sensor used to perform the inspection procedure, (at least refer to fig. 1, 5 and paragraphs 23, 30. Describes for example, a borescope operator may physically manipulate the borescope 14 at one location. Para. 30, describes: the operator may actuate a button and/or a user input element on the borescope 14 and/or mobile device 30. In response to the actuation, the button and/or the user input element may generate a signal indicative of a request for acquisition of an image or video. The borescope 14 and/or mobile device 30 may receive the signal indicative of the request for acquisition of the image or video and may transmit a control signal to operate the camera 144. Furthermore, the acquired image or video may be saved in a report detailing results of the inspection).
Regarding claim 5, Morris discloses:
Wherein the received data is received via an inspection point selection provided via the graphical user interface, the inspection point selection including previously performed inspection procedures associated with the inspection point, (at least refer to fig. 1, 5 and paragraphs 58. Describes actuation of the “Flag” button 256 may cause a current position of the inspection or a current image/data of the inspection to be flagged for future reference (e.g., reference via the report, the report screen, or at the end of an inspection to cause the operator to determine if to repeat inspection of a portion of the asset). Actuation of the “Capture” button 258 may cause the sensor 148 to capture an image, video, and/or other sensor data associated with the point of inspection).
Regarding claim 6, Morris discloses:
Wherein the received data is received via the image sensor of the NDT device and includes an image of the inspection point, (at least refer to fig. 1, 5 and paragraph 60. Describes for example, the operator may find cracks in an asset from the comparison of the live feed 268 to the ideal representation shown with the CAD model 265. In some embodiments, the CAD model 265 may follow the positioning of the sensor 148 throughout the inspection, such that that CAD model 265 may generally display the same inspection point on the asset as the sensor 148).
Regarding claim 7, Morris discloses:
Wherein the received data includes image data of the inspection point captured by the image sensor, (at least refer to fig. 1, 5 and paragraph 60. Describes the head end section 138 may include one or more sensors that collect data about the surrounding environment (e.g., a camera 144, a sensor 148, etc.) Para. 60, describes: the CAD model 265 may follow the positioning of the sensor 148 throughout the inspection, such that that CAD model 265 may generally display the same inspection point on the asset as the sensor 148).
Regarding claim 8, Morris discloses:
Wherein the actuated interface device includes at least one of a button, slider, joystick, knob, pointing stick, and touchpad, (at least refer to fig. 1-2 and paragraph 34. Describes the probe driver 162 may receive positioning instructions to retract or extend the conduit section 142 via actuation of the user inputs 166. The user inputs 166 may be a variety of input structures including a joystick, a digital pad, a control pad, a 3D spatial mouse (e.g., a computer mouse that facilitates navigation and selection within a 3D plane instead of the 2D plane of a display or screen) and/or one or more buttons).
Regarding claim 9, Morris discloses:
Wherein the NDT device includes a borescope or a video probe, (at least refer to fig. 1-2 and paragraph 19. Describes an operator may operate a probe driver in coordination with a menu driven inspection (MDI) of a video borescope).
Regarding claim 10, Morris discloses:
Wherein the asset includes at least one of a compressor, a turbine, an engine, or a combustor, (at least refer to fig. 1-2 and paragraph 21. Describes the borescope 14 may have one or more of a processor 16 and may have one or more of a memory 18, and may be used to inspect, for example, turbo machinery, containers, vessels, compressors, pumps, turbo expanders, wind turbines, hydroturbines, industrial equipment, residential equipment, and the like).
Regarding claim 12, Morris discloses:
Wherein the inspection task includes performing measurements of the area of interest, (at least refer to fig. 5-6 and paragraph 77. Describes Upon the borescope 14 and/or the processor 16 positioning of the sensor 148 into the goal position, the processor 16 may interpret a next goal action to perform when continuing to the next portion of the recorded inspection. A goal action may include capturing video, a photo, and/or taking a sensor measurement. The goal action may also relate to storing data in the report, to storing an indication of an observation from the inspection, and/or flagging a particular portion, image, video, and/or sensor reading for additional and/or alternative analysis).
Regarding claim 13, Morris discloses:
Wherein the manual interface configuration includes an actuation input pattern that, when received via the actuated interface device, cause the NDT device to execute the inspection task, (at least refer to fig. 2, 5 and paragraphs 30, 78. Describes For example, a borescope operator may physically manipulate the borescope 14 at one location. Para. 30, describes: the operator may actuate a button and/or a user input element on the borescope 14 and/or mobile device 30. In response to the actuation, the button and/or the user input element may generate a signal indicative of a request for acquisition of an image or video. The borescope 14 and/or mobile device 30 may receive the signal indicative of the request for acquisition of the image or video and may transmit a control signal to operate the camera 144. Furthermore, the acquired image or video may be saved in a report detailing results of the inspection. Para. 78, describes: The processor 16, via the MDI 250, may indicate to the operator to place the probe driver 162 into the disengaged operational mode, or the fully-manual operational mode, such that the operator take over the control of the inspection from the processor 16).
Regarding claim 14, Morris discloses:
Wherein the NDT device controls an actuator associated with the sensor used to perform the inspection procedure, responsive to receiving the actuation input pattern, (at least refer to fig. 1, 5 and paragraphs 23, 30. Describes for example, a borescope operator may physically manipulate the borescope 14 at one location. Para. 30, describes: the operator may actuate a button and/or a user input element on the borescope 14 and/or mobile device 30. In response to the actuation, the button and/or the user input element may generate a signal indicative of a request for acquisition of an image or video. The borescope 14 and/or mobile device 30 may receive the signal indicative of the request for acquisition of the image or video and may transmit a control signal to operate the camera 144. Furthermore, the acquired image or video may be saved in a report detailing results of the inspection).
Regarding claim 15, Morris discloses:
Wherein the received data is received via an inspection point selection provided via the graphical user interface, the inspection point selection including previously performed inspection procedures associated with the inspection point, (at least refer to fig. 1, 5 and paragraphs 58. Describes actuation of the “Flag” button 256 may cause a current position of the inspection or a current image/data of the inspection to be flagged for future reference (e.g., reference via the report, the report screen, or at the end of an inspection to cause the operator to determine if to repeat inspection of a portion of the asset). Actuation of the “Capture” button 258 may cause the sensor 148 to capture an image, video, and/or other sensor data associated with the point of inspection).
Regarding claim 16, Morris discloses:
Wherein the received data is received via an image sensor of the NDT device and includes an image of the inspection point, (at least refer to fig. 1, 5 and paragraph 60. Describes for example, the operator may find cracks in an asset from the comparison of the live feed 268 to the ideal representation shown with the CAD model 265. In some embodiments, the CAD model 265 may follow the positioning of the sensor 148 throughout the inspection, such that that CAD model 265 may generally display the same inspection point on the asset as the sensor 148).
Regarding claim 17, Morris discloses:
Wherein the received data includes image data of the inspection point captured by the image sensor, (at least refer to fig. 1, 5 and paragraph 60. Describes the head end section 138 may include one or more sensors that collect data about the surrounding environment (e.g., a camera 144, a sensor 148, etc.) Para. 60, describes: the CAD model 265 may follow the positioning of the sensor 148 throughout the inspection, such that that CAD model 265 may generally display the same inspection point on the asset as the sensor 148).
Regarding claim 18, Morris discloses:
Wherein the actuated interface device includes at least one of a button, slider, joystick, knob, pointing stick, and touchpad, (at least refer to fig. 1-2 and paragraph 34. Describes the probe driver 162 may receive positioning instructions to retract or extend the conduit section 142 via actuation of the user inputs 166. The user inputs 166 may be a variety of input structures including a joystick, a digital pad, a control pad, a 3D spatial mouse (e.g., a computer mouse that facilitates navigation and selection within a 3D plane instead of the 2D plane of a display or screen) and/or one or more buttons).
Regarding claim 19, Morris discloses:
Wherein the NDT device includes a borescope or a video probe, (at least refer to fig. 1-2 and paragraph 19. Describes an operator may operate a probe driver in coordination with a menu driven inspection (MDI) of a video borescope).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. 20140207417 relates to the inspection management system including an inspection data provider that provides inspection data relating to an inspector, one or more devices used to complete an inspection, one or more assets associated with an inspection, or any combination thereof. The inspection management system also includes a display that presents one or more graphical user interfaces and a processor that obtains the inspection data from the inspection data provider and presents an inspection management graphical user interface via the display, based upon the inspection data.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to IFEDAYO B ILUYOMADE whose telephone number is (571)270-7118. The examiner can normally be reached Monday-Friday.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Eason can be reached at 5712707230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/IFEDAYO B ILUYOMADE/Primary Examiner, Art Unit 2624 12/10/2025