Prosecution Insights
Last updated: April 19, 2026
Application No. 18/071,082

HELMET MOUNTED PROCESSING SYSTEM

Non-Final OA §102§103
Filed
Nov 29, 2022
Examiner
ANYA, CHARLES E
Art Unit
2194
Tech Center
2100 — Computer Architecture & Software
Assignee
Galvion Ltd.
OA Round
3 (Non-Final)
82%
Grant Probability
Favorable
3-4
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
727 granted / 891 resolved
+26.6% vs TC avg
Strong +34% interview lift
Without
With
+33.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
41 currently pending
Career history
932
Total Applications
across all art units

Statute-Specific Performance

§101
11.2%
-28.8% vs TC avg
§103
61.1%
+21.1% vs TC avg
§102
6.8%
-33.2% vs TC avg
§112
10.4%
-29.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 891 resolved cases

Office Action

§102 §103
DETAILED ACTION Claims 1-20 are pending in this application. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 4-11 and 14-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. Pub. No. 2016/0342840 A1 to Mullins et al. As to claim 1, Mullin teaches a helmet system comprising: a helmet shell (HMD 101) (“…FIG. 1 is a network diagram illustrating a network environment 100 suitable for operating an AR application of a HMD with display lenses, according to some example embodiments. The network environment 100 includes a HMD 101 and a server 110, communicatively coupled to each other via a network 108. The HMD 101 and the server 110 may each be implemented in a computer system, in whole or in part, as described below with respect to FIG. 13…The server 110 may be part of a network-based system. For example, the network-based system may be or include a cloud-based server system that provides AR content (e.g., audio or visual instructions on how to operate a tool, information about an imminent or potential threat, instructions on how to remedy the threat or minimize exposure to the threat, visualization of the threat, augmented information including 3D models of virtual objects related to physical objects in images captured by the HMD 101) to the HMD 101…The HMD 101 may include a helmet or other head mounted device that a user 102 may wear to view the AR content related to captured images of several physical objects (e.g., object A 116, object B 118) in a real world physical environment 114. In one example embodiment, the HMD 101 includes a computing device with a camera and a display (e.g., smart glasses, smart helmet, smart visor, smart face shield). The computing device may be removably mounted to the head of the user 102. In one example, the display may be a screen that displays what is captured with a camera of the HMD 101. In another example, the display of the HMD 101 may include a transparent display or see-through display, such as in the visor or face shield of a helmet, or a display lens distinct from the visor or face shield of the helmet…” paragraph 0035-0037); a first subsystem (Sensors 202), attached to the helmet shell, for receiving field data from at least one in-situ sensor (sensor data) and processing the field data to detect a first event trigger and generate first derived data associated with the first event trigger (determine and identify a potential threat) (“…The HMD 101 may determine and identify a potential threat to the user 102 based on the combination of HMD-based sensor data, user-based sensor data, physical object-based sensor data, and ambient-based sensor data. In one example embodiment, the HMD 101 receives preconfigured parameters (e.g., safe ranges, and safe thresholds for corresponding sensors) associated with a threat and performs the analysis locally on the HMD 101 by comparing the sensor-based data with the preconfigured parameters. If the HMD 101 determines that one or more of the sensor data matches one or more of the preconfigured parameters, the HMD 101 notifies the user 102 by generating an audio or visual alert in the HMD 101. The HMD 101 may further provide the user 102 with instructions on how to remedy or correct an operation on the physical objects 116, 118 to dissipate the threat. If HMD 101 determines that no action from the user 102 can dissipate the threat, the HMD 101 may cause a display of a virtual evacuation route or path in the transparent display of the HMD 101…” paragraph 0040); and a controller (Server 110), attached to the helmet shell, for receiving the first derived data ((determine and identify a potential threat) and receiving second derived data associated with a second event trigger detected by a second subsystem (operation 602, the server 110 receives or accesses preconfigured parameters for sensor data) (“…In another example embodiment, the HMD 101 streams or provides sensor-data to the server 110 so that the server 110 performs a threat analysis. For example, the server 110 may already be configured with preconfigured parameters associated with threats and user tasks. For example, the temperature of a gauge may not exceed a threshold after the user 102 turns a valve in step 3 of a maintenance operation of a machine…Furthermore, external sensors 112 may be associated with, coupled to, or related to the objects 116 and 118 in the physical environment 114 to measure a location, information, or captured readings from the objects 116 and 118. Examples of captured readings may include but are not limited to weight, pressure, temperature, velocity, direction, position, intrinsic and extrinsic properties, acceleration, and dimensions. For example, sensors 112 may be disposed throughout a factory floor to measure movement, pressure, orientation, and temperature. The server 110 can compute readings from data generated by the sensors 112…FIG. 6 is a flowchart illustrating a method 600 for generating a threat pattern based on preconfigured parameters at a server 110, according to an example embodiment. The method 600 may be deployed on the server 110 or on the HMD 101 and, accordingly, is described merely by way of example with reference thereto. At operation 602, the server 110 receives or accesses preconfigured parameters for sensor data. The sensor data may include sensor data from HMD 101 or aggregate HMDs. In one example embodiment, a user 102 configures and enters a range for one of the attribute of the sensor data (e.g., safe temperature range between t1 and t2 for temperature sensor of an engine during steps 1 through 5 of a maintenance operation). In another example embodiment, the user 102 configures a range for one or more sensors 112 external to the HMD 101. Operation 602 may be implemented with the server AR application 504 of server 110 or the threat application 216 of HMD 101…At operation 604, a threat pattern is generated based on the preconfigured parameters for the sensor data from operation 602. In one example embodiment, operation 604 may be implemented with the server AR application 504 of server 110 or the threat learning module 402 of HMD 101. The threat pattern is stored in the threat pattern dataset 512 of the server 110 or in the storage device 208 of HMD 101 at operation 606…” paragraphs 0041/0042/0095/0096), and in response to receiving the first derived data and the second derived data, requesting third derived data from a third subsystem that did not detect a third event trigger (“…The threat application identifies a threat based on a threat pattern and the sensor data. The threat pattern includes, for example, preconfigured thresholds for the sensor data or a series of user activities and corresponding sensor data resulting from the user activities. The threat application generates a warning notification in response to detecting the threat. The threat application compares the sensor data with the threat pattern to determine a threat. The AR application causes a display of the AR content comprising the warning notification in the transparent display. The warning notification may include a visual notification in the transparent display to bring the attention of the user to the imminent threat. For example, the transparent display may display a layer of virtual flashing lights on the physical objects causing the threat (e.g., a physical motor may be flashing red to indicate that the physical motor is overheating) or a virtual arrow showing a direction of the imminent threat…the HMD 101 notifies the user 102 by generating an audio or visual alert in the HMD 101… At operation 908, the HMD 101 identifies the threat based on the comparison. At operation 910, the HMD 101 generates a suggested course of action for the user 102 to take to address and dissipate the threat. At operation 912, the HMD 101 generates AR information corresponding to the suggested course of action and displays the AR information in the display 204 of the HMD 101…” paragraphs 0023/0040/0108). As to claim 4, Mullins teaches the helmet system of claim 1 wherein the subsystem provides an output to a user in response to detection of the first event trigger (a display of the AR content comprising the warning notification in the transparent display) (“…The threat application identifies a threat based on a threat pattern and the sensor data. The threat pattern includes, for example, preconfigured thresholds for the sensor data or a series of user activities and corresponding sensor data resulting from the user activities. The threat application generates a warning notification in response to detecting the threat. The threat application compares the sensor data with the threat pattern to determine a threat. The AR application causes a display of the AR content comprising the warning notification in the transparent display. The warning notification may include a visual notification in the transparent display to bring the attention of the user to the imminent threat. For example, the transparent display may display a layer of virtual flashing lights on the physical objects causing the threat (e.g., a physical motor may be flashing red to indicate that the physical motor is overheating) or a virtual arrow showing a direction of the imminent threat…The HMD 101 may determine and identify a potential threat to the user 102 based on the combination of HMD-based sensor data, user-based sensor data, physical object-based sensor data, and ambient-based sensor data. In one example embodiment, the HMD 101 receives preconfigured parameters (e.g., safe ranges, and safe thresholds for corresponding sensors) associated with a threat and performs the analysis locally on the HMD 101 by comparing the sensor-based data with the preconfigured parameters. If the HMD 101 determines that one or more of the sensor data matches one or more of the preconfigured parameters, the HMD 101 notifies the user 102 by generating an audio or visual alert in the HMD 101. The HMD 101 may further provide the user 102 with instructions on how to remedy or correct an operation on the physical objects 116, 118 to dissipate the threat. If HMD 101 determines that no action from the user 102 can dissipate the threat, the HMD 101 may cause a display of a virtual evacuation route or path in the transparent display of the HMD 101…” paragraphs 0023/0040). As to claim 5, Mullins teaches the helmet system of claim 1, wherein each of the first, second and third subsystems processing processes a different type of field data and provides the first, second and third derived data to the controller in response to detection of an associated the first, second and third event triggers, respectively (HMD 101 may determine and identify a potential threat to the user 102 based on the combination of HMD-based sensor data, user-based sensor data, physical object-based sensor data, and ambient-based sensor data) (“…The HMD 101 may determine and identify a potential threat to the user 102 based on the combination of HMD-based sensor data, user-based sensor data, physical object-based sensor data, and ambient-based sensor data. In one example embodiment, the HMD 101 receives preconfigured parameters (e.g., safe ranges, and safe thresholds for corresponding sensors) associated with a threat and performs the analysis locally on the HMD 101 by comparing the sensor-based data with the preconfigured parameters. If the HMD 101 determines that one or more of the sensor data matches one or more of the preconfigured parameters, the HMD 101 notifies the user 102 by generating an audio or visual alert in the HMD 101. The HMD 101 may further provide the user 102 with instructions on how to remedy or correct an operation on the physical objects 116, 118 to dissipate the threat. If HMD 101 determines that no action from the user 102 can dissipate the threat, the HMD 101 may cause a display of a virtual evacuation route or path in the transparent display of the HMD 101…” paragraph 0040). As to claim 6, Mullins teaches the helmet system of claim 5 wherein the controller provides commands (User 102) to modify an operating mode of one or more of the first, second and third subsystems (The HMD 101 may determine and identify a potential threat to the user 102 based on the combination of HMD-based sensor data, user-based sensor data, physical object-based sensor data, and ambient-based sensor data) in response to the first, second and third derived data, respectively (The HMD 101 may further provide the user 102 with instructions on how to remedy or correct an operation on the physical objects 116, 118 to dissipate the threat) (“…The threat application identifies a threat based on a threat pattern and the sensor data. The threat pattern includes, for example, preconfigured thresholds for the sensor data or a series of user activities and corresponding sensor data resulting from the user activities. The threat application generates a warning notification in response to detecting the threat. The threat application compares the sensor data with the threat pattern to determine a threat. The AR application causes a display of the AR content comprising the warning notification in the transparent display. The warning notification may include a visual notification in the transparent display to bring the attention of the user to the imminent threat. For example, the transparent display may display a layer of virtual flashing lights on the physical objects causing the threat (e.g., a physical motor may be flashing red to indicate that the physical motor is overheating) or a virtual arrow showing a direction of the imminent threat…The HMD 101 may determine and identify a potential threat to the user 102 based on the combination of HMD-based sensor data, user-based sensor data, physical object-based sensor data, and ambient-based sensor data. In one example embodiment, the HMD 101 receives preconfigured parameters (e.g., safe ranges, and safe thresholds for corresponding sensors) associated with a threat and performs the analysis locally on the HMD 101 by comparing the sensor-based data with the preconfigured parameters. If the HMD 101 determines that one or more of the sensor data matches one or more of the preconfigured parameters, the HMD 101 notifies the user 102 by generating an audio or visual alert in the HMD 101. The HMD 101 may further provide the user 102 with instructions on how to remedy or correct an operation on the physical objects 116, 118 to dissipate the threat. If HMD 101 determines that no action from the user 102 can dissipate the threat, the HMD 101 may cause a display of a virtual evacuation route or path in the transparent display of the HMD 101…” paragraphs 0023/0040). As to claim 7, Mullins teaches the helmet system of claim 5 wherein the controller provides augmented data to one or more of the first, second and third subsystems in response to detection of the first, second and third event triggers, respectively (AR application/warning notification) (“…The threat application identifies a threat based on a threat pattern and the sensor data. The threat pattern includes, for example, preconfigured thresholds for the sensor data or a series of user activities and corresponding sensor data resulting from the user activities. The threat application generates a warning notification in response to detecting the threat. The threat application compares the sensor data with the threat pattern to determine a threat. The AR application causes a display of the AR content comprising the warning notification in the transparent display. The warning notification may include a visual notification in the transparent display to bring the attention of the user to the imminent threat. For example, the transparent display may display a layer of virtual flashing lights on the physical objects causing the threat (e.g., a physical motor may be flashing red to indicate that the physical motor is overheating) or a virtual arrow showing a direction of the imminent threat…The HMD 101 may determine and identify a potential threat to the user 102 based on the combination of HMD-based sensor data, user-based sensor data, physical object-based sensor data, and ambient-based sensor data. In one example embodiment, the HMD 101 receives preconfigured parameters (e.g., safe ranges, and safe thresholds for corresponding sensors) associated with a threat and performs the analysis locally on the HMD 101 by comparing the sensor-based data with the preconfigured parameters. If the HMD 101 determines that one or more of the sensor data matches one or more of the preconfigured parameters, the HMD 101 notifies the user 102 by generating an audio or visual alert in the HMD 101. The HMD 101 may further provide the user 102 with instructions on how to remedy or correct an operation on the physical objects 116, 118 to dissipate the threat. If HMD 101 determines that no action from the user 102 can dissipate the threat, the HMD 101 may cause a display of a virtual evacuation route or path in the transparent display of the HMD 101…” paragraphs 0023/0040). As to claim 8, Mullins teaches the helmet system of claim 7 wherein the augmented data (AR application/warning notification) are related to potential threats and presented to a user via one or more of the first, second and third subsystems (“…The threat application identifies a threat based on a threat pattern and the sensor data. The threat pattern includes, for example, preconfigured thresholds for the sensor data or a series of user activities and corresponding sensor data resulting from the user activities. The threat application generates a warning notification in response to detecting the threat. The threat application compares the sensor data with the threat pattern to determine a threat. The AR application causes a display of the AR content comprising the warning notification in the transparent display. The warning notification may include a visual notification in the transparent display to bring the attention of the user to the imminent threat. For example, the transparent display may display a layer of virtual flashing lights on the physical objects causing the threat (e.g., a physical motor may be flashing red to indicate that the physical motor is overheating) or a virtual arrow showing a direction of the imminent threat…The HMD 101 may determine and identify a potential threat to the user 102 based on the combination of HMD-based sensor data, user-based sensor data, physical object-based sensor data, and ambient-based sensor data. In one example embodiment, the HMD 101 receives preconfigured parameters (e.g., safe ranges, and safe thresholds for corresponding sensors) associated with a threat and performs the analysis locally on the HMD 101 by comparing the sensor-based data with the preconfigured parameters. If the HMD 101 determines that one or more of the sensor data matches one or more of the preconfigured parameters, the HMD 101 notifies the user 102 by generating an audio or visual alert in the HMD 101. The HMD 101 may further provide the user 102 with instructions on how to remedy or correct an operation on the physical objects 116, 118 to dissipate the threat. If HMD 101 determines that no action from the user 102 can dissipate the threat, the HMD 101 may cause a display of a virtual evacuation route or path in the transparent display of the HMD 101…” paragraphs 0023/0040). As to claim 9, Mullins teaches the helmet system of claim 5 wherein each of the first, second and third subsystems provide an output to a user in response to detection of the first, second and third event triggers, respectively (AR application/warning notification) (“…The threat application identifies a threat based on a threat pattern and the sensor data. The threat pattern includes, for example, preconfigured thresholds for the sensor data or a series of user activities and corresponding sensor data resulting from the user activities. The threat application generates a warning notification in response to detecting the threat. The threat application compares the sensor data with the threat pattern to determine a threat. The AR application causes a display of the AR content comprising the warning notification in the transparent display. The warning notification may include a visual notification in the transparent display to bring the attention of the user to the imminent threat. For example, the transparent display may display a layer of virtual flashing lights on the physical objects causing the threat (e.g., a physical motor may be flashing red to indicate that the physical motor is overheating) or a virtual arrow showing a direction of the imminent threat…The HMD 101 may determine and identify a potential threat to the user 102 based on the combination of HMD-based sensor data, user-based sensor data, physical object-based sensor data, and ambient-based sensor data. In one example embodiment, the HMD 101 receives preconfigured parameters (e.g., safe ranges, and safe thresholds for corresponding sensors) associated with a threat and performs the analysis locally on the HMD 101 by comparing the sensor-based data with the preconfigured parameters. If the HMD 101 determines that one or more of the sensor data matches one or more of the preconfigured parameters, the HMD 101 notifies the user 102 by generating an audio or visual alert in the HMD 101. The HMD 101 may further provide the user 102 with instructions on how to remedy or correct an operation on the physical objects 116, 118 to dissipate the threat. If HMD 101 determines that no action from the user 102 can dissipate the threat, the HMD 101 may cause a display of a virtual evacuation route or path in the transparent display of the HMD 101…” paragraphs 0023/0040). As to claim 10, Mullins teaches the helmet system of claim 5 wherein the subsystems include three of an audio subsystem (voice speech recognition subsystem), a vision subsystem (other sensors, including infrared sensors and lighting and microphone(s), and various cameras and displays), a laser threat warning subsystem, an inertia measurement subsystem (an integrated inertial measuring unit (IMU)) and an ambient light subsystem (“…The HMD may contain a battery and receipt charging DC subsystem or, additionally or alternatively, an AC input and converter to connect directly to an AC source. The HMD may additionally or alternately contain a wired and/or wireless subsystems to connect or pair the device to other systems, such as sound, alert systems, fall monitoring systems, heart monitoring, other vital sign monitoring, and various APPs programs, cloud computing, and data storage. Other subsystems in the HMD may include a microphone/speaker and amplifier system, an integrated inertial measuring unit (IMU) containing a three axis accelerometer, a three axis gyroscope, a three axis magnetometer, an auxiliary port for custom sensors such as range finder, thermal camera, etc., GPS, SLAM sensor, gesturing sensor(s), infrared lights or cameras, brightness and color adjustment subsystem and control, network connectivity subsystem and controls, wire or wireless connectivity subsystem and controls, eye-tracking subsystem, gesture recognition subsystem, voice speech recognition subsystem, gyroscope, accelerometer, gagnetometer, obstacle avoidance subsystem, GPS, RFID subsystem and control, SLAM Sensors, other sensors, including infrared sensors and lighting and microphone(s), and various cameras and displays. In one embodiment of the invention, the hand gesturing subsystem may use RGB cameras or IR cameras with time-of-flight information to recognize 3D hand, finger, and arm gestures. This may also be accomplished by a combined gesture recognition subsystem like the Intel Realsense® chipset and may include coarse or fine tuning...” paragraph 0129). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claim invention to modify the system of Freeman and Bryan with the teaching of Mullins because the teaching of Mullins would improve the system of Freeman and Bryan by providing a technique generate threat pattern based on sensor data and notifying user of the threat. As to claims 11, see the rejection of claims 1 and 7 above. As to claim 14, see the rejection of claim 5 above. As to claims 15 and 20, see the rejection of claim 10 above. As to claim 16, see the rejection of claims 1 and 10 above. As to claim 17, see the rejection of claim 7 above. As to claim 18, see the rejection of claim 8 above. As to claim 19, see the rejection of claim 9 above. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 2, 3, 12 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Pub. No. 2016/0342840 A1 to Mullin et al. and further in view of U.S. Pub. No. 2018/0096261 A1 to Chu et al. As to claim 2, Mullins teaches the helmet system of claim 1 however it is silent with reference to wherein the controller receives the third derived data and processes the first derived data, the second derived data and the third derived data in a machine learning model of the helmet system. Chu teaches wherein the controller receives the third derived data and processes the first derived data, the second derived data and the third derived data in a machine learning model (machine learning algorithms) of the helmet system (a collection of sensor data generated by multiple sensors/Steps 605-635) (“…Turning to the simplified flow diagram 600 of FIG. 6, an example technique for generating an anomaly detection model using an ensemble of unsupervised machine learning algorithms is illustrated. For instance, a collection of sensor data generated by multiple sensors may be accessed 605. For instance, the sensor data may be passed (e.g., as it is generated) to an anomaly detection model generator. A set of feature vectors may be determined 610 from the sensor data and used in the execution 615 of an ensemble of unsupervised anomaly detection machine learning algorithms. Executing the ensemble of the unsupervised anomaly detection machine learning algorithms produces a collection of predictions for each of the set of feature vectors. These predictions may be used to determine 620 weightings (e.g., entropy-based weightings) for each of the unsupervised anomaly detection machine learning algorithms, which may be used, together with the predictions to generate pseudo labels from the predictions. These pseudo labels, unlike supervised labels, may represent a predicted ground truth and may stand in in the absence of actual supervised labels. A supervised machine learning algorithm may be provided with the set of pseudo labels as training data, and the supervised machine learning algorithm may be executed 630 to determine and generate 635 an anomaly detection model that may be used to detect anomalies in subsequent sensor data generated by the multiple sensors (or even other sensors similar to the multiple sensors (e.g., another deployment of a similar grouping of sensors)), among other examples…” paragraph 0067). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claim invention to modify the system of Mullins with the teaching of Chu because the teaching of Chu would improve the system of Chu by providing a specific approach to artificial Intelligence that learns from a dataset and recognizes patterns to improving their performance over time through the learned and recognized pattern. As to claim 3, Mullins the helmet system of claim 2 wherein the third derived data is related to a potential threat and presented to a user via the subsystem for a same time period as the first derived data and the second derived data (a display of the AR content comprising the warning notification in the transparent display) (“…The threat application identifies a threat based on a threat pattern and the sensor data. The threat pattern includes, for example, preconfigured thresholds for the sensor data or a series of user activities and corresponding sensor data resulting from the user activities. The threat application generates a warning notification in response to detecting the threat. The threat application compares the sensor data with the threat pattern to determine a threat. The AR application causes a display of the AR content comprising the warning notification in the transparent display. The warning notification may include a visual notification in the transparent display to bring the attention of the user to the imminent threat. For example, the transparent display may display a layer of virtual flashing lights on the physical objects causing the threat (e.g., a physical motor may be flashing red to indicate that the physical motor is overheating) or a virtual arrow showing a direction of the imminent threat…The HMD 101 may determine and identify a potential threat to the user 102 based on the combination of HMD-based sensor data, user-based sensor data, physical object-based sensor data, and ambient-based sensor data. In one example embodiment, the HMD 101 receives preconfigured parameters (e.g., safe ranges, and safe thresholds for corresponding sensors) associated with a threat and performs the analysis locally on the HMD 101 by comparing the sensor-based data with the preconfigured parameters. If the HMD 101 determines that one or more of the sensor data matches one or more of the preconfigured parameters, the HMD 101 notifies the user 102 by generating an audio or visual alert in the HMD 101. The HMD 101 may further provide the user 102 with instructions on how to remedy or correct an operation on the physical objects 116, 118 to dissipate the threat. If HMD 101 determines that no action from the user 102 can dissipate the threat, the HMD 101 may cause a display of a virtual evacuation route or path in the transparent display of the HMD 101…” paragraphs 0023/0040). As to claim 12, see the rejection of claim 3 above. As to claim 13, see the rejection of claims 2 and 6 above. Response to Arguments Applicant’s arguments with respect to claims 1-20 have been considered but are moot because the new ground of rejection relies on different parts of the Mullins prior art because of the current amendment and additional reference not applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHARLES E ANYA whose telephone number is (571)272-3757. The examiner can normally be reached Mon-Fir. 9-6pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KEVIN YOUNG can be reached at 571-270-3180. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHARLES E ANYA/Primary Examiner, Art Unit 2194
Read full office action

Prosecution Timeline

Nov 29, 2022
Application Filed
Jan 30, 2023
Response after Non-Final Action
May 09, 2025
Non-Final Rejection — §102, §103
Jul 28, 2025
Response Filed
Aug 23, 2025
Final Rejection — §102, §103
Nov 17, 2025
Request for Continued Examination
Nov 24, 2025
Response after Non-Final Action
Jan 21, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591471
KNOWLEDGE GRAPH REPRESENTATION OF CHANGES BETWEEN DIFFERENT VERSIONS OF APPLICATION PROGRAMMING INTERFACES
2y 5m to grant Granted Mar 31, 2026
Patent 12591455
PARAMETER-BASED ADAPTIVE SCHEDULING OF JOBS
2y 5m to grant Granted Mar 31, 2026
Patent 12585510
METHOD AND SYSTEM FOR AUTOMATED EVENT MANAGEMENT
2y 5m to grant Granted Mar 24, 2026
Patent 12579014
METHOD AND A SYSTEM FOR PROCESSING USER EVENTS
2y 5m to grant Granted Mar 17, 2026
Patent 12572393
CONTAINER CROSS-CLUSTER CAPACITY SCALING
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
82%
Grant Probability
99%
With Interview (+33.5%)
3y 2m
Median Time to Grant
High
PTA Risk
Based on 891 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month