Prosecution Insights
Last updated: April 17, 2026
Application No. 17/770,835

Devices and Methods for Reducing Transmission of Pathogens

Final Rejection §102§103§112
Filed
Apr 21, 2022
Examiner
COOPER, JONATHAN EPHRAIM
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
unknown
OA Round
2 (Final)
46%
Grant Probability
Moderate
3-4
OA Rounds
3y 5m
To Grant
79%
With Interview

Examiner Intelligence

Grants 46% of resolved cases
46%
Career Allow Rate
62 granted / 134 resolved
-23.7% vs TC avg
Strong +32% interview lift
Without
With
+32.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
50 currently pending
Career history
184
Total Applications
across all art units

Statute-Specific Performance

§101
17.7%
-22.3% vs TC avg
§103
41.6%
+1.6% vs TC avg
§102
14.2%
-25.8% vs TC avg
§112
23.9%
-16.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 134 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments, see page 8, filed 10/03/2025, with respect to the objection to Claims 6 and 11 have been fully considered and are persuasive. The objection to Claims 6 and 11 has been withdrawn. Applicant’s arguments, see page 9, filed 10/03/2025, with respect to the rejection to Claims 4 and 8-9 under 35 U.S.C. § 112(b) have been fully considered and are persuasive. The rejection to Claims 4 and 8-9 under 35 U.S.C. § 112(b) has been withdrawn. Applicant’s arguments, see page 9, filed 10/03/2025, with respect to the rejection of the claims under 35 U.S.C. § 101 have been fully considered and are persuasive. Claim 1 recites “a feedback mechanism comprising a haptic actuator controlled by the microcontroller and configured to generate a tactile response when the first hand approaches or crosses any of the one or more programmed spatial boundaries, thereby training the user to reduce touching of the user’s face and head”. The Examiner agrees with the applicant’s argument on page 9 of the filed response asserting this claim limitation modifies user behavior in a tangible way, thereby amounting to significantly more than merely an abstract idea. Claim 6 recites “activating the first feedback mechanism to generate the first tactile response to the user upon determining that the first hand has approached or crossed any of the one or more programmed spatial boundaries”. The Examiner agrees with the applicant’s argument on page 10 of the filed response asserting this claim limitation modifies user behavior in a tangible way, thereby amounting to significantly more than merely an abstract idea. Therefore, the rejection under 35 U.S.C. § 101 has been withdrawn. Applicant's arguments filed 10/03/2025 have been fully considered but they are not persuasive. Regarding the rejection of the claims under 35 U.S.C. § 102, the applicant has argued “There is no disclosure in Hathorn of a feedback mechanism that operates in the manner, or for the purpose, as recited in claims 1 and 6”. Specifically, in the interview on 10/1/2025, the Office stated the Hathorn does not appear to disclose “wherein the wearable motion monitoring device operates on stored motion data and event-based triggering”. However, upon further consideration, the Office believes Hathorn does teach the broadest reasonable interpretation of this claimed limitation. See rejection below. Claim Objections Claims 1-14 are objected to because of the following informalities: The amended claims (Claims 1-14) in the claim set filed 10/03/2025 are low resolution and fuzzy, making them difficult to read/transcribe. In Claim 6, “a first transmitter operably connected to the first microcontroller, configured to transmit a first position information signal e upon a triggering condition” should read “a first transmitter operably connected to the first microcontroller, configured to transmit a first position information signal [[e]] upon a triggering condition”. In Claim 6, “e activating the first feedback mechanism to generate the first tactile response to the user upon determining that the first hand has approached or crossed any of the one or more programmed spatial boundaries” should read “[[e]] activating the first feedback mechanism to generate the first tactile response to the user upon determining that the first hand has approached or crossed any of the one or more programmed spatial boundaries”. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites “wherein the wearable motion monitoring device operates on stored motion data and event-based triggering thereby reducing power consumption and enhancing feedback responsiveness”. It is unclear from what baseline the applicant is claiming power consumption is reduced, and from what baseline feedback responsiveness is enhanced. For the purposes of substantive examination, the claim limitation “thereby reducing power consumption and enhancing feedback responsiveness” will be considered an intended use limitation. Claims 2-5 are rejected by virtue of dependence on Claim 1. Claim 6 recites “wherein the wearable motion monitoring device operates on stored motion data and event-based triggering thereby reducing power consumption and enhancing feedback responsiveness”. It is unclear from what baseline the applicant is claiming power consumption is reduced, and from what baseline feedback responsiveness is enhanced. . For the purposes of substantive examination, the claim limitation “thereby reducing power consumption and enhancing feedback responsiveness” will be considered an intended use limitation. Claims 7-18 are rejected by virtue of dependence on Claim 6. Claim 16 recites “thereby reducing latency, improving energy efficiency, and enhancing system independence”. is unclear from what baseline(s) the applicant is claiming latency is reduced, energy efficiency is improved, and system independence is enhanced. For the purposes of substantive examination, the claim limitation “thereby reducing latency, improving energy efficiency, and enhancing system independence” will be considered an intended use limitation. Claim 18 recites “thereby reducing latency, improving energy efficiency, and enhancing system independence”. is unclear from what baseline(s) the applicant is claiming latency is reduced, energy efficiency is improved, and system independence is enhanced. For the purposes of substantive examination, the claim limitation “thereby reducing latency, improving energy efficiency, and enhancing system independence” will be considered an intended use limitation. The following is a quotation of 35 U.S.C. 112(d): (d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. The following is a quotation of pre-AIA 35 U.S.C. 112, fourth paragraph: Subject to the following paragraph [i.e., the fifth paragraph of pre-AIA 35 U.S.C. 112], a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. Claims 5 and 10 are rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. Claim 5 recites “wherein the feedback mechanism is at least one of visual, audible, and haptic”. This fails to further limit the subject matter of parent Claim 1, which recites “a feedback mechanism comprising a haptic actuator…”. Instead, Claim 5 broadens the claimed subject matter. Claim 10 recites “wherein the feedback mechanism is at least one of visual, audible, and haptic”. This fails to further limit the subject matter of parent Claim 6, which recites “a feedback mechanism comprising a first haptic actuator…”. Instead, Claim 5 broadens the claimed subject matter. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-8, 10-13, 16, and 18 are rejected under 35 U.S.C. 102 (a)(1) as being anticipated by Hathorn (US 20160140830 A1, cited in applicant’s IDS, hereinafter Hathorn). Regarding Claim 1, Hathorn discloses a wearable motion monitoring device (Elements 101 and 103, Figs. 1, and 5-9; Element 1500, Fig 15), configured to be worn on or near a first hand of a user to a hand (See Figs. 5-9; Element 101 is worn on the left hand), the wearable motion device comprising: at least one multi-axis accelerometer mounted in a fixed position relative to the hand of the user , configured to generate motion data corresponding to movement of the first hand (“Hand modules 101 that are active may incorporate…accelerometers and gyroscopes to measure hand motion and orientation”, [0025]; to detect orientation in 3D space, the accelerometer must record motion in at least two axes); a microcontroller (Element 1504, Fig. 15) operably connected to the at least one multi-axis accelerometer (“the processing system further includes at least one of the modules 1512, 1514, 1516, 1518, and 1520”, [0052]), the microcontroller programmed with one or more spatial boundaries (“a processor configured to generate an alert when the hand module comes within a predefined range of the face module”, Abstract; “Proximity determining module 1518 may assess the proximity of a hand module to a face module. The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]) and configured to: receive and store the motion data from the at least one multi-axis accelerometer (“the processing system further includes at least one of the modules 1512, 1514, 1516, 1518, and 1520”, [0052]; “The processing system further includes at least one of the modules 1512, 1514, 1516, 1518, and 1520. The modules may be software modules running in the processor 1504, resident/stored in the computer readable medium/memory 1506, one or more hardware modules coupled to the processor 1504, or some combination thereof”, [0053]); compare the motion to the one or more programmed spatial boundaries (“Proximity determining module 1518 may assess the proximity of a hand module to a face module. The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]); and determine whether the motion data indicates approaching or crossing of the one or more programmed spatial boundaries (“Proximity determining module 1518 may assess the proximity of a hand module to a face module. The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]); a transmitter (Elements 1510-1511, Fig. 15) operably connected to the microcontroller (“The processing system may be coupled to a transceiver, a transmitter and/or a receiver 1510”, [0052]), configured to transmit a first position information signal ("GPS module 1512 may determine a geographic location of the hand module 1500. The determined location may be stored locally, e.g., in memory 1506 and/or may be transmitted to a user's computer via antenna 1511", [0053]) upon a triggering condition (“Clock module 1514 may maintain an accurate clock so that a time may be associated with the location determined by the GPS module. This location and time may be used in order to determine whether the user may have come into proximity of an infectious person. Alert Generating Module 1516 may generate an alert when a certain condition is met or when an assessed risk is above a threshold”, [0053]); and a feedback mechanism comprising a haptic actuator controlled by the microcontroller and configured to generate a tactile response (“As discussed supra, the alert may be visual, auditory, vibration, etc.”, [0053]) when the first hand approaches or crosses any of the one or more programmed spatial boundaries (“Proximity determining module 1518 may assess the proximity of a hand module to a face module. The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]), thereby training the user to reduce touching of the user’s face and head (“Aspects as presented herein incorporate wireless, wearable technologies designed to minimize inadvertent face-hand contact”, [0005]), wherein the wearable motion monitoring device operates on stored motion data and event-based triggering (The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]; to trigger an alert when the event of the hand module coming within a predefined distance, the module, which is a software module running in the processor 1504, and resident/stored in the computer readable medium memory 1506 as taught in [0053], must have a stored predefined distance and at least short term motion tracking/memory) thereby reducing power consumption and enhancing feedback responsiveness (The software module 1518 of Hathorn would achieve this intended use functional language; see the rejection of this limitation under 35 U.S.C. § 112(b) above). Regarding Claim 2, Hathorn discloses the wearable motion monitoring device according to claim 1, wherein the monitoring device is configured to be worn on a wrist (See Fig. 5) or on one or more fingers (See Fig. 6). Regarding Claim 3, Hathorn discloses a wearable motion monitoring device according to claim 1, further comprising an interface for programming the microcontroller with the one or more programmed spatial boundaries (“The processing system further includes at least one of the modules 1512, 1514, 1516, 1518, and 1520. The modules may be software modules running in the processor 1504, resident/stored in the computer readable medium/memory 1506, one or more hardware modules coupled to the processor 1504, or some combination thereof”, [0054]; element 1518 comprises the programmed boundaries, [0054]; therefore the boundaries must be programmed on the computer readable memory (see [0041]) element 1518 is stored on; therefore, the connections within device 1500 that connect to the computer readable medium (which can be “a random-access memory (RAM), a read-only memory (ROM), an electrically erasable programmable ROM (EEPROM), compact disk ROM (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices”, [0041]) are interfaces for programming the microcontroller with the programmed boundaries). Regarding Claim 4, Hathorn discloses the wearable motion monitoring device according to claim 1, further comprising a receiver for receiving a second position information signal (“GPS module 1512 may determine a geographic location of the hand module 1500. The determined location may be stored locally, e.g., in memory 1506 and/or may be transmitted to a user's computer via antenna 1511”, [0053]; therefore, a receiver for receiving a position information signal must exist within the user’s computer) from a second monitoring device (Fig. 1; a second element 101 is worn on the right hand), wherein the second monitoring device is configured to be worn on or in proximity to a second hand (See Fig. 1). Regarding Claim 5, Hathorn discloses the wearable motion monitoring device according to claim 1, wherein the feedback mechanism is at least one of visual, audible, and haptic (“As discussed supra, the alert may be visual, auditory, vibration, etc.”, [0053]). Regarding Claim 6, Hathorn discloses a method for detecting hand motion and providing feedback (“A system, apparatus, method, and computer program product for encouraging prevention of hand-to-face contact are provided”, Abstract), the method comprising: a) affixing a first monitoring device (Element 101, Figs. 1, and 5-9; Element 1500, Fig 15) on or near a first hand of a user (See Figs. 5-9; Element 101 is worn on the left hand), the first wearable motion monitoring device comprising: at least one first multi-axis accelerometer mounted in a fixed position relative to the first hand of a user, configured to generate motion data corresponding to movement of the first hand (“Hand modules 101 that are active may incorporate…accelerometers and gyroscopes to measure hand motion and orientation”, [0025]; to detect orientation in 3D space, the accelerometer must record motion in at least two axes); a first microcontroller (Element 1504, Fig. 15) operably connected to the at least one first multi-axis accelerometer (“the processing system further includes at least one of the modules 1512, 1514, 1516, 1518, and 1520”, [0052]), the microcontroller programmed with one or more spatial boundaries (“a processor configured to generate an alert when the hand module comes within a predefined range of the face module”, Abstract; “Proximity determining module 1518 may assess the proximity of a hand module to a face module. The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]) and configured to: receive and store the motion data from the at least one first multi-axis accelerometer (“the processing system further includes at least one of the modules 1512, 1514, 1516, 1518, and 1520”, [0052]; “The processing system further includes at least one of the modules 1512, 1514, 1516, 1518, and 1520. The modules may be software modules running in the processor 1504, resident/stored in the computer readable medium/memory 1506, one or more hardware modules coupled to the processor 1504, or some combination thereof”, [0053]); compare the motion data to the one of more programmed spatial boundaries (“Proximity determining module 1518 may assess the proximity of a hand module to a face module. The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]); and determine whether the motion data indicates approaching or crossing the one or more programmed spatial boundaries (“Proximity determining module 1518 may assess the proximity of a hand module to a face module. The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]); a first transmitter (Elements 1510-1511, Fig. 15) operably connected to the first microcontroller (“The processing system may be coupled to a transceiver, a transmitter and/or a receiver 1510”, [0052]), configured to transmit a first position information signal ("GPS module 1512 may determine a geographic location of the hand module 1500. The determined location may be stored locally, e.g., in memory 1506 and/or may be transmitted to a user's computer via antenna 1511", [0053]) upon a triggering condition (“Clock module 1514 may maintain an accurate clock so that a time may be associated with the location determined by the GPS module. This location and time may be used in order to determine whether the user may have come into proximity of an infectious person. Alert Generating Module 1516 may generate an alert when a certain condition is met or when an assessed risk is above a threshold”, [0053]); a first feedback mechanism comprising a first haptic actuator, controlled by the microcontroller and configured to generate a tactile response (“As discussed supra, the alert may be visual, auditory, vibration, etc.”, [0053]) when the first hand approaches or crosses any of the one or more programmed spatial boundaries (“Proximity determining module 1518 may assess the proximity of a hand module to a face module. The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]), thereby training the user to reduce touching of the user’s face and head (“Aspects as presented herein incorporate wireless, wearable technologies designed to minimize inadvertent face-hand contact”, [0005]); b) collecting the motion data from the at least one first multi-axis accelerometer (“Hand modules 101 that are active may incorporate a power source, as well as additional sensors, such as accelerometers and gyroscopes to measure hand motion and orientation. Active hand modules may also be designed, for example, to transmit and/or receive information to and from a computer capable of storing and analyzing data”, [0025]); c) storing and processing the motion data with the first microcontroller to determine whether the first hand approaches or crosses any of the one or more programmed spatial boundaries (“Proximity determining module 1518 may assess the proximity of a hand module to a face module”, [0054]); and activating the first feedback mechanism to generate the first tactile response to the user upon determining that the first hand has approached or crossed any of the one or more programmed spatial boundaries (“The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]), wherein the first wearable motion monitoring device operates based on stored motion data and event-based triggering (The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]; to trigger an alert when the event of the hand module coming within a predefined distance, the module, which is a software module running in the processor 1504, and resident/stored in the computer readable medium memory 1506 as taught in [0053], must have a stored predefined distance and at least short term motion tracking/memory), thereby reducing power consumption and enhancing feedback responsiveness (The software module 1518 of Hathorn would achieve this intended use functional language; see the rejection of this limitation under 35 U.S.C. § 112(b) above). Regarding Claim 7, Hathorn discloses the method according to claim 6, wherein the first wearable motion monitoring device further comprises an interface, and wherein the method further comprises utilizing the interface to program the first microcontroller (“The processing system further includes at least one of the modules 1512, 1514, 1516, 1518, and 1520. The modules may be software modules running in the processor 1504, resident/stored in the computer readable medium/memory 1506, one or more hardware modules coupled to the processor 1504, or some combination thereof”, [0054]; element 1518 comprises the programmed boundaries, [0054]; therefore the boundaries must be programmed on the computer readable memory (see [0041]) element 1518 is stored on; therefore, the connections within device 1500 that connect to the computer readable medium (which can be a random-access memory (RAM), a read-only memory (ROM), an electrically erasable programmable ROM (EEPROM), compact disk ROM (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, [0041]) are interfaces utilized to program the microcontroller with the programmed boundaries). Regarding Claim 8, Hathorn discloses the method according to claim 6, further comprising: a) affixing a second wearable motion monitoring device on or in proximity to a second hand of the user (See Figs. 5-9; a second element 101 is worn on the right hand), wherein the second wearable motion monitoring device comprises: at least one second multi-axis accelerometer that senses motion of the second hand (See Fig. 1; a second element 101 is worn on the right hand; “Hand modules 101 that are active may incorporate…accelerometers and gyroscopes to measure hand motion and orientation”, [0025]; to detect orientation in 3D space, the accelerometer must record motion in at least two axes); a second microcontroller (See Fig. 1; a second element 101 is worn on the right hand; Element 1504, Fig. 15), in operable communication with the at least one second multi-axis accelerometer (“the processing system further includes at least one of the modules 1512, 1514, 1516, 1518, and 1520”, [0052]), programmed with the one or more spatial boundaries (“a processor configured to generate an alert when the hand module comes within a predefined range of the face module”, Abstract; “Proximity determining module 1518 may assess the proximity of a hand module to a face module. The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]) and that receives information from the at least one second multi-axis accelerometer pertaining to the motion of the second hand (“the processing system further includes at least one of the modules 1512, 1514, 1516, 1518, and 1520”, [0052]; “The processing system further includes at least one of the modules 1512, 1514, 1516, 1518, and 1520. The modules may be software modules running in the processor 1504, resident/stored in the computer readable medium/memory 1506, one or more hardware modules coupled to the processor 1504, or some combination thereof”, [0053]); a second transmitter (See Fig. 1; a second element 101 is worn on the right hand; Elements 1510-1511, Fig. 15; “FIG. 15 illustrates a number of example system components that might be included in a hardware implementation of a hand module 1500 in accordance with aspects presented herein”, [0051]), in operable communication with the second microcontroller (See Fig. 1; a second element 101 is worn on the right hand; “The processing system may be coupled to a transceiver, a transmitter and/or a receiver 1510”, [0052]), for transmitting a second position information signal pertaining to the location of the second wearable motion monitoring device ("GPS module 1512 may determine a geographic location of the hand module 1500. The determined location may be stored locally, e.g., in memory 1506 and/or may be transmitted to a user's computer via antenna 1511", [0053]); and a second feedback mechanism comprising a second haptic actuator, controlled by the second microcontroller and configured to generate a second tactile response (“As discussed supra, the alert may be visual, auditory, vibration, etc.”, [0053]) when the second hand approaches or passes any of the programmed spatial boundaries (“Proximity determining module 1518 may assess the proximity of a hand module to a face module. The assessed proximity may be used by the alert generating module 1516 in order to alert the user when the hand module comes within a predefined distance from the face module”, [0054]), and b) programming the second microcontroller of the second wearable motion monitoring device with the one or more programmed boundaries (“The modules may be software modules running in the processor 1504, resident/stored in the computer readable medium/memory 1506, one or more hardware modules coupled to the processor 1504, or some combination thereof”, [0054]; element 1518 comprises the programmed boundaries, [0054]; therefore the boundaries must be programmed on the computer readable memory (see [0041]) element 1518 is stored on), and wherein the user moves the second hand away from a programmed spatial boundary of the one or more programmed spatial boundaries in response to the second tactile response (“Such an alert may be made in order to help the user avoid hand to face contact”, [0054]). Regarding Claim 10, Hathorn discloses the method according to claim 6, wherein the first feedback mechanism provides a signal that is at least one of visual, audible, and haptic (“As discussed supra, the alert may be visual, auditory, vibration, etc.”, [0053]). Regarding Claim 11, Hathorn discloses the method according to claim 6, wherein the one or more programmed spatial boundaries excludes a space around the user’s face, such that motion of the first hand towards the user’s face causes the user’s feedback mechanism to generate a tactile response (“The system includes a hand module configured to be worn near a hand of at least one user, a face module configured to be worn in an area at chest level or above for at least one user, and a processor configured to generate an alert when the hand module comes within a predefined range of the face module”, Abstract; “Such an alert may be made in order to help the user avoid hand to face contact”, [0054]). Regarding Claim 12, Hathorn discloses the method according to claim 6, wherein the first wearable motion monitoring device is turned off during eating and/or drinking (“one or more of the modules may be equipped with an override mechanism that, once activated by the user, permits the hand 101 and face 103 modules to be within close proximity of one another (without triggering an alert)…The override function may be useful for example, in circumstances in which hand-face contact is i) acknowledged by the device user and ii) desired and/or necessary. Examples of such situations include eating meals and performing personal hygiene activities”, [0029]). Regarding Claim 13, Hathorn discloses the method according to claim 6, wherein the first wearable motion monitoring device is programmed not to send generate the first tactile response the first hand approaches a mouth of the user (“However, to allow for desired hand-face contact, one or more of the modules may be equipped with an override mechanism that, once activated by the user, permits the hand 101 and face 103 modules to be within close proximity of one another (without triggering an alert) for a set period of time”, [0029]). Regarding Claim 16, Hathorn discloses the wearable motion monitoring device according to claim 1, wherein the wearable motion monitoring device is operable in both a standalone mode (“Aspects as presented herein incorporate wireless, wearable technologies designed to minimize inadvertent face-hand contact and encourage effective hand washing. Furthermore, additional aspects of the invention track each occurrence of hand-washing and inadvertent (and hopefully averted) face-hand contact. The resulting data may be transmitted via software on a computer or mobile device, for example, which identifies the time and location from which the signal originated. This information may then stored in memory, either on the local device…”, [0005]) and a paired mode with a second device, the paired mode enabling direct communication between the wearable motion monitoring device and the second device (“The transceiver 1510 provides a means for the hand module to communicate communicating with various other apparatus, such as the face module”, [0052]), thereby reducing latency, improving energy efficiency, and enhancing system independence (The system of Hathorn would achieve this intended use; see the rejection of this limitation under 35 U.S.C. § 112(b) above). Regarding Claim 18, Hathorn discloses the method according to claim 6, wherein the first wearable motion monitoring device is operable in both a standalone mode (“Aspects as presented herein incorporate wireless, wearable technologies designed to minimize inadvertent face-hand contact and encourage effective hand washing. Furthermore, additional aspects of the invention track each occurrence of hand-washing and inadvertent (and hopefully averted) face-hand contact. The resulting data may be transmitted via software on a computer or mobile device, for example, which identifies the time and location from which the signal originated. This information may then stored in memory, either on the local device…”, [0005]) and a paired mode with a second wearable motion monitoring device, the paired mode enabling direct communication between the first wearable motion monitoring device and the second wearable motion monitoring device (“The transceiver 1510 provides a means for the hand module to communicate communicating with various other apparatus, such as the face module”, [0052]), thereby reducing latency, improving energy efficiency, and enhancing system independence (The software module 1518 of Hathorn would achieve this intended use functional language; see the rejection of this limitation under 35 U.S.C. § 112(b) above). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Hathorn in view of Davidson et al (US 20170000389 A1, hereinafter Davidson). Regarding Claim 9, Hathorn discloses the method according to claim 8. Hathorn discloses the claimed invention except for expressly disclosing wherein the second microcontroller in the second wearable motion monitoring device is further programmed to compare the information from the at least one second multi-axis accelerometer with the second position information signal received from the second monitoring device and determine when the first and second hands of the user are arranged in one or more pre-defined orientations, to prevent or inhibit the first tactile response from being provided by the first feedback mechanism and the second tactile response from being provided be the second feedback mechanism. However, Davidson teaches wherein the second microcontroller (Element 122, Fig. 1) in the second wearable motion monitoring device (Element 120, Fig. 1; “the computing device 120 may be implemented as a small mobile computing device that can be held, worn, or otherwise disposed about the subject such that the subject may participate in a series of motions without being inhibited”, [0066]) is further programmed to ([0064]) compare the information from the at least one second multi-axis accelerometer with the second position information signal received from the second monitoring device (“In some embodiments, the location data may be associated with timestamps that may be compared with when the IMUs 110 are attached to a subject to differentiate between times when the IMUs 110 may be attached to the subject and not attached to the subject”, [0041]) and determine when the first and second hands of the user are arranged in one or more pre-defined orientations (“absolute and relative location determinations of the IMUs 110 may be used to determine absolute and relative positions of the respective segments”, [0041]; these IMUs can also be in proximity to a second hand of the subject, see elements 210 of Fig. 2A). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Hathorn, with the programming of Davidson (to prevent or inhibit the first tactile response from being provided by the first feedback mechanism and the second tactile response from being provided be the second feedback mechanism) because Davidson teaches this modification allows differentiation between times when the IMUs may be attached to the subject and not attached to the subject ([0041]), which provides the advantage of not providing unnecessary feedback when the device is not being worn. Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Hathorn. Regarding Claim 14, Hathorn discloses the method according to claim 12, wherein the first wearable motion monitoring device is turned off (“one or more of the modules may be equipped with an override mechanism”, [0029]) during eating and/or drinking (“The override function may be useful for example, in circumstances in which hand-face contact is i) acknowledged by the device user and ii) desired and/or necessary. Examples of such situations include eating meals”, [0029]). Hathorn discloses the claimed invention except for expressly disclosing wherein the first wearable motion monitoring device is on a dominant hand of the user when it is turned off during eating and/or drinking. However, it has been held that optimization within prior art conditions or through routine experimentation is obvious, and Hathorn clearly teaches a situation wherein only one module on one hand is turned off (“one or more of the modules may be equipped with an override mechanism”, [0029]). There are only two possible situations where this statement is true— 1) wherein only a device on the subject's dominant hand is turned off during eating and/or drinking, and 2) wherein only a device on the subject's non-dominant hand is turned off during eating and/or drinking. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Hathorn such that the first wearable motion monitoring device is on a dominant hand of the user when it is turned off during eating and/or drinking, as a matter of optimization within prior art conditions or through routine experimentation. Claims 15 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Hathorn in view of Maples (US 20090125083 A1). Regarding Claim 15, Hathorn discloses the wearable motion monitoring device according to claim 1. Hathorn discloses the claimed invention except for expressly disclosing wherein the wearable motion monitoring device is configured to increase output intensity as the hand nears the user's face. However, Maples teaches wherein the wearable motion monitoring device is configured to increase output intensity as the hand nears the user's face (“It should be understood that the intensity of the sensory stimulus triggered by the above-described devices can be made proportional to the degree of elbow flexion or the proximity of the hand to the face”, [0037]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Hathorn in view of Maples such that the wearable motion monitoring device is configured to increase output intensity as the hand nears the user's face, because this combats the user’s tendency to become sensitized to lower intensity stimuli (See Maples, [0037]). Regarding Claim 17, Hathorn discloses the method according to claim 6. Hathorn discloses the claimed invention except for expressly disclosing wherein the first wearable motion monitoring device is configured to increase output intensity as the first hand nears the user's face. However, Maples teaches wherein the first wearable motion monitoring device is configured to increase output intensity as the first hand nears the user's face(“It should be understood that the intensity of the sensory stimulus triggered by the above-described devices can be made proportional to the degree of elbow flexion or the proximity of the hand to the face”, [0037]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Hathorn in view of Maples such that the wearable motion monitoring device is configured to increase output intensity as the hand nears the user's face, because this combats the user’s tendency to become sensitized to lower intensity stimuli (See Maples, [0037]). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See Greenly (US 20180027908 A1). See Buser et al (US 20190043384 A1). Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JONATHAN EPHRAIM COOPER whose telephone number is (571)272-2860. The examiner can normally be reached Monday-Friday 7:30AM-5:30PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jacqueline Cheng can be reached at (571) 272-5596. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JONATHAN E. COOPER/Examiner, Art Unit 3791 /JACQUELINE CHENG/Supervisory Patent Examiner, Art Unit 3791
Read full office action

Prosecution Timeline

Apr 21, 2022
Application Filed
Jun 03, 2025
Non-Final Rejection — §102, §103, §112
Oct 01, 2025
Examiner Interview Summary
Oct 01, 2025
Applicant Interview (Telephonic)
Oct 03, 2025
Response Filed
Jan 10, 2026
Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12558001
MUSCLE FATIGUE DETERMINATION METHOD
2y 5m to grant Granted Feb 24, 2026
Patent 12543963
APPARATUS AND METHOD FOR ESTIMATING BIO-INFORMATION
2y 5m to grant Granted Feb 10, 2026
Patent 12538956
Footwear Having Sensor System
2y 5m to grant Granted Feb 03, 2026
Patent 12507905
DEVICE AND METHOD FOR REAL TIME ASSESSMENT AND MONITORING OF THORACIC FLUID, AIR TRAPPING AND VENTILATION
2y 5m to grant Granted Dec 30, 2025
Patent 12465246
SYSTEMS FOR PHYSIOLOGICAL CHARACTERISTIC MONITORING
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
46%
Grant Probability
79%
With Interview (+32.5%)
3y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 134 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in for Full Analysis

Enter your email to receive a magic link. No password needed.

Free tier: 3 strategy analyses per month