Prosecution Insights
Last updated: April 19, 2026
Application No. 17/574,436

PERFORMANCE OF DIFFERENT ACTIONS AT DEVICE BASED ON SUCCESS OR FAILURE OF SUBSEQUENT AUTHENTICATION WITHIN THRESHOLD TIME AFTER REPEATED AUTHENTICATION FAILS

Final Rejection §103§112
Filed
Jan 12, 2022
Examiner
DHAKAD, RUPALI
Art Unit
2437
Tech Center
2400 — Computer Networks
Assignee
LENOVO (SINGAPORE) PTE. LTD.
OA Round
6 (Final)
39%
Grant Probability
At Risk
7-8
OA Rounds
3y 6m
To Grant
71%
With Interview

Examiner Intelligence

Grants only 39% of cases
39%
Career Allow Rate
13 granted / 33 resolved
-18.6% vs TC avg
Strong +31% interview lift
Without
With
+31.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
40 currently pending
Career history
73
Total Applications
across all art units

Statute-Specific Performance

§101
13.0%
-27.0% vs TC avg
§103
56.1%
+16.1% vs TC avg
§102
9.1%
-30.9% vs TC avg
§112
20.0%
-20.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 33 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claims 1, 12, 18, are currently amended. Claim 2-6, 11, 13-17. 19-38 has been cancelled. Claim 39-51 are newly added. Response to Arguments Applicant’s arguments, see 9-10, filed 11/13/2025, with respect to the rejection(s) of claim(s) 1, 12, 18 under 35 U.S.C. § 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of newly found prior art. Applicant’s arguments, see page 9-10, filed 11/13/2025, with respect to Claims 12, 28, and 30 have been fully considered and are persuasive. The rejection under 35 U.S.C. § 112(a) of 10/01/2025 has been withdrawn. Claim Objections Claims 1, 7-10, and 39-48 are objected to because of the following informalities: Claim 1 is objected to because of the following informality: in the limitation reciting “select a dynamic threshold amount of time related to authentication failure, the selection of the dynamic threshold amount of time being based on one or more both of: an activity for which the device is currently being used as indicated in the relational database, at least one type of authentication to be used for authenticating a user while the user performs the activity as indicated in the relational database; wherein the activity is managing network infrastructure;” the “wherein” clause is improperly separated from the preceding clause by a semicolon instead of a comma. Appropriate correction is required, for example, by replacing the semicolon before “wherein the activity is managing network infrastructure” with a comma so that the “wherein” clause properly modifies the preceding limitation. Claims 7-10 and 39-48 depend on objected claim 1 and do not overcome the deficiencies mentioned in the objection of claim 1. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1, 45, 49 are rejected under 35 U.S.C. § 112(a) (pre-AIA 35 U.S.C. 112, first paragraph) for failing to comply with the written description requirement. The claims contain subject matter that is not described in the specification in a manner sufficient to reasonably convey to one skilled in the relevant art that the inventor(s), at the time the application was filed, had possession of the claimed invention. Independent claims 1, 45, and 49 recite, in relevant part, that “the activity is managing network infrastructure.” The phrase “managing network infrastructure,” when read broadly and in light of its ordinary meaning, encompasses a wide range of network-management activities (e.g., configuration, monitoring, update/patch management, security policy enforcement, routing, and other infrastructure-level operations) and therefore constitutes a broad functional genus. However, the specification, while describing activities such as user authentication, repeated/continuous authentication, and user interactions with documents and graphical user interfaces on client devices, does not describe any network-infrastructure management activities or otherwise identify a set of species that are in fact “managing network infrastructure.”​ The written description requirement for a broad functional genus is satisfied when the specification either (i) discloses a representative number of species within the claimed genus, or (ii) describes structural or other identifying characteristics common to members of the genus such that a person of ordinary skill can recognize that the inventor was in possession of the claimed genus as of the filing date. See MPEP § 2163 and § 2163.05; see also Ariad Pharm., Inc. v. Eli Lilly & Co., 598 F.3d 1336 (Fed. Cir. 2010) (en banc). Here, the specification does not disclose any concrete examples of “managing network infrastructure” as that term would be understood by a person of ordinary skill in the art, nor does it describe any common structural or functional characteristics that define such a genus in the context of the invention.​ Accordingly, the specification fails to reasonably convey to those skilled in the art that the inventor had possession, as of the filing date, of the full breadth of the claimed activity of “managing network infrastructure.” The limitation “the activity is managing network infrastructure” therefore lacks adequate written description support. Applicant is required to amend the claims to limit this limitation to subject matter that is supported by the specification (e.g., to specifically disclosed activities such as user document editing and associated authentication-gap handling), or to otherwise point out with particularity where and how the originally filed disclosure provides written description support for the full scope of “managing network infrastructure” as currently claimed.​ Note: Prior-art references such as “Fallon” cannot supply missing written description; written description support must be found within the application as filed. Claims 7-10 and 39-48 depend on rejected claim 1 and do not overcome the deficiencies mentioned in the rejection of claim 1. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1 is rejected under 35 U.S.C. 103 as being unpatentable over Lindemann (U. S. PGPub. No. 2018/0041503 A1) (hereinafter “Lindemann”) in view of Weidner (U. S. PGPub. No 2013/0104187 A1) (hereinafter “Weidner”); and in further view of Fallon et al. (U. S. Pat. No. 8,713,559 B2) (hereinafter “Fallon”) Regarding claim 1, Lindemann teaches: at least one processor (Lindemann: [0621], The steps may be embodied in machine-executable instructions which cause a general-purpose or special-purpose processor to perform certain steps); and storage accessible to the at least one processor (Lindemann: [0622] Elements of the present invention may also be provided as a machine-readable medium for storing the machine-executable program code. The machine-readable medium may include, but is not limited to, floppy diskettes, optical disks, CD-ROMs, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic program code) and comprising instructions executable by the at least one processor to (Lindemann: [0624] Embodiments of the invention may include various steps as set forth above. The steps may be embodied in machine-executable instructions which cause a general-purpose or special-purpose processor to perform certain steps): select a dynamic threshold amount of time related to authentication failure, the selection of the threshold amount of time (Lindemann: examiner is interpreting that implementing the authentication policy (i.e. set of rules to authenticate the devices) will be the process of selecting threshold and authentication techniques. See Fig 3 “Timeout” which is dynamically changing if user does not perform the assigned authentication method in the policy. Also explained in the Table cited after the paragraph [0020]. [0048] FIG. 23 illustrates one embodiment of a system for adaptively applying an authentication policy; [0049] FIG. 24 illustrates one embodiment of a method for adaptively applying an authentication policy. [0097], Assuming that amount of time which has passed is within a particular threshold (e.g., 5 seconds, 5 minutes, 1 hour, etc.), the device may be considered to be in a “legitimate user state” and the assurance level set to a maximum value (e.g., 100 on a normalized scale of −100 to 100)) being based both of: an activity for which the device is currently being used (Lindemann: [0096] One embodiment of the invention uses “normal” authentication techniques (e.g., swiping a finger, entering a code, etc.) in order to train the authentication system to recognize non-intrusive authentication situations (=non-authentication activity). [0184] Once the authentication policy engine 1710 selects a set of authentication techniques 1712, the authentication policy engine 1710 may implement the techniques using one or more explicit user authentication devices 1720-1721 and/or non-intrusive authentication techniques 1742-1743 to authenticate the user with a relying party 1750), at least one type of authentication to be used for authenticating a user (Lindemann: [0184] Once the authentication policy engine 1710 selects a set of authentication techniques 1712 (=type of authentication), the authentication policy engine 1710 may implement the techniques using one or more explicit user authentication devices 1720-1721 and/or non-intrusive authentication techniques 1742-1743 to authenticate the user with a relying party 1750. By way of example, and not limitation, the explicit user authentication 1720-1721 may include requiring the user to enter a secret code such as a PIN, fingerprint authentication, voice or facial recognition, and retinal scanning, to name a few) while the user performs activity (Lindemann: [0096] One embodiment of the invention uses “normal” authentication techniques (e.g., swiping a finger, entering a code, etc.) in order to train the authentication system to recognize non-intrusive authentication situations (=non-authentication activity); Lindemann does not explicitly disclose: based on successful authentication resuming subsequent to the interruption but within the dynamic threshold amount of time, take at least a second action different from the first action. and based on the interruption exceeding the dynamic threshold amount of time take at least a third action different from the first and second actions. However, in an analogous art, Weidner teaches: (Weidner: [0026] (1) active and unlocked 301: no authentication needed (user is currently considered authenticated); based on successful authentication resuming subsequent to interruption but within the dynamic threshold amount of time, take at least a second action different from the first action (Weidner: [0027] (2) screen off 302: no authentication needed (user still considered authenticated); a keyguard action such as a "drag to unlock" (=second action) gesture may be required to prevent accidental activation but is not considered an authentication method); and also reset an authentication failure timer (Weidner: [0052], The security application may monitor sounds using the device's microphone or other audio input. Based on the sounds, the device may increase or decrease the timeout periods (=reset timer). if the device recognizes a familiar sound pattern (e.g., the sound of a car or other environment, or the user's voice), it may apply a longer timeout period. On the other hand, if the device recognizes a known adverse sound (such as someone yelling the phrase "stop thief", or if it detects sounds that it cannot recognize (such as multiple unrecognized voices, which may indicate that the phone is in a public place), it may apply a shorter timeout period); and based on the interruption exceeding the dynamic threshold amount of time take at least a third action different from the first and second actions (Weidner: Please see the table provides after para [0020] and Fig. 3. [0024], Transitions between security states may occur automatically (without external input) upon the satisfaction of certain state transition conditions. For example the passage of a threshold period of time 327-329 (represented by solid lines in FIG. 3) may automatically transition the device to the next higher security state. [0033] (2) State 302, elapsed time 5 minutes, transition to state 303). It would be obvious to a person having ordinary skill in the art, before the effective filing date of the invention, to modify Lindemann’s method of selecting time interval and one or more different type of authentication applying authentication policy by applying Weidner’s method of performing actions if response does not receive withing certain threshold amount of time, in order to improvement over leaving the device completely unprotected in an insecure security state during this interval (Weidner: [0021]). Lindemann in view of Weidner does not explicitly teaches: based on identification of an interruption wherein the activity is managing network infrastructure; the interruption identified as a sensor update being performed during which a sensor cannot be used for authentication; However, in an analogous art, Fallon teaches: wherein the activity is managing network infrastructure (Fallon:[Col 20, lines 20-24], (97) FIG. 10 and FIG. 11 show an example of the firmware update routine (=initiating firmware updates). The process 1000 may include the acts of displaying a status screen to a user, transmitting firmware file to UPS, writing the firmware file to UPS, if the update is successful displaying a completion screen, [Col 20, lines 28-34], (98) In step 1001, in one example, while the firmware update routine continues, the firmware update utility 500 may display a status screen to the user. The status screen may have a graphic illustration of the progress. The status screen will show the process of the firmware update, and may visually indicate to the user the length of time elapsed since the start of the update and the length of the time to complete the update.) based on identification of an interruption (Fallon: [Col 14, lines 54-65], the graphical illustration 601 may be a status bar representing the progress of a step in the process. The status bar may visually indicate to the user approximately the time the utility will take to perform a step of the firmware update process (=sensors updates as interruption). [Col 20, lines 28-34], while the firmware update routine continues, the firmware update utility 500 may display a status screen to the user. The status screen may have a graphic illustration of the progress. The status screen will show the process of the firmware update, and may visually indicate to the user the length of time elapsed since the start of the update and the length of the time to complete the update). the interruption identified as a sensor update being performed during which a sensor cannot be used for authentication (Fallon: [Col 15, lines 55-63], (76) The firmware utility screen 600 may also contain one or more warning screens that may alert or advise the user 314 of a certain condition. FIG. 7 depicts an example of such a warning screen…Such warning screens may appear to the user 314 when the user 314 performs an action (=system provides alerts when user is trying to perform action which cannot be performed during firmware updates) that could potentially harm the UPS 10 or interrupt the firmware update process (=sensor updates) in a way that is harmful to the UPS) A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Lindemann in view of Weidner by applying the well-known technique as disclosed by Fallon of initiating firmware updates process on the computer system in order to install a new version of firmware. The motivation is to improve or add functionality to the device (Fallon: [Col 1, lines 21-22]) Claim(s) 39-44, 46-48 are rejected under 35 U.S.C. 103 as being unpatentable over Lindemann (U. S. PGPub. No. 2018/0041503 A1) (hereinafter “Lindemann”) in view of Weidner (U. S. PGPub. No 2013/0104187 A1) (hereinafter “Weidner”); and in further view of Fallon et al. (U. S. Pat. No. 8,713,559 B2) (hereinafter “Fallon”) and Radulov et al. (U. S. Pat. No. 10,395,066 B1) (hereinafter “Radulov”). Regarding Claim 39, The Lindemann in view of Weidner, Fallon teaches: The device of claim 1 (see rejection of claim 1 above), The Lindemann in view of Weidner, Fallon does not explicitly disclose: wherein the sensor comprises a biometric sensor; However, in an analogous art, Radulov teaches: wherein the sensor comprises a biometric sensor (Radulov: [Col 6, lines 20-39], Means of inputting information include means for inputting visual information, such as a camera 104, and audio information, such as a microphone 105 (=sensors). The camera 104 or the microphone 105 may be built-in in the computing device 101, e.g. a camera and a microphone of a desktop or laptop computer. The camera 104 and the microphone 105 may be external devices connected through a serial interface to the computing device 101). A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Lindemann in view of Weidner and Fallon by applying the well-known technique as disclosed by Radulov use of microphone and camera as biometric sensors. The motivation is to provide the security and protection of computers or computer systems from unauthorized actions by controlling access to a microphone and/or a camera from software applications that request access to the function of a microphone and/or camera. (Radulov: [Col 1, lines 17-21]). Regarding Claim 40, The Lindemann in view of Weidner, Fallon and Radulov teaches: The device of claim 39 (see rejection of claim 39 above), wherein the biometric sensor comprises a camera (Radulov: [col 6, lines 31-36], Means of inputting information include means for inputting visual information, such as a camera 104, and audio information, such as a microphone 105. The camera 104 or the microphone 105 may be built-in in the computing device 101, e.g. a camera and a microphone of a desktop or laptop computer). A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Lindemann in view of Weidner and Fallon by applying the well-known technique as disclosed by Radulov use of microphone and camera as biometric sensors. The motivation is to provide the security and protection of computers or computer systems from unauthorized actions by controlling access to a microphone and/or a camera from software applications that request access to the function of a microphone and/or camera. (Radulov: [Col 1, lines 17-21]). Regarding 41, The Lindemann in view of Weidner, Fallon and Radulov teaches: The device of claim 39 (see rejection of claim 39 above), wherein the biometric sensor comprises a microphone (Radulov: [col 6, lines 31-36], Means of inputting information include means for inputting visual information, such as a camera 104, and audio information, such as a microphone 105. The camera 104 or the microphone 105 may be built-in in the computing device 101, e.g. a camera and a microphone of a desktop or laptop computer). A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Lindemann in view of Weidner and Fallon by applying the well-known technique as disclosed by Radulov use of microphone and camera as biometric sensors. The motivation is to provide the security and protection of computers or computer systems from unauthorized actions by controlling access to a microphone and/or a camera from software applications that request access to the function of a microphone and/or camera. (Radulov: [Col 1, lines 17-21]). Regarding Claim 42, This claim contains identical limitations found within that of claim 39 above albeit directed to a different statutory category (method). For this reason the same grounds of rejection are applied to claim 42. Regarding Claim 43, This claim contains identical limitations found within that of claim 40 above albeit directed to a different statutory category (method). For this reason the same grounds of rejection are applied to claim 43. Regarding Claim 44, This claim contains identical limitations found within that of claim 41 above albeit directed to a different statutory category (method). For this reason the same grounds of rejection are applied to claim 44. Regarding Claim 46, This claim contains identical limitations found within that of claim 39 above albeit directed to a different statutory category (CRSM). For this reason the same grounds of rejection are applied to claim 46. Regarding Claim 47, This claim contains identical limitations found within that of claim 40 above albeit directed to a different statutory category (CRSM). For this reason the same grounds of rejection are applied to claim 47. Regarding Claim 48, This claim contains identical limitations found within that of claim 41 above albeit directed to a different statutory category (CRSM). For this reason the same grounds of rejection are applied to claim 48. Claim(s) 7 is rejected under 35 U.S.C. 103 as being unpatentable over Lindemann (U. S. PGPub. No. 2018/0041503 A1) (hereinafter “Lindemann”) in view of Weidner (U. S. PGPub. No 2013/0104187 A1) (hereinafter “Weidner”), Fallon et al. (U. S. Pat. No. 8,713,559 B2) (hereinafter “Fallon”); and in further in view of Rehfeld (U. S. Pat. No. 8,054,496 B1) (hereinafter “Rehfeld”). Regarding claim 7, the Lindemann in view of Weidner and Fallon teaches: The method of claim 1 (see rejection of claim 1 above), the combination of Lindemann in view of Weidner and Sheller does not explicitly teaches but Rehfeld discloses: wherein the first action comprises one or more of: accepting changes to data but not saving the changes to persistent storage as part of a current version of the data (Rehfeld: [Col 2, lines 60-67], Compound document engine 108 generates a preview of compound document 106 that includes updates to placed document 104 (=current version of the document). This preview may be displayed to a user using a placed document application installed on device 100 or using another application installed on device 100. In some embodiments, the preview generated by compound document engine 108 includes updates to placed document 104 even though the updates are not saved), accepting user actions taken in relation to the data but not saving the actions to persistent storage, indicating the changes in a first log separate from the data, indicating the actions in a second log separate from the data (Rehfeld: [col 6, lines 8-15], (31) FIG. 4 is a flowchart illustrating an embodiment of generating a preview of an updated placed document in the context of a compound document. In the example shown, the placed document is updated in the placed document application at 400. For example, the update may be performed by a user who desires a preview of the compound document which includes the update to the placed document. In some embodiments, the updates to the placed document have not been saved) A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Lindemann in view of Weidner and Fallon by applying the well-known technique as disclosed by Rehfeld of generating the preview of unsaved document. The motivation is to improve methods to display previews to users of placed documents (Rehfeld: [Col 1, lines 14-15]). Claim(s) 8-10 are rejected under 35 U.S.C. 103 as being unpatentable over Lindemann (U. S. PGPub. No. 2018/0041503 A1) (hereinafter “Lindemann”) in view of Weidner (U. S. PGPub. No 2013/0104187 A1) (hereinafter “Weidner”) and Fallon et al. (U. S. Pat. No. 8,713,559 B2) (hereinafter “Fallon”); and further in view of Rehfeld (U. S. Pat. No. 8,054,496 B1) (hereinafter “Rehfeld”); further in view of Chugunov (U. S. PGPub. No. 2022/0067177 A1) (hereinafter “Chugunov”) Regarding Claim 8, Lindemann in view of Weidner, Fallon and Rehfeld teaches: The method of claim 7 (see rejection of claim 7 above), wherein the interruption is a first interruption, and wherein the second action comprises one or more of (Weidner: [0019], when a device is in an insecure state, after the passage of a threshold period of time or the occurrence of a threshold event the device may perform an automatic re-authentication process 109. The re-authentication process will automatically capture an authentication 115, and the device will be retained in the insecure state only if a result of the authentication process satisfies an insecure state re-authentication confirmation policy 127. In some such examples, the re-authentication can happen at any time, and does not require going through a secure state before re-authentication): It would be obvious to a person having ordinary skill in the art, before the effective filing date of the invention, to modify Lindemann’s method of selecting time interval and one or more different type of authentication applying authentication policy by applying Weidner’s method of determining if response receive withing certain threshold amount of time or not, in order to improvement over leaving the device completely unprotected in an insecure security state during this interval (Weidner: [0021]). Lindemann in view of Weidner, Fallon and Rehfeld does not explicitly teach: . However, in an analogous art, Chugunov teaches: saving the changes to persistent storage as part of the current version of the data, saving the user actions to persistent storage (Chugunov: [0047], provides for the saving the current version of the document 304, modifying the confidentiality mark, and the like. Each of these actions may be accompanied by a process of selecting a mark. If a mark has already been assigned to the document 304 and the functionality “Always select current mark” is activated in the server configurations, then the user 302 does not mark the document 304 during the actions listed above. Otherwise, the selection of a mark is an obligatory action for the user 302 before successful completion of the actions of printing, saving, or saving the document 304), indicating the first interruption in a third log separate from the data (Chugunov: [0030], provides for the file tracking module 118 may be configured to track and automatically notify the server component 104 of, all actions performed by users with protected documents. The fact that a file has been opened may be communicated to the server component 104 with a mark. The fact that a document has been opened may be communicated by email with an embedded tracker. The fact and conditions of a leak of a document with encoded information may be determined when copies of this document are detected. In an aspect, the file tracking module 118 may separate into a special category any cases detected during tracking in which attempts are made to obtain access to prohibited documents, and may automatically notify the server component 104 of such events as a priority), A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Lindemann in view of Weidner, Fallon and Rehfeld by applying the well-known technique as disclosed by Chugunov of saving the current version of the document and configured to track all actions performed by the users using file tracking module, in order to prevent the data leakage from unauthorized user. The motivation is to prevent leaks of confidential information, preventing the distribution of stolen information, limit access to files, to identify a user who has permitted the unsanctioned dissemination of confidential information (Chugunov: [0006]). Regarding claim 9, the Lindemann in view of Weidner, Fallon, Rehfeld and Chugunov teaches: The method of claim 8 (see rejection of claim 8 above), wherein the at least one type of authentication is a first type of authentication (Lindemann: [0258], FIG. 25 illustrates an exemplary client 2520 with a biometric device 2500 for performing facial recognition. When operated normally, a biometric sensor 2502 (e.g., a camera) reads raw biometric data from the user (e.g., snaps a photo of the user) and a feature extraction module 2503 extracts specified characteristics of the raw biometric data (e.g., focusing on certain facial features, etc.). An application 2505 may then use the score or yes/no result to determine whether the authentication was successful), and wherein the third action comprises one or more of prompting the user to authenticate themselves using second type of authentication different from the first type of authentication to save the changes (Lindemann: [0306], In one embodiment, the user's voice is captured via a microphone 2680 and the analog voice signal is converted to a digital signal via an analog to digital (A/D) converter 2681. The voice recognition module 2660 compares the digital voice signal to a voice print of the user stored within a voice database 2665. In one embodiment, the voice print is generated during a training/enrollment process in which the user is prompted to speak certain words or phrases). prompting the user to authenticate themselves using a second type of authentication different from the first type of the authentication to save the actions (Lindemann: [0307] In one embodiment, the user is prompted to speak a particular sequence of words and/or phrases displayed on the display 2601 of the client device. These may be the same words/phrases or similar words/phrases as those used during the enrollment process so that the voice recognition module 2660 can compare similar voice characteristics to those captured in the voice print) and prompting the user to authenticate themselves using a second type of authentication different from the first type of authentication to allow the user access to the device again (Lindemann: [0307] In one embodiment, the user is prompted to speak a particular sequence of words and/or phrases displayed on the display 2601 of the client device. These may be the same words/phrases or similar words/phrases as those used during the enrollment process so that the voice recognition module 2660 can compare similar voice characteristics to those captured in the voice print. [0383] If the authentication at 3504 is unsuccessful (e.g., because an acceptable assurance level was not reached), then at 3505, the transaction is denied and/or one or more additional authentication techniques may be required. For example, the user may be required to provide additional authentication using one or more additional techniques (e.g., entering a secret code if the initial authentication was a fingerprint, etc.). If the additional techniques are sufficient, determined at 3506, then the transaction is permitted at 3507. If not, then the transaction is again denied and/or additional authentication techniques are attempted). providing the notification to the person regarding the first interruption, the person being different than the user (Lindemann: [0120], Turning to FIG. 5, if the assurance level transmitted to the relying party is acceptable for the current transaction with the user, determined at 501, then the relying party may send a response to the client device indicating a successful authentication. If not, then at 503, the relying party may send a response to the client indicating that additional authentication is needed (e.g., potentially explicit user authentication if non-intrusive authentication is insufficient)). The above cited Lindemann does not explicitly disclose: However, Weidner teaches: (Weidner: [0024], Transitions between security states may occur automatically (without external input) upon the satisfaction of certain state transition conditions. For example the passage of a threshold period of time 327-329 (represented by solid lines in FIG. 3) may automatically transition the device to the next higher security state), locking the user out of the device (Weidner: [0020] As a more detailed example, consider an authentication policy set for a smartphone, where a user presses the smartphone's power button momentarily to turn off the screen and place the smartphone in a locked and secured security state:), It would be obvious to a person having ordinary skill in the art, before the effective filing date of the invention, to modify Lindemann’s method of selecting time interval and one or more different type of authentication applying authentication policy by applying Weidner’s method of performing actions if response does not receive withing certain threshold amount of time, in order to improvement over leaving the device completely unprotected in an insecure security state during this interval (Weidner: [0021]). Lindemann in view of Weidner, Fallon does not explicitly disclose: indicating in a fourth log However, in an analogous art, Chugunov teaches: indicating in a fourth log (Chugunov: [0037], provides for the recording module 140 may represent data about the use of documents and actions performed with them. Such data may be represented in different cross section. [0047], provides for each of these actions may be accompanied by a process of selecting a mark. If a mark has already been assigned to the document 304 and the functionality “Always select current mark” is activated in the server configurations, then the user 302 does not mark the document 304 during the actions listed above. Otherwise, the selection of a mark is an obligatory action for the user 302 before successful completion of the actions of printing, saving, or saving the document 304), A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Lindemann in view of Weidner, Fallon and Rehfeld by applying the well-known technique as disclosed by Chugunov of recording the data about the use of documents and actions performed on them, in order to keep track of the changes made in the documents. The motivation is to prevent leaks of confidential information, preventing the distribution of stolen information, limit access to files, to identify a user who has permitted the unsanctioned dissemination of confidential information (Chugunov: [0006]). Regarding claim 10, the Lindemann in view of Weidner, Fallon and Chugunov teaches: The method of claim 9 (see rejection of claim 9 above), wherein the instructions are executable to one or more of: responsive to user authentication subsequent to performance of the third action (Lindemann: [0185] The non-intrusive authentication techniques 1742-1743 may include user behavior sensors 1742 which collect data related to user behavior (user’s actions) for authenticating the user. For example, the biometric gait of the user may be measured using an accelerometer or other type of sensor 1742 in combination with software and/or hardware designed to generate a gait “fingerprint” of the user's normal walking pattern) and using a type of authentication different from the first type of authentication, (Lindemann: [0383] If the authentication at 3504 is unsuccessful (e.g., because an acceptable assurance level was not reached), then at 3505, the transaction is denied and/or one or more additional authentication techniques may be required. For example, the user may be required to provide additional authentication using one or more additional techniques (e.g., entering a secret code if the initial authentication was a fingerprint, etc.). If the additional techniques are sufficient, determined at 3506, then the transaction is permitted at 3507. If not, then the transaction is again denied and/or additional authentication techniques are attempted) Lindemann in view of Weidner, Fallon and Rehfeld does not explicitly disclose: However, in an analogous art, Chugunov teaches: (Chugunov: [0047], provides for editing the document 304 and saving amendments, saving the current version of the document 304, modifying the confidentiality mark, and the like). and save the actions to persistent storage (Chugunov: [0030], provides for the file tracking module 118 may be configured to track and automatically notify the server component 104 of, all actions performed by users with protected documents. [0037], provides for the recording module 140 may be used to construct charts, create a physical document tree, and output data from logs into a customer's template. In other words, the recording module 140 may represent data about the use of documents and actions performed with them). A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Lindemann in view of Weidner, Fallon and Rehfeld by applying the well-known technique as disclosed by Chugunov of recording the data about the use of documents and actions performed on them, in order to keep track of the changes made in the documents. The motivation is to prevent leaks of confidential information, preventing the distribution of stolen information, limit access to files, to identify a user who has permitted the unsanctioned dissemination of confidential information (Chugunov: [0006]). Claim(s) 12 and 45 are rejected under 35 U.S.C. 103 as being unpatentable over McLachlan et al. (U. S. 2024/0028766 A1 (hereinafter “McLachlan”) in view of Grigg et al. (U. S. 2015/0227730 A1) (hereinafter “Grigg”) and Weidner (U. S. PGPub. No 2013/0104187 A1) (hereinafter “Weidner”); Fallon et al. (U. S. Pat. No. 8,713,559 B2) (hereinafter “Fallon”). Regarding claim 12, McLachlan teaches: selecting a dynamic threshold amount of time, the selecting of the dynamic threshold amount of time being based on one or more of: (McLachlan: [0089] FIG. 3 illustrates an exemplary user interface 310 on a UE for facilitating data sharing time limits. Using user interface 310, an end user can select how long data uploaded into the edge cloud is preserved for different use cases within the same entity (e.g., commercial requestor). In this case, the end user has granted the entity named “Mobile Phone Service Provider” with access to advertising data for 12 hours (indicated by selector 320), but to billing data for 30 days (indicated by selector 322). Additionally, then entity has been granted access to service optimization data for 5 days (indicated by selector 324), and media data for 0 days (e.g., immediate deletion) (indicated by selector 326). Exemplary user interface 310 includes user identifier 112, previous screen icon 314, previous screen title 316 (“Data Sharing Time Limits”), current screen title 318 (“Mobile Phone Service Provider Data Sharing Time Limits”), and time limit selector 328. In this example, time limit selector 328 can be used to adjust the time limit (=selecting dynamic threshold amount of time) by vertically scrolling values in respective boxes for increments of days, hours, and/or minutes and accepting the selected time limit using the appropriate icon (labeled “Accept” in this example). Any other appropriate time selection interfaces or entry fields can be used), McLachlan does not explicitly disclose: an activity for which a device is currently being used, at least one type of authentication to be used for authenticating a user while the user performs the activity However, Grigg teaches: an activity for which a device is currently being used, at least one type of authentication to be used for authenticating a user while the user performs the activity (Grigg: [0061], the authentication types may include a username, a password, a personal identification number, biometric data, or the like. [0068], In another aspect, the user may choose to not have access to application functions (=type of activity to perform for selected time period) even with valid authentication credentials associated with the determined authentication level during a particular time period determined by defined by a first time stamp and a second time stamp. For example, the user may be an online shopping enthusiast who is usually tempted to shop for products and services immediately after work and before beginning household chores. In such situations, the user may choose a particular time period for limited access to application functions (=type of activity to perform for selected time period) in an attempt to curtail expenses and budget better. In one aspect, a user selection of one or more application functions may be selecting one or more check boxes (=first selector) corresponding to the one or more application functions presented to the user. In another aspect, a user selection of one or more application functions may include a selection of one or more application functions from a drop down box (=second selector). In yet another aspect, a user selection of one or more application functions may include an option to save the selection. In some embodiments, the saved selection may be saved in memory to enable the user to choose the saved option directly instead of having to go through the process of selecting the one or more application functions again); It would be obvious to a person having ordinary skill in the art, before the effective filing date of the invention, to modify McLachlan’s method of selecting time limit by suing time selector by applying Grigg’s method of performing multi-step verification by selecting type of authentication and selecting actions activities to perform selected actions by user. The motivation is to prevent fraud and theft of services (Grigg: [0001]). McLachlan in view of Grigg does not explicitly disclose: based on an interruption that prevents repeated authentication not exceeding the dynamic threshold amount of time, taking at least a first action; based on the interruption exceeding the dynamic threshold amount of time, taking at least a second action different from the first action. However, in an analogous art, Weidner teaches: based on an interruption that prevents repeated authentication not exceeding the dynamic threshold amount of time, taking at least a first action (Weidner: [0019] In some further examples, when a device is in an insecure state, after the passage of a threshold period of time or the occurrence of a threshold event the device may perform an automatic re-authentication process 109. The re-authentication process will automatically capture an authentication 115, and the device will be retained in the insecure state only if a result of the authentication process satisfies an insecure state re-authentication confirmation policy 127. In some such examples, the re-authentication can happen at any time, and does not require going through a secure state before re-authentication); based on the interruption exceeding the dynamic threshold amount of time, taking at least a second action different from the first action (Weidner: Please see the table provides after para [0020] and Fig. 3. [0024], Transitions between security states may occur automatically (without external input) upon the satisfaction of certain state transition conditions. For example the passage of a threshold period of time 327-329 (represented by solid lines in FIG. 3) may automatically transition the device to the next higher security state. [0033] (2) State 302, elapsed time 5 minutes, transition to state 303), A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify McLachlan in view of Grigg by applying the well-known technique as disclosed by Weidner’s method of determining if response receive withing certain threshold amount of time or not, in order to improvement over leaving the device completely unprotected in an insecure security state during this interval (Weidner: [0021]). the interruption identified as a sensor update being performed (Fallon: [Col 15, lines 55-63], (76) The firmware utility screen 600 may also contain one or more warning screens that may alert or advise the user 314 of a certain condition. FIG. 7 depicts an example of such a warning screen…Such warning screens may appear to the user 314 when the user 314 performs an action (=system provides alerts when user is trying to perform action which cannot be performed during firmware updates) that could potentially harm the UPS 10 or interrupt the firmware update process (=sensor updates) in a way that is harmful to the UPS) A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Lindemann in view of Weidner by applying the well-known technique as disclosed by Fallon of initiating firmware updates process on the computer system in order to install a new version of firmware. The motivation is to improve or add functionality to the device (Fallon: [Col 1, lines 21-22]). Regarding Claim 45, The Lindemann in view of Weidner and Fallon teaches: The method of claim 12 (see rejection of claim 12 above), wherein the dynamic threshold amount of time is selected (McLachlan: [0089] FIG. 3 illustrates an exemplary user interface 310 on a UE for facilitating data sharing time limits. Using user interface 310, an end user can select how long data uploaded into the edge cloud is preserved for different use cases within the same entity (e.g., commercial requestor). In this case, the end user has granted the entity named “Mobile Phone Service Provider” with access to advertising data for 12 hours (indicated by selector 320), but to billing data for 30 days (indicated by selector 322). Additionally, then entity has been granted access to service optimization data for 5 days (indicated by selector 324), and media data for 0 days (e.g., immediate deletion) (indicated by selector 326). Exemplary user interface 310 includes user identifier 112, previous screen icon 314, previous screen title 316 (“Data Sharing Time Limits”), current screen title 318 (“Mobile Phone Service Provider Data Sharing Time Limits”), and time limit selector 328. In this example, time limit selector 328 can be used to adjust the time limit by vertically scrolling values in respective boxes for increments of days, hours, and/or minutes and accepting the selected time limit using the appropriate icon (labeled “Accept” in this example). Any other appropriate time selection interfaces or entry fields can be used), based at least on the activity for which the device is currently being used (Grigg: [0061], the authentication types may include a username, a password, a personal identification number, biometric data, or the like. [0068], In another aspect, the user may choose to not have access to application functions (=type of activity to perform for selected time period) even with valid authentication credentials associated with the determined authentication level during a particular time period determined by defined by a first time stamp and a second time stamp. For example, the user may be an online shopping enthusiast who is usually tempted to shop for products and services immediately after work and before beginning household chores. In such situations, the user may choose a particular time period for limited access to application functions (=type of activity to perform for selected time period) in an attempt to curtail expenses and budget better. In one aspect, a user selection of one or more application functions may be selecting one or more check boxes (=first selector) corresponding to the one or more application functions presented to the user. In another aspect, a user selection of one or more application functions may include a selection of one or more application functions from a drop down box (=second selector). In yet another aspect, a user selection of one or more application functions may include an option to save the selection. In some embodiments, the saved selection may be saved in memory to enable the user to choose the saved option directly instead of having to go through the process of selecting the one or more application functions again); the activity being management of network infrastructure (Fallon:[Col 20, lines 20-24], (97) FIG. 10 and FIG. 11 show an example of the firmware update routine (=initiating firmware updates). The process 1000 may include the acts of displaying a status screen to a user, transmitting firmware file to UPS, writing the firmware file to UPS, if the update is successful displaying a completion screen, [Col 20, lines 28-34], (98) In step 1001, in one example, while the firmware update routine continues, the firmware update utility 500 may display a status screen to the user. The status screen may have a graphic illustration of the progress. The status screen will show the process of the firmware update, and may visually indicate to the user the length of time elapsed since the start of the update and the length of the time to complete the update); Claim(s) 18 and 49-50 are rejected under 35 U.S.C. 103 as being unpatentable over Grigg et al. (U. S. 2015/0227730 A1) (hereinafter “Grigg”) in view of McLachlan et al. (U. S. 2024/0028766 A1 (hereinafter “McLachlan”); and in further view of Fallon et al. (U. S. Pat. No. 8,713,559 B2) (hereinafter “Fallon”) Regarding claim 18, Grigg teach: At least one computer readable storage medium (CRSM) (Grigg: [0073], Furthermore, embodiments of the present invention may take the form of a computer program product that includes a computer-readable storage medium having one or more computer-executable program code portions stored therein) that is not a transitory signal, the computer readable storage medium comprising instructions executable by at least one processor to (Grigg: [0074], The computer-readable medium may include, but is not limited to, a non-transitory computer-readable medium, such as a tangible electronic, magnetic, optical, electromagnetic, infrared, and/or semiconductor system, device, and/or other apparatus. For example, in some embodiments, the non-transitory computer-readable medium includes a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a compact disc read-only memory (CD-ROM), and/or some other tangible optical and/or magnetic storage device) (Grigg: [0065] In one aspect, the user selected preference may include a time duration. The time duration is defined by a user selection of a first time stamp and a second time stamp. In one aspect, the first time stamp and the second time stamp may define the boundaries of application of the level of authentication); Grigg does not explicitly disclose: select a particular threshold amount of time, select a particular threshold amount of time However, in an analogous art, McLachlan teaches: select a particular threshold amount of time (McLachlan: [0089] FIG. 3 illustrates an exemplary user interface 310 on a UE for facilitating data sharing time limits. Using user interface 310, an end user can select how long data uploaded into the edge cloud is preserved for different use cases within the same entity (e.g., commercial requestor). In this case, the end user has granted the entity named “Mobile Phone Service Provider” with access to advertising data for 12 hours (indicated by selector 320), but to billing data for 30 days (indicated by selector 322). Additionally, then entity has been granted access to service optimization data for 5 days (indicated by selector 324), and media data for 0 days (e.g., immediate deletion) (indicated by selector 326). Exemplary user interface 310 includes user identifier 112, previous screen icon 314, previous screen title 316 (“Data Sharing Time Limits”), current screen title 318 (“Mobile Phone Service Provider Data Sharing Time Limits”), and time limit selector 328. In this example, time limit selector 328 can be used to adjust the time limit by vertically scrolling values in respective boxes for increments of days, hours, and/or minutes and accepting the selected time limit using the appropriate icon (labeled “Accept” in this example). Any other appropriate time selection interfaces or entry fields can be used), select dynamic threshold amounts of time (McLachlan:[0089] FIG. 3 illustrates an exemplary user interface 310 on a UE for facilitating data sharing time limits. Using user interface 310, an end user can select how long data uploaded into the edge cloud is preserved for different use cases within the same entity (e.g., commercial requestor). In this case, the end user has granted the entity named “Mobile Phone Service Provider” with access to advertising data for 12 hours (indicated by selector 320), but to billing data for 30 days (indicated by selector 322). Additionally, then entity has been granted access to service optimization data for 5 days (indicated by selector 324), and media data for 0 days (e.g., immediate deletion) (indicated by selector 326). Exemplary user interface 310 includes user identifier 112, previous screen icon 314, previous screen title 316 (“Data Sharing Time Limits”), current screen title 318 (“Mobile Phone Service Provider Data Sharing Time Limits”), and time limit selector 328. In this example, time limit selector 328 can be used to adjust the time limit by vertically scrolling values in respective boxes for increments of days, hours, and/or minutes and accepting the selected time limit using the appropriate icon (labeled “Accept” in this example). Any other appropriate time selection interfaces or entry fields can be used) It would be obvious to a person having ordinary skill in the art, before the effective filing date of the invention, to modify Grigg’s method of performing multi-step verification by selecting type of authentication and selecting actions activities to perform selected actions by user by applying McLachlan’s method of selecting time using time limit selector, in order to facilitating time-controlled data for preserving privacy of data sources (McLachlan: [Abstract]). Grigg in view of McLachlan does not explicitly disclose: In the first instance, based on an interruption that prevents repeated authentication not exceeding the particular threshold amount of time, take at least a first action; In the first instance, based on the interruption exceeding the dynamic threshold amount of time take at least a second action different from the first actions. However, in an analogous art, Weidner teaches: (Weidner: [0019] In some further examples, when a device is in an insecure state, after the passage of a threshold period of time or the occurrence of a threshold event the device may perform an automatic re-authentication process 109. The re-authentication process will automatically capture an authentication 115, and the device will be retained in the insecure state only if a result of the authentication process satisfies an insecure state re-authentication confirmation policy 127. In some such examples, the re-authentication can happen at any time, and does not require going through a secure state before re-authentication); based on the interruption exceeding the dynamic threshold amount of time take at least a second action different from the first actions (Weidner: Please see the table provides after para [0020] and Fig. 3. [0024], Transitions between security states may occur automatically (without external input) upon the satisfaction of certain state transition conditions. For example the passage of a threshold period of time 327-329 (represented by solid lines in FIG. 3) may automatically transition the device to the next higher security state. [0033] (2) State 302, elapsed time 5 minutes, transition to state 303). A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Grigg in view of McLachlan by applying the well-known technique as disclosed by Weidner of performing the re-authentication process the passage of a threshold period of time. The motivation is to improvement over leaving the device completely unprotected in an insecure security state during this interval (Weidner: [0021]). Grigg in view of McLachlan and Weidner does not explicitly disclose: based on identification of an interruption the interruption identified as a sensor update being performed However, in an analogous art, Fallon teaches: based on identification of an interruption (Fallon: [Col 14, lines 54-65], the graphical illustration 601 may be a status bar representing the progress of a step in the process. The status bar may visually indicate to the user approximately the time the utility will take to perform a step of the firmware update process (=sensors updates as interruption). [Col 20, lines 28-34], while the firmware update routine continues, the firmware update utility 500 may display a status screen to the user. The status screen may have a graphic illustration of the progress. The status screen will show the process of the firmware update, and may visually indicate to the user the length of time elapsed since the start of the update and the length of the time to complete the update) the interruption identified as a sensor update being performed (Fallon: [Col 15, lines 55-63], (76) The firmware utility screen 600 may also contain one or more warning screens that may alert or advise the user 314 of a certain condition. FIG. 7 depicts an example of such a warning screen…Such warning screens may appear to the user 314 when the user 314 performs an action (=system provides alerts when user is trying to perform action which cannot be performed during firmware updates) that could potentially harm the UPS 10 or interrupt the firmware update process (=sensor updates) in a way that is harmful to the UPS) A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Lindemann in view of Weidner by applying the well-known technique as disclosed by Fallon of initiating firmware updates process on the computer system in order to install a new version of firmware. The motivation is to improve or add functionality to the device (Fallon: [Col 1, lines 21-22]) Regarding Claim 49, The Grigg in view of McLachlan and Weidner and Fallon teaches: The CRSM of claim 18 (see rejection of claim 18 above), wherein the particular threshold amount of time is selected based (Grigg: [0065] In one aspect, the user selected preference may include a time duration. The time duration is defined by a user selection of a first time stamp and a second time stamp. In one aspect, the first time stamp and the second time stamp may define the boundaries of application of the level of authentication) the activity being management of network infrastructure (Fallon:[Col 20, lines 20-24], (97) FIG. 10 and FIG. 11 show an example of the firmware update routine (=initiating firmware updates). The process 1000 may include the acts of displaying a status screen to a user, transmitting firmware file to UPS, writing the firmware file to UPS, if the update is successful displaying a completion screen, [Col 20, lines 28-34], (98) In step 1001, in one example, while the firmware update routine continues, the firmware update utility 500 may display a status screen to the user. The status screen may have a graphic illustration of the progress. The status screen will show the process of the firmware update, and may visually indicate to the user the length of time elapsed since the start of the update and the length of the time to complete the update); Regarding Claim 50, The Grigg in view of McLachlan and Weidner and Fallon teaches: The CRSM of claim 18 (see rejection of claim 18 above), wherein the activity is changing network configuration settings (Fallon:[Col 20, lines 20-24], (97) FIG. 10 and FIG. 11 show an example of the firmware update routine (=initiating firmware updates). The process 1000 may include the acts of displaying a status screen to a user, transmitting firmware file to UPS, writing the firmware file to UPS, if the update is successful displaying a completion screen, [Col 20, lines 28-34], (98) In step 1001, in one example, while the firmware update routine continues, the firmware update utility 500 may display a status screen to the user. The status screen may have a graphic illustration of the progress. The status screen will show the process of the firmware update, and may visually indicate to the user the length of time elapsed since the start of the update and the length of the time to complete the update.) Claim(s) 51 is rejected under 35 U.S.C. 103 as being unpatentable over Grigg et al. (U. S. 2015/0227730 A1) (hereinafter “Grigg”) in view of McLachlan et al. (U. S. 2024/0028766 A1 (hereinafter “McLachlan”); and in further view of Fallon et al. (U. S. Pat. No. 8,713,559 B2) (hereinafter “Fallon”). Regarding Claim 51, The Grigg in view of McLachlan and Weidner and Fallon teaches: The CRSM of claim 18 (see rejection of claim 18 above), wherein the activity is accessing sensitive device hardware (Radulov: [Col 5, lines 24-25], any process (=activity) requesting access to the camera (=sensitive device hardware) through the operating system). A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Lindemann in view of Weidner and Fallon by applying the well-known technique as disclosed by Radulov determining that any process requesting access to the camera. The motivation is to provide the security and protection of computers or computer systems from unauthorized actions by controlling access to a microphone and/or a camera from software applications that request access to the function of a microphone and/or camera. (Radulov: [Col 1, lines 17-21]). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Refer to PTO-892, Notice of References Cited for a listing of analogous art. Le et al. (U. S. Pat. No. 11,275,576 B2): Techniques are provided for updating firmware of an accessory device. An accessory development kit of the accessory device can communicate with an accessory update daemon using a home management daemon of a controller device. Based on a firmware update policy of the accessory device, the accessory update daemon will check for firmware updates. When firmware updates are available, the accessory update daemon can instruct the home management daemon to stage the update. The home management daemon will notify the accessory development kit to be in a stage mode. The accessory update daemon will download the firmware update and send the firmware update to the accessory development kit of the accessory device using an interface for the secure channel provided by the home management daemon. The accessory device can be a third party accessory device that does not have its own firmware updating application. Boettcher et al. (U. S. PGPub. No.): A wearable device can establish a verified session with a host device. For example, if a user wearing the wearable device unlocks the host device (e.g., by entering a credential), the devices can create a verified session, which can persist across lock and unlock events at the host device. For the duration of the verified session, a host device can request session confirmation from the wearable device at any time to confirm that the verified session is still in progress. While the session is in progress, the host can make features available such as bypassing re-entry of a credential during unlock operations. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RUPALI DHAKAD whose telephone number is (571)270-3743. The examiner can normally be reached M-F 8:30-5:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alexander Lagor can be reached at 5712705143. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /R.D./Examiner, Art Unit 2437 /ALEXANDER LAGOR/Supervisory Patent Examiner, Art Unit 2437
Read full office action

Prosecution Timeline

Jan 12, 2022
Application Filed
Jan 25, 2024
Non-Final Rejection — §103, §112
Apr 16, 2024
Response Filed
Jun 24, 2024
Final Rejection — §103, §112
Sep 16, 2024
Request for Continued Examination
Oct 01, 2024
Response after Non-Final Action
Nov 13, 2024
Non-Final Rejection — §103, §112
Feb 07, 2025
Response Filed
Apr 16, 2025
Final Rejection — §103, §112
Jun 28, 2025
Request for Continued Examination
Jul 02, 2025
Response after Non-Final Action
Sep 26, 2025
Non-Final Rejection — §103, §112
Nov 13, 2025
Response Filed
Feb 04, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12592937
Method For Protection From Cyber Attacks To A Vehicle, And Corresponding Device
2y 5m to grant Granted Mar 31, 2026
Patent 12587544
METHOD AND SYSTEM TO REMEDIATE A SECURITY ISSUE
2y 5m to grant Granted Mar 24, 2026
Patent 12513154
BLOCKCHAIN-BASED DATA DETECTION METHOD, APPARATUS, AND COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Dec 30, 2025
Patent 12495039
INTEGRATED AUTHENTICATION SYSTEM AND METHOD
2y 5m to grant Granted Dec 09, 2025
Patent 12468826
METHOD FOR OPERATING A PRINTING SYSTEM
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
39%
Grant Probability
71%
With Interview (+31.2%)
3y 6m
Median Time to Grant
High
PTA Risk
Based on 33 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month