Prosecution Insights
Last updated: April 19, 2026
Application No. 18/602,608

SYSTEMS AND METHODS FOR ENHANCED SECURITY IN 3D SPACES

Non-Final OA §103
Filed
Mar 12, 2024
Examiner
DUFFIELD, JEREMY S
Art Unit
2498
Tech Center
2400 — Computer Networks
Assignee
Adeia Guides Inc.
OA Round
1 (Non-Final)
49%
Grant Probability
Moderate
1-2
OA Rounds
3y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 49% of resolved cases
49%
Career Allow Rate
213 granted / 438 resolved
-9.4% vs TC avg
Strong +53% interview lift
Without
With
+53.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
27 currently pending
Career history
465
Total Applications
across all art units

Statute-Specific Performance

§101
7.4%
-32.6% vs TC avg
§103
59.9%
+19.9% vs TC avg
§102
10.9%
-29.1% vs TC avg
§112
15.3%
-24.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 438 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority No priority claim has been filed for the instant application. Therefore, the effective filing date for the claims is 12 March 2024. Information Disclosure Statement The Information Disclosure Statements filed on 13 June 2024, 23, May 2025, and 02 September 2025 comply with all applicable rules and regulations. Therefore, the information referred to therein has been considered. Drawings No issues have been found with the drawings filed 12 March 2024. Specification No issues have been found with the specification filed 12 March 2024. Claim Objections Claims 3, 9, 14, and 20 are objected to because of the following informalities: Regarding claim 3, line 3—“at least one sensor”, it is unclear as to whether “at least one sensor” is referring to “at least one sensor” of claim 1. For examination purposes, “at least one sensor” of line 3 and claim 1 will be interpreted to be the same. In order to overcome this objection, line 3 may be amended to state --the at least one sensor--, for example. Claim 14 includes similar language and is similarly analyzed. Regarding claim 9, line 16—“the at least two signal gesture recognition models” should be amended to state --the at least two single gesture recognition models-- in order to accurately correspond to the previous recitation. Claim 20 includes similar language and is similarly analyzed. Appropriate correction is required. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Such claim limitations are: Claim 34: “means for receiving”, “means for determining”, “means for delivering”, “means for receiving”, “means for determining”, and “means for causing”. Regarding the “means for receiving”, “means for delivering”, and “means for causing” limitations, the specification discloses utilizing an I/O path which may be I/O circuitry. Note the portions of the specification below: “Server 204 may also include an I/O path 212. I/O path 212 may provide video conferencing data, device information, or other data, over a local area network (LAN) or wide area network (WAN), and/or other content and data to control circuitry 211, which may include processing circuitry, and storage 213. Control circuitry 211 may be used to send and receive commands, requests, and other suitable data using I/O path 212, which may comprise 1/O circuitry. I/O path 212 may connect control circuitry 211 (and specifically control circuitry) to one or more communications paths” (Para. 86). “At 1505, I/O circuitry of the verification server, based on instructions from control circuitry of the verification server, causes the application to deliver the one or more encrypted verification challenges to a secured application (e.g., device secured app 108) executing on the user device” (Para. 164). “At 1506, I/O circuitry of the verification server receives, from the application, the processed data from the at least one sensor” (Para. 165). “If the processed data does verify the user, then I/O circuitry of the verification server, based on instructions from control circuitry of the verification server, causes the application to provide the user access to at least one resource of the application at 1508” (Para. 165). Regarding the “means for determining” limitations, the specification discloses utilizing control circuitry. Note the portions of the specification below: “At 1503, if it is determined that the request does include data indicative of a plurality of sensors of the user device the verification server, then control circuitry of the verification server determines, based on the data indicative of the plurality of sensors, one or more verification challenges” (Para. 162). “At 1507, control circuitry of the verification server determines whether the processed data verifies the user” (Para. 165). Therefore, all of the limitations that invoke 112(f) have sufficient structural support in the specification. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1, 10, 12, 21, and 34 are rejected under 35 U.S.C. 103 as being unpatentable over Guillory et al. (US 2018/0183777 A1) in view of Nowak et al. (US 2020/0045046 A1) and further in view of Wang et al. (WO 2017/197554 A1). Regarding claim 1, Guillory teaches a method for verifying a user at a verification server, e.g., server 106 (Fig. 1, el. 106), the method comprising: receiving, from an application, e.g., application 108 (Fig. 1, el. 108), wherein application 108 may be operative to provide a front end of a website for an online service (Para. 15), executing on a user device, e.g., user device 102A/102B (Fig. 1, el. 102A, 102B), a verification challenge request for verifying the user, wherein the verification challenge request comprises data…, e.g., method 200 proceeds to operation 215 where a request to generate a challenge is originated, wherein after receiving the sign-in request, application 108 may generate the request for challenge, and the generated challenge request is sent to the webpage backend by application 108, wherein the challenge request is received by the webpage backend/server authentication application 116 (Fig. 1, el. 116; Fig. 2, el. 215; Para. 33); determining, based on the data…, one or more verification challenges, e.g., after receiving the challenge request at operation 215, method 200 proceeds to operation 220 where a challenge is created (Fig. 2, el. 220; Para. 34); delivering the one or more verification challenges to the application executing on the user device, wherein the one or more verification challenges are encrypted to be inaccessible to the application, e.g., after creating challenge at operation 220, method 200 proceeds to operation 225 where the created challenge is sent to webpage frontend, wherein the webpage backend is operative to send the encrypted challenge to the webpage frontend operating on computing device 102 over network 104, wherein the webpage frontend or application 108 is operative to receive the challenge from webpage backend (Fig. 2, el. 225; Para. 35); server authentication application 116 is operative to communicate with authentication application 109 installed on computing device 102, wherein server authentication application 116 is operative to receive a corresponding public key to encrypt information that can be decrypted with a private key of the public/private key pair (Fig. 1, el. 116; Para. 27); authentication application 109 is operative to create separate key databases for storing public/private key pairs associated with each user 110 (Para. 25), wherein the delivering of the one or more verification challenges causes: the application to deliver the one or more encrypted verification challenges to a secured application executing on the user device, e.g., method 200 proceeds to operation 230 where the challenge is sent to authentication application 109—secured application--, wherein the webpage frontend is operative to forward the challenge it received from the webpage backend to authentication application 109—secured application-- (Fig. 1, el. 109; Fig. 2, el. 230; Para. 36); server authentication application 116 is operative to communicate with authentication application 109 installed on computing device 102, wherein server authentication application 116 is operative to receive a corresponding public key to encrypt information that can be decrypted with a private key of the public/private key pair (Fig. 1, el. 116; Para. 27); authentication application 109 is operative to create separate key databases for storing public/private key pairs associated with each user 110 (Para. 25); the master password is required to access authentication application 109—secured application-- and to automatically login into all online services for which user 110 is registered (Para. 24); …; and the secured application to process data…for performing the one or more verification challenges, and provide the processed data to the application, e.g., method 200 proceeds to operation 235 where a confirmation for the sign-in request is received, wherein after receiving the challenge, authentication application 109 may prompt user 110 to confirm the sign-in request, wherein user 110 is prompted to confirm the sign-in request through a user interface, wherein authentication application 109 is operative to prompt user 110 to provide a confirmation for the sign-in request with the web service, wherein the prompt may be a simple confirmation from the user 110 to confirm the sign-in request or another challenge requiring a particular response (Fig. 2, el. 235; Para. 37); method 200 proceeds to operation 240 where the challenge is resolved, wherein authentication application 109 is then operative to resolve the challenge using the private key (Fig. 2, el. 240; Para. 38); after resolving the challenge response at operation 240, method 200 proceeds to operation 245 where the challenge response is sent to the webpage frontend, wherein authentication application 109 is configured to send the challenge response to the webpage frontend (Fig. 2, el. 245; Para. 40); receiving, from the application, the processed data…, e.g., method 200 proceeds to operation 250 where the challenge response is received by the webpage front end and sent to the webpage backend, wherein webpage frontend is operative to receive the challenge response from authentication application 109 (or application 108) and send the received challenge response to the webpage backend, wherein the challenge response is sent to server authentication application 116 (Fig. 2, el. 250; Para. 40); determining that the user is verified based on the processed data…, e.g., method 200 proceeds to operation 255 where the challenge response is verified, wherein the webpage backend is operative to receive the challenge response from the webpage backend, and the webpage backend is then operative to verify the challenge response, wherein server authentication application 116 may use the public key to verify the signature in the challenge response (Fig. 2, el. 255; Para. 41); and causing the application to provide the user access to at least one resource of the application, based on the determining that the user is verified, e.g., method 200 proceeds to operation 260 where user 110 is authenticated, wherein in response to receiving a positive verification of the challenge response, the webpage backend may positively authenticate user 110 and allow user 110 (through, e.g., device 102) to access the requested online services (Fig. 2, el. 260; Para. 43). Guillory does not clearly teach wherein the verification challenge request comprises data indicative of a plurality of sensors of the user device; determining, based on the data indicative of the plurality of sensors, one or more verification challenges; the secured application to lock at least one sensor of the plurality of sensors, wherein the at least one sensor is used for the one or more verification challenges, such that the application cannot access the at least one sensor while the one or more verification challenges are performed; and the secured application to process data from the at least one sensor for performing the one or more verification challenges, and provide the processed data to the application; receiving, from the application, the processed data from the at least one sensor; and determining that the user is verified based on the processed data from the at least one sensor. Nowak teaches receiving, from…a user device, e.g., user device 110A, 110B…110N (Fig. 1, el. 110A, 110B, 110N), a verification challenge request for verifying the user, wherein the verification challenge request comprises data indicative of a plurality of sensors of the user device, e.g., the user device 110 then transmits 302 an authentication request to the IS core 104 that includes user data and user device data, including the data that identifies the types of FIDO authenticators available (Fig. 1, el. 104; Fig. 3, el. 302; Para. 29); the FIDO authenticators—sensors-- may include one or more of a fingerprint reader, a microphone, and/or a digital camera (Para. 24); determining, based on the data indicative of the plurality of sensors, one or more verification challenges, e.g., the IS core 104 transmits 306 the authentication request to the routing engine 108, and the routing engine then retrieves 308 a list of authorized authenticators, selects 310 one or more business rules, selects 312 a FIDO-certified server from a plurality of such servers (in this example, the “ACME” FIDO server 114 is selected), which selected FIDO server can handle the authorized authenticators and satisfies the one or more business rules (Fig. 1, el. 108, 114; Fig. 3, el. 306, 308, 310, 312; Para. 29); the routing engine also transmits 320 the authentication request and the application identifier, the FIDO facet, and the correlation identifier to the selected ACME FIDO-certified server 114, and next the ACME FIDO server 114 generates 322 a challenge message which is based on the application identifier and FIDO facet, and transmits 324 the FIDO challenge message to the routing engine 108 (Fig. 3, el. 320, 322, 324; Para. 30); delivering the one or more verification challenges to…the user device, wherein the one or more verification challenges are encrypted to be inaccessible to the application, e.g., the routing engine 108 then forwards 326 the FIDO challenge message to the IS Core 104, which performs 328 a secure process. As explained above, the secure process may entail encrypting the entire payload (Fig. 3, el. 326, 328; Para. 30); the FIDO services authentication process next includes the IS core 104 transmitting 330 the FIDO challenge message along with an authentication response to the user device 110 (Fig. 3, el. 330; Para. 31), …to lock at least one sensor of the plurality of sensors, wherein the at least one sensor is used for the one or more verification challenges, such that the application cannot access the at least one sensor while the one or more verification challenges are performed, e.g., a FIDO server may be configured to remotely lock or unlock one or more authenticators, and/or to request additional data associated with one or more authenticators (for example, a FIDO server may be able to request a face matching score associated with a facial authenticator) (Para. 23), and …to process data from the at least one sensor for performing the one or more verification challenges, and provide the processed data…, e.g., the user device 110 then captures 336 biometric data and provides an authentication response to the user, and the user then interacts with the SDK of the user device 110 and provides FIDO authentication data (by interacting with one or more FIDO authenticators associated with the user's smartphone, for example) to satisfy the native authentication application (for example, a biometric application requiring fingerprint data from a FIDO fingerprint reader component), and the user then utilizes the user device 110 to transmit 338 the authentication response to the IS core 104 (Fig. 3, el. 336, 338; Para. 31); receiving…the processed data from the at least one sensor, e.g., the user then utilizes the user device 110 to transmit 338 the authentication response to the IS core 104 (Fig. 3, el. 338; Para. 31); and determining that the user is verified based on the processed data from the at least one sensor, e.g., the ACME FIDO-certified server 114 then retrieves 352 the FIDO facet and the authentication identifier, conducts 354 a verification process (as explained above), and transmits 356 the authentication result to the routing engine 108, which forwards 258 the authentication result to IS core 104 (Fig. 3, el. 352, 354, 356; Para. 33). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory to include wherein the verification challenge request comprises data indicative of a plurality of sensors of the user device; determining, based on the data indicative of the plurality of sensors, one or more verification challenges; to lock at least one sensor of the plurality of sensors, wherein the at least one sensor is used for the one or more verification challenges, such that the application cannot access the at least one sensor while the one or more verification challenges are performed; and the secured application to process data from the at least one sensor for performing the one or more verification challenges, and provide the processed data to the application; receiving, from the application, the processed data from the at least one sensor; and determining that the user is verified based on the processed data from the at least one sensor, using the known method of sending an authentication request from the user device to the server, wherein the request includes a list of the authenticators included on the user device, generating a challenge based on the list, processing the challenge at the user device using the authenticators, and providing a response to the challenge, as taught by Nowak, in combination with the challenge authentication system of Guillory, for the purpose of providing a real-time list of the sensor capabilities of the user device, thereby ensuring that the user device will have the capability to response to the challenge. Guillory in view of Nowak does not clearly teach the secured application to lock at least one sensor of the plurality of sensors, wherein the at least one sensor is used for the one or more verification challenges, such that the application cannot access the at least one sensor while the one or more verification challenges are performed. Wang teaches the secured application, e.g., sensor virtualization module 208 (Fig. 2, el. 208), to lock at least one sensor of the plurality of sensors, e.g., sensors 204 (Fig. 2, el. 204), wherein the at least one sensor is used for the one or more user interactions, such that the application cannot access the at least one sensor while the one or more user interactions are performed, e.g., the background application 230 may attempt to access the sensor 204 while the foreground application 220 is accessing the sensor 204, wherein the sensor virtualization module 208 may then determine whether the background application 230 is a foreground application, and upon determining that the background application 230 is not a foreground application and is not at the top of the application list, the sensor virtualization module 208 may then deny access to the sensor 204 to the background application 230 (Fig. 2, el. 220, 230; Para. 22). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory in view of Nowak to include the secured application to lock at least one sensor of the plurality of sensors, wherein the at least one sensor is used for the one or more verification challenges, such that the application cannot access the at least one sensor while the one or more verification challenges are performed, using the known method of denying access to the sensor to a background application, as taught by Wang, in combination with the challenge authentication system of Guillory in view of Nowak, for the purpose of mitigating user-behavior or sensor-based covert channels (Wang-Para. 11). Regarding claim 10, Guillory in view of Nowak in view of Wang teaches the method of claim 1. Guillory further teaches wherein the delivering of the one or more verification challenges further causes the secured application to…share the data…on the application, such that the data…is made available to the user via the application, e.g., after resolving the challenge response at operation 240, method 200 proceeds to operation 245 where the challenge response is sent to the webpage frontend, wherein authentication application 109 is configured to send the challenge response to the webpage frontend (Fig. 2, el. 245; Para. 40). Guillory does not clearly teach wherein the delivering of the one or more verification challenges further causes the secured application to, while the application cannot access the at least one sensor, share the data from the at least one sensor on the application, such that the data from the at least one sensor is made available to the user via the application. Nowak further teaches wherein the delivering of the one or more verification challenges further causes…to…share the data from the at least one sensor on the application, such that the data from the at least one sensor is made available to the user via the application, e.g., the user device 110 then captures 336 biometric data and provides an authentication response to the user, and the user then interacts with the SDK of the user device 110 and provides FIDO authentication data (by interacting with one or more FIDO authenticators associated with the user's smartphone, for example) to satisfy the native authentication application (for example, a biometric application requiring fingerprint data from a FIDO fingerprint reader component), and the user then utilizes the user device 110 to transmit 338 the authentication response to the IS core 104 (Fig. 3, el. 336, 338; Para. 31). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory to include wherein the delivering of the one or more verification challenges further causes the secured application to share the data from the at least one sensor on the application, such that the data from the at least one sensor is made available to the user via the application, using the known method of processing the challenge at the user device using the authenticators, and providing a response to the challenge, as taught by Nowak, in combination with the challenge authentication system of Guillory, using the same motivation as in claim 1. Guillory in view of Nowak does not clearly teach wherein the delivering of the one or more verification challenges further causes the secured application to, while the application cannot access the at least one sensor, share the data from the at least one sensor on the application, such that the data from the at least one sensor is made available to the user via the application. Wang further teaches wherein the delivering of the one or more verification challenges further causes the secured application to, while the application cannot access the at least one sensor, share the data from the at least one sensor on the application, such that the data from the at least one sensor is made available to the user via the application, e.g., the background application 230 may attempt to access the sensor 204 while the foreground application 220 is accessing the sensor 204, wherein the sensor virtualization module 208 may then determine whether the background application 230 is a foreground application, and upon determining that the background application 230 is not a foreground application and is not at the top of the application list, the sensor virtualization module 208 may then deny access to the sensor 204 and/or refrain from providing the original sensor data to the background application 230 (Fig. 2, el. 220, 230; Para. 22); the sensor virtualization module 208 may be configured to provide degraded sensor data 234 to the background application 230 instead of the original sensor data 206 (Para. 23). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory in view of Nowak to include wherein the delivering of the one or more verification challenges further causes the secured application to, while the application cannot access the at least one sensor, share the data from the at least one sensor on the application, such that the data from the at least one sensor is made available to the user via the application, using the known method of denying access to the sensor to a background application while providing degraded sensor data to the background application, as taught by Wang, in combination with the challenge authentication system of Guillory in view of Nowak, using the same motivation as in claim 1. Regarding claim 12, Guillory teaches a system for verifying a user at a verification server, e.g., server 106 (Fig. 1, el. 106), the system comprising: input/output circuitry, e.g., one or more communication connections, 512, such as LAN, WAN, point to point (Fig. 5, el. 512; Para. 71), configured to receive, from an application, e.g., application 108 (Fig. 1, el. 108), wherein application 108 may be operative to provide a front end of a website for an online service (Para. 15), executing on a user device, e.g., user device 102A/102B (Fig. 1, el. 102A, 102B), a verification challenge request for verifying the user, wherein the verification challenge request comprises data…, e.g., method 200 proceeds to operation 215 where a request to generate a challenge is originated, wherein after receiving the sign-in request, application 108 may generate the request for challenge, and the generated challenge request is sent to the webpage backend by application 108, wherein the challenge request is received by the webpage backend/server authentication application 116 (Fig. 1, el. 116; Fig. 2, el. 215; Para. 33); control circuitry, e.g., processing unit 502 (Fig. 5, el. 502), configured to determine, based on the data…, one or more verification challenges, e.g., after receiving the challenge request at operation 215, method 200 proceeds to operation 220 where a challenge is created (Fig. 2, el. 220; Para. 34); wherein the input/output circuitry is further configured to deliver the one or more verification challenges to the application executing on the user device, wherein the one or more verification challenges are encrypted to be inaccessible to the application, e.g., after creating challenge at operation 220, method 200 proceeds to operation 225 where the created challenge is sent to webpage frontend, wherein the webpage backend is operative to send the encrypted challenge to the webpage frontend operating on computing device 102 over network 104, wherein the webpage frontend or application 108 is operative to receive the challenge from webpage backend (Fig. 2, el. 225; Para. 35); server authentication application 116 is operative to communicate with authentication application 109 installed on computing device 102, wherein server authentication application 116 is operative to receive a corresponding public key to encrypt information that can be decrypted with a private key of the public/private key pair (Fig. 1, el. 116; Para. 27); authentication application 109 is operative to create separate key databases for storing public/private key pairs associated with each user 110 (Para. 25), wherein the delivering of the one or more verification challenges causes: the application to deliver the one or more encrypted verification challenges to a secured application executing on the user device, e.g., method 200 proceeds to operation 230 where the challenge is sent to authentication application 109—secured application--, wherein the webpage frontend is operative to forward the challenge it received from the webpage backend to authentication application 109—secured application-- (Fig. 1, el. 109; Fig. 2, el. 230; Para. 36); server authentication application 116 is operative to communicate with authentication application 109 installed on computing device 102, wherein server authentication application 116 is operative to receive a corresponding public key to encrypt information that can be decrypted with a private key of the public/private key pair (Fig. 1, el. 116; Para. 27); authentication application 109 is operative to create separate key databases for storing public/private key pairs associated with each user 110 (Para. 25); the master password is required to access authentication application 109—secured application-- and to automatically login into all online services for which user 110 is registered (Para. 24); …; and the secured application to process data…for performing the one or more verification challenges, and provide the processed data to the application, e.g., method 200 proceeds to operation 235 where a confirmation for the sign-in request is received, wherein after receiving the challenge, authentication application 109 may prompt user 110 to confirm the sign-in request, wherein user 110 is prompted to confirm the sign-in request through a user interface, wherein authentication application 109 is operative to prompt user 110 to provide a confirmation for the sign-in request with the web service, wherein the prompt may be a simple confirmation from the user 110 to confirm the sign-in request or another challenge requiring a particular response (Fig. 2, el. 235; Para. 37); method 200 proceeds to operation 240 where the challenge is resolved, wherein authentication application 109 is then operative to resolve the challenge using the private key (Fig. 2, el. 240; Para. 38); after resolving the challenge response at operation 240, method 200 proceeds to operation 245 where the challenge response is sent to the webpage frontend, wherein authentication application 109 is configured to send the challenge response to the webpage frontend (Fig. 2, el. 245; Para. 40); wherein the input/output circuitry is further configured to receive, from the application, the processed data…, e.g., method 200 proceeds to operation 250 where the challenge response is received by the webpage front end and sent to the webpage backend, wherein webpage frontend is operative to receive the challenge response from authentication application 109 (or application 108) and send the received challenge response to the webpage backend, wherein the challenge response is sent to server authentication application 116 (Fig. 2, el. 250; Para. 40); wherein the control circuitry is further configured to determine that the user is verified based on the processed data…, e.g., method 200 proceeds to operation 255 where the challenge response is verified, wherein the webpage backend is operative to receive the challenge response from the webpage backend, and the webpage backend is then operative to verify the challenge response, wherein server authentication application 116 may use the public key to verify the signature in the challenge response (Fig. 2, el. 255; Para. 41); and wherein the input/output circuitry is further configured to cause the application to provide the user access to at least one resource of the application, based on the determining that the user is verified, e.g., method 200 proceeds to operation 260 where user 110 is authenticated, wherein in response to receiving a positive verification of the challenge response, the webpage backend may positively authenticate user 110 and allow user 110 (through, e.g., device 102) to access the requested online services (Fig. 2, el. 260; Para. 43). Guillory does not clearly teach wherein the verification challenge request comprises data indicative of a plurality of sensors of the user device; control circuitry configured to determine, based on the data indicative of the plurality of sensors, one or more verification challenges; the secured application to lock at least one sensor of the plurality of sensors, wherein the at least one sensor is used for the one or more verification challenges, such that the application cannot access the at least one sensor while the one or more verification challenges are performed; and the secured application to process data from the at least one sensor for performing the one or more verification challenges, and provide the processed data to the application; wherein the input/output circuitry is further configured to receive, from the application, the processed data from the at least one sensor; and wherein the control circuitry is further configured to determine that the user is verified based on the processed data from the at least one sensor. Nowak teaches input/output circuitry configured to receive, from…on a user device, e.g., user device 110A, 110B…110N (Fig. 1, el. 110A, 110B, 110N), a verification challenge request for verifying the user, wherein the verification challenge request comprises data indicative of a plurality of sensors of the user device, e.g., the user device 110 then transmits 302 an authentication request to the IS core 104 that includes user data and user device data, including the data that identifies the types of FIDO authenticators available (Fig. 1, el. 104; Fig. 3, el. 302; Para. 29); the FIDO authenticators—sensors-- may include one or more of a fingerprint reader, a microphone, and/or a digital camera (Para. 24); control circuitry configured to determine, based on the data indicative of the plurality of sensors, one or more verification challenges, e.g., the IS core 104 transmits 306 the authentication request to the routing engine 108, and the routing engine then retrieves 308 a list of authorized authenticators, selects 310 one or more business rules, selects 312 a FIDO-certified server from a plurality of such servers (in this example, the “ACME” FIDO server 114 is selected), which selected FIDO server can handle the authorized authenticators and satisfies the one or more business rules (Fig. 1, el. 108, 114; Fig. 3, el. 306, 308, 310, 312; Para. 29); the routing engine also transmits 320 the authentication request and the application identifier, the FIDO facet, and the correlation identifier to the selected ACME FIDO-certified server 114, and next the ACME FIDO server 114 generates 322 a challenge message which is based on the application identifier and FIDO facet, and transmits 324 the FIDO challenge message to the routing engine 108 (Fig. 3, el. 320, 322, 324; Para. 30); wherein the input/output circuitry is further configured to deliver the one or more verification challenges to…the user device, wherein the one or more verification challenges are encrypted to be inaccessible to the application, e.g., the routing engine 108 then forwards 326 the FIDO challenge message to the IS Core 104, which performs 328 a secure process. As explained above, the secure process may entail encrypting the entire payload (Fig. 3, el. 326, 328; Para. 30); the FIDO services authentication process next includes the IS core 104 transmitting 330 the FIDO challenge message along with an authentication response to the user device 110 (Fig. 3, el. 330; Para. 31), …to lock at least one sensor of the plurality of sensors, wherein the at least one sensor is used for the one or more verification challenges, such that the application cannot access the at least one sensor while the one or more verification challenges are performed, e.g., a FIDO server may be configured to remotely lock or unlock one or more authenticators, and/or to request additional data associated with one or more authenticators (for example, a FIDO server may be able to request a face matching score associated with a facial authenticator) (Para. 23), and … data from the at least one sensor for performing the one or more verification challenges, and provide the processed data…, e.g., the user device 110 then captures 336 biometric data and provides an authentication response to the user, and the user then interacts with the SDK of the user device 110 and provides FIDO authentication data (by interacting with one or more FIDO authenticators associated with the user's smartphone, for example) to satisfy the native authentication application (for example, a biometric application requiring fingerprint data from a FIDO fingerprint reader component), and the user then utilizes the user device 110 to transmit 338 the authentication response to the IS core 104 (Fig. 3, el. 336, 338; Para. 31); wherein the input/output circuitry is further configured to receive…the processed data from the at least one sensor, e.g., the user then utilizes the user device 110 to transmit 338 the authentication response to the IS core 104 (Fig. 3, el. 338; Para. 31); and wherein the control circuitry is further configured to determine that the user is verified based on the processed data from the at least one sensor, e.g., the ACME FIDO-certified server 114 then retrieves 352 the FIDO facet and the authentication identifier, conducts 354 a verification process (as explained above), and transmits 356 the authentication result to the routing engine 108, which forwards 258 the authentication result to IS core 104 (Fig. 3, el. 352, 354, 356; Para. 33). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory to include wherein the verification challenge request comprises data indicative of a plurality of sensors of the user device; control circuitry configured to determine, based on the data indicative of the plurality of sensors, one or more verification challenges; to lock at least one sensor of the plurality of sensors, wherein the at least one sensor is used for the one or more verification challenges, such that the application cannot access the at least one sensor while the one or more verification challenges are performed; and the secured application to process data from the at least one sensor for performing the one or more verification challenges, and provide the processed data to the application; wherein the input/output circuitry is further configured to receive, from the application, the processed data from the at least one sensor; and wherein the control circuitry is further configured to determine that the user is verified based on the processed data from the at least one sensor, using the known method of sending an authentication request from the user device to the server, wherein the request includes a list of the authenticators included on the user device, generating a challenge based on the list, processing the challenge at the user device using the authenticators, and providing a response to the challenge, as taught by Nowak, in combination with the challenge authentication system of Guillory, for the purpose of providing a real-time list of the sensor capabilities of the user device, thereby ensuring that the user device will have the capability to response to the challenge. Guillory in view of Nowak does not clearly teach the secured application to lock at least one sensor of the plurality of sensors, wherein the at least one sensor is used for the one or more verification challenges, such that the application cannot access the at least one sensor while the one or more verification challenges are performed. Wang teaches the secured application, e.g., sensor virtualization module 208 (Fig. 2, el. 208), to lock at least one sensor of the plurality of sensors, e.g., sensors 204 (Fig. 2, el. 204), wherein the at least one sensor is used for the one or more user interactions, such that the application cannot access the at least one sensor while the one or more user interactions are performed, e.g., the background application 230 may attempt to access the sensor 204 while the foreground application 220 is accessing the sensor 204, wherein the sensor virtualization module 208 may then determine whether the background application 230 is a foreground application, and upon determining that the background application 230 is not a foreground application and is not at the top of the application list, the sensor virtualization module 208 may then deny access to the sensor 204 to the background application 230 (Fig. 2, el. 220, 230; Para. 22). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory in view of Nowak to include the secured application to lock at least one sensor of the plurality of sensors, wherein the at least one sensor is used for the one or more verification challenges, such that the application cannot access the at least one sensor while the one or more verification challenges are performed, using the known method of denying access to the sensor to a background application, as taught by Wang, in combination with the challenge authentication system of Guillory in view of Nowak, for the purpose of mitigating user-behavior or sensor-based covert channels (Wang-Para. 11). Regarding claim 21, the claim is analyzed with respect to claim 10. Regarding claim 34, the claim is analyzed with respect to claim 12. Claims 2 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Guillory in view of Nowak in view of Wang and further in view of Thompson (US 2017/0078319 A1). Regarding claim 2, Guillory in view of Nowak in view of Wang teaches the method of claim 1. Guillory in view of Nowak in view of Wang does not explicitly teach wherein the verification challenge comprises at least one of a Completely Automated Public Turing Test to Tell Computers and Humans Apart (CAPTCHA) challenge or a multi-factor authentication (MFA) challenge. Thompson teaches wherein the verification challenge comprises at least one of a Completely Automated Public Turing Test to Tell Computers and Humans Apart (CAPTCHA) challenge or a multi-factor authentication (MFA) challenge, e.g., the service gateway may select captcha data and difficulty of the captcha from a captcha database, wherein the captcha may be selected from the captcha database based upon client information determined from the service request, wherein the captcha may, in addition or alternatively, be selected from the captcha database based on a client profile, wherein the captcha may, in addition or alternatively, be selected from the captcha database based on a service policy (Para. 30). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory in view of Nowak in view of Wang to include wherein the verification challenge comprises at least one of a Completely Automated Public Turing Test to Tell Computers and Humans Apart (CAPTCHA) challenge or a multi-factor authentication (MFA) challenge, using the known method of selecting captcha data based on a service request, as taught by Thompson, in combination with the challenge authentication system of Guillory in view of Nowak in view of Wang, for the purpose of providing an improved technique for distinguishing between computing devices operating under control of human users and automated access by computing devices (Thompson-Para. 4). Regarding claim 13, the claim is analyzed with respect to claim 2. Claims 7 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Guillory in view of Nowak in view of Wang and further in view of Ponsini (US 2020/0184089 A1). Regarding claim 7, Guillory in view of Nowak in view of Wang teaches the method of claim 1. Guillory in view of Nowak in view of Wang does not clearly teach further comprising: providing a public key of the verification server to the application, to cause the application to deliver the public key to the secured application; causing the secured application to: generate a symmetric key; encrypt the symmetric key using the public key of the verification server; and deliver the encrypted symmetric key to the application; receiving, from the application, the encrypted symmetric key; and decrypting the encrypted symmetric key using a private key of the verification server. Ponsini teaches providing a public key of the verification server, e.g., security entity 110 that includes secure server 112 (Fig. 2, el. 110, 112), to the application, e.g., administrative agent 109 (Fig. 1, el. 109), to cause the application to deliver the public key to the secured application, e.g., to establish the secure channel 115, the secure server 112 can provide a public key of a public/private key pair to the SEE 102 (Fig. 1, el. 102, 115; Para. 31); the administrative agent 109 can provide an interface between the REE 106 (an untrusted computing environment) and the SEE 102 (a trusted computing environment), wherein the administrative agent 109 can operate as a conduit (pass through) for a secure channel 115 (with encrypted data) between the SEE 102 and the secure server 112 (Fig. 1, el. 109; Para. 30); one or more of the trusted applications of the SEE 102—secured application-- can be programmed to authenticate the public key certificate and to generate and encrypt a symmetric key with the public key included in the public key certificate (Para. 33); causing the secured application to: generate a symmetric key, e.g., in response, the SEE 102 can be programmed to generate and encrypt a symmetric key with the public key of the secure server (Para. 31); one or more of the trusted applications of the SEE 102 can be programmed to authenticate the public key certificate and to generate and encrypt a symmetric key with the public key included in the public key certificate (Para. 33); encrypt the symmetric key using the public key of the verification server, e.g., in response, the SEE 102 can be programmed to generate and encrypt a symmetric key with the public key of the secure server (Para. 31); one or more of the trusted applications of the SEE 102 can be programmed to authenticate the public key certificate and to generate and encrypt a symmetric key with the public key included in the public key certificate (Para. 33); and deliver the encrypted symmetric key to the application, e.g., the SEE 102 can be programmed to generate and encrypt a symmetric key with the public key of the secure server and transmit the encrypted symmetric key to the secure server 112 via the administrative agent 109 (Para. 31); one or more of the trusted applications of the SEE 102 can be programmed to transmit the encrypted symmetric key to the secure server 112 (Para. 33); receiving, from the application, the encrypted symmetric key, e.g., the secure server 112 can decrypt the symmetric key using the private key of the public/private key pair (Para. 31); the secure server 112 can decrypt the symmetric key using the private key of the public/private key pair (Para. 33); and decrypting the encrypted symmetric key using a private key of the verification server, e.g., the secure server 112 can decrypt the symmetric key using the private key of the public/private key pair (Para. 31); the secure server 112 can decrypt the symmetric key using the private key of the public/private key pair (Para. 33). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory in view of Nowak in view of Wang to include providing a public key of the verification server to the application, to cause the application to deliver the public key to the secured application; causing the secured application to: generate a symmetric key; encrypt the symmetric key using the public key of the verification server; and deliver the encrypted symmetric key to the application; receiving, from the application, the encrypted symmetric key; and decrypting the encrypted symmetric key using a private key of the verification server, using the known method of encrypting, by a trusted application, a symmetric key using the server’s public key, sending the encrypted symmetric key to the server via the administrative agent, and decrypting the symmetric key at the server, as taught by Ponsini, in combination with the challenge authentication system of Guillory in view of Nowak in view of Wang, for the purpose of providing enhanced security for the communication channel between the server and the applications, thereby aiding in the prevention of attackers gaining sensitive data. Regarding claim 18, Guillory in view of Nowak in view of Wang teaches the system of claim 12. Guillory in view of Nowak in view of Wang does not clearly teach wherein the input/output circuitry is further configured to: provide a public key of the verification server to the application, to cause the application to deliver the public key to the secured application; cause the secured application to: generate a symmetric key; encrypt the symmetric key using the public key of the verification server; and deliver the encrypted symmetric key to the application; and receive, from the application, the encrypted symmetric key; and wherein the control circuitry is further configured to decrypt the encrypted symmetric key using a private key of the verification server. Ponsini teaches wherein the input/output circuitry is further configured to: provide a public key of the verification server, e.g., security entity 110 that includes secure server 112 (Fig. 2, el. 110, 112), to the application, e.g., administrative agent 109 (Fig. 1, el. 109), to cause the application to deliver the public key to the secured application, e.g., to establish the secure channel 115, the secure server 112 can provide a public key of a public/private key pair to the SEE 102 (Fig. 1, el. 102, 115; Para. 31); the administrative agent 109 can provide an interface between the REE 106 (an untrusted computing environment) and the SEE 102 (a trusted computing environment), wherein the administrative agent 109 can operate as a conduit (pass through) for a secure channel 115 (with encrypted data) between the SEE 102 and the secure server 112 (Fig. 1, el. 109; Para. 30); one or more of the trusted applications of the SEE 102—secured application-- can be programmed to authenticate the public key certificate and to generate and encrypt a symmetric key with the public key included in the public key certificate (Para. 33); causing the secured application to: generate a symmetric key, e.g., in response, the SEE 102 can be programmed to generate and encrypt a symmetric key with the public key of the secure server (Para. 31); one or more of the trusted applications of the SEE 102 can be programmed to authenticate the public key certificate and to generate and encrypt a symmetric key with the public key included in the public key certificate (Para. 33); encrypt the symmetric key using the public key of the verification server, e.g., in response, the SEE 102 can be programmed to generate and encrypt a symmetric key with the public key of the secure server (Para. 31); one or more of the trusted applications of the SEE 102 can be programmed to authenticate the public key certificate and to generate and encrypt a symmetric key with the public key included in the public key certificate (Para. 33); and deliver the encrypted symmetric key to the application, e.g., the SEE 102 can be programmed to generate and encrypt a symmetric key with the public key of the secure server and transmit the encrypted symmetric key to the secure server 112 via the administrative agent 109 (Para. 31); one or more of the trusted applications of the SEE 102 can be programmed to transmit the encrypted symmetric key to the secure server 112 (Para. 33); and receive, from the application, the encrypted symmetric key, e.g., the secure server 112 can decrypt the symmetric key using the private key of the public/private key pair (Para. 31); the secure server 112 can decrypt the symmetric key using the private key of the public/private key pair (Para. 33); and wherein the control circuitry is further configured to decrypt the encrypted symmetric key using a private key of the verification server, e.g., the secure server 112 can decrypt the symmetric key using the private key of the public/private key pair (Para. 31); the secure server 112 can decrypt the symmetric key using the private key of the public/private key pair (Para. 33). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory in view of Nowak in view of Wang to include wherein the input/output circuitry is further configured to: provide a public key of the verification server to the application, to cause the application to deliver the public key to the secured application; cause the secured application to: generate a symmetric key; encrypt the symmetric key using the public key of the verification server; and deliver the encrypted symmetric key to the application; and receive, from the application, the encrypted symmetric key; and wherein the control circuitry is further configured to decrypt the encrypted symmetric key using a private key of the verification server, using the known method of encrypting, by a trusted application, a symmetric key using the server’s public key, sending the encrypted symmetric key to the server via the administrative agent, and decrypting the symmetric key at the server, as taught by Ponsini, in combination with the challenge authentication system of Guillory in view of Nowak in view of Wang, for the purpose of providing enhanced security for the communication channel between the server and the applications, thereby aiding in the prevention of attackers gaining sensitive data. Claims 8 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Guillory in view of Nowak in view of Wang in view of Ponsini and further in view of Kahol et al. (US 2022/0353246 A1). Regarding claim 8, Guillory in view of Nowak in view of Wang in view of Ponsini teaches the method of claim 7. Guillory further teaches further comprising: encrypting…the one or more verification challenges, e.g., after creating challenge at operation 220, method 200 proceeds to operation 225 where the created challenge is sent to webpage frontend, wherein the webpage backend is operative to send the encrypted challenge to the webpage frontend operating on computing device 102 over network 104, wherein the webpage frontend or application 108 is operative to receive the challenge from webpage backend (Fig. 2, el. 225; Para. 35); causing the secured application to decrypt… the one or more verification challenges, e.g., after creating challenge at operation 220, method 200 proceeds to operation 225 where the created challenge is sent to webpage frontend, wherein the webpage backend is operative to send the encrypted challenge to the webpage frontend operating on computing device 102 over network 104, wherein the webpage frontend or application 108 is operative to receive the challenge from webpage backend (Fig. 2, el. 225; Para. 35); server authentication application 116 is operative to communicate with authentication application 109 installed on computing device 102, wherein server authentication application 116 is operative to receive a corresponding public key to encrypt information that can be decrypted with a private key of the public/private key pair (Fig. 1, el. 116; Para. 27); authentication application 109 is operative to create separate key databases for storing public/private key pairs associated with each user 110 (Para. 25); causing the secured application to sign…the processed data before the processed data is provided to the application, e.g., the challenge is solved by signing the challenge data with the private key of the public/private pair, wherein authentication application 109 can sign the challenge using one or more algorithms such as an elliptic curve digital signature algorithm (Para. 38); and verifying the processed data…, e.g., server authentication application 116 may use the public key to verify the signature in the challenge response, wherein server authentication application 116 can verify the signature using the one or more algorithms used to sign the challenge response (Para. 41). Guillory in view of Nowak in view of Wang in view of Ponsini does not clearly teach encrypting, using the symmetric key, the one or more verification challenges; causing the secured application to decrypt, using the symmetric key, the one or more verification challenges; causing the secured application to encrypt, using the symmetric key, the processed data before the processed data is provided to the application; and decrypting the processed data using the symmetric key. Kahol teaches encrypting, using the symmetric key, data, e.g., thereafter the client and server can exchange data encrypted with the session key 312 (Fig. 3, el. 312; Para. 27); the application server 401 encrypts data sent to the client 402 and decrypts data received from the client 402 with a session key 402, and the client 203 encrypts/decrypts data with the session key (Fig. 4, el. 401, 402; Para. 28); causing the client to decrypt, using the symmetric key, the data, e.g., thereafter the client and server can exchange data encrypted with the session key 312 (Fig. 3, el. 312; Para. 27); the application server 401 encrypts data sent to the client 402 and decrypts data received from the client 402 with a session key 402, and the client 203 encrypts/decrypts data with the session key (Fig. 4, el. 401, 402; Para. 28); causing the client to encrypt, using the symmetric key, data, e.g., thereafter the client and server can exchange data encrypted with the session key 312 (Fig. 3, el. 312; Para. 27); the application server 401 encrypts data sent to the client 402 and decrypts data received from the client 402 with a session key 402, and the client 203 encrypts/decrypts data with the session key (Fig. 4, el. 401, 402; Para. 28); and decrypting the…data using the symmetric key, e.g., thereafter the client and server can exchange data encrypted with the session key 312 (Fig. 3, el. 312; Para. 27); the application server 401 encrypts data sent to the client 402 and decrypts data received from the client 402 with a session key 402, and the client 203 encrypts/decrypts data with the session key (Fig. 4, el. 401, 402; Para. 28). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory in view of Nowak in view of Wang in view of Ponsini to include encrypting, using the symmetric key, the one or more verification challenges; causing the secured application to decrypt, using the symmetric key, the one or more verification challenges; causing the secured application to encrypt, using the symmetric key, the processed data before the processed data is provided to the application; and decrypting the processed data using the symmetric key, using the known method of encrypting and decrypting data sent and received between the server and the client using a session key, as taught by Kahol, in combination with the challenge authentication system of Guillory in view of Nowak in view of Wang in view of Ponsini, for the purpose of utilizing a fast encryption method that requires less computational power. Regarding claim 19, Guillory in view of Nowak in view of Wang in view of Ponsini teaches the system of claim 18. Guillory further teaches wherein the control circuitry is further configured to: encrypt…the one or more verification challenges, e.g., after creating challenge at operation 220, method 200 proceeds to operation 225 where the created challenge is sent to webpage frontend, wherein the webpage backend is operative to send the encrypted challenge to the webpage frontend operating on computing device 102 over network 104, wherein the webpage frontend or application 108 is operative to receive the challenge from webpage backend (Fig. 2, el. 225; Para. 35); wherein the input/output circuitry is further configured to: cause the secured application to decrypt… the one or more verification challenges, e.g., after creating challenge at operation 220, method 200 proceeds to operation 225 where the created challenge is sent to webpage frontend, wherein the webpage backend is operative to send the encrypted challenge to the webpage frontend operating on computing device 102 over network 104, wherein the webpage frontend or application 108 is operative to receive the challenge from webpage backend (Fig. 2, el. 225; Para. 35); server authentication application 116 is operative to communicate with authentication application 109 installed on computing device 102, wherein server authentication application 116 is operative to receive a corresponding public key to encrypt information that can be decrypted with a private key of the public/private key pair (Fig. 1, el. 116; Para. 27); authentication application 109 is operative to create separate key databases for storing public/private key pairs associated with each user 110 (Para. 25); cause the secured application to sign…the processed data before the processed data is provided to the application, e.g., the challenge is solved by signing the challenge data with the private key of the public/private pair, wherein authentication application 109 can sign the challenge using one or more algorithms such as an elliptic curve digital signature algorithm (Para. 38); and wherein the control circuitry is further configured to verify the processed data…, e.g., server authentication application 116 may use the public key to verify the signature in the challenge response, wherein server authentication application 116 can verify the signature using the one or more algorithms used to sign the challenge response (Para. 41). Guillory in view of Nowak in view of Wang in view of Ponsini does not clearly teach wherein the control circuitry is further configured to: encrypt, using the symmetric key, the one or more verification challenges; wherein the input/output circuitry is further configured to: cause the secured application to decrypt, using the symmetric key, the one or more verification challenges; and cause the secured application to encrypt, using the symmetric key, the processed data before the processed data is provided to the application; and wherein the control circuitry is further configured to decrypt the processed data using the symmetric key. Kahol teaches to: encrypt, using the symmetric key, data, e.g., thereafter the client and server can exchange data encrypted with the session key 312 (Fig. 3, el. 312; Para. 27); the application server 401 encrypts data sent to the client 402 and decrypts data received from the client 402 with a session key 402, and the client 203 encrypts/decrypts data with the session key (Fig. 4, el. 401, 402; Para. 28); … cause the client to decrypt, using the symmetric key, the data, e.g., thereafter the client and server can exchange data encrypted with the session key 312 (Fig. 3, el. 312; Para. 27); the application server 401 encrypts data sent to the client 402 and decrypts data received from the client 402 with a session key 402, and the client 203 encrypts/decrypts data with the session key (Fig. 4, el. 401, 402; Para. 28); cause the client to encrypt, using the symmetric key, data, e.g., thereafter the client and server can exchange data encrypted with the session key 312 (Fig. 3, el. 312; Para. 27); the application server 401 encrypts data sent to the client 402 and decrypts data received from the client 402 with a session key 402, and the client 203 encrypts/decrypts data with the session key (Fig. 4, el. 401, 402; Para. 28); and …decrypt the…data using the symmetric key, e.g., thereafter the client and server can exchange data encrypted with the session key 312 (Fig. 3, el. 312; Para. 27); the application server 401 encrypts data sent to the client 402 and decrypts data received from the client 402 with a session key 402, and the client 203 encrypts/decrypts data with the session key (Fig. 4, el. 401, 402; Para. 28). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory in view of Nowak in view of Wang in view of Ponsini to include wherein the control circuitry is further configured to: encrypt, using the symmetric key, the one or more verification challenges; wherein the input/output circuitry is further configured to: cause the secured application to decrypt, using the symmetric key, the one or more verification challenges; and cause the secured application to encrypt, using the symmetric key, the processed data before the processed data is provided to the application; and wherein the control circuitry is further configured to decrypt the processed data using the symmetric key, using the known method of encrypting and decrypting data sent and received between the server and the client using a session key, as taught by Kahol, in combination with the challenge authentication system of Guillory in view of Nowak in view of Wang in view of Ponsini, for the purpose of utilizing a fast encryption method that requires less computational power. Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Guillory in view of Nowak in view of Wang and further in view of Ward et al. (US 2025/0200154 A1). Regarding claim 11, Guillory in view of Nowak in view of Wang teaches the method of claim 1. Guillory in view of Nowak in view of Wang does not clearly teach wherein the determining that the user is verified based on the processed data from the at least one sensor comprises using a trained model based on the at least one sensor, wherein the trained model is trained based on a first set of data representing authorized attempts to complete the verification challenge and a second set of data representing unauthorized attempts to complete the verification challenge. Ward teaches wherein the determining that the user is verified based on the processed data from the at least one sensor, e.g., first biometric capture component 112, a second biometric capture component 114, biometric capture device 116 (Fig. 1, el. 112, 114, 116), comprises using a trained model based on the at least one sensor, wherein the trained model is trained based on a first set of data representing authorized attempts to complete the verification challenge and a second set of data representing unauthorized attempts to complete the verification challenge, e.g., once trained, the model 420 may output a prediction associated with incoming biometric data, wherein the prediction may include whether the biometric data matches (e.g., within a threshold) stored biometric data, wherein the prediction may indicate whether an authentication attempt is authorized, incorrect, or malicious, wherein the model 420 may be retrained over time (Fig. 4, el. 420; Para. 35); training engine 402 uses input data 406, for example after undergoing preprocessing component 408, to determine one or more features 410, wherein the one or more features 410 may be used to generate an initial model 412, which may be updated iteratively or with future labeled or unlabeled data (Fig. 4, el. 402, 406, 408, 410, 412; Para. 28); labels for the input data 406 may include one or more categories and attributes associated with biometric data, wherein a label may specify that a particular facial recognition data point corresponds to a successful user authentication, wherein the label may include historical user authentication data, including past authentication attempts, successful or unsuccessful, for the corresponding biometric data, wherein the label may include a recurring or repeating pattern associated with successful or unsuccessful user authentication attempts over a period of time (Para. 32). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Guillory in view of Nowak in view of Wang to include wherein the determining that the user is verified based on the processed data from the at least one sensor comprises using a trained model based on the at least one sensor, wherein the trained model is trained based on a first set of data representing authorized attempts to complete the verification challenge and a second set of data representing unauthorized attempts to complete the verification challenge, using the known method of using a model to determine whether a user is authenticated, wherein the model is trained using past successful and unsuccessful attempts, as taught by Ward, in combination with the challenge authentication system of Guillory in view of Nowak in view of Wang, for the purpose of enhancing security and accuracy in user authentication (Ward-Para. 1). Allowable Subject Matter As allowable subject matter has been indicated, applicant's reply must either comply with all formal requirements or specifically traverse each requirement not complied with. See 37 CFR 1.111(b) and MPEP § 707.07(a). Claims 3-6, 9, 14, 15, and 20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims and the associated claim objections were satisfied. The prior art of record fails to disclose the combination of features as claimed and arranged by applicant when read in light of the specification. In this case, the allowance is based on the combination of the limitations in each indicated claim and not on any single limitation. Regarding claim 3, the cited references do not alone or in an obvious combination teach “calculating, based on the hardware specifications and a plurality of possible verification challenges, a plurality of verification success rates, wherein each one of the plurality of verification success rates corresponds to a respective one of the plurality of possible verification challenges” and “selecting the one or more verification challenges based on at least one of the plurality of verification success rates exceeding a threshold success rate” of claim 3 in combination with the remaining limitations claim 3. Claim 14 includes similar limitations and is similarly analyzed. Regarding claim 9, the cited references do not alone or in an obvious combination teach “requesting input of at least two sequential gestures,” “processing data indicative of the at least two sequential gestures, wherein the data indicative of the at least two sequential gestures is provided by the at least one sensor of the plurality of sensors of the user device,” “training at least two single gesture recognition models, wherein each of the at least two single gesture recognition models corresponds to a respective one of the at least two sequential gestures,” “based on the at least two trained single gesture recognition models, generating a sequence recognition model, wherein: the sequence recognition model is configured to, in sequence, recognize the at least two sequential gestures,” “the sequence recognition model comprises at least one adjustable weight, wherein when recognizing the at least two sequential gestures, the at least one adjustable weight is adjusted for each one of the at least two sequential gestures, wherein each weight adjustment is based on a respective one of the at least two signal gesture recognition models,” and providing the sequence recognition model to the secured application, such that the secured application is configured to determine that the user is verified based on recognizing the at least two sequential gestures” of claim 9 in combination with the remaining limitations claim 9. Claim 20 includes similar limitations and is similarly analyzed. Claims 4-6 are dependent on claim 3 and claim 15 is dependent on claim 14. Therefore, these claims also include allowable subject matter. Additional relevant prior art: Buck (US 2014/0196158 A1)—Buck discloses intercepting a request to access the sensor information from a requesting application of the plurality of applications, and controlling access to the sensor information associated with the at least one user input action based on the requesting application (Abstract). Ford (US 2024/0256686 A1)—Ford discloses the application server 1604 transmits website code or application code to the client device 1602. At least some of the code may include a call or link to the security proxy server 1608 that causes a message to be transmitted when the code is executed. The message may include a request for a Turing test. In some embodiments, the message may identify a model of the client device 1602, an operating system, a screen size, a browser type, etc., which may be used to structure the Turing test for the particular client device 1602. The information in the request message may also be used to determine or modify an answer file based on model of the client device 1602, an operating system, a screen size, a browser type, etc. (Para. 38). The additional relevant prior art also do not alone or in an obvious combination teach the aforementioned limitations in combination with the remaining limitations of each respective indicated claim. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEREMY DUFFIELD whose telephone number is (571)270-1643. The examiner can normally be reached Monday - Friday, 7:00 AM - 3:00 PM (ET). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yin-Chen Shaw can be reached at (571) 272-8878. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. 05 January 2026 /Jeremy S Duffield/Primary Examiner, Art Unit 2498
Read full office action

Prosecution Timeline

Mar 12, 2024
Application Filed
Jan 05, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12598067
Method, Device, and System for Updating Anchor Key in a Communication Network for Encrypted Communication with Service Applications
2y 5m to grant Granted Apr 07, 2026
Patent 12591642
SYSTEM FOR STEGANALYSIS DETECTION OF METADATA IN A VIDEO STREAM FOR PROVIDING REAL-TIME DATA
2y 5m to grant Granted Mar 31, 2026
Patent 12579320
SPLIT COUNTERS WITH DYNAMIC EPOCH TRACKING FOR CRYPTOGRAPHIC PROTECTION OF SECURE DATA
2y 5m to grant Granted Mar 17, 2026
Patent 12572685
CONTEXT-BASED PATTERN MATCHING FOR SENSITIVE DATA DETECTION
2y 5m to grant Granted Mar 10, 2026
Patent 12554872
SYSTEM AND METHOD FOR NOTIFYING USERS ABOUT PUBLICLY AVAILABLE DATA
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
49%
Grant Probability
99%
With Interview (+53.1%)
3y 11m
Median Time to Grant
Low
PTA Risk
Based on 438 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month