Prosecution Insights
Last updated: April 19, 2026
Application No. 18/772,902

System for Calculating Trust of Client Session(s)

Final Rejection §103
Filed
Jul 15, 2024
Examiner
DUFFIELD, JEREMY S
Art Unit
2498
Tech Center
2400 — Computer Networks
Assignee
Microsoft Technology Licensing, LLC
OA Round
2 (Final)
49%
Grant Probability
Moderate
3-4
OA Rounds
3y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 49% of resolved cases
49%
Career Allow Rate
213 granted / 438 resolved
-9.4% vs TC avg
Strong +53% interview lift
Without
With
+53.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
27 currently pending
Career history
465
Total Applications
across all art units

Statute-Specific Performance

§101
7.4%
-32.6% vs TC avg
§103
59.9%
+19.9% vs TC avg
§102
10.9%
-29.1% vs TC avg
§112
15.3%
-24.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 438 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority This application is a continuation of US S.N. 16/702,395 with a filing date of 03 December 2019. Therefore, the effective filing date of the claims is 03 December 2019. Response to Arguments Applicant’s arguments, see page 8, filed 20 November 2025, with respect to the rejection(s) of claim(s) 21, 28, and 35 under 35 U.S.C. 103 have been fully considered and are persuasive in light of the new claim amendments. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground of rejection is made in view of Mattson et al. (US 2017/0237766 A1), Tarkkala et al. (US 2007/0011453 A1), and Wardman et al. (US 2020/0201981 A1). See the 35 U.S.C. 103 section below for a detailed analysis. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 21-24, 28-31, and 35-38 are rejected under 35 U.S.C. 103 as being unpatentable over Mattson et al. (US 2017/0237766 A1) in view of Tarkkala et al. (US 2007/0011453 A1) and further in view of Wardman et al. (US 2020/0201981 A1). Regarding claim 21, Mattson teaches a method comprising: receiving, at a client computer, e.g., client device 102 (Fig. 1, el. 102), a work function, e.g., In step 11, security server computer 104 serves code to client device 102, wherein the served code may include code for implementing the selected countermeasures, such as proof of work code, code to monitor to the execution of the origin server computer system code to ensure it is not being interfered with by malicious code, or other countermeasures (Fig. 1, el. 11, 104; Para. 112); in step 426, the intermediary computer serves the content and one or more countermeasures to the client computer, and logs the results or signals of the observations of the client computer (Fig. 4, el. 426; Para. 160); in step 3, security server computer 104 serves the tests to client device 102 (Fig. 1, el. 3; Para. 101); in step 408, the security intermediary server computer serves the selected tests and the requested content to the requesting client computer (Fig. 4, el. 408; Para. 151); executing a particular session on the client computer, wherein the particular session calculate different proof of work values repeatedly by processing different inputs using the work function, the different inputs being received from a service connected to the particular session…, e.g., executing countermeasures at the client computer and sending the results to the security server, wherein the process of selecting and sending countermeasures may be repeated each time the client device makes a request (Para. 113, 128, 129); one or more additional countermeasures can be sent with subsequently requested content (Para. 114); wherein the countermeasures are Proof of Work code and may be associated with a network configuration, device, browser, user, malware, attack, website, content, or one or more characteristics of the client computer (Para. 66, 88, 89, 112); a hash generating function (Para. 79); sending the different proof of work values to a trust server, e.g., security server computer 104 (Fig. 1, el. 104); executing countermeasures at the client computer and sending the results to the security server, wherein the process of selecting and sending countermeasures may be repeated each time the client device makes a request (Para. 113, 128, 129); one or more additional countermeasures can be sent with subsequently requested content (Para. 114); …. Mattson does not clearly teach executing different sessions concurrently on the client computer, wherein the different sessions calculate different proof of work values repeatedly by processing different inputs using the work function, the different inputs being received from a service connected to the different sessions, the different inputs being specifically associated with the different sessions and distinguishing individual sessions from other sessions executing concurrently on the client computer; receiving, from the trust server, feedback for the different sessions; updating the calculations of the different proof of work values by the different sessions based at least on the feedback; and wherein the feedback varies resource utilization for respective sessions by increasing resource utilization by relatively less trustworthy sessions and decreasing resource utilization by relatively more trustworthy sessions. Tarkkala teaches …wherein the particular session calculate…proof of work value…by processing different inputs using the work function, the different inputs being received from a service connected to the particular session, the different inputs being specifically associated with the particular session and distinguishing individual sessions from other sessions…on the client computer, e.g., receiving, by a verifier V, the proof-of-work P(w)-proof of work value- and the input sets M and A from the prover (Para. 120, 121); wherein the prover computes a bitstring s as a result of a pseudo-random function h-work function- being applied on the created data set A-particular inputs-, applies a mapping function m-work function- for mapping an arbitrary bitstring to a problem instance, solves the problem instance, and generates the proof-of-work for the solving of the problem instance (Para. 116, 117, 119); wherein the created data set A may include an identifier for the prover P, an identifier for the verifier V, a salt S_1, and auxiliary data such as a timestamp and/or a message to be sent to verifier V (Para. 113); prover P derives a computationally hard problem from some set identifying a session q between prover P and verifier V (Para. 134). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Mattson to include the different inputs being received from a service connected to the particular session, the different inputs being specifically associated with the particular session and distinguishing individual sessions from other sessions on the client computer, using the known method of computing, by the prover, a bitstring s as a result of a pseudo-random function h being applied on the created data set A, applying a mapping function m for mapping an arbitrary bitstring to a problem instance, solving the problem instance, and generating the proof-of-work for the solving of the problem instance, as taught by Tarkkala, in combination with the Proof of Work system of Mattson, for the purpose of establishing a trusted relationship between unknown communication parties in whatever environment with a respective architecture, (Tarkkala-Para. 109), while also ensuring that the proof-of-work is sufficiently fresh (Tarkkala-Para. 132). Mattson in view of Tarkkala does not clearly teach executing different sessions concurrently on the client computer, wherein the different sessions calculate different proof of work values repeatedly by processing different inputs using the work function, the different inputs being received from a service connected to the different sessions, the different inputs being specifically associated with the different sessions and distinguishing individual sessions from other sessions executing concurrently on the client computer; receiving, from the trust server, feedback for the different sessions; and updating the calculations of the different proof of work values by the different sessions based at least on the feedback; and wherein the feedback varies resource utilization for respective sessions by increasing resource utilization by relatively less trustworthy sessions and decreasing resource utilization by relatively more trustworthy sessions. Wardman teaches receiving, at a client computer, e.g., client system 102 (Fig. 1, el. 102), a work function, e.g., once it receives the challenge problem 120, client system 102 may generate a proposed solution 124 by executing program code to solve the challenge problem 120, wherein this program code may be provided to client system 102 (Fig. 1, el. 120, 124; Para. 30); executing different sessions concurrently on the client computer, wherein the different sessions calculate different proof of work values repeatedly by processing different inputs using the work function, the different inputs being received from a service, e.g., service 112 (Fig. 1, el. 112), connected to the different sessions, the different inputs being specifically associated with the different sessions and distinguishing individual sessions from other sessions executing concurrently on the client computer, e.g., computer system 352 is running multiple VMs 358 simultaneously, wherein this may allow a malicious third-party to scale their attacks, attempting to access the service 112 using the multiple emulated computer systems at the same time, wherein each of the VMs 358 may be configured to attempt to access the same account (e.g., user account 315) with the same service (e.g., service 112) (Fig. 3B, el. 352, 358; Para. 53); server computer system 110 may select a challenge problem 120 and send it to the VM 358A, and once VM 358A generates a proposed solution 124 and provides it to the server computer system 110, system 110 may compare the performance of the challenge problem 120 by VM 358A to the computational performance data 400 (Para. 58); server computer system 110 may select a challenge problem 120 that has a level of difficulty 122 that is commensurate with the technical capabilities of client system 102, as indicated by the reported technical features, wherein server computer system 110 may select the problem 120 such that the level of difficulty 122 increases as the reported technical capabilities of the client system 102 increase (Para. 28); given their frequent use for malicious activity, however, the detection of an emulated computer system may trigger server computer system 110 to implement additional authentication or security challenges (Fig. 1, el. 110; Para. 35); server computer system 110 may instead (or additionally) select multiple challenge problems for client system 102 to solve based on its technical feature (Para. 41); sending the different proof of work values to a trust server, e.g., server computer system 110 (Figs. 1, 2, el. 110); Client system 102 may then, at 136, send a challenge response 126, including the proposed solution 124, to the server computer system 110 (Fig. 1, el. 124, 126, 136; Para. 30); receiving, from the trust server, feedback for the different sessions, e.g., given their frequent use for malicious activity, however, the detection of an emulated computer system may trigger server computer system 110 to implement additional authentication or security challenges (Fig. 1, el. 110; Para. 35); server computer system 110 may instead (or additionally) select multiple challenge problems for client system 102 to solve based on its technical feature (Para. 41); server computer system 110 may send the client system 102 one or more additional challenge problems once the server computer system determines that the client system 102 is an emulated computer system (Para. 46); and updating the calculations of the different proof of work values by the different sessions based at least on the feedback, e.g., server computer system 110 may select a challenge problem 120 and send it to the VM 358A, and once VM 358A generates a proposed solution 124 and provides it to the server computer system 110, system 110 may compare the performance of the challenge problem 120 by VM 358A to the computational performance data 400 (Para. 58); server computer system 110 may instead (or additionally) select multiple challenge problems for client system 102 to solve based on its technical feature (Para. 41); server computer system 110 may send the client system 102 one or more additional challenge problems once the server computer system determines that the client system 102 is an emulated computer system (Para. 46), wherein the feedback varies resource utilization for respective sessions by increasing resource utilization by relatively less trustworthy sessions and decreasing resource utilization by relatively more trustworthy sessions, e.g., the determination that a client system 102 is an emulated system may not, by itself, result in the requesting user being denied access to service 112, and instead, detection that the client system is an emulated computer system may instead trigger additional authentication operations to be performed (Para. 25); given their frequent use for malicious activity, however, the detection of an emulated computer system may trigger server computer system 110 to implement additional authentication or security challenges (Fig. 1, el. 110; Para. 35); after determining that the client system 102 is a malicious user attempting to access the service 112, server computer system 110 may select additional, computationally intensive challenge problems for the client system 102 to solve, wherein these additional challenge problems may be computationally intensive or unsolvable such that, when it attempts to generate a proposed solution, the client system 102 may be required to perform vast amounts of computational work (Para. 46). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Mattson in view of Tarkkala to include executing different sessions concurrently on the client computer, wherein the different sessions calculate different proof of work values repeatedly by processing different inputs using the work function, the different inputs being received from a service connected to the different sessions, the different inputs being specifically associated with the different sessions and distinguishing individual sessions from other sessions executing concurrently on the client computer; receiving, from the trust server, feedback for the different sessions; and updating the calculations of the different proof of work values by the different sessions based at least on the feedback; and wherein the feedback varies resource utilization for respective sessions by increasing resource utilization by relatively less trustworthy sessions and decreasing resource utilization by relatively more trustworthy sessions, using the known method of having a client system with multiple VMs concurrently executing sessions for a service, having each VM solve a challenge problem, and having each VM solve an additional problem based on a determination by the server of whether the VM is an emulated system and whether the emulated system is operated by a malicious user, as taught by Wardman, in combination with the Proof of Work system of Mattson in view of Tarkkala, for the purpose of improving data security for both the web service and its users, thereby improving the functioning of the service as a whole (Wardman-Para. 22). Regarding claim 22, Mattson in view of Tarkkala in view of Wardman teaches the method of claim 21, wherein the feedback causes the relatively less trustworthy sessions to perform relatively more complex proof of work calculations than the relatively more trustworthy sessions, e.g., the determination that a client system 102 is an emulated system may not, by itself, result in the requesting user being denied access to service 112, and instead, detection that the client system is an emulated computer system may instead trigger additional authentication operations to be performed (Wardman-Para. 25); given their frequent use for malicious activity, however, the detection of an emulated computer system may trigger server computer system 110 to implement additional authentication or security challenges (Wardman-Fig. 1, el. 110; Para. 35); after determining that the client system 102 is a malicious user attempting to access the service 112, server computer system 110 may select additional, computationally intensive challenge problems for the client system 102 to solve, wherein these additional challenge problems may be computationally intensive or unsolvable such that, when it attempts to generate a proposed solution, the client system 102 may be required to perform vast amounts of computational work (Wardman-Para. 46). Regarding claim 23, Mattson in view of Tarkkala in view of Wardman teaches the method of claim 21, wherein the feedback causes the relatively less trustworthy sessions to compute proof of work values more frequently than the relatively more trustworthy sessions, e.g., server computer system 110 may instead (or additionally) select multiple challenge problems for client system 102 to solve based on its technical feature, and server computer system 110 may select the challenge problem 120 based on other factors, such as those indicative of a level of risk associated with the access request (Wardman-Para. 41); after determining that the client system 102 is a malicious user attempting to access the service 112, server computer system 110 may select additional, computationally intensive challenge problems for the client system 102 to solve, wherein these additional challenge problems may be computationally intensive or unsolvable such that, when it attempts to generate a proposed solution, the client system 102 may be required to perform vast amounts of computational work (Wardman-Para. 46). Regarding claim 24, Mattson in view of Tarkkala in view of Wardman teaches the method of claim 21, further comprising: limiting at least one user interaction with the client computer based at least on a particular calculation associated with a particular session, e.g., determining that the solution is invalid and rejecting, terminating, or not accepting the request (Mattson-Para. 130-132). Regarding claim 28, Mattson teaches a client computer, e.g., client device 102 (Fig. 1, el. 102), comprising: a processor, e.g., processor 710 (Fig. 7, el. 710); and a memory, e.g., memory 720/storage device 730 (Fig. 7, el. 720, 730), having computer-executable instructions stored thereupon which, when executed by the processor, cause the client computer to: receive a work function, e.g., In step 11, security server computer 104 serves code to client device 102, wherein the served code may include code for implementing the selected countermeasures, such as proof of work code, code to monitor to the execution of the origin server computer system code to ensure it is not being interfered with by malicious code, or other countermeasures (Fig. 1, el. 11, 104; Para. 112); in step 426, the intermediary computer serves the content and one or more countermeasures to the client computer, and logs the results or signals of the observations of the client computer (Fig. 4, el. 426; Para. 160); in step 3, security server computer 104 serves the tests to client device 102 (Fig. 1, el. 3; Para. 101); in step 408, the security intermediary server computer serves the selected tests and the requested content to the requesting client computer (Fig. 4, el. 408; Para. 151); execute a particular session on the client computer, wherein the particular session calculate different proof of work values repeatedly by processing different inputs using the work function, the different inputs being received from a service connected to the particular session…, e.g., executing countermeasures at the client computer and sending the results to the security server, wherein the process of selecting and sending countermeasures may be repeated each time the client device makes a request (Para. 113, 128, 129); one or more additional countermeasures can be sent with subsequently requested content (Para. 114); wherein the countermeasures are Proof of Work code and may be associated with a network configuration, device, browser, user, malware, attack, website, content, or one or more characteristics of the client computer (Para. 66, 88, 89, 112); a hash generating function (Para. 79); send the different proof of work values to a trust server, e.g., security server computer 104 (Fig. 1, el. 104); executing countermeasures at the client computer and sending the results to the security server, wherein the process of selecting and sending countermeasures may be repeated each time the client device makes a request (Para. 113, 128, 129); one or more additional countermeasures can be sent with subsequently requested content (Para. 114); …. Mattson does not clearly teach to: execute different sessions concurrently on the client computer, wherein the different sessions calculate different proof of work values repeatedly by processing different inputs using the work function, the different inputs being received from a service connected to the different sessions, the different inputs being specifically associated with the different sessions and distinguishing individual sessions from other sessions executing concurrently on the client computer; receive, from the trust server, feedback for the different sessions; and update the calculations of the different proof of work values by the different sessions based at least on the feedback, wherein the feedback varies resource utilization for respective sessions by increasing resource utilization by relatively less trustworthy sessions and decreasing resource utilization by relatively more trustworthy sessions. Tarkkala teaches to: …wherein the particular session calculate…proof of work value…by processing different inputs using the work function, the different inputs being received from a service connected to the particular session, the different inputs being specifically associated with the particular session and distinguishing individual sessions from other sessions…on the client computer, e.g., receiving, by a verifier V, the proof-of-work P(w)-proof of work value- and the input sets M and A from the prover (Para. 120, 121); wherein the prover computes a bitstring s as a result of a pseudo-random function h-work function- being applied on the created data set A-particular inputs-, applies a mapping function m-work function- for mapping an arbitrary bitstring to a problem instance, solves the problem instance, and generates the proof-of-work for the solving of the problem instance (Para. 116, 117, 119); wherein the created data set A may include an identifier for the prover P, an identifier for the verifier V, a salt S_1, and auxiliary data such as a timestamp and/or a message to be sent to verifier V (Para. 113); prover P derives a computationally hard problem from some set identifying a session q between prover P and verifier V (Para. 134). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Mattson to include the different inputs being received from a service connected to the particular session, the different inputs being specifically associated with the particular session and distinguishing individual sessions from other sessions on the client computer, using the known method of computing, by the prover, a bitstring s as a result of a pseudo-random function h being applied on the created data set A, applying a mapping function m for mapping an arbitrary bitstring to a problem instance, solving the problem instance, and generating the proof-of-work for the solving of the problem instance, as taught by Tarkkala, in combination with the Proof of Work system of Mattson, for the purpose of establishing a trusted relationship between unknown communication parties in whatever environment with a respective architecture, (Tarkkala-Para. 109), while also ensuring that the proof-of-work is sufficiently fresh (Tarkkala-Para. 132). Mattson in view of Tarkkala does not clearly teach to: execute different sessions concurrently on the client computer, wherein the different sessions calculate different proof of work values repeatedly by processing different inputs using the work function, the different inputs being received from a service connected to the different sessions, the different inputs being specifically associated with the different sessions and distinguishing individual sessions from other sessions executing concurrently on the client computer; receive, from the trust server, feedback for the different sessions; and update the calculations of the different proof of work values by the different sessions based at least on the feedback, wherein the feedback varies resource utilization for respective sessions by increasing resource utilization by relatively less trustworthy sessions and decreasing resource utilization by relatively more trustworthy sessions. Wardman teaches to: receive a work function, e.g., once it receives the challenge problem 120, client system 102 may generate a proposed solution 124 by executing program code to solve the challenge problem 120, wherein this program code may be provided to client system 102 (Fig. 1, el. 120, 124; Para. 30); execute different sessions concurrently on the client computer, wherein the different sessions calculate different proof of work values repeatedly by processing different inputs using the work function, the different inputs being received from a service, e.g., service 112 (Fig. 1, el. 112), connected to the different sessions, the different inputs being specifically associated with the different sessions and distinguishing individual sessions from other sessions executing concurrently on the client computer, e.g., computer system 352 is running multiple VMs 358 simultaneously, wherein this may allow a malicious third-party to scale their attacks, attempting to access the service 112 using the multiple emulated computer systems at the same time, wherein each of the VMs 358 may be configured to attempt to access the same account (e.g., user account 315) with the same service (e.g., service 112) (Fig. 3B, el. 352, 358; Para. 53); server computer system 110 may select a challenge problem 120 and send it to the VM 358A, and once VM 358A generates a proposed solution 124 and provides it to the server computer system 110, system 110 may compare the performance of the challenge problem 120 by VM 358A to the computational performance data 400 (Para. 58); server computer system 110 may select a challenge problem 120 that has a level of difficulty 122 that is commensurate with the technical capabilities of client system 102, as indicated by the reported technical features, wherein server computer system 110 may select the problem 120 such that the level of difficulty 122 increases as the reported technical capabilities of the client system 102 increase (Para. 28); given their frequent use for malicious activity, however, the detection of an emulated computer system may trigger server computer system 110 to implement additional authentication or security challenges (Fig. 1, el. 110; Para. 35); server computer system 110 may instead (or additionally) select multiple challenge problems for client system 102 to solve based on its technical feature (Para. 41); send the different proof of work values to a trust server, e.g., server computer system 110 (Figs. 1, 2, el. 110); Client system 102 may then, at 136, send a challenge response 126, including the proposed solution 124, to the server computer system 110 (Fig. 1, el. 124, 126, 136; Para. 30); receive, from the trust server, feedback for the different sessions, e.g., given their frequent use for malicious activity, however, the detection of an emulated computer system may trigger server computer system 110 to implement additional authentication or security challenges (Fig. 1, el. 110; Para. 35); server computer system 110 may instead (or additionally) select multiple challenge problems for client system 102 to solve based on its technical feature (Para. 41); server computer system 110 may send the client system 102 one or more additional challenge problems once the server computer system determines that the client system 102 is an emulated computer system (Para. 46); and update the calculations of the different proof of work values by the different sessions based at least on the feedback, e.g., server computer system 110 may select a challenge problem 120 and send it to the VM 358A, and once VM 358A generates a proposed solution 124 and provides it to the server computer system 110, system 110 may compare the performance of the challenge problem 120 by VM 358A to the computational performance data 400 (Para. 58); server computer system 110 may instead (or additionally) select multiple challenge problems for client system 102 to solve based on its technical feature (Para. 41); server computer system 110 may send the client system 102 one or more additional challenge problems once the server computer system determines that the client system 102 is an emulated computer system (Para. 46), wherein the feedback varies resource utilization for respective sessions by increasing resource utilization by relatively less trustworthy sessions and decreasing resource utilization by relatively more trustworthy sessions, e.g., the determination that a client system 102 is an emulated system may not, by itself, result in the requesting user being denied access to service 112, and instead, detection that the client system is an emulated computer system may instead trigger additional authentication operations to be performed (Para. 25); given their frequent use for malicious activity, however, the detection of an emulated computer system may trigger server computer system 110 to implement additional authentication or security challenges (Fig. 1, el. 110; Para. 35); after determining that the client system 102 is a malicious user attempting to access the service 112, server computer system 110 may select additional, computationally intensive challenge problems for the client system 102 to solve, wherein these additional challenge problems may be computationally intensive or unsolvable such that, when it attempts to generate a proposed solution, the client system 102 may be required to perform vast amounts of computational work (Para. 46). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Mattson in view of Tarkkala to include to: execute different sessions concurrently on the client computer, wherein the different sessions calculate different proof of work values repeatedly by processing different inputs using the work function, the different inputs being received from a service connected to the different sessions, the different inputs being specifically associated with the different sessions and distinguishing individual sessions from other sessions executing concurrently on the client computer; receive, from the trust server, feedback for the different sessions; and update the calculations of the different proof of work values by the different sessions based at least on the feedback, wherein the feedback varies resource utilization for respective sessions by increasing resource utilization by relatively less trustworthy sessions and decreasing resource utilization by relatively more trustworthy sessions, using the known method of having a client system with multiple VMs concurrently executing sessions for a service, having each VM solve a challenge problem, and having each VM solve an additional problem based on a determination by the server of whether the VM is an emulated system and whether the emulated system is operated by a malicious user, as taught by Wardman, in combination with the Proof of Work system of Mattson in view of Tarkkala, for the purpose of improving data security for both the web service and its users, thereby improving the functioning of the service as a whole (Wardman-Para. 22). Regarding claim 29, the claim is analyzed with respect to claim 22. Regarding claim 30, the claim is analyzed with respect to claim 23. Regarding claim 31, the claim is analyzed with respect to claim 24. Regarding claim 35, the claim is analyzed with respect to claims 21 and 28. Regarding claim 36, the claim is analyzed with respect to claim 22. Regarding claim 37, the claim is analyzed with respect to claim 23. Regarding claim 38, the claim is analyzed with respect to claim 24. Claims 25, 26, 32, 33, 39, and 40 are rejected under 35 U.S.C. 103 as being unpatentable over Mattson in view of Tarkkala in view of Wardman and further in view of Bartolucci et al. (US 2020/0389292 A1). Regarding claim 25, Mattson in view of Tarkkala in view of Wardman teaches the method of claim 21. Mattson in view of Tarkkala in view of Wardman does not clearly teach wherein the work function is a cryptocurrency mining work function. Bartolucci teaches wherein the work function is a cryptocurrency mining work function, e.g., the proof-of-work blockchain may require miners to solve a cryptographic problem, wherein in Bitcoin, the miners 104 find a nonce such that a block header hashes, with SHA-256, to a number that is less than a value defined by the current difficultly, wherein the hashing power required for the proof-of-work algorithm means that a transaction is considered practically irreversible after a certain number of blocks have been mined on top of it (Fig. 1, el. 104; Para. 65). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Mattson in view of Tarkkala in view of Wardman to include wherein the work function is a cryptocurrency mining work function, using the known method of utilizing miners to compute hashes for a proof-of-work blockchain, as taught by Bartolucci, in combination with the Proof of Work system of Mattson in view of Mattson in view of Tarkkala in view of Wardman, for the purpose of controlling the locking/unlocking of resources on a blockchain, which provides enhanced security for the user (Bartolucci-Para. 17). Regarding claim 26, Mattson in view of Tarkkala in view of Wardman teaches the method of claim 21. Mattson further teaches wherein the work function computes hashes…, the different proof of work values comprising the hashes, e.g., the countermeasure may be a proof of work challenge, such as a hash generating function (Para. 79). Mattson in view of Tarkkala in view of Wardman does not clearly teach wherein the work function computes hashes according to a blockchain. Bartolucci teaches wherein the work function computes hashes according to a blockchain, the different proof of work values comprising the hashes, e.g., the proof-of-work blockchain may require miners to solve a cryptographic problem, wherein in Bitcoin, the miners 104 find a nonce such that a block header hashes, with SHA-256, to a number that is less than a value defined by the current difficultly, wherein the hashing power required for the proof-of-work algorithm means that a transaction is considered practically irreversible after a certain number of blocks have been mined on top of it (Fig. 1, el. 104; Para. 65). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Mattson in view of Tarkkala in view of Wardman to include wherein the work function computes hashes according to a blockchain, the different proof of work values comprising the hashes, using the known method of utilizing miners to compute hashes for a proof-of-work blockchain, as taught by Bartolucci, in combination with the Proof of Work system of Mattson in view of Mattson in view of Tarkkala in view of Wardman, for the purpose of controlling the locking/unlocking of resources on a blockchain, which provides enhanced security for the user (Bartolucci-Para. 17). Regarding claim 32, the claim is analyzed with respect to claim 25. Regarding claim 33, the claim is analyzed with respect to claim 26. Regarding claim 39, the claim is analyzed with respect to claim 25. Regarding claim 40, the claim is analyzed with respect to claim 26. Claims 27 and 34 are rejected under 35 U.S.C. 103 as being unpatentable over Mattson in view of Tarkkala in view of Wardman and further in view of Colangelo (US 2019/0303448). Regarding claim 27, Mattson in view of Tarkkala in view of Wardman teaches the method of claim 21. Mattson in view of Tarkkala in view of Wardman does not clearly teach wherein the trust server selectively increases a view count for a video based at least on the calculations. Colango teaches wherein the trust server, e.g., media embedding system 130 (Fig. 4, el. 130), selectively increases a view count for a video based at least on the calculations, e.g., the media embedding system 130 may report 422 a content item view to the content provider 120, wherein the media embedding system 130 analyzes 422 the event data to determine whether the user input was provided by a person or a computer-implemented bot, and thereby determine if the content item was viewed by a person, wherein if the media embedding system 130 determines the user to be a person based on the behavioral information, the media embedding system 130 may report 422 a view of a media content item to the content provider 120, and wherein if the media embedding system 130 determines the user is likely a bot, the view may not be reported to the content provider 120 (Fig. 4, el. 120, 130, 422; Para. 73). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Mattson in view of Tarkkala in view of Wardman to include wherein the trust server selectively increases a view count for a video based at least on the calculations, using the known method of tracking the number of views of media content items, determining whether user input was provided by a person or a bot, and reporting/not reporting the view based on the determination, as taught by Colangelo, in combination with the method of frustrating ratings or results manipulation using Proof of Work of Mattson in view of Mattson in view of Tarkkala in view of Wardman, for the purpose of obtaining more accurate viewing statistics. Regarding claim 34, the claim is analyzed with respect to claim 27. Relevant Prior Art The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Meriac (US 2017/0237770 A1)—Meriac discloses mitigating a power-denial of service attack on a first device by a second device (Abstract). Caragea (US 2019/0068561 A1)—Caragea discloses multiple sessions may be carried out concurrently within a single VM, for example by multiple instances of a browser (as in tabbed browsing), or by distinct applications running at the same time (Para. 70). Feng et al. (US 2011/0231913 A1)—Feng discloses proof-of-work puzzles prevent an adversary from using a single computer to participate in concurrent ticket purchasing campaigns since solving simultaneous proof-of-work puzzles simply slows down the solution of each rather than providing an advantage (Para. 90). Juels et al. (US 7,197,639 B1)—Juels discloses a server imposes a task, such as a puzzle, to a client. The input data for the puzzle includes an encrypted version of seed data, wherein the seed data includes information identifying the server, the client, the session, and the date and time the puzzle expires (Col. 17, lines 12-42). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEREMY DUFFIELD whose telephone number is (571)270-1643. The examiner can normally be reached Monday - Friday, 7:00 AM - 3:00 PM (ET). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yin-Chen Shaw can be reached at (571) 272-8878. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. 17 February 2026 /Jeremy S Duffield/Primary Examiner, Art Unit 2498
Read full office action

Prosecution Timeline

Jul 15, 2024
Application Filed
Sep 16, 2025
Non-Final Rejection — §103
Oct 30, 2025
Examiner Interview Summary
Oct 30, 2025
Applicant Interview (Telephonic)
Nov 20, 2025
Response Filed
Feb 17, 2026
Final Rejection — §103
Apr 14, 2026
Applicant Interview (Telephonic)
Apr 14, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12598067
Method, Device, and System for Updating Anchor Key in a Communication Network for Encrypted Communication with Service Applications
2y 5m to grant Granted Apr 07, 2026
Patent 12591642
SYSTEM FOR STEGANALYSIS DETECTION OF METADATA IN A VIDEO STREAM FOR PROVIDING REAL-TIME DATA
2y 5m to grant Granted Mar 31, 2026
Patent 12579320
SPLIT COUNTERS WITH DYNAMIC EPOCH TRACKING FOR CRYPTOGRAPHIC PROTECTION OF SECURE DATA
2y 5m to grant Granted Mar 17, 2026
Patent 12572685
CONTEXT-BASED PATTERN MATCHING FOR SENSITIVE DATA DETECTION
2y 5m to grant Granted Mar 10, 2026
Patent 12554872
SYSTEM AND METHOD FOR NOTIFYING USERS ABOUT PUBLICLY AVAILABLE DATA
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
49%
Grant Probability
99%
With Interview (+53.1%)
3y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 438 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month