Prosecution Insights
Last updated: April 19, 2026
Application No. 18/888,273

METHOD AND SYSTEM FOR INDEPENDENT PROOF-OF-CORRECT-SAMPLING OF STREAMING DATA

Final Rejection §103
Filed
Sep 18, 2024
Examiner
TOLENTINO, RODERICK
Art Unit
2439
Tech Center
2400 — Computer Networks
Assignee
Nokia Solutions and Networks Oy
OA Round
2 (Final)
77%
Grant Probability
Favorable
3-4
OA Rounds
3y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
545 granted / 705 resolved
+19.3% vs TC avg
Strong +35% interview lift
Without
With
+35.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
25 currently pending
Career history
730
Total Applications
across all art units

Statute-Specific Performance

§101
15.7%
-24.3% vs TC avg
§103
56.2%
+16.2% vs TC avg
§102
11.9%
-28.1% vs TC avg
§112
8.3%
-31.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 705 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Detailed Action Office Action is in response to the reply filed by Applicant on 3/2/2026. Claims 1-17 are pending. This Office Action is Final. Response to Arguments A) Applicant’s arguments and amendments, regarding 35 USC 101 for being directed to non-statutory subject matter for being a transitory media, has been considered and deemed persuasive. As a result, these rejections have been Withdrawn. B) Applicant’s arguments with respect to claim(s) 1, 16 and 17 have been considered but are moot because the new ground of rejection does not rely on the exact combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1, 2, 5, 6, 11-13 and 15-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Krassovsky et al. (US 2024/0039720) in view of Zamani et al. (US 2025/0190984). As per claim 1, Krassovksy teaches an apparatus to provide a first network element that is as a data source of multiple data sources able to provide data for a data stream, wherein the apparatus comprises: at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the apparaus at least to (Krassovsky, Paragraph 0028 recites “Client device 110 may include a memory 220-1 and a processor 212-1. Memory 220-1 may include a group chat application 222, configured to run in client device 110 and couple with input device 214 and output device 216. Application 222 may be downloaded by the user from server 130 and may be hosted by server 130.”): employ a first function with a secret key (SK), specific to the first network element, to determine whether the first network element is selected to be included in a sample of the data stream (Krassovksy, Paragraph 0032 recites “Encryption tool 244 provides encrypted pairs including a public key and a private key for each identification value of registered users of chat application 222. An identification value for a user may include an identification number for client device 110, an identification number or a name tag associated with the user of client device 110, or any combination thereof. The”). But fails to teach wherein the sample is from a subset of the multiple data sources. However, in an analogous art Kumar teaches wherein the sample is from a subset of the multiple data sources (Kumar, Paragraph 0037 recites “Referring to FIG. 6, a method 600 is one suitable implementation for step 420 in FIG. 4. A sample size is selected (step 610). The selected sample size in step 610 could be a sample size for a single data source, wherein method 600 is repeated for each data source, or could be a sample size for multiple data sources. The sample size selected in step 610 thus specifies a subset of data in one or more data sources.” And Claim 1 recites “wherein the data harvester characterizes a plurality of data sources in a target system, samples a subset of data in the plurality of data sources, and estimates a time of completion based on the samples.”). It would have been obvious to a person of ordinary skill in the art, at the earliest effective filing date to use Kumar’s Data Harvester with Krassovsky’s device verification using key transparency because it offers the advantage of more advanced way of sampling data for better analysis. And fails to teach determine that the first network element is selected to be included in the sample; and provide, based on the first network element being selected and to one or more second network elements, data as part of the sample and provide an inclusion proof indicating that a selection of data to be included as part of the sample was computed correctly, and is supposed to be part of the sample. However, in an analogous art Zamani teaches determine that the first network element is selected to be included in the sample; and provide, based on the first network element being selected and to one or more second network elements, data as part of the sample and provide an inclusion proof indicating that a selection of data to be included as part of the sample was computed correctly, and is supposed to be part of the sample (Zamani, Paragraph 0145 recites “As an example, to generate an inclusion proof, nodes in X participate in a one-time setup protocol for threshold signing-verification key. One such example is a threshold signature where the verification key (vk) is a public parameter and the signing key (sk) is secret that is shared among all nodes in X. Then after the setup phase, each node in X will receive their share of the secret key. The secure element on the user device can be initialized with the verification key vk. Once the key is established, the inclusion proof (e.g., the deposit proof) for each interaction can be a threshold signature on the block that includes the interaction. Hence, for these blockchains, the secure element on the user device can validate the inclusion of the deposit message (e.g., the deposit proof) by validating a single signature. Also, the size of such inclusion proof is at most the size of the block that includes the transaction, which can further be reduced by using techniques such as Merkle trees.”). It would have been obvious to a person of ordinary skill in the art, at the earliest effective filing date to use Zamani’s Offline interaction blockchain system and method with Krassovsky’s device verification using key transparency because it offers the advantage of securely including a trusted element into a network. As per claim 2, Krassovsky in combination with Zamani teaches the apparatus according to claim 1, Krassovsky further teaches wherein the first function is a verifiable random function, VRF, employed at the first network element, to independently evaluate the VRF with the secret key (SK) specific to the first network element (Krassovsky, Paragraphs 0051-0052 recites “Step 806 includes requesting, from a verifiable directory, an identity proof of the second participant associated with the identification for the second participant, wherein the verifiable directory includes a list of encryption public-keys for client devices associated with each of multiple users in the chat server. In some embodiments, step 806 includes requesting an updated identity proof for the second participant when the identity proof is not decoded by the private key. Step 808 includes verifying the identity proof of the second participant with a public key associated with the second client device. In some embodiments, step 808 includes identifying a source of an identity attack from at least one of the second participant and the chat server. In some embodiments, step 808 includes requesting, to the verifiable directory, to update an identity for the first participant or the second participant. In some embodiments, step 808 includes matching, in the first client device, an output of a verifiable random function with the public key associated with the second client device as an input, with the identity proof. In some embodiments, the identity proof is a graphic code and step 808 includes scanning the graphic code with the first client device.”). As per claim 5, Krassovsky in combination with Zamani teaches the apparatus according to claim 1, Krassovsky further teaches wherein the at least one memory stores instructions that, when executed by the at least one processor further cause the apparatus to: evaluate, the first function with the secret key (SK) specific to the first network element and a public key (PK) (Krassovsky, Paragraphs 0051-0052 recites “Step 806 includes requesting, from a verifiable directory, an identity proof of the second participant associated with the identification for the second participant, wherein the verifiable directory includes a list of encryption public-keys for client devices associated with each of multiple users in the chat server. In some embodiments, step 806 includes requesting an updated identity proof for the second participant when the identity proof is not decoded by the private key. Step 808 includes verifying the identity proof of the second participant with a public key associated with the second client device. In some embodiments, step 808 includes identifying a source of an identity attack from at least one of the second participant and the chat server. In some embodiments, step 808 includes requesting, to the verifiable directory, to update an identity for the first participant or the second participant. In some embodiments, step 808 includes matching, in the first client device, an output of a verifiable random function with the public key associated with the second client device as an input, with the identity proof. In some embodiments, the identity proof is a graphic code and step 808 includes scanning the graphic code with the first client device.”). As per claim 6, Krassovsky in combination with Zamani teaches the apparatus according to claim 1, Krassovsky further teaches wherein the at least one memory stores instructions that, when executed by the at least one processor further cause the apparatus to sign its produced data with its private key that can be checked with a corresponding public key (PK) and/or a verification key (Krassovsky, Paragraphs 0051-0052 recites “Step 806 includes requesting, from a verifiable directory, an identity proof of the second participant associated with the identification for the second participant, wherein the verifiable directory includes a list of encryption public-keys for client devices associated with each of multiple users in the chat server. In some embodiments, step 806 includes requesting an updated identity proof for the second participant when the identity proof is not decoded by the private key. Step 808 includes verifying the identity proof of the second participant with a public key associated with the second client device. In some embodiments, step 808 includes identifying a source of an identity attack from at least one of the second participant and the chat server. In some embodiments, step 808 includes requesting, to the verifiable directory, to update an identity for the first participant or the second participant. In some embodiments, step 808 includes matching, in the first client device, an output of a verifiable random function with the public key associated with the second client device as an input, with the identity proof. In some embodiments, the identity proof is a graphic code and step 808 includes scanning the graphic code with the first client device.”). As per claim 11, Krassovsky in combination with Zamani teaches the apparatus according to claim 6, Krassovsky further teaches wherein the at least one memory stores instructions that, when executed by the at least one processor further cause the apparatus at least to at each of the slots, use the public key (PK) and the secret key (SK) for evaluation the first function and obtain a pi, which is a VRF proof, calculated based on the public key (PK) as proof of properties of the data stream (Krassovsky, Paragraphs 0051-0052 recites “Step 806 includes requesting, from a verifiable directory, an identity proof of the second participant associated with the identification for the second participant, wherein the verifiable directory includes a list of encryption public-keys for client devices associated with each of multiple users in the chat server. In some embodiments, step 806 includes requesting an updated identity proof for the second participant when the identity proof is not decoded by the private key. Step 808 includes verifying the identity proof of the second participant with a public key associated with the second client device. In some embodiments, step 808 includes identifying a source of an identity attack from at least one of the second participant and the chat server. In some embodiments, step 808 includes requesting, to the verifiable directory, to update an identity for the first participant or the second participant. In some embodiments, step 808 includes matching, in the first client device, an output of a verifiable random function with the public key associated with the second client device as an input, with the identity proof. In some embodiments, the identity proof is a graphic code and step 808 includes scanning the graphic code with the first client device.”). As per claim 12, Krassovsky in combination with Zamani the apparatus according to claim 11, Krassovsky further teaches wherein, the at least one memory stores instructions that, when executed by the at least one processor, cause the first network element apparatus at the to obtain a value beta, which is a VRF hash output, and use the beta to evaluate whether itself has been selected to be included in the sample in this slot according to ana algorithm that uses a sampling probability; wherein, the beta is calculated based on pi; wherein, the sampling probability is used to compute an expected number of first network elements in the multiple data sources in the sample by multiplying a total number of the first network elements and the sampling probability (Krassovsky, Paragraphs 0051-0052 recites “Step 806 includes requesting, from a verifiable directory, an identity proof of the second participant associated with the identification for the second participant, wherein the verifiable directory includes a list of encryption public-keys for client devices associated with each of multiple users in the chat server. In some embodiments, step 806 includes requesting an updated identity proof for the second participant when the identity proof is not decoded by the private key. Step 808 includes verifying the identity proof of the second participant with a public key associated with the second client device. In some embodiments, step 808 includes identifying a source of an identity attack from at least one of the second participant and the chat server. In some embodiments, step 808 includes requesting, to the verifiable directory, to update an identity for the first participant or the second participant. In some embodiments, step 808 includes matching, in the first client device, an output of a verifiable random function with the public key associated with the second client device as an input, with the identity proof. In some embodiments, the identity proof is a graphic code and step 808 includes scanning the graphic code with the first client device.”). As per claim 13, Krassovsky in combination with Zamani the apparatus according to claim 1, Krassovsky further teaches wherein, the at least one memory stores instructions that, when executed by the at least one processor, cause the apparatus at least to: release the secret key (SK) at a predetermined time period, and create a new secret key (SK) for evaluations of the first function (Krassovsky, Paragraph 0059 recites “Step 910 includes transmitting the private key to the user of the chat group service. In some embodiments, step 910 includes receiving, from a participant in a group chat supported by the chat group service, a request for an identity proof of the user of the chat group service, and transmitting the private key to the participant in the group chat supported by the chat group service. In some embodiments, step 910 includes receiving, from the user of the chat group service, a request to modify the identification value into a new identification value, requesting, from the selected witness authority, a cross-validation of the new identification value, generating a new encrypted pair upon receipt of the cross-validation of the new identification value, the new encrypted pair including a new public key and a new private key, and updating the database to the new identification value associated with the new public key. In some embodiments, step 910 includes generating a new encrypted pair after a pre-selected period of time, the new encrypted pair including a new public key and a new private key, and updating the database to include the identification value associated with the new public key. In some embodiments, step 910 includes marking the identification value and the public key for deletion in the database; and deleting the identification value and the public key after a pre-selected period of time.”). As per claim 15, Krassovsky in combination with Zamani the apparatus according to claim 1, Krassovsky further teaches wherein, the at least one memory stores instructions that, when executed by the at least one processor, cause the apparatus at least to perform periodical key release, and the periodical key release depends on at least one of the following: a number of first network elements in the multiple data sources, a size of the inclusion proof, a size of the data, a size of the public key (PK), a size of the secret key (SK), a sampling probability, and a key release period (Krassovsky, Paragraph 0059 recites “Step 910 includes transmitting the private key to the user of the chat group service. In some embodiments, step 910 includes receiving, from a participant in a group chat supported by the chat group service, a request for an identity proof of the user of the chat group service, and transmitting the private key to the participant in the group chat supported by the chat group service. In some embodiments, step 910 includes receiving, from the user of the chat group service, a request to modify the identification value into a new identification value, requesting, from the selected witness authority, a cross-validation of the new identification value, generating a new encrypted pair upon receipt of the cross-validation of the new identification value, the new encrypted pair including a new public key and a new private key, and updating the database to the new identification value associated with the new public key. In some embodiments, step 910 includes generating a new encrypted pair after a pre-selected period of time, the new encrypted pair including a new public key and a new private key, and updating the database to include the identification value associated with the new public key. In some embodiments, step 910 includes marking the identification value and the public key for deletion in the database; and deleting the identification value and the public key after a pre-selected period of time.”). Regarding claims 16 and 17, claims 16 and 17 are directed to a method and computer readable medium associated with the device/element of claim 1. Claims 16 and 17 are of similar scope to claim 1, and are therefore rejected under similar rationale. Claim(s) 3 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Krassovsky et al. (US 2024/0039720) and Zamani et al. (US 2025/0190984) in in further view of Weigand (US 2023/0060241). As per claim 3, Krassovsky in combination with Zamani the apparatus according to claim 1, but fails to teach wherein, the at least one memory stores instructions that, when executed by the at least one processor, cause the apparatus at least to determine that the first network element is not selected to be included in the sample, provide, based on the first network element not being selected and to the one or more second network elements, an exclusion proof that said first network element is not selected in the sample, without providing data to be part of the sample. However, in an analogous art Weigand teaches wherein, the at least one memory stores instructions that, when executed by the at least one processor, cause the apparatus at least to determine that the first network element is not selected to be included in the sample, provide, based on the first network element not being selected and to the one or more second network elements, an exclusion proof that said first network element is not selected in the sample, without providing data to be part of the sample (Weigand, Paragraph 0027 recites “Correspondingly, the sparse hash tree includes 2.sup.256 leaf nodes and the generated first hash value is set to the leaf node of the sparse hash tree corresponding to the value of the first hash value. In addition to hash trees, the sparse hash tree provides the possibility of a proof of exclusion, namely by determination that the leaf node corresponding to a particular hash value is null.”). It would have been obvious to a person of ordinary skill in the art, at the earliest effective filing date to use Weigand’s document integrity protection with Krassovsky’s device verification using key transparency because it offers the advantage of knowing the active and unactive nodes in a network. As per claim 14, Krassovsky in combination with Zamani and Weigand’s teaches the apparatus according to claim 3, Weigand further teaches wherein the exclusion proof is based on at least one of the following: a number of first network elements in the multiple data sources, a size of the proof, a size of the data, a size of the public key (PK), a size of the secret key (SK), a sampling probability, and a key release period. However, in an analogous art Weigand teaches wherein the exclusion proof is based on at least one of the following: the number of first network elements, the size of the proof, the size of the data, the size of the public key (PK), the size of the secret key (SK), a sampling probability, and a key release period (Weigand, Paragraph 0027 recites “Correspondingly, the sparse hash tree includes 2.sup.256 leaf nodes and the generated first hash value is set to the leaf node of the sparse hash tree corresponding to the value of the first hash value. In addition to hash trees, the sparse hash tree provides the possibility of a proof of exclusion, namely by determination that the leaf node corresponding to a particular hash value is null.”). It would have been obvious to a person of ordinary skill in the art, at the earliest effective filing date to use Weigand’s document integrity protection with Krassovsky’s device verification using key transparency because it offers the advantage of knowing the active and unactive nodes in a network. Claim(s) 4 and 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Krassovsky et al. (US 2024/0039720) and Zamani et al. (US 2025/0190984) and in further view of Reinsberg et al. (US 2019/0179806). As per claim 4, Krassovsky in combination with Zamani the apparatus according to claim 1, but fails to teach wherein the first network element is a randomly selected data source of a multiple data sources; the at least one memory stores instructions that, when executed by the at least one processor. cause the apparatus at least to produce data forming part of the data stream to be provided to the second network element; the sample is a subset of dataset items of the data stream determined via at least a statistical sampling method; the sample includes data items of a plurality of first network elements; and/or the inclusion proof is for verifying that the selection of data in the sample was computed correctly. However, in ana analogous art Reinsberg teaches wherein the first network element is a randomly selected data source of a multiple data sources; the at least one memory stores instructions that, when executed by the at least one processor. cause the apparatus at least to produce data forming part of the data stream to be provided to the second network element; the sample is a subset of dataset items of the data stream determined via at least a statistical sampling method; the sample includes data items of a plurality of first network elements; and/or the inclusion proof is for verifying that the selection of data in the sample was computed correctly (Reinsberg, Paragraph 0035 recites “For example, validator nodes may be selected randomly from validator nodes that wrote some previous number of blocks of the blockchains, such as, for example, the last 1000 validator nodes to write a block. Multiple verification requests may be sent by a user in parallel to multiple validator nodes, which may all response with signed secret messages and verify received user-signed secret messages. The last validator node to verify a user-signed secret message may write the communications address/public key pair to the verified database.”). It would have been obvious to a person of ordinary skill in the art, at the earliest effective filing date to use Reinsberg’s decentralized database associating public keys and communications addresses with Krassovsky’s device verification using key transparency because it offers the advantage of verifying communications in a network. As per claim 7, Krassovsky in combination with Zamani and Reinsburg teaches the apparatus element according to claim 4, Krassovsky further teaches wherein, the is at least one memory stores instructions that, when executed by the at least one processor further cause the apparatus to: receive, from the second network element, a control signal; generate, a public key (PK) and/or the secret key (SK); and provide, to the one or more second network elements, the public key (PK) (Krassovsky, Paragraphs 0051-0052 recites “Step 806 includes requesting, from a verifiable directory, an identity proof of the second participant associated with the identification for the second participant, wherein the verifiable directory includes a list of encryption public-keys for client devices associated with each of multiple users in the chat server. In some embodiments, step 806 includes requesting an updated identity proof for the second participant when the identity proof is not decoded by the private key. Step 808 includes verifying the identity proof of the second participant with a public key associated with the second client device. In some embodiments, step 808 includes identifying a source of an identity attack from at least one of the second participant and the chat server. In some embodiments, step 808 includes requesting, to the verifiable directory, to update an identity for the first participant or the second participant. In some embodiments, step 808 includes matching, in the first client device, an output of a verifiable random function with the public key associated with the second client device as an input, with the identity proof. In some embodiments, the identity proof is a graphic code and step 808 includes scanning the graphic code with the first client device.”). Claim(s) 8-10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Krassovsky et al. (US 2024/0039720), Zamani et al. (US 2025/0190984) and Reinsberg et al. (US 2019/0179806) and in further view of Moon et al. (US 2024/0007292). As per claim 8, Krassovsky in combination with Zamani and Reinsburg teaches the apparatus element according to claim 7, but fails to teach wherein: the data stream is divided into slots; the first network element at each of the slots produces a part of the data of the data stream; each of the slots of the data stream is configured with a slot representation value; and wherein, the slot representation value is used as is used as part of input the proof generation and verification. However, in an analogous art Moon teaches teach wherein: the data stream is divided into slots; the first network element at each of the slots produces a part of the data of the data stream; each of the slots of the data stream is configured with a slot representation value; and wherein, the slot representation value is used as is used as part of input the proof generation and verification (Moon, Paragraph 0013 recites “According to an aspect of an example embodiment of the present disclosure, there is provided a calculating method using a zero-knowledge proof-friendly one-way function, performed by a computing device, the calculating method including: calculating a first intermediate bit stream by inputting an input bit stream of a one-way function to an augmented matrix, calculating a second intermediate bit stream by dividing the first intermediate bit stream into a predetermined number of bit streams and inputting each of the predetermined number of divided bit streams to a substitution-box (S-box), and outputting an output bit stream of the one-way function by inputting the second intermediate bit stream to a reduced matrix.”). It would have been obvious to a person of ordinary skill in the art, at the earliest effective filing date to use Moon’s calculating method using zero-knowledge proof-friendly one-way function, and apparatus for implementing the same with Krassovsky’s device verification using key transparency because it offers the advantage of verifying communications in a network. As per claim 9, Krassovsky in combination with Zamani, Reinsburg and Moon teaches the first network apparatus according to claim 8, Moon further teaches wherein, the slot representation value is obtained by, updating the slot representation value at predetermined time points, wherein the slot representation value changes incremental at the predetermined time points; or reaching a consensus for a next value of the slot representation value, if the first network element is part of a decentralized blockchain; or receiving, from another network element configured to announce the next value of the slot representation value, the next value of the slot representation value; or obtaining, by requesting from another network element, the next value of the slot representation value. However, in an analogous art Moon teaches wherein, the slot representation value is obtained by, updating the slot representation value at predetermined time points, wherein the slot representation value changes incremental at the predetermined time points; or reaching a consensus for a next value of the slot representation value, if the first network element is part of a decentralized blockchain; or receiving, from another network element configured to announce the next value of the slot representation value, the next value of the slot representation value; or obtaining, by requesting from another network element, the next value of the slot representation value (Moon, Paragraph 0013 recites “According to an aspect of an example embodiment of the present disclosure, there is provided a calculating method using a zero-knowledge proof-friendly one-way function, performed by a computing device, the calculating method including: calculating a first intermediate bit stream by inputting an input bit stream of a one-way function to an augmented matrix, calculating a second intermediate bit stream by dividing the first intermediate bit stream into a predetermined number of bit streams and inputting each of the predetermined number of divided bit streams to a substitution-box (S-box), and outputting an output bit stream of the one-way function by inputting the second intermediate bit stream to a reduced matrix.”). It would have been obvious to a person of ordinary skill in the art, at the earliest effective filing date to use Moon’s calculating method using zero-knowledge proof-friendly one-way function, and apparatus for implementing the same with Krassovsky’s device verification using key transparency because it offers the advantage of verifying communications in a network. As per claim 10, Krassovsky in combination with Zamani, Reinsburg and Moon teaches the first network apparatus according to claim 8, Krassovsky further teaches wherein the next value of the slot representation value corresponds to a hash of a previous sample and/or to other arbitrary value determined by another network element (Krassovsky, Paragraph 0063 recites FIG. 6 is a tree diagram 600 illustrating the architecture of a verifiable directory 654 storing identities values 601-1, 601-2, 601-3, and 601-4 (hereinafter, collectively referred to as “identity values 601”) for each of the users of a group chat application, according to some embodiments. Identity values 601 may be identity values by the users themselves or an identity provider (e.g., a government institution or some other authority). Diagram 600 may be a Merkle tree, configured in such a way that all the information may be compacted into a single hash 631 after one or two layers of hashing. A first layer includes hashes 611-1, 611-2, 611-3, and 611-4 (hereinafter, collectively referred to as “first layer hashes 611”), resulting from hashing identity values 601 with the public keys for each of the users. A second layer of hashes 621-1 and 621-2 (hereinafter, collectively referred to as “second layer hashes 621”) may include a pairwise hashing of first layer hashes 611 according to rules defined by a variable random function (cf. verifiable random tool 246). Finally, a third layer of hashing 631 is achieved again by pairwise hashing second layer hashes 621 based on rules established by the variable random function. Notice how the complexity and size of the data structure is reduced by a factor of 2 for higher layers.”). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RODERICK TOLENTINO whose telephone number is (571)272-2661. The examiner can normally be reached Mon- Fri 8am-4pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Luu Pham can be reached at 571-270-5002. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. RODERICK . TOLENTINO Examiner Art Unit 2439 /RODERICK TOLENTINO/Primary Examiner, Art Unit 2439
Read full office action

Prosecution Timeline

Sep 18, 2024
Application Filed
Nov 25, 2025
Non-Final Rejection — §103
Mar 02, 2026
Response Filed
Mar 25, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603907
SERVER AND METHOD FOR PROVIDING ONLINE THREAT DATA BASED ON USER-CUSTOMIZED KEYWORDS FOR PRIVATE CHANNEL
2y 5m to grant Granted Apr 14, 2026
Patent 12592915
INFERENCE-BASED SELECTIVE FLOW INSPECTION
2y 5m to grant Granted Mar 31, 2026
Patent 12580946
SYSTEMS AND METHODS FOR TRIGGERING TOKEN ALERTS
2y 5m to grant Granted Mar 17, 2026
Patent 12580948
CYBERSECURITY OPERATIONS MITIGATION MANAGEMENT
2y 5m to grant Granted Mar 17, 2026
Patent 12572632
SYSTEMS AND METHODS FOR DATA SECURITY MODEL MODIFICATION AND ANOMALY DETECTION
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
77%
Grant Probability
99%
With Interview (+35.4%)
3y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 705 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month