Prosecution Insights
Last updated: April 19, 2026
Application No. 18/670,276

UTILITY DATASET ANONYMIZATION

Non-Final OA §101§112
Filed
May 21, 2024
Examiner
SUH, ANDREW
Art Unit
2493
Tech Center
2400 — Computer Networks
Assignee
Itron, Inc.
OA Round
1 (Non-Final)
80%
Grant Probability
Favorable
1-2
OA Rounds
2y 12m
To Grant
99%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
135 granted / 169 resolved
+21.9% vs TC avg
Strong +40% interview lift
Without
With
+39.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 12m
Avg Prosecution
20 currently pending
Career history
189
Total Applications
across all art units

Statute-Specific Performance

§101
8.7%
-31.3% vs TC avg
§103
51.7%
+11.7% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
21.4%
-18.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 169 resolved cases

Office Action

§101 §112
DETAILED ACTION In response to the communication filed on 05/21/2024, responded in following. On this Office Action, claims 1-20, consisting of independent claims 1, 12, and 18. Claims 1-20 are pending. Claim 5 is rejected under 35 U.S.C. 112(b). Claims 12-17 and 19 are rejected under the 35 USC § 101. Claims 1-4, 6-11, 18 and 20 are allowed. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 07/16/2025. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Drawings The drawings were received on 05/21/2024. These drawings are accepted. Claim Objections Claim 9 is objected to because of the following informalities: Claim 9 cites “topology associated with the firs data set.” It should be “a topology.” Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claim 5 is rejected under 35 U.S.C. 112(b), as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, regards as the invention. Claim 5 recites the limitation "the collection of endpoints." There is insufficient antecedent basis for this limitation in the claim. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 12 and 15-17 : STEP 1:Statutory Category Claim 12 (and dependent claims 15-17) recites a method (process) and therefore fall within a statutory category under 35 USC § 101. STEP2A Prong one: Judicial Exception Claims 12 and 15-17 recite the steps of “assigning anonymous identifiers” (claim 12), “processing the first dataset to swap data associated with the endpoint with data associated with other endpoint” (claim 12), “combining” datasets and “swapping” data between endpoints of different datasets (claims 15-16), and various forms of swapping identifiers and service information among endpoints and parent node (claim 17), These limitations are directed to data anonymization, reengagement, and re-association of information, which fall within the “Mental Processes” grouping of the 2019 PEG as concepts that can be performed in the human mind such as assigning substitute identifiers, comparing characteristics, and reorganizing records based on rules. Accordingly, the claimed limitations fall within a Judicial Exception. STEP2A Prong two: Practical Application Claims 12 and 15-17 does not recite additional elements that integrate the abstract idea into a practical application. The claims merely recite generic data-processing steps applied to datasets, such as “assigning,” “processing,” ”combining,” “swapping,” and “outputting.” These limitations amount to instructions to apply the judicial exception using a computer, or to use a computer as a tool to perform the abstract idea. The claims do not recite any information to computer functionality, a particular machine, or a specific technological solution. Thus, the claims do not integrate the abstract idea into a practical application - see MPEP 2106.05(f). STEP 2B: Significantly More Claims 12 and 15-17 do not recite additional elements that amount to significantly more that the judicial exception. The additional limitation merely further describe how the abstract data manipulation rules are applied and constitute insignificant extra-solution activity. Accordingly, the additional elements, individually or in combination, fail to transform the abstract idea into patent-eligible subject matter - see MPEP 2106.05(g). The dependent claims 13-14 inherit the deficiencies of the independent claim 12 upon which it depends and are rejected as well. Claim 19 falls within a different statutory category (manufacture, i.e., one or more non-transitory computer-readable media storing computer-executable instructions); however, for the same reasons discussed above, the claims merely recites abstract data-manipulation rules implemented on a computer-readable medium and therefore remains directed to an abstract idea under 35 USC § 101. Therefore, claims 12-17 and 19 are not patent-eligible under 35 USC § 101. Allowable Subject Matter Claim 5 would be allowable if rewritten to overcome the rejection under 35 U.S.C. 112(b), set forth in this Office action. Claims 12-17 and 19 would be allowable if rewritten to overcome the rejection under 35 U.S.C. 101, set forth in this Office action. The following is a statement of reasons for the indication of allowable subject matter. Claims 1-4, 6-11, 18 and 20 are allowed. Independent claim 1 (System) Receives a utility dataset associated with a collection of nodes, where a portion contains original identifiers including personal identifiable information (PII). Replaces the original identifiers with anonymous identifiers for the nodes. Produces a transformed dataset that preserves the utility of the node collections while protecting personal information. None of the references considered below discloses receiving a utility dataset associated with a collection of nodes and anonymizing node identifiers while selectively preserving node-level utility in a transformed dataset. Independent claim 12 (Method) Assigns anonymous identifiers to endpoints in a utility dataset, each endpoint being associated with at least on parent node. Processes the dataset, based on a feature of the utility data, to swap among endpoints. Outputs a processed dataset with obscured endpoint associations. None of the references considered below discloses processing a utility dataset associated with endpoints by assigning anonymous identifiers to the endpoints associated with parent nodes and, based on a feature of the utility data, swapping data among the endpoint to output a processed utility dataset. Independent claim 18 (Non-transitory Computer-Readable Medium) Use a deterministic one-way hashing algorithm to assign anonymous identifiers to endpoints associated with parent nodes Processes the dataset to swap data among endpoints. Output the processed dataset None of the references considered below discloses processing a utility dataset associated with endpoints by using a deterministic one-way hashing algorithm to assign anonymous identifiers to the endpoints associated with at least one parent node, swapping data among the endpoints, and outputting the processed utility dataset. Troitsky et al. (US 20210266297 A1): [0054] In one aspect, the data sources 110 may be clients, different user devices, Internet of Things (IoT) devices or data management systems, such as a user database which aggregates and stores all data about a user and which the user manages through an interface; [0131] In step 210, a source 110 receives a request from a recipient 120 to send data; [0134] method 200 determines whether or not critical data is present in the data intended to be sent from the source 110 to the recipient 120. If critical data is discovered, then method 200 proceeds to step 220 a;[0136] In step 220, method 200 sends the identifier of the user (either the identifier generated in step 212 or the existing one found in step 211) from the source to the token generator 130; [0137] In step 221, the token generator 130 determines whether an existing anonymous identifier is linked to the obtained identifier of the user; [0140-0143] In step 223, method 200 generates random tokens for the first and second pairs, e.g., for the pair user identifier/anonymous identifier and the pair critical data/anonymous data. In step 230 a, the random token for the pair critical data/anonymous data is returned. In step 242, all the tokens of the source 110 and, in a particular instance, the noncritical data are sent to the recipient. Villax et al. (US 20220222373 A1): [0073] the data subject's original anonymous identifier may be hashed by the data subject's cryptographic software module in the personal computing device, so that it may be transmitted to data providers in a way which hides the original anonymous identifier; [0094] In one embodiment of the invention a parent may store a child's information of interest in the parent's personal computing device and then allow the child's information to be removed from the parent's device and transferred to the child's device, for example on reaching a certain legal age. Smith et al. (US 9876823 B2): (56) In Example 1, a method includes: receiving, in the system of an external verifier of a first network including a plurality of nodes, a plurality of attestation reports and a plurality of attestation values from a plurality of reporting nodes of the first network, each of the plurality of attestation values randomly generated in the corresponding reporting node based on a common random seed value; determining whether at least a threshold number of the plurality of attestation values match; responsive to at least the threshold number of the plurality of attestation values matching, decrypting the plurality of attestation reports, processing the decrypted plurality of attestation reports to obtain aggregated telemetry data of the plurality of nodes, where identity of the plurality of nodes remains anonymous to the external verifier; and enforcing a security policy based at least in part on the aggregated telemetry data. Wang et al. (US 20120239799 A1): [0034] In an example of a hierarchical tree topology, a parent node can analyze its child node's data and send the results to a next parent at a higher level. A child node is a node, that can only directly communicate with its parent node (and with no other node), and a parent node can only directly communicate with its children nodes and with its own parent node; [0036] each node can compute an intermediate result by repeatedly swapping data with other nodes. A binomial swap forest topology is an example of a decentralized topology. Fleck (US 11387998 B2): FIG. 2 is a flow chart of certain processes that may be used to effectuate an anonymization engine. The method begins at a flow label 210. At a flow step 212 an original data source prepares data or entry into the method. This data may include PII and a source ID unique to the original data source. At a step 214 the data from the original data source is transmitted to an anonymization engine. At a step 215 the anonymization engine analyzes the data to see if there is a history for this PII and this data source. If yes, flow proceeds to a step 216, else flow proceeds to a step 218. At the step 216 the anonymization engine updates a repository with any changes to PII and obtains an external anonymous key at a step 217. While at the step 218 the anonymizer engine creates a new record for the new individual and flow proceeds to a step 220. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANDREW SUH whose telephone number is (571)270-5524. The examiner can normally be reached 9:00 AM- 5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Carl Colin can be reached at (571) 272-3862. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ANDREW SUH/Examiner, Art Unit 2493
Read full office action

Prosecution Timeline

May 21, 2024
Application Filed
Jan 21, 2026
Non-Final Rejection — §101, §112
Feb 18, 2026
Applicant Interview (Telephonic)
Feb 18, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12477012
SYSTEMS AND METHODS FOR BLOCKCHAIN-BASED CONTROL OF NOTIFICATION PERMISSIONS
2y 5m to grant Granted Nov 18, 2025
Patent 12468841
SYSTEM FOR PROVIDING SELECTIVE ACCESS TO USER INFORMATION
2y 5m to grant Granted Nov 11, 2025
Patent 12430099
SYSTEMS AND METHODS FOR PRIVATE AUTHENTICATION WITH HELPER NETWORKS
2y 5m to grant Granted Sep 30, 2025
Patent 12413565
SYSTEMS AND METHODS FOR GROUP MESSAGING USING BLOCKCHAIN-BASED SECURE KEY EXCHANGE
2y 5m to grant Granted Sep 09, 2025
Patent 12413429
SYSTEMS AND METHODS FOR GROUP MESSAGING USING BLOCKCHAIN-BASED SECURE KEY EXCHANGE WITH KEY ESCROW FALLBACK
2y 5m to grant Granted Sep 09, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
80%
Grant Probability
99%
With Interview (+39.8%)
2y 12m
Median Time to Grant
Low
PTA Risk
Based on 169 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month