Prosecution Insights
Last updated: April 19, 2026
Application No. 17/227,469

SYSTEM AND METHOD FOR SECURE VALUATION AND ACCESS OF DATA STREAMS

Non-Final OA §101§112
Filed
Apr 12, 2021
Examiner
NILFOROUSH, MOHAMMAD A
Art Unit
3697
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
International Business Machines Corporation
OA Round
5 (Non-Final)
29%
Grant Probability
At Risk
5-6
OA Rounds
4y 10m
To Grant
64%
With Interview

Examiner Intelligence

Grants only 29% of cases
29%
Career Allow Rate
116 granted / 397 resolved
-22.8% vs TC avg
Strong +35% interview lift
Without
With
+34.8%
Interview Lift
resolved cases with interview
Typical timeline
4y 10m
Avg Prosecution
30 currently pending
Career history
427
Total Applications
across all art units

Statute-Specific Performance

§101
26.0%
-14.0% vs TC avg
§103
33.8%
-6.2% vs TC avg
§102
9.8%
-30.2% vs TC avg
§112
28.4%
-11.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 397 resolved cases

Office Action

§101 §112
DETAILED ACTION Acknowledgements The amendment filed 9/16/2025 is acknowledged. Claims 1-3, 8-9, 11-14, 16, 18-19, 21, and 23-24 are pending. Claims 1-3, 8-9, 11-14, 16, 18-19, 21, and 23-24 have been examined. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 9/16/2025 has been entered. Response to Amendment/Arguments Regarding the rejection of the claims under 35 USC 101, applicant states that the claims have been amended to no longer recite "value", so the claims are now directed to operations performed by a machine learning model and to determining the accuracy of data resulting from the trained machine learning model. Applicant cites ex parte Hannun to state that obtaining predicted character probabilities from a trained neural network resulting in predicted character probability outputs did not recite organizing human activity. Examiner notes, however, that although the claims no longer explicitly describe the prediction as being a prediction of an "estimated value", the claim does involve determining a prediction of some data, and this data is described in the specification as being an estimated value of a data stream (see, e.g., Specification Paras. 2-4, 18-20). Although the limitations from the specification are not imported into the claims, the claims are read in view of the specification. Thus, while the claims are broader than the specification, they are still broad enough to read on the embodiment described in the specification. Therefore, the claims still fall within the "certain methods of organizing human activity" grouping of abstract ideas. Regarding ex parte Hannun, the claims in that case dealt with analyzing an input audio signal to generate spectrogram frames for an audio file, and using a trained neural network to obtain probabilities for determining the words or letters spoken in the audio file to generate a transcription of the audio file. The present claims, on the other hand, simply make a prediction regarding a characteristic of data (e.g., the value of the data), determine the accuracy of the prediction, and store and control access to it. Thus the nature and facts of the present invention differ significantly from that of the invention of ex parte Hannun, and because the patent eligibility analysis depends on these facts, the analysis in ex parte Hannun is not applicable to the present claims. The claims in Hannun were found to not be directed to methods of organizing human activity because they recited a specific technical implementation that involved normalizing an input file, generating a jitter set of audio files, generating a set of spectrogram frames, obtaining predicted character probabilities from a trained neural network and decoding a transcription of the input audio using the probabilities. The present claims, on the other hand, recite predicting a characteristic of data (e.g., the value of the data) and determining the accuracy of the prediction in a commercial context where the prediction is for determining the value of a product to be bought or sold. Thus, the present claims remain directed to certain methods of organizing human activity. Applicant also states that the mathematical algorithm or formula for performing calculations that may be involved in training the machine learning prediction function model is not recited in the claim, and cites ex parte Hannun to state that because the mathematical concepts are not recited in the claims, the claims do not recite a mathematical relationship, formula, or calculation. Applicant argues that the same analysis applies to the step of encrypting using homomorphic encryption, because this limitation does not recite a mathematical relationship, formula, or calculation for homomorphic encryption. Examiner notes, however, that the claims do continue to recite a mathematical concept, at least because the claims recite "encrypting . . .using homomorphic encryption," and although there are different ways of performing homomorphic encryption, all of these ways are mathematical calculations. The document titled "Homomorphic Encryption Use Cases," originally cited by applicant in the interview of 9/9/2025 and also attached to this Office action, cites a 2009 dissertation by Craig Gentry which provides the first an algorithm for performing full homomorphic encryption (See section entitled “Future Implementations of Homomorphic Encryption”). This dissertation details a manner of performing homomorphic encryption, and demonstrates that homomorphic encryption is a series of mathematical calculations. (See Craig Gentry, “A Fully Homomorphic Encryption Scheme”, Stanford University, September 2009). Applicant further states that the claims recite a specific technique to solve a specific problem in the technical field of transmitting, storing, and controlling access to sensitive data in data streams, and the claims recite specific limitations that achieve an improved technological result of providing a secure platform for determining the accuracy of a prediction model trained on the data stream and controlling access to the accuracy of the data stream. Applicant states that the claims as a whole reflect the technological improvement. Examiner notes, however, that the functionality recited in the claims involves the steps of “transmitting a data stream . . .,” “encrypting access to the data stream . . . using homomorphic encryption,” “storing the encrypted data stream . . .,” “training a . . . prediction function model on the encrypted data stream resulting in a dataset of a plurality of data predictors,” “determining an accuracy of the data predictors,” “determining an accuracy profile of the encrypted data stream based on the accuracy of the data predictors,” “storing the accuracy profile and the accuracy of the data predictors with the data stream . . .,” and “controlling access . . . to the data stream, the accuracy profile, the data predictors, and the accuracy of the data predictors . . . by permitting the accuracy profile to be viewed and the data stream to be obtained based on receipt of tokens . . . .” Each of these steps is part of the abstract idea of transferring, storing, and protecting a set of information, predicting an estimated value of the information, determining an accuracy of the prediction, and storing and controlling access to the estimated value, accuracy, and the information. These steps do not provide a technological solution or a solution to a technological problems, because the steps describe a process for determining a prediction regarding a set of information, such as the value of the information, and an accuracy of the prediction, as well as storing and protecting the set of information and predictions from unauthorized access, which is a manner of managing data a commercial interaction rather than a technological process. The additional elements recited beyond the abstract idea, such as the use of a local computer, a secure platform on a central server computer, a secure communication link from the local computer to the secure platform, a machine learning model, a blockchain, a computer system comprising one or more computer processors and one or more non-transitory computer-readable storage media, and a computer program product comprising program instructions on a computer-readable storage medium, do not provide significantly more than the abstract idea because these additional elements only involve using a computer as a tool to automate and/or implement the abstract idea. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-3, 8-9, 11-14, 16, 18-19, 21, and 23-24 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. In the instant case, claims 1-3, and 8-9 are directed to a method, claims 11-14 and 21 are directed to a system comprising one or more processors and one or more non-transitory computer-readable storage media, and claims 16, 18-19 and 23-24 are directed to a computer program product comprising a computer-readable storage medium, which according to the specification, excludes transitory signals (See PGPub of specification ¶ 63). Therefore, these claims fall within the four statutory categories of invention. The claims recite transferring, storing, and protecting a set of information, predicting an estimated value of the information, determining an accuracy of the prediction, and storing and controlling access to the estimated value, accuracy, and the information, which is an abstract idea. Specifically, the claims recite “transmitting a data stream . . .,” “encrypting access to the data stream . . . using homomorphic encryption,” “storing the encrypted data stream . . .,” “training a . . . prediction function model on the encrypted data stream resulting in a dataset of a plurality of data predictors,” “determining an accuracy of the data predictors,” “determining an accuracy profile of the encrypted data stream based on the accuracy of the data predictors,” “storing the accuracy profile and the accuracy of the data predictors with the data stream . . .,” and “controlling access . . . to the data stream, the accuracy profile, the data predictors, and the accuracy of the data predictors . . . by permitting the accuracy profile to be viewed and the data stream to be obtained based on receipt of tokens . . . ,” which is grouped within the “certain methods of organizing human activity” grouping of abstract ideas in prong one of step 2A of the Alice/Mayo test (See 2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50, 52, 54 (January 7, 2019)) because it describes a process for determining a prediction regarding a set of information, such as the value of the information, and an accuracy of the prediction, as well as storing and protecting the set of information and predictions from unauthorized access, which is a commercial or legal interaction. Additionally, the limitation of “encrypting access to a data stream . . . using homomorphic encryption” only involves performing a mathematical calculation or inputting values into mathematical functions, and thus falls within the “mathematical concepts” grouping of abstract ideas. Accordingly, the claims recite an abstract idea (See pages 7, 10, Alice Corporation Pty. Ltd. v. CLS Bank International, et al., US Supreme Court, No. 13-298, June 19, 2014; 2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50, 53-54 (January 7, 2019)). This judicial exception is not integrated into a practical application because, when analyzed under prong two of step 2A of the Alice/Mayo test (See 2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50, 54-55 (January 7, 2019)), the additional elements of the claims such as the use of a local computer, a secure platform on a central server computer, a secure communication link from the local computer to the secure platform, a machine learning prediction function model, a blockchain, a computer system comprising one or more computer processors and one or more non-transitory computer-readable storage media, and a computer program product comprising program instructions on a computer-readable storage medium, merely use a computer as a tool to perform an abstract idea. Specifically, these additional elements perform the steps or functions of “transmitting a data stream . . .,” “encrypting access to the data stream . . . using homomorphic encryption,” “storing the encrypted data stream . . .,” “training a . . . prediction function model on the encrypted data stream resulting in a dataset of a plurality of data predictors,” “determining an accuracy of the data predictors,” “determining an accuracy profile of the encrypted data stream based on the accuracy of the data predictors,” “storing the accuracy profile and the accuracy of the data predictors with the data stream . . .,” and “controlling access . . . to the data stream, the accuracy profile, the data predictors, and the accuracy of the data predictors . . . by permitting the accuracy profile to be viewed and the data stream to be obtained based on receipt of tokens . . . .” The use of a processor/computer as a tool to implement the abstract idea does not integrate the abstract idea into a practical application because it requires no more than a computer performing functions that correspond to acts required to carry out the abstract idea. The additional elements do not involve improvements to the functioning of a computer, or to any other technology or technical field (MPEP 2106.05(a)), the claims do not apply the abstract idea with, or by use of, a particular machine (MPEP 2106.05(b)), and the claims do not apply or use the abstract idea in some other meaningful way beyond generally linking the use of the abstract idea to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception (MPEP 2106.05(e) and Vanda Memo). Therefore, the claims do not, for example, purport to improve the functioning of a computer. Nor do they effect an improvement in any other technology or technical field. Accordingly, the additional elements do not impose any meaningful limits on practicing the abstract idea, and the claims are directed to an abstract idea. The claim do not include additional elements that are sufficient to amount to significantly more than the judicial exception because, when analyzed under step 2B of the Alice/Mayo test (See 2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50, 52, 56 (January 7, 2019)), the additional elements of using a local computer, a secure platform on a central server computer, a secure communication link from the local computer to the secure platform, a machine learning prediction function model, a blockchain, a computer system comprising one or more computer processors and one or more non-transitory computer-readable storage media, and a computer program product comprising program instructions on a computer-readable storage medium to perform the steps amounts to no more than using a computer or processor to automate and/or implement the abstract idea transferring, storing, and protecting a set of information, predicting an estimated value of the information, determining an accuracy of the prediction, and storing and controlling access to the estimated value, accuracy, and the information. As discussed above, taking the claim elements separately, these additional elements perform the steps or functions of “transmitting a data stream . . .,” “encrypting access to the data stream . . . using homomorphic encryption,” “storing the encrypted data stream . . .,” “training a . . . prediction function model on the encrypted data stream resulting in a dataset of a plurality of data predictors,” “determining an accuracy of the data predictors,” “determining an accuracy profile of the encrypted data stream based on the accuracy of the data predictors,” “storing the accuracy profile and the accuracy of the data predictors with the data stream . . .,” and “controlling access . . . to the data stream, the accuracy profile, the data predictors, and the accuracy of the data predictors . . . by permitting the accuracy profile to be viewed and the data stream to be obtained based on receipt of tokens . . . .” These functions correspond to the actions required to perform the abstract idea. Viewed as a whole, the combination of elements recited in the claims merely recite the concept of transferring, storing, and protecting a set of information, predicting an estimated value of the information, determining an accuracy of the prediction, and storing and controlling access to the estimated value, accuracy, and the information. Therefore, the use of these additional elements does no more than employ the computer as a tool to automate and/or implement the abstract idea. The use of a computer or processor to merely automate and/or implement the abstract idea cannot provide significantly more than the abstract idea itself (MPEP 2106.05 (f) & (h)). Therefore, the claim is not patent eligible. Dependent claims 2-3, 8-9, 12-14, 18-19 and 21, 23-24 further describe the abstract idea of transferring, storing, and protecting a set of information, predicting an estimated value of the information, determining an accuracy of the prediction, and storing and controlling access to the estimated value, accuracy, and the information. Specifically, claims 2, 9, 12, 14, 19, and 24 further describe the prediction function and the estimated value, which are part of the abstract idea. Claims 3, 21, and 23 describe determining and intrinsic value of the data stream, which describes determining a value and is thus abstract. The broad use of artificial intelligence to determine the value does not provide a practical application or significantly more than the abstract idea because it only involves using a computer as a tool to automate and/or implement the abstract idea. Claims 8, 13, and 18 further describe the artificial intelligence, but do not require any steps or functions to be performed. Additionally, the use of a local computer, a secure platform on a central server computer, a secure communication link from the local computer to the secure platform, a machine learning prediction function model, a blockchain, a computer system comprising one or more computer processors and one or more non-transitory computer-readable storage media, and a computer program product comprising program instructions on a computer-readable storage medium to perform the steps does not provide a practical application or significantly more than the abstract idea because it amounts to no more than using a computer or processor to automate and/or implement the abstract idea. The dependent claims do not include additional elements that integrate the abstract idea into a practical application or that provide significantly more than the abstract idea. Therefore, the dependent claims are also not patent eligible. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 2, 9, 11-12, 14, 16, and 18-19 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 2, 9, 12, 14, 16, and 19 recite the limitation "the prediction function" in each claim. There is insufficient antecedent basis for this limitation in the claims. Claims 1, 11, and 16 previously recite “a machine learning prediction function model,” but do not recite a prediction function. Claims 11 and 18 recite the limitation "the artificial intelligence" in each claim. There is insufficient antecedent basis for this limitation in the claims. Statement Regarding Prior Art The closest prior art of Blaikie, III, et al. (US 2021/0365574) (“Blaikie”) discloses encrypting access to a data stream (Blaikie ¶¶ 19, 22, 38, 93), where encrypting access to the data stream is on a blockchain (Blaikie ¶¶ 7, 22-23, 26, 37-38, 46-48), storing the encrypted data stream on a secure platform (Blaikie ¶¶ 24, 66, 71-72, 75, 90, 96), determining an estimated value of the data stream based on a prediction function (Blaikie ¶¶ 29, 52-55, 57, 66, 92), storing the estimated value with the data stream on the secure platform (Blaikie ¶¶ 65-66), and controlling access on the secure platform to the data stream and the estimated value (Blaikie ¶¶ 65-66, 74, 77, 93, 97). Blaikie additionally discloses that encrypting access to the data stream is performed on a central server computer and the secure platform storing the encrypted data stream is on the central server computer (Blaikie ¶¶ 90-93), and that controlling access on the secure platform is controlled by a token-based system (Blaikie ¶¶ 30, 48, 65-66, 73-74, 78, 97). Blaikie further discloses bundling a plurality of data streams based on the value and number of tokens for accessing the plurality of data streams (Blaikie ¶¶ 28). Hummel, et al. (US 2013/0166354) (“Hummel”) discloses determining an accuracy of the estimated value, storing the accuracy of the estimated value, and controlling access to the accuracy of the estimated value (Hummel ¶¶ 26-28, 30-31, 34-37, 46, 49-50, 52, 55-56, 59-60). Hummel also discloses that determining the accuracy of the estimated value comprises assessing the accuracy of the data predictors and valuing the data streams based on the prediction accuracy (Hummel ¶¶ 30-31, 34-37, 40-43, 48-50, 60). Finally, Cella, et al. (US 2018/0284758) (“Cella”) discloses that encrypting access to the data stream is performed on a local computer (Cella ¶¶ 1527, 1582, 1616). However, the prior art does not disclose, neither singly nor in combination, the specific claimed steps of transmitting a data stream on a secure communication link from a local computer to a secure platform on a central server computer, encrypting access to the data stream on a block chain using homomorphic encryption, storing the encrypted data stream on the blockchain on the secure platform, training a machine learning prediction function model on the encrypted data stream resulting in a dataset of a plurality of data predictors, determining an accuracy of the data predictors, determining an accuracy profile of the encrypted data stream based on the accuracy of the data predictors, storing the accuracy profile and the accuracy of the data predictors with the data stream on the blockchain on the secure platform, and controlling access on the secure platform to the data stream, the accuracy profile, the data predictors, and the accuracy of the data predictors on the blockchain by permitting the accuracy profile to be viewed and the data stream to be obtained based on receipt of tokens implemented on the blockchain. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Mohammad A. Nilforoush whose telephone number is (571)270-5298. The examiner can normally be reached Monday-Friday 12pm-7pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, John W. Hayes can be reached at 571-272-6708. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Mohammad A. Nilforoush/Primary Examiner, Art Unit 3697
Read full office action

Prosecution Timeline

Apr 12, 2021
Application Filed
Jan 13, 2024
Non-Final Rejection — §101, §112
Apr 12, 2024
Applicant Interview (Telephonic)
Apr 12, 2024
Examiner Interview Summary
Apr 22, 2024
Response Filed
Aug 16, 2024
Final Rejection — §101, §112
Oct 16, 2024
Examiner Interview Summary
Oct 16, 2024
Applicant Interview (Telephonic)
Oct 21, 2024
Response after Non-Final Action
Oct 23, 2024
Response after Non-Final Action
Nov 14, 2024
Request for Continued Examination
Nov 15, 2024
Response after Non-Final Action
Mar 08, 2025
Non-Final Rejection — §101, §112
Jun 12, 2025
Response Filed
Jun 26, 2025
Applicant Interview (Telephonic)
Jun 26, 2025
Examiner Interview Summary
Jul 13, 2025
Final Rejection — §101, §112
Sep 09, 2025
Applicant Interview (Telephonic)
Sep 12, 2025
Examiner Interview Summary
Sep 16, 2025
Response after Non-Final Action
Oct 14, 2025
Request for Continued Examination
Oct 20, 2025
Response after Non-Final Action
Jan 25, 2026
Non-Final Rejection — §101, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597028
SYSTEMS AND METHODS FOR GENERATING, PROVIDING, AND MANAGING CUSTOM NOTIFICATIONS
2y 5m to grant Granted Apr 07, 2026
Patent 12548013
ACCELERATED SECURITY FOR REAL-TIME DETECTION OF SUSPICIOUS TRANSACTIONS
2y 5m to grant Granted Feb 10, 2026
Patent 12536517
ARTIFICIAL INTELLIGENCE-POWERED MUSIC REGISTRY, COLLABORATION, AND WORKFLOW MANAGEMENT SYSTEM
2y 5m to grant Granted Jan 27, 2026
Patent 12518282
USING LOCATION-BASED MAPPING TO ENABLE AUTOMATED INFORMATION TRANSFER AT A USER LOCATION
2y 5m to grant Granted Jan 06, 2026
Patent 12499432
TECHNIQUES TO PERFORM OPERATIONS WITH A CONTACTLESS CARD WHEN IN THE PRESENCE OF A TRUSTED DEVICE
2y 5m to grant Granted Dec 16, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
29%
Grant Probability
64%
With Interview (+34.8%)
4y 10m
Median Time to Grant
High
PTA Risk
Based on 397 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month