Prosecution Insights
Last updated: April 19, 2026
Application No. 17/971,082

AUTOMATED DATA ARCHIVAL FRAMEWORK USING ARTIFICIAL INTELLIGENCE TECHNIQUES

Non-Final OA §101§103
Filed
Oct 21, 2022
Examiner
VAUGHN, RYAN C
Art Unit
2125
Tech Center
2100 — Computer Architecture & Software
Assignee
DELL PRODUCTS, L.P.
OA Round
3 (Non-Final)
62%
Grant Probability
Moderate
3-4
OA Rounds
3y 9m
To Grant
81%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
145 granted / 235 resolved
+6.7% vs TC avg
Strong +19% interview lift
Without
With
+19.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
45 currently pending
Career history
280
Total Applications
across all art units

Statute-Specific Performance

§101
23.9%
-16.1% vs TC avg
§103
40.1%
+0.1% vs TC avg
§102
7.6%
-32.4% vs TC avg
§112
21.9%
-18.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 235 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-6, 8-14, 16-19, and 21-23 are presented for examination. Continued Examination under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on February 17, 2026 has been entered. Claim Rejections - 35 USC § 101 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 1-6, 8-14, 16-19, and 21-23 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The analysis of the claims will follow the 2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50 (“2019 PEG”). Claim 1 Step 1: The claim recites a method; therefore, it is directed to the statutory category of processes. Step 2A Prong 1: The claim recites, inter alia: [D]etermining one or more storage-related features within the obtained data by processing at least a portion of the obtained data: This limitation could encompass mentally processing the data to make a mental determination of storage-related features. [P]redicting two or more data archival classes, from a set of multiple predetermined data archival classes associated with multiple designations of data access frequency, for at least a portion of the obtained data by processing the one or more storage-related features: This limitation could encompass mentally predicting the classes by performing mental processing of the features. [P]erforming one or more … actions based at least in part on the two or more predicted data archival classes: This limitation could encompass performing mental actions based on the prediction. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “obtaining data associated with one or more storage systems” and that “performing one or more automated actions comprises automatically storing two or more parts of the at least a portion of the obtained data in respective storage systems associated with the two or more predicted data archival classes, wherein automatically storing the two or more parts of the at least a portion of the obtained data comprises migrating at least a first part of the two or more parts in accordance with the two or more predicted data archival classes, and replicating at least a second part of the two or more parts in accordance with the two or more predicted data archival classes”. These limitations recite the insignificant extra-solution activity of mere data gathering and output. MPEP § 2106.05(g). The claim further recites that the prediction is performed “using one or more artificial intelligence techniques”; that the actions performed are “automated”; and that “the method is performed by at least one processing device comprising a processor coupled to a memory”. However, these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. MPEP § 2106.05(f). Step 2B: The claim does not contain significantly more than the judicial exception. The analysis at this step mirrors that of step 2A, prong 2, with the exception that the obtaining limitation, in addition to being insignificant extra-solution activity, also recites the well-understood, routine, and conventional activity of receiving or transmitting data over a network. MPEP § 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network), and the storing limitation is directed to the well-understood, routine, and conventional activity of storing or retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015). As an ordered whole, the claim is directed to a mentally performable process of predicting data archival classes. Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible. Claim 2 Step 1: A process, as above. Step 2A Prong 1: The claim recites, inter alia, that “processing the one or more storage-related features … comprises processing the one or more storage-related features using … multiple decision trees, wherein each of the multiple decision trees is constructed in connection with one or more different storage-related features.” This limitation could encompass the mental processing of the storage-related features by mentally processing decision trees, or traversing them using a pen and paper. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “using one or more artificial intelligence techniques” and “using at least one random forest classifier”. However, these are mere instructions to apply the judicial exception using a generic computer programmed with generic classes of computer algorithm. MPEP § 2106.05(f). Step 2B: The claim does not contain significantly more than the judicial exception. The claim further recites “using one or more artificial intelligence techniques” and “using at least one random forest classifier”. However, these are mere instructions to apply the judicial exception using a generic computer programmed with generic classes of computer algorithm. MPEP § 2106.05(f). Claim 3 Step 1: A process, as above. Step 2A Prong 1: The claim recites that “predicting two or more data archival classes comprises generating multiple data archival class predictions corresponding to the multiple decision trees, and determining a final data archival class, from among the multiple data archival class predictions, by implementing a voting mechanism across the multiple data archival class predictions.” These limitations could encompass mentally generating multiple predictions using the multiple decision trees and mentally using a voting mechanism to determine a final class. Step 2A Prong 2: This judicial exception is not integrated into a practical application. See claim 2 analysis. Step 2B: The claim does not contain significantly more than the judicial exception. See claim 2 analysis. Claim 4 Step 1: A process, as above. Step 2A Prong 1: The claim recites the same judicial exceptions as in claim 2. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites that “processing the one or more storage-related features using at least one random forest classifier comprises processing the one or more storage-related features using at least one random forest classifier in conjunction with one or more ensemble bagging techniques.” However, this is a mere instruction to apply the judicial exception using a generic computer programmed with generic classes of computer algorithm. MPEP § 2106.05(f). Step 2B: The claim does not contain significantly more than the judicial exception. The claim further recites that “processing the one or more storage-related features using at least one random forest classifier comprises processing the one or more storage-related features using at least one random forest classifier in conjunction with one or more ensemble bagging techniques.” However, this is a mere instruction to apply the judicial exception using a generic computer programmed with generic classes of computer algorithm. MPEP § 2106.05(f). Claim 5 Step 1: A process, as above. Step 2A Prong 1: The claim recites the same judicial exceptions as in claim 1. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites that “processing the one or more storage-related features using one or more artificial intelligence techniques comprises processing the one or more storage-related features using at least one dense artificial neural network-based classifier comprising an input layer, one or more hidden layers, and an output layer.” However, this is a mere instruction to apply the judicial exception using a generic computer programmed with generic classes of computer algorithm. MPEP § 2106.05(f). Step 2B: The claim does not contain significantly more than the judicial exception. The claim further recites that “processing the one or more storage-related features using one or more artificial intelligence techniques comprises processing the one or more storage-related features using at least one dense artificial neural network-based[] classifier comprising an input layer, one or more hidden layers, and an output layer.” However, this is a mere instruction to apply the judicial exception using a generic computer programmed with generic classes of computer algorithm. MPEP § 2106.05(f). Claim 6 Step 1: A process, as above. Step 2A Prong 1: The claim recites the same judicial exceptions as in claim 5. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites that “the input layer comprises a number of neurons matching a number of storage-related features, and the output layer comprises a number of neurons matching a number of data archival classes.” However, this is a mere instruction to apply the judicial exception using a generic computer programmed with generic classes of computer algorithm. MPEP § 2106.05(f). Step 2B: The claim does not contain significantly more than the judicial exception. The claim further recites that “the input layer comprises a number of neurons matching a number of storage-related features, and the output layer comprises a number of neurons matching a number of data archival classes.” However, this is a mere instruction to apply the judicial exception using a generic computer programmed with generic classes of computer algorithm. MPEP § 2106.05(f). Claim 8 Step 1: A process, as above. Step 2A Prong 1: The claim recites, inter alia, “storing the at least a portion of the obtained data further comprises removing the at least a portion of the obtained data from the one or more storage systems in accordance with the two or more predicted data archival classes.” This limitation could encompass mentally migrating data from one category to another or mentally replicating the data, or replicating the data using a pen and paper. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites that the archiving takes place “automatically”. However, this is a mere instruction to apply the judicial exception using a generic computer. MPEP § 2106.05(f). Step 2B: The claim does not contain significantly more than the judicial exception. The claim further recites that the archiving takes place “automatically”. However, this is a mere instruction to apply the judicial exception using a generic computer. MPEP § 2106.05(f). Claim 9 Step 1: A process, as above. Step 2A Prong 1: The claim recites the same judicial exceptions as in claim 1. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites that “performing one or more automated actions comprises automatically training the one or more artificial intelligence techniques using feedback related to the two or more predicted data archival classes.” However, this is a mere instruction to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. MPEP § 2106.05(f). Step 2B: The claim does not contain significantly more than the judicial exception. The claim further recites that “performing one or more automated actions comprises automatically training the one or more artificial intelligence techniques using feedback related to the two or more predicted data archival classes.” However, this is a mere instruction to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. MPEP § 2106.05(f). Claim 10 Step 1: A process, as above. Step 2A Prong 1: The claim recites that “determining one or more storage-related features comprises identifying, within the obtained data, one or more of information pertaining to database type, information pertaining to transaction type, latency information, information related to one or more full table scans, and information pertaining to one or more accessed rows.” This limitation could encompass mentally identifying the claimed information. Step 2A Prong 2: This judicial exception is not integrated into a practical application. See claim 1 analysis. Step 2B: The claim does not contain significantly more than the judicial exception. See claim 1 analysis. Claim 11 Step 1: A process, as above. Step 2A Prong 1: The claim recites the same judicial exceptions as in claim 1. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites that “the one or more artificial intelligence techniques are trained using one or more storage-related policies and historical data pertaining to data archiving.” This limitation amounts to a mere instruction to apply the judicial exceptions using a generic computer programmed with a generic class of computer algorithm. MPEP § 2106.05(f). Step 2B: The claim does not contain significantly more than the judicial exception. The claim further recites that “the one or more artificial intelligence techniques are trained using one or more storage-related policies and historical data pertaining to data archiving.” This limitation amounts to a mere instruction to apply the judicial exceptions using a generic computer programmed with a generic class of computer algorithm. MPEP § 2106.05(f). Claims 12-14, 16 Step 1: The claims recite a non-transitory processor-readable storage medium; therefore, they are directed to the statutory category of articles of manufacture. Step 2A Prong 1: The claims recite the same judicial exceptions as in claims 1-2, 5, and 9, respectively. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The analysis at this step is identical to that of claims 1-2, 5, and 9, respectively, except insofar as these claims recite a “non-transitory processor-readable storage medium having stored therein program code of one or more software programs, wherein the program code when executed by at least one processing device causes the at least one processing device: to [perform the method]”. However, this amounts to a mere instruction to apply the judicial exception using a generic computer. MPEP § 2106.05(f). Step 2B: The claim does not contain significantly more than the judicial exception. The analysis at this step is identical to that of claims 1-2, 5, and 9, respectively, except insofar as these claims recite a “non-transitory processor-readable storage medium having stored therein program code of one or more software programs, wherein the program code when executed by at least one processing device causes the at least one processing device: to [perform the method]”. However, this amounts to a mere instruction to apply the judicial exception using a generic computer. MPEP § 2106.05(f). Claims 17-19 Step 1: The claims recite an apparatus comprising a processor and a memory; therefore, they are directed to the statutory category of machines. Step 2A Prong 1: The claims recite the same judicial exceptions as in claims 1-2 and 5, respectively. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The analysis at this step is identical to that of claims 1-2 and 5, respectively, except insofar as these claims recite an “apparatus comprising: at least one processing device comprising a processor coupled to a memory; the at least one processing device being configured: to [perform the method]”. However, this amounts to a mere instruction to apply the judicial exception using a generic computer. MPEP § 2106.05(f). Step 2B: The claim does not contain significantly more than the judicial exception. The analysis at this step is identical to that of claims 1-2 and 5, respectively, except insofar as these claims recite an “apparatus comprising: at least one processing device comprising a processor coupled to a memory; the at least one processing device being configured: to [perform the method]”. However, this amounts to a mere instruction to apply the judicial exception using a generic computer. MPEP § 2106.05(f). Claim 21 Step 1: A machine, as above. Step 2A Prong 1: The claim recites the same judicial exceptions as in claim 17. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites that “performing one or more automated actions comprises automatically training the one or more artificial intelligence techniques using feedback related to the two or more predicted data archival classes.” This limitation does no more than restrict the judicial exception to the field of use of artificial intelligence training. MPEP § 2106.05(h). Step 2B: The claim does not contain significantly more than the judicial exception. The claim further recites that “performing one or more automated actions comprises automatically training the one or more artificial intelligence techniques using feedback related to the two or more predicted data archival classes.” This limitation does no more than restrict the judicial exception to the field of use of artificial intelligence training. MPEP § 2106.05(h). Claim 22 Step 1: A machine, as above. Step 2A Prong 1: The claim recites that “determining one or more storage-related features comprises identifying, within the obtained data, one or more of information pertaining to database type, information pertaining to transaction type, latency information, information related to one or more full table scans, and information pertaining to one or more accessed rows.” This limitation could encompass mentally identifying one or more of the claimed data types. Step 2A Prong 2: This judicial exception is not integrated into a practical application. See claim 17 analysis. Step 2B: The claim does not contain significantly more than the judicial exception. See claim 17 analysis. Claim 23 Step 1: A machine, as above. Step 2A Prong 1: The claim recites the same judicial exceptions as in claim 17. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites that “the one or more artificial intelligence techniques are trained using one or more storage-related policies and historical data pertaining to data archiving.” This limitation does no more than restrict the judicial exception to the field of use of artificial intelligence training. MPEP § 2106.05(h). Step 2B: The claim does not contain significantly more than the judicial exception. The claim further recites that “the one or more artificial intelligence techniques are trained using one or more storage-related policies and historical data pertaining to data archiving.” This limitation does no more than restrict the judicial exception to the field of use of artificial intelligence training. MPEP § 2106.05(h). Claim Rejections - 35 USC § 103 Claims 1, 8-12, 16-17, and 21-22 are rejected under 35 U.S.C. 103 as being unpatentable over Kambadahalli Puttasetty (US 20190362271) (“KP”) in view of Wang et al. (US 20210117395) (“Wang”) and further in view of Bourgeois et al. (US 20210141698) (“Bourgeois”). Regarding claim 1, KP discloses “[a] computer-implemented method comprising: obtaining data associated with one or more storage systems (data associated with an entity are received from a data source [i.e., the data are associated with storage systems] – KP, paragraph 5; see also paragraphs 24 (disclosing that the information may be stored in a memory), 50 (disclosing receipt of data associated with an entity from a data source)); determining one or more storage-related features within the obtained data by processing at least a portion of the obtained data (master data may be compared with source data to determine any duplicity in the data; points-of-similar scores and exact-match scores may be computed [processed] to generate one or more features with numerical values; the points-of-similar score may be a value between 0 and 1, with 0 indicating no similarity and 1 indicating an exact match – KP, paragraph 45); predicting two or more data archival classes, from a set of multiple predetermined data archival classes associated with multiple designations …, for at least a portion of the obtained data by processing the one or more storage-related features using one or more artificial intelligence techniques (data management system may receive data from a data source associated with an enterprise to determine duplicity; the data management system may predict the category for the current data to be one of duplicate data and non-duplicate data [archival class designations; note that if one set of current data is determined to be duplicate and another not, there are two archival data classes predicted] by using twenty supervised machine learning classifiers [AI technique] – KP, paragraph 46; see also paragraph 45 (disclosing that the numerical features are input to the SML classifiers)); and performing one or more automated actions based at least in part on the two or more predicted data archival classes (once the category of the data is determined [two categories/classes for two sets of data], instructions may be provided to the system [automatically] to manage the redundant data; the duplicate data may be deleted from the current data [action] – KP, paragraph 43), wherein performing one or more automated actions comprises automatically storing two or more parts of the at least a portion of the obtained data in respective storage systems associated with the two or more predicted data archival classes (once the category of the data is determined [i.e., in accordance with the archival class predictions], instructions may be provided to the system [automatically] to manage [store] the redundant data; the duplicate data may be deleted from the current data [i.e., the duplicate data are sent to a portion of storage designated for deletion and the non-duplicate data are sent elsewhere] – KP, paragraph 43; see also paragraph 24 (disclosing that the data received from the data sources may be stored in a memory)), …; wherein the method is performed by at least one processing device comprising a processor coupled to a memory (KP Figure 5 shows that the method is performed using a computer system 500 comprising a processor 502 coupled to a memory 505).” KP appears not to disclose explicitly the further limitations of the claim. However, Wang discloses “designations of data access frequency (statistical values of the data used can be recorded, such as the access frequency of the data – Wang, paragraph 99) ….” Wang and the instant application both relate to machine learning for data analysis and are analogous. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified KP to track the data access frequency, as disclosed by Wang, and an ordinary artisan could reasonably expect to have done so successfully. Doing so would allow the system to determine which data are most important to the users thereof. See Wang, paragraph 99. Neither KP nor Wang appears to disclose explicitly the further limitations of the claim. However, Bourgeois discloses that “automatically storing the two or more parts of at least a portion of the obtained data comprises migrating the at least a first part of the two or more parts in accordance with the two or more predicted data archival classes, and replicating at least a second part of the obtained data in accordance with the two or more predicted data archival classes (user defines rules for data classification [i.e., data archival classes]; for items that meet private/sensitive data rules, the associated tag for each regular expression matched is applied; for items with tags not eligible for archiving, those items are automatically excluded from evaluation in future data crawls [i.e., they are migrated to an area where they will be so excluded based on the class “not eligible for archiving”]; if the item is eligible for archiving [i.e., falls into the class “eligible for archiving”], standard archiving steps are performed so that the item is copied/moved [replicated] to cloud storage – Bourgeois, paragraphs 175-92 and Fig. 7) ….” Bourgeois and the instant application both relate to data archival and are analogous. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of KP and Wang to migrate some data based on one archival class and replicate them based on another, as disclosed by Bourgeois, and an ordinary artisan could reasonably expect to have done so successfully. Doing so would assist the organization hosting the data with compliance with legal requirements related to retention of data. See Bourgeois, paragraph 8. Claim 12 is a non-transitory processor-readable storage medium claim corresponding to method claim 1 and is rejected for the same reasons as given in the rejection of that claim. Similarly, claim 17 is an apparatus claim corresponding to method claim 1 and is rejected for the same reasons as given in the rejection of that claim. Regarding claim 8, KP/Wang/Bourgeois discloses that “automatically storing the at least a portion of the obtained data further comprises removing the at least a portion of the obtained data from the one or more storage systems in accordance with the at least one predicted data archival class (once the category of the data is determined [i.e., in accordance with the archival class prediction], instructions may be provided to the system [automatically] to manage [archive] the redundant data; the duplicate data may be deleted [removed] from the current data – KP, paragraph 43).” Regarding claim 9, KP/Wang/Bourgeois discloses “performing one or more automated actions comprises automatically training the one or more artificial intelligence techniques using feedback related to the at least one predicted data archival class (data management system may train a plurality of supervised machine learning classifiers based on a plurality of master datasets; once the classifiers are trained, the data management system may evaluate the classifiers based on one or more metrics comprising accuracy, precision, recall, and F1 score [feedback related to the predictions] – KP, paragraph 23).” Claim 16 is a non-transitory processor-readable storage medium claim corresponding to method claim 9 and is rejected for the same reasons as given in the rejection of that claim. Regarding claim 10, KP/Wang/Bourgeois discloses that “determining one or more storage-related features comprises identifying, within the obtained data, one or more of information pertaining to database type, information pertaining to transaction type, latency information, information related to one or more full table scans, and information pertaining to one or more accessed rows (data received from a data source is stored in a customer table; the customer data includes current data and reference data – KP, paragraph 46; data are analyzed to determine duplicity in the data, and a points-of-similar score between 0 and 1 may be used to determine similarity between two data entries [information pertaining to accessed rows] – id. at paragraph 45; see also Fig. 3, ref. char. 303 (showing that the data entries are organized as rows in a table)).” Regarding claim 11, KP/Wang/Bourgeois discloses that “the one or more artificial intelligence techniques are trained using one or more storage-related policies and historical data pertaining to data archiving (data management system may train a plurality of supervised machine learning classifiers based on a plurality of master datasets associated with the entity and analyzed by one or more data experts as duplicate and non-duplicate [master datasets = historical data pertaining to data archiving; determination of data points as duplicate or non-duplicate = storage-related policy] – KP, paragraph 23).” Regarding claim 21, KP/Wang/Bourgeois discloses that “performing one or more automated actions comprises automatically training the one or more artificial intelligence techniques using feedback related to the two or more predicted data archival classes (data management system may train a plurality of Supervised Machine Learning (SML) classifiers based on a plurality of master datasets associated with an entity and analyzed by one or more data experts as duplicate and non-duplicate [two or more predicted data archival classes; analysis by data experts = feedback] – KP, paragraph 23).” Regarding claim 22, KP/Wang/Bourgeois discloses that “determining one or more storage-related features comprises identifying, within the obtained data, one or more of information pertaining to database type, information pertaining to transaction type, latency information, information related to one or more full table scans, and information pertaining to one or more accessed rows (master data signify business objects of the organization which may be agreed on and shared across the organization; the master data may include transactional data [information pertaining to transaction type]; data management tool may identify duplicates or different versions [including identifying duplicate transaction information] – KP, paragraph 2).” Claims 2-4, 13, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over KP in view of Wang and Bourgeois and further in view of Breiman et al., “Random Forests,” in 45.1 Machine Learning 5-32 (2001) (“Breiman”). Regarding claim 2, the rejection of claim 1 is incorporated. KP further discloses that “processing the one or more storage-related features using one or more artificial intelligence techniques comprises processing the one or more storage-related features using at least one random forest classifier comprising multiple decision trees (category prediction module may predict the category of current data received from receiving module to be one of the duplicate data and the non-duplicate data; the category may be predicted by using the plurality of trained SML classifiers, which may be a Random Forest classifier [note that a random forest comprises multiple decision trees] – KP, paragraph 41) ….” KP/Wang/Bourgeois appears not to disclose explicitly the further limitations of the claim. However, Breiman discloses that “each of the multiple decision trees is constructed in connection with one or more different … features (forests use the random selection of features at each node to determine a split; combining trees grown using random features can produce improved accuracy – Breiman, sec. 1.2, first five paragraphs).” Breiman and the instant application both relate to random decision forests and are analogous. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified KP/Wang/Bourgeois to construct each tree with different features, as disclosed by Breiman, and an ordinary artisan could reasonably expect to have done so successfully. Doing so would improve the accuracy of the ensemble relative to single decision trees that have accuracies only slightly better than chance. See Breiman, sec. 1.2, first five paragraphs. Claim 13 is a non-transitory processor-readable storage medium claim corresponding to method claim 2 and is rejected for the same reasons as given in the rejection of that claim. Similarly, claim 18 is an apparatus claim corresponding to method claim 2 and is rejected for the same reasons as given in the rejection of that claim. Regarding claim 3, the rejection of claim 2 is incorporated. KP further discloses that “predicting at least one data archival class comprises generating multiple data archival class predictions …, and determining a final data archival class, from among the multiple data archival class predictions, by implementing a voting mechanism across the multiple data archival class predictions (data management system may generate a confidence factor for the duplicate data category and the non-duplicate data category based on the prediction of each of the plurality of classifiers [i.e., there are multiple predictions]; the confidence factor for each category may be the total number of the classifiers with the prediction of that data category; the current data may be determined to be duplicate data when the confidence factor of the duplicate data category is greater than the confidence factor of the non-duplicate data category [i.e., since the confidence represents the total number of classifiers that made the prediction, a higher confidence implies that more classifiers made that prediction, or in other words, the final classification is determined by majority vote of the classifiers] – KP, paragraph 23).” KP/Wang/Bourgeois appears not to disclose explicitly the further limitations of the claim. However, Breiman discloses “generating multiple … class predictions corresponding to the multiple decision trees (random forest is a classifier consisting of a collection of tree-structured classifiers in which each tree casts a unit vote for the most popular class at input x [i.e., in cases of non-unanimity, some trees may output a different prediction from other trees] – Breiman, definition 1.1) ….” It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified KP/Wang to generate multiple class predictions corresponding to each tree, as disclosed by Breiman, and an ordinary artisan could reasonably expect to have done so successfully. Doing so would improve the accuracy of the ensemble relative to single decision trees that have accuracies only slightly better than chance. See Breiman, sec. 1.2, first five paragraphs. Regarding claim 4, KP, as modified by Wang, Bourgeois, and Breiman, discloses that “processing the one or more storage-related features using at least one random forest classifier comprises processing the one or more storage-related features using at least one random forest classifier in conjunction with one or more ensemble bagging techniques (category prediction module may predict the category for the current data with respect to reference data by using the plurality of trained SML classifiers; the plurality of SML classifiers may be a combination [conjunction] of, inter alia, a Random Forest classifier and a Bagging Classifier Gradient Boosting classifier [ensemble bagging technique] – KP, paragraph 41).” Claims 5, 14, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over KP in view of Wang and Bourgeois and further in view of Zhao et al. (US 20210049418) (“Zhao”). Regarding claim 5, the rejection of claim 1 is incorporated. KP further discloses “processing the one or more storage-related features”, as shown in the rejection of claim 1 above. KP/Wang/Bourgeois appears not to disclose explicitly the further limitations of the claim. However, Zhao discloses that “processing the one or more … features using one or more artificial intelligence techniques comprises processing the one or more … features using at least one dense artificial neural network-based[] classifier comprising an input layer, one or more hidden layers, and an output layer (first neural network may be a first fully connected neural network including an input layer with at least one input, an output layer with at least one output, and at least one hidden layer between the input layer and the output layer – Zhao, paragraph 120).” Zhao and the instant application both relate to neural networks and are analogous. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified KP/Wang/Bourgeois to perform the processing using a neural network comprising input, hidden, and output layers, as disclosed by Zhao, and an ordinary artisan could reasonably expect to have done so successfully. Doing so would allow the machine learning to detect complex patterns given sufficient training data. See Zhao, paragraph 3. Claim 14 is a non-transitory processor-readable storage medium claim corresponding to method claim 5 and is rejected for the same reasons as given in the rejection of that claim. Similarly, claim 19 is an apparatus claim corresponding to method claim 5 and is rejected for the same reasons as given in the rejection of that claim. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over KP in view of Zhao, Wang, and Bourgeois and further in view of Lei et al. (US 20220019872) (“Lei”). Regarding claim 6, the rejection of claim 5 is incorporated. KP further discloses “a number of storage-related features” and “a number of data archival classes” (see mapping of claim 1 supra and note that the points-of-similar score and exact-match scores of KP paragraph 45 correspond to the claimed “number of storage-related features” and that the duplicate data and non-duplicate data of paragraph 46 correspond to the claimed “number of data archival classes”). KP/Wang/Bourgeois appears not to disclose explicitly the further limitations of the claim. However, Zhao discloses that “the input layer comprises a number of neurons matching a number of … features (generator may include an input for each feature of the second number of features – Zhao, paragraph 13) ….” It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified KP/Wang/Bourgeois to utilize an input layer with inputs matching a number of features, as disclosed by Zhao, and an ordinary artisan could reasonably expect to have done so successfully. Doing so would increase the interpretability of the network by ensuring that separate input nodes correspond to separate features. See Zhao, paragraph 22. Neither KP, Wang, Bourgeois, nor Zhao appears to disclose explicitly the further limitations of the claim. However, Lei discloses that “the output layer comprises a number of neurons matching a number of … classes (ten output nodes correspond to ten possible classifications of the feature map input to the CNN; the number of possible classifications is equal to the number of output nodes of the classification layer – Lei, paragraph 139).” Lei and the instant application both relate to neural networks and are analogous. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of KP, Wang, Bourgeois, and Zhao to provide a number of output neurons equal to the number of possible classifications, as disclosed by Lei, and an ordinary artisan could reasonably expect to have done so successfully. Doing so would increase the interpretability of the network by ensuring that separate output nodes correspond to separate classifications. See Lei, paragraph 139. Claim 23 is rejected under 35 U.S.C. 103 as being unpatentable over KP in view of Zhao, Bourgeois, and Wang and further in view of Kumar et al. (US 20220107974) (“Kumar”). Regarding claim 23, neither KP, Bourgeois, nor Wang appears to disclose explicitly the further limitations of the claim. However, Kumar discloses that “the one or more artificial intelligence techniques are trained using one or more storage-related policies and historical data pertaining to data archiving (label encoding may be used to encode the data when only two categories, e.g., active and archive, are present, including the dependent variable; 80% of the data may be used as a training set [i.e., the system is trained using storage-related policies “active” and “archive”] – Kumar, paragraph 54; training data may be retrieved from related historical data databases [historical data pertaining to data archiving] – id. at paragraph 50).” Kumar and the instant application both relate to machine learning for data archiving and are analogous. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified KP/Wang/Bourgeois to train the model using historical data pertaining to data archiving and storage-related policies, as disclosed by Kumar, and an ordinary artisan could reasonably expect to have done so successfully. Doing so would optimize storage usage by allowing the system to determine which files should be archived automatically. See Kumar, paragraphs 2-4. Response to Arguments Applicant's arguments filed February 17, 2026 (“Remarks”) have been fully considered but they are, except insofar as rendered moot by the entry of a new ground of rejection, not persuasive. Applicant first argues that the claims as amended are now eligible because migrating and replicating data are not practically mentally performable. Remarks at 9-10. Without conceding that this is indeed the case, Examiner is now analyzing the disputed language as insignificant extra-solution activity that is well-understood, routine, and conventional, thereby rendering the argument moot. Applicant then argues that the decision of the Appeals Review Panel in Ex parte Desjardins allegedly compels the conclusion that the amended claims are eligible because, like the Desjardins claims, the instant claims allegedly recite an improvement to computer functionality. Remarks at 10-11. However, Desjardins is distinguishable. As Applicant correctly points out, the Desjardins claims were directed to an improvement to a model training algorithm itself that was reflected in the disclosure. By contrast, the instant claims do not recite an improvement to model training, but at most the use of the model to predict data archival classes and move data based thereon. Applicant does not articulate what it believes that the technological improvement is, but to the extent that the argument is that the improvement is to the technological field of data storage and archival, the additional elements of the claim do not reflect that improvement, since, as noted above, they merely recite insignificant extra-solution activity that is well-understood, routine, and conventional. Applicant’s arguments with respect to the art rejection, Remarks at 12-14, are moot in light of the use of newly cited reference Bourgeois to teach the disputed limitations. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to RYAN C VAUGHN whose telephone number is (571)272-4849. The examiner can normally be reached M-R 7:00a-5:00p ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kamran Afshar, can be reached at 571-272-7796. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RYAN C VAUGHN/ Primary Examiner, Art Unit 2125
Read full office action

Prosecution Timeline

Oct 21, 2022
Application Filed
Jul 17, 2025
Non-Final Rejection — §101, §103
Oct 03, 2025
Interview Requested
Oct 21, 2025
Response Filed
Nov 13, 2025
Final Rejection — §101, §103
Jan 01, 2026
Interview Requested
Jan 15, 2026
Examiner Interview Summary
Jan 15, 2026
Applicant Interview (Telephonic)
Jan 20, 2026
Response after Non-Final Action
Feb 17, 2026
Request for Continued Examination
Feb 25, 2026
Response after Non-Final Action
Mar 10, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602448
PROGRESSIVE NEURAL ORDINARY DIFFERENTIAL EQUATIONS
2y 5m to grant Granted Apr 14, 2026
Patent 12602610
CLASSIFICATION BASED ON IMBALANCED DATASET
2y 5m to grant Granted Apr 14, 2026
Patent 12561583
Systems and Methods for Machine Learning in Hyperbolic Space
2y 5m to grant Granted Feb 24, 2026
Patent 12541703
MULTITASKING SCHEME FOR QUANTUM COMPUTERS
2y 5m to grant Granted Feb 03, 2026
Patent 12511526
METHOD FOR PREDICTING A MOLECULAR STRUCTURE
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
62%
Grant Probability
81%
With Interview (+19.4%)
3y 9m
Median Time to Grant
High
PTA Risk
Based on 235 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month