Prosecution Insights
Last updated: April 19, 2026
Application No. 17/499,678

MACHINE-LEARNED MODELS FOR PREDICTING DATABASE SIZE

Non-Final OA §101§103§112
Filed
Oct 12, 2021
Examiner
ALLEN, NICHOLAS E
Art Unit
2154
Tech Center
2100 — Computer Architecture & Software
Assignee
SAP SE
OA Round
4 (Non-Final)
77%
Grant Probability
Favorable
4-5
OA Rounds
3y 3m
To Grant
93%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
585 granted / 760 resolved
+22.0% vs TC avg
Strong +16% interview lift
Without
With
+16.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
68 currently pending
Career history
828
Total Applications
across all art units

Statute-Specific Performance

§101
22.7%
-17.3% vs TC avg
§103
50.6%
+10.6% vs TC avg
§102
16.1%
-23.9% vs TC avg
§112
4.7%
-35.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 760 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on January 30, 2026 has been entered. In response to Applicant’s claims filed on January 30, 2026, claims 1-20 are now pending for examination in the application. Response to Arguments This office action is in response to amendment filed 01/30/2026. In this action claim(s) 1, 8, and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Murthy et al. (US Pub. No. 20210240539) and DeLuca et al. (US Pub. No. 20190303356) and Sarferez (US Pub. No. 20200380155) and Eberlein (US Pub. No. 20210182108) in further view of KHAWAS et al. (US Pub. No. 20210241131). The Murthy et al. reference has been added to address the amendment of receiving, from a software database, telemetry data comprising record counts and archiving logs for a plurality of application tables. Applicant’s arguments: In regards to claim 1 on Pages 8, applicant argues “Under Step 2A, Prong One, this is not a mental process: as a practical matter, a human cannot train and execute k-means clustering, logistic regression, ARIMA segmentation, and a second-stage learned-weight combiner over multi-year histograms and archiving mappings to output database capacity forecasts in their head..” Examiner’s Reply: A computer is being used as a tool with a received set of data that a human using a variety of mathematical concepts would have been able to predict a database size from a historic database size. Applicant’s arguments: In regards to claim 1 on Pages 9, applicant argues “These features collectively solve a computer-centric problem, specifically forecasting database growth to prevent performance degradation and storage exhaustion, by improving how such forecasts are generated using learned weights and dual-model architecture. The treatment of hardware as irrelevant ignores that the models can operate over large in-memory tables and multi- year telemetry and cannot be performed absent a computer. Under Step 2A, Prong Two, the claims integrate any alleged exception into a practical application that improves computer database capacity planning.” Examiner’s Reply: Although mathematical concepts may be eligible when additional elements integrate the exception into a practical application, the current claims do not include any additional elements that provide such an integration. Receiving telemetry data and capacity planning are conventional computer functions that merely link the abstract idea to a generic computing environment. These I/O steps do not integrate an abstract idea into a practical application. See MPEP 2106.05(g) Insignificant Extra-Solution Activity. Applicant’s arguments: In regards to claim 1 on Pages 9, applicant argues “Finally, in Step 2B, the Final Office Action states, "Mathematical tools such as ARIMA and linear/polynomial regression models that are used in the prediction of the database size are well- understood, routine, and conventional." That statement does not address the claimed combination and ordered interaction. The eligibility inquiry at Step 2B requires evidence that the additional elements, in combination, were well-understood, routine, and conventional at the relevant time. There are no factual findings, citations, or evidence showing that the specific dual-model architecture, first, per-table forecasting from histograms and archiving logs; second, a separately trained combiner using learned weights per heavy table informed by historical database telemetry and mappings to archiving objects, and the computation of the total forecast by summing products across learned per-table weights, were well-understood, routine, and conventional.” Examiner’s Reply: Making a prediction is well-understood, routine, and conventional. Mathematical tools such as ARIMA and linear/polynomial regression models that are used in the prediction of the database size are well-understood, routine, and conventional. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1, 8, and 15 is/are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claims 1, 8, and 15 contain subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. There is no support for “receiving, from a software database, telemetry data comprising record counts and archiving logs for a plurality of application tables ….”. Dependent claims 2-7, 9-14, and 16-20 is/are also rejected for inheriting the deficiencies of the independent claims from which they depend on. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-patentable subject matter. The claims are directed to an abstract idea without significantly more. Claim 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The judicial exception is not integrated into a practical application. The claims do not include additional elements that are sufficient to amount to significantly more than judicial exception. The eligibility analysis in support of these findings is provided below, on Claim Rejections - 35 USC 101 accordance with the "2019 Revised Patent Subject Matter Eligibility Guidance" (published on 1/7/2019 in Fed, Register, Vol. 84, No. 4 at pgs. 50-57, hereinafter referred to as the "2019 PEG"). Step 1. in accordance with Step 1 of the eligibility inquiry (as explained in MPEP 2106), it is first noted the claim system (claims 1-8), method (claims 9-14), and medium (15-20) are directed to one of the eligible categories of subject matter and therefore satisfies Step 1. Step 2A. In accordance with Step 2A, prong one of the 2019 PEG, it is noted that the independent claims recite an abstract idea falling within the Mental Processes & Mathematical Concepts enumerated groupings of abstract ideas set forth in the 2019 PEG. Examiner is of the position that independent claims 1, 8, and 15 are directed towards the Mental Process Grouping of Abstract Ideas. Independent claim(s) 1, 8, and 15 recites the following limitations directed towards a Mental Processes & Mathematical Concepts: for each of N application tables, in a software database, that have a larger size than a remainder of the application tables in the database, wherein N is an integer, generating a prediction of a size for a corresponding application table at a future time period (The limitation recites a mental process of observation and/or evaluation capable of being performed by the human mind by using computer as a tool to generating a prediction of a table size) generating a prediction of a total size for the database at the future time period by (The limitation recites a mental process of observation and/or evaluation capable of being performed by the human mind by using computer as a tool to generating a prediction a database size), application table by its corresponding learned weight (The limitation recites a mathematical concept of summation). Step 2A. In accordance with Step 2A, prong two of the 2019 PEG, the judicial exception is not integrated into a practical application because of the recitation in claim(s) 1, 8, and 15: At least one hardware processor (i.e., as a generic processor/component performing a generic computer function); a computer-readable medium (i.e., as a generic processor/component performing a generic computer function) storing instructions that, when executed by the at least one hardware processor, cause the at least one hardware processor to perform operations comprising: receiving, from a software database, telemetry data comprising record counts and archiving logs for a plurality of application tables (recites insignificant extra solution activity that amounts to data gathering); operations performed on the application table into a first machine learned model the second machine learned model having been trained using a machine learning algorithm (recites insignificant extra solution activity that amounts to modeling a database), the training including inputting historical database size information, historical table size information for N application tables, and a mapping between the N application tables and archiving objects into the machine learning algorithm, and the machine leaning algorithm learning a separate weight applied to each of the N application tables based on the inputs, Step 2B. Similar to the analysis under 2A Prong Two, the claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception. Because the additional elements of the independent claims amount to insignificant extra solution activity and/or mere instructions, the additional elements do not add significantly more to the judicial exception such that the independent claims as a whole would be patent eligible. Therefore, independent claims 1, 8, and 15 is/are rejected under 35 U.S.C. 101. With respect to claim(s) 2, 9, and 16: Step 2A, prong one of the 2019 PEG: Examiner is of the position the dependent claim is directed toward additional elements. Step 2A Prong Two Analysis: wherein the information about the number of records in the application table over time is a histogram and the first machine learning model comprises a histogram threshold prediction machine learned model that predicts a time point in the histogram at which a trend of a record size for the corresponding application table changes (recites insignificant extra solution activity that amounts to data gathering). Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. The claim is not patent eligible. With respect to claim(s) 3, 10, and 17: Step 2A, prong one of the 2019 PEG: Examiner is of the position the dependent claim is directed toward additional elements. Step 2A Prong Two Analysis: wherein the histogram threshold prediction machine learned model is combination of a k-means and a logistic regression model (recites insignificant extra solution activity that amounts to modeling a database size). Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. The claim is not patent eligible. With respect to claim(s) 4, 11, and 18: Step 2A, prong one of the 2019 PEG: Examiner is of the position the dependent claim is directed toward additional elements. Step 2A Prong Two Analysis: wherein the first machine learned model further comprises an archived period trend prediction machine learned model that takes a first portion of the histogram for time periods prior to the predicted threshold and outputs a first trend, and a non- archived period trend prediction machine learned model that takes a second portion of the histogram for time periods after the predicted threshold and outputs a second trend (recites insignificant extra solution activity that amounts to modeling a database size ). Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. The claim is not patent eligible. With respect to claim(s) 5, 12, 19: Step 2A, prong one of the 2019 PEG: Examiner is of the position the dependent claim is directed toward additional elements. Step 2A Prong Two Analysis: wherein the first trend and the second trend are used along with a residence time and a frequency to predict a table size (recites insignificant extra solution activity that amounts to modeling a size). Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. The claim is not patent eligible. With respect to claim(s) 6, 13, and 20: Step 2A, prong one of the 2019 PEG: Examiner is of the position the dependent claim is directed toward additional elements. Step 2A Prong Two Analysis: wherein the residence time is calculated by subtracting the predicted threshold from a time period of a last archiving operation performed on the corresponding table (recites insignificant extra solution activity that amounts to modeling a size). Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. The claim is not patent eligible. With respect to claim(s) 7: Step 2A, prong one of the 2019 PEG: Examiner is of the position the dependent claim is directed toward additional elements. Step 2A Prong Two Analysis: wherein the archived period trend prediction machine learned model utilizes an AutoRegressive Integrated Moving Average (ARIMA) model (recites insignificant extra solution activity that amounts to modeling prediction data). Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. The claim is not patent eligible. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 8, 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Murthy et al. (US Pub. No. 20210240539) and DeLuca et al. (US Pub. No. 20190303356) and Sarferez (US Pub. No. 20200380155) and Eberlein (US Pub. No. 20210182108) in further view of KHAWAS et al. (US Pub. No. 20210241131). With respect to claim 1, Murthy et al. discloses a system comprising: at least one hardware processor (Paragraph 83 discloses a processing resource (e.g., a microcontroller, a microprocessor, central processing unit core(s), an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA), and the like); and a computer-readable medium (Paragraph 123 discloses a machine-readable storage medium) storing instructions that, when executed by the at least one hardware processor, cause the at least one hardware processor to perform operations comprising: receiving, from a software database, telemetry data comprising record counts and archiving logs for a plurality of application tables (Paragraph 39 discloses Reference to files and credentials for accessing historical telemetry data (e.g., logs, alerts and monitoring data) and Paragraph 49 discloses learning information relating to current and historical consumption trends, non-limiting examples of such information includes the current and historical allocation of resources and Paragraph 49 discloses the average consumption of allocated resources; the growth rate in consumption of allocated disk storage; whether the application data is to be backup and Paragraph 49 discloses the size of the database instances). Murthy et al. does not explicitly disclose for each of N application tables, in a software database, that have a larger size than a remainder of the application tables in the database, wherein N is an integer, However, DeLuca et al. teaches for each of N application tables, in a software database, that have a larger size than a remainder of the application tables in the database, wherein N is an integer, Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. with DeLuca et al. to include for each of N application tables, in a software database, that have a larger size than a remainder of the application tables in the database, wherein N is an integer, Murthy et al. as modified by DeLuca et al. does not disclose feeding information about a number of records over time in the application table and information about historical archiving operations performed on the application table into a table size prediction component that utilizes a first machine learned model, the first machine learned model making a prediction of a size for the application table at a future time period. However, Sarferez teaches for each of the N application tables: feeding information about a number of records over time in the application table and information about historical archiving operations performed on the application table into a table size prediction component that utilizes a first machine learned model, the first machine learned model making a prediction of a size for the application table at a future time period (Paragraph 29 discloses a machine learning model and 40 discloses predictive services 152 can include functionality for clustering, forecasting, making recommendations, detecting outliers, or conducting “what if” analyses); and Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. and DeLuca et al. with Sarferez to include feeding information about a number of records over time in the application table and information about historical archiving operations performed on the application table into a table size prediction component that utilizes a first machine learned model, the first machine learned model making a prediction of a size for the application table at a future time period. This would have facilitated forecasting database sizes for future needs. See Sarferez Paragraph(s) 3-13. Murthy et al. and DeLuca et al. as modified by Sarferez does not disclose feeding the N predictions from the table size prediction component into a second machine learned model, the second machine learned model having been trained using a machine learning algorithm to predict a total size for the database at the future time period, the training including inputting historical database size information, historical table size information for the N application tables, and a mapping between the N application tables and archiving objects into a machine learning algorithm, and the machine leaning algorithm learning a separate weight applied to each of the N application tables based on the inputs, the predicted total database size computed by summing products calculated by multiplying the predicted application table size for each of the N application tables by its corresponding learned weight. However, Eberlein teaches generating a prediction of a size for a corresponding application table at a future time period by feeding the predictions from the first machine learning model into a second machine learned model, the second machine learned model having been trained using a machine learning algorithm, the training including inputting historical database size information, historical table size information for the N application tables, and a mapping between the N application tables and archiving objects into a machine learning algorithm, and the machine leaning algorithm learning a separate weight applied to each of the N application tables based on the inputs, the predicted total database size computed by summing products calculated by multiplying the predicted application table size for each application table by its corresponding learned weight (Paragraph 56 discloses Resource Consumption Prediction 308 returns a predicted load profile for the new software process, which includes parameter(s) for an in-memory database size or other criteria needed to execute the new software process (for example, a 100 GB in-memory database, 200 GB of RAM, and 3 CPU)). Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. and DeLuca et al. and Sarferez with Eberlein. This would have facilitated forecasting database sizes for future needs. See Eberlein Paragraph(s) 2-6. Murthy et al. and DeLuca et al. as modified by Sarferez and Eberlein does not generating a prediction of a size for a corresponding application table at a future time. However, KWASAS et al. teaches generating a prediction of a size for a corresponding application table at a future time (Paragraph 79 discloses predict migration techniques based on migration information for a candidate database migration (e.g., the circumstances for the candidate database migration), such as source database information, target infrastructure, platform, database size, and the like). Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. and DeLuca et al. and Sarferez and Eberlein with KWASAS et al. This would have facilitated forecasting database sizes for future needs. See KWASAS et al. Paragraph(s) 2-4. With respect to claim 8, Murthy et al. discloses a method comprising: receiving, from a software database, telemetry data comprising record counts and archiving logs for a plurality of application tables (Paragraph 39 discloses Reference to files and credentials for accessing historical telemetry data (e.g., logs, alerts and monitoring data) and Paragraph 49 discloses learning information relating to current and historical consumption trends, non-limiting examples of such information includes the current and historical allocation of resources and Paragraph 49 discloses the average consumption of allocated resources; the growth rate in consumption of allocated disk storage; whether the application data is to be backup and Paragraph 49 discloses the size of the database instances). Murthy et al. does not explicitly disclose for each of N application tables, in a software database, that have a larger size than a remainder of the application tables in the database, wherein N is an integer, However, DeLuca et al. teaches for each of N application tables, in a software database, that have a larger size than a remainder of the application tables in the database, wherein N is an integer, Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. with DeLuca et al. to include for each of N application tables, in a software database, that have a larger size than a remainder of the application tables in the database, wherein N is an integer, Murthy et al. as modified by DeLuca et al. does not disclose feeding information about a number of records over time in the application table and information about historical archiving operations performed on the application table into a table size prediction component that utilizes a first machine learned model, the first machine learned model making a prediction of a size for the application table at a future time period. However, Sarferez teaches for each of the N application tables: feeding information about a number of records over time in the application table and information about historical archiving operations performed on the application table into a table size prediction component that utilizes a first machine learned model, the first machine learned model making a prediction of a size for the application table at a future time period (Paragraph 29 discloses a machine learning model and 40 discloses predictive services 152 can include functionality for clustering, forecasting, making recommendations, detecting outliers, or conducting “what if” analyses); and Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. and DeLuca et al. with Sarferez to include feeding information about a number of records over time in the application table and information about historical archiving operations performed on the application table into a table size prediction component that utilizes a first machine learned model, the first machine learned model making a prediction of a size for the application table at a future time period. This would have facilitated forecasting database sizes for future needs. See Sarferez Paragraph(s) 3-13. Murthy et al. and DeLuca et al. as modified by Sarferez does not disclose feeding the N predictions from the table size prediction component into a second machine learned model, the second machine learned model having been trained using a machine learning algorithm to predict a total size for the database at the future time period, the training including inputting historical database size information, historical table size information for the N application tables, and a mapping between the N application tables and archiving objects into a machine learning algorithm, and the machine leaning algorithm learning a separate weight applied to each of the N application tables based on the inputs, the predicted total database size computed by summing products calculated by multiplying the predicted application table size for each of the N application tables by its corresponding learned weight. However, Eberlein teaches generating a prediction of a size for a corresponding application table at a future time period by feeding the predictions from the first machine learning model into a second machine learned model, the second machine learned model having been trained using a machine learning algorithm, the training including inputting historical database size information, historical table size information for the N application tables, and a mapping between the N application tables and archiving objects into a machine learning algorithm, and the machine leaning algorithm learning a separate weight applied to each of the N application tables based on the inputs, the predicted total database size computed by summing products calculated by multiplying the predicted application table size for each application table by its corresponding learned weight (Paragraph 56 discloses Resource Consumption Prediction 308 returns a predicted load profile for the new software process, which includes parameter(s) for an in-memory database size or other criteria needed to execute the new software process (for example, a 100 GB in-memory database, 200 GB of RAM, and 3 CPU)). Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. and DeLuca et al. and Sarferez with Eberlein. This would have facilitated forecasting database sizes for future needs. See Eberlein Paragraph(s) 2-6. Murthy et al. and DeLuca et al. as modified by Sarferez and Eberlein does not generating a prediction of a size for a corresponding application table at a future time. However, KWASAS et al. teaches generating a prediction of a size for a corresponding application table at a future time (Paragraph 79 discloses predict migration techniques based on migration information for a candidate database migration (e.g., the circumstances for the candidate database migration), such as source database information, target infrastructure, platform, database size, and the like). Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. and DeLuca et al. and Sarferez and Eberlein with KWASAS et al. This would have facilitated forecasting database sizes for future needs. See KWASAS et al. Paragraph(s) 2-4. With respect to claim 15, Murthy et al. discloses a non-transitory machine-readable medium storing instructions which, when executed by one or more processors, cause the one or more processors to perform operations comprising: receiving, from a software database, telemetry data comprising record counts and archiving logs for a plurality of application tables (Paragraph 39 discloses Reference to files and credentials for accessing historical telemetry data (e.g., logs, alerts and monitoring data) and Paragraph 49 discloses learning information relating to current and historical consumption trends, non-limiting examples of such information includes the current and historical allocation of resources and Paragraph 49 discloses the average consumption of allocated resources; the growth rate in consumption of allocated disk storage; whether the application data is to be backup and Paragraph 49 discloses the size of the database instances). Murthy et al. does not explicitly disclose for each of N application tables, in a software database, that have a larger size than a remainder of the application tables in the database, wherein N is an integer, However, DeLuca et al. teaches for each of N application tables, in a software database, that have a larger size than a remainder of the application tables in the database, wherein N is an integer, Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. with DeLuca et al. to include for each of N application tables, in a software database, that have a larger size than a remainder of the application tables in the database, wherein N is an integer, Murthy et al. as modified by DeLuca et al. does not disclose feeding information about a number of records over time in the application table and information about historical archiving operations performed on the application table into a table size prediction component that utilizes a first machine learned model, the first machine learned model making a prediction of a size for the application table at a future time period. However, Sarferez teaches for each of the N application tables: feeding information about a number of records over time in the application table and information about historical archiving operations performed on the application table into a table size prediction component that utilizes a first machine learned model, the first machine learned model making a prediction of a size for the application table at a future time period (Paragraph 29 discloses a machine learning model and 40 discloses predictive services 152 can include functionality for clustering, forecasting, making recommendations, detecting outliers, or conducting “what if” analyses); and Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. and DeLuca et al. with Sarferez to include feeding information about a number of records over time in the application table and information about historical archiving operations performed on the application table into a table size prediction component that utilizes a first machine learned model, the first machine learned model making a prediction of a size for the application table at a future time period. This would have facilitated forecasting database sizes for future needs. See Sarferez Paragraph(s) 3-13. Murthy et al. and DeLuca et al. as modified by Sarferez does not disclose feeding the N predictions from the table size prediction component into a second machine learned model, the second machine learned model having been trained using a machine learning algorithm to predict a total size for the database at the future time period, the training including inputting historical database size information, historical table size information for the N application tables, and a mapping between the N application tables and archiving objects into a machine learning algorithm, and the machine leaning algorithm learning a separate weight applied to each of the N application tables based on the inputs, the predicted total database size computed by summing products calculated by multiplying the predicted application table size for each of the N application tables by its corresponding learned weight. However, Eberlein teaches generating a prediction of a size for a corresponding application table at a future time period by feeding the predictions from the first machine learning model into a second machine learned model, the second machine learned model having been trained using a machine learning algorithm, the training including inputting historical database size information, historical table size information for the N application tables, and a mapping between the N application tables and archiving objects into a machine learning algorithm, and the machine leaning algorithm learning a separate weight applied to each of the N application tables based on the inputs, the predicted total database size computed by summing products calculated by multiplying the predicted application table size for each application table by its corresponding learned weight (Paragraph 56 discloses Resource Consumption Prediction 308 returns a predicted load profile for the new software process, which includes parameter(s) for an in-memory database size or other criteria needed to execute the new software process (for example, a 100 GB in-memory database, 200 GB of RAM, and 3 CPU)). Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. and DeLuca et al. and Sarferez with Eberlein. This would have facilitated forecasting database sizes for future needs. See Eberlein Paragraph(s) 2-6. Murthy et al. and DeLuca et al. as modified by Sarferez and Eberlein does not generating a prediction of a size for a corresponding application table at a future time. However, KWASAS et al. teaches generating a prediction of a size for a corresponding application table at a future time (Paragraph 79 discloses predict migration techniques based on migration information for a candidate database migration (e.g., the circumstances for the candidate database migration), such as source database information, target infrastructure, platform, database size, and the like). Therefore, it would have been obvious at the time the invention was made to a person having ordinary skill in the art to modify Murthy et al. and DeLuca et al. and Sarferez and Eberlein with KWASAS et al. This would have facilitated forecasting database sizes for future needs. See KWASAS et al. Paragraph(s) 2-4. Dependent claims are not rejected over the prior art. Relevant Prior Art The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US Patent No. 20210116905 is directed to Machine-learned Models For Predicting Database Application Table Growth Factor: Column 2 Lines 22-41 The predicted size of an ERP software database can be used in a number of different ways. In a typical ERP software database, the size of the database increases as new records are generated. Users are also able to perform archiving activities to archive the records out of the application tables to reduce the size of the database. The prediction of the size of the database over time can be useful, therefore, in recommending how urgently the user needs to begin archiving operations, or whether they need to change their current archiving and/or new record addition behavior in the near future to avoid having the database grow beyond a certain size, such as a size beyond which database performance will suffer or storage space becomes unreasonably expensive, or if it is simply challenging technologically to add additional memory to the database. Thus, for example, it would be useful to predict after how many months the database will be “full” if the user keeps their ERP software database (and corresponding archiving or data reduction operations) performing as-is. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to NICHOLAS E ALLEN whose telephone number is (571)270-3562. The examiner can normally be reached Monday through Thursday 830-630. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Boris Gorney can be reached at (571) 270-5626. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /N.E.A/Examiner, Art Unit 2154 /BORIS GORNEY/Supervisory Patent Examiner, Art Unit 2154
Read full office action

Prosecution Timeline

Oct 12, 2021
Application Filed
Oct 03, 2024
Non-Final Rejection — §101, §103, §112
Dec 20, 2024
Response Filed
May 05, 2025
Non-Final Rejection — §101, §103, §112
Jul 09, 2025
Examiner Interview Summary
Jul 09, 2025
Applicant Interview (Telephonic)
Aug 05, 2025
Response Filed
Nov 29, 2025
Final Rejection — §101, §103, §112
Jan 30, 2026
Request for Continued Examination
Feb 10, 2026
Response after Non-Final Action
Mar 20, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12380068
RECENT FILE SYNCHRONIZATION AND AGGREGATION METHODS AND SYSTEMS
2y 5m to grant Granted Aug 05, 2025
Patent 12339822
METHOD AND SYSTEM FOR MIGRATING CONTENT BETWEEN ENTERPRISE CONTENT MANAGEMENT SYSTEMS
2y 5m to grant Granted Jun 24, 2025
Patent 12321704
COMPOSITE EXTRACTION SYSTEMS AND METHODS FOR ARTIFICIAL INTELLIGENCE PLATFORM
2y 5m to grant Granted Jun 03, 2025
Patent 12271379
CROSS-DATABASE JOIN QUERY
2y 5m to grant Granted Apr 08, 2025
Patent 12259876
SYSTEM AND METHOD FOR A HYBRID CONTRACT EXECUTION ENVIRONMENT
2y 5m to grant Granted Mar 25, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

4-5
Expected OA Rounds
77%
Grant Probability
93%
With Interview (+16.2%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 760 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month