DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment/Argument
This office action is in response to the amendment filed on 01/02/2026.
Claims 1-20 are presented for further examination.
Response to Argument
Remark 1:
Applicant argues that the claims is to an “improvement in computer technology”.
Examiner respectfully submits that the “improvement in computer technology” is reliance on mathematical formula, calculation, and/or relationship.
Remark 2:
Applicant argues that the claims additionally describe “replacing one or more pages on memory media based on a cache policy”.
Examiner respectfully submits that the additional element “replacing one or more pages on memory media based on a cache policy” is described in high level of genericity, thus is insignificant extra solution.
Remark 3:
Applicant argues that the invention claimed here is directed towards reserving program slots to applications to "ensure that a priority application has resources to execute when needed. In situations in which multiple entities load applications into the processing device 108 (e.g., in a multi-tenant system), one tenant may be prevented from consuming all of available program slots and blocking an important application of another tenant." See paragraph [0046] of the as-filed Specification.
Examiner respectfully submits that the element in paragraph [0046] is not claimed. Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993)
Remark 4:
Applicant argues that the limitations recited in the claims are "not well-understood, routine, conventional activity".
Examiner respectfully submits that (1) claims 5, 8-10, and 12-13 explicitly recite only mathematical formula, calculation, and/or relationship. (2) claims 2 and 3 recite additional elements: “replacing” the one or more pages on the memory media comprises “loading a portion of memory to” or “removing a portion of memory from” the memory media.
However, “loading a portion of memory to” and “removing a portion of memory from” the memory media is described in high level of genericity. Therefore, the above additional elements are insignificant extra-solution activities.
Remark 5:
Applicant's argument concerning newly added claim limitation “updating a cache policy on the memory media based on the one or more scores and replacing one or more pages on the memory media based on the cache policy” is not taught in the cited prior art.
Examiner respectfully submits that Applicant's argument has been fully considered but moot in view of the new ground(s) of rejection as set forth below. It is noted that Applicant's arguments are directed towards limitations newly added via amendments.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 2-3, 5, 8-10, and 12-13 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1: See MPEP 2106.03 II. Eligibility step 1.
claim(s) 2 recites determining that at least one of the one or more scores is above a threshold …
claim(s) 3 recites determining that at least one of the one or more scores is below a threshold …
claim(s) 5 and 13 recite performing the mixture model analysis comprises using an expectation-maximization algorithm on the memory access information, wherein the one or more scores correspond to a maximum a posteriori estimate based on the expectation-maximization algorithm.
Claim(s) 12 recites calculating the one or more scores comprises: training, in parallel, a first Gaussian and a second Gaussian.
Claim(s) 8 recites setting a threshold value; and comparing output of the mixture model with the threshold value to determine a distribution value.
Claim(s) 9 recites calculating a first score based on a frequency value and second score from the mixture model; and comparing the first score with the threshold value.
Claim(s) 10 recites the threshold value relates to a size of the memory media.
Thus, the claim is directed to a process, one of the four statutory categories (process, machine, manufacture, or composition of matter).
Step 2A - Prong 1: Regarding Eligibility step 2A. See MPEP 2106.04.11.
Next the claim is analyzed to determine whether it is directed to a judicial exception.
Claims 2-3, 5, 8-10, and 12-13 recite a mathematical concept because the claims explicitly recite mathematical formula, calculation, and/or relationship. Thus, claims recite abstract ideas (i.e., mental processes combined with mathematical concept).
Step 2A - Prong 2:
This judicial exception is not integrated into a practical application.
The claims 2 and 3 recite following additional elements: replacing memory pages, loading a portion of memory to or removing a portion of memory from memory media, which are insignificant extra solution activities and generic component that is described in high level of genericity. Therefore, the above additional elements are insignificant extra-solution activities.
Thus, Applicant’s solutions rely on mathematical with the additional elements that are insignificant extra solution activities (e.g., loading and removing portion of memory) and generic components, the claims being evaluated as a whole do not integrate the abstract idea into a practical application because the insignificant extra solution activities and/or generic components do not impose any meaningful limits on practicing the abstract ideas.
Step 2B:
The claim 2-3, 5, 8-10, and 12-13 do not include additional elements that are sufficient to amount to significantly more than the judicial exception. Therefore, the above additional elements are insignificant extra-solution activities.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1-4, 6-7, 11, and 14-20 are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. (US 12,007,893; hereinafter Kim) in view of Cai et al. (US 2014/0089559; hereinafter Cai).
Regarding independent claim(s) 1, 6 and 17, taking claim 1 as exemplary analysis, Kim teaches A device, comprising: memory media configured as cache media; and one or more circuits configured to perform operations (claim 11, A computing device, comprising: a cache memory; and a processor connected to the cache memory and configured to execute at least one computer readable program for controlling the cache memory; claim 11, a device; claim 1, method; col. 7, ll. 32-33, FIG. 2 is a configuration diagram of a computing device for an adaptive cache pool management method) comprising:
receiving memory access information (Fig. 11, step S1110 & col. 15, ll. 25-29, The processor may receive the monitoring information on the cache memories divided into a plurality of cache pools, at S1110. Here, the monitoring information {memory access information} may include at least one of an access log to an address associated with each of a plurality of cache pools, or a cache hit-rate);
performing a mixture model analysis (col. 10, ll. 15-40, The analysis module 332 may input the monitoring information to a machine learning model and acquire a memory access pattern for each workload output from the machine learning model... The analysis module 332 may apply the clustering algorithm to the monitoring information to estimate a memory access pattern for each of a plurality of workloads. For example, the analysis module 332 may use the K-means clustering algorithm, Mean-Shift Clustering algorithm, Gaussian Mixture Model (GMM), and Density-Based Spatial Clustering of Applications with Noise (BSCAN), etc. to estimate the memory access patterns for each of a plurality of workloads) based on the memory access information to produce one or more access probability or frequency (col. 15, ll. 30-37, the memory access pattern for each of a plurality of workloads may be estimated by inputting at least some of the monitoring information to a machine learning model. As another example, the processor may estimate an access probability or frequency {scores} for each main memory address for each of a plurality of workloads based on at least some of the monitoring information);
Although Kim does not expressly teach scores, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention was made, to interoperate the access probability or frequency as scores for the motivation that probability, frequency and scores are quantitative values that can be compare with reference values.
and updating the memory media based on the one or more scores (Fig. 11, step S1120 & col. 15, ll. 30-37, The processor may adjust a cache region associated with at least one of a plurality of cache pools based on the monitoring information, at S1120. The processor may estimate a memory access pattern for each of a plurality of workloads based on the monitoring information, and adjust the cache region associated with at least one of a plurality of cache pools based on the estimated memory access pattern for each of the plurality of workloads).
Kim does not expressly teach replacing one or more pages on the memory media based on the cache policy. In an analogous art of cache management, Cai teaches replacing one or more pages on the memory media based on the cache policy ([0029], Based on evaluation logic 210 identifying metric value 215, policy parameter logic 230 may determine a parameter value 235 of a cache replacement policy. Based on determined parameter value 235, policy parameter logic 230 may generate a signal 240 indicating the replacement policy to be applied for managing a cache of the computer system. In an embodiment, signal 240 specifies parameter value 235 to a cache replacement unit, where the cache replacement determines a cache replacement policy to implement based on specification of parameter value 235 with signal 240).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention was made, with the teachings of Kim and Cai before them, to incorporate Cai’s updating cache policy based on evaluated metric value and replacing one or more pages on the memory media based on the cache policy with Kim’s adjusting a cache region based on the monitoring information for the motivation that selects an optimized cache replacement policy to enhance cache performance.
Thus, the combination of Kim and Cai teaches updating a cache policy on the memory media based on the one or more scores and replacing one or more pages on the memory media based on the cache policy.
Regarding independent claim(s) 17, Claim recites substantially the same limitations as in claim 1, and is therefore rejected for the same reasons set forth in the analysis of claim 1.
Kim additionally teaches A system, comprising: a host device comprising one or more applications, and a storage device comprising memory media, wherein the storage device is configured to perform operations comprising: determining memory access information corresponding to the one or more applications (col. 8, line 64 – col. 9, line 4, the computer program 260 may include one or more instructions associated with an application. The first processor 220 may execute one or more instructions associated with the application so as to execute and control the application. The main memory address range to which the cache is allocated or the storage region (capacity) of the cache memory may be determined based on the type of application);
training a mixture model using the memory access information; determining one or more memory locations based on the mixture model; and updating the memory media using the one or more memory locations ( col. 10, ll. 15-40, The analysis module 332 may input the monitoring information to a machine learning model and acquire a memory access pattern for each workload output from the machine learning model... The analysis module 332 may apply the clustering algorithm to the monitoring information to estimate a memory access pattern for each of a plurality of workloads. For example, the analysis module 332 may use the K-means clustering algorithm, Mean-Shift Clustering algorithm, Gaussian Mixture Model (GMM);
col. 7, ll. 3-5, The memory access pattern may include an access probability/frequency {scores} to the address range of main memory for each workload (see FIG. 10 ). As illustrated in FIG. 1 , cache regions (Region_#0 and Region_#1) allocated to the cache pools 110 and 120 may be adjusted based on the memory access pattern for each workload after a period t1).
Regarding claim(s) 2, the combination of Kim and Cai further teaches wherein updating the cache policy on the memory media comprises: determining that at least one of the one or more scores is above a threshold; and replacing the one or more pages on the memory media comprises: loading a portion of memory corresponding to the at least one of the one or more scores to the memory media (Kim, col. 10, ll. 53-62, the cache pool control module 334 may adjust a cache region associated with at least one of the plurality of cache pools based on the determined address range for each workload. In this case, the cache pool control module 334 may perform instructions for widening/narrowing the main memory range for the cache pool, and/or instructions for allocating more/less of a predetermined cache memory capacity to a specific cache pool, so as to adjust the cache region associated with the specific cache pool; col. 15, ll. 21-24, the cache pool adjustment based on the memory access patterns may be performed immediately if the cache hit-rate is less than or equal to a predetermined threshold;
Cai, [0037], generating a signal based on the parameter value determined at 320, the signal indicating a replacement policy. In an embodiment, based on the signal generated at 330, a line of cache memory is selected for an eviction according to the indicated replacement policy).
Regarding claim(s) 3 and 15, taking claim 3 as exemplary analysis, the combination of Kim and Cai further teaches wherein updating the cache policy of the memory media comprises: determining that at least one of the one or more scores is below a threshold; and replacing the one or more pages on the memory media comprises removing a portion of memory corresponding to the at least one of the one or more scores from the memory media (Kim, col. 10, ll. 53-62, the cache pool control module 334 may adjust a cache region associated with at least one of the plurality of cache pools based on the determined address range for each workload. In this case, the cache pool control module 334 may perform instructions for widening/narrowing the main memory range for the cache pool, and/or instructions for allocating more/less of a predetermined cache memory capacity to a specific cache pool, so as to adjust the cache region associated with the specific cache pool; col. 15, ll. 21-24, the cache pool adjustment based on the memory access patterns may be performed immediately if the cache hit-rate is less than or equal to a predetermined threshold;
Cai, [0037], generating a signal based on the parameter value determined at 320, the signal indicating a replacement policy. In an embodiment, based on the signal generated at 330, a line of cache memory is selected for an eviction according to the indicated replacement policy).
Regarding claim(s) 4 and 16, taking claim 4 as exemplary analysis, Kim further teaches wherein the memory access information is first access information, and wherein the one or more circuits is further configured to perform operations comprising: receiving second access information; and updating the mixture model analysis based on the second access information to produce the one or more scores (col. 8, ll. 11-19, FIG. 8 illustrates a method for outputting a memory access pattern 820 output through a machine learning model 800. The machine learning model 800 illustrated in FIG. 8 may be a model repeatedly trained a predetermined number of times or more. As illustrated in FIG. 8 , monitoring information 810 (including first access information and second access information) may be input to the machine learning model 800, and the machine learning model 800 may output the memory access pattern 820 for each workload based on the monitoring information 810).
Regarding claim(s) 7 and 18, taking claim 7 as exemplary analysis, Kim further teaches wherein the memory access information comprises at least one of address information or order of access information; and the at least one of the address information or order of access information are input to the mixture model (col. 15, ll. 27-33 & 38-40, the monitoring information may include at least one of an access log {order} to an address associated with each of a plurality of cache pools, or a cache hit-rate. The processor may adjust a cache region associated with at least one of a plurality of cache pools based on the monitoring information … the memory access pattern for each of a plurality of workloads may be estimated by inputting at least some of the monitoring information to a machine learning model).
Regarding claim(s) 11, Kim further teaches wherein the mixture model is a Gaussian mixture model (GMM) (col. 10, ll. 15-40, the analysis module 332 may use the K-means clustering algorithm, Mean-Shift Clustering algorithm, Gaussian Mixture Model (GMM), and Density-Based Spatial Clustering of Applications with Noise (BSCAN), etc. to estimate the memory access patterns for each of a plurality of workloads).
Regarding claim(s) 14, Kim further teaches writing data to the memory media based on the one or more scores (col. 2, ll. 6-11, each of the plurality of cache pools may be allocated a different workload, each workload may be associated with a different data structure, and the adjusting the cache region may include estimating a memory access pattern for each workload associated with the different data structure based on the monitoring information; col. 2, ll. 28-31, The estimating the memory access pattern may include estimating an access probability or frequency {scores} for each main memory address for each of the plurality of workloads based on at least some of the monitoring information).
Regarding claim(s) 19, Kim further teaches wherein updating the memory media using the one or more memory locations comprises: writing data to the memory media based on the one or more memory locations (col. 10, ll. 50-61, the cache pool control module 334 may determine an address range for each workload based on the memory access pattern. In addition, the cache pool control module 334 may adjust a cache region associated with at least one of the plurality of cache pools based on the determined address range for each workload. In this case, the cache pool control module 334 may perform instructions for widening/narrowing the main memory range for the cache pool, and/or instructions for allocating more/less of a predetermined cache memory capacity to a specific cache pool, so as to adjust the cache region associated with the specific cache pool).
Regarding claim(s) 20, Kim further teaches wherein updating the memory media using the one or more memory locations comprises: removing data from the memory media based on the one or more memory locations (col. 10, ll. 50-61, the cache pool control module 334 may determine an address range for each workload based on the memory access pattern. In addition, the cache pool control module 334 may adjust a cache region associated with at least one of the plurality of cache pools based on the determined address range for each workload. In this case, the cache pool control module 334 may perform instructions for widening/narrowing the main memory range for the cache pool, and/or instructions for allocating more/less of a predetermined cache memory capacity to a specific cache pool, so as to adjust the cache region associated with the specific cache pool).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TRACY C CHAN whose telephone number is (571)272-9992. The examiner can normally be reached on Monday - Friday 10 AM to 6 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tim Vo can be reached on (571)272-3642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TRACY C CHAN/ Primary Examiner, Art Unit 2138