Prosecution Insights
Last updated: April 20, 2026
Application No. 18/087,887

APPARATUS AND METHOD FOR PROBABILISTIC CACHE REPLACEMENT FOR ACCELERATING ADDRESS TRANSLATION

Non-Final OA §103
Filed
Dec 23, 2022
Examiner
BLUST, JASON W
Art Unit
2132
Tech Center
2100 — Computer Architecture & Software
Assignee
Intel Corporation
OA Round
1 (Non-Final)
79%
Grant Probability
Favorable
1-2
OA Rounds
2y 3m
To Grant
96%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
220 granted / 277 resolved
+24.4% vs TC avg
Strong +16% interview lift
Without
With
+16.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 3m
Avg Prosecution
24 currently pending
Career history
301
Total Applications
across all art units

Statute-Specific Performance

§101
6.6%
-33.4% vs TC avg
§103
46.2%
+6.2% vs TC avg
§102
23.8%
-16.2% vs TC avg
§112
13.4%
-26.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 277 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Interpretation In regards to claims 19-27, the claims do not explicitly state that the machine-readable medium is “non-transitory”. However, since the claims do state that the program code is “stored thereon”, and the plain meaning of stored is “to keep or accumulate for future use” and/or “retain or enter for future electronic retrieval”, the examiner has therefore interpreted (in addition to the specifications ¶161 differentiation between non-transitory machine-readable storage media, and transitory machine-readable communication media) that the claimed “machine-readable medium” can’t consist of a transitory (temporary) medium such as a “signal per se”, and is therefore patent eligible. While the examiner in this case has interpret the claims as patent eligible, it may still be in the applicant’s best interest to amend the claims to explicitly state a “non-transitory machine-readable medium” to avoid any future confusion/argument over the matter. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-27 is/are rejected under 35 U.S.C. 103 as being unpatentable over Huberty (US 2023/0066236). In regards to claims 1, 10, and 19, taking claim 1 as exemplary Huberty teaches A processor comprising: (fig. 1) a plurality of cores, each core to execute instructions and process data; (fig. 1, ¶25-26, processors 10A-10N, each processor executes instructions from the instruction cache 20 and accesses data in the data cache 28) a cache to be shared by a subset of the plurality of cores (fig. 1, ¶28, last level cache (LLC) 14, comprising cache 32 accessed by processors 10A-10N) for storing page table entry (PTE) cachelines and non-PTE cachelines; (¶26-27, the LLC caches data for the MMU 30 (i.e. page table entries, PTE) cachelines, and also for the ICache 20 and DCache 28 (i.e. non-PTE) cachelines. Huberty may not explicitly teach the cache comprising an N-way set associative cache a cache manager to implement a PTE-aware eviction policy for evicting cachelines from the cache, the PTE-aware eviction policy to reduce a rate at which the PTE cachelines are evicted to service non-PTE cacheline fills. In regards to “the cache comprising an N-way set associative cache”, Huberty does teach in ¶28 that the cache 32 of the LLC 14 “may have any capacity and configuration”. ¶4 teaches set associative caches, with a plurality (i.e. N) number of columns referred to as ways (i.e. an N-way set associative cache). ¶23 teaches that wide set associative caches (i.e. the larger the value of N, the wider) can be useful for implementing a replacement policy that uses criticality values of the cachelines as a factor). Therefore, it would have been obvious to one of ordinary skill in the art prior to the effective filing date of the invention to have modified the system of Huberty to make the cache 32 of the LLC 14 a N-way associated cache. The motivation for such is that Huberty suggests that a set associate cache may simplify the replacement policy that takes into account the criticality values of cache lines, and the substitution of a N-way set associative cache for a generic cache would yield predictable results to one of ordinary skill in the art. In regards to, a cache manager to implement a PTE-aware eviction policy for evicting cachelines from the cache, the PTE-aware eviction policy to reduce a rate at which the PTE cachelines are evicted to service non-PTE cacheline fills”, Huberty does teach in ¶48 that the criticality control circuit 34 is responsible for replacement (i.e. implementing an eviction policy) of cache lines in the cache 32 of the LLC 14 when a cache miss is detected (i.e. servicing a cacheline fill). ¶27-30 teaches that cache lines can be marked critical or non-critical, and that entries for the MMU (i.e. page table entries, PTEs) can marked (categorized) as critical, while other requests (i.e. non-PTEs) can be considered non-critical. ¶50 and fig. 4 teaches that cache lines marked as critical (i.e. PTEs) can be probabilistically biased to prevent critical cache lines from being replaced (i.e. the rate which PTE cachelines are selected to be evicted is reduced compared to non-PTE cachelines). It would have been obvious to one of ordinary skill in the art prior the effective filing date of the claimed invention to modify the system of Huberty to implement a “PTE-aware” eviction policy by only marking MMU (i.e. PTE) as critical, and other entries (i.e. non-PTE as non-critical), as to decrease the rate of replacement of critical cachelines compared to non-critical cachelines by using a biased pseudo-random selection to preferentially select non-critical (non-PTE) marked cachelines for replacement over critical (PTE) cachelines. The motivation for such modifications is based on the teachings above (and it’s use of “may” when determining which factors to consider for marking cachelines as critical/non-critical), and the fact that Huberty explicitly that various modifications and variations would be apparent to those skilled in the art (¶119, ¶20). These modifications would yield predictable results to one of ordinary skill in the art. In regards to claims 2-4, 11-13, and 20-22, Huberty further makes obvious wherein if a number of PTE cachelines in a set is less than a first threshold, the cache manager is to evict PTE cachelines from the set at a first rate or with a first probability and wherein if the number of PTE cachelines in a set is greater than the first threshold, the cache manager is to evict PTE cachelines from the set at a second rate greater than the first rate or with a second probability greater than the first probability wherein if a number of PTE cachelines in the set is greater than a third threshold, the cache manager is to evict PTE cachelines from the set at a third rate greater than the second rate or with a third probability greater than the second probability. wherein the first probability is in a range of 0% to 1%, the second probability is in a range of 5% to 20%, and the third probability is in a range of 95% to 100%. ¶50 and ¶64 teaches that criticality control circuit 34 (cache manager) may employ multiple biased pseudo-random selections based on different probabilities. The probabilities of selection would increase as the number of critical (PTE) cachelines increases, and probability functions could be chosen such that the probability ranges (i.e. first-third probability) are fulfilled when a certain number (or percent) of the cache is currently above/below (first and third threshold) the number of cachelines marked as critical (PTE). This is a design issue that could be implemented by one of ordinary skill in the art while yielding predictable results, based on the size of the cache, the number of ways, and the desired probabilities. In regards to claims 5, 14, and 23, Huberty further makes obvious wherein each cacheline is to include a PTE bit to differentiate between the PTE cachelines and the non-PTE cachelines (¶36 teaches the cache may have a field (i.e. a bit) for a criticality value (i.e. a 1 for critical (PTE), and 0 for non-critical (non-PTE) data) the cache manager to determine the number of PTE cachelines in the set based on a number of PTE bits set to 1 in the set. (¶35-36 teaches that the amount/capacity of a cache can be determined based off a field in the cache tag (i.e. the critical/non-critical, PTE/non-PTE field)) In regards to claims 6, 15, and 24, Huberty further makes obvious wherein the cache manager is to generate a way mask with a bit value associated with each way to indicate if a PTE cacheline is stored in the way. (¶50, criticality control circuit 34 (cache manager) may selectively mask critical (PTE) cache lines). ¶4 as each entry is associated with a “way”, the masking for each row can be seen as a “way” mask. In regards to claims 7, 16, and 25, Huberty further makes obvious wherein the cache manager is to determine an eviction candidate for a first set based on a least recently used (LRU) eviction policy, the cache manager to exclude certain ways from the LRU eviction policy based on the way mask. (see at least fig. 4 and ¶48-50 where an LRU entry is selected and cachelines marked as critical (i.e. containing PTEs) can be selectively masked (i.e. the ways containing those entries are excluded from selection) In regards to claims 8, 17, and 26, Huberty further makes obvious wherein the cache manager is to exclude ways from consideration which are indicated in the way mask to store PTE cachelines. (see at least fig. 4 and ¶48-50 where an LRU entry is selected and cachelines marked as critical (i.e. containing PTEs) can be selectively masked (i.e. the ways containing those entries are excluded from selection) In regards to claims 9, 18, and 27, Huberty further makes obvious wherein the cache manager is to increase the eviction rate or probability of PTE cacheline evictions for PTE cachelines without a cache hit within a pre-defined window. (¶66 teaches that the criticality control circuit 34 may accelerate (increase rate/probability) the eviction of cache lines identified as critical (PTE cachelines) based on one or more indications. ¶67-72 teaches one of these indications is if the critical cachelines are not being accessed any longer (i.e. no cache hits within a pre-defined window), and/or the cache hit rates are below a set threshold (i.e. the number of cache hits within a certain amount of requests (i.e. window) is below a set amount). EXAMINER’S NOTE Examiner has cited particular paragraphs, figures, and/or columns and line numbers in the references applied to the claims above for the convenience of the Applicants. Although the specified citations are representative of the teachings of the art and are applied to specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested from the Applicants in preparing responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure, as they could have been used to support a 103 rejection of the claimed invention in lieu of, or in addition to Huberty, or in combination: Mattina (US 2006/0004963) teaches CPU and request type (such as MMU/PTE) of a cache line can be tracked with the cache lines and can be taken into account when selecting a cache block (line) for eviction. It also teaches bit way masking. Luick (US 6,349,362) teaches a n-way associative cache where certain ways of the cache are assigned to hold TLB data (i.e. page table entries, PTE), while other ways contain regular data. Olszewski (US 2011/0153949) teaches that certain types of cache entries can be marked to be protected from replacement (up until a number of protected entries meets a limit/threshold), and therefore their rate/probability of being selected for eviction is reduced compared to cachelines marked/designated as non-protected. Yoshioka (US 2005/0268041) teaches that each entry in a way of a cache can contain a priority attribute that indicates a type of data to preferentially be stored in that way. Hughest (US 2018/0203798) teaches that a cache can implement a quality of service (QoS) protocol for cache placement/eviction based on indications of the owner of the data and the type of transaction the data is associated with (i.e. data, instruction, page table) Kumar (US 2023/0012880) teaches that cache entries can be marked as preferential or non-preferential based on the data stored in the cache entry, such that entries from page table walks (i.e. PTEs) can be preferential. This preferential indication can be used to determine which (and when) entries are promoted or demoted (i.e. replaced/evicted) through the cache hierarchy. Jakkula (US 10,754,784) teaches that a cache manager can assign a metadata type for each cache entry upon the determined level of importance. Based upon different factors, such as the determined level of importance and frequency of accesses, the cache manager can selectively evict cache entries. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JASON W BLUST whose telephone number is (571)272-6302. The examiner can normally be reached 12-8:30 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hosain Alam can be reached at (571) 272-3978. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JASON W BLUST/ Primary Examiner, Art Unit 2132
Read full office action

Prosecution Timeline

Dec 23, 2022
Application Filed
Feb 09, 2023
Response after Non-Final Action
Jan 23, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596485
HOST DEVICE GENERATING BLOCK MAP INFORMATION, METHOD OF OPERATING THE SAME, AND METHOD OF OPERATING ELECTRONIC DEVICE INCLUDING THE SAME
2y 5m to grant Granted Apr 07, 2026
Patent 12554417
DISTRIBUTED DATA STORAGE CONTROL METHOD, READABLE MEDIUM, AND ELECTRONIC DEVICE
2y 5m to grant Granted Feb 17, 2026
Patent 12535954
STORAGE DEVICE AND OPERATING METHOD THEREOF
2y 5m to grant Granted Jan 27, 2026
Patent 12530120
Maximizing Data Migration Bandwidth
2y 5m to grant Granted Jan 20, 2026
Patent 12530118
DATA PROCESSING METHOD AND RELATED DEVICE
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
79%
Grant Probability
96%
With Interview (+16.2%)
2y 3m
Median Time to Grant
Low
PTA Risk
Based on 277 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month