Prosecution Insights
Last updated: April 19, 2026
Application No. 18/948,174

SYSTEMS AND METHODS FOR IMPROVING CACHE EFFICIENCY AND UTILIZATION

Non-Final OA §103§DP
Filed
Nov 14, 2024
Examiner
CHAN, TRACY C
Art Unit
2138
Tech Center
2100 — Computer Architecture & Software
Assignee
Intel Corporation
OA Round
1 (Non-Final)
79%
Grant Probability
Favorable
1-2
OA Rounds
2y 5m
To Grant
79%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
280 granted / 354 resolved
+24.1% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
16 currently pending
Career history
370
Total Applications
across all art units

Statute-Specific Performance

§101
4.5%
-35.5% vs TC avg
§103
56.3%
+16.3% vs TC avg
§102
10.7%
-29.3% vs TC avg
§112
15.7%
-24.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 354 resolved cases

Office Action

§103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Application This office action is in response to the Application filed on 11/14/2024. Claims 1-2, 5, 8-10, 26-37 are presented for examination. Drawings The drawings submitted on 11/14/2024 are accepted. Specification The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-2, 5, 8-10, 26-28, and 32-34 are rejected under 35 U.S.C. 103 as being unpatentable over Moon (US 2017/0357600) in view of Zulauf (US 2008/0052466). Regarding independent claims 1, 26 and 32, taking claim 1 as exemplary analysis, Moon teaches A graphics processor, comprising: processing resources to perform graphics operations (Fig. 1 & [0036], The IP 1220 may include, for example, a graphics processing unit (GPU); [0037], the IP 1220 may include cache memory; [0008], a method for caching graphic processing unit (GPU) data in a multimedia processing system; Note that it is a known technique to implement a method, such as the method taught by Moon in view of Zulauf, as a software stored in a computer readable medium for the benefit of increased flexibility by allowing the method to be changed easily); Moon teaches a cache policy setting circuit of a cache coupled to the processing resources ([0005], memory device including a cell array storing a plurality of cache lines and a plurality of tags corresponding to the plurality of cache lines, a cache policy setting circuit selecting from a plurality of managing policies at least one managing policy and setting a cache policy based on the at least one selected managing policy, and cache logic managing the plurality of cache lines based on the cache policy), however, Moon does not expressly teach a cache controller. In an analogous art of cache configuration, Zulauf teaches a cache controller of a cache ([0020], the cache controller 108 includes a cache line allocation controller 136 to determine the cache line allocation polices for the cache 110 and to configure the cache controller 108 to comply with the selected cache line allocation policies). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention was made, with the teachings of Moon and Zulauf before them, to incorporate Zulauf’s cache controller to implement a certain cache line allocation policy with Moon’s a cache policy setting circuit for the motivation that cache controller is well known in the field that implements cache line allocation policies. The combination of Moon and Zulauf further teaches wherein the cache controller is configured to control cache priority by determining whether default cache settings or an instruction to control cache operations for the cache ( Moon, [0082], The cache policy selector 132 may select at least one of the plurality of managing policies 131. When the cache policy setting command CMD_CP is received from the memory controller 200, the cache policy selector 132 may select a managing policy in response to the cache policy setting command CMD_CP; [0118]-[0119], Referring to FIG. 12, the memory device 100 may set a default cache policy in operation S410 …The memory device 100 may transmit the set cache policy or the information about the cache policy to the memory controller 200 in operation S430 in response to the request of the memory controller 200. When the cache policy is requested to be changed, the memory controller 200 may transmit a cache policy setting command to the memory device 100 in operation S440; Zulauf, [0021], the cache line allocation controller 136 initially configures the cache controller 108 to implement default (or instruction independent) cache line allocation polices for corresponding memory addresses or memory regions (or to implement a default global cache line allocation policy). As the execution of instructions commences, the cache allocation controller 136 is notified of an instruction to be executed and, based on the instruction and other information, the cache allocation controller 136 configures the cache controller 108 to implement an alternate cache line allocation policy (other than the default cache line allocation policy) for the instruction if so warranted by the instruction and by a policy arbitration process). Regarding claim(s) 2, 27 and 33, the combination of Moon and Zulauf further teaches wherein the cache controller is configured to determine the default cache settings and further to determine whether the instruction has been received, wherein the cache controller is configured to apply the instruction if the instruction has been received, and wherein the cache controller to apply the default cache settings if the instruction has not been received (Moon, Fig. 12, wherein the default cache setting is overridden in response to cache policy request has been received). Regarding claim(s) 5, 28 and 34, the combination of Moon and Zulauf further teaches wherein the cache comprises a first level cache of the processing resources (Moon, [0034], a cache (for example, an L1 cache), wherein the default cache settings comprise one or more of caching, write-through, write-back, write-streaming for store or atomics operations, load caching, or load streaming for load or prefetch operations (Zulauf, [0016], Examples of caching actions can include, for example, “no caching” … one cache line allocation policy can designate a certain write action as a write-through action). Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP §§ 706.02(l)(1) - 706.02(l)(3) for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp. Claims 1-2, 5-10, 26-37 rejected on the ground of nonstatutory double patenting as being anticipated by claims 1-13 of U.S. Patent No. 12,210,477). Although the claims at issue are not identical, they are not patentably distinct from each other because the instant claims are anticipated as shown below. Current application PAT 12,210,477 1, 26, 32 A graphics processor, comprising: processing resources to perform graphics operations; and a cache controller of a cache coupled to the processing resources, wherein the cache controller is configured to control cache priority by determining whether default cache settings or an instruction to control cache operations for the cache. 1 A graphics processor, comprising: processing resources to perform graphics operations; and a cache controller of a cache of the graphics processor coupled to the processing resources, wherein the cache controller is configured to control cache priority by determining a default setting having a default cache attribute to be applied if no received instruction, determining whether an instruction from an application with an associated cache attribute has been received, and applying the instruction and the associated cache attribute when the instruction is received by the cache controller, … 2, 27, 33 the cache controller is configured to determine the default cache settings and further to determine whether the instruction has been received, wherein the cache controller is configured to apply the instruction if the instruction has been received, and wherein the cache controller to apply the default cache settings if the instruction has not been received. 1 3 4 the cache controller is configured to … determining a default setting having a default cache attribute to be applied if no received instruction, determining whether an instruction from an application with an associated cache attribute has been received the cache controller is configured to apply the instruction and associated cache attribute if the instruction has been received. the cache controller applies the default cache attribute if an instruction has not been received. 5, 28, 34 the cache comprises a first level cache of the processing resources, wherein the default cache settings comprise one or more of caching, write-through, write-back, write-streaming for store or atomics operations, load caching, or load streaming for load or prefetch operations. 5 6 7 the cache comprises a first level cache of the processing resources. the associated cache attribute comprises no caching, write-through, write-back, or write-streaming for store or atomics operations. the associated cache attribute comprises no caching, load caching, or load streaming for load or prefetch operations. 8, 29, 35 the cache controller is configured to receive a store message having a write-streaming attribute to stream data for a streaming store that is cached at low priority in the cache, wherein a least recently used (LRU) position of the cache is used to merge partial writes in the cache until a full cache line is generated based on a plurality of partial writes. 1 8 11 the cache controller is configured to receive the instruction including a store instruction having a write-streaming attribute to stream data for a streaming store that is cached with a low priority cache attribute in the cache wherein the data being streamed for a streaming store that is cached at low priority in the cache is evicted using a least recently used (LRU) position of the cache wherein a least recently used (LRU) position of the cache is used to merge partial writes in the cache until a full cache line is generated based on a plurality of partial writes 9, 30, 36 the cache controller is configured to receive a load message having an invalid after read attribute to invalidate data for a cache hit in the cache after a read operation if the data is from a private memory. 9 the cache controller is configured to receive a load message having an invalid after read attribute to invalidate data for a cache hit in the cache after a read operation if the data is from a private memory 10, 31, 37 the cache controller is configured to receive a prefetch message having a load streaming attribute with data being streamed that is prefetched into the cache and given low priority, wherein the data being streamed is then evicted using a LRU position in the cache. 10 the cache controller is configured to receive a prefetch message having a load streaming attribute with data being streamed that is prefetched into the cache and given low priority, wherein the data being streamed is then evicted using a LRU position in the cache Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to TRACY C CHAN whose telephone number is (571)272-9992. The examiner can normally be reached on Monday - Friday 10 AM to 6 PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TIM VO can be reached on (571)272-3642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TRACY C CHAN/ Primary Examiner, Art Unit 2138
Read full office action

Prosecution Timeline

Nov 14, 2024
Application Filed
Dec 16, 2024
Response after Non-Final Action
Dec 12, 2025
Non-Final Rejection — §103, §DP
Mar 27, 2026
Response Filed

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602102
SELECTIVE BACKUP TO PERSISTENT MEMORY FOR VOLATILE MEMORY
2y 5m to grant Granted Apr 14, 2026
Patent 12596656
PREFETCH AWARE LRU CACHE REPLACEMENT POLICY
2y 5m to grant Granted Apr 07, 2026
Patent 12591381
METHOD FOR ADJUSTING OPERATION MODE MEMORY STORAGE DEVICE AND MEMORY CONTROL CIRCUIT UNIT
2y 5m to grant Granted Mar 31, 2026
Patent 12579070
Prune policies
2y 5m to grant Granted Mar 17, 2026
Patent 12566700
EXTERNAL MEMORY AS AN EXTENSION TO LOCAL PRIMARY MEMORY
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
79%
Grant Probability
79%
With Interview (+0.0%)
2y 5m
Median Time to Grant
Low
PTA Risk
Based on 354 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month