Prosecution Insights
Last updated: April 19, 2026
Application No. 18/308,397

METHODS AND APPARATUS FOR MANAGING THE COOLING OF A DISTRIBUTED COOLING SYSTEM

Non-Final OA §102
Filed
Apr 27, 2023
Examiner
BROWN, MICHAEL J
Art Unit
2115
Tech Center
2100 — Computer Architecture & Software
Assignee
Intel Corporation
OA Round
1 (Non-Final)
88%
Grant Probability
Favorable
1-2
OA Rounds
2y 10m
To Grant
97%
With Interview

Examiner Intelligence

Grants 88% — above average
88%
Career Allow Rate
905 granted / 1029 resolved
+32.9% vs TC avg
Moderate +9% lift
Without
With
+8.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
24 currently pending
Career history
1053
Total Applications
across all art units

Statute-Specific Performance

§101
10.3%
-29.7% vs TC avg
§103
43.0%
+3.0% vs TC avg
§102
25.9%
-14.1% vs TC avg
§112
7.1%
-32.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1029 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 4/27/2023 was filed. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-6, 8-15, 17, 18, and 21-24 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ruso et al. [Ruso] (US PGPub 2024/0064941). As to claim 1 Ruso discloses an apparatus (cooling control device 100; see Figs. 1 and 2) comprising: memory (memory 201, see Fig. 2); machine readable instructions (program instructions; see paragraph 0036, line 19); and programmable circuitry (processor 202, see Fig. 2) to at least one of instantiate or execute the machine readable instructions (see paragraph 0036, lines 17-19) to: input operational data (one or more server indicators; see paragraph 0048, line 2) into a machine-learning model (machine learning model; see paragraph 0049, line 9), the operational data including first information (server indicators relating to a task to be executed by at least one component of the server; see paragraph 0048, lines 2-3) relating to a workload (task) of a server (one or more servers 101, see Fig. 1) and second information (server indicators from the temperature sensors 103; see paragraph 0048, lines 5-6) relating to an ambient condition (temperature) of the server (Step 300, see Fig. 3 and paragraph 0048, lines 1-10); compare a predicted cooling power requirement (expected cooling demand; see paragraph 0052, line 2) for a time period with a predicted cooling power availability (percentage of total cooling available; see paragraph 0055, lines 7-8) for the time period, the predicted cooling power requirement based on an output of the machine-learning model (Steps 310 and 340, see Fig. 3; also see paragraph 0049, lines 1-12 and paragraph 0059, lines 9); and generate a cooling plan (cooling amount provided by a cooling mechanism; see paragraph 0055, lines 2-3) based on the comparison, the cooling plan to define operation of at least one of the server or a cooling system (cooling mechanism device 102, see Fig. 1) used to cool the server during the time period (Steps 320 and 350, see Fig. 3 and paragraph 0055, lines 2-10). As to claim 2 Ruso discloses the apparatus of claim 1, wherein the cooling plan defines temporally segmented cooling plans for different time segments of the time period, the operation of at least one of the server or the cooling system to change between different ones of the time segments (see paragraph 0061, lines 23-29). As to claim 3 Ruso discloses the apparatus of claim 2, wherein a number of the temporally segmented cooling plans is greater than two (see paragraph 0061, lines 23-29). As to claim 4 Ruso discloses the apparatus of claim 1, wherein the operation of the server is to be at least one of throttled or deployed to another server when the predicted cooling power requirement exceeds the predicted available cooling power availability (see paragraph 0028, lines 1-10 and paragraph 0029, lines 3-13). As to claim 5 Ruso discloses the apparatus of claim wherein the operation of the cooling system is to reduce a temperature of the server when the predicted cooling power requirement is less than the predicted available cooling power availability (see paragraph 0022, lines 7-15). As to claim 6 Ruso discloses the apparatus of claim 1, wherein the operation of the server is to increase a temperature of the server when the predicted cooling power requirement exceeds the predicted available cooling power availability (see paragraph 0022, lines 7-15). As to claim 8 Ruso discloses the apparatus of claim 1, wherein the second information includes: sensor data related to a current ambient condition of the server (see paragraph 0053, lines 5-6); historic records of past ambient conditions of the server (see paragraph 0058, lines 3-5); and forecasts of future ambient conditions on the server (see paragraph 0056, lines 9-16). As to claim 9 Ruso discloses a non-transitory machine readable storage medium (memory 201, see Fig. 2) comprising instructions (program instructions; see paragraph 0036, line 19) to cause programmable circuitry (processor 202, see Fig. 2) to at least: input operational data (one or more server indicators; see paragraph 0048, line 2) into a machine-learning model (machine learning model; see paragraph 0049, line 9), the operational data including first information (server indicators relating to a task to be executed by at least one component of the server; see paragraph 0048, lines 2-3) relating to a workload (task) of a server (one or more servers 101, see Fig. 1) and second information (server indicators from the temperature sensors 103; see paragraph 0048, lines 5-6) relating to an ambient condition (temperature) of the server (Step 300, see Fig. 3 and paragraph 0048, lines 1-10); compare a predicted cooling power requirement (expected cooling demand; see paragraph 0052, line 2) for a time period with a predicted cooling power availability (percentage of total cooling available; see paragraph 0055, lines 7-8) for the time period, the predicted cooling power requirement based on an output of the machine-learning model (Steps 310 and 340, see Fig. 3; also see paragraph 0049, lines 1-12 and paragraph 0059, lines 9); and generate a cooling plan (cooling amount provided by a cooling mechanism; see paragraph 0055, lines 2-3) based on the comparison, the cooling plan to define operation of at least one of the server or a cooling system (cooling mechanism device 102, see Fig. 1) used to cool the server during the time period (Steps 320 and 350, see Fig. 3 and paragraph 0055, lines 2-10). As to claim 10 Ruso discloses the non-transitory machine readable medium of claim 9, wherein the cooling plan defines temporally segmented cooling plans for different time segments of the time period, the operation of at least one of the server or the cooling system to change between different ones of the time segments (see paragraph 0061, lines 23-29). As to claim 11 Ruso discloses the non-transitory machine readable medium of claim 10, wherein a number of the temporally segmented cooling plans is greater than two (see paragraph 0061, lines 23-29). As to claim 12 Ruso discloses the non-transitory machine readable medium of claim 9, wherein the operation of the server is to be at least one of throttled or deployed to another server when the predicted cooling power requirement exceeds the predicted available cooling power availability (see paragraph 0028, lines 1-10 and paragraph 0029, lines 3-13). As to claim 13 Ruso discloses the non-transitory machine readable medium of claim 9, wherein the operation of the cooling system is to reduce a temperature of the server when the predicted cooling power requirement is less than the predicted available cooling power availability (see paragraph 0022, lines 7-15). As to claim 14 Ruso discloses the non-transitory machine readable medium of claim 9, wherein the operation of the server is to increase a temperature of the server when the predicted cooling power requirement exceeds the predicted available cooling power availability (see paragraph 0022, lines 7-15). As to claim 15 Ruso discloses the non-transitory machine readable medium of claim 9, wherein the first information includes at least one of an instruction set associated with the workload or a power requirement of an input/output device of the server, the input/output device to be used during the execution of the workload (see paragraph 0040, lines 7-11 and paragraph 0043, lines 11-17). As to claim 17 Ruso discloses a method comprising: inputting operational data (one or more server indicators; see paragraph 0048, line 2) into a machine-learning model (machine learning model; see paragraph 0049, line 9), the operational data including first information (server indicators relating to a task to be executed by at least one component of the server; see paragraph 0048, lines 2-3) relating to a workload (task) of a compute device (one or more servers 101; see Fig. 1) and second information (server indicators from the temperature sensors 103; see paragraph 0048, lines 5-6) relating to an ambient condition (temperature) of the compute device (Step 300, see Fig. 3 and paragraph 0048, lines 1-10); comparing a predicted cooling power requirement (expected cooling demand; see paragraph 0052, line 2) for a time period with a predicted cooling power availability (percentage of total cooling available; see paragraph 0055, lines 7-8) for the time period, the predicted cooling power requirement based on an output of the machine-learning model (Steps 310 and 340, see Fig. 3; also see paragraph 0049, lines 1-12 and paragraph 0059, lines 9); and generating a cooling plan (cooling amount provided by a cooling mechanism; see paragraph 0055, lines 2-3) based on the comparison, the cooling plan to define operation of at least one of the compute device or a cooling system (cooling mechanism device 102, see Fig. 1) used to cool the compute device during the time period (Steps 320 and 350, see Fig. 3 and paragraph 0055, lines 2-10). As to claim 18 Ruso discloses the method of claim 17, wherein the cooling plan defines temporally segmented cooling plans for different time segments of the time period, the operation of at least one of the compute device or the cooling system to change between different ones of the time segments (see paragraph 0061, lines 23-29). As to claim 21 Ruso discloses the method of claim 17, wherein the operation of the cooling system is to reduce a temperature of the compute device when the predicted cooling power requirement is less than the predicted available cooling power availability (see paragraph 0022, lines 7-15). As to claim 22 Ruso discloses the method of claim 17, wherein the operation of the compute device is to increase a temperature of the compute device when the predicted cooling power requirement exceeds the predicted available cooling power availability (see paragraph 0022, lines 7-15). As to claim 23 Ruso discloses the method of claim 17, wherein the first information includes at least one of an instruction set associated with the workload or a power requirement of an input/output device of the compute device, the input/output device to be used during the execution of the workload (see paragraph 0040, lines 7-11 and paragraph 0043, lines 11-17). As to claim 24 Ruso discloses the method of claim 17, wherein the second information includes: sensor data related to a current ambient condition of the compute device (see paragraph 0053, lines 5-6); historic records of past ambient conditions of the compute device (see paragraph 0058, lines 3-5); and forecasts of future ambient conditions on the compute device (see paragraph 0056, lines 9-16). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Michael J. Brown whose telephone number is (571)272-5932. The examiner can normally be reached Monday-Thursday from 5:30am-4:00pm. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kamini Shah can be reached at (571)272-2279. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Michael J Brown/ Primary Examiner, Art Unit 2115
Read full office action

Prosecution Timeline

Apr 27, 2023
Application Filed
Jun 07, 2023
Response after Non-Final Action
Feb 18, 2026
Non-Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600093
Information Processing Device
2y 5m to grant Granted Apr 14, 2026
Patent 12594602
Systems and Methods
2y 5m to grant Granted Apr 07, 2026
Patent 12594727
EXTRUSION-BASED ADDITIVE MANUFACTURING: METHOD AND 3D PRINTING SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12570051
CONVOLUTION MODELING AND LEARNING SYSTEM FOR PREDICTING GEOMETRIC SHAPE ACCURACY OF 3D PRINTED PRODUCTS
2y 5m to grant Granted Mar 10, 2026
Patent 12574794
METHOD FOR CONTROLLING AND/OR OPERATING AN AUTOMATION COMPONENT
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
88%
Grant Probability
97%
With Interview (+8.8%)
2y 10m
Median Time to Grant
Low
PTA Risk
Based on 1029 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month