DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Applicant’s Arguments
Applicant’s remarks have been fully considered, and are moot in view of the new ground of rejection.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 30 January 2026 has been entered.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-17 are rejected under 35 U.S.C. 103 as being unpatentable over Tran (U.S. Patent No. 9,648,464) in view of Otsuka (US Pub. 2007/0022327).
Regarding claim 1, Tran discloses “A method, comprising:
associating, with each time period of a range of time periods, values of one or more computing system parameters”, as Tran teaches generating historical time series data over defined time intervals. Specifically, Tran discloses that “the server 109 can calculate the client counts in each Zone over time. The result of this step is a dataset of historical time series of client counts for each Zone” (col. 5, lines 9–15), and further discloses “generate down-sampled time series {y} of historical client counts by splitting {z} into T intervals and taking pth percentile of values in each interval” (col. 5, lines 16–24), thereby associating parameter values with multiple time periods.
Tran discloses “based on a predictive model, generating, for an additional time period and based on the values of the one or more computing system parameters associated with each of the range of time periods, values for the additional time period”, as Tran teaches retrieving predictive methods and forecasting future values. Tran discloses “retrieve a plurality predictive methods” (col. 5, lines 36–39), and further discloses predicting client counts over future lookahead windows (col. 7, lines 10–20; col. 8, lines 1–8) using models such as ARIMA, ETS, and Vector AR.
Tran discloses “initiating a mitigating action to an anomalous behavior in a network”, as Tran teaches using predicted values for operational planning and adjustments. Tran discloses that predicted values “can help the analytics customers to plan their operations based on the location analytics” (col. 1, lines 55–65), and that forecasting enables operational preparation and resource allocation adjustments (col. 7, lines 26–36), which correspond to corrective or mitigating responses when predicted conditions indicate degradation.
Tran does not specifically disclose “generating, for an additional time period and based on the values of the one or more computing system parameters associated with each of the range of time periods, a service level experience (SLE) vector including a predicted minimum value and a predicted maximum value of each computing system parameter of the one or more computing system parameters” and does not disclose determining that the value does not satisfy a range “based on the predicted minimum value and the predicted maximum value of the SLE vector.”
However, Otsuka discloses “creating a prediction model based on combinations of specified performance and status information” (para. 42). Otsuka further discloses “setting an allowable range for maintaining a service level based on the prediction model” (para. 46) and that “the performance predictor compares the status information on the current status … with the allowable range to predict whether an adequate service level can be maintained” (para. 72). An allowable range inherently includes a lower bound and an upper bound corresponding to minimum and maximum limits derived from the prediction model.
It would have been obvious to one of ordinary skill in the art at the time of the invention to modify Tran to generate a service level experience vector including predicted minimum and predicted maximum values for each computing system parameter and to define a range based on the predicted minimum value and the predicted maximum value as taught by Otsuka in order to maintain service level performance using model-derived allowable bounds and to initiate a mitigating action when a value does not satisfy that range. Accordingly, claim 1 is unpatentable over Tran in view of Otsuka.
Regarding claim 2, Tran discloses wherein associating values of the one or more computing system parameters comprises determining, within each of the range of time periods, one or more of:
a number of client devices failing to connect to a network,
a number of client devices successfully connecting to the network,
a number of client devices that failed authentication,
a number of clients that failed to associate with an access point,
a number of clients that failed to obtain an IP address from a DHCP server,
a number of clients that failed for an unknown reason, a number of unique client devices, or
a number of unique client devices that failed to connect to the network.
Tran discloses determining client counts within defined time periods. Tran teaches that “the server 109 can calculate the client counts in each Zone over time. The result of this step is a dataset of historical time series of client counts for each Zone” (col. 5, lines 9–15)
Tran further discloses that specific wireless access points “serve clients 110(1) and 110(2)” and “serve clients 110(3) and 110(4)” (col. 4, lines 45–60), thereby identifying distinct client devices within Zones.
Counting client devices in each Zone over time inherently requires counting distinct client devices present in that Zone during each time period. Thus, Tran discloses determining a number of unique client devices within each of the range of time periods.
Regarding claim 3, Tran discloses wherein the time period of the range of time periods includes one or more of a day of a month, a day of a year, a day of a week or a time in a day.
Tran discloses selecting predictive methods based on prediction error. Specifically, Tran discloses that “the best method for each lookahead time is selected based on a prediction error metric, such as mean absolute prediction error or root mean square error” (col. 7, lines 10–20). Tran further describes MAE, MAPE, and MASE metrics for comparing predictive methods (col. 8, lines 1–25)
Regarding claim 4, Tran discloses wherein the one or more computing system parameters includes signal strength parameter measurements received from a plurality of client devices associated with the computing system.
Tran discloses that for a two-hour look-ahead window, “NaiveLast provided similar performance to the other methods with less computational time” and that “NaiveLast was also found to be the best approach for doing forecasting from 10 to 30 minutes” (col. 9, lines 35–45)
Regarding claim 5, Tran discloses wherein the mitigating action to the anomalous behavior in the network includes one or more of:
restarting a dynamic host control protocol (DHCP) server,
restarting an authentication server, generating an alert,
increasing a transmission power of a wireless transmitter,
changing a communication channel of a wireless device,
changing a communication bandwidth utilized by a device,
changing a power level of a wireless device,
shutting down or blocking a network port,
switching a broadcast transmission to a unicast transmission,
renewing a security certificate of one or more user devices,
electronically configuring a switch, restarting a switch,
performing a communications test over a cable,
disconnecting a particular client device,
rebooting an AP,
reinitiating a radio, or
stopping or limiting a guest portal access.
Tran expressly discloses predictive methods including “Exponential smoothing (ETS) model,” “Autoregressive integrated moving average (ARIMA) model,” and “Vector autoregressive (Vector AR or VAR) model” (col. 6, lines 10–45)
Regarding claim 6, Tran discloses that a mitigation action of the issue is taken when a computing system parameter falls below the predicted lower bound.
Tran discloses selecting different predictive methods for different lookahead times and combining them within a lookahead window. Specifically, Tran discloses selecting “the best method for each lookahead time” (col. 7, lines 10–20) and describes using one method for earlier lookahead times and another method for later lookahead times (col. 9, lines 5–25)
Regarding claim 7, Tran discloses wherein initiating the mitigating action comprises initiating the mitigating action based on determining that the value of the one or more computing system parameters is less than the predicted minimum value.
Tran expressly discloses time-series decomposition using STL and discloses decomposing a time series into “a season time series” and “a seasonally adjusted time series” (col. 6, lines 20–35)
Regarding claim 8, Tran discloses wherein determining the value of the one or more computing system parameters does not satisfy the range based on the predicted minimum value and the predicted maximum value comprises determining the value of the one or more computing system parameters does not satisfy the range for a duration of time.
Tran discloses dividing the down-sampled dataset into “a training set and a validation set” (col. 7, lines 1–8), training predictive models using the training set, and predicting client counts using the validation set (col. 7, lines 1–15)
Claim 9 recites a system corresponding to the method of claim 1, and is thus similarly rejected.
Claims 10-16 recite substantially identical subject matter as recited in claims 2-8, respectively, and are thus similarly rejected.
Claim 17 recites non-transitory computer readable storage media comprising instructions that when executed configure hardware processing circuitry to perform operations substantially as recited in claim 1, and is thus similarly rejected.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LUAT T PHUNG whose telephone number is (571)270-3126. The examiner can normally be reached on M-F 9 AM - 6 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Marcus Smith can be reached on (571) 272-3988. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Luat Phung/
Primary Examiner, Art Unit 2468