DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claims 1, 12, and 20 recite “for each corresponding configuration item type...training a corresponding multi-variate machine learning model” and “in response to detecting...initiating for the specific configuration item type, an execution of a particular multi-variate machine learning model” (bolded emphasis added). It is unclear whether the “particular” model that is executed is the same as the “corresponding” model or a different model.
For purposes of examination, both models will be interpreted as the same model. Dependent claims 2-11 and 13-19 fail to cure the deficiencies of the independent claims from which they depend and are therefore rejected for the same reason.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-6 and 9-20 are rejected under 35 U.S.C. 103 as being unpatentable over Mandal et al. (US Pub. 20230008225) in view of Doyle et al. (US Pub. 20220277397).
Referring to claim 1, Mandal discloses A method comprising:
for each corresponding configuration item type of a plurality of different configuration item types [pars. 46-48; a computing environment includes computing infrastructure and software applications; each node has a fixed number of resources such as memory, CPU cycles, persistent storage, etc. that are managed by a performance manager; managing the various resources entails processing time series of values of various data types (e.g., performance metrics for each resource)], training a corresponding multi-variate machine learning model of a plurality of multi-variate machine learning models [pars. 48-50 and 74; the performance manager is an AIOps (Artificial Intelligence for IT operations) system of a plurality of AIOps systems, each AIOps system employs a set of AI models (including multivariate models) to perform predictions on a corresponding input set] to perform anomaly detection [par. 74; the predictions may be for outlier detection (i.e., anomaly detection)] for a corresponding configuration item type of the plurality of different configuration item types [pars. 48-50; each AIOps system operates on its corresponding input set];
...via a univariate machine learning model, an anomaly associated with a specific configuration item type of the plurality of different configuration item types, initiating for the specific configuration item type, an execution of a particular multi-variate machine learning model of the plurality of multi-variate machine learning models [pars. 48-50 and 74; note the set of AI models employed by the performance manager to perform the predictions on the corresponding input set; for outlier detection, the set of AI models may include univariate statistical models (Tier-1) and multivariate models (Tier-2)]; and
evaluating an output of the execution of the particular multi-variate machine learning model to determine an anomaly detection result [pars. 48-50 and 74; note the outlier detection based on the predictions].
Though the tiered structure of the set of AI models disclosed in Mandal [par. 74] implies that the multivariate models are executed based on the output of the univariate models, Mandal does not appear to explicitly disclose initiating the execution of the particular multi-variate model in response to detecting the anomaly via the univariate machine learning model.
However, Doyle discloses initiating the execution of the particular multi-variate model in response to detecting the anomaly via the univariate machine learning model [pars. 70-75; a model generator may use univariate analysis to identify one or more covariates that are associated with a response; the best fitting multivariate model is selected and eventually evaluated].
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the resource management taught by Mandal so that the multivariate models are executed in response to output from the univariate models as taught by Doyle, with a reasonable expectation of success. The motivation for doing so would have been to limit multivariate analysis to only significant variables [Doyle, par. 71], thereby saving computation resources.
Referring to claim 2, Mandal discloses The method of claim 1, wherein the univariate machine learning model is a stationary statistical model [par. 74; note the univariate statistical models (Tier-1)].
Referring to claim 3, Mandal discloses The method of claim 1, wherein the particular multi-variate machine learning model is a time-series based model [par. 48; note the processing of the time series of values of various data types].
Referring to claim 4, Mandal discloses The method of claim 1, wherein the anomaly detection result is associated with a remote device, and the remote device is assigned the specific configuration item type of the plurality of different configuration item types [pars. 48-50; the performance manager manages the resources for the computing environment, the other AIOps systems managing their respective computing environments].
Referring to claim 5, Mandal discloses The method of claim 4, wherein data used to detect the anomaly, via the univariate machine learning model, is collected by an agent at the remote device [fig. 1; pars. 48-50; note the performance manager managing the various resources associated with the nodes in the computing environment].
Referring to claim 6, Doyle discloses The method of claim 1, wherein the univariate machine learning model is trained to reconstruct an input provided to the univariate machine learning model, and the univariate machine learning model provides the reconstructed input as one or more inputs to the particular multi-variate machine learning model [pars. 70 and 71; input variables may be scaled or weighted based on the univariate analysis].
Referring to claim 9, Mandal discloses The method of claim 1, further comprising: providing an agent for monitoring metrics of a device assigned the specific configuration item type of the plurality of different configuration item types [fig. 1; pars. 48-50; note the performance manager managing the various resources associated with the nodes in the computing environment].
Referring to claim 10, Mandal discloses The method of claim 9, further comprising: activating a communication channel with a server in communication with the device, wherein the server is configured to apply the univariate machine learning model using collected metrics data to detect the anomaly [fig. 1; pars. 48-50; note the performance manager managing the various resources associated with the nodes in the computing environment over an intranet connection].
Referring to claim 11, Mandal discloses The method of claim 10, wherein the univariate machine learning model is configured to detect the anomaly based on one or more of the collected metrics data exceeding one or more configured threshold values [par.74; note the outlier detection].
Referring to claim 12, see at least the rejection for claim 1. Mandal further discloses A system comprising: one or more processors; and a memory coupled to the one or more processors, wherein the memory is configured to provide the one or more processors with instructions which when executed cause the one or more processors to perform the claimed steps [pars. 48-50; note the managing of the various resources by the performance manager, which requires the performance manager to be implemented using some type of computing device].
Referring to claim 13, see the rejection for claim 2.
Referring to claim 14, see the rejection for claim 3.
Referring to claim 15, see the rejection for claim 4.
Referring to claim 16, see the rejection for claim 5.
Referring to claim 17, see the rejection for claim 9.
Referring to claim 18, see the rejection for claim 10.
Referring to claim 19, see the rejection for claim 11.
Referring to claim 20, see at least the rejection for claim 1. Mandal further discloses A computer program product, the computer program product being embodied in a non- transitory computer readable storage medium and comprising computer instructions for performing the claimed steps [pars. 48-50; note the managing of the various resources by the performance manager, which requires the performance manager to be implemented using some type of computing device].
Conclusion
The following prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Balasubramanian et al. (US Pub. 20220172100) discloses using user feedback to re-train a machine learning model for performing anomaly detection.
Contact Information
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GRACE PARK whose telephone number is (571)270-7727. The examiner can normally be reached M-F 8AM-5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TAMARA KYLE can be reached at (571)272-4241. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Grace Park/Primary Examiner, Art Unit 2144