DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Regarding claim 1,
Step 1: Is the claim to a process, machine, manufacture or composition of matter?
Yes, the claim is directed to a method.
Step 2A Prong One: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
The limitations of:
detecting incident data and resolution data in monitored data collected while monitoring an information technology (IT) environment; (abstract mental process, a human can look at the given data and determine that an incident has occurred and look at corresponding resolution data, something an IT worker would do)
correlating the incident data with the resolution data according to a detected change in health metrics data from the monitored data; (abstract mental process, an IT worker can look at the incident data and determine how to resolve it, i.e. correlate it with resolution data)
Step 2A Prong Two: Does the claim recite additional elements that integrate the judicial exception into a practical application?
The limitations of:
storing the correlated incident data and resolution data as a training dataset stored in a database; (insignificant extra-solution activity, MPEP 2106.05(g))
training a machine learning model using the training dataset resulting in a trained machine learning model; (invoking generic computer as a tool, MPEP 2106.05(f))
deploying the trained machine learning model such that the trained machine learning model provides resolution recommendation in response to receiving new incident data (applying the abstract idea to a particular field of use, MPEP 2106.05(h))
Step 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
The limitations of:
storing the correlated incident data and resolution data as a training dataset stored in a database; (insignificant extra-solution activity, MPEP 2106.05(g), storing data is well-understood, routine, and conventional in the art, MPEP 2106.05(d)(II))
training a machine learning model using the training dataset resulting in a trained machine learning model; (invoking generic computer as a tool, MPEP 2106.05(f))
deploying the trained machine learning model such that the trained machine learning model provides resolution recommendation in response to receiving new incident data (applying the abstract idea to a particular field of use, MPEP 2106.05(h))
Dependent claim 2 recites the monitored data includes data from a micro-service, a specific technological environment, MPEP 2106.05(h).
Dependent claim 3 recites the data including health metrics, a specific technological environment, MPEP 2106.05(h).
Dependent claim 4 recites correlating with a runbook action, a specific technological environment, MPEP 2106.05(h).
Dependent claim 5 recites the data including user communications, a specific technological environment, MPEP 2106.05(h).
Dependent claim 6 recites correlating based on key-word analysis, abstract mental process.
Dependent claim 7 recites correlating using user communications, mental process.
Dependent claim 8 recites training the model to understand relationships, applying the abstract idea MPEP 2106.05(f).
Dependent claim 9 recites training to recommend a runbook, applying the abstract idea MPEP 2106.05(f). Dependent claim 10 recites collecting user feedback and updating the dataset, insignificant extra-solution activity, collecting data and transmitting data is well-understood, routine, and conventional in the art MPEP 2106.05(d)(II).
Note independent claims 11 and 17 recite the same substantial subject matter as independent claim 1, only differing in embodiment. The differences in embodiments do not meaningfully change the above analysis and therefore the claims are subject to the same rejection. Dependent claims 14-16 and 18-20 are mapped to their corresponding duplicate claims 2-10 as shown above.
Dependent claim 12 recites transferring data over a network, WURC, MPEP 2106.05(d)(II).
Dependent claim 13 recites downloading data, metering a system and generating an invoice, WURC, and applying the abstract idea, MPEP 2106.05(d)(II) and MPEP 2106.05(f).
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-2, 4-6, 8-12, 14, 16-18, and 20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Rajaram et al. US 2018/0253736.
Regarding claims 1, 11, and 17, Rajaram teaches “a computer-implemented method comprising: detecting incident data” ([0020] “The input module 201 receives one or more incident tickets 206 (e.g., service request, user query, etc.) from a computing system”) “and resolution data in monitored data collected while monitoring an information technology (IT) environment” ([0026] “The knowledge module 204 derives a knowledge graph model based on the knowledge graph received from the learning module 203. The knowledge graph model may then be leveraged to infer right resolution code 209 for an incoming incident ticket 206 that may be executed for resolving the incoming incident ticket 206”);
“correlating the incident data with the resolution data according to a detected change in health metrics data from the monitored data” ([0026] “The knowledge module 204 dynamically determines a signature for an incoming incident ticket 206 based on the problem type as obtained from the description of the incoming incident ticket 206. The knowledge module 204 then maps the signature with an existing unique signature using the knowledge graph model so as to determine the resolution code 209 to be applied for resolving the incident ticket 206”);
“storing the correlated incident data and resolution data as a training dataset stored in a database” ([0028] “The new incident ticket and the corresponding manual resolution may be provided to learning module 203 so as to update the repository and to update the knowledge graph”);
“training a machine learning model using the training dataset resulting in a trained machine learning model” ([0028] “the knowledge module 204 initiates a learning process based on intelligence gathered manual resolution of the incident ticket”); and
“deploying the trained machine learning model such that the trained machine learning model provides resolution recommendation in response to receiving new incident data” ([0028] “The new incident ticket and the corresponding manual resolution may be provided to learning module 203 so as to update the repository and to update the knowledge graph”)
Note that independent claims 11 and 17 recite the same substantial subject matter as independent claim 1, only differing in embodiment. The differences in embodiment, a computer program product and computer system are taught by Rajaram in [0036] and fig. 5 respectively.
Regarding claims 2, 14, and 18, Rajaram teaches “wherein the monitored data includes data collected from a micro-service in the IT environment” ([0018] “The system 100 may also interact with one or more external devices 105 over a communication network 106 for sending or receiving various data (e.g., training data, incident ticket, resolution code, etc.)”)
Regarding claims 4, 16, and 20, Rajaram teaches “wherein the correlating of incident data and resolution data includes correlating incident data with a runbook action” ([0026] “The knowledge module 204 dynamically determines a signature for an incoming incident ticket 206 based on the problem type as obtained from the description of the incoming incident ticket 206. The knowledge module 204 then maps the signature with an existing unique signature using the knowledge graph model so as to determine the resolution code 209 to be applied for resolving the incident ticket 206” which is done automatically, or a runbook action).
Regarding claim 5, Rajaram teaches “wherein the monitored data includes data indicative of end user actions and end user communications” ([0020] “The input module 201 receives one or more incident tickets 206 (e.g., service request, user query, etc.) from a computing system”)
Regarding claim 6, Rajaram teaches “wherein the correlating of the incident data with the resolution data includes performing keyword-based analysis on runbooks” ([0023] “the learning module 203 determines the unique signature for each of the multiple sets by pre-processing the descriptions of the incident tickets in the set, determining n-grams and corresponding weightages from the pre-processed descriptions, and selecting one of the n-grams as the unique signature based on the assigned weightage […] the pre-processing may include, but is not limited to, removing URLs, removing numbers, removing generic stop words, removing custom stop words, removing Emails, removing special characters, removing date and time values, removing punctuations, and so forth as they have little or no contribution to content, context, and meaning of the ticket.” )
Regarding claim 8, Rajaram teaches “wherein the training of the machine learning model includes training the machine learning model to understand incident and resolution relationships” (abstract “The knowledge graph model is derived, from a plurality of past incident tickets and a plurality of corresponding resolutions, by determining a set of unique signatures and a set of corresponding resolution codes” knowledge graphs inherently track relationships)
Regarding claim 9, Rajaram teaches “wherein the training of the machine learning model includes training the machine learning model to recommend a runbook in response to an inputted incident” ([0028] “The new incident ticket and the corresponding manual resolution may be provided to learning module 203 so as to update the repository and to update the knowledge graph”)
Regarding claim 10, Rajaram teaches “further comprising collecting user feedback and updating the dataset based on the user feedback” ([0028] “In some embodiments, the knowledge module 204 may validate the resolution code 209 determined using the knowledge graph model with the help of a service team.” help from a service team i.e. manual user feedback)
Regarding claim 12, Rajaram teaches “wherein the stored program instructions are stored in a computer readable storage device in a data processing system, and wherein the stored program instructions are transferred over a network from a remote data processing system” ([0018] “The system 100 interacts with a user via a user interface 104 accessible via the display 103. The system 100 may also interact with one or more external devices 105 over a communication network 106 for sending or receiving various data (e.g., training data, incident ticket, resolution code, etc.). For example, the system 100 may receive incident ticket generated from an external device 105 and provide appropriate resolution for the incident ticket to the external device 105”)
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 3, 15, and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rajaram in view of Bogojeska et al. US 2015/0178637.
Regarding claims 3, 15, and 19, the Rajaram reference has been addressed above. The reference does not explicitly teach the claim limitations. Bogojeska however teaches “wherein the monitored data includes health metrics data collected from a performance monitoring tool” (Bogojeska [0027] “This data may include server details, such as hardware specifications, current operating systems, user applications, enterprise applications, virtual machine usage and configuration, age, size, performance, utilization, environment, functions, service management system, location, and prior feature modifications”)
It would have been obvious to one having ordinary skill in the art at the time that the invention was effectively filed to combine the teachings of Rajaram with that of Bogojeska since “Generally, a human being will need to assess each ticket, and as a result, ticket management may slow down the operation of a server or set of servers, some of which may not be able to continue processing work as intended without their tickets having been addressed.” Bogojeska [0003]. Thus by combining the references, one would have an automated system to address the tickets and therefore have a more efficient resolution process. This may include services related to performance as addressed above.
Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rajaram in view of Al-Bahar et al. USPAT 10,956,255.
Regarding claim 7, the Rajaram reference has been addressed above. The reference does not explicitly teach the limitations. Al-Bahar however teaches “wherein the correlating of the incident data with the resolution data includes performing data mining of end user communications” (Al-Bahar abstract “The L1 IT support issue may be determined based on monitoring indications of human-initiated activities maintained by a system of record, and may, prior to the automated agent's alert, be unknown to the user. In some instances, a natural language understanding (NLU) module may be used to identify an entity and intent from the indications of human-initiated activities” i.e. monitoring communications)
It would have been obvious to one having ordinary skill in the art at the time that the invention was effectively filed to combine the teachings of Rajaram with that of Al-Bahar since a combination of known methods would yield predictable results. As shown in Al-Bahar, it is known to mine/collect user interactions in order to make determinations. Thus this data would also be helpful in the system of Rajaram in order to solve IT issues and would work as any other data would.
Claim(s) 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rajaram in view of O’Hare et al. US 2016/0150047.
Regarding claim 13, Rajaram teaches “wherein the stored program instructions are stored in a computer readable storage device in a server data processing system, and wherein the stored program instructions are downloaded in response to a request over a network to a remote data processing system for use in a computer readable storage device associated with the remote data processing system, further comprising:” ([0018] “The system 100 interacts with a user via a user interface 104 accessible via the display 103. The system 100 may also interact with one or more external devices 105 over a communication network 106 for sending or receiving various data (e.g., training data, incident ticket, resolution code, etc.). For example, the system 100 may receive incident ticket generated from an external device 105 and provide appropriate resolution for the incident ticket to the external device 105” transferring over a network is essentially downloading)
Rajaram does not explicitly teach the remaining limitations. O’Hare however teaches “program instructions to meter use of the program instructions associated with the request” ([0181] “Stored data may be metered, and billing to the enterprise user device 1702 may be added to monthly cloud invoice based on the amount of data stored.”); and
“program instructions to generate an invoice based on the metered use” (previous citation “Stored data may be metered, and billing to the enterprise user device 1702 may be added to monthly cloud invoice based on the amount of data stored.”)
It would have been obvious to one having ordinary skill in the art at the time that the invention was effectively filed to combine the teachings of Rajaram with that of O’Hare since a combination of known methods would yield predictable results. As shown above, metering and invoicing software is known in the art and would apply in a similar form with the combination above in order to potentially limit use of software.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KEVIN W FIGUEROA whose telephone number is (571)272-4623. The examiner can normally be reached Monday-Friday, 10AM-6PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, MIRANDA HUANG can be reached at (571)270-7092. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
KEVIN W FIGUEROA
Primary Examiner
Art Unit 2124
/Kevin W Figueroa/ Primary Examiner, Art Unit 2124