Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-20 are currently pending for examination.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-4, 8, 10-11, 14, 16-17, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Roddom (US 20230353464 A1) in view of Turi (US 20230155911 A1) in further view of Venugopal (US 20210109830 A1).
As per claim 1, Roddom discloses:
A computer-implemented method comprising: defining, by a site reliability engineering maturity engine, a plurality of tenets, a tenet in the plurality of tenets representing a principle of site reliability engineering (“Service level objectives (SLOs), service level indicators (SLIs), and service level agreements (SLAs) may be primary metrics of site reliability engineering (SRE).". 0030 ; " In accordance with aspects, SLI engine 120 may define service level indicators for objectives such as latency, throughput, availability, capacity, etc.", 0045 ; “A service level agreement may be defined to provide a user with a clear picture of an application, service, or platform's level of functionality, reliability, and/or performance.”, ; “Exemplary service level objectives include availability, latency, service desk response time, incident resolution time, etc.", 0031 ; "Exemplary service level indicators include data latency, network traffic, error rate, saturation, etc.", 0032 ; “The system of the invention or portions of the system of the invention may be in the form of a “processing machine” a “computing device,” an “electronic device,” a “mobile device,” etc. These may be a computer, a computer server, a host machine, etc.”, 0060 ; Examiner Note: SLIs, SLOs, and SLAs equate to tenets of SRE maturity.)
Roddom may disclose the above limitation of claim 1, but does not explicitly disclose generating a first, second, and third SRE assessment.
However, Turi discloses:
generating, by the site reliability engineering maturity engine for the plurality of tenets, a first site reliability engineering maturity assessment for a first hybrid cloud component, a second site reliability engineering maturity assessment for a second hybrid cloud component, and a third site reliability engineering maturity assessment for an application (“This disclosure describes, at least in part, techniques for providing information associated with services. For instance, system(s) may determine health scores and one or more topologies associated with services”, 0029 ; "This disclosure further describes, at least in part, a method that includes determining a first health score associated with a first service and a second health score associated with a second service", 0024 ; ."In some examples, in addition to, or alternatively from, the user determining the priorities, the system(s) may also determine the priorities for the services using the health scores and the business values. For a first example, the system(s) may determine the priorities based on the health scores, where the system(s) determine that a first service with a lowest health score is prioritized first, followed by a second service with the second lowest health score, followed by a third service with the third lowest health score, and/or so forth.", 0033 ; “The user device(s) 104 and the resource(s) 108 may be communicatively coupled among one another and/or to various other devices via the cloud computing network 102 and/or the resource network(s) 106.”, 0051 ; see fig 1. ; Examiner Note: a health score equates to a site reliability engineering maturity assessment, and a third service equates to a third application.)
It would have been obvious to one of ordinary skill in the art, before the effective filing date, to combine the teachings of Roddom and Turi in order to provide the SRE method of Roddom with the health score comparison method of Turi in order to prioritize the services in such a way that minimizes the loss that may occur from problems with the services (see Turi, [0033]).
Roddom in view of Turi discloses the above SRE method, but does not disclose determining workload placement based on SRE assessments.
However, Venugopal discloses:
determining, by the site reliability engineering maturity engine, a workload placement for the application based on the first, second, and third site reliability engineering maturity assessments (“For example, the nodes may “self-diagnose” such that if the nodes determine that there is a relatively high probability that they will fail within a predetermined amount of time, or more generally, the calculated reliability score exceeds (or fails to exceed) a predetermined threshold, the nodes may cause a remedial action to be performed (and/or perform such an action). For example, tasks may be migrated to other nodes, tasks may be restarted on other nodes, the particular (unreliable) node may be restarted, tasks may be “checkpointed” (e.g., the current/most recent state may be saved in a centralized database), etc.”, 0018 ; “Method 600 ends (step 610) with, for example, the remedial action being completed, such as a migrated task being performed by a second computing node. The process may then be repeated (e.g., for the second computing node and/or other computing nodes in the system). “, 0090 ; Examiner Note: a reliability score equates to a site reliability engineering maturity assessment, and restarting a task on a different node equates to determining workload placement)
initiating, by the site reliability engineering maturity engine, a workload migration for the application based on the workload placement. (" The reliability score may be used to determine whether or not a remedial action associated with the operation of the node should be performed, such as migrating one or more task being performed by the particular node to another node or restarting the task(s) on another node. In such instances, the second node may be selected based on the similarly calculated reliability scores of other available nodes in the system (i.e., each node may operate in a similar manner such that each self-diagnoses by determining its own reliability score).", 0027)
The system of Roddom in view of Turi in further view of Venugopal would be capable of determining workload placement based on a first, second, and third SRE assessment. It would have been obvious to one of ordinary skill in the art, before the effective filing date to combine the teachings of Roddom in view of Turi with those of Venugopal in order to provide the SRE system with means for preventing node failure by taking remedial action based on calculated reliability scores (Venugopal, [0016-0017]).
As per claim 2, Roddom in view of Turi in further view of Venugopal fully discloses the limitations of claim 1.
Furthermore, Roddom discloses:
the plurality of tenets includes at least one of scaling operations with load, capping operational load, overflow handling, service level agreements, operational readiness reviews, error budgeting, observability, end-user alerts handling, and blameless post-mortems ("Service level objectives (SLOs), service level indicators (SLIs), and service level agreements (SLAs) may be primary metrics of site reliability engineering (SRE).". 0030)
As per claim 3, Roddom in view of Turi in further view of Venugopal fully discloses the limitations of claim 1.
Furthermore, Roddom discloses:
defining a plurality of tenets further comprises: defining a plurality of tenet dimensions associated with the plurality of tenets (“Exemplary service level objectives include availability, latency, service desk response time, incident resolution time, etc.”, 0031 ; " Exemplary service level indicators include data latency, network traffic, error rate, saturation, etc.", 0032)
As per claim 4, Roddom in view of Turi in further view of Venugopal fully discloses the limitations of claim 3.
Furthermore, Roddom discloses:
the plurality of tenet dimensions includes at least one of security, high availability, disaster recovery, storage, networking, incident response, and deployment (“Exemplary service level objectives include availability, latency, service desk response time, incident resolution time, etc., 0031 ; " Exemplary service level indicators include data latency, network traffic, error rate, saturation, etc.", 0032)
As per claim 8, Roddom in view of Turi in further view of Venugopal fully discloses the limitations of claim 1.
Furthermore, Turi discloses:
generating a report for at least one of the first, second, and third site reliability engineering maturity assessments. ("The electronic device is further configured to display the user interface using a display, the user interface including at least: a first health score associated with a first service; a first business value associated with the first service; a second health score associated with a second service; and a second business value associated with the second service", 0022)
As per claim 10, it is a computer program product claim (see Roddom, [0060]) comprising substantially the same limitations as claim 1, and as such, it is rejected for substantially the same reasons.
As per claim 11, it is a computer program product claim comprising substantially the same limitations as claim 3, and as such, it is rejected for substantially the same reasons.
As per claim 14, it is a computer program product claim comprising substantially the same limitations as claim 8, and as such, it is rejected for substantially the same reasons.
As per claim 16, it is a computer system claim (see Roddom, [0060]) comprising substantially the same limitations as claim 1, and as such, it is rejected for substantially the same reasons.
As per claim 17, it is a computer system claim comprising substantially the same limitations as claim 3, and as such, it is rejected for substantially the same reasons.
As per claim 20, it is a computer system claim comprising substantially the same limitations as claim 8, and as such, it is rejected for substantially the same reasons.
Claims 5, 12, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Roddom (US 20230353464 A1) in view of Turi (US 20230155911 A1) in further view of Venugopal (US 20210109830 A1) in further view of Yadwadkar (US 9122739 B1).
As per claim 5, Roddom in view of Turi in further view of Venugopal fully discloses the limitations of claim 3, but does not disclose assigning a weight for a tenet dimension of a plurality of tenet dimensions.
However, Yadwadkar discloses:
generating a site reliability engineering maturity assessment further comprises: assigning a value for a tenet dimension in the plurality of tenet dimensions; and assigning a weight for the tenet dimension in the plurality of tenet dimensions ("The evaluation function 935 may also apply a set of cost weights (specified by the optimization goals 670) to the set of cost evaluation values and a set of SLO weights (specified by the SLO weights 930) to the set of SLO evaluation values to produce a final evaluation value for the proposed state/solution.", 0120 ; Examiner Note: an SLO equates to a tenet, and SLO evaluation values equate to dimensions.)
It would have been obvious to one of ordinary skill in the art, before the effective filing date, to combine the teachings of Roddom in view of Turi in further view of Venugopal with those of Yadwadkar in order to execute workloads in environments which better achieve SLOs and optimization goals of the workloads through SLO weights (Yadwadkar, [0108]).
As per claim 12, it is a computer program product claim comprising substantially the same limitations as claim 5, and as such, it is rejected for substantially the same reasons.
As per claim 18, it is a computer system claim comprising substantially the same limitations as claim 5, and as such, it is rejected for substantially the same reasons.
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Roddom (US 20230353464 A1) in view of Turi (US 20230155911 A1) in further view of Venugopal (US 20210109830 A1) in further view of Mahacek (US 11595306 B2)
As per claim 6, Roddom in view of Turi in further view of Venugopal fully discloses the limitations of claim 1, but does not disclose taking a first and second difference in SRE scores.
However, Mahacek discloses:
computing a first difference between the first site reliability engineering maturity assessment for the first hybrid cloud component and the third site reliability engineering maturity assessment for the application; computing a second difference between the second site reliability engineering maturity assessment for the second hybrid cloud component and the third site reliability engineering maturity assessment for the application; and determining the workload placement based on the first difference and the second difference. (“Latency benchmarking module 210 may enumerate all CSP cloud regions and for each region-to-region pair, the latency benchmarking module 210 determines an evaluation result and stores the results in the benchmarking results database 260.”, 0029 ; see fig.3 ; “ The latency threshold may also be provided through user input through applications 111. For example, a user may specify that 10 milliseconds is a reasonable latency threshold, and as a result, the multi-cloud region determination module 220 may determine to network together more CSP regions to create a multi-cloud region because 10 milliseconds is a larger tolerable latency threshold than 5 milliseconds.”, 0032 ; Examiner Note: a latency benchmark evaluation result equates to a site reliability engineering maturity assessment. The latency threshold of the application equates to a site maturity assessment for the application. Determination of workload placement is based upon the comparison of each region (first, second, etc.) to the threshold.)
It would have been obvious to one of ordinary skill in the art, before the effective filing date, to combine the teachings of Roddom in view of Turi in further view of Venugopal with those of Mahacek in order to improve workload placement and ensure reliable transmission of packets from pod to pod (Mahacek, [0003]).
Claims 7, 13, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Roddom (US 20230353464 A1) in view of Turi (US 20230155911 A1) in further view of Venugopal (US 20210109830 A1) in further view of Mahacek (US 11595306 B2) in further view of Yadwadkar (US 9122739 B1).
As per claim 7, Roddom in view of Turi in further view of Venugopal in further view of Mahacek fully discloses the limitations of claim 6.
Furthermore, Mahacek discloses:
the first difference and the second difference are computed as a difference between a value associated with a tenet in the plurality of tenets scaled by a weight associated with the tenet (“Latency benchmarking module 210 may enumerate all CSP cloud regions and for each region-to-region pair, the latency benchmarking module 210 determines an evaluation result and stores the results in the benchmarking results database 260.”, 0029 ; see fig.3)
The combination of Roddom in view of Turi in further view of Venugopal in further view of Mahacek in further view of Yadwadkar would be capable of taking the first and second differences of weighted SRE maturity scores (Yadwadkar : "The evaluation function 935 may also apply a set of cost weights (specified by the optimization goals 670) to the set of cost evaluation values and a set of SLO weights (specified by the SLO weights 930) to the set of SLO evaluation values to produce a final evaluation value for the proposed state/solution.", 0120)
It would have been obvious to one of ordinary skill in the art, before the effective filing date, to combine the teachings of Roddom in view of Turi in further view of Venugopal in further view of Mahacek with those of Yadwadkar in order to execute workloads in environments which better achieve SLOs and optimization goals of the workloads through SLO weights (Yadwadkar, [0108]).
As per claim 13, it is a computer program product claim comprising substantially the same limitations as claims 6 and 7, and as such, it is rejected for substantially the same reasons.
As per claim 19, it is a computer system claim comprising substantially the same limitations as claims 6 and 7, and as such, it is rejected for substantially the same reasons.
Claims 9 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Roddom (US 20230353464 A1) in view of Turi (US 20230155911 A1) in further view of Venugopal (US 20210109830 A1) in further view of Liokumovich (US 20250293964 A1).
As per claim 9, Roddom in view of Turi in further view of Venugopal fully discloses the limitations of claim 1, but does not disclose generating SRE maturity assessments based on a trigger event.
However, Liokumovich discloses:
the first, second, and third site reliability engineering maturity assessments are generated based on a trigger event associated with at least one of the first hybrid cloud component, the second hybrid cloud component, and the application. (“The scheduler 260 periodically triggers SLO evaluations defined in an SLA, for example monthly and forwards this to the SLA evaluator 265. The SLA evaluator 265 messages the SLA storage 255 to recover SLO models associated with the SLA. An example SLO model was illustrated in FIG. 3. Upon receiving the SLO models, the SLA evaluator 265 requests the associated KPI or performance metric values from the KPI storage, which may return an appropriate performance metric data structure such as a tensor which groups the KPI according to coordinates of one or more dimensions.”, 0058)
It would have been obvious to one of ordinary skill in the art, before the effective filing date, to combine the teachings of Roddom in view of Turi in further view of Venugopal with those of Liokumovich in order to provide a method for metric collection which reduces resource overhead (Liokumovich, [0030]).
As per claim 15, it is a computer program product claim comprising substantially the same limitations as claim 9, and as such, it is rejected for substantially the same reasons.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Padmanaban (US 20240264831 A1) – discloses a system and method for automatically determining whether an application is SRE ready for production deployment.
Dennis (US 20230252390 A1) – discloses a method for providing common objective performance metrics across teams, and guiding performance of targeted behaviors based on site reliability engineering principles for improved system reliability and performance.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROSS MICHAEL VINCENT whose telephone number is (703)756-1408. The examiner can normally be reached Mon-Fri 8:30AM-5:30PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, April Blair can be reached at (571) 270-1014. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/R.M.V/
Examiner, Art Unit 2196
/APRIL Y BLAIR/Supervisory Patent Examiner, Art Unit 2196