Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This in response to Application filed 09/09/25. Claims 1 – 20 have been examined and are pending.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1 – 20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Pandurangarao et al. US 20200272786.
Regarding claims 1, 12 and 20, Pandurangarao anticipates a computer-implemented method, comprising:
receiving a change request data object indicative of a software modification to a software program [0040, see 610 for commit request and updating one or more documents creating a change document];
generating a risk level of the software modification based on the change request data object [0043, see generating recommendations and calculating risks];
wherein the change request data object represents an association between the software program to be modified and a computing environment in which the software program is implemented; [0036, “…The results of the build process can be tested (436) in a test environment…”];
generating a test scheduling data object based on the change request data object and the risk level of the software modification, wherein the test scheduling data object is indicative of at least one testing operation for the software modification [0039, shows developing time stamps for document creating i.e. scheduling];
initiating performance of the at least one testing operation at a testing environment [0038, see unit tests and quality metrics for performance];
receiving, from the testing environment, test performance data [0036, see results of the build process and tests];
determining at least one test failure based on the test performance data [0036, see errors]; and
in response to determining the at least one test failure, generating a communication bridge between at least a subset of a plurality of computing devices based on the at least one test failure and a development flow data object [0034, see test results tool and creating test documents, 0038 shows tests can include errors and failure].
Regarding claims 2, and 13 the computer-implemented method of claim 1, further comprising:
in response to determining the risk level of the software modification exceeds a risk threshold, flagging the change request data object for administrator approval providing an administrator approval request to one of the plurality of computing devices based on the development flow data object [0040, see commit request, Quality Metrics, threshold value as well as document Validation].
Regarding claims 3, the computer-implemented method of claim 1, further comprising:
modifying a distributed scheduling data object based on the test scheduling data object modification [0039, shows developing time stamps for document creating i.e. scheduling and 0040 for updating test database].
Regarding claim 4, the computer-implemented method of claim 3, further comprising:
providing a graphical user interface (GUI) indicative of the distributed scheduling data object to the plurality of computing devices [0019 – 0020, shows scheduling and user interface]; and
rendering the test scheduling data object on the GUI in one of a plurality of colors based on the risk level of the software modification [0024, shows utilizing developer ratings to identify issues, “Note: while colors aren’t specifically mentioned its common knowledge to use colors for ratings or rankings the severity of occurrences in most industries” example code blue, code red, code yellow etc.].
Regarding claim 5, the computer-implemented method of claim 1, wherein: the at least one testing operation comprises one or more test case data objects; and the computer implemented method further comprises:
generating, using a machine learning model, at least one test case correction based on the at least one test failure; and applying the at least one test case correction to the one or more test case data objects [0019 – 0020, shows testing suites, test coverages, unit testing, failures, errors].
Regarding claim 6, the computer-implemented method of claim 1, further comprising:
modifying a status of the change request data object based on the at least one test failure, wherein the test scheduling data object is indicative of the status of the change request data object [0043, see modified document].
Regarding claim 7, the computer-implemented method of claim 1, further comprising:
providing a notification indicative of the test scheduling data object to the plurality of computing devices based on the development flow data object [0039, shows developing time stamps for document creating i.e. scheduling].
Regarding claim 8, the computer-implemented method of claim 1, wherein generating the communication bridge comprises: generating a teleconference communication session between at least the subset of the plurality of computing devices [0024 – 0025, see telecommunication and networking].
Regarding claim 9, the computer-implemented method of claim 1, wherein generating the communication bridge comprises:
generating a second test scheduling data object; and providing a notification indicative of the second test scheduling data object to at least the subset of the plurality of computing devices [0039, shows developing time stamps for document creating i.e. scheduling].
Regarding claim 10, the computer-implemented method of claim 1, further comprising:
modifying a distributed scheduling data object based on the second test scheduling data object [0039, shows developing time stamps for document creating i.e. scheduling].
Regarding claim 11, the computer-implemented method of claim 1, further comprising:
querying the development flow data object based on the at least one test failure to determine the subset of the plurality of computing devices, wherein the development flow data object defines associations between a plurality of user accounts and the plurality of computing devices [0042, see commit request].
Regarding claim 14, the computing apparatus of claim 13, wherein the computer-coded instructions, when executed by the at least one processor, further cause the computing apparatus to:
provide a notification indicative of the communication bridge and the at least one test failure to at least a subset of the plurality of computing devices based on the at least one test failure and the development flow data object [0034 see interaction tools].
Regarding claim 15, the computing apparatus of claim 12, wherein the computer-coded instructions, when executed by the at least one processor, further cause the computing apparatus to: generate, based on the at least one test failure, at least one correction data object indicative of one or more service-level agreement (SLA) impacts [0043, see rules].
Regarding claim 16, the computing apparatus of claim 15, wherein the computer-coded instructions, when executed by the at least one processor, further cause the computing apparatus to:
perform one or more root cause corrective actions (RCCAs) on the software modification based on the at least one correction data object [0043, see corrective suggestions].
Regarding claim 17, the computing apparatus of claim 12, wherein the computer-coded instructions, when executed by the at least one processor, further cause the computing apparatus to:
generate an incident report based on the at least one test failure; and
provide the incident report to the plurality of computing devices [0034, see test results].
Regarding claim 18, the computing apparatus of claim 12, wherein:
the change request data object indicates a downtime metric associated with the at least one testing operation; and the risk level is generated based on the downtime metric [0034 – 0043, shows metric and testing].
Regarding claim 19, the computing apparatus of claim 12, wherein the at least one testing operation comprises a simulated graphical user interface test [0033 – 0034, see user interface].
Correspondence Information
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Chuck Kendall whose telephone number is 571-272-3698. The examiner can normally be reached on 10:00 am - 6:30pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hyung Sough can be reached on 571-272-6799. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only.
For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
/CHUCK O KENDALL/
Primary Examiner, Art Unit 2192