FINAL OFFICE ACTION
Status of the Claims
Claims 1-8, and 11-12 are rejected under 35 U.S.C. 103
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-7, and 11-12 are rejected under 35 U.S.C. 103 as being unpatentable over Fujine et al. (U.S. Publication No. 2021/0019252 A1), hereinafter referred to as Fujine, in view of Ryall et al. (U.S. Publication No. 2020/0004519 A1), hereinafter referred to as Ryall, in further view of Maddela et al. (U.S. Publication No. 2013/0042222 A1), hereinafter referred to as Maddela.
Regarding Claim 1, Fujine teaches:
at least one storage device configured to store instructions; ([0032])
and at least one processor configured to execute the instructions to: ([0091])
detect update of configuration information of an information system; ([0078]; regarding, “the reception unit 610 receives the registration information from the development terminal 710 or the operation terminal 720 (S21)… The registration information includes, for example, information for defining or updating a workflow, information for defining or updating a work pattern, information for defining or updating test data, information for defining or updating a test pattern, and the like.”);
deploy a load module related to the updated configuration information in the information system to release updated contents; ([0079]; regarding, “If it is determined in Step S22 that the registration information is a definition of the workflow, the registration unit 620 registers the workflow information 310 in the CMDB 300 based on the registration information (S23).”; [0080]; regarding, “If it is determined in Step S22 that the registration information is a definition of the work pattern or the like… the registration unit 620 registers or updates the work pattern table 320, the parameter sheet 330, and the work scenario 340.”; [0081]; regarding, “When it is determined in Step S22 that the registration information is a definition of the test data… the registration unit 620 issues a test data ID for identifying a combination of the test data included in the registration information and the automation method.”; [0082]; regarding, “If it is determined in Step S22 that the registration information is a definition of the test pattern or the like… the registration unit 620 registers or updates the test pattern table 410, the test scenario 420, and the association table 430.”);
refer to, based on the release completion notification being received, combined test definition information in which a test automation tool and an execution order are associated with each of a plurality of test scenarios constituting a combined test for the information system ([0033]; regarding, “The test pattern 21 is information defining a combination of a tool 211, a parameter ID 212, and a GUI application 213 of a system to be tested.”; [0066]; regarding, “the test scenario 420 can be expressed by defining an association between an operation to be executed in each processing order and a parameter ID of a parameter value to be used in the operation...”);
and then select the test scenario and the test automation tool in accordance with the execution order; (Fig. 7, [0086]; regarding, “When the selected pattern ID is a test pattern in Step S304, the acquisition unit 630 refers to the test scenario 420 corresponding to the selected test pattern ID”; [0088]; regarding, “the test execution unit 650 reads the GUI operation automation tool 501 or the like corresponding to the tool ID associated with the selected test pattern ID from the tool DB 500.” [0082]; regarding, “The registration unit 620… updates the parameter ID in the test scenario 420 for each processing order with respect to the test pattern ID.”);
execute, for the information system, the selected test scenario by using the selected test automation tool; ([0088]; regarding, “the test execution unit 650 reads the GUI operation automation tool 501 or the like corresponding to the tool ID associated with the selected test pattern ID from the tool DB 500… and controls a test to be executed.”).
Fujine fails to explicitly disclose but Ryall teaches:
perform a status check by a communication test for the information system in which the updated contents have been released; ([0089]; regarding, “Upon commencing the execution of the deployment, the CI management system 106 is configured to update the status of the deployment based on the result.”; [0091]; regarding, “The CI management system 106 may retrieve some of the required information… such as the deployment ID is retrieved from the deployment request and the date and environment may be retrieved once the deployment is completed. Further, the status of the deployment from the CI management system 106.”);
output a release completion notification when there is no problem with regard to the status check; ([0089]; regarding, “Upon commencing the execution of the deployment, the CI management system 106 is configured to update the status of the deployment based on the result. It is configured to communicate information about the deployment to the tracking system 108 at step 210… The CI management system 106 then updates the status once the deployment is complete (to “success” of “failed” for instance). Status of the deployment includes, e.g., whether the deployment is queued/pending, in progress, completed, successful or unsuccessful. When the deployment request is first created, the status of the deployment is set as queued/pending. This status is updated during the deployment process. For example, when the deployment is initiated, the status is updated to ‘in progress’ and once it is completed the status may be updated to ‘completed’, ‘successful’ or ‘unsuccessful’ (depending on the outcome of the deployment).”).
Therefore, it would have been obvious before the effective filing date of the claimed invention to one of ordinary skill in the art to which said subject matter pertains to have modified Fujine with the teachings of Ryall above. Doing could allow for greater flexibility and ease of operation for developers (Ryall, [0028]).
Further, Maddela teaches:
and determine whether or not the combined test has been successfully performed based on at least some of results of the execution of the plurality of the test scenarios executed in accordance with the respective execution orders. ([0046]; regarding, “Upon completion of the test case execution, the test case execution status for individual methods, test cases, or the test set as a whole may be provided 380...This may include presenting a detailed or summarized log of test case execution activities; a listing of pass/fail test case results; graphical indications; or alerts provided to a testing user.”)
Therefore, it would have been obvious before the effective filing date of the claimed invention to one of ordinary skill in the art to which said subject matter pertains to have modified Fujine and Ryall with the teachings of Maddela above. Doing so can overcome the challenges of high cost and effort for the development and maintenance of automated test cases and test suites (Maddela, [0018]).
Regarding Claim 2, Fujine in view of Ryall in further view of Maddela teach the combined test apparatus of claim 1. Fujine in view of Ryall in further view of Maddela further teaches:
wherein the at least one processor is further configured to execute the instructions to:
convert information about a test corresponding to the selected test scenario into a format corresponding to the selected test automation tool, and execute the selected test scenario by inputting the converted information into the selected test automation tool. (Fujine, [0088]; regarding, “test execution unit 650… controls a test to be executed… At this time, the test execution unit 650 appropriately converts, processes, or performs other processing on the acquired setting value and test data according to the interface of the GUI operation automation tool.)
Regarding Claim 3, Fujine in view of Ryall in further view of Maddela teach the combined test apparatus of claim 1. Fujine in view of Ryall in further view of Maddela further teaches:
wherein the at least one processor is further configured to execute the instructions to:
select a second test scenario and a second test automation tool, (Maddela, [0025]; regarding, “the test case automation tool may be operated by a test user to define specific inputs and testing actions to establish an automated test case and test set collection of test cases. The test case automation tool is further able to execute an automation of test scenarios and use cases of various methods”);
an execution order of the second test scenario being later than that of the first test scenario, (Maddela, [0066]; regarding, “the test case automation tool may enable a provision for invoking already designed test case(s) within another test case of the same test set. Likewise, the execution of a method of a test case may be dependent on the results of the execution of another method or test case.”);
convert a result of the execution by the first test automation tool associated with the first test scenario into a format corresponding to the second test automation tool, (Maddela, [0060]);
execute the second test scenario by inputting the converted information into the selected second test automation tool. (Maddela, [0056]).
Regarding Claim 4, Fujine in view of Ryall in further view of Maddela teach the combined test apparatus of claim 1. Fujine in view of Ryall in further view of Maddela further teaches:
wherein the at least one processor is further configured to execute the instructions to:
acquire an expected value in the combined test from a configuration management database configured to manage configuration information of the information system, (Maddela, [0065]; regarding, “the test set data generator GUI 800 is to generate a large number of test set records in a database”; Fig. 6, [0062]);
determine whether or not the combined test has been successfully performed by using the expected value, and register a determined result in the configuration management database. (Maddela. [0046]; regarding, “Upon completion of the test case execution, the test case execution status for individual methods, test cases, or the test set as a whole may be provided 380 to a testing user, automated process, or other location… This may include… a detailed or summarized log of test case execution activities…”; [0045]; regarding, “Errors and unexpected results that occur during execution of the test case may be logged or captured for further analysis.”).
Regarding Claim 5, Fujine in view of Ryall in further view of Maddela teach the combined test apparatus of claim 4. Fujine in view of Ryall in further view of Maddela further teaches:
wherein the at least one processor is further configured to execute the instructions to:
generate a test result of the combined test based on some of the results of the execution of the respective test scenarios, (Maddela, [0056]);
and determine whether or not the combined test has been successfully performed by comparing the test result with the expected value. (Maddela, [0028]; regarding, “The GUI may accept commands to allow the specification of inputs to the application, initiate test actions on the application, define expected results of the test actions, and verify or report the results of the test action.”; [0061]; regarding, “The GUI may also provide a variety of graphical and textual indications of the execution results from the test cases, including providing a listing with PASS/FAIL status”).
Regarding Claim 6, Fujine in view of Ryall in further view of Maddela teach the combined test apparatus of claim 4. Fujine in view of Ryall in further view of Maddela further teaches:
wherein the at least one processor is further configured to execute the instructions to:
acquire a result of the past determination registered as the expected value from the configuration management database. (Maddela. [0056]; regarding, “The execution of the method and the values for the parameters of the method may be tied to the execution of other methods in the test case, or results of other test cases.”; [0060]; regarding, “Automated test cases for the test sets themselves may also be stored in an XML format, although such test case data may also be maintained in other formats in a database…”);
Regarding Claim 7, Fujine in view of Ryall in further view of Maddela teach the combined test apparatus of claim 1. Fujine in view of Ryall in further view of Maddela further teaches:
wherein the at least one processor is further configured to execute the instructions to:
perform a process subsequently to completion of release processing executed in response to an update of the configuration information of the information system, (Maddela, [0060]; regarding, “The GUI 400 of FIG. 4 may be configured to load this XML data from a selected test set or from various test cases, thereby providing easy maintenance and changes to the automated test cases”);
the process including selecting the test scenario and the test automation tool, (Fujine, Fig. 7, [0086]; regarding, “When the selected pattern ID is a test pattern in Step S304, the acquisition unit 630 refers to the test scenario 420 corresponding to the selected test pattern ID”; [0088]; regarding, “the test execution unit 650 reads the GUI operation automation tool 501 or the like corresponding to the tool ID associated with the selected test pattern ID from the tool DB 500.” [0082]; regarding, “The registration unit 620… updates the parameter ID in the test scenario 420 for each processing order with respect to the test pattern ID.”);
executing the selected test scenario, (Fujine, [0088]; regarding, “the test execution unit 650 reads the GUI operation automation tool 501 or the like corresponding to the tool ID associated with the selected test pattern ID from the tool DB 500… and controls a test to be executed.”);
and determining whether or not the combined test has been successfully performed. (Maddela, [0046]; regarding, “Upon completion of the test case execution, the test case execution status for individual methods, test cases, or the test set as a whole may be provided 380...This may include presenting a detailed or summarized log of test case execution activities; a listing of pass/fail test case results; graphical indications; or alerts provided to a testing user.”).
Claims 11 and 12 are rejected under 35 U.S.C. 103 under the same grounds of rejection as claim 1.
Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Fujine et al. (U.S. Publication No. 2021/0019252 A1), hereinafter referred to as Fujine, in view of Ryall et al. (U.S. Publication No. 2020/0004519 A1), hereinafter referred to as Ryall, in further view of Maddela et al. (U.S. Publication No. 2013/0042222 A1), hereinafter referred to as Maddela, in further view of Cohen et al. (U.S. Patent No. 9,201,775 B1), hereinafter referred to as Cohen.
Regarding Claim 8, Fujine in view of Ryall in further view of Maddela teach the combined test apparatus of claim 1. Fujine in view of Ryall in further view of Maddela further teaches:
wherein the combined test includes at least one of a functional test for verifying a function provided by the information system, (Maddela, [0035]; regarding, “the test case automation tool 120 may provide its own custom functionality, logic, and custom methods to design and execute the various testing actions upon the software application 110…”);
Fujine in view of Ryall in further view of Maddela fails to explicitly disclose but Cohen teaches:
a load test for verifying performance of the information system, (Col. 26, lines 12-16; regarding, “Additionally or alternatively, a run of a test scenario may include state information about systems involved in running the test scenario (e.g., the state of certain system resources, and/or performance data such as CPU load or network congestion)”);
and an abnormal system test for verifying an operation of the information system in an event of an abnormality. (Col. 31, lines 26-35; regarding, “there may be cases in which the automatic test scenario utilizes randomly generated or randomly selected values… which may introduce unexpected values into a run of the automatic test scenario may lead to unexpected behavior of the system (e.g., generation of certain errors or execution of different transactions than were expected).”).
Therefore, it would have been obvious before the effective filing date of the claimed invention to one of ordinary skill in the art to which said subject matter pertains to have modified Fujine, Maddela, and Ryall with the teachings of Cohen above. Doing so can increase the efficiency of test scenarios, possibly reducing the number of test scenarios that need to be run (Cohen, Col. 52, lines 43-45).
Response to Arguments
Applicant’s arguments filed 12/05/2025 have been fully considered.
Applicant argues amended independent Claims 1 and similarly Claims 11 and 12 overcome the 35 U.S.C. 101 rejection. Examiner agrees.
Applicant’s arguments with respect to the previous rejection under 35 U.S.C. 103 on independent Claims 1 and similarly Claims 11 and 12, have been considered and a new grounds of rejection has been provided addressing the newly claimed matter. Please see the above detailed rejection of the newly recited subject matter.
Previously cited reference Fujine teaches detect update of configuration information of an information system; and deploy a load module related to the updated configuration information in the information system to release updated contents;
Newly cited reference Ryall teaches perform a status check by a communication test for the information system in which the updated contents have been released; and output a release completion notification when there is no problem with regard to the status check;
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATHEW GUSTAFSON whose telephone number is (571)272-5273. The examiner can normally be reached Monday-Friday 8:00-4:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Bryce Bonzo can be reached at (571) 272-3655. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/M.D.G./Examiner, Art Unit 2113 /BRYCE P BONZO/Supervisory Patent Examiner, Art Unit 2113