Detailed Action
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This is the initial office action based on the application filed on March 21st, 2024, which claim 1-20 have been presented for examination.
Status of Claims
Claims 1-20 are pending in the application, of which claims 1, 11 and 20 are in independent form and these claims (1-20) are subject to following rejection(s) and/or objection(s) set forth in the following Office Action below.
Claim Objections
Claims 1-20 are objected to because of the following informalities:
Claim 1, lines 12-13; claim 11, lines 10-11; and claim 20, line 10, “the corresponding initial test results” lacks proper antecedent basis.
Claims 2-10 and 12-19 depend on the objected claims and inherit the same issue.
Appropriate correction is required.
Claim Rejections – 35 USC §103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 3-11 and 13-20 are rejected under 35 U.S.C. 103 as being unpatentable over Heidasch et al. (US Patent Application Publication No. 2009/0164848 A1 -herein after Heidasch) in view of Bussa et al. (US Patent Application Publication No. 2023/0007894 A1 herein after Bussa).
Per claim 1:
Heidasch discloses:
A computing platform (At least see ¶[0008] - a system for testing an application or component) comprising:
at least one processor (At least see ¶[0026] one or more/multiple processors);
a communication interface communicatively coupled to the at least one processor (At least see ¶[0027] - on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network); and
memory storing computer-readable instructions that, when executed by the at least one processor (At least see ¶[0025] - one or more modules of computer program instructions encoded on a computer readable medium, e.g., a machine-readable storage device, a machine readable storage medium, a memory device), cause the computing platform to:
receive test input information comprising input information to validate successful performance of code for a software application (At least see ¶[0006] -test input data for application for use by a test process);
test the code by inputting the test input information into the software application and executing the code (At least see ¶[0024] -instrument the test application or component to test the internal functionality without code modification), wherein testing the code produces initial test results (At least see ¶[0003] -an intelligent test framework that enables test data to be defined, and the test results and internal application or component states to be recorded);
store the test input information and the initial test results in a table (At least see ¶[0004] -test data and test results (test output as well as internal data) can be saved; ¶[0020] one or more tables for storing);
identify that subsequent testing should be initiated (At least see ¶[0004] next or subsequent tests are detected and initialized);
automatically identify the test input information and the corresponding initial test results using the stored table (At least see ¶[0017] -obtaining the test result (test output data) and recording test data (test input data), test results);
automatically test the software application by inputting the test input information into the software application and executing the software application, wherein testing the software application produces automated test results (At least see ¶[0007] -generating test input data for a test process to test the application or component, and generating a configuration of the test process based on a template. The method further includes executing the test process according to the configuration and based on the test input data to generate test output data; also see ¶[0020] and ¶[0023]);
compare the automated test results to the initial test results (At least see ¶[0023] -current test results will be compared with "reference" data, and the intelligent test framework then reports the test status and detail).
Heidasch sufficiently discloses the system as set forth above, but Heidasch does not explicitly discloses: based on identifying that the automated test results match the initial test results, cause deployment of the software application.
However, Bussa discloses:
based on identifying that the automated test results match the initial test results, cause deployment of the software application (At least see ¶[0009] - result report interface may include a selectable option to push the test code for deployment in accordance based on an indication of success in the result report).
It would have been obvious to one ordinary skill in the art before effective filing date of the claimed invention to incorporate Bussa into Heidasch because Bussa provides efficient, scalable, and convenient technical solutions that address and overcome the technical problems associated with software testing by providing intelligent dynamic web service testing in a continuous integration and delivery environment by building a web service test data set based on historical data and test code deployment parameters by conducting automated web service testing to validate requirement changes in a continuous integration and delivery test deployment environment (please see ¶[0003]).
Per claim 3:
Heidasch discloses:
wherein the test input information comprises input information used to test an overall routine of the software application and one or more subroutines of the software application, wherein a first portion of the test input information is used to test the overall routine and a second portion of the test input information is used to test the one or more subroutines (At least see ¶[0019] - intelligent test framework 200 is configured to test an application or component in a running test process 204; [wherein application and/or component include routines and subroutines] -emphasis added).
Per claim 4:
Bussa also discloses:
wherein the second portion of the test input information is produced through execution of the overall routine (At least see ¶[0070] - a code coverage test may be performed, in which a user may select a threshold percentage of lines of codes to execute).
It would have been obvious to one ordinary skill in the art before effective filing date of the claimed invention to incorporate Bussa into Heidasch because Bussa provides efficient, scalable, and convenient technical solutions that address and overcome the technical problems associated with software testing by providing intelligent dynamic web service testing in a continuous integration and delivery environment by building a web service test data set based on historical data and test code deployment parameters by conducting automated web service testing to validate requirement changes in a continuous integration and delivery test deployment environment (please see ¶[0003]).
Per claim 5:
Bussa also discloses:
wherein the test input information is manually generated (At least see ¶[0037] - user must manually update the test data to cover the various requirement changes. Accordingly, the systems and methods described herein seek to automate the processes for creating test data).
It would have been obvious to one ordinary skill in the art before effective filing date of the claimed invention to incorporate Bussa into Heidasch because Bussa provides efficient, scalable, and convenient technical solutions that address and overcome the technical problems associated with software testing by providing intelligent dynamic web service testing in a continuous integration and delivery environment by building a web service test data set based on historical data and test code deployment parameters by conducting automated web service testing to validate requirement changes in a continuous integration and delivery test deployment environment (please see ¶[0003]).
Per claim 6:
Heidasch discloses:
initial test results further comprise one or more error codes corresponding to testing of the code (At least see ¶[0017] - obtaining the test result (test output data) and tracing data (from debugging agent 104), and recording test data (test input data), test results (test output data) and internal application or component data (tracing data) for the creation of "reference" test data and/or result).
Per claim 7:
Bussa also discloses:
generating, using generative artificial intelligence and based on the test input information automated test input information (At least see ¶[0003] - delivery test deployment environment using an AI-generated test data set), wherein generating the automated test input information avoids generation of additional code (At least see ¶[0036] - a computing platform may employ artificial intelligence and/or machine learning models to automatically generate a test data set suitable for specific parameters associated with a test code); and
automatically testing the software application using the automated test input information (At least see ¶[0036] - based on an existing test data set and one or more parameters associated with executing the test code in a continuation integration and delivery environment).
It would have been obvious to one ordinary skill in the art before effective filing date of the claimed invention to incorporate Bussa into Heidasch because Bussa provides efficient, scalable, and convenient technical solutions that address and overcome the technical problems associated with software testing by providing intelligent dynamic web service testing in a continuous integration and delivery environment by building a web service test data set based on historical data and test code deployment parameters by conducting automated web service testing to validate requirement changes in a continuous integration and delivery test deployment environment (please see ¶[0003]).
Per claim 8:
Bussa also discloses:
based on identifying that the automated test results do not match the initial test results, identify whether a number of errors in the automated test results meets or exceeds a predetermined error threshold (At least see ¶[0070] - a code coverage test may be performed, in which a user may select a threshold percentage of lines of codes to execute. Accordingly, the prepared test data set may cover the threshold percentage of the test code to deem the test code a success).
It would have been obvious to one ordinary skill in the art before effective filing date of the claimed invention to incorporate Bussa into Heidasch because Bussa provides efficient, scalable, and convenient technical solutions that address and overcome the technical problems associated with software testing by providing intelligent dynamic web service testing in a continuous integration and delivery environment by building a web service test data set based on historical data and test code deployment parameters by conducting automated web service testing to validate requirement changes in a continuous integration and delivery test deployment environment (please see ¶[0003]).
Per claim 9:
Bussa also discloses:
based on identifying that the number of errors does not meet or exceed the predetermined error threshold, sending a notification to a user device indicating that the automated test failed (At least see ¶[0016] - Sending the test code output results may include at least one of: a success/failure status of the test code, a trace of any errors in the test code, an exception caught during execution of the test code, an error in the corrected test data set, or an explanation of a failure indication).
It would have been obvious to one ordinary skill in the art before effective filing date of the claimed invention to incorporate Bussa into Heidasch because Bussa provides efficient, scalable, and convenient technical solutions that address and overcome the technical problems associated with software testing by providing intelligent dynamic web service testing in a continuous integration and delivery environment by building a web service test data set based on historical data and test code deployment parameters by conducting automated web service testing to validate requirement changes in a continuous integration and delivery test deployment environment (please see ¶[0003]).
Per claim 10:
Bussa also discloses:
sending a notification to a user device indicating that the automated test failed and a request for updated test input information, receiving the updated test input information, and updating the table based on the updated test input information (At least see ¶[0074] - test data failures may be assessed and corrected. Where a test data failure has occurred, machine learning models may be training using reinforcement learning algorithms and/or supervised learning algorithms to connect to external web services/databases to generate a new test data set and use the new test data set as input to a subsequent test code execution).
It would have been obvious to one ordinary skill in the art before effective filing date of the claimed invention to incorporate Bussa into Heidasch because Bussa provides efficient, scalable, and convenient technical solutions that address and overcome the technical problems associated with software testing by providing intelligent dynamic web service testing in a continuous integration and delivery environment by building a web service test data set based on historical data and test code deployment parameters by conducting automated web service testing to validate requirement changes in a continuous integration and delivery test deployment environment (please see ¶[0003]).
Per claim 11:
Heidasch discloses:
A method (At least see ¶[0007] - a method for testing an application or component) comprising:
at a computing platform comprising at least one processor (At least see ¶[0026] - a programmable processor, a computer, or multiple processors or computers), a communication interface (At least see ¶[0027] - on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network), and memory:
remaining limitations as depicted in this method claim are as similar as claim 1 above; as such, the remaining limitations are rejected using same rational as claim 1 above.
Per claim 13:
wherein the test input information comprises input information used to test an overall routine of the software application and one or more subroutines of the software application, wherein a first portion of the test input information is used to test the overall routine and a second portion of the test input information is used to test the one or more subroutines.
limitation as depicted in this method claim is as similar as claim 3 above; as such, the limitation is rejected using same rational as claim 3 above.
Per claim 14:
wherein the second portion of the test input information is produced through execution of the overall routine.
limitation as depicted in this method claim is as similar as claim 4 above; as such, the limitation is rejected using same rational as claim 4 above.
Per claim 15:
wherein the test input information is manually generated.
limitation as depicted in this method claim is as similar as claim 5 above; as such, the limitation is rejected using same rational as claim 5 above.
Per claim 16:
initial test results further comprise one or more error codes corresponding to testing of the code.
limitation as depicted in this method claim is as similar as claim 6 above; as such, the limitation is rejected using same rational as claim 6 above.
Per claim 17:
generating, using generative artificial intelligence and based on the test input information automated test input information, wherein generating the automated test input information avoids generation of additional code; and
automatically testing the software application using the automated test input information.
limitations as depicted in this method claim are as similar as claim 7 above; as such, the limitations are rejected using same rational as claim 7 above.
Per claim 18:
based on identifying that the automated test results do not match the initial test results, identify whether a number of errors in the automated test results meets or exceeds a predetermined error threshold.
limitation as depicted in this method claim is as similar as claim 8 above; as such, the limitation is rejected using same rational as claim 8 above.
Per claim 19:
based on identifying that the number of errors does not meet or exceed the predetermined error threshold, sending a notification to a user device indicating that the automated test failed.
limitation as depicted in this method claim is as similar as claim 9 above; as such, the limitation is rejected using same rational as claim 9 above.
Per claim 20:
One or more non-transitory computer-readable media storing instructions that (At least see ¶[0025] - one or more modules of computer program instructions encoded on a computer readable medium, e.g., a machine readable storage device, a machine readable storage medium, a memory device), when executed by a computing platform comprising at least one processor, a communication interface, and memory (At least see ¶[0028] - processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output), cause the computing platform to:
remaining limitations as depicted in this product claim are as similar as claim 1 above; as such, the remaining limitations are rejected using same rational as claim 1 above.
Claims 2 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Heidasch et al. in view of Bussa et al., and further in view of Surace et al. (US Patent Application Publication No. 2009/0164848 A1 -herein after Surace).
Per claim 2:
Heidasch modified by Bussa sufficiently disclose the system as set forth above, but Heidasch modified by Bussa does not explicitly disclose: wherein the test input information comprises a username and a password.
However, Surace discloses:
wherein the test input information comprises a username and a password (At least see ¶[0041] test script data includes user name and passwords as an input).
It would have been obvious to one ordinary skill in the art before the effective filing date of the claimed invention to incorporate Surace into Heidasch modified by Bussa because capability to develop test scripts and test scenarios using user logs of user activities with the software applications while running on a production server, which user logs recorded in various formats; as such, user logs may be processed, analyzed, and/or combined with other information to derive one or more test scripts, which may then be combined into a test scenario and/or executed individually to conduct tests of the software application under test on the same or a different server (please see ¶[0018] - ¶[0019]).
Per claim 12:
wherein the test input information comprises a username and a password.
limitation as depicted in this method claim is as similar as claim 2 above; as such, the limitation is rejected using same rational as claim 2 above.
CONCLUSION
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZIAUL A. CHOWDHURY whose telephone number is (571)270-7750. The examiner can normally be reached on 9:30PM 6:30PM Monday -Friday.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hyung S. Sough can be reached on 571-272-6799. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Status information for published applications may be obtained from Patent Public Search tool (for all users) – A link to the Patent Public Search Tool is available at www. Uspto.gov/PatentPublicSearch. To find a U.S. patent or U.S. patent application publication, open the Patent Public Search tool by selecting “Start search”. Type the U.S. patent or U.S. patent application publication number in the “Search” panel without any punctuation and followed by an”.pn.”.
Should you have questions on access to the system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ZIAUL A CHOWDHURY/ Primary Examiner, Art Unit 2192