DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim 1 recites, “establishing one or more logical links between the one or more applications…generating a logical map…” and “computing an impact score for each of the one or more applications”. The limitations of “establishing”, “generating” and “computing” as drafted are functions that, under their broadest reasonable interpretation, recite the abstract idea of a mental process. The limitations encompass a human mind carrying out the function through observation, evaluation, judgment and /or opinion, or even with the aid of pen and paper. Thus, this limitation recites and falls within the “Mental Processes” grouping of abstract ideas under Prong 1.
Under Prong 2, this judicial exception is not integrated into a practical application. The claim recites the following additional elements “continuously receiving”, “receiving” and “present a consolidated view… in a dashboard display...”. The additional elements of “continuously receiving” and “receiving” are insignificant pre and post solution activity. The additional element of “present a consolidated view… in a dashboard display…”, is recited at a high level of generality and thus is an insignificant extra-solution activity. See MPEP 2106.05(g). Further, the limitation “a processing device” is recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using generic computer, and/or mere computer components, MPEP 2106.05(f). Accordingly, the additional elements do not integrate the recited judicial exception into a practical application and the claim is therefore directed to the judicial exception
Under Step 2B, the claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of “A processing device” amounts to no more than mere instructions, or generic computer/computer components to carry out the exception, the limitation of “present a consolidated view… in a dashboard display…” is identified as well-understood, routine, conventional activity (2106.05(d)) and the limitations of “continuously receiving” and “receiving”, the courts have identified mere data gathering is also well-understood, routine and conventional activity. Se MPEP 2106.05(d) and MPEP 2106.05(f). The recitation of generic computer instruction and computer components to apply the judicial exception, and the well-understood, routine, conventional activities do not amount to significantly more, thus, cannot provide an inventive concept. Accordingly, claim 1 is not patent eligible under 35 USC 101.
Claims 2-3, further defines “application data” and a backend server. The additional elements are neither a practical application under prong 2, nor an inventive concept under step 2B as they amount to mere instructions to apply.
Claims 4-6, further defines “impact score”. The additional elements are neither a practical application under prong 2, nor an inventive concept under step 2B as they amount to mere instructions to apply.
Claim 7, further defines the presenting step. The additional elements are neither a practical application under prong 2, nor an inventive concept under step 2B as they amount to insignificant extra-solution activity.
Claims 8-20, contain similar limitations to claims 1-7 and are therefore rejected for the same reasons.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over by Bendert et al. (US 2024/0338184 A1) and further in view of Goyal et al. (US 10,423,410 B1).
As per claim 1 (Amended), Bendert et al. teaches the invention as claimed including, “A system for automated computer software release validation and management, the system comprising:
a processing device;
a non-transitory storage device containing instructions when executed by the processing device, causes the processing device to perform the steps of:
continuously receiving application data associated with one or more applications in a network environment;
establishing one or more logical links between the one or more applications based on the application data, wherein the one or more logical links comprise dependencies of the one or more applications and wherein establishing on or more logical links further comprises generating a logical map illustrating dependent connections across the one or more applications;”
Bendert et al. teaches the use of dependency identification components to identify dependencies of a software component (0013). Software components may have one or more dependencies such as different API’s, other software components that the software component communicates with or utilizes, and the like. Dependency management service may identify and track dependencies to the one or more software components (0021). A GUI may provide a dependency tree (generated logical map) that shows not only direct dependencies but indirect dependencies (0017).
“receiving one or more artifacts associated with a software release, wherein the artifact is associated with one or more checkpoints for validating the software release received from a user device service as proof of formatting standards and code review;
computing an impact score for each of the one or more applications based on the one or more artifacts and the one or more logical links between the one or more applications; and”
Bendert et al. teaches the dependency tracking system creates one or more heath indicators for each dependency based upon the dependency status information. A health indicator may be determined based on one or more specified rules. An unhealthy or intermediate indicatory may be assigned when one or both of the dependency of the software components is not up-to-date or had vulnerabilities. Multiple health indicators may be combined to produce an overall dependency health indicator for the software component. Indicator may be displayed (0016). The system may provide a dependency score that determines how dependent one software component is to another (0018). The system may determine when a new version of an object is released or predicted to be released and may notify administrators of other software components that depend on the software component with the new version (0022). Dependency status and recommendation component may calculate one or more status indicators for one or more of the dependencies identified by the dependency determiner component based upon the status information retrieved by the dependency tracker component. Status indicators may be a score, based on a specified formula that considers factors. The status may include a test status of a current version (code review) of the dependency. Status indicators are a score, points may be assigned based upon whether the dependency is up-to-date, has no known vulnerabilities, or the like. For example, if the software component has incorporated and/or tested version 1.6 of a dependency, but version 2.1 is the latest version, fewer points may be given to the software component than if it had incorporated and/or tested version 1.9 of the dependency status indicator may be converted to a percentage of total points possible (0032). Status indicators of all dependencies of a given software component may be aggregated to form a score for the entire software component (0033). Status score of an entire software component may be created by weighted combination of each of the scores of each dependency. A user interface may show the scores for a plurality of software components (0042).
Bendert et al. does not explicitly appear to teach, “as proof of formatting standard…”.
Goyal et al. teaches source code can be tested against a model to reveal whether the source code conforms to formats of authoritative source code (column 3, lines 3-8). If source code complies with the formats exhibited by the authoritative source code, the source code repository server stores the source code to the source code repository. Figure 2 illustrates an example of source code that is compliant with a set of coding rules (column, lines 53-58).
It would have been obvious to one of ordinary skill in the art before the effective filing date to modify Bendert et al. with Goyal et al. because both teach testing a program/code. Bendert et al. teaches determining the health of a program based on status indicator score based on tests and rules. Goyal et al. teaches a check to see if source code conforms to a format. There are many factors one can check in order to determine the health/conformance of a program. Checking that the source code conforms to a format is nothing more then a design choice and would have been obvious to one to add it as one of the rules/tests to be performed by Bendert et al. to determine the health of a program.
“based on the impact score for each of the one or more applications, present a consolidated view of a status of the software release in a dashboard display on the user device, the status comprising a progress of each of the one or more checkpoints for validating the software release.”
Bendert et al. teaches the dependency tracking system may provide various advanced visualizations in one or more GUIs (dashboard) including a visualization of current dependencies or a particular software component, visualization of past dependencies, visualization of dependency histories, a visualization of number of dependencies, a health of dependencies (status), and the like (0017). The GUI provides a visualization of dependencies and their statuses (progress) among other information (0012). Also see 0015, 0032-0033 and figure 3.
As per claim 2 (Amended), Bendert et al. further teaches, “The system of claim 1, wherein the application data comprises information regarding the dependencies of the one or more applications a computing resource in the network environment, wherein the computing resource is a target of the software release.”
The system may calculate a dependency complexity resource utilization score. This score predicts server workload based upon the software components dependencies and what other software components depend on that software component. For example, if a second software component depends on a first software component and is set to go live, the system may recommend increasing an amount of hardware resources dedicated to the software component (0041). Software components may have one or more dependencies such as different API’s, other software components that the software component communicates with or utilizes, and the like. Dependency management service may identify and track dependencies to the one or more software components (0020-0021).
As per claim 3, Bendert et al. further teaches, “The system of claim 2, wherein the computing resource is a backend server comprising data used by one or more functions of the one or more applications.”
The system may calculate a dependency complexity resource utilization score. This score predicts server workload based upon the software components dependencies and what other software components depend on that software component. For example, if a second software component depends on a first software component and is set to go live, the system may recommend increasing an amount of hardware resources dedicated to the software component (0041). Software components may have one or more dependencies such as different API’s, other software components (backend server) that the software component communicates with or utilizes, and the like. Dependency management service may identify and track dependencies to the one or more software components (0021-0023).
As per claim 4, Bendert et al. further teaches, “ The system of claim 3, wherein the impact score indicates a degree of dependency of an application on the computing resource.
The system may provide a dependency score that determines how dependent one software component is to another (0018).
As per claim 5, Bendert et al. further teaches, “The system of claim 1, wherein the impact score is a composite score comprising one or more subscores, wherein each of the one or more subscores indicate a degree of impact along a specified dimension.
The dependency tracking system creates one or more heath indicators for each dependency based upon the dependency status. An unhealthy or intermediate indicatory may be assigned when one or both of the dependency of the software components is not up-to-date or had vulnerabilities. Multiple health indicators may be combined to produce an overall dependency health indicator for the software component. Indicator may be displayed (0016). The system may calculate a dependency complexity resource utilization score. This score predicts server workload based upon the software components dependencies and what other software components depend on that software component. For example, if a second software component depends on a first software component and is set to go live, the system may recommend increasing an amount of hardware resources dedicated to the software component (0041). Status score of an entire software component may be created by weighted combination of each of the scores of each dependency. A user interface may show the scores for a plurality of software components (0042).
As per claim 6, Bendert et al. further teaches, “The system of claim 5, wherein the specified dimension comprises one of application performance, application security, and application uptime.”
The system may calculate a dependency complexity resource utilization score. This score predicts server workload based upon the software components dependencies and what other software components depend on that software component. For example, if a second software component depends on a first software component and is set to go live, the system may recommend increasing an amount of hardware resources dedicated to the software component (0041). The dependency tracking system creates one or more heath indicators for each dependency based upon the dependency status. An unhealthy or intermediate indicatory may be assigned when one or both of the dependency of the software components is not up-to-date or had vulnerabilities. Multiple health indicators may be combined to produce an overall dependency health indicator for the software component. Indicator may be displayed (0016).
As per claim 7, Bendert et al. further teaches, “The system of claim 1, wherein presenting the impact score comprises displaying a user dashboard on a user computing device, wherein the user dashboard comprises one or more interface elements for accepting the one or more artifacts from a user.”
Dependency tracking system may provide various advanced visualizations in one or more GUIs including a visualization of current dependencies or a particular software component, visualization of past dependencies, visualization of dependency histories, a visualization of number of dependencies, a health of dependencies, and the like (0017). Also see figure 3 and paragraph 0012. User can confirm or deny the dependency (0013).
As per Claims 8-20, contain similar limitations to claims 1-7 and are therefore rejected for the same reasons.
Response to Arguments
Applicant's arguments filed 11/6/2025 have been fully considered but they are moot due to amendments. Please see above rejection regarding the new limitations.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Olejarz et al. (US 2023/0029624 A1) teaches auditing release readiness for a software application. A release readiness score for the application is generated based on assessment criteria (abstract).
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MARK A GOORAY whose telephone number is (571)270-7805. The examiner can normally be reached Monday - Friday 10:00am - 6:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lewis Bullock can be reached at 571-272-3759. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MARK A GOORAY/ Examiner, Art Unit 2199
/LEWIS A BULLOCK JR/ Supervisory Patent Examiner, Art Unit 2199