Prosecution Insights
Last updated: April 19, 2026
Application No. 18/360,576

EVALUATION OF SOFTWARE DEPLOYMENT PIPELINE STAGES BY APPLYING SOFTWARE TESTING CHECKERS

Non-Final OA §103§112
Filed
Jul 27, 2023
Examiner
LYONS, ANDREW M
Art Unit
2191
Tech Center
2100 — Computer Architecture & Software
Assignee
DELL PRODUCTS, L.P.
OA Round
3 (Non-Final)
74%
Grant Probability
Favorable
3-4
OA Rounds
2y 6m
To Grant
90%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
338 granted / 459 resolved
+18.6% vs TC avg
Strong +16% interview lift
Without
With
+16.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
23 currently pending
Career history
482
Total Applications
across all art units

Statute-Specific Performance

§101
14.2%
-25.8% vs TC avg
§103
57.3%
+17.3% vs TC avg
§102
14.5%
-25.5% vs TC avg
§112
6.0%
-34.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 459 resolved cases

Office Action

§103 §112
DETAILED ACTION This Action is a response to the RCE filed 27 January 2026. Claims 1-2, 8-9 and 15 are amended; claim 22 is canceled; and claim 24 is newly added. Claims 1-3, 5-10, 12-16, 18-21 and 23-24 remain pending for examination. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claims 21 and 23 are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor regards as the invention. Claim 21 recites the limitation "the user" in line 1 and “the at least one listing of designated software testing checkers” in lines 2-3. There is insufficient antecedent basis for these limitations in the claim. Claim 23 includes the same limitations. The amendments to claims 1 and 15 remove reference to “a user” and “at least one listing of designated software testing checkers” and thus “the user” and “the at least one listing of designated software testing checkers” lack antecedent basis Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1, 3, 5-8, 10, 12-16 and 18-20 are rejected under 35 U.S.C. § 103 as being unpatentable over Ramachandran et al., U.S. 11,281,571 B2 (“Ramachandran”) in view of Sandhu et al., U.S. 2021/0141718 A1 (“Sandhu”). Regarding claim 1, Ramachandran teaches: A method (Ramachandran, e.g., 11:50-52, “method 300 for performing validation on an application in a CI/CD pipeline …”), comprising: obtaining a set of processor-based software testing checkers for evaluating one or more software items in at least one stage of a software deployment pipeline (Ramachandran, e.g., 7:58-8:21, “Validation engine 270 may determine the validation suite to apply to application 280 and query repository 285 based on the determination … Validation suite 210 includes a pre-production validation suite 215 and a production validation suite … Validation engine 270 may be configured to apply policies, rules, and/or algorithms to determine whether application 280 adheres to pre-production validation suite 215 and production validation suite 225 … Validation engine 270 may execute one or more validation suites at different stages of deployment pipeline 240.” Examiner’s note: see also, e.g., 8:22-10:65, describing various validation factors, each of which comprises one or more tests or checkers to determine validation criteria); … automatically applying at least some of the processor-based software testing checkers in the set to the first stage of the software deployment pipeline to obtain respective software testing checker results … automatically applying the at least some of the processor-based software testing checkers in the set to the second stage of the software deployment pipeline to obtain respective software testing checker results (Ramachandran, e.g., 8:7-21, “Validation engine 270 may execute one or more validation suites at different stages of deployment pipeline 240”) …; and initiating one or more automated actions based at least in part on one or more of the software testing checker results (Ramachandran, e.g., 10:66-11:17, “After validating application 280, validation engine 270 may transmit data associated with the performed validation to validation reporter 275 … generate a score associated with application 280 based on the performed validation … determine corrective action based on the score and/or whether application 280 passed or failed the validation.” See also, e.g., 12:40-47, “method determines if the application in the pre-production environment passed the pre-production validation suite. If the application passed … ‘YES’ branch is taken …” See also, e.g., 12:48-49, “method deploys the application in the production environment …” and, e.g., 13:16-36, “method may determine and provide one or more suggestions to correct the issue … method may generate a report based on the result of the validation suites applied to the application …”); wherein the method is performed by at least one processing device comprising a processor coupled to a memory (Ramachandran, e.g., 2:25-52, “system 100 including processors 102 and 104, a chipset 110, a memory 120 … Processor 102 is connected to chipset 110 … Memory is connected to chipset 110 …” See also, e.g., 6:51-53, “software release management system 200 in an information handling system similar to information handling system 100 …”). Ramachandran does not more particularly teach that the automatic applying of the software testing checkers occurs in response to respective first and second requests to transition the software deployment pipeline from respective previous stages to respective following stages, wherein at least first, second and third stages are distinct pipeline stages. However, Sandhu does teach: in response to a first request to transition the software deployment pipeline from a first stage to a second stage … in response to a second request to transition the software deployment pipeline from the second stage to a third stage, [automatically applying at least some testing checkers], wherein the first stage, the second stage and the third stage comprise distinct stages of the software development pipeline (Sandhu, e.g., ¶32, “automated code deployment pipeline 104 includes 104 includes multiple (n) stages for code deployment, illustrated as stages 114(1), 114(2), …, 114(n). For a code change 108, in at least one stage 114, one or more checks or evaluations of the code change 108 is made … For each such stage 114 the code change 108 does not progress to the next stage 114 until the evaluation of the code change is successful … At least stage 114, a stage result 122 is output indicating either approval … or denial …” See also, e.g., ¶35, “By way of example, the automated code deployment pipeline can include a peer engineer review stage, followed by a unit test stage, followed by a merge/build stage, followed by a system/integration test stage. In the peer engineer review stage, the code change 108 is communicated to one or more peer engineers … and feedback is received … (e.g., an indication that the code change 108 is approved … indication of a problem) …” See also, e.g., ¶43, “example stage 200 in the automated code deployment pipeline … an example of each stage 114(1), …, 114(n) in the automated code deployment pipeline … code change evaluation module 202 receives the code change 108 and can perform various analysis on the code change 108, such as running one or more tests on the code change …” Examiner’s note: by each respective stage receiving a code change from a prior stage (or an initial stage), this acts as a request to progress the code change from a previous stage to a subsequent stage (i.e., for a code change to progress from a peer review stage to a unit test stage conditional on passing the peer review, or to pass from the unit test stage to the merge/build stage conditional on passing the unit tests selected for validation. See at least claim 2, wherein the request to transition from a first stage to a second stage may comprise an event). At least the peer review, unit test, merge/build test, and system/integration test stages comprise one or more of the first, second and third distinct pipeline stages) for the purpose of executing a deployment pipeline having a plurality of stages wherein at each stage one or more identified or selected tests are performed against a code change in order to validate suitability for progression to a following stage, wherein identification of at least some of the tests is based on an assessment of risk for the code change (Sandhu, e.g., ¶¶15-25). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the system and method for pipeline-based software validation as taught by Ramachandran to provide that the automatic applying of the software testing checkers occurs in response to respective first and second requests to transition the software deployment pipeline from respective previous stages to respective following stages, wherein at least first, second and third stages are distinct pipeline stages because the disclosure of Sandhu shows that it was known to those of ordinary skill in the pertinent art to improve a system and method for pipeline-based continuous software artifact validation and deployment to provide that the automatic applying of the software testing checkers occurs in response to respective first and second requests to transition the software deployment pipeline from respective previous stages to respective following stages, wherein at least first, second and third stages are distinct pipeline stages for the purpose of executing a deployment pipeline having a plurality of stages wherein at each stage one or more identified or selected tests are performed against a code change in order to validate suitability for progression to a following stage, wherein identification of at least some of the tests is based on an assessment of risk for the code change (Sandhu, Id.). Claims 8 and 15 are rejected for the reasons given in the rejection of claim 1 above. Examiner notes that with respect to claim 8, Ramachandran further teaches: An apparatus comprising: at least one processing device comprising a processor coupled to a memory; the at least one processing device being configured to (Ramachandran, e.g., 2:25-52, “system 100 including processors 102 and 104, a chipset 110, a memory 120 … Processor 102 is connected to chipset 110 … Memory is connected to chipset 110 …” See also, e.g., 6:51-53, “software release management system 200 in an information handling system similar to information handling system 100 …”) implement the following steps: [[[the method of claim 1]]]; and with respect to claim 15, Ramachandran further teaches: A non-transitory processor-readable storage medium having stored therein program code of one or more software programs, wherein the program code when executed by at least one processing device causes the at least one processing device (Ramachandran, e.g., 2:25-52, “system 100 including processors 102 and 104, a chipset 110, a memory 120 … Processor 102 is connected to chipset 110 … Memory is connected to chipset 110 …” See also, e.g., 6:51-53, “software release management system 200 in an information handling system similar to information handling system 100 …” See also, e.g., 13:44-59, “methods described herein may be implemented by software programs executable by a computer system … a computer-readable medium that includes instructions …”) to perform the following steps: [[[the method of claim 1]]]. Regarding claim 3, the rejection of claim 1 is incorporated, and Ramachandran further teaches: wherein the one or more automated actions comprise one or more of failing a validation of one or more of the stages of the software deployment pipeline, initiating a designated review process, preventing a promotion of the software deployment pipeline to a next stage, promoting the software deployment pipeline to a next stage and generating one or more notifications (Ramachandran, e.g., 10:66-11:17, “After validating application 280, validation engine 270 may transmit data associated with the performed validation to validation reporter 275 … generate a score associated with application 280 based on the performed validation … determine corrective action based on the score and/or whether application 280 passed or failed the validation.” See also, e.g., 12:40-47, “method determines if the application in the pre-production environment passed the pre-production validation suite. If the application passed … ‘YES’ branch is taken …” See also, e.g., 12:48-49, “method deploys the application in the production environment …” and, e.g., 13:16-36, “method may determine and provide one or more suggestions to correct the issue … method may generate a report based on the result of the validation suites applied to the application …”). Claims 10 and 16 are rejected for the additional reasons given in the rejection of claim 3 above. Regarding claim 5, the rejection of claim 1 is incorporated, but Ramachandran does not more particularly teach that delivery of a product associated with the pipeline is permitted based on testing checker results from the software testing checkers associated with a final deployment pipeline stage. However, Sandhu does teach: wherein a delivery of a product associated with the software deployment pipeline is permitted based at least in part on one or more of the software testing checker results from the processor-based software testing checkers associated with a final stage of the software deployment pipeline (Sandhu, e.g., ¶25, “Once the code change has passed all the stages of the automated code deployment pipeline, the code change becomes an approved code change that can then be deployed to … the code repository … included in future distribution of the software …”) for the purpose of executing a deployment pipeline having a plurality of stages wherein at each stage one or more identified or selected tests are performed against a code change in order to validate suitability for progression to a following stage, wherein identification of at least some of the tests is based on an assessment of risk for the code change (Sandhu, e.g., ¶¶15-25). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the system and method for pipeline-based software validation as taught by Ramachandran to provide that delivery of a product associated with the pipeline is permitted based on testing checker results from the software testing checkers associated with a final deployment pipeline stage because the disclosure of Sandhu shows that it was known to those of ordinary skill in the pertinent art to improve a system and method for pipeline-based continuous software artifact validation and deployment to provide that delivery of a product associated with the pipeline is permitted based on testing checker results from the software testing checkers associated with a final deployment pipeline stage for the purpose of executing a deployment pipeline having a plurality of stages wherein at each stage one or more identified or selected tests are performed against a code change in order to validate suitability for progression to a following stage, wherein identification of at least some of the tests is based on an assessment of risk for the code change (Sandhu, Id.). Regarding claim 6, the rejection of claim 1 is incorporated, and Ramachandran further teaches: wherein a transition to a next stage of the software deployment pipeline is permitted based at least in part on one or more of the software testing checker results from the processor-based software testing checkers associated with a prior stage of the software deployment pipeline (Ramachandran, e.g., 12:40-60, “method determines if the application in the pre-production environment passed the pre-production validation suite … ‘YES’ branch is taken … method deploys the application in the production environment …”). Regarding claim 7, the rejection of claim 1 is incorporated, and Ramachandran further teaches: wherein a software testing checker result from a given processor-based software testing checker is processed using one or more rules specified for the given processor-based software testing checker (Ramachandran, e.g., 8:7-21, “Validation engine 270 may be configured to apply policies, rules, and/or algorithms to determine whether application 280 adheres to pre-production validation suite 215 and production validation suite 225 …” See also, e.g., 10:66-11:17, “After validating application 280, validation engine 270 may transmit data associated with the performed validation to validation reporter 275 … generate a score associated with application 280 based on the performed validation … determine corrective action based on the score and/or whether application 280 passed or failed the validation … report may indicate whether application 280 passed or failed … may include a score associated with each of the validation factors … score may be aggregated, wherein an overall score is determined and assigned to application 280 …”). Claims 12-14 and 18-20 are rejected for the additional reasons given in the rejections of claims 5-7 above. Claims 2 and 9 are rejected under 35 U.S.C. § 103 as being unpatentable over Ramachandran in view of Sandhu, and in further view of Sreedharan et al., U.S. 2023/0168883 A1 (“Sreedharan”). Regarding claim 2, the rejection of claim 1 is incorporated, and Sandhu further teaches: wherein the request to transition the software deployment pipeline from the first stage to the second stage is detected by monitoring one or more events (Sandhu, e.g., ¶35, “By way of example, the automated code deployment pipeline can include a peer engineer review stage, followed by a unit test stage, followed by a merge/build stage, followed by a system/integration test stage. In the peer engineer review stage, the code change 108 is communicated to one or more peer engineers … and feedback is received … (e.g., an indication that the code change 108 is approved … indication of a problem) …” See also, e.g., ¶43, “example stage 200 in the automated code deployment pipeline … an example of each stage 114(1), …, 114(n) in the automated code deployment pipeline … code change evaluation module 202 receives the code change 108 and can perform various analysis on the code change 108, such as running one or more tests on the code change …” Examiner’s note: communication of the code change to one or more peer reviewers and receipt of the code change by the code change evaluation module for a particular stage comprise events). Ramachandran in view of Sandhu does not more particularly teach that the monitored events are published on a message queue. However, Sreedharan does teach: [monitoring one or more events] published on a message queue (Sreedharan, e.g., claim 3, “one or more events are obtained from a messaging layer of a sequential message queue.” See also, e.g., claim 4, “one or more events comprise one or more of … a pull request event … a continuous integration/continuous deployment event …”) for the purpose of monitoring for messages indicative of software pull and/or merge requests such that evaluation of software artifacts may be performed in response thereto as part of a CICD code pipeline (Sreedharan, e.g., ¶¶49-53). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the system and method for pipeline-based software validation as taught by Ramachandran in view of Sandhu to provide that the monitored events are published on a message queue because the disclosure of Sreedharan shows that it was known to those of ordinary skill in the pertinent art to improve a system and method for CICD code pipeline-related software artifact analytics to provide that the monitored events are published on a message queue for the purpose of monitoring for messages indicative of software pull and/or merge requests such that evaluation of software artifacts may be performed in response thereto as part of a CICD code pipeline (Sreedharan, Id.). Claim 9 is rejected for the additional reasons given in the rejection of claim 2 above. Claims 21 and 23-24 are rejected under 35 U.S.C. § 103 as being unpatentable over Ramachandran in view of Sandhu, and in further view of Kumar et al., U.S. 2011/0289489 A1 (“Kumar”) Regarding claim 21, the rejection of claim 1 is incorporated, but Ramachandran in view of Sandhu does not more particularly teach that the user selects the at least one software testing checker from the at least one listing of designated checkers using a graphical user interface. However, Kumar does teach: wherein the user selects the at least one processor-based software testing checker from the at least one listing of designated software testing checkers using a graphical user interface (Kumar, e.g., ¶68, “user interface 175 may include a test list 410 interface element providing for the selection of one of a list of test cases 110 for the module 130 selected from the modules list 405 interface element. Upon selection of a test case 110, a user action list 605 interface element may be populated with a list of the user actions 115 included in the selected test case …” See also, e.g., ¶69, “user interface … may also include an execute 535 interface element … providing for the execution of the selected user actions 115 rather than entire test cases 110 …”) for the purpose of allowing a user to select individual program modules for testing in a testing phase, and further allowing the user to select individual test cases (testing checkers) or sub-actions of the test cases to validate functionality of the selected modules (Kumar, e.g., ¶¶53-71). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the system and method for pipeline-based software validation as taught by Ramachandran in view of Sandhu to provide that the user selects the at least one software testing checker from the at least one listing of designated checkers using a graphical user interface because the disclosure of Kumar shows that it was known to those of ordinary skill in the pertinent art to improve a system and method for program module-specific validation test selection and execution to provide that the user selects the at least one software testing checker from the at least one listing of designated checkers using a graphical user interface for the purpose of allowing a user to select individual program modules for testing in a testing phase, and further allowing the user to select individual test cases (testing checkers) or sub-actions of the test cases to validate functionality of the selected modules (Kumar, Id.). Claim 23 is rejected for the additional reasons given in the rejection of claim 21 above. Regarding claim 24, the rejection of claim 1 is incorporated, but Ramachandran in view of Sandhu does not more particularly teach that the set of testing checkers for evaluating the software items in the stage of the pipeline is populated, at least in part, by a user selecting at least one testing checker from at least one listing of designated testing checkers. However, Kumar does teach: wherein the set of processor-based software testing checkers for evaluating the one or more software items in the at least one stage of the software deployment pipeline is populated, at least in part, by a user selecting at least one processor-based software testing checker from at least one listing of designated software testing checkers (Kumar, e.g., ¶68, “user interface 175 may include a test list 410 interface element providing for the selection of one of a list of test cases for the module 130 selected from the modules list 405 user interface element. Upon selection of a test cases 110, a user action list 605 interface element may be populated with a list of the user actions 115 included in the selected test case …” See also, e.g., ¶69, “user interface … may also include an execute 535 user interface element … providing for the execution of the selected user actions 115 rather than entire test cases 110 …”) for the purpose of allowing a user to select individual program modules for testing in a testing phase, and further allowing the user to select individual test cases (testing checkers) or sub-actions of the test cases to validate functionality of the selected modules (Kumar, e.g., ¶¶53-71). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the system and method for pipeline-based software validation as taught by Ramachandran in view of Sandhu to provide that the set of testing checkers for evaluating the software items in the stage of the pipeline is populated, at least in part, by a user selecting at least one testing checker from at least one listing of designated testing checkers because the disclosure of Kumar shows that it was known to those of ordinary skill in the pertinent art to improve a system and method for program module-specific validation test selection and execution to provide that the set of testing checkers for evaluating the software items in the stage of the pipeline is populated, at least in part, by a user selecting at least one testing checker from at least one listing of designated testing checkers for the purpose of allowing a user to select individual program modules for testing in a testing phase, and further allowing the user to select individual test cases (testing checkers) or sub-actions of the test cases to validate functionality of the selected modules (Kumar, Id.). Response to Arguments In the Remarks, Applicant Argues: The Office Action acknowledges that Ramachandran “does not more particularly teach that the obtaining of the software testing checkers occurs in response to a request to transition the software deployment pipeline from a first stage to a second stage” and cites Perdomo for this feature (Resp. at 9). Perdomo states that “FIG. 7 shows the overall process for generating pipelines for deployment of software artifacts …” (emphasis in original); that is, the passage refers to a deployment stage (id., emphasis in original). Applicant submits that Perdomo does not teach first and second transition requests as cited in amended independent claims 1, 8 and 15 (id. at 9-10). Thus, Ramachandran in view of Perdomo and Kumar, alone or in combination, do not render amended claims 1, 8 and 15 obvious, and the rejections of these claims and all claims depending therefrom should be withdrawn (id. at 10-11). Examiner’s Response: In view of the amendments, Examiner newly cites to Sandhu, and maintains the rejections under the new grounds set forth in full above. Conclusion Examiner has identified particular references contained in the prior art of record within the body of this action for the convenience of Applicant. Although the citations made are representative of the teachings in the art and are applied to the specific limitations within the enumerated claims, the teaching of the cited art as a whole is not limited to the cited passages. Other passages and figures may apply. Applicant, in preparing the response, should consider fully the entire reference as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art and/or disclosed by Examiner. Examiner respectfully requests that, in response to this Office Action, support be shown for language added to any original claims on amendment and any new claims. That is, indicate support for newly added claim language by specifically pointing to page(s) and line number(s) in the specification and/or drawing figure(s). This will assist Examiner in prosecuting the application. When responding to this Office Action, Applicant is advised to clearly point out the patentable novelty which he or she thinks the claims present, in view of the state of the art disclosed by the references cited or the objections made. He or she must also show how the amendments avoid such references or objections. See 37 C.F.R. 1.111(c). Examiner interviews are available via telephone and video conferencing using a USPTO-supplied web-based collaboration tool. Applicant is encouraged to submit an Automated Interview Request (AIR) which may be done via https://www.uspto.gov/patent/uspto-automated-interview-request-air-form, or may contact Examiner directly via the methods below. Any inquiry concerning this communication or earlier communication from Examiner should be directed to Andrew M. Lyons, whose telephone number is (571) 270-3529, and whose fax number is (571) 270-4529. The examiner can normally be reached Monday to Friday from 10:00 AM to 6:00 PM ET. If attempts to reach Examiner by telephone are unsuccessful, Examiner’s supervisor, Wei Mui, can be reached at (571) 272-3708. Information regarding the status of an application may be obtained from the Patent Center system. For more information about the Patent Center system, see https://www.uspto.gov/patents/apply/patent-center. If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call (800) 786-9199 (in USA or Canada) or (571) 272-1000. /Andrew M. Lyons/Primary Examiner, Art Unit 2191
Read full office action

Prosecution Timeline

Jul 27, 2023
Application Filed
May 31, 2025
Non-Final Rejection — §103, §112
Aug 14, 2025
Examiner Interview Summary
Aug 14, 2025
Applicant Interview (Telephonic)
Aug 28, 2025
Response Filed
Nov 29, 2025
Final Rejection — §103, §112
Jan 27, 2026
Request for Continued Examination
Feb 04, 2026
Response after Non-Final Action
Feb 07, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602311
METHOD, DEVICE, SYSTEM, AND COMPUTER PROGRAM FOR COVERAGE-GUIDED SOFTWARE FUZZING
2y 5m to grant Granted Apr 14, 2026
Patent 12602203
INTEGRATION FLOW DESIGN GUIDELINES VALIDATOR
2y 5m to grant Granted Apr 14, 2026
Patent 12596542
GENERATING AND DISTRIBUTING CUSTOMIZED EMBEDDED OPERATING SYSTEMS
2y 5m to grant Granted Apr 07, 2026
Patent 12585465
DYNAMIC PROJECT PLANNING FOR SOFTWARE DEVELOPMENT PROJECTS
2y 5m to grant Granted Mar 24, 2026
Patent 12585453
SYSTEMS AND METHODS FOR UPDATING WITNESS SLED FIRMWARE
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
74%
Grant Probability
90%
With Interview (+16.1%)
2y 6m
Median Time to Grant
High
PTA Risk
Based on 459 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month