Prosecution Insights
Last updated: April 18, 2026
Application No. 17/936,793

MANAGED SOLVER EXECUTION USING DIFFERENT SOLVER TYPES

Non-Final OA §102§103
Filed
Sep 29, 2022
Examiner
VY, HUNG T
Art Unit
2163
Tech Center
2100 — Computer Architecture & Software
Assignee
Amazon Technologies, Inc.
OA Round
3 (Non-Final)
86%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
89%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
781 granted / 905 resolved
+31.3% vs TC avg
Minimal +3% lift
Without
With
+2.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
30 currently pending
Career history
935
Total Applications
across all art units

Statute-Specific Performance

§101
18.1%
-21.9% vs TC avg
§103
31.1%
-8.9% vs TC avg
§102
29.2%
-10.8% vs TC avg
§112
6.7%
-33.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 905 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments about the 101 based on the amendment filed 10/03/2025 have been fully considered but they are not persuasive. Therefore, the previous office action are withdraw. Applicant’s arguments with respect to claim(s) 1-20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. The new rejection based on the Bruckhaus et al. (U.S. Pat. 8,417,715 B1 Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-12, 15-19 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Bruckhaus et al. (U.S. Pat. 8,417,715 B1) With respect to claim 1, Bruckhaus et al. discloses a system, comprising: one or more computing devices that implement a solver execution service, configured to: provide access to a plurality of compute resources configured with different types of optimization solvers; in response to one or more client requests (i.e., the proper, broad interpretation of the term "business task" is to include questions or tasks requested or desired to be solved not only by for-profit companies but also by individuals, non-profit organizations, government organizations, and other non-business entities about any aspect of their business, organization or operations;”(col. 8, lines 45-52) or “ the end user requests access details to connect to the QUICKBOOKS database”(col. 14, lines 20) or “the end user interface 120 comprises a GUI that displays requests for inputs from the user and that displays results from operation of the program, including the results of operation of the predictive model developed based upon a given business task.”(col. 25, lines 1-5) or “ The analytic results stored in the analytic results repository 140 are made available to the user through the end user interface 120 that in turn communicates with the delivery services component 110 to deliver the results in the user requested format.”(col. 36, lines 33-40)); translate an optimization problem resulting in a first model format for the optimization problem and a second model format for the optimization problem (i.e., “he model manager 144 can request information about specific data elements that are part of a unit of prepared data, such as the data type of the data element, say an ORACLE "VARCHAR(2000)," a C "long," or JAVA "String" type. Depending on algorithm requirements, the model manager 144 can then request from data management services to perform certain data type conversions, such as converting ORACLE "CHAR(2000)" into ORACLE "VARCHAR2 (2000)," or into a JAVA "String." The model manager 144 can thus analyze the types and contents of the prepared data, convert the data as needed, and feed the prepared data to various algorithms in formats appropriate for each algorithm”(col. 49, lines 1-13) or “ it is possible to convert an algorithm in a certain format into appropriate software source code that can be incorporated into the program. For example, if an algorithm is described as text, as pseudo code, in mathematical notation, or other similar formats, software programmers can translate that information into working software code that can be incorporated into the program.”(col. 42, lines 35-43)) execute, at the compute resources, a first optimization solver of a first type on a first compute resource configured with the first type of optimization solver to determine a first solution to the optimization problem formatted with the first model format (i.e., “FIG. 7 is a flowchart illustrating the business task translation process. The overall process of translating a business task basically includes, among other items, identifying the category or type of data mining algorithm that could be used to solve the business task, as well as the identification of which data from the user's data sources 105 will be used in creating the models within the data mining component and the data that will be used to evaluate the models”(col. 22, lines 33-40) and “The model manager 144 retrieves translated business tasks from the metadata repository 126 creates and creates, optimizes, deploys, and executes data mining models, and monitors their performance. The model manager 144 interacts with the data mining algorithm library 146, the automatic optimization module 148, the performance monitoring module 171, and the feature selection module 172 to select algorithms, build models, and optimize or select the deployed model with the overall purpose of providing the best possible solutions for a given business task”(col. 43, lines 41-50 )); execute. at the computer resources, a second optimization solver of a second type on a second compute resource configured with the second type of optimization solver to determine a second solution to the optimization problem, formatted with the second model format ((i.e., “FIG. 7 is a flowchart illustrating the business task translation process. The overall process of translating a business task basically includes, among other items, identifying the category or type of data mining algorithm that could be used to solve the business task, as well as the identification of which data from the user's data sources 105 will be used in creating the models within the data mining component and the data that will be used to evaluate the models”(col. 22, lines 33-40) ) and “The model manager 144 retrieves translated business tasks from the metadata repository 126 creates and creates, optimizes, deploys, and executes data mining models, and monitors their performance. The model manager 144 interacts with the data mining algorithm library 146, the automatic optimization module 148, the performance monitoring module 171, and the feature selection module 172 to select algorithms, build models, and optimize or select the deployed model with the overall purpose of providing the best possible solutions for a given business task”(col. 43, lines 41-50 )); and generate output that indicates a comparison of performance data of the first optimization solver and the second optimization solver in solving the optimization problem (i.e., “n a next step 712, the business task services module 116, again through the configuration wizard 114, requests the user to specify the schedule for scoring observations data, described further below in connection with the data mining component 106. This process then repeats for each business task.”(col. 23, lines 61-67) and “It should be appreciated that the predictive models are also automatically optimized by the program as described further below, and information about this optimization is also stored in the model repository 150.”(col. 35, lines 51-55) and “ The data mining component controls the process of optimizing and selecting the best model using certain performance or evaluation criteria, referred to as an objective function.”(col. 35, lines 50-55) ) and “The model manager 144 retrieves translated business tasks from the metadata repository 126 creates and creates, optimizes, deploys, and executes data mining models, and monitors their performance. The model manager 144 interacts with the data mining algorithm library 146, the automatic optimization module 148, the performance monitoring module 171, and the feature selection module 172 to select algorithms, build models, and optimize or select the deployed model with the overall purpose of providing the best possible solutions for a given business task”(col. 43, lines 41-50 ) the best possible solutions, mean there are more than 2 comparing performance for first and second optimization solve as claimed invention); With respect to claims 2 and 15, Bruckhaus et al discloses wherein: the solver execution service implements a graphical user interface (GUI) (i.e., “ Various different methods of accessing results are provided by the invention, such as using a graphical user interface (GUI), ”(col. 64, line 5560)); the one or more client requests are generated according to user input received via the GUI (i.e., “ Various different methods of accessing results are provided by the invention, such as using a graphical user interface (GUI), ”(col. 64, line 5560)); the GUI includes one or more user control elements to select one or more types of optimization solvers to solve the optimization problem; and the GUI is configured to display an indication of the output (i.e., “t the data mining phase, typically a new set of experts who are skilled in mathematical analysis algorithms, such as neural networks, decision trees, nonlinear regressions, etc. perform data mining on the prepared data. Data mining experts typically create or use custom applications written in programming languages such as Java, C++, Python, or R, or data mining workbenches or use graphical user interface-based (GUI-based) tools, such as those provided by a development workbench tool like WEKA, SAS, and SPSS CLEMENTINE, to read, analyze, transform, and derive data from one or more data tables and to develop models (col. 2, line 26-38)). With respect to claim 3, Bruckhaus et al. discloses the system of claim 2, wherein the performance data includes: an amount of time used to determine the first solution and the second solution (i.e, “ This automates model evaluation to measure the quality of the predictive model quantitatively, and the invention can select models with the highest quality for execution during run time. ”(col. 5, lines 16-20)), objective function values of the first solution and the second solution (i.e., “The integration of a user's data via the configuration wizard 114 allows this data to be used by the data management and data mining components to ultimately develop a predictive model that uses the value of input variables to predict the value of output variables and to provide a solution to a given business task”(col. 17, lines 40-45)), or resource usage levels associated with determination of the first solution and the second solution (i.e., “These include data management processes such as extraction and aggregation of raw data from the user's data sources 105 and data preparation to transform aggregated data to a format usable by the data mining component and data mining process such as model building, evaluation, selection, evolution, execution and deployment (col. 24, lines 44-50)). With respect to claim 4, Bruckhaus et al. discloses wherein: the optimization problem is stored as a first model in a first format specific to the first type of optimization solver; and the solver execution service is configured to translate the first model to a second model in a second format specific to the second type of optimization solver (i.e., “The model manager 144 retrieves translated business tasks from the metadata repository 126 creates and creates, optimizes, deploys, and executes data mining models, and monitors their performance. The model manager 144 interacts with the data mining algorithm library 146, the automatic optimization module 148, the performance monitoring module 171, and the feature selection module 172 to select algorithms, build models, and optimize or select the deployed model with the overall purpose of providing the best possible solutions for a given business task”(col. 43, lines 41-50 ); With respect to claim 5, Bruckhaus et al. discloses the system of claim 4, wherein: the first model is specified in a problem modeling language readable by the first type of optimization solver; and the second model is specified in a programming language to make calls to a solver API of the second type of optimization solver (i.e. ,” Based upon that knowledge, in a next step 217, the developer-user would modify the business software platform's GUI code to make the necessary application program interface (API) calls to the interfaces of the modules in the program of the invention.”(col. 13, lines 10-15)). With respect to claim 6, Bruckhaus et al. discloses wherein the solver execution service is configured to: translate at least one of the first solution and the second solution to a common solution format; and store the first and second solutions in the common solution format (i.e., “More specifically, an important aspect of the methods and systems of the present invention is the use of a common data model for data mining (CDMDM) that allows for the foregoing integration of a user's data sources.’(col. 14, lines 55-60)). With respect to claims 7 and 16, Bruckhaus et al. discloses a method, comprising: performing, by a solver execution service implemented by one or more computing devices: providing access to a plurality of compute resources configured with different types of optimization solvers (configured with different types of optimization solvers; in response to one or more client requests (i.e., the proper, broad interpretation of the term "business task" is to include questions or tasks requested or desired to be solved not only by for-profit companies but also by individuals, non-profit organizations, government organizations, and other non-business entities about any aspect of their business, organization or operations;”(col. 8, lines 45-52) or “ the end user requests access details to connect to the QUICKBOOKS database”(col. 14, lines 20) or “the end user interface 120 comprises a GUI that displays requests for inputs from the user and that displays results from operation of the program, including the results of operation of the predictive model developed based upon a given business task.”(col. 25, lines 1-5) or “ The analytic results stored in the analytic results repository 140 are made available to the user through the end user interface 120 that in turn communicates with the delivery services component 110 to deliver the results in the user requested format.”(col. 36, lines 33-40)); performing, in response to one or more client requests to solve an optimization problem, wherein the optimization problem comprises a plurality of portions: executing. at the computer resources, a first optimization solver of a first type on a first compute resource configured with the first type of optimization solver to generate a first solution to a first portion of the optimization problem (i.e., “FIG. 7 is a flowchart illustrating the business task translation process. The overall process of translating a business task basically includes, among other items, identifying the category or type of data mining algorithm that could be used to solve the business task, as well as the identification of which data from the user's data sources 105 will be used in creating the models within the data mining component and the data that will be used to evaluate the models”(col. 22, lines 33-40) and “The model manager 144 retrieves translated business tasks from the metadata repository 126 creates and creates, optimizes, deploys, and executes data mining models, and monitors their performance. The model manager 144 interacts with the data mining algorithm library 146, the automatic optimization module 148, the performance monitoring module 171, and the feature selection module 172 to select algorithms, build models, and optimize or select the deployed model with the overall purpose of providing the best possible solutions for a given business task”(col. 43, lines 41-50 )); executing, at the computer resources, a second optimization solver of a second type on a second compute resource configured with the second type of optimization solver to generate a second solution to a second portion of the optimization problem wherein the second portion of the optimization problem is a different portion from the first portion of the optimization problem ((i.e., “FIG. 7 is a flowchart illustrating the business task translation process. The overall process of translating a business task basically includes, among other items, identifying the category or type of data mining algorithm that could be used to solve the business task, as well as the identification of which data from the user's data sources 105 will be used in creating the models within the data mining component and the data that will be used to evaluate the models”(col. 22, lines 33-40) and “The model manager 144 retrieves translated business tasks from the metadata repository 126 creates and creates, optimizes, deploys, and executes data mining models, and monitors their performance. The model manager 144 interacts with the data mining algorithm library 146, the automatic optimization module 148, the performance monitoring module 171, and the feature selection module 172 to select algorithms, build models, and optimize or select the deployed model with the overall purpose of providing the best possible solutions for a given business task”(col. 43, lines 41-50 )); and generating output indicating an overall solution to the optimization problem based at least in part on the first solution and the second solution ((i.e., “n a next step 712, the business task services module 116, again through the configuration wizard 114, requests the user to specify the schedule for scoring observations data, described further below in connection with the data mining component 106. This process then repeats for each business task.”(col. 23, lines 61-67) and “It should be appreciated that the predictive models are also automatically optimized by the program as described further below, and information about this optimization is also stored in the model repository 150.”(col. 35, lines 51-55) and “ The data mining component controls the process of optimizing and selecting the best model using certain performance or evaluation criteria, referred to as an objective function.”(col. 35, lines 50-55) ) and “The model manager 144 retrieves translated business tasks from the metadata repository 126 creates and creates, optimizes, deploys, and executes data mining models, and monitors their performance. The model manager 144 interacts with the data mining algorithm library 146, the automatic optimization module 148, the performance monitoring module 171, and the feature selection module 172 to select algorithms, build models, and optimize or select the deployed model with the overall purpose of providing the best possible solutions for a given business task”(col. 43, lines 41-50 ) the best possible solutions, mean there are more than 2 comparing performance for first and second optimization solve as claimed invention); With respect to claim 8, Bruckhaus et al. discloses method of claim 7, wherein the first optimization solver and the second optimization solver are executed at least partly in parallel (i.e., fig. 15 shows adapter 1….adaptern.). With respect to claim 9, Bruckhaus et al. discloses the method of claim 7, wherein: the one or more client requests specify configuration data for obtaining the overall solution in multiple execution stages, including a first stage to obtain the first solution followed by a second stage to obtain the second solution; and the first solution is translated into an input model for the second optimization solver(i.e., “i.e., “During the process of preparing data the individual components in the data management services 104 create the tables if not already present in the common data repository for data mining (CDRDM) 136 to store the data during the various stages of data management such as acquisition, aggregation and preparation of data.”(col. 28 , lines 25-30) ). With respect to claims 10, Bruckhaus et al. discloses the method of claim 9, wherein the configuration data specifies: a plurality of solver jobs executable by the solver execution service to obtain the overall solution (i.e., “The model manager 144 interacts with the data mining algorithm library 146, the automatic optimization module 148, the performance monitoring module 171, and the feature selection module 172 to select algorithms, build models, and optimize or select the deployed model with the overall purpose of providing the best possible solutions for a given business task.”(col. 43 ,lines 45-50)), and for individual ones of the solver jobs: a solver type of an optimization solver to use for the solver job; and one or more solver parameters that are specific to the solver type (i.e., “The model manager 144 receives requests to construct data mining models of certain types using specific data sets from the observations table. It can obtain data mining algorithms of the appropriate type from the data mining algorithm library 146, and it is able to apply such algorithms to the data specified by business task translation. The model manager 146 executes these algorithms to build various models for the given business task”(col. 43, line 55-62) and “ The model manager 144 retrieves translated business tasks from the metadata repository 126 creates and creates, optimizes, deploys, and executes data mining models, and monitors their performance. The model manager 144 interacts with the data mining algorithm library 146, the automatic optimization module 148, the performance monitoring module 171, and the feature selection module 172 to select algorithms, build models, and optimize or select the deployed model with the overall purpose of providing the best possible solutions for a given business task”(col. 43, lines 41-50). With respect to claim 11, Bruckhaus et al discloses the method of claim 10, wherein the configuration data specifies: for a particular one of the solver jobs, one or more parameters of a virtual machine (and “ The model manager 144 retrieves translated business tasks from the metadata repository 126 creates and creates, optimizes, deploys, and executes data mining models, and monitors their performance. The model manager 144 interacts with the data mining algorithm library 146, the automatic optimization module 148, the performance monitoring module 171, and the feature selection module 172 to select algorithms, build models, and optimize or select the deployed model with the overall purpose of providing the best possible solutions for a given business task”(col. 43, lines 41-50)) or container instance to execute the particular solver job, including a particular program language runtime configured on the virtual machine or container (i.e., “The business task translator 118 also identifies to the data-mining component 106 which of the data elements are inputs, which are outputs, and which server as inputs and outputs simultaneously. For example, the monthly sales volume per item could be a column in the database that could server as a target to build a model to predict that value and the price of each item could be an input column. As described above, columns may also serve as inputs and outputs simultaneously, for example for use with association algorithms’(col. 45, lines 1-10)). With respect to claim 12, Bruckhaus et discloses the method of claim 9, wherein the configuration data specifies: a model preparation stage executable by the solver execution service to populate a first model for the first optimization solver with values from a model file (i.e., “n a next step 1108, the data aggregation services module 149 fetches rows of data from the next source table, which is the equivalent of a given staging table.”(col. 31, lines 65=67 )). With respect to claim 17, Bruckhaus et discloses the one or more non-transitory computer-readable storage media of claim 16, wherein the program instructions when executed on or across the one or more processors cause the solver execution service to: determine, from the one or more client requests ((i.e., the proper, broad interpretation of the term "business task" is to include questions or tasks requested or desired to be solved not only by for-profit companies but also by individuals, non-profit organizations, government organizations, and other non-business entities about any aspect of their business, organization or operations;”(col. 8, lines 45-52) or “ the end user requests access details to connect to the QUICKBOOKS database”(col. 14, lines 20) or “the end user interface 120 comprises a GUI that displays requests for inputs from the user and that displays results from operation of the program, including the results of operation of the predictive model developed based upon a given business task.”(col. 25, lines 1-5) or “ The analytic results stored in the analytic results repository 140 are made available to the user through the end user interface 120 that in turn communicates with the delivery services component 110 to deliver the results in the user requested format.”(col. 36, lines 33-40)); configuration data for obtaining the overall solution in multiple execution stages (i.e., “The model manager 144 interacts with the data mining algorithm library 146, the automatic optimization module 148, the performance monitoring module 171, and the feature selection module 172 to select algorithms, build models, and optimize or select the deployed model with the overall purpose of providing the best possible solutions for a given business task.”(col. 43, lines 45-50)), including a first stage to obtain the first solution followed by a second stage to obtain the second solution; and translate the first solution generated by the first optimization solver into an input model for the second optimization solver. (i.e., “Referring back to FIG. 11, in a next step 1106, the data aggregation services module 149 determines whether it has processed the last table listed in the applicable source and target table list. If not, in a next step 1108, the data aggregation services module 149 fetches rows of data from the next source table, which is the equivalent of a given staging table. It should be appreciated that the data aggregation metadata comprises a column (DATA_PROCESSED_COLUMN) that identifies whether a given row in the staging table has already been processed or moved to an aggregation table. Therefore, only rows of data in a given staging table that have not been processed or previously moved to an aggregation table will be moved to the aggregation table.”(col. 31, line 61-67 and col. 32, lines 1-8) With respect to claim 18, Bruckhaus et discloses the one or more non-transitory computer-readable storage media of claim 17, wherein the program instructions when executed on or across the one or more processors cause the solver execution service to: determine, from the configuration data, a plurality of solver jobs executable by the by the solver execution service to obtain the overall solution (i.e., “it should be appreciated that the model repository 150 stores any information that can be captured that may affect the best choice and configuration of model for a given business task. That is, the model repository provides a complete trace about the details of each model, the evolution history that led to the creation of each model, and the context in which the model was built.”(col. 36, 15-22)), and for individual ones of the solver jobs: a solver type of an optimization solver to use for the solver job (i.e.,. “ the data mining component is able to optimize automatically the selection of a single model that provides the best solution for the business task. Accordingly, the data mining components selects only one of the models, referred to as the deployed model, out of the many that it builds to provide the user with a response or solution to the business task”(col. 36, lines 15-22)); and one or more solver parameters that are specific to the solver type ((i.e.,. “ the data mining component is able to optimize automatically the selection of a single model that provides the best solution for the business task. Accordingly, the data mining components selects only one of the models, referred to as the deployed model, out of the many that it builds to provide the user with a response or solution to the business task”(col. 36, lines 15-22));)). With respect to claim 19, Bruckhaus et discloses the one or more non-transitory computer-readable storage media of claim 18, wherein the program instructions when executed on or across the one or more processors cause the solver execution service to: determine, from the configuration data and for a particular one of the solver jobs (i.e., “The data mining component controls the process of optimizing and selecting the best model using certain performance or evaluation criteria, referred to as an objective function. The data mining component may use various different alternative objective functions to evaluate the performance of a given model. For example, when a financial institution desires to predict the risk of customers defaulting on loans, it may use different objective functions to accomplish different goals”(0011)), one or more parameters of a virtual machine or container instance to execute the particular solver job, including a particular program language runtime configured on the virtual machine or container (i.e., “it is possible to convert an algorithm in a certain format into appropriate software source code that can be incorporated into the program. For example, if an algorithm is described as text, as pseudo code, in mathematical notation, or other similar formats, software programmers can translate that information into working software code that can be incorporated into the program.”(col. 42, line 35-42 )); and provision a virtual machine or container to execute the particular solver job according to the one or more parameters (i.e., “the solver system 101 includes a constraint optimizer provisioning API 1008 that validates clients of the system”(0133)). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 13-14 and 20 are rejected under 35 U.S.C 103(a) as being unpatentable over Bruckhaus et al. (U.S. Pat. 8,417,715 B1). in view of Cella et al. (U.S. Pub. 2022/0366494 A1). With respect to claim 13 and 20, Bruckhaus et al. discloses all limitations recited in claim 9 except for wherein the configuration data specifies a workflow executable by a workflow orchestration service, and the workflow orchestration service uses the solver execution service to execute the first and second solver jobs. However, Cella et al. discloses wherein the configuration data specifies a workflow executable by a workflow orchestration service (i.e., “the digital twin interaction manager 20824 may manage one or more workflows that are performed via a market orchestration digital twin”(2084)), and the workflow orchestration service uses the solver execution service to execute the first and second solver jobs (i.e., “the recommendation may be influenced by the type of problem to be solved and whether there are specialized algorithms or methods that are optimized for the type of problem (e.g. quantum annealing based traveling salesperson solver or even classic heuristic methods that provide for reasonable baseline results).”(1313)). It would have been obvious for a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Bruckhaus to use workflow orchestration in order to increase efficiency, speed, reliability to produce more output for the stated purpose has been well known in the art as evidenced by teaching of Cella et al.(0025) With respect to claim 14, Cella et al. discloses the method of claim 13, wherein the workflow causes the workflow orchestration service to execute at least one step via a machine learning service distinct from the solver execution service (i.e., “the trader digital twin 20842 may work in connection with the market orchestration system platform 20500 to provide simulations, predictions, statistical summaries, and decision-support based on analytics, machine learning, and/or other AI and learning-type processing of inputs (e.g., pricing data, counterparty data, asset data, order data, news, discussion boards, and the like)”(2093)). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to HUNG T VY whose telephone number is (571)272-1954. The examiner can normally be reached M-F 8-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tony Mahmoudi can be reached at (571)272-4078. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HUNG T VY/Primary Examiner, Art Unit 2163 December 4, 2025
Read full office action

Prosecution Timeline

Sep 29, 2022
Application Filed
Jul 01, 2025
Non-Final Rejection — §102, §103
Oct 03, 2025
Response Filed
Oct 03, 2025
Applicant Interview (Telephonic)
Oct 03, 2025
Examiner Interview Summary
Dec 04, 2025
Final Rejection — §102, §103
Feb 05, 2026
Response after Non-Final Action
Feb 24, 2026
Request for Continued Examination
Mar 07, 2026
Response after Non-Final Action
Apr 08, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12566949
TERNARY NEURAL NETWORK ACCELERATOR DEVICE AND METHOD OF OPERATING THE SAME
2y 5m to grant Granted Mar 03, 2026
Patent 12561345
GENERATING AN ARTIFICIAL DATA SET
2y 5m to grant Granted Feb 24, 2026
Patent 12524422
SYSTEMS AND METHODS FOR PROCESSING HIERARCHICAL, SEMI-STRUCTURED, SCHEMA-LESS, POLYMORPHIC DATA
2y 5m to grant Granted Jan 13, 2026
Patent 12517772
EVENT PROCESSING SYSTEMS AND METHODS
2y 5m to grant Granted Jan 06, 2026
Patent 12511259
SCALABLE, SECURE, EFFICIENT, AND ADAPTABLE DISTRIBUTED DIGITAL LEDGER TRANSACTION NETWORK
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
86%
Grant Probability
89%
With Interview (+2.9%)
2y 9m
Median Time to Grant
High
PTA Risk
Based on 905 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month