Prosecution Insights
Last updated: April 19, 2026
Application No. 17/749,678

SYSTEM AND METHOD FOR CONTINUOUS DYNAMICS MODEL FROM IRREGULAR TIME-SERIES DATA

Final Rejection §101§103
Filed
May 20, 2022
Examiner
MAC, GARY
Art Unit
2127
Tech Center
2100 — Computer Architecture & Software
Assignee
Royal Bank Of Canada
OA Round
2 (Final)
36%
Grant Probability
At Risk
3-4
OA Rounds
5y 0m
To Grant
61%
With Interview

Examiner Intelligence

Grants only 36% of cases
36%
Career Allow Rate
5 granted / 14 resolved
-19.3% vs TC avg
Strong +25% interview lift
Without
With
+25.0%
Interview Lift
resolved cases with interview
Typical timeline
5y 0m
Avg Prosecution
36 currently pending
Career history
50
Total Applications
across all art units

Statute-Specific Performance

§101
38.4%
-1.6% vs TC avg
§103
41.9%
+1.9% vs TC avg
§102
8.0%
-32.0% vs TC avg
§112
10.1%
-29.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 14 resolved cases

Office Action

§101 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s argument filed 12/09/2025 have been fully considered but they are not persuasive. Applicant’s Argument: On page 7 of Applicant’s response to rejections under 35 U.S.C. 101, applicant states “In particular, the claims provide technical improvements over conventional systems by addressing challenges in the ability to represent modern time series data which may be irregular in nature. This data represents challenges in existing machine learning techniques both in terms of their structure (e.g., irregular sampling in hospital records and spatiotemporal structure in climate data) and size. See, for example, present application, paragraph. [0066].” Examiner’s Response: Applicant’s argument is not persuasive. An important consideration in determining whether a claim improves technology is the extent to which the claim covers a particular solution to a problem or a particular way to achieve a desired outcome, as opposed to merely claiming the idea of a solution or outcome (see MPEP 2106.05(a)). The amended claims do not provide sufficient details to describe any technological improvement. If the specifications explicitly set forth an improvement but in a conclusory manner (see MPEP 2106.04(d)(1): a bare assertion of an improvement without the detail necessary to be apparent to a person of ordinary skill in the art), the examiner should not determine the claim improves technology. During examination, the examiner should analyze the "improvements" consideration by evaluating the specification and the claims to ensure that a technical explanation of the asserted improvement is present in the specification, and that the claim reflects the asserted improvement (see MPEP §2106.05(a)). The MPEP (§2106.05(a)(II)) also warns, “it is important to keep in mind that an improvement in the abstract idea itself (e.g. a recited fundamental economic concept) is not an improvement in technology.” Here, the alleged improvement in the form of “generate or update, from the observation data, a normalizing flow model based on a log likelihood of observations with a variational lower bound, wherein variational lower bound is based on a piece-wise temporal construction of trajectories for a posterior distribution of a latent continuous-time stochastic process” is an improvement to the abstract idea of a mathematical calculation. Applicant’s Argument: On pages 7-8 of Applicant’s response to rejections under 35 U.S.C. 102 and 103, applicant states that the amended claims have overcome 102 rejections. Applicant also argue the references no longer teach the amended claims, “a piece-wise temporal construction of trajectories for a posterior distribution of a latent continuous-time stochastic process”. Examiner’s Response: Applicant’s argument is not persuasive. Applicant’s arguments with respect to claims 1, 13, and 25 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-2, 4-14, and 16-25 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding Claim 1: Subject Matter Eligibility Analysis Step 1: Claim 1 recites “A system for machine learning architecture for time series data prediction comprising” and is thus a machine, one of the four statutory categories of patentable subject matter. Subject Matter Eligibility Analysis Step 2A Prong 1: “sample trajectories of the time series data to generate observation data including at least partial realizations of the time series data” (a mental process that can be performed in the human mind with the aid of pen and paper, i.e. judgement; See par. 87 in Specification; Identifying specific points of time series data for a particular timestamp) “generate or update, from the observation data, a normalizing flow model based on a log likelihood of observations with a variational lower bound, wherein variational lower bound is based on a piece-wise temporal construction of trajectories for a posterior distribution of a latent continuous-time stochastic process” (a mathematical calculation, par. 97 in the Specification) “generate, ” (a mathematical calculation, par. 63-65 in the Specification) Claim 1 therefore recites an abstract idea. Subject Matter Eligibility Analysis Step 2A Prong 2: “a processor; and a memory coupled to the processor and storing processor-executable instructions that, when executed, configure the processor to” (mere instructions to apply the exception using a generic computer component - see MPEP 2106.05(f)) “maintain a data set representing a neural network having a plurality of weights” (This step is directed to storing data in memory, which is understood to be insignificant extra solution activity - see MPEP 2106.05(g)) "obtain time series data associated with a data query, the time series data gathered by one or more sensors or data sources over a period of time” (This step is directed to data gathering, which is understood to be insignificant extra solution activity - see MPEP 2106.05(g)) “generate, using the neural network ” (mere instructions to apply the exception using a generic computer component - see MPEP 2106.05(f)) "generate a signal providing an indication of the predicted value associated with the data query” (This step is directed to data gathering, which is understood to be insignificant extra solution activity - see MPEP 2106.05(g)) The additional elements as disclosed above alone or in combination do not integrate the judicial exception into practical application as they are mere insignificant extra solution activity in combination of generic computer functions being implemented with generic computer elements in a high level of generality to perform the disclosed abstract idea above. Therefore, Claim 1 is directed to the abstract idea. Subject Matter Eligibility Analysis Step 2B: “a processor; and a memory coupled to the processor and storing processor-executable instructions that, when executed, configure the processor to” (mere instructions to apply the exception using a generic computer component - see MPEP 2106.05(f)) “maintain a data set representing a neural network having a plurality of weights” (This step is directed to storing data in memory, which is understood to be insignificant extra solution activity and well understood, routine and conventional activity of storing and retrieving information in memory as identified by the court - see MPEP 2106.05(d)) " obtain time series data associated with a data query, the time series data gathered by one or more sensors or data sources over a period of time” (This step is directed to transmitting or receiving information, which is understood to be insignificant extra solution activity and well understood, routine and conventional activity of transmitting and receiving data as identified by the court - see MPEP 2106.05(d)) “generate, using the neural network ” (mere instructions to apply the exception using a generic computer component - see MPEP 2106.05(f)) "generate a signal providing an indication of the predicted value associated with the data query” (This step is directed to transmitting or receiving information, which is understood to be insignificant extra solution activity and well understood, routine and conventional activity of transmitting and receiving data as identified by the court - see MPEP 2106.05(d)) The additional elements as disclosed above alone or in combination do not recite significantly more than the abstract idea itself as they are mere insignificant extra solution activity in combination of generic computer functions being implemented with generic computer elements in a high level of generality to perform the disclosed abstract idea above. Therefore, Claim 1 is subject-matter ineligible. Regarding Claim 13: The claim recites a process (“A computer-implemented method for machine learning architecture for time series data prediction comprising”) that performs the method as described in claim 1. Therefore, claim 13 is rejected for the same reasons as disclosed for claim 1. Regarding Claims 2 and 14: Subject Matter Eligibility Analysis Step 2A Prong 1: None Subject Matter Eligibility Analysis Step 2A Prong 2 & 2B: “wherein the time series data is gathered by HVAR control system sensors, traffic control system sensors or medical sensors” (merely specifies a particular technological environment in which the abstract idea is to take place, ie. a field of use, and thus does not integrate the abstract idea into a practical application nor cannot provide significantly more than the abstract idea itself - see MPEP 2106.05(h)) Regarding Claims 4 and 16: Subject Matter Eligibility Analysis Step 2A Prong 1: “” (a mathematical calculation, par. 89-90 in Specification) Subject Matter Eligibility Analysis Step 2A Prong 2 & 2B: “wherein the normalizing flow model (Fθ) is configured to ” (mere instructions to apply the exception using a generic computer component - see MPEP 2106.05(f)) Regarding Claims 5 and 17: Subject Matter Eligibility Analysis Step 2A Prong 1: None Subject Matter Eligibility Analysis Step 2A Prong 2 & 2B: “wherein Fθ is a continuous mapping and one or more sampled trajectories of the latent continuous-time stochastic process are continuous with respect to time” (merely specifies a particular technological environment in which the abstract idea is to take place, ie. a field of use, and thus does not integrate the abstract idea into a practical application nor cannot provide significantly more than the abstract idea itself - see MPEP 2106.05(h)) Regarding Claims 6 and 18: Subject Matter Eligibility Analysis Step 2A Prong 1: None Subject Matter Eligibility Analysis Step 2A Prong 2 & 2B: “wherein the latent state has m+1 dimensions, and wherein m is derived from the latent continuous-time stochastic process” (merely specifies a particular technological environment in which the abstract idea is to take place, ie. a field of use, and thus does not integrate the abstract idea into a practical application nor cannot provide significantly more than the abstract idea itself - see MPEP 2106.05(h)) Regarding Claims 7 and 19: Subject Matter Eligibility Analysis Step 2A Prong 1: “wherein a variational posterior of the latent state is based on piece-wise solutions of latent differential equations” (a mathematical calculation) Subject Matter Eligibility Analysis Step 2A Prong 2 & 2B: None Regarding Claims 8 and 20: Subject Matter Eligibility Analysis Step 2A Prong 1: None Subject Matter Eligibility Analysis Step 2A Prong 2 & 2B: “wherein the latent continuous-time stochastic process comprises an Ornstein-Uhlenbeck (OU) process having the stationary marginal distribution and bounded variance” (merely specifies a particular technological environment in which the abstract idea is to take place, ie. a field of use, and thus does not integrate the abstract idea into a practical application nor cannot provide significantly more than the abstract idea itself - see MPEP 2106.05(h)) Regarding Claims 9 and 21: Subject Matter Eligibility Analysis Step 2A Prong 1: None Subject Matter Eligibility Analysis Step 2A Prong 2 & 2B: “wherein the latent continuous-time stochastic process is configured such that transition density between two arbitrary time points is determined in closed form” (merely specifies a particular technological environment in which the abstract idea is to take place, ie. a field of use, and thus does not integrate the abstract idea into a practical application nor cannot provide significantly more than the abstract idea itself - see MPEP 2106.05(h)) Regarding Claims 10 and 22: Subject Matter Eligibility Analysis Step 2A Prong 1: None Subject Matter Eligibility Analysis Step 2A Prong 2 & 2B: “wherein the time series data comprises sensor data obtained from one or more physical sensor devices” (merely specifies a particular technological environment in which the abstract idea is to take place, ie. a field of use, and thus does not integrate the abstract idea into a practical application nor cannot provide significantly more than the abstract idea itself - see MPEP 2106.05(h)) Regarding Claims 11 and 23: Subject Matter Eligibility Analysis Step 2A Prong 1: None Subject Matter Eligibility Analysis Step 2A Prong 2 & 2B: “wherein the time series data comprises irregularly spaced temporal data” (merely specifies a particular technological environment in which the abstract idea is to take place, ie. a field of use, and thus does not integrate the abstract idea into a practical application nor cannot provide significantly more than the abstract idea itself - see MPEP 2106.05(h)) Regarding Claims 12 and 24: Subject Matter Eligibility Analysis Step 2A Prong 1: “wherein the predicted value comprises an interpolation between two data points from the time series data” (a mathematical calculation) Subject Matter Eligibility Analysis Step 2A Prong 2 & 2B: None Regarding Claim 25: The claim recites an article of manufacture that performs the method as described in claim 1. Therefore, claim 25 is rejected for the same reasons as disclosed for claim 1. The limitations for additional elements of claim 25 are analyzed below. Subject Matter Eligibility Analysis Step 2A Prong 1: Please see Step 2A Prong 1 analysis of claim 1 Subject Matter Eligibility Analysis Step 2A Prong 2 & 2B: “A non-transitory computer-readable medium having stored thereon machine interpretable instructions which, when executed by a processor, cause the processor to perform a computer- implemented method for machine learning architecture for time series data prediction, the method comprising” (mere instructions to apply the exception using a generic computer component - see MPEP 2106.05(f)) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-2, 4-14, and 16-25 are rejected under 35 U.S.C. 103 as being unpatentable over Deng, "Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows", in view of White, "Piecewise Approximate Bayesian Computation: Fast Inference For Discretely Observed Markov Models Using A Factorised Posterior Distribution". Regarding claim 1, Deng teaches: “A system for machine learning architecture for time series data prediction comprising” (pg.1, abstract, Experiments are conducted using time series model to make predictions on synthetic and real-world data.) “a processor; and a memory coupled to the processor and storing processor-executable instructions that, when executed, configure the processor to” ([pg. 6, section 4, par. 1-2], It is implied that the experiments using the models and datasets are performed on a computer consisting of a processor and memory storing instructions to execute the model.) “maintain a data set representing a neural network having a plurality of weights” ([pg.6, section 4, par. 1-2; pg. 12, section C.3, par. 1], Latent CTFP model use ODE-RNN as the inference network. The neural network is stored in memory on the computer (maintain) and receives synthetic data generated from common continuous-time stochastic processes as input data. All models receive training using IWAE bound. Models inherently have weights.) “obtain time series data associated with a data query, the time series data gathered by one or more sensors or data sources over a period of time” ([pg. 7, section 4.2, par. 1-2], In order to test the models, real-world datasets of various dynamics and complexities are retrieved (data query) from different databases. For example, the Beijing Air-Quality Dataset consists of temperature, pressure, and wind speed that are recorded once per hour. It is inherent that temperature and pressure is gathered by one or more sensors.) “sample trajectories of the time series data to generate observation data including at least partial realizations of the time series data” ([pg. 4, section 3.2, par. 1], The observed data is an irregularly spaced time series with a corresponding timestamp (sample trajectories of the time series data). The time series is assumed to be an incomplete realization of a continuous stochastic process (partial realizations of the time series data).) “generate or update, from the observation data, a normalizing flow model based on a log likelihood of observations with a variational lower bound, wherein variational lower bound is based on a ” ([pg. 3, section 2.5, par. 1; pg. 4, section 3.2, par. 1-6; pg. 5, section 3.4, par. 1-4], The observed data is a time series and a model is used to predict continuous trajectories into the future, given past observations. The normalizing flow model is generated based on the distribution of the observed data. The normalizing flow model is updated based on an importance-weighted autoencoder lower bound of the log-likelihood. The time series data is a continuous stochastic process.) “generate, using the neural network and based on the time series data, a predicted value based on the normalizing flow model, the normalizing flow model based on a latent continuous-time stochastic process having a stationary marginal distribution and bounded variance” ([pg. 5, section 3.3, par. 2-3; pg.5, section 3.4, par. 1-5; pg. 6, section 4.1, par. 2-3], The observed data is a time series and a model is used to predict continuous trajectories into the future, given past observations. The normalizing flow model may be based on a continuous stochastic process, such as Ornstein-Uhlenbeck process. The Ornstein-Uhlenbeck process has a bounded variance and a stationary probability distribution.) “generate a signal providing an indication of the predicted value associated with the data query” ([pg. 5, section 3.3, par. 3; Figure 2c], Figure 2c shows the extrapolation of the model based on synthetic data generated from Geometric Brownian motion process.) Deng does not explicitly disclose an implementation of “a piece-wise temporal construction of trajectories for a posterior distribution of a latent continuous-time stochastic process”. However, White discloses in the same field of endeavor: “... a piece-wise temporal construction of trajectories for a posterior distribution of a ” ([pg. 2, col. 2, par. 1-2; pg. 2, section 2, par. 1-4; pg. 5, section 2.4, par. 1-4; pg. 9, section 4.4, par. 1], The algorithm describes factorization (piece-wise) of the posterior density. The algorithm is tested on synthetic data such as stochastic Lotka-Volterra model, which is an example of a stochastic discete state-space continuous-time Markov process.) It would be obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of “a piece-wise temporal construction of trajectories for a posterior distribution of a latent continuous-time stochastic process” from White into the teaching of Deng. Doing so can improve complicated stochastic models by generating the posterior density as a product of factors (White, abstract). Regarding claim 13: Claim 13 recites a method (“A computer-implemented method for machine learning architecture for time series data prediction comprising”) that performs the same process as described in Claim 1. Therefore claim 13 is rejected under the same reasons mention for claim 1. Regarding claims 2 and 14, Deng teaches: “wherein the time series data is gathered by HVAR control system sensors, traffic control system sensors or medical sensors” ([pg. 7, section 4.2, par. 1-2], The PTB Diagnostic Database consists of ambulatory electrocardiography recordings (medical sensors). It is inherent the recordings are collected by sensors.) Regarding claims 4 and 16, Deng teaches: “wherein the normalizing flow model (Fθ) is configured to decode a continuous time sample path of a latent state into a complex distribution of continuous trajectories” ([pg. 2, section 1, par. 4; pg. 4, section 3.2, par. 1-2], The normalizing flow model maps continuous samples to continuous trajectories.) Regarding claims 5 and 17, Deng teaches: “wherein Fθ is a continuous mapping and one or more sampled trajectories of the latent continuous-time stochastic process are continuous with respect to time” ([pg. 4, section 3.2, par. 2], The normalizing flow model is a continuous mapping and the continuous trajectories are computed with respect to time.) Regarding claims 6 and 18, Deng teaches: “wherein the latent state has m+1 dimensions, and wherein m is derived from the latent continuous-time stochastic process” ([pg. 2, section 2.2, par. 3-4; pg. 5, section 3.4, par. 1-3; pg. 11, section A, par. 1-3], The Wiener process is d-dimensional. The Kolmogorov extension theorem states a marginalization consistency condition to consistently characterize a continuous time stochastic process and the distribution contains a term that represents m+1 dimension.) Regarding claims 7 and 19, Deng teaches: “wherein a variational posterior of the latent state is based on ” ([pg. 3, section 2.4, par. 1; pg. 4, section 3.1, par. 1; pg. 6, section 4.1, par. 2-3], The latent state is mapped to a continuous normalizing flow based on the mapping defined by the ordinary differential equation. Equation 5 defines multiple functions for the modified variant of ANODE.) Deng does not explicitly disclose an implementation of “wherein a variational posterior of the latent state is based on piece-wise solutions of latent differential equations”. However, White discloses in the same field of endeavor: “wherein a variational posterior of the latent state is based on piece-wise solutions of latent differential equations” ([pg. 2, col. 2, par. 1-2; pg. 6, section 4, par. 1; pg. 7, section 4.2, par. 1], The algorithm describes factorization (piece-wise) of the posterior density. The factorization is performed on a stochastic differential equation describing evolution of an interest rate.) It would be obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of “wherein a variational posterior of the latent state is based on solutions of latent differential equations” from White into the teaching of Deng. Doing so can improve complicated stochastic models by generating the posterior density as a product of factors (White, abstract). Regarding claims 8 and 20, Deng teaches: “wherein the latent continuous-time stochastic process comprises an Ornstein-Uhlenbeck (OU) process having the stationary marginal distribution and bounded variance” ([pg. 6, section 4.1, par. 3], The continuous time stochastic process may be Ornstein-Uhlenbeck process. The Ornstein-Uhlenbeck process is defined as having the stationary marginal distribution and bounded variance.) Regarding claims 9 and 21, Deng teaches: “wherein the latent continuous-time stochastic process is configured such that transition density between two arbitrary time points is determined in closed form” ([pg. 6-7, section 4.1, par. 5; Table 1], The ground truth density function of each stochastic process is computed and refers to the closed-form negative log-likelihood of the true underlying data generation process.) Regarding claims 10 and 22, Deng teaches: “wherein the time series data comprises sensor data obtained from one or more physical sensor devices” ([pg. 7, section 4.2, par. 2], The real-world datasets consists of PTB diagnostic database, which consists of electrocardiography (ECG) recordings and Beijing Air-Quality Dataset, which consists of weather and air quality data. It is implied that the temperature, pressure, and wind speed of the environment are recorded by physical sensor devices.) Regarding claims 11 and 23, Deng teaches: “wherein the time series data comprises irregularly spaced temporal data” ([pg. 6, section 4.1, par. 1], The synthetic dataset are irregularly-sampled time series data.) Regarding claims 12 and 24, Deng teaches: “wherein the predicted value comprises an interpolation between two data points from the time series data” ([pg. 5, section 3.3, par. 1-2], Interpolation is performed on the time series data.) Regarding claim 25: Claim 25 recites a system that performs the same process as described in Claim 1. Therefore claim 25 is rejected under the same reasons mention for claim 1. The additional elements of claim 25 is addressed below: “A non-transitory computer-readable medium having stored thereon machine interpretable instructions which, when executed by a processor, cause the processor to perform a computer- implemented method for machine learning architecture for time series data prediction, the method comprising” ([pg. 6, section 4, par. 1-2], It is implied that the experiments using the models and datasets are performed on a computer consisting of a processor and memory storing instructions to execute the model.) Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to GARY MAC whose telephone number is (703)756-1517. The examiner can normally be reached Monday - Friday 8:00 AM - 5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abdullah Kawsar can be reached at (571) 270-3169. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /GARY MAC/Examiner, Art Unit 2127 /ABDULLAH AL KAWSAR/Supervisory Patent Examiner, Art Unit 2127
Read full office action

Prosecution Timeline

May 20, 2022
Application Filed
Jun 05, 2025
Non-Final Rejection — §101, §103
Dec 09, 2025
Response Filed
Feb 09, 2026
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596907
NEURAL NETWORK OPERATION APPARATUS AND METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12572842
METHODS AND SYSTEMS FOR DECENTRALIZED FEDERATED LEARNING
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 2 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
36%
Grant Probability
61%
With Interview (+25.0%)
5y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 14 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month