Prosecution Insights
Last updated: April 19, 2026
Application No. 19/046,083

APPARATUS AND METHOD FOR TIME SERIES DATA FORMAT CONVERSION AND ANALYSIS

Non-Final OA §101§103
Filed
Feb 05, 2025
Examiner
ZECHER, CORDELIA P K
Art Unit
2100
Tech Center
2100 — Computer Architecture & Software
Assignee
Anumana, Inc.
OA Round
1 (Non-Final)
50%
Grant Probability
Moderate
1-2
OA Rounds
3y 8m
To Grant
76%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
253 granted / 509 resolved
-5.3% vs TC avg
Strong +26% interview lift
Without
With
+25.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
287 currently pending
Career history
796
Total Applications
across all art units

Statute-Specific Performance

§101
19.0%
-21.0% vs TC avg
§103
46.8%
+6.8% vs TC avg
§102
13.1%
-26.9% vs TC avg
§112
16.0%
-24.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 509 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This is a non-final Office Action in response to the present US application number 19/046083, filed on 02/19/2024. This application is a Continuation of U.S. Patent Application Serial No. 18/591499, filed 09/15/2021. Claims 1-20 are presented for examination, with claims 1 and 11 being independent. Information Disclosure Statement The information disclosure statements (IDSs) submitted on 02/05/2025 and 07/22/2025 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements have been considered by the Examiner. Claim Rejections - 35 USC § 101 35 U.S.C. §101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. §101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Claims 1-20 are directed to the abstract idea for time series data format conversion and analysis using machine learning. The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. Independent claims 1 and 11 Step 1: Claim 1 recites “An apparatus…” ; therefore, it is a manufacture. Claim 11 recites “A method …”; the claim recites a series of steps and therefore, it is a process. Independent claims 1 and 11 recite limitations of: receive (insignificant extra-solution activity) a static image comprising at least a time series of measured values wherein the at least a time series of measured values represents an electrocardiogram (ECG) of a subject; convert (insignificant extra-solution activity) the at least a time series of measured values from the static image to a target domain protocol, wherein the conversion comprises: parsing (a mental step that performs using generic tool, when the data to be parsed is provided by a data gathering step) the at least a time series of measured values from the static image to ECG data comprising data points representing the ECG of the subject, wherein the data points represent lead signal and time; and predict (a mental step that using generic computer component), using an ejection-fraction prediction model, an estimated ejection fraction characteristic as a function of the ECG data, wherein the predicting the estimated ejection fraction characteristic comprises: inputting (insignificant extra-solution activity) the ECG data representing the ECG of the subject into the ejection-fraction prediction model; predicting (a mental step that using generic computer component), using the ejection-fraction prediction model, an estimated ejection-fraction characteristic of the subject as a function of the ECG data; and outputting (insignificant extra-solution activity), using the ejection-fraction prediction model, the estimated ejection-fraction characteristic. Step 2A Prong One: The limitations of: predict …, predicting …, parsing …; are processes, that, under its broadest reasonable interpretation, covers performance of the limitation in the mind, but for the recitation of generic computer components. That is , other than reciting, a processor, a memory; they are computer components; nothing in the claim elements preclude the step from practically being performed in a human mind or with the aid of pen and paper. Note that the limitations are done by the generically recited computer components under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas (concepts performed in the human mind including an observation, evaluation, judgment, and opinion). Step 2A Prong Two: The judicial exception is not integrated into a practical application. In particular, the claims recite the additional limitations: receive …, convert …, inputting …, and outputting…; the limitations are mere generic gathering/collecting, analyzing and transmitting data (see MPEP 2106.05(g)). Further, these additional limitations are recited as being performed by, a processor, a memory; provide nothing more than mere instructions to implement an abstract idea on a generic computer. See MPEP 2106.05(f). MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. Step 2B: The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. The limitations: receive …, convert …, inputting …, and outputting…; are recognized by the courts as well-understood, routine , and conventional activities when they are claimed in a merely generic manner (see MPEP 2106.05(d)(II)(iv) gathering/collecting, analyzing and transmitting data, Versata Dev. Group Inc.... As explained with respect to Step 2A, Prong Two, the additional elements performing by a processor, a memory; are at best mere instructions to “apply” the abstract ideas, which cannot provide an inventive concept. See MPEP 2106.05(f). Generally linking the use of the judicial exception to a particular technological environment or field of use, e.g., a claim describing how the abstract idea of hedging could be used in the commodities and energy markets, as discussed in Bilski v. Kappos, 561 U.S. 593, 595, 95 USPQ2d 1001, 1010 (2010) or a claim limiting the use of a mathematical formula to the petrochemical and oil-refining fields, as discussed in Parker v. Flook, 437 U.S. 584, 588-90, 198 USPQ 193, 197-98 (1978) (MPEP § 2106.05(h)). Since, claims 1 and 11 are directed to abstract ideas; thus, the claims are not patent eligible. Claims 2-10 and 12-20 The limitations as recited in claims 2-10 and 12-20 are simply describe the concepts for time series data format conversion and analysis using machine learning. The claims do not include additional element(s) that is sufficient to amount to significantly more than the judicial exceptions. The claims cannot provide an inventive concept. Therefore, claims 2-10 and 12-20 are directed to abstract ideas and are not patent eligible. Analysis of the dependent claims are shown below. Claim 2 recites the limitations, wherein: the static image comprises an ECG format comprising multiple leads; and parsing the at least a time series of measured values from the static image to the ECG data comprises parsing the at least a time series of measured values from the static image to the ECG data, wherein the ECG data comprises ECG data for multiple leads. The limitations are processes, that, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the “Mental Processes” grouping of abstract ideas (concepts performed in the human mind including an observation, evaluation, judgment, and opinion). Claim 3 recites the limitations, wherein parsing the at least a time series of measured values from the static image to the ECG data comprises: scaling the data points along a time axis; and aligning the data points along a lead signal axis. The limitations are Mathematical concepts. The courts have found that mathematical relationships fall within the judicial exceptions, grouping of abstract ideas. Claim 4 recites the limitations, wherein parsing the at least a time series of measured values from the static image to the ECG data comprises: inputting the static image into a machine-learning model, wherein the machine-learning model is trained using a discriminator configured to differentiate between generated ECG data and real ECG data; generating, by the machine-learning model, a transformation of the static image into ECG data; and outputting the ECG data based on the generated ECG data. The limitations are insignificant extra-solution activity of mere generic transmission and presentation of collected and analyzed data (see MPEP 2106.05(g)) and which is well understood routine conventional (see MPEP 2106.05(d)). Claim 5 recites the limitations, wherein parsing the at least a time series of measured values from the static image to ECG data comprises: inputting the static image into a machine-learning model trained using synthetic image data generated from digital ECG data; and outputting, by the machine-learning model, the ECG data. The limitations are insignificant extra-solution activity of mere generic presentation of collected and analyzed data (see MPEP 2106.05(g)) and which is well understood routine conventional (see MPEP 2106.05(d)). Claim 6 recites the limitations, wherein the processor is further configured to: input the ECG data into a feature extractor; the limitation is insignificant extra-solution activity of mere generic presentation of collected/gathered data (see MPEP 2106.05(g)) and which is well understood routine conventional (see MPEP 2106.05(d)); analyze, using the feature extractor, the ECG data to determine morphological features of the ECG; the limitation is insignificant extra-solution activity of mere generic presentation of analyzed data (see MPEP 2106.05(g)) and which is well understood routine conventional (see MPEP 2106.05(d)); and predict, using the ejection-fraction prediction model, the ejection-fraction prediction as a function of the morphological features of the ECG; the limitation is Mathematical concept. The courts have found that mathematical relationships fall within the judicial exceptions, grouping of abstract ideas. . Claim 7 recites the limitations, wherein the ejection-fraction prediction model comprises a neural network trained, using gradient descent machine-learning techniques, with a set of multiple training data pairs, wherein the set of multiple training data pairs comprises: exemplary time-series data; and a target ejection-fraction characteristic. The limitations are processes, that, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the “Mental Processes” grouping of abstract ideas (concepts performed in the human mind including an observation, evaluation, judgment, and opinion). Claim 8 recites the limitations, wherein the at least a processor is further configured to select the ejection-fraction prediction model as a function of one or more characteristics of the subject, wherein selecting the ejection-fraction prediction model comprises: identifying a set of characteristics for the subject; selecting the ejection-fraction prediction model from one or more ejection-fraction prediction models as a function of the set of characteristics for the subject; and generating the estimated ejection-fraction characteristic using the selected ejection- fraction prediction model and the ECG data of the subject. The limitations are processes, that, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the “Mental Processes” grouping of abstract ideas (concepts performed in the human mind including an observation, evaluation, judgment, and opinion). Claim 9 recites the limitation, wherein the processor is further configured to classify the estimated ejection-fraction characteristic into a risk category comprising one or more thresholds; the limitation is a processes, that, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the “Mental Processes” grouping of abstract ideas (concepts performed in the human mind including an observation, evaluation, judgment, and opinion). Claim 10 recites the limitations, wherein the at least a processor is further configured to determine whether the estimated ejection-fraction characteristic meets one or more screening criteria comprising a threshold ejection-fraction, wherein determining whether the estimated ejection-fraction characteristic meets one or more screening criteria comprises: comparing the one or more screening criteria to the estimated ejection-fraction characteristic. The limitations are Mathematical concepts. The courts have found that mathematical relationships fall within the judicial exceptions, grouping of abstract ideas. Claims 12-20 recite, A method for time series data format conversion and analysis using machine-learning, wherein the method comprises steps are similar to subject matters of claims 2-10. Therefore, claims 12-20 are rejected by the same reason as claims 2-10. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Bordaweker et al., US 2019/0294953 (hereinafter Bordaweker), and further in view of Attia et al., US 2020/0397313 (hereinafter Attia). Regarding claim 1, Bordaweker discloses, An apparatus for time series data format conversion and analysis using machine-learning, wherein the apparatus comprises: at least a processor, and a memory communicatively connected to the at least a processor (e.g. Processor 110, Memory 115, Bordaweker: [0005], [0017] and Fig. 1), the memory containing instructions configuring the at least a processor to: receive a static image comprising at least a time series of measured values (e.g. a new time series is received and processed. The time series includes a series of data points or sequence of data corresponding to an interval of time, Bordaweker: [0005], [0024], [0041]. The time series data may include data relating to geophysical measurements (e.g., temperature, sea level, wind speed, and the like), energy usage, computing resource usage, population, congestion, energy levels, healthcare data (e.g., blood pressure, heart rate, etc.) [as static image] and any other data that may be collected over a window of time, Bordaweker: [0025]; convert the at least a time series of measured values from the static image to a target domain protocol (e.g. the time series must be converted to a sequence of symbols or characters in order to facilitate generation of the feature vector. In order to convert the time series data to a sequence of symbols, the Pre-Processing Component 135 delineates the time series into segments of equal length. In one embodiment, once the segments are created, the segments are clustered [as a target domain protocol] based on the data values within each segment. The Pre-Processing Component 135 may then assign a symbol or character to each cluster, and thereby represent the original time series as a sequence of symbols, Bordaweker: [0018]-[0020]), wherein the conversion comprises: parsing the at least a time series of measured values from the static image to data (e.g. In order to convert the time series data to a sequence of symbols, the Pre-Processing Component 135 delineates the time series into segments of equal length [as parsing the received time series]. The Feature Extractor 140 generates one or more feature vectors for all defined segment of time in each Time Series 150 based on the associated context for each segment, Bordaweker: [0018]-[0020] and[0040]). Bordaweker does not directly or explicitly disclose: wherein the at least a time series of measured values represents an electrocardiogram (ECG) of a subject; wherein the conversion comprises: ECG data comprising data points representing the ECG of the subject, wherein the data points represent lead signal and time; and predict, using an ejection-fraction prediction model, an estimated ejection fraction characteristic as a function of the ECG data, wherein the predicting the estimated ejection fraction characteristic comprises: inputting the ECG data representing the ECG of the subject into the ejection-fraction prediction model; predicting, using the ejection-fraction prediction model, an estimated ejection-fraction characteristic of the subject as a function of the ECG data; and outputting, using the ejection-fraction prediction model, the estimated ejection-fraction characteristic. Attia teaches: wherein the at least a time series of measured values represents an electrocardiogram (ECG) of a subject (e.g. a computer system obtains ECG data that describes results of the ECG over a period of time. Using the ECG data to estimate an ejection-fraction characteristic of a subject 102. For the purpose of this example, the subject 102 will be considered as a human, and more specifically as a patient of a healthcare provider. A predictive input contains a time-series of values indicating the amplitude of the ECG for one or more leads at each point in time at a specified sampling frequency over a period of time that spans one or more cardiac cycles (e.g., 1, 2, 5, or 10 seconds), Attia: abstract, [0006], [0046], [0055]); wherein the conversion comprises: ECG data comprising data points representing the ECG of the subject, wherein the data points represent lead signal and time (e.g. the predictive input generator 116 may normalize and vectorize ECG data from one or more channels (corresponding to one or more leads)[as parsing]. Each lead provides a different view of the patient's cardiac electrical activity as a result of the different angles formed by the different pairs of electrodes for the different leads. The signal from each lead can be recorded simultaneously for a period of time (e.g., 5, 10, or 15 seconds)[as data points] to capture information about the timing and location of electrical activity along different radial directions, Attia: [0047], [0055]); and predict, using an ejection-fraction prediction model, an estimated ejection fraction characteristic as a function of the ECG data (e.g. The system 110 further includes one or more ejection-fraction prediction models 118. These models 118 are generally configured to process one or more predictive inputs that characterize a patient's ECG data and, based on the predictive inputs, generate an estimated ejection-fraction characteristic for the patient 102, Attia: [0052]), wherein the predicting the estimated ejection fraction characteristic comprises: inputting the ECG data representing the ECG of the subject into the ejection-fraction prediction model (e.g. The system provides a predictive input that was derived from the ECG data to an ejection-fraction predictive mode, Attia: [0009]); predicting, using the ejection-fraction prediction model, an estimated ejection-fraction characteristic of the subject as a function of the ECG data (e.g. The system 110 further includes one or more ejection-fraction prediction models 118. These models 118 are generally configured to process one or more predictive inputs that characterize a patient's ECG data and, based on the predictive inputs, generate an estimated ejection-fraction characteristic for the patient 102, Attia: [0052]); and outputting, using the ejection-fraction prediction model, the estimated ejection-fraction characteristic (e.g. The predictive input can be processed using the ejection-fraction predictive model to generate an estimated ejection-fraction characteristic. The system then provides, for output, the estimated ejection-fraction characteristic, Attia: [0009], [0021], [0058]). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify method of comparing time series data as disclosed by Bordaweker to include Systems, methods, devices, and techniques for estimating an ejection-fraction characteristic as taught by Attia to provide an ejection-fraction prediction. Regarding claim 2, Attia further teaches, wherein: the static image comprises an ECG format comprising multiple leads (e.g. The ECG data can include multiple channels, each channel including a subset of the ECG data that describes a respective one of multiple leads of the ECG of the mammal over the period of time. The predictive input can characterize the multiple leads of the ECG for each of the multiple channels of the ECG data, Attia: [0012]); and parsing the at least a time series of measured values from the static image to the ECG data comprises parsing the at least a time series of measured values from the static image to the ECG data, wherein the ECG data comprises ECG data for multiple leads (e.g. the predictive input generator 116 may normalize and vectorize ECG data from one or more channels (corresponding to one or more leads)[as parsing]. Each lead provides a different view of the patient's cardiac electrical activity as a result of the different angles formed by the different pairs of electrodes for the different leads. The signal from each lead can be recorded simultaneously for a period of time (e.g., 5, 10, or 15 seconds)[as data points] to capture information about the timing and location of electrical activity along different radial directions, Attia: [0047], [0055]). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify method of comparing time series data as disclosed by Bordaweker to include Systems, methods, devices, and techniques for estimating an ejection-fraction characteristic as taught by Attia to provide an ejection-fraction prediction. Regarding claim 3, Bordaweker further discloses, wherein parsing the at least a time series of measured values from the static image to the ECG data comprises: scaling the data points along a time axis (e.g. time is graphed on the horizontal axis, Bordaweker: [0025] and Fig. 2); and aligning the data points along a lead signal axis (e.g. the data value in the Time Series 210a-c at the corresponding time is graphed on the vertical axis, Bordaweker: [0025] and Fig. 2). Regarding claim 4, Bordaweker further discloses, wherein parsing the at least a time series of measured values from the static image to the ECG data comprises: inputting the static image into a machine-learning model, wherein the machine-learning model is trained using a discriminator configured to differentiate between generated ECG data and real ECG data (e.g. the generated similarity measures are used to train one or more machine learning models to predict a future data series corresponding to an input data series, Bordaweker: [0045]); generating, by the machine-learning model, a transformation of the static image into ECG data (e.g. the numerical time series data must be converted to the sequence of symbols. In order to convert the time series data to a sequence of symbols, the Pre-Processing Component 135 delineates the time series into segments of equal length, Bordaweker: [0020]); and outputting the ECG data based on the generated ECG data (e.g. the Feature Extractor 140 may generate one or more feature vectors [as ECG data] for the Segment 211 based on its corresponding context (Segment 213 and/or the preceding time segment). In an embodiment, each of these feature vectors is stored along with the Time Series 150 so it can be compared with newly generated feature vectors. The subsequent time segment corresponding to a first time window may be provided as the target output of a machine learning model, Bordaweker: [0040] and [0045]). Regarding claim 5, Bordaweker further discloses, wherein parsing the at least a time series of measured values from the static image to ECG data comprises: inputting the static image into a machine-learning model trained (e.g. the generated similarity measures are used to train one or more machine learning models to predict a future data series corresponding to an input data series, Bordaweker: [0045]) using synthetic image data generated from digital ECG data; and outputting, by the machine-learning model (e.g. the Neural Network 300 includes a plurality of Nodes 302-324 (often referred to as neurons) and is trained to generate an output Time Series 355 when provided with an input Time Series 350, Bordaweker: [0034]), the ECG data. Regarding claim 6, Attia further teaches, wherein the processor is further configured to: input the ECG data into a feature extractor (e.g. the system 110 includes a feature extractor 126 that analyzes ECG data and determines values of any applicable morphological features to include the predictive input that will be processed by an ejection-fraction prediction model 118, Attia: [0056]); analyze, using the feature extractor, the ECG data to determine morphological features of the ECG (e.g. the system 110 includes a feature extractor 126 that analyzes ECG data and determines values of any applicable morphological features to include the predictive input that will be processed by an ejection-fraction prediction model 118, Attia: [0056]); and predict, using the ejection-fraction prediction model, the ejection-fraction prediction as a function of the morphological features of the ECG (e.g. the system 110 includes a feature extractor 126 that analyzes ECG data and determines values of any applicable morphological features to include the predictive input that will be processed by an ejection-fraction prediction model 118, Attia: [0056]). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify method of comparing time series data as disclosed by Bordaweker to include Systems, methods, devices, and techniques for estimating an ejection-fraction characteristic as taught by Attia to provide an ejection-fraction prediction. Regarding claim 7, Attia further teaches, wherein the ejection-fraction prediction model comprises a neural network trained, using gradient descent machine-learning techniques, with a set of multiple training data pairs (e.g. the ejection-fraction prediction model(s) 118 may be regression models, machine-learning models, or both. In some implementations, the model(s) 118 are feedforward, recurrent, or convolutional neural networks, or a capsule network. Neural network models may have fully connected layers and may employ an auto-encoder network, Attia: [0053], [0064]), wherein the set of multiple training data pairs comprises: exemplary time-series data (e.g. the predictive inputs represent a time-series of values for the ECG waveform, Attia: [0058]); and a target ejection-fraction characteristic (e.g. the system obtains a set of multiple training data pairs. Each pair includes an ECG predictive input that characterizes a particular patient's ECG and a target ejection-fraction characteristic for the patient, Attia: [0064]). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify method of comparing time series data as disclosed by Bordaweker to include Systems, methods, devices, and techniques for estimating an ejection-fraction characteristic as taught by Attia to provide an ejection-fraction prediction. Regarding claim 8, Attia further teaches, wherein the at least a processor is further configured to select the ejection-fraction prediction model as a function of one or more characteristics of the subject (e.g. selecting and using an appropriate ejection-fraction prediction model that corresponds to the characteristics of a patient, Attia: [0061]), wherein selecting the ejection-fraction prediction model comprises: identifying a set of characteristics for the subject (e.g. the system identifies a set of characteristics for the patient for whom an estimated ejection fraction characteristic is to be determined, Attia: [0061]); selecting the ejection-fraction prediction model from one or more ejection-fraction prediction models as a function of the set of characteristics for the subject (e.g. the system selects one of the ejection-fraction prediction models that corresponds to the identified set of characteristics for the patients, Attia: [0061]); and generating the estimated ejection-fraction characteristic using the selected ejection- fraction prediction model and the ECG data of the subject (e.g. the system generates an ejection-fraction prediction using the selected ejection-fraction prediction model that corresponds to the patient's characteristics, Attia: [0061]). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify method of comparing time series data as disclosed by Bordaweker to include Systems, methods, devices, and techniques for estimating an ejection-fraction characteristic as taught by Attia to provide an ejection-fraction prediction. Regarding claim 9, Attia further teaches, wherein the processor is further configured to classify the estimated ejection-fraction characteristic into a risk category comprising one or more thresholds (e.g. the ejection-fraction model may be trained to classify a patient's ejection fraction into one of two, three, or more possible ejection-fraction categories defined by specified threshold ejection-fraction values, Attia: [0058]). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify method of comparing time series data as disclosed by Bordaweker to include Systems, methods, devices, and techniques for estimating an ejection-fraction characteristic as taught by Attia to provide an ejection-fraction prediction. Regarding claim 10, Attia further teaches, wherein the at least a processor is further configured to determine whether the estimated ejection-fraction characteristic meets one or more screening criteria comprising a threshold ejection-fraction (e.g. ECG-based estimates of a patient's ejection fraction can be useful screening procedure, further evaluation of a patient may be warranted based on the results of an ECG-based screening procedure. The estimated ejection-fraction characteristic may be an absolute value that indicates the predicted ejection fraction of a patient and the screening criteria may include a threshold ejection fraction, Attia: [0062]), wherein determining whether the estimated ejection-fraction characteristic meets one or more screening criteria comprises: comparing the one or more screening criteria to the estimated ejection-fraction characteristic (e.g. the system determines whether the estimated ejection-fraction characteristic, and optionally additional factors, meet one or more screening criteria that are to guide a decision whether to further evaluation of the patient's condition is warranted. For example, the estimated ejection-fraction characteristic may be an absolute value that indicates the predicted ejection fraction of a patient and the screening criteria may include a threshold ejection fraction (e.g., 35-percent or 50-percent). If the estimated ejection fraction of the patient is below the threshold, a follow-on procedure for further evaluation can be performed on the patient, Attia: [0062]). Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to modify method of comparing time series data as disclosed by Bordaweker to include Systems, methods, devices, and techniques for estimating an ejection-fraction characteristic as taught by Attia to provide an ejection-fraction prediction. Claims 11-20 recite, A method for time series data format conversion and analysis using machine-learning, wherein the method comprises steps are similar to subject matters of claims 1-10. Therefore, claims 11-20 are rejected by the same reason as claims 1-10. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CECILE H VO whose telephone number is (571)270-3031. The examiner can normally be reached Mon-Fri (9AM-5PM). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kavita Stanley can be reached at (571) 272-8352. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CECILE H VO/Examiner, Art Unit 2153 9/25/2025 /KAVITA STANLEY/Supervisory Patent Examiner, Art Unit 2153
Read full office action

Prosecution Timeline

Feb 05, 2025
Application Filed
Sep 25, 2025
Non-Final Rejection — §101, §103
Apr 01, 2026
Response Filed

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12583466
VEHICLE CONTROL MODULES INCLUDING CONTAINERIZED ORCHESTRATION AND RESOURCE MANAGEMENT FOR MIXED CRITICALITY SYSTEMS
2y 5m to grant Granted Mar 24, 2026
Patent 12578751
DATA PROCESSING CIRCUITRY AND METHOD, AND SEMICONDUCTOR MEMORY
2y 5m to grant Granted Mar 17, 2026
Patent 12561162
AUTOMATED INFORMATION TECHNOLOGY INFRASTRUCTURE MANAGEMENT
2y 5m to grant Granted Feb 24, 2026
Patent 12536291
PLATFORM BOOT PATH FAULT DETECTION ISOLATION AND REMEDIATION PROTOCOL
2y 5m to grant Granted Jan 27, 2026
Patent 12393641
METHODS FOR UTILIZING SOLVER HARDWARE FOR SOLVING PARTIAL DIFFERENTIAL EQUATIONS
2y 5m to grant Granted Aug 19, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
50%
Grant Probability
76%
With Interview (+25.8%)
3y 8m
Median Time to Grant
Low
PTA Risk
Based on 509 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month