Prosecution Insights
Last updated: April 19, 2026
Application No. 18/574,377

A METHOD AND A SYSTEM FOR PROVIDING DATA FOR DIAGNOSIS AND PREDICTION OF AN OUTCOME OF CONDITIONS AND THERAPIES

Non-Final OA §101§103§112
Filed
Dec 27, 2023
Examiner
WEBB, JESSICA MARIE
Art Unit
3683
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Inteneural Networks Inc.
OA Round
1 (Non-Final)
33%
Grant Probability
At Risk
1-2
OA Rounds
3y 0m
To Grant
86%
With Interview

Examiner Intelligence

Grants only 33% of cases
33%
Career Allow Rate
33 granted / 99 resolved
-18.7% vs TC avg
Strong +52% interview lift
Without
With
+52.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
21 currently pending
Career history
120
Total Applications
across all art units

Statute-Specific Performance

§101
33.6%
-6.4% vs TC avg
§103
34.3%
-5.7% vs TC avg
§102
5.1%
-34.9% vs TC avg
§112
23.3%
-16.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 99 resolved cases

Office Action

§101 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION Status of Claims Claims 1-4 are pending and have been examined. Priority Acknowledgement is made of applicant’s claim to priority under 35 U.S.C. 371 to PCT Application No. PCT/US2022/035062 filed 06/27/2022 which claims priority to U.S. Provisional Patent Application No. 63/215,491 filed 06/27/2021. Specification The abstract should be in narrative form and generally limited to a single paragraph within the range of 50 to 150 words in length. The Abstract is objected to as it needs to be submitted in narrative form. See MPEP § 608.01(b) for guidelines for the preparation of patent abstracts. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claims 1-4 are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor regards as the invention. Appropriate correction is required. Claims 1 and 4 recite the limitation "determining a patient from the other patient’s history database" (claim 1 being representative). There is insufficient antecedent basis for this limitation in the claim. It is unclear whether the determining step is referring back to the other patients history database recited in the providing step. Claims 1 and 4 recite the limitation including "the examined patients" (claim 1 being representative). There is insufficient antecedent basis for this limitation in the claim. It is unclear whether the determining step is referring back to the examined patient in the first claim step. The claim as a whole is concerned with retrieving images of one or more similar patients that match the one examined patient image, rather than adding one or more examined patients to a cluster of similar patients. The rejections applied to claim 1 also apply to dependent claims 2-3. Claim 3 recites “the patient history database”, which lacks antecedent basis. The Examiner interprets this as the other patients history database. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-4 are rejected under 35 U.S.C. §101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Claims 1 and 4 are rejected under 35 U.S.C. §101 because the claimed invention is directed to an abstract idea without significantly more. Eligibility Analysis Step 1 (YES): Claims 1 and 4 fall into at least one of the statutory categories (i.e., process or system). Eligibility Analysis Step 2A1 (YES): The claims recite an abstract idea. The identified abstract idea is as underlined (claim 1 being representative): receiving a medical image of an examined patient, the medical image covering an area or volume of the examined patient's anatomy; inputting the medical image to a classifying neural network to generate descriptors; receiving additional data of the examined patient; providing an other patients history database comprising other patients' records, the records including the descriptors, the additional data and a clinical outcome of individual patients; determining a patient from the other patient's history database being a closest match to the examined patients in terms of features of the descriptors to be a digital twin patient; and presenting the clinical outcome of the digital twin patient. The identified claim elements, as drafted, is a process that under the broadest reasonable interpretation (BRI) covers a method of organizing human activity but for the recitation of generic computer component language (in claim 4 only, discussed below in 2A2). That is, other than reciting the generic computer component language, the claimed invention amounts to a human following a series of rules or instructions, which is a method of managing personal behavior or relationships or interactions between people. For example, but for the generic computer component language, the claims encompass a person receiving a medical image of an examined patient, inputting the medical image to generate descriptors, receiving additional data of the examined patient, providing an other patients history database, determining a patient from the other patient’s history database being a closest match to the examined patients, and presenting a clinical outcome of a digital twin patient in the manner described in the identified abstract idea, supra.. The Examiner notes that certain “method[s] of organizing human activity” includes a person’s interaction with a computer. MPEP 2106.04(a)(2)(II). If a claim limitation, under its broadest reasonable interpretation, covers managing personal behavior or interactions between people but for the recitation of the generic computer component language, then it falls within the “certain methods of organizing human activity” grouping of abstract ideas. See additionally MPEP 2106. Accordingly, the claims recite an abstract idea. Eligibility Analysis Step 2A2 (NO): The judicial exception, the above-identified abstract idea, is not integrated into a practical application. In particular, the claims recite the additional elements of a non-transitory processor-readable storage medium and a processor (claim 4 only) that implement the identified abstract idea. The additional elements aforementioned are not described by the applicant and are recited at a high-level of generality (i.e., a generic computer or computer component performing a generic computer or computer component function that facilitates the identified abstract idea) such that these amount no more than mere instructions to apply the exception using a generic computer component (see Specification e.g., at pg. 2, para. 3). See MPEP § 2106.04(d)(I). Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claims are directed to an abstract idea. The claims further recite the additional element of a classifying neural network that implements the identified abstract idea. The additional element is recited at a high-level of generality and is merely invoked as a tool to perform an existing process (MPEP § 2106.05(f)(2), see case involving a commonplace business method or mathematical algorithm being applied on a general-purpose computer within the “Other examples”), such that this amounts no more than mere instructions to apply the abstract idea using a general-purpose computer (see Specification at pg. 5, para. 1: “a convolutional neural network trained to perform typical classification task on a dataset such as ImageNet can be used”). See MPEP § 2106.04(d)(I); and Alice Corp. Pty. Ltd. V. CLS Bank Int’l, 134 S. Ct. 2347, 1357 (2014). Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claims are directed to an abstract idea. Eligibility Analysis Step 2B (NO): The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of a non-transitory processor-readable storage medium and a processor (claim 4 only) to perform the method (represented by claim 1) amount no more than mere instructions to apply the exception using a generic computer or generic computer component. Mere instructions to apply an exception using generic computer(s) and/or generic computer component(s) cannot provide an inventive concept (“significantly more”). See MPEP § 2106.05(f). The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of a classifying neural network to perform the method amounts no more than mere instructions to “apply it” with the exception by invoking an algorithm merely as a tool to perform an existing process (i.e., only recites the algorithm as a tool to apply data to an algorithm and report the results), in this case to receive input data and output output data. The use of a trained machine learning algorithm in its ordinary capacity to perform tasks in the identified abstract idea does not provide an inventive concept (“significantly more”). See MPEP § 2106.05(f). Accordingly, alone or in combination, the additional element does not provide significantly more. Thus, the claims are not patent eligible. Dependent claims 2-3, when analyzed as a whole, are similarly rejected under 35 U.S.C. §101 because the additional limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea without significantly more. The claims, when considered alone or as an ordered combination, either (1) merely further define the abstract idea, (2) do not further limit the claim to a practical application, or (3) do not provide an inventive concept such that the claims are subject matter eligible. Claim(s) 2 merely further describe(s) the additional element(s) of the classifying neural network is an ImageNet that implements the identified abstract idea. The ImageNet is recited at a high-level of generality and is merely invoked as a tool to perform an existing process (MPEP § 2106.05(f)(2), see case involving a commonplace business method or mathematical algorithm being applied on a general-purpose computer within the “Other examples”), such that this amounts no more than mere instructions to apply the abstract idea using a general-purpose computer (see Applicant’s disclosure at Fig. 4, which describes the ImageNet). See analysis, supra. Claim(s) 3 merely further describe(s) the abstract idea (e.g., determining the digital twin patient, finding a set of most similar candidates from the patient history database using a first technique, finding the digital twin patient from the set of the most similar candidates using a second technique). See analysis, supra. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1 and 3-4 are rejected under 35 U.S.C. 103 as being unpatentable over US 2019/0371439 A1 to Lisowska et al. (“Lisowska” herein) in view of US 2020/0019617 A1 to Eswaran et al. (“Eswaran” herein). Re. Claim 1, Lisowska teaches a method comprising: receiving a medical image (40) of an examined patient, the medical image covering an area or volume of the examined patient's anatomy (Fig. 2, [0040]-[0041], [0048] teach forming a plurality of medical data sets, each comprising at least one medical image 30 obtained from patient imaging and medical record data 32 associated with the patients for whom imaging was performed… In use time, receiving a further medical data set comprising current patient imaging 40 comprising at least one further medical image 40 and records 42 comprising further information about the patient and/or image(s).); inputting the medical image to a classifying neural network to generate descriptors (Fig. 3, [0059], [0165] teach the embedding model 66 is a trained classifier, e.g., a convolutional neural network (CNN)… The embedding model 66 outputs a plurality of features 192 and further outputs a prediction 194 for each of the classification tasks. Fig. 2, [0049]-[0050], [0087] teach applying the embedding model 34 to the plurality of stored medical data sets and to the further medical data set to obtain respective embeddings / compressed representations 48 (descriptors), each of which comprises a fixed-length one-dimensional vector… point 52a is representative of the further medical data set for the current patient under review.); receiving additional data of the examined patient (see [0048], further medical records 42); providing an other patients history database comprising other patients' records, the records including the descriptors, the additional data and […] of individual patients ([0032]-[0033] teaches image data sets and/or medical data sets are stored in and supplied from a data store 20 or a remote data store (other patients history database comprising other patients’ records including the additional data), which may form part of a Picture Archiving and Communication System (PACS). Fig. 2, [0036], [0091] teach the embedding model 34 outputs a compressed representation database 36… and the embedding vectors are stored in the data store 20 (including the descriptors).); determining a patient from the other patient's history database being a closest match to the examined patients in terms of features of the descriptors to be a digital twin patient (Fig. 6, [0036], [0052] teach similarity between medical data sets is always determined based on the difference between the fixed-length 1D vectors. Fig. 6, [0094]-[0095], [0105], [0109], [0111] teach selecting importance weighting for attributes 94, weighting elements of embedding vectors (features of the descriptors) accordingly 96, determining similarity of at least some of the medical data sets 98, and selecting the medical data set that is most similar to the further medical data set (determining a digital twin patient). Also, [0098] teaches the selection of a high value of importance by the user for a given attribute indicates that the user wants to find medical data sets that are similar in terms of that attribute.); and presenting the […] of the digital twin patient ([0006], [0113], [0015] teach finding another patient that has a similar pathology to that of the patient of interest… allowing the clinician to display and compare images or other information for the similar patient to corresponding images or other information for the patient of interest, e.g., comparing the treatment plan for the pathology or the progression of the pathology.) Lisowska does not explicitly teach providing an other patients history database comprising other patients' records, the records including… a clinical outcome of individual patients; and presenting the clinical outcome. Eswaran teaches providing a clinical outcome of an individual patient / presenting the clinical outcome of the digital twin patient (Fig. 9, [0035], [0127] teach displaying summary statics on a timeline for similar patients…This timeline shows what medications/events (clinical outcomes) happened for similar patients before and after the most similar image, e.g., the medications received, the occurrence of certain medical events over a period of days… The trends in the timelines may highlight obvious interventions to make for the patient associated with the query image.) Therefore, it would have been prima facie obvious to one of ordinary skill in the art before the effective filing date to have modified the an apparatus for determining similarity between medical datasets for a plurality of patients of Lisowska to store additional data and supply data therefrom and to use this information as part of a similar image search for radiology as taught by Eswaran, with the motivation of improving machine learning technology, radiology image retrieval and clinical decision support (see Eswaran at para. 0001-0002 and 0039). Re. Claim 3, Lisowska/Eswaran teaches the method according to claim 1, comprising determining the digital twin patient (see claim 1 prior art rejection) by finding a set of most similar candidates from the patient history database using a first technique (Lisowska Fig. 6, [0109] teaches determining similarity of at least some of the medical data sets (a set of most similar candidates) by calculating a distance from the modified embedding vector s’ for the further medical data set to each of the modified embedding vectors s’ for the medical data sets (a first technique).) and next finding the digital twin patient from the set of the most similar candidates using a second technique (Lisowska Fig. 6, [0111]-[0112] teaches selecting the medical data set that is most similar to the further medical data set as determined based on the modified embedding vectors s’… ranking the selected most similar medical data sets in order of similarity (finding the digital twin patient using a second technique).) Re. CLAIM 4, Lisowska/Eswaran teaches a computer-implemented system, comprising at least one nontransitory processor-readable storage medium that stores at least one of processor-executable instructions or data; and at least one processor communicably coupled to at least one nontransitory processor readable storage medium (Lisowska [0037]-[0038], claim 20 teaches circuitries 24, 26, 28 are each implemented in the CPU and/or GPU by means of a computer program product comprising computer-readable instructions that are executable to perform the method.), wherein at least one processor is configured to perform the steps of the method according to claim 1 (see analogous claim 1 prior art rejection). Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Lisowska, Eswaran and US 2019/0304092 A1 to Akselrod-Ballin et al. (“Akselrod-Ballin” herein). Re. Claim 2, Lisowska/Eswaran teaches the method according to claim 1, wherein the classifying neural network is […] (see claim 1 prior art rejection). Lisowska/Eswaran does not teach the classifying neural network is an ImageNet. Akselrod-Ballin teaches the classifying neural network is an ImageNet (Abstract teaches a method for training a deep convolutional neural network (CNN). [0091], [0155], [0189] teach the deep CNN may be trained on the ImageNet dataset.) Therefore, it would have been prima facie obvious to one of ordinary skill in the art before the effective filing date to have modified the an apparatus for determining similarity between medical datasets for a plurality of patients of Lisowska/Eswaran to train a CNN for classification tasks using specifically the ImageNet dataset and to use this information as part of systems and methods for automatic detection of an indication of abnormality in an anatomical image as taught by Akselrod-Ballin, with the motivation of improving machine learning technology for medical image processing (see Akselrod-Ballin at para. 0009-0010, 0012-0013, 0073, 0151). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Krizhevsky et al. (2012) (“ImageNet Classification with Deep Convolutional Neural Networks”) for teaching convolutional neural networks constitute one class of models that, due to the large learning capacity, can specify an object recognition task or problem by a dataset as large as ImageNet, which contains enough labeled examples to train such models without severe overfitting. See pg. 1, last para. - pg. 2, 1st para. Bogoni et al. (US 2016/0321427 A1) for teaching patient-specific therapy planning support using patient matching. Suehling (US 2019/0108917 A1) for teaching method and system for supporting clinical decisions. Schmidt et al. (US 2017/0193660 A1) for teaching identifying a successful therapy for a cancer patient using image analysis of tissue from similar patients. Bhavani (US 2017/0053064 A1) for teaching personalized content-based patient retrieval system. Kraft et al. (US 2021/0082565 A1) for teaching method and system for predicting neurological treatment. Siemionow et al. (US 2020/0357119 A1) for teaching system and method for determining brain age using a neural network. Foley et al. (US 2020/0211678 A1) for teaching automated image quality control apparatus and methods. Sorenson et al. (US 2018/0137244 A1) for teaching medical image identification and interpretation. Wang et al. (US 11,583,239 B2) for teaching hospital-scale chest x-ray database for entity extraction and weakly-supervised classification and localization of common thorax diseases. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jessica M Webb whose telephone number is (469)295-9173. The examiner can normally be reached Mon-Fri 9:00am-1:00pm CST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Robert Morgan can be reached on (571) 272-6773. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /J.M.W./Examiner, Art Unit 3683 /CHRISTOPHER L GILLIGAN/Primary Examiner, Art Unit 3683
Read full office action

Prosecution Timeline

Dec 27, 2023
Application Filed
Oct 08, 2025
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585721
SINGLE BARCODE SCAN CAST SYSTEM FOR PHARMACEUTICAL PRODUCTS
2y 5m to grant Granted Mar 24, 2026
Patent 12525336
INTELLIGENT MEDICAL ASSESSMENT AND COMMUNICATION SYSTEM WITH ARTIFICIAL INTELLIGENCE
2y 5m to grant Granted Jan 13, 2026
Patent 12394505
ELECTRONIC HEALTH RECORD INTEROPERABILITY TOOL
2y 5m to grant Granted Aug 19, 2025
Patent 12347541
CAREGIVER SYSTEM AND METHOD FOR INTERFACING WITH AND CONTROLLING A MEDICATION DISPENSING DEVICE
2y 5m to grant Granted Jul 01, 2025
Patent 12293001
REFERENTIAL DATA GROUPING AND TOKENIZATION FOR LONGITUDINAL USE OF DE-IDENTIFIED DATA
2y 5m to grant Granted May 06, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
33%
Grant Probability
86%
With Interview (+52.5%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 99 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month