Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 09 February 2026 has been entered.
Formal Matters
Applicant's response, filed 09 February 2026, has been fully considered. The following rejections and/or objections are either reiterated or newly applied. They constitute the complete set presently being applied to the instant application.
Status of Claims
Claims 1, 3-15, 17, 19, 20, and 23 are currently pending and have been examined.
Claims 1, 3, 4, 8, 17, and 19 have been amended.
Claims 1, 3-15, 17, 19, 20, and 23 have been rejected.
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed on 21 November 2022 in parent Application No. KR10-2020-0060548 and KR10-2020-0171441, filed on 20 May 2020 and 09 December 2020 respectively.
The instant application therefore claims the benefit of priority under 35 U.S.C 119(a)-(d). Accordingly, the effective filing date for the instant application is 20 May 2020, claiming benefit to KR10-2020-0060548.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 3-15, 17, 19, 20, and 23 are rejected under 35 U.S.C. § 101 because the claimed invention is directed to a judicial exception (i.e. a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Step 1 – Statutory Categories of Invention:
Claims 1, 3-15, 17, 19, 20, and 23 are drawn to a method or system, which are statutory categories of invention.
Step 2A – Judicial Exception Analysis, Prong 1:
Independent claim 1 recites a method for predicting needs of a patient for hospital resources in part performing the steps of: generating numerical data per information type by encoding natural language data and structured data in patient data recorded in language and digits, wherein the natural language data comprises current disease information of the patient, and the structured data comprises at least one of demographic information and measurement information of the patient; performing natural language processing on the natural language data calculating a first type of natural lan2ua2e embedding vector by embedding text data of the current disease information obtained through the natural language processing calculating a second type of natural language embedding vector by embedding text data of remaining information of the natural language data obtained through the natural language processing performing natural language processing on the structured data calculating an embedding vector of the demographic information by embedding the text data of the demographic information obtained through the natural language processing, or performing conversion into the numerical data through the natural language processing; and predicting a task corresponding to the needs of the patient for hospital resources by applying the numerical data per information type to the artificial neural network to enable automated prediction of emergency patient needs, wherein the artificial neural network comprises an embedding model for calculating an embedding matrix of the patient data based on at least a part of the numerical data, the embedding model including a recurrent neural network structure comprising a unidirectional or bidirectional gated recurrent unit (GRU)- based hidden layer that extracts features of input data and calculates a hidden state vector, and an attention layer that receives an output matrix of the hidden layer and calculates the embedding matrix of the patient data by forming an attention matrix based on an attention weight, wherein the attention layer enables the artificial neural network to focus on input words related to the task to be predicted in the patient data; and a decision model for determining the task to which the patient data belongs by receiving the embedding matrix of the patient data, or the embedding matrix of the patient data and the numerical data of the structured data, the decision model performing multi-label classification for multiple patient need categories by using a fully connected layer comprising a first network for determining a main task and a second network for determining an auxiliary task, wherein the first network and the second network share a hidden layer having the same input to enable efficient training, and wherein the artificial neural network performs an operation of checking patient data, reading current disease information in detail, and interpreting measurement values to predict the needs of the patient similar to a medical staff or emergency manager reading the patient data directly.
Independent claim 17 recites a system for predicting needs of a patient for hospital resources in part performing the steps of receiving patient data comprising natural language data describing a condition of a patient and structured data, the patient data being recorded in language and digits, the natural language data comprises current disease information of the patient, and the structured data comprises at least one of demographic information of the patient and measurement information of the patient; encoding the natural language data and the structured data in the patient data to generate numerical data per information type performing natural language processing on the natural language data calculating a first type of natural language embedding vector by embedding text data of the current disease information obtained through the natural language processing calculating a second type of natural language embedding vector by embedding text data of remaining information of the natural language data obtained through the natural language processing performing natural language processing on the structured data calculating an embedding vector of the demographic information by embedding the text data of the demographic information obtained through the natural language processing, or performing conversion into the numerical data through the natural language processing; and predicting a task corresponding to needs of the patient for hospital resources by applying the numerical data to the artificial neural network to enable automated prediction of emergency patient needs, wherein the artificial neural network comprises: an embedding model for calculating an embedding matrix of the patient data based on at least a part of the numerical data, the embedding model including a recurrent neural network structure comprising a unidirectional or bidirectional gated recurrent unit (GRU)- based hidden layer that extracts features of input data and calculates a hidden state vector, and an attention layer that receives an output matrix of the hidden layer and calculates the embedding matrix of the patient data by forming an attention matrix based on an attention weight, wherein the attention layer enables the artificial neural network to focus on input words related to the task to be predicted in the patient data; and a decision model for determining the task to which the patient data belongs by receiving the embedding matrix of the patient data, or the embedding matrix of the patient data and the numerical data of the structured data, the decision model performing multi-label classification for multiple patient need categories by using a fully connected layer comprising a first network for determining a main task and a second network for determining an auxiliary task, wherein the first network and the second network share a hidden layer having the same input to enable efficient training, and wherein the artificial neural network performs an operation of checking patient data, reading current disease information in detail, and interpreting measurement values to predict the needs of the patient similar to a medical staff or emergency manager reading the patient data directly.
The steps of acquiring patient data and predicting tasks related to the hospital resources patient needs amount to methods of organizing human activity which includes functions relating to interpersonal and intrapersonal activities, such as managing relationships or transactions between people, social activities, and human behavior; satisfying or avoiding a legal obligation; advertising, marketing, and (MPEP § 2106.04(a)(2)(II)(C) citing the abstract idea grouping for methods of organizing human activity for managing personal behavior or relationships or interactions between people similar to iii. a mental process that a neurologist should follow when testing a patient for nervous system malfunctions, In re Meyer, 688 F.2d 789, 791-93, 215 USPQ 193, 194-96 (CCPA 1982) – also note MPEP § 2106.04(a)(2)(II) stating certain activity between a person and a computer may fall within the “certain methods of organizing human activity” grouping).
The steps of applying the patient data to an artificial natural network with a specific model structure and the decision model amount to a mathematical concept which includes mathematical relationships, mathematical formulas or equations, and mathematical calculations. The mathematical concept need not be expressed in mathematical symbols but not merely limitations that are based on or involve a mathematical concept (MPEP § 2106.04(a)(2)(I)(A) citing the abstract idea grouping for mathematical concepts for mathematical relationships).
Examiner notes that, in light of the 2024 Guidance Update on Patent Subject Matter Eligibility, Including on Artificial Intelligence and Recentive Analytics, Inc. v. Fox Corp., 134 F.4th 1205 (Fed. Cir. 2025), the claims recite both a method of organizing human activity and a mathematical concept and is not subject matter eligible. The use of a computer to apply a neural network with corresponding limitations describing the mathematical structure of the network amount to applying data to an algorithm and report the results (MPEP § 2106.05(f)(2) see case involving a commonplace business method or mathematical algorithm being applied on a general purpose computer within the “Other examples.. i.”) amounting to instruction to implement the abstract idea using a general purpose computer. Alice Corp. Pty. Ltd. V. CLS Bank Int’l, 134 S. Ct. 2347, 1357 (2014) consistent with Example 47 claim 2.
Dependent claim 3 recites, in part, wherein the predicting of the task corresponding to the needs of the patient for the hospital resources comprises calculating, by the embedding model, an embedding matrix of the patient data from the first type of natural language embedding vector and a contextual embedding vector wherein the contextual embedding vector is based on the second type of natural language embedding vector and the embedding vector of the demographic information.
Dependent claim 4 recites, in part, wherein an initial hidden state of the embedding model is specified as the contextual embedding vector, wherein the predicting of the task corresponding to the needs of the patient for the hospital resources comprises inputting the first type of natural language embedding vector into the initial hidden layer of the embedding model.
Dependent claim 5 recites, in part, when a plurality of first types of natural language embedding vectors are input to the embedding model, sequentially inputting the plurality of first types of natural language embedding vectors to the hidden layer.
Dependent claim 6 recites, in part, wherein the predicting of the task corresponding to the needs of the patient for the hospital resources comprises inputting a combined vector obtained by combining the first type of natural language embedding vector with the contextual embedding vector into the initial hidden layer of the embedding model.
Dependent claim 7 recites, in part, wherein the predicting of the task corresponding to the needs of the patient for the hospital resources comprises forming a hidden matrix H consisting of final hidden state vectors, and wherein the attention layer calculates the embedding matrix M of the patient data based on the hidden matrix H and the attention matrix A based on an attention weight.
Dependent claim 8 recites, in part, wherein the predicting of the task corresponding to the needs of the patient for the hospital resources includes receiving, by the decision model, at least the embedding matrix of the patient data among the embedding matrix of the patient data, a final hidden state vector, and the numerical data of the measurement information, and wherein the decision model is a fully connected layer composed of two or more layers.
Dependent claim 9 recites, in part, wherein the artificial neural network is pre-trained such that the decision model determines at least one task among multiple tasks using a training data set for a plurality of training patients, and wherein the training data set consists of training samples for each training patient, and each training sample comprises at least an embedding matrix of patient data among the embedding matrix of the patient data, a final hidden state vector, and numerical data of measurement information for a corresponding training patient.
Dependent claim 10 recites, in part, wherein the decision model is trained to perform multiple binary classification for determining a task class to which the patient data belongs among a plurality of task classes included in a corresponding task to determine at least one task among multiple tasks.
Dependent claim 11 recites, in part, wherein the fully connected layer comprises one or more of a first network for determining a main task, a second network for determining a first auxiliary task, and a third network for determining a second auxiliary task, wherein the first network or the second network is configured to receive the embedding matrix of the patient data and the final hidden state vector, and wherein the third network is configured to receive the embedding matrix of the patient data, the final hidden state vector, and the numerical data of the measurement information.
Dependent claim 12 recites, wherein a loss function of the artificial neural network comprises: a term indicating a weighted sum of a cross entropy loss function between networks per task of the fully connected layer, and another term obtained by applying the Frobenius norm to an attention matrix, a transform matrix of the attention matrix, and an identity matrix.
Dependent claim 13 recites, in part, wherein the natural language data further includes one or more of main symptom-related information, injury-related information, and history-related information, and wherein the demographic information comprises one or more information among gender and age.
Dependent claim 14 recites, in part, wherein the measurement information includes measurement values for one or more measurement items among a pupil state, a systolic blood pressure (SBP), a diastolic blood pressure (DBP), a pulse, a respiration rate, a body temperature, a consciousness level, and an initial O2 saturation.
Dependent claim 15 recites, in part, wherein the main task comprises one or more of hospital admission, endotracheal intubation, mechanical ventilation, vasopressor infusion, cardiac catheterization, surgery, intensive care unit (ICU) admission, and cardiac arrest as a task class, wherein the first auxiliary task comprises an emergency room diagnosis disease name code as a task class, and wherein the second auxiliary task comprises one or more of discharge, ward admission, intensive care unit (ICU) admission, transfer, and death as a task class.
Dependent claim 19 recites, in part, calculate, by the embedding model, an embedding matrix of the patient data from the first type of natural language embedding vector and a contextual embedding vector, wherein the contextual embedding vector is based on the second type of natural language embedding vector and the embedding vector of the demographic information.
Dependent claim 20 recites, in part, input the first type of natural language embedding vector to the initial hidden layer of the embedding model when an initial hidden state of the embedding model is specified as the contextual embedding vector; or input a combined vector obtained by combining the first type of natural language embedding vector with the contextual embedding vector into the initial hidden layer of the embedding model; or input, to the decision model, at least the embedding matrix of the patient data among the embedding matrix of the patient data, a final hidden state vector, and the numerical data of the measurement information.
Dependent claim 23 recites, in part, train the artificial neural network such that the decision model determines at least one task among multiple tasks using an intermediate data set for training patients, wherein the training data set consists of training samples for each training patient, and each training sample comprises at least an embedding matrix of patient data.
Each of these steps of the preceding dependent claims only serve to further limit or specify the features of independent claims 1 or 17 accordingly, and hence are nonetheless directed towards fundamentally the same abstract ideas as the independent claim and utilize the additional elements analyzed below in the expected manner.
Step 2A – Judicial Exception Analysis, Prong 2:
This judicial exception is not integrated into a practical application because the additional elements within the claims only amount to instructions to implement the judicial exception using a computer [MPEP 2106.05(f)].
Claim 1 recites a system comprising a processor and a memory for storage of instructions, an encoding module and a prediction module, the prediction module comprising a neural network, wherein when the instructions are executed by the processor. Claim 17 recites a processor, memory, an encoding module, and a prediction module comprising a neural network. The instant specification provides that the hardware requirement for the processor and memory are any processors capable of performing the claimed functions (see the instant specification on p. 4 lines 24- p. 5 line 3 and p. 24 line28- p. 25 line 10), the is the neural network and decision model are both program code/algorithms executed by the computer hardware but require no specific hardware configurations.
The use of a processor, and memory, in this case to performing the method for predicting needs of a patient for hospital resources, only recites the processor and memory as a tool to apply data to an algorithm and report the results (MPEP § 2106.05(f)(2) see case involving a commonplace business method or mathematical algorithm being applied on a general purpose computer within the “Other examples.. i.”) amounting to instruction to implement the abstract idea using a general purpose computer. Alice Corp. Pty. Ltd. V. CLS Bank Int’l, 134 S. Ct. 2347, 1357 (2014).
The above claims, as a whole, are therefore directed to an abstract idea.
Step 2B – Additional Elements that Amount to Significantly More:
The present claims do not include additional elements that are sufficient to amount to more than the abstract idea because the additional elements or combination of elements amount to no more than a recitation of instructions to implement the abstract idea on a computer.
Claim 1 recites a system comprising a processor and a memory for storage of instructions, an encoding module and a prediction module, the prediction module comprising a neural network, wherein when the instructions are executed by the processor. Claim 17 recites a processor, memory, an encoding module, and a prediction module comprising a neural network.
Each of these elements is only recited as a tool for performing steps of the abstract idea, such as the use of the storage mediums to store data, the computer and data processing devices to apply the algorithm, and the display device to display selected results of the algorithm. These additional elements therefore only amount to mere instructions to perform the abstract idea using a computer and are not sufficient to amount to significantly more than the abstract idea (MPEP 2016.05(f) see for additional guidance on the “mere instructions to apply an exception”).
Each additional element under Step 2A, Prong 2 is analyzed in light of the specification’s explanation of the additional element’s structure. The claimed invention’s additional elements do not have sufficient structure in the specification to be considered a not well-understood, routine, and conventional use of generic computer components. Note that the specification can support the conventionality of generic computer components if “the additional elements are sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 U.S.C. § 112(a)” (Berkheimer in III. Impact on Examination Procedure, A. Formulating Rejections, 1. on p. 3).
Thus, taken alone, the additional elements do not amount to significantly more than the above-identified judicial exception. Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. Their collective functions merely provide conventional computer implementation.
Claims 1, 3-15, 17, 19, 20, and 23 are therefore rejected under 35 U.S.C. § 101 as being directed to non-statutory subject matter.
Response to Arguments
Applicant's arguments filed 09 February 2026 with respect to 35 USC § 101 have been fully considered but they are not persuasive. Applicant asserts that the instant claims solve the technical problem outlined in the specification specifically arguing:
The background art states that "since emergency medical services must be provided 24 hours a day and 365 days a year, it is difficult to accurately predict the needs of patients when the managers' workload is increased." Thus, the harder medical personnel work, the harder it is for them to keep up with the workload and still make accurate predictions of patient needs on round- the-clock operations. The specification then identifies the need for "a technology capable of predicting patient needs with performance comparable to that of a human manager for 24 hours a day and 365 days." This establishes that the invention addresses maintaining consistent prediction accuracy under varying workload conditions, which is a technical challenge rather than a mere automation of a mental process. The claimed invention, which includes an artificial neural network, provides consistent prediction performance comparable to that of a human manager but for 24 hours a day and 365 days, without being subject to increased workload that degrades prediction accuracy and with consistent performance throughout.
While Examiner respects that the quoted assertion does solve a problem – the problem is not directed towards how a computer or technological system functions but instead is rooted in the abstract idea of workload management and prediction accuracy. Even considering the claims as a whole, the application does not intend to be directed towards a new structural design for a neural network OR a new natural language processing model. Instead, similar to that in Recentive Analytics, Inc. v. Fox Corp., 134 F.4th 1205 (Fed. Cir. 2025), the claims utilize known models to solve a problem with the abstract idea. That is, the claims utilize models in the ordinary manner in a “novel” abstract environment to solve a problem with the abstract idea. This is not an improvement to computers under the analysis outlined in MPEP § 2106.05(a)(III) stating “it is important to keep in mind that an improvement in the abstract idea itself (e.g. a recited fundamental economic concept) is not an improvement in technology.
Next, Applicant asserts that the technical mechanisms describing the neural network structure is not a generic model. Examiner is not persuaded. The specific underlying structure of the model is based in a mathematical relationship and is abstract. Moreover, there is no evidence that the inventor is attempting to claim inventing neural network with bidirectional gates and hidden layers. The same analysis applies to the decision model – the structure of the model with attention weights to generation an output matrix based on a hidden state are mathematical concepts. A selective attention mechanism is a mathematical feature similar to that of Example 47 claim 2’s backpropagation algorithm; i.e.. it is not considered a technical feature but instead a mathematical concept. Furthermore, there is no evidence in the specification that the model is not implemented on a generic computer performing generic computer functions.
Next, Applicant asserts the technical effect of “enabled efficient training” realized by the multi-task learning is another improvement. Without arguing the merits of the statement that multi-task learning is “unconventional technical solution” (Examiner sustains that one of ordinary skill in the art would know this is not a realized improvement over the prior art but instead a known modeling method), efficiency is not enough to amount to a practical application via an improvement to computer or technology under Step 2A Prong 2 (see MPEP § 2106.05(a)(I) examples that the courts have indicated may not be sufficient to show an improvement in computer-functionality: ii. accelerating a process of analyzing audit log data when the increased speed comes solely from the capabilities of a general-purpose computer, FairWarning IP, LLC v. Iatric Sys., 839 F.3d 1089, 1095, 120 USPQ2d 1293, 1296 (Fed. Cir. 2016)) (also see MPEP § 2106.05(f)(2) stating “"claiming the improved speed or efficiency inherent with applying the abstract idea on a computer" does not provide an inventive concept (Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367 (Fed. Cir. 2015)”), and, thus, the combination of the generic computer components do not provide a non-conventional and non-generic arrangement of known, conventional pieces; note this is applied to Step 2B as well as Step 2A Prong 2).
Next, Applicant asserts that a human remembering this much information is not possible. Examiner notes this is not a test under the subject matter eligibility analysis for organizing human activity or the mathematical relationship abstract idea groupings. Even so, there is no particular data volume that would render an activity no longer reasonable to perform mentally. While comparing patient data from 24 hrs. a day 365 days a year would take considerable time and effort, that is, of course, the singular purpose of computers and computer networks, to perform large numbers of calculations, via algorithms, rapidly, and without error (assuming no error in user input). Although a general-purpose computer can perform calculations at a rate and accuracy that can far outstrip the mental performance of a skilled artisan, the nature of the activity is essentially the same, and constitutes an abstract idea. See Bancorp Serves., L.L. C. v. Sun Life Assur. Co. of Canada (U.S.) (holding that “the fact that the required calculations could be performed more efficiently via a computer does not materially alter the patent eligibility of the claimed subject matter”); see also SiRF Tech., Inc. v. Int' l Trade Comm ' n, (Fed. Cir. 2010) (holding that: In order for the addition of a machine to impose a meaningful limit on the scope of a claim, it must play a significant part in permitting the claimed method to be performed, rather than function solely as an obvious mechanism for permitting a solution to be achieved more quickly, i.e., through the utilization of a computer for performing calculations).
Finally, Applicant asserts that the neural network and attention mechanism help solve specific problems with the abstract idea. Examiner notes, as stated above, the use of a known algorithm to solve a problem with the abstract idea is not an improvement to technology or technological field.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JORDAN LYNN JACKSON whose telephone number is (571)272-5389. The examiner can normally be reached Monday-Friday 8:30AM-4:30PM ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Arleen M Vazquez can be reached at 571-272-2619. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JORDAN L JACKSON/
Primary Examiner, Art Unit 2857