Prosecution Insights
Last updated: April 19, 2026
Application No. 18/720,603

COMPUTER-BASED SYSTEMS FOR ACQUIRING AND ANALYZING OBSERVATIONAL SUBJECT DATA

Non-Final OA §101§103
Filed
Jun 14, 2024
Examiner
STOLTENBERG, DAVID J
Art Unit
3685
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Pgi Drug Discovery LLC
OA Round
1 (Non-Final)
57%
Grant Probability
Moderate
1-2
OA Rounds
3y 7m
To Grant
82%
With Interview

Examiner Intelligence

Grants 57% of resolved cases
57%
Career Allow Rate
299 granted / 522 resolved
+5.3% vs TC avg
Strong +25% interview lift
Without
With
+24.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
23 currently pending
Career history
545
Total Applications
across all art units

Statute-Specific Performance

§101
31.6%
-8.4% vs TC avg
§103
37.0%
-3.0% vs TC avg
§102
13.5%
-26.5% vs TC avg
§112
10.8%
-29.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 522 resolved cases

Office Action

§101 §103
DETAILED CORRESPONDENCE The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This non-final first office action on merits is in response to the Patent Application filed on 14 June 2024. Claims 1-78, 81, 83, 85-87, 90, 91, 93-97, 99-194, 197, 199-203, 206, 207, 209-213, 215-310, 313-320, 322, 323, 325-329, and 331-348 are cancelled. Claims 80, 82, 84, 88, 89, 92, 98, 196, 198, 204, 205, 208, 214, 311, 312, 321, 324 and 330 are amended and have been carefully considered. Claims 80, 82, 84, 88, 89, 92, 98, 195, 196, 198, 204, 205, 208, 214, 311, 312, 321, 324 and 330 are pending and considered below. Priority Applicant’s claim for the benefit of a prior-filed application under 35 U.S.C. 119(e) or under 35 U.S.C. 120, 121, or 365(c) is acknowledged. The instant invention claims the benefit of priority of U.S. Provisional Patent Application No. 63/290,208, filed December 16, 2021, U.S. Provisional Patent Application No. 63/295,085, filed December 30, 2021, U.S. Provisional Patent Application No. 63/295,242, filed December 30, 2021, U.S. Provisional Patent Application No. 63/295,298, filed December 30, 2021, U.S. Provisional Patent Application No. 63/295,184, filed December 30, 2021, U.S. Provisional Patent Application No. 63/295,057, filed December 30, 2021, U.S. Provisional Patent Application No. 63/295,391, filed December 30, 2021, U.S. Provisional Patent Application No. 63/295,124, filed December 30, 2021, U.S. Provisional Patent Application No. 63/295,164, filed December 30, 2021, U.S. Provisional Patent Application No. 63/295,232, filed December 30, 2021, U.S. Provisional Patent Application No. 63/295,208, filed December 30, 2021, U.S. Provisional Patent Application No. 63/295,105, filed December 30, 2021, U.S. Provisional Patent Application No. 63/295,421, filed December 30, 2021, all of which are incorporated herein by reference in their entireties. Examiner affords the Applicants a priority date of 16 December 2021. Claim Objections Claims 79, 82, 88,89, 98, 195, 198, 204, 205, 214, 311, 312, 321, and 330 are objected to because of the following informalities: The initial insertion of the EEG abbreviation needs to be specifically spelled out. Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 80, 82, 84, 88, 89, 92, 98, 195, 196, 198, 204, 205, 208, 214, 311, 312, 321, 324 and 330 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. The claimed limitations, as per method Claims 79 and 195 include the steps of: obtaining EEG data from a plurality of electrodes positioned on an animal subject to which the drug is administered at a first dose; obtaining acceleration data from one or more accelerometers positioned on the animal subject to which the drug is administered; predicting a class label for the drug by applying the EEG data and the acceleration data to a machine-learning classifier component trained to predict the class label using the EEG data and the acceleration data; and providing an indication of the class label. The claimed limitations, as per storage medium claim 311 include the steps of: obtaining EEG data from a plurality of electrodes positioned on an animal subject to which the drug is administered at a first dose, wherein the EEG data comprises wake EEG and sleep EEG; obtaining acceleration data from one or more accelerometers positioned on the animal subject to which the drug is administered; automatically separating the wake EEG from the sleep EEG based on the EEG data and the acceleration data; predicting a class label for the drug by applying the EEG data and the acceleration data to a machine-learning classifier component trained to predict the class label using the EEG data and the acceleration data; and providing an indication of the class label. Examiner Note: underlined elements indicate additional elements of the claimed invention identified as performing the steps of the claimed invention. Under Step One of the analysis under the Mayo framework, claims 79, 80, 82, 84, 88, 89, 92, and 98 is/are drawn to methods (i.e., a process), claims 195, 196, 198, 204, 205, 208, and 214 is/are drawn to a system (i.e., a machine/manufacture), and claims 311, 312, 321, 324, 330, is/are drawn to a storage medium (i.e., a machine/manufacture). As such, claims 80, 82, 84, 88, 89, 92, 98, 195, 196, 198, 204, 205, 208, 214, 311, 312, 321, 324 and 330 is/are drawn to one of the statutory categories of invention. Under Step 2A Prong One of the analysis under the Mayo framework the claim(s) 79 and 195 are determined to recite(s) the judicial exception of obtaining EEG data from a plurality of electrodes positioned on an animal subject to which the drug is administered at a first dose; obtaining acceleration data from one or more accelerometers positioned on the animal subject to which the drug is administered; predicting a class label for the drug by applying the EEG data and the acceleration data to a machine-learning classifier component trained to predict the class label using the EEG data and the acceleration data; and providing an indication of the class label. This judicial exception is similar to abstract ideas related to mental processes such as concepts performed in the human mind including observation, evaluation, judgement, and opinion and as well similar to mathematical concepts such as relationships, formulas or equations, and mathematical calculations. Under Step 2A Prong One of the analysis under the Mayo framework the claim(s) 311 are determined to recite(s) the judicial exception of obtaining EEG data from a plurality of electrodes positioned on an animal subject to which the drug is administered at a first dose, wherein the EEG data comprises wake EEG and sleep EEG; obtaining acceleration data from one or more accelerometers positioned on the animal subject to which the drug is administered; automatically separating the wake EEG from the sleep EEG based on the EEG data and the acceleration data; predicting a class label for the drug by applying the EEG data and the acceleration data to a machine-learning classifier component trained to predict the class label using the EEG data and the acceleration data; and providing an indication of the class label. This judicial exception is similar to abstract ideas related to mental processes such as concepts performed in the human mind including observation, evaluation, judgement, and opinion and as well similar to mathematical concepts such as relationships, formulas or equations, and mathematical calculations. Under Step 2A Prong Two of the analysis under the Mayo Framework, the judicial exception expressed as the steps of the instant claims is not integrated into a practical application because the claims only recite one additional element, the element of using a processor or computing system including a local registry or memory to perform the steps of the claimed abstract idea. The processor is recited at a high-level of generality (i.e., as a generic processor performing generic computer functions to perform the claimed steps of the invention), and therefore the abstract idea amounts to no more than mere instructions to apply the exception using a generic computer component. Accordingly, this additional element of performing the inventive steps with a generic computer does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. Thus the claimed invention is directed to an abstract idea without a practical application. Under step 2B of the Mayo analysis framework the claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional limitations of performing the steps with a computer processor, a display module, and a memory storing machine executable instructions represents insignificant data gathering and data processing steps requiring no more than a generic computer to perform generic computer functions that are well-understood, routine and conventional activities previously known to the industry. Applicant’s published written description paragraph [120] recites “observational data acquired (and/or observational features generated) by a behavioral platform can be analyzed by one or more other computing systems. Such computing systems could include a tablet, laptop, desktop, mainframe, cluster, or cloud computing platform. Such computing systems can be communicatively connected to the behavioral platform. Such computing systems may perform data acquisition and/or analysis automatically. The system could be remote from the behavioral platform (e.g., connected over a network link to the behavioral platform). Alternatively, a user of the behavioral platform could transfer observational data (and/or observational features) from the behavioral platform to the computing device manually (e.g., using a non-transitory computer readable medium, such as a USB drive, optical disk, magnetic disk, or by data entry through a user interface),” written description paragraph [162] recites “system 500 (or components thereof) can be configured to dynamically modify a database schema to create a mapping of at least one outlier data point to at least one label of a plurality of labels associated with a plurality of chemical compounds, as detailed herein. In some embodiments, the system 500 can be based on a scalable computer and/or network architecture that incorporates varies strategies for assessing the data, caching, searching, and/or database connection pooling. Such a scalable architecture can be capable of operating multiple servers and adding or removing servers in response to demand,” written description paragraph [163] recites “devices 502-504 (e.g., clients) of the system 500 can include suitable computing devices. Such computing devices can be capable of simultaneously launching a plurality of software applications via a network (e.g., cloud network), such as network 505, to and from another computing device, such as servers 506 and 507, each other, and the like. In some embodiments, the devices 502-504 can be personal computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, and the like. In some embodiments, one or more member devices within devices 502-504 can be devices that are capable of connecting using a wired or wireless communication medium such as a laptop, tablet, desktop computer, a netbook, a smart phone, an ultra-mobile personal computer,” written description paragraph [645] recites “block diagram of an exemplary system 500 in accordance with one or more embodiments of the present disclosure. However, not all of these components may be required to practice one or more embodiments, and variations in the arrangement and type of the components may be made without departing from the spirit or scope of various embodiments of the present disclosure. In some embodiments, the exemplary inventive computing devices and/or the exemplary inventive computing components of the system 500 may be configured to dynamically query and screen a database schema to create a mapping of at least one Target behavioral profile to at least one Test behavior profile,” written description paragraph [647] recites “system 500 may be based on a scalable computer and/or network architecture that incorporates varies strategies for assessing the data, caching, searching, and/or database connection pooling. Such a scalable architecture can be capable of operating multiple servers and adding or removing servers in response to demand. In some embodiments, the exemplary inventive computing devices and/or the exemplary inventive computing components of the system 500 may be configured to manage the exemplary dynamic mapping module,” and written description paragraph [648] recites “…devices 502-504 of the system 500 may include virtually any computing device capable of simultaneously launching a plurality of software applications via a network (e.g., cloud network), such as network 505, to and from another computing device, such as servers 506 and 507, each other, and the like. In some embodiments, the devices 502-504 may be personal computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs.” Thus the claimed inventive steps are performed by generic or general purpose computing systems executing well known and understood instructions and processes which do not comprise significantly more than a known computing system, or comprise improvements to another technological field. Further, as per MPEP 2106, and TLI Communications LLC v. AV Automotive LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) ("It is well-settled that mere recitation of concrete, tangible components is insufficient to confer patent eligibility to an otherwise abstract idea") and as per Intellectual Ventures I LLC v. Capital One Bank (USA), N.A., 792 F.3d 1363, 1366, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015) ("An abstract idea does not become nonabstract by limiting the invention to a particular field of use or technological environment, such as the Internet [or] a computer") simply performing the steps of an abstract idea by a computing apparatus does not make an inventive concept statutorily eligible. Therefore, it is clear from Applicants’ specification that the elements and modules in the claims require no more than a generic computer (e.g., a general-purpose computing device) to perform generic computer functions (e.g., accessing, transmitting/receiving, sorting, and storing data) that are well-understood, routine and conventional activities previously known in the industry. None of the limitations, considered as a whole and as an ordered combination provide eligibility, because the steps of the claims simply instruct the practitioner to implement the abstract idea with routine, conventional activity. Viewing the additional limitations in combination also shows that they fail to ensure the claims amount to significantly more than the abstract idea. When considered as an ordered combination, the additional components of the claims add nothing that is not already present when considered separately, and thus simply append the abstract idea with words equivalent to “apply it” on a generic computer and/or mere instructions to implement the abstract idea on a generic computer, generally link the abstract idea to a particular technological environment or field of use, and append the abstract idea with insignificant extra solution activity associated with the implementation of the judicial exception, (e.g., mere data gathering). Dependent claims 80, 82, 88, 89, 92, 196, 204, 205, 208, 311, 321, and 324 are directed to the judicial exception as explained above for Claims 79, 195, and 311 and are further directed to limitations directed to a variety of data related processing methods including the obtainment of EEG data including the collection of accelerometer related data as well as the collection of a wide variety of sensor related data including behavioral data, specific EEG data and the processing of the collected data by the implementation of machine learning modules. These limitations or processes are considered to be executed by the general purpose computing system as explained above, and therefore do not result in the claimed invention being directed to a practical application or comprise significantly more than the identified abstract idea. Dependent claims 80, 82, 88, 89, 92, 196, 204, 205, 208, 311, 321, and 324 do not add more to the abstract idea of independent Claims 79, 195, and 311 and therefore are rejected as ineligible subject matter under 35 U.S.C. 101 based on a rationale similar to the claims from which they depend. Examiner Note 101 Eligibility Examiner has extensively reviewed the written description and the currently submitted claims and has determined that should elements from dependent claims 84, 98, 198, 214, 312 or 330 be integrated into the independent claims, or should Applicants incorporate elements as derived from the written description, the currently in place rejection of all pending claims under 35 USC 101 would be removed. The above noted claims precisely claim procedures implemented by detailed processes with respect to the extraction of state/behavioral related features from the data and as well processing data with respect to low and high temporal resolution and related frequency. The integration of such specificity to the independent claims would result in the removal of the currently in place rejection of the claims under 35 USC 101 and MPEP 2106. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 79, 82, 92, 195, 208, and 324 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Yeo et al. (20240252102) and Dey et al. (20200118040). Claim 79, 195: Yeo discloses a computer-implemented method and system of classifying a drug comprising: obtaining EEG data from a plurality of electrodes positioned on an animal subject to which the drug is administered at a first dose ([56 “acquired time-series data of sensor 106 may also include electromyogram (EMG), electrocardiogram (ECG), electroencephalogram (EEG), phonocardiogram, voltage potential, impedance, and acoustics,” 84 “skin-wearable printed sensor may include at least two electrodes, a conductive flexible film, and an elastomeric substrate,” 86 “sensor system 104a includes multi-layer flexible circuits 302 (shown as 302a, 302b, 302c) that has an electrode array 110 (shown as 110b) fabricated on the underside of the bottom layer 302c. In some embodiments, the PI-Cu-PI-Cu-PI multilayers 302 may be stacked on a polydimethylsiloxane (PDMS)-coated 4-inch wafer (not shown—see FIG. 3). The fabricated circuit and electrodes may be retrieved from carrier substrates and transferred to a soft silicone elastomer,” 105]); obtaining acceleration data from one or more accelerometers positioned on the animal subject to which the drug is administered ([37 “agent may be an organic molecule (e.g., a therapeutic agent, a drug), inorganic molecule, nucleic acid, protein, amino acid, peptide, polypeptide, polynucleotide, targeting agent, isotopically labeled organic or inorganic molecule, vaccine, immunological agent,” 43, 46, 47, 56 “sensors 106 can acquire time-series data or channels of them, such as inertia, acceleration, orientation, temperature, or sound. The acquired time-series data of sensor 106 may also include electromyogram (EMG), electrocardiogram (ECG), electroencephalogram (EEG),” 60 “skin-wearable sensor system 104 includes auxiliary sensors, such as inertia measurement sensors 112, that can provide inertia, acceleration, and orientation information relating to movement or activity,” 64]). Yeo discloses the implementation of machine learning for the prediction and classification of raw data at paragraph [69] including EEG and acceleration data and Yeo does not explicitly disclose however Dey discloses: predicting a class label for the drug by applying the EEG data and the acceleration data to a machine-learning classifier component trained to predict the class label using the EEG data and the acceleration data ([20 “metadata information associated with the samples contain free-texts which are helpful to determine the categories of disease onset or drug responses. For disease conditions, the samples that have those disease states and the samples that are controls are identified via the mechanisms of the illustrative embodiments. For the drug states, the drug name, the dosage information, and IC50 score after a few hours (typically, with an interval of 6 or 12 hours) are recorded for each sample and identified via the mechanisms of the illustrative embodiments,” 21 “classification engine may be built and configured to extract particular features from a genomic database dataset that is input to the classification engine, and process these features in order to infer or predict a classification for the particular dataset. For example, the one or more classification engines, in accordance with one illustrative embodiment, may comprise a first classification engine that is configured and trained to classify inputs into classes of disease states, or a non-disease state, thereby generating a disease class label for the dataset,” 22 “second classification engine may be configured and trained to classify a study corresponding to the dataset to be a drug agent study or a non-drug agent study, to thereby generate a binary drug agent class label,” 23-26, 52 “one or more classification engines 140 are executed on each dataset in the training subset 122 of datasets to generate a corresponding classification prediction/inference output. For example, the classification engine(s) 140 may perform various natural language processing operations on the natural language or free text portions of the selected training subset,”]); Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures of Dey with respect to the collection and processing of relevant data which details the incorporation of class labels and drug related information to disclose the claimed limitation including disclosures of Yeo to denote the processing of collected EEG and acceleration data as well as the categorization of drug related class labels to disclose the above limitations. providing an indication of the class label ([27 “vector slot in the vector output may comprise a numerical value indicative of the probability that the input is properly classified into the corresponding class. The classification output is compared to the corresponding ground truth classification to determine if the classification engine generated a correct classification output,” 30 “trained classification engine(s) identify and label the dataset as to whether the corresponding study was for a specific disease state and/or a specific drug agent. In addition, the labels indicate whether the sample had the disease state (disease sample) or did not have the disease state (control sample). Based on these labels, subsets of curated datasets that correspond to a particular disease and/or drug agent, as well as whether or not the corresponding sample was a disease sample or a control,”]). Therefore it would be obvious for Yeo to predict a class label for the drug by applying the EEG data and the acceleration data to a machine-learning classifier component trained to predict the class label using the EEG data and the acceleration data and providing an indication of the class label to the operators of the medical processing system as per the steps of Dey and results in the provision of EEG and related acceleration data and the processing of the data for the purpose of predicting label classes and the implementation of treatments which are implemented using the class labels as interpreted by the system and therefore resulting in the optimization of provided treatments. Claim 82: Yeo in view of Dey discloses the computer-implemented method and system of claim 80 above and Yeo further discloses extracting features by applying observational data concerning the animal subject to a machine-learning feature-extraction component ([69 “enables a machine to acquire knowledge by extracting patterns from raw data. Machine learning techniques include, but are not limited to, logistic regression, support vector machines (SVMs), decision trees, Naïve Bayes classifiers, and artificial neural networks,”]), the observational data comprising the EEG data and the acceleration data ([56 “acquired time-series data of sensor 106 may also include electromyogram (EMG), electrocardiogram (ECG), electroencephalogram (EEG), phonocardiogram, voltage potential, impedance, and acoustics,” 60 “skin-wearable sensor system 104 includes auxiliary sensors, such as inertia measurement sensors 112, that can provide inertia, acceleration, and orientation information relating to movement or activity,”]), the extracted features comprising at least one of instant behavioral features ([103 “thin-film electronics can be seamlessly mounted on the back of the mouse so as to allow its natural behavior while offering a long-range wireless recording of muscle activities,”]); or higher-order features derived from the instant behavioral features using a machine-learning higher-order feature-extraction component ([58 “seamless mounting on the skin of alive and moving animal models (e.g., mice or rats) without disrupting their natural behavior. In addition, the compact device integration on a soft elastomeric platform can provide comfortable wearability without motion artifacts caused by cumbersome wires and rigid system. The use of non-invasive and ergonomic monitoring system/device allows movement during measurement, thus allowing the system to monitor the physiological response in a natural ambulatory environment,” 61 “acquisition electronics 128 may include analog-to-digital convertors or capacitance-to-digital converters, trans-impedance amplifiers or other amplifier circuitries, appropriate filters (e.g., low pass and/or high pass filters),” 69 “learning techniques include but are not limited to autoencoders and embeddings. The term “deep learning” is defined herein to be a subset of machine learning that enables a machine to automatically discover representations needed for feature detection, prediction, classification, etc., using layers of processing. Deep learning techniques include but are not limited to artificial neural networks or multilayer perceptron (MLP),” 71 “disclosure contemplates that the machine learning model can be any supervised learning model, semi-supervised learning model, or unsupervised learning model. Optionally, the machine learning model is a deep learning model. Machine learning models are known in the art and are therefore not described in further detail herein,” 143 “algorithm based on machine learning could offer automated signal discrimination and behavioral classification for further study [51, 52]. Since the EMG signal could be slightly different depending on the area to which the electrode was attached, constant localization with increasing the number of samples was demanded for accurate results,”]). Claims 92, 208, and 324: Yeo in view of Dey discloses the computer-implemented method and system of claim 79, 195, and 311 above and Yeo further discloses wherein machine-learning classifier component is a layer or branch of a machine-learning model, the machine-learning model being one of an ensemble of machine- learning or neural network models ([68 “statistical analysis may include machine learning-based analysis,” 69 “analysis system can be implemented using one or more artificial intelligence and machine learning operations. The term “artificial intelligence” can include any technique that enables one or more computing devices or comping systems (i.e., a machine) to mimic human intelligence. Artificial intelligence (AI) includes but is not limited to knowledge bases, machine learning, representation learning, and deep learning. The term “machine learning” is defined herein to be a subset of AI that enables a machine to acquire knowledge by extracting patterns from raw data,”]). Claim(s) 80 and 196 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Yeo et al. (20240252102) and Dey et al. (20200118040) and in further view of Weber et al. (20190151640). Claims 80 and 196: Yeo in view of Dey disclose the elements of method and system claims 79 and 195 above and Yeo does not disclose however Weber discloses further comprising obtaining observational data concerning the animal subject, the observation data acquired using an enclosure for the animal subject ([50 “subject 104 may be a human, a dog, a pig, and/or any other animal having physiological parameters that can be recorded,”]), the enclosure instrumented with at least one sensing device, the at least one sensing device comprising at least one of an imaging sensor, a force sensor, a pressure sensor, a piezoelectric sensor, a pseudo piezoelectric sensor, a stimulus sensor associated with a stimulus actuator, or a thermal sensor ([64 “one or more of the sensors 112, 114, 116 may be configured to sense physiological information about the subject 104. The physiological information may include at least one of: a respiration sensor, a sound sensor, a heart rate sensor, an oxygen sensor, a muscle use sensor, an activity sensor, a posture sensor, an inflammation sensor, a chemical sensor, an exhaled breath sensor, a thoracic composition sensor, an altered consciousness sensor, a central cyanosis sensor, and a sleep quality sensor,” 65 “Respiration sensors can be used to determine tidal volume (VT), respiration rate, peak expiratory flow rate (PEFR), forced expiratory volume (FEV), and a composite respiration index,” 66 “Sound sensors can include at least one of a lung sound sensor, a speech sensor, and a heart sound sensor,” 67 “heart rate sensor includes an ECG for measuring the heart rate, an oxygen sensor includes an optical oxygen saturation sensor, and a central cyanosis sensor includes an optical oxygen saturation sensor,” 73 “controller 208 may be any arrangement of electronic circuits, electronic components, processors, program components and/or the like configured to store and/or execute programming instructions, to direct the operation of the other functional components of the MD 202, to perform respiratory functionality detection, ECG detection, EEG detection, EMG detection, arrhythmia detection and/or classification algorithms, to store physiologic data obtained by the sensing component,”]). Therefore it would be obvious for Yeo to obtain observational data concerning the animal subject, the observation data acquired using an enclosure for the animal subject, the enclosure instrumented with at least one sensing device, the at least one sensing device comprising at least one of an imaging sensor, a force sensor, a pressure sensor, a piezoelectric sensor, a pseudo piezoelectric sensor, a stimulus sensor associated with a stimulus actuator, or a thermal sensor as per the steps of Weber and results in the provision of EEG and related acceleration data and the processing of the data for the purpose of predicting label classes and the implementation of treatments which are implemented using the class labels as interpreted by the system and therefore resulting in the optimization of provided treatments. Claim(s) 84 and 198 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Yeo et al. (20240252102) and Dey et al. (20200118040) and in further view of Lee et al. (20200285944). Claims 84 and 198: Yeo in view of Dey disclose the elements of method and system claims 82 and 196 above and Yeo further discloses wherein the operations further comprise extracting features by applying observational data concerning the animal subject to a machine learning feature-extraction component, the observational data comprising the EEG data and the acceleration data ([56 “acquired time-series data of sensor 106 may also include electromyogram (EMG), electrocardiogram (ECG), electroencephalogram (EEG), phonocardiogram, voltage potential, impedance, and acoustics,” 60 “skin-wearable sensor system 104 includes auxiliary sensors, such as inertia measurement sensors 112, that can provide inertia, acceleration, and orientation information relating to movement or activity,” 69 “enables a machine to acquire knowledge by extracting patterns from raw data. Machine learning techniques include, but are not limited to, logistic regression, support vector machines (SVMs), decision trees, Naïve Bayes classifiers, and artificial neural networks,”]). the extracted features comprising at least one of: instant behavioral features ([103 “thin-film electronics can be seamlessly mounted on the back of the mouse so as to allow its natural behavior while offering a long-range wireless recording of muscle activities,”]), higher-order features derived from the instant behavioral features using a machine-learning higher-order feature-extraction component, wherein the higher-order features comprise ([58 “seamless mounting on the skin of alive and moving animal models (e.g., mice or rats) without disrupting their natural behavior. In addition, the compact device integration on a soft elastomeric platform can provide comfortable wearability without motion artifacts caused by cumbersome wires and rigid system. The use of non-invasive and ergonomic monitoring system/device allows movement during measurement, thus allowing the system to monitor the physiological response in a natural ambulatory environment,” 61 “acquisition electronics 128 may include analog-to-digital convertors or capacitance-to-digital converters, trans-impedance amplifiers or other amplifier circuitries, appropriate filters (e.g., low pass and/or high pass filters),” 69 “learning techniques include but are not limited to autoencoders and embeddings. The term “deep learning” is defined herein to be a subset of machine learning that enables a machine to automatically discover representations needed for feature detection, prediction, classification, etc., using layers of processing. Deep learning techniques include but are not limited to artificial neural networks or multilayer perceptron (MLP),” 71 “disclosure contemplates that the machine learning model can be any supervised learning model, semi-supervised learning model, or unsupervised learning model. Optionally, the machine learning model is a deep learning model. Machine learning models are known in the art and are therefore not described in further detail herein,” 143 “algorithm based on machine learning could offer automated signal discrimination and behavioral classification for further study [51, 52]. Since the EMG signal could be slightly different depending on the area to which the electrode was attached, constant localization with increasing the number of samples was demanded for accurate results,”]): Yeo does not explicitly disclose however Lee discloses: one or more state features, and the method further comprises extracting the state features from the instant behavioral features using a machine-learning state-extraction component, wherein the machine-learning state-extraction component comprises a supervised machine-learning component, an unsupervised machine- learning component ([113, 114, 125, 133]), or both ([27 “attention mechanism uses a node state matrix and two trainable functions to produce a probability vector indicating the relevancy of different motifs and a probability vector indicating the relevancy of different step sizes for each respective target node,” 30 “entity includes a person, such as a user of a service, a member of a social network, a researcher in a citation network, or the like. In another example, an entity includes an object or an item, such as a user session with a web-based application,” 100, 101 “the state matrix encoding node states at layer l is a concatenation of two matrices,” 102, 133 “determining a state matrix (see, e.g., equation (19)) that includes the motif count matrix and, for each respective node in the graph, a weighted sum of the attributes of neighboring nodes each connected to the respective node by a single edge (i.e., the one-hop edge-induced neighborhood); (3) applying a first trainable function (e.g., ƒ.sub.l) to the state matrix to determine, for each respective node, a probability value associated with each type of motif in the multiple types of motifs; and (4) selecting, for each respective node, a type of motif corresponding to a highest probability value among the multiple types of motifs as the selected type of motif t. To select the step size of the selected type of motif for each respective node in the graph, the computer system applies a second trainable function (e.g., ƒ′.sub.l) to the state matrix and the probability value associated with each type of motif in the multiple types of motifs for the respective node,”]); one or more motif features, and the method further comprises extracting the motif features from the state features using a machine-learning motif-extraction component, wherein the machine-learning motif-extraction component comprises a supervised machine-learning component, an unsupervised machine-learning component ([113, 114, 125, 133]), or both ([56 “trained attention mechanism is used to select the most relevant neighborhood for each respective node in the graph, which includes selecting an appropriate motif and/or an appropriate distance (e.g., step size or number of hops) that define the most relevant motif-induced neighborhood,” 57 “output includes, for each node in a set of nodes in the graph, a weighted sum of the attributes of the nodes in the respective motif-induced neighborhood determined by the attention mechanism. The weighted sum represents a node's new features or attributes extracted from its motif-induced neighborhood,” 57, 58 “one or more computing systems make an inference regarding the set of entities based on the output at the last graph convolutional layer that includes extracted attributes for the nodes in the graph. As described above, the one or more computing systems implement a fully-connected layer and/or a softmax engine to make prediction or classification,” 59]); and one or more domain features, and the method further comprises extracting the domain features from the motif features using a machine-learning higher-order-extraction component, wherein the machine-learning higher-order-extraction component comprises a supervised machine-learning component, an unsupervised machine-learning component ([113, 114, 125, 133]), or both ([27 “graph convolutional layer then combines (e.g., as a weighted sum of) features of the nodes in the defined neighborhood to extract new features for the target node. An activation engine applies a nonlinear function (e.g., ReLU) to the output from a graph convolutional layer and send the output to the next graph convolutional layer,” 56 “trained attention mechanism is used to select the most relevant neighborhood for each respective node in the graph, which includes selecting an appropriate motif and/or an appropriate distance (e.g., step size or number of hops) that define the most relevant motif-induced neighborhood,” 57 “weighted sum of the attributes of the nodes in the respective motif-induced neighborhood determined by the attention mechanism. The weighted sum represents a node's new features or attributes extracted from its motif-induced neighborhood. The output from a graph convolutional layer is processed by an activation engine that applies a nonlinear function as described above to the output, and is then used as the input to a subsequent graph convolutional layer. In each subsequent graph convolutional layer, attributes of nodes in a wider neighborhood of a target node are integrated (e.g., as a weighted sum) to extract new attributes for the target node,” 58 “inference regarding the set of entities based on the output at the last graph convolutional layer that includes extracted attributes for the nodes in the graph. As described above, the one or more computing systems implement a fully-connected layer and/or a softmax engine to make prediction or classification,” 59 “graph convolutional network 100 uses various motifs to select nodes within different neighborhoods for information integration at different target nodes in the graph, where the different neighborhoods include nodes within different distances from respective target nodes and/or nodes forming different structural relationships (i.e., motifs) with the respective target nodes,”]). Therefore it would be obvious for Yeo one or more state features, and the method further comprises extracting the state features from the instant behavioral features using a machine-learning state-extraction component, wherein the machine-learning state-extraction component comprises a supervised machine-learning component, an unsupervised machine- learning component, one or more motif features, and the method further comprises extracting the motif features from the state features using a machine-learning motif-extraction component, wherein the machine-learning motif-extraction component comprises a supervised machine-learning component, an unsupervised machine-learning component, and one or more domain features, and the method further comprises extracting the domain features from the motif features using a machine-learning higher-order-extraction component, wherein the machine-learning higher-order-extraction component comprises a supervised machine-learning component, an unsupervised machine-learning component, or both as per the steps of Lee and results in the provision of EEG and related acceleration data and the processing of the data for the purpose of predicting label classes and the implementation of treatments which are implemented using the class labels as interpreted by the system and therefore resulting in the optimization of provided treatments. Claim(s) 89, 205, and 321 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Yeo et al. (20240252102) and Dey et al. (20200118040) and in further view of Cook (20020059159). Claims 89, 205, 321: Yeo discloses the computer-implemented method and system of claim 79, 195, and 311 and Yeo further discloses: obtaining reference EEG data from the plurality of electrodes positioned on the animal subject to which a reference drug is administered at a second dose ([43 “treating” or “treatment” of a subject includes the administration of a drug to a subject with the purpose of preventing, curing, healing, alleviating, relieving, altering, remedying, ameliorating, improving, stabilizing or affecting a disease or disorder, or a symptom of a disease or disorder,” 46 “effective amount” of a drug necessary to achieve a therapeutic effect may vary according to factors such as the age, sex, and weight of the subject. Dosage regimens can be adjusted to provide the optimum therapeutic response,” 47, 56 “acquired time-series data of sensor 106 may also include electromyogram (EMG), electrocardiogram (ECG), electroencephalogram (EEG), phonocardiogram, voltage potential, impedance, and acoustics,” 59 “skin-wearable sensor system 104 is configured to communicate through a short-range communication channel 126 to a data acquisition system 128 configured with a network interface 130, data storage 132, and monitoring and control module 134 (also referred to as a controller 134). The acquired data can be subsequently analyzed in an analysis system or operation,” 71, 108 “the motion sensor package can use a sensitive accelerometer and gyroscope, which identifies the mouse movements to distinguish mastication motions. A mobile device embeds a custom-designed app to offer real-time signal displaying and data storing to analyze muscle functions,”]); obtaining reference acceleration data from the one or more accelerometers positioned on the animal subject to which the reference drug is administered at the second dose ([43 “treating” or “treatment” of a subject includes the administration of a drug to a subject with the purpose of preventing, curing, healing, alleviating, relieving, altering, remedying, ameliorating, improving, stabilizing or affecting a disease or disorder, or a symptom of a disease or disorder,” 46 “effective amount” of a drug necessary to achieve a therapeutic effect may vary according to factors such as the age, sex, and weight of the subject. Dosage regimens can be adjusted to provide the optimum therapeutic response,” 47, 56 “acquired time-series data of sensor 106 may also include electromyogram (EMG), electrocardiogram (ECG), electroencephalogram (EEG), phonocardiogram, voltage potential, impedance, and acoustics,” 59 “skin-wearable sensor system 104 is configured to communicate through a short-range communication channel 126 to a data acquisition system 128 configured with a network interface 130, data storage 132, and monitoring and control module 134 (also referred to as a controller 134). The acquired data can be subsequently analyzed in an analysis system or operation,” 71, 108 “the motion sensor package can use a sensitive accelerometer and gyroscope, which identifies the mouse movements to distinguish mastication motions. A mobile device embeds a custom-designed app to offer real-time signal displaying and data storing to analyze muscle functions,”]); Examiner Note: Examiner interprets the disclosures of Yeo under a broadest reasonable interpretation to disclose the optimization of drug dosages with respect to treatment methodologies. Yeo does not explicitly disclose however Cook discloses: generating a similarity value for the reference drug using the EEG data, the acceleration data, the reference EEG data, and the reference acceleration data ([134 “similarity matrices 78 may be particularly useful. A similarity matrix 78 enables a user to objectively calculate how similar the effects of a particular state 40a are to the effects of another state 40b. This comparison may have a profound impact on the ability of a user to predict and quantify the effects of a particular drug (state A) with a drug (state B) having known effects, 135-138, 139 “a similarity matrix may also be used to identify relationships between one dose of a given drug and a collection of other doses all of the same drug. In this way, dose response relationships can be determined according to the way a particular set of doses affect an observed entity. As can be seen, does one and dose two are most likely to produce a similar result,” 145 “Motor activity was assessed by a piezoelectric activity transducer cemented to underside of the suspended floor of the chamber. Piezoelectric activity was filtered at 0.1-100 Hz (-3 dB) and amplified 10-100 times by an Axon Instruments CyberAMP 380 signal conditioner. EEG and piezoelectric recordings were digitized at 1000 Hz with a National Instruments PCI-MIO-16XE-10 multifunction I/O board in a Pentium III computer and streamed to disk for later analysis,” 146 “displaying EEG and piezoelectric activity were superimposed on the video signal for off-line correlation of behavior with electrophysiological responses,” 157 “time course of the effects of each of the drugs on EEG activation was determined for each rat. The activation value plot contains one hour of recorded and processed EEG data. The EEG data was divided into epochs of two seconds in duration. Chlordiazepoxide (6.0 mg/kg) was administered intravenously thirty minutes into the sampling. Activation values represent the degree of presence (greater than zero) or absence (less than zero) of a pattern of features (e. g. a pattern of phase-weighted frequencies) inherent in the raw spontaneous EEG.” 161-166]). Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures of Cook with respect to the collection and processing of motor activity to disclose the collection of acceleration data and as well the collecti
Read full office action

Prosecution Timeline

Jun 14, 2024
Application Filed
Nov 26, 2025
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594431
AED ACTIONS REMOTELY TRIGGERED BY AED MANAGEMENT PLATFORM
2y 5m to grant Granted Apr 07, 2026
Patent 12580054
COMPUTATIONALLY-EFFICIENT LOAD PLANNING SYSTEMS AND METHODS OF DIAGNOSTIC LABORATORIES
2y 5m to grant Granted Mar 17, 2026
Patent 12555679
HUMIDIFICATION DEVICE COMMUNICATIONS
2y 5m to grant Granted Feb 17, 2026
Patent 12548681
METHOD AND DEVICE FOR ADAPTIVELY DISPLAYING AT LEAST ONE POTENTIAL SUBJECT AND A TARGET SUBJECT
2y 5m to grant Granted Feb 10, 2026
Patent 12525346
VIRTUAL CARE SYSTEMS AND METHODS
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
57%
Grant Probability
82%
With Interview (+24.9%)
3y 7m
Median Time to Grant
Low
PTA Risk
Based on 522 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month