Prosecution Insights
Last updated: April 19, 2026
Application No. 18/694,271

A COMPUTER IMPLEMENTED METHOD FOR ASSESSING AND DETERMINING A COMPLEXITY LEVEL OF A CLINICAL TRIAL STUDY

Non-Final OA §101§102§103§112
Filed
Mar 21, 2024
Examiner
WEBB, JESSICA MARIE
Art Unit
3683
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Novartis AG
OA Round
1 (Non-Final)
33%
Grant Probability
At Risk
1-2
OA Rounds
3y 0m
To Grant
86%
With Interview

Examiner Intelligence

Grants only 33% of cases
33%
Career Allow Rate
33 granted / 99 resolved
-18.7% vs TC avg
Strong +52% interview lift
Without
With
+52.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
21 currently pending
Career history
120
Total Applications
across all art units

Statute-Specific Performance

§101
33.6%
-6.4% vs TC avg
§103
34.3%
-5.7% vs TC avg
§102
5.1%
-34.9% vs TC avg
§112
23.3%
-16.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 99 resolved cases

Office Action

§101 §102 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION Status of Claims In the claims filed 1/21/2025, the following occurred: claims 3, 9, 13 and 15 are amended; claims 4-8, 10-12 and 17 are canceled. Claims 1-3, 9, 13-16 and 18-19 are pending and have been examined. Priority Acknowledgement is made of applicant’s claim to priority under 35 U.S.C. 371 to PCT Application No. PCT/IB2022/058938 filed 09/21/2022, which claims priority to U.S. Provisional Patent Application No. 63/247,602 filed 09/23/2021. Information Disclosure Statement The Information Disclosure Statement(s) (IDS)(s) submitted on 7/8/2024 follow(s) the provisions of 37 CFR 1.97 and has/have been fully considered by the Examiner. Claim Objections Claims 1, 9 and 18-19 are objected to because of the following informalities: Claim 9 depends on claim 1, not on claim 2. As such, “the standardized score value” of claim 9 should be recited as the “scaled score value” of claim 1. That is, “the calculation of the standardized score value” of claim 9 is actually “the calculation of the scaled score value” unless the dependency is changed. Appropriate correction is required. Claims 1 and 18-19 recite “one or more input and/or output elements” and subsequently recite “by the input element”. The term “and/or” requires one input element or one output element. The subsequent selecting (or defining step) requires implementation with the input element. It does not require an output element. There is no way in which this claim would only implement one or more output elements. The Examiner suggests amending to recite “one or more input elements and one or more output elements”. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claims 14-16 are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor regards as the invention. Claim 14 recites the limitation "the graphical user interface". There is insufficient antecedent basis for this limitation in the claim. Parent claim 1 does not recite a graphical user interface. The rejection of claim 14 also applies to dependent claims 15-16. Appropriate correction is required. The following is a quotation of 35 U.S.C. 112(d): (d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. Claim 14 is rejected under 35 U.S.C. 112(d) as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. Claim 14 depends improperly on canceled claim 5. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements. The rejection of claim 14 also applies to dependent claims 15-16. Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-3, 9, 13-16 and 18-19 are rejected under 35 U.S.C. §101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Claims 1 and 18-19 are rejected under 35 U.S.C. §101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 (YES) Claims 1 and 18-19 fall into at least one of the statutory categories (i.e., process, machine or non-transitory CRM). Step 2A1 (YES) The limitations of selecting and/or defining a clinical trial protocol and, based on the selection and/or definition thereof, activating and/or generating data entries for a set of corresponding parameters correlated to a set-up of the portion of the clinical trial study; wherein the parameters refer to objectives; and/or design; and/or methods; and/or patient assessment procedures and/or patient data collection schedules of the clinical trial study; automatically assigning a score value to each parameter in the set of parameters, applying statistical rules to the score value of each parameter to obtain a scaled score value; and calculating a complexity level for the portion of clinical trial study, based on the scaled score values of each parameter in the set of parameters; wherein the score value of each parameter is adaptively based on benchmark score values of retrospective/previously assessed clinical trial studies for the same parameter, as drafted, is a process that under the broadest reasonable interpretation (BRI) covers a method of organizing human activity (i.e., managing personal behavior or relationships or interactions between people including following rules or instructions) but for the recitation of generic computer component language (discussed below in 2A2). That is, other than reciting the generic computer component language, the claimed invention amounts to a person assessing and determining a complexity level of at least a portion of a clinical trial study, which is a method of managing personal behavior or relationships or interactions between people. For example, but for the input element, the claims encompass a person selecting a clinical trial protocol, subsequently activating data entries for a set of corresponding parameters. Likewise, but for the processor, the claims encompass a person assigning, applying, or calculating data in the manner described in the identified abstract idea, supra. The Examiner notes that certain “method[s] of organizing human activity” includes a person’s interaction with a computer (see MPEP § 2106.04(a)(2)(II)). If a claim limitation, under its broadest reasonable interpretation, covers managing personal behavior or relationships or interactions between people but for the recitation of generic computer component language, then it falls within the “certain methods of organizing human activity” grouping of abstract ideas. See additionally MPEP § 2106. Accordingly, the claims recite an abstract idea. Eligibility Analysis Step 2A2 (NO) The judicial exception, the above-identified abstract idea, is not integrated into a practical application. In particular, the claims recite the additional elements of a computing device / electronic device including one or more processors; one or more input and/or output elements; and a memory / non-transitory computer-readable storage medium that implement the identified abstract idea. The additional elements aforementioned are not described by the applicant and are recited at a high-level of generality (i.e., a generic computer or computer component performing a generic computer or computer component function that facilitates the identified abstract idea) such that these amount no more than mere instructions to apply the exception using a generic computer component (see Specification, e.g., at para. 0053, 0088). See MPEP § 2106.04(d)(I). Accordingly, alone or in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claims are directed to an abstract idea. Eligibility Analysis Step 2B (NO) The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of a computing device / electronic device including one or more processors; one or more input and/or output elements; and a memory / non-transitory computer-readable storage medium to perform the abstract idea amount no more than mere instructions to apply the exception using a generic computer or generic computer component. Mere instructions to apply an exception using generic computer(s) and/or generic computer component(s) cannot provide an inventive concept (“significantly more”). See MPEP § 2106.05(f). Dependent claims 2-3, 9 and 13-16, when analyzed as a whole, are similarly rejected under 35 U.S.C. §101 because the additional limitation(s) fail(s) to establish that the claim(s) is/are not directed to an abstract idea without significantly more. The claims, when considered alone or as an ordered combination, either (1) merely further define the abstract idea, (2) do not further limit the claim to a practical application, or (3) do not provide an inventive concept such that the claims are subject matter eligible. Claim(s) 2 further describes the abstract idea (e.g., calculating the scaled score value of each parameter). The limitation below is considered to be part of the CMOHA because it falls under data manipulations that humans perform and thus is part of the rules or instructions. Alternately and for completeness, the limitations of wherein the scaled score value of each parameter is the standardized score value (SV) for each parameter, calculated according to the following: SV = (x-mean) / standard deviation (and the descriptions of “x”, “mean”, and “standard deviation”), as drafted, is a process that under broadest reasonable interpretation covers a mathematical concept that includes mathematical relationships, mathematical formulas or equations, and mathematical calculations. Claim 2 describes mathematical expression(s) and manipulation(s). See also specification at para. 0016, 0020, 0087. That is, the claim recites a procedure for calculating the scaled score value of each parameter that encompasses a mathematical concept. For example, the claim encompasses mathematical calculations using the mathematical formula SV = (x-mean) / standard deviation. The Examiner notes that the mathematical concept need not be expressed in mathematical symbols. MPEP § 2106.04(a)(2)(I). If a claim limitation under its broadest reasonable interpretation encompasses a mathematical concept, then it falls within the “Mathematical Concepts” grouping of abstract ideas. The types of identified abstract ideas are considered together as a single abstract idea for analysis purposes. Accordingly, the claim recites an abstract idea. Claim 3 merely further describe(s) the additional element(s) of the computing device (e.g., presenting, activating, adjusting data). See Applicant’s disclosure at Fig. 1 and para. 0048-0049. See analysis, supra. Claim 9 merely further describes the abstract idea (e.g., an adjustment factor to obtain an adjusted score value (intended result of use), applicable prior to the calculation of the standardized score value (SV) (optional language); configured to prevent that a score value of any parameter dominates over the score value of other parameters (intended result of use).) Claims 13-16 merely further recite the additional element of the computing device comprising a graphical user interface to implement the abstract idea (e.g., visually conveying data, selecting data, changing data in real-time), which amounts no more generic computer component (see specification at Fig. 1 and para. 0048-49). See analysis, supra. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1, 9, 13-16 and 18-19 are rejected under 35 U.S.C. 102(a)(1) and 102(a)(2) as being anticipated by Campo et al. (US 2011/0153358 A1; “Campo” herein). Re. Claim 1, Campo teaches a computer-implemented method for assessing and determining a complexity level of at least a portion, or pillar, of a clinical trial study (see Abstract), comprising the steps of: at a computing device including at least a processor; a memory and an input and/or output elements (Abstract and Fig. 1 teach the service delivery device includes a processor and a memory for storing instructions; and a display. See also [0023].), by the input element, selecting and/or defining a clinical trial protocol and, based on the selection and/or definition thereof, activating and/or generating data entries for a set of corresponding parameters correlated to a set-up of the portion of the clinical trial study ([0026] teaches FIGS. 3-13 show computer screen shots 70, 80, 90, 100, 110, 120, 130, 140, 150, 160, and 170 of exemplary displays 12, which may be used in combination with the present invention to plan and design a clinical trial. The present invention permits the protocol to be created using a text-based authoring environment as shown in FIG. 3 at 70 (selecting or defining the clinical trial protocol), a wizard-based interface (by the input element) as shown in FIGS. 4-13 at 80, 90, 100, 110, 120, 130, 140, 150, 160, and 170 (activating or generating the data entries for the set).); wherein the parameters refer to objectives; and/or design; and/or methods; and/or patient assessment procedures and/or patient data collection schedules of the clinical trial study (Fig. 5, [0028]-[0029] teach a list of objectives 92 associated with the medical study, e.g., objectives entered in Fig. 5 at 90. Figs. 7, 9-10 and [0030]-[0032] teach selection of procedures/tasks and collecting associated design study information. See also Figs. 11-13 and [0033]. The Examiner notes only one of these is required for the claim to be met.); by the processor, automatically assigning a score value to each parameter in the set of parameters (Fig. 2, [0010] teach the memory causes the processor to (1) select a set of procedures from the plurality of medical procedures; (2) assign a work effort unit (score value) for each procedure included in the set of procedures), applying statistical rules to the score value of each parameter to obtain a scaled score value; and calculating a complexity level for the portion of clinical trial study, based on the scaled score values of each parameter in the set of parameters ([0010], [0012] teach (3) calculating a complexity level for the set of procedures (portion) based on the work effort units… includes the following: all procedures included in the set of medical procedures are identified; the work effort unit is multiplied by a number of occurrences of each of the unique procedures in the set of medical procedures (applying statistical rules to the score value of each parameter); and finally, the multiplied work effort units (scaled score values) are summed.); wherein the score value of each parameter is adaptively based on benchmark score values of retrospective/previously assessed clinical trial studies for the same parameter ([0011] teaches the work effort unit is based on at least one of procedure type, procedure cost, procedure time, and procedure phase. Fig. 2, [0019]-[0020] teaches using relative value units (RVUs) (benchmark score values) to calculate the complexity of the procedures… an RVU value or an RVU-like value will be assigned to every clinical procedure in a clinical trial.) Re. Claim 9, Campo teaches the method of claim 1, comprising an adjustment factor, or weight, for the score value of each parameter ([0010], [0012] teach (3) calculating a complexity level for the set of procedures based on the work effort units… includes the following: all procedures included in the set of medical procedures are identified; the work effort unit (score value) is multiplied by a number of occurrences of each of the unique procedures in the set of medical procedures (adjustment factor); and finally, the multiplied work effort units are summed.), to obtain an adjusted score (The Examiner notes that “to obtain…” is an intended use of “an adjustment factor”), applicable prior to the calculation of the standardized score value (SV) (The Examiner notes that “applicable…” is optional language, which is not required for the claim to be met.); the adjustment factor, or weight being configured to prevent that a score value of any parameter dominates over the score value of other parameters (The Examiner notes that “configured to prevent…” is an intended result, which is not required for the claim to be met.) The Examiner interprets limitations that contain “applicable” (able to be applied) as optional language. As a matter of linguistic precision, optional claim elements do not narrow claim limitations since they can be omitted; “[c]laim scope is not limited by claim language that suggests or makes optional but does not require steps to be performed.” MPEP 2111.04 & 2173.05(d) (see also In re Johnston, 435 F3d 1381, 77 USPQ2d 1788 (Fed Cir 2006)). Re. Claim 13, Campo teaches the method of claim 1, wherein the computing device comprises a graphical user interface configured to visually convey to a user, or operator, a measure of the impact on the complexity level of each portion, or pillar, and/or of each parameter comprised in the clinical trial study, as a result of a current selection and/or definition of a clinical trial protocol (Fig. 7, [0023], [0026] teach computer screen shots of exemplary displays 12, e.g., graphical user interfaces, for planning and designing a clinical trial permit the protocol to be created using a wizard-based interface as shown in Fig. 7 at 110. Figs. 7, 9, [0030]-[0031] teach display of important design study information including a basic schedule of selected tasks/procedures and associated cost 132 (a measure of the impact on the complexity level).) Re. Claim 14, Campo teaches the method of claim 5, wherein the graphical user interface is interactive in that it changes in real-time, based on modified selection and/or definition of a clinical trial protocol by a user, or operator (Figs. 3-13, [0023], [0026] teach computer screen shots of exemplary displays 12, e.g., graphical user interfaces, for planning and designing a clinical trial permit the protocol to be created using a text-based authoring environment as shown in Fig. 3 at 70 and a wizard-based interface as shown in Fig. 4-13. Fig. 5, [0028] teach objectives may be selected by typing in a text box, selecting from a list or a pull-down menu… and may be edited (changed in real-time based on modified selection or definition) by including additional objectives 98 or making changes to the listed objectives 94, 96…The outcome 102, 104 may also be edited by including additional objectives 107 or making changes to the included objectives 102, 104. See also Fig. 7, showing GUI functionality of creating, selecting, adding, removing tasks.) Re. Claim 15, Campo teaches the method of claim 14, wherein the graphical user interface allows to visualize the complexity level of the clinical trial study (as an overall standardized score value); or of a portion, or pillar, thereof, in relation to other clinical trial studies comprised in the benchmark of retrospective/previously assessed clinical trial studies (Figs. 3-13, [0023], [0026] teach computer screen shots of exemplary displays 12, e.g., graphical user interfaces, for planning and designing a clinical trial permit the protocol to be created using a text-based authoring environment as shown in Fig. 3 at 70 and a wizard-based interface as shown in Fig. 4-13. The Examiner notes that “allows…” is optional language, which is not required for the claim to be met. Regardless, Fig. 2 teaches assigning CPT code’s RVU value as WEU (the benchmark of retrospectively/previously assessed clinical trial studies). Figs. 1-2, [0010], [0012] teach (3) calculating a complexity level for the set of procedures based on the work effort units (in relation to other clinical studies)… includes the following: all procedures included in the set of medical procedures are identified; the work effort unit (score value) is multiplied by a number of occurrences of each of the unique procedures in the set of medical procedures; and finally, the multiplied work effort units are summed (an overall standardized score value).) Re. Claim 16, Campo teaches the method of claim 15, wherein the graphical user interface allows to visualize the complexity level of the clinical trial study, either within clinical trial studies belonging to a specific developmental unit; or across all clinical trial studies; or filtering by clinical trial study phase (Fig. 9, [0023], [0026] teach computer screen shots of exemplary displays 12, e.g., graphical user interfaces, for planning and designing a clinical trial permit the protocol to be created using a wizard-based interface as shown in Fig. 9 at 130. The Examiner notes that “allows…” is optional language and is not required. Regardless, Figs. 8-9 and [0030] teach the ability to search with selected search criteria / phase and TA/indication filters (filtering); and Fig. 9, [0031] teach based on the WEU values, the complexity factor for each procedure may be acquired from the database and shown (visualized) in a panel containing search results 137 and/or included in a selected tasks panel 138.) Re. CLAIM 18, the subject matter of claim 18 is essentially defined in terms of a system, which is technically corresponding to method claim 1. Since claim 18 is analogous to claim 1, it is similarly analyzed and rejected in a manner consistent with the rejection of claim 1. Further, Campo teaches one or more programs stored in the memory including instructions ([0010], [0024] teach a memory 22 for storing instructions that causes the processor 20 to follow the instructions stored in the memory.) Re. CLAIM 19, the subject matter of claim 19 is essentially defined in terms of a manufacture, which is technically corresponding to method claim 1. Since claim 19 is analogous to claim 1, it is similarly analyzed and rejected in a manner consistent with the rejection of claim 1. Further, Campo teaches a non-transitory computer-readable storage medium storing one or more programs… the one or more programs including instructions ([0010], [0024] teach a memory 22 for storing instructions that causes the processor 20 to follow the instructions stored in the memory.) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 2-3 are rejected under 35 U.S.C. 103 as being unpatentable over Campo in view of John (US 6,067,467 A). Re. Claim 2, Campo teaches the method of claim 1, wherein the scaled score value of each parameter is the standardized score value (SV) for each parameter, calculated according to the following: […] wherein "x" is an adjusted score value for each parameter (assigned work effort units) (see Figs. 1-2), derived from a predefined scoring table reflecting the relative importance of the score value of the parameters (Fig. 2 teaches the method for determining work effort unit includes assigning work effort unit value using a time estimate table (predefined scoring table) 60.); […], based on the benchmark score values of the retrospective/previously assessed clinical trial studies for the same parameter (the RVUs); and […]. Campo does not teach SV = (x - mean) / standard deviation… wherein "mean" is the mean of each score value… "standard deviation" is a value calculated as the square root of variance, by determining each parameter's score value deviation relative to the mean, measuring the amount of variation, or dispersion, of a distribution of the score values of a parameter in the set. John teaches SV = (x - mean) / standard deviation… wherein "mean" is the mean of each score value… "standard deviation" is a value calculated as the square root of variance, by determining each parameter's score value deviation relative to the mean, measuring the amount of variation, or dispersion, of a distribution of the score values of a parameter in the set (Col. 9, lines 19-34 teaches each measure may be z-transformed using the corresponding mean and standard deviation obtained from the baseline. Each z-score (SV) for a patient is calculated in the following manner: the reference pre-operative mean, X, for a particular measure, is subtracted from the value X for that measure obtained from the patient during the operation. The difference, X-X, is divided by the standard deviation, s, of that measure for the baseline. Thus, z=(X-X)/s. The Examiner notes that John is only required to teach “z-score… is calculated”, since a person having ordinary skill in the art would understand SV to be the definition of a z-score.) Therefore, it would have been prima facie obvious to one of ordinary skill in the art before the effective filing date to have modified the protocol complexity analyzer of Campo to calculate the z-score for each of the work effort units of the procedures/tasks and to use this information as part of a method of patient monitoring as taught by John (see Fig. 6B), with the motivation of improving clinical data analysis (e.g., evaluation) and quality of measured/observed clinical data (see John Fig. 6B; col. 9, lines 19-34; and col. 9, lines 35-47). Re. Claim 3, Campo teaches the method of claim 1, wherein, when selecting and/or defining a clinical trial protocol, the computing device automatically presents an initial user interface to an operator ([0023], [0026] teach the display 12 may be a graphical user interface that presents data according to the exemplary computer screen shots in Figs. 3-13 (initial user interface(s), e.g., Fig. 9), comprising a preliminary multiplicity of clinical trial protocol data entries activated and/or generated (Fig. 3 shows data entered (activated or generated) for Protocol ID, sponsor name and address, countries, single/multi-centers, number of centers. See also Fig. 9 tasks) based on an initial choice by an operator of at least one of: a category of therapeutic area; a developmental unit (83) (Fig. 4, [0026]-[0027] teach data entry in the ‘protocol design guide’ window in fields 81-88 (based on initial choice(s)) including a full title 81, a protocol ID 82, a phase 83 (developmental unit), an indication 84, a description 85, a study type 86, a list of countries 87, and regulatory/ governmental information 88. Figs. 8-9, [0030] teach selecting search criteria ‘TherapeuticArea’ (an initial choice); and tasks (preliminary multiplicity) with associated phase and TA/indication search criteria… since oftentimes, the complexity of a procedure depends on the type of study (i.e., Phase 1, 2, or 3) and/or the therapeutic area being studied.); and subsequently the computing device presents to the operator clinical trial protocol data entries (Fig. 9 procedures/tasks in Basic Schedule) adaptively adjusted, based on […] that the set of corresponding parameters drives the complexity level of the portion of the clinical trial study ([0010], [0012] teach (3) calculating a complexity level for the set of procedures based on the work effort units… includes the following: all procedures included in the set of medical procedures are identified; the work effort unit is multiplied by a number of occurrences of each of the unique procedures in the set of medical procedures (adaptively adjusted); and finally, the multiplied work effort units are summed.) Campo may not teach adaptively adjusted based on a probability value. John teaches adaptively adjusting based on a probability value (Col. 9, lines 19-34 teaches each measure (Campo’s work effort unit) may be z-transformed (adaptively adjusted) using the corresponding mean and standard deviation obtained from the baseline. Each z-score for a patient is calculated in the following manner… the z-score provides an estimate of the probability that an observed measure is improbable.) Therefore, it would have been prima facie obvious to one of ordinary skill in the art before the effective filing date to have modified the protocol complexity analyzer of Campo to calculate the z-score for each of the work effort units of the procedures/tasks and to use this information as part of a method of patient monitoring as taught by John (see Fig. 6B), with the motivation of improving clinical data analysis (e.g., evaluation) and quality of measured/observed clinical data (see John Fig. 6B; col. 9, lines 19-34; and col. 9, lines 35-47). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Bhattacharya et al. (US 2021/0241859 A1; see IDS document) for teaching the z-score (SV by definition) (see para. 0730, “a score component x may be normalized to a score component x' (SV) according to x' = (x – x_min) / (x_max – x_min). In embodiments, normalization may include normalization techniques that include and/or are based on… z-score”), with the motivation to combine of improving clinical trial design recommendations (see Bhattacharya at Abstract and para. 0010, 0199, 0217, 0270). This reference could be applied to reject claim 2 as evidenced by John (or “Standard score” from Wikipedia). “Standard score” From Wikipedia for evidencing the definition of a z-score / normal score / standard score; and using the z-score to calculate a prediction interval. Note: A person having ordinary skill in the art would understand SV to be the definition of a z-score. Shields et al. (US 2015/0178244 A1) for teaching method and apparatus for determining complexity of a clinical trial, e.g., calculating study design complexity with and without routinely covered procedures (Fig. 3). Bound et al. (US 2016/0203296 A1) for teaching system and method for determining a clinical trial patient burden, e.g., benchmarking and analysis (para. 0027), aggregating patient burden components to output procedure-level patient burden index (Fig. 2B, 3-4), aggregating all PBIs to calculate the overall PBI for the entire clinical trial (Fig. 7). Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jessica M Webb whose telephone number is (469)295-9173. The examiner can normally be reached Mon-Fri 9:00am-3:00pm CST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Robert Morgan can be reached on (571) 272-6773. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /J.M.W./Examiner, Art Unit 3683 /CHRISTOPHER L GILLIGAN/Primary Examiner, Art Unit 3683
Read full office action

Prosecution Timeline

Mar 21, 2024
Application Filed
Nov 05, 2025
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585721
SINGLE BARCODE SCAN CAST SYSTEM FOR PHARMACEUTICAL PRODUCTS
2y 5m to grant Granted Mar 24, 2026
Patent 12525336
INTELLIGENT MEDICAL ASSESSMENT AND COMMUNICATION SYSTEM WITH ARTIFICIAL INTELLIGENCE
2y 5m to grant Granted Jan 13, 2026
Patent 12394505
ELECTRONIC HEALTH RECORD INTEROPERABILITY TOOL
2y 5m to grant Granted Aug 19, 2025
Patent 12347541
CAREGIVER SYSTEM AND METHOD FOR INTERFACING WITH AND CONTROLLING A MEDICATION DISPENSING DEVICE
2y 5m to grant Granted Jul 01, 2025
Patent 12293001
REFERENTIAL DATA GROUPING AND TOKENIZATION FOR LONGITUDINAL USE OF DE-IDENTIFIED DATA
2y 5m to grant Granted May 06, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
33%
Grant Probability
86%
With Interview (+52.5%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 99 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month