Prosecution Insights
Last updated: April 19, 2026
Application No. 18/560,644

SYSTEM AND METHOD FOR REMOTELY MONITORING MUSCLE AND JOINT FUNCTION

Non-Final OA §102§103
Filed
Nov 13, 2023
Examiner
FARDANESH, MARJAN
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
University Of Vermont And State Agricultural College
OA Round
1 (Non-Final)
72%
Grant Probability
Favorable
1-2
OA Rounds
3y 6m
To Grant
91%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
613 granted / 846 resolved
+2.5% vs TC avg
Strong +18% interview lift
Without
With
+18.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
28 currently pending
Career history
874
Total Applications
across all art units

Statute-Specific Performance

§101
8.8%
-31.2% vs TC avg
§103
29.6%
-10.4% vs TC avg
§102
28.7%
-11.3% vs TC avg
§112
21.8%
-18.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 846 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-2, 6, 8-10, 13, 16-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Wang et al. (Prediction of joint moments using a neural network model of muscle activations from EMG signals-Cited by the Applicant). Regarding claim 1, Wang discloses a system (a system; page 36, column 2, para 4) for determining dynamics of a joint of an individual (an adjusted back-propagation algorithm was developed; joint torque at the elbow was calculated; generation of muscle forces and joint moments; Figs. 1-7, Abstract & page 30, column 2, para 5), comprising: a first muscle contraction sensor configured to measure an excitation of a first muscle adjacent to a joint (the excitation of muscle tissue acts through activation dynamics to generate an internal muscle tissue state, which is called muscle activation; through muscle contraction dynamics, muscle activation energizes the cross bridges and muscle force is developed; once muscle forces are found, they can be multiplied by the respective muscle moment arms and summed to yield the joint moment; page 30, column 2, para 2); a first movement sensor configured to measure movement on a first side of the joint (joint torque at the elbow was calculated from the EMG (sensor) signals of ten flexor {first side) and extensor muscles; Fig. 1 shows transformation from EMG (sensor) to joint movement; Fig. 4 shows predicted elbow flexion moments with the measured joint moments; this figure shows the predictions of joint moments based on use of the neural network estimations of muscle activations; Figs. 1, 4, Abstract & page 36, column 1, para 1 ); a second muscle contraction sensor configured to measure an excitation of a second muscle located adjacent to the joint (determine muscle activations (excitations) from EMG (sensor) signals; EMGs (second sensor) and joint moments were recorded during maximal voluntary contractions; Fig. 4 shows predicted elbow flexion moments with the measured joint moments; this figure shows the predictions of joint moments based on use of the neural network estimations of muscle activations (excitations); Figs. 1, 4, Abstract & page 33, column 2, para 5 & page 36, column 1, para 1 ); a second movement sensor configured to measure movement on a second side of the joint (joint torque at the elbow was calculated from the EMG signals of ten flexor (first side) and extensor (second side) muscles; Fig. 4 shows predicted elbow flexion moments with the measured joint moments; this figure shows the predictions of joint moments based on use of the neural network estimations of muscle activations (excitations); use the neural network to model the whole process from EMG to muscle force, EMG to joint moment or EMG to joint angles; Figs. 1, 4, Abstract & page 36, column 1, para 1 ); a machine learning processor trained to determine a set of excitation values based on an excitation value from the first muscle contraction sensor and an excitation value from the second muscle contraction sensor (joint torque at the elbow was calculated from the EMG signals of ten flexor and extensor muscles, using the neural network result of estimated activation of the muscles (excitations); the excitation of muscle tissue acts through activation dynamics to generate an internal muscle tissue state, which is called muscle activation; through muscle contraction dynamics, muscle activation energizes the cross bridges and muscle force is developed; a neural network model for muscle activation dynamics; EMGs (second sensor) and joint moments were recorded during maximal voluntary contractions; Abstract, page 30, column 2, para 2, page 31, column 1, para 1 & page 32, column 2, para 5), wherein the set of excitation values includes an excitation value for each muscle of the joint (joint torque at the elbow was calculated from the EMG signals of ten flexor and extensor muscles, using the neural network result of estimated activation of the muscles (excitations); EMGs and joint moments were recorded during maximal voluntary contractions, and these maximal values were used to normalize the EMG data; Abstract & page 33, column 2, para 5); and a processor configured to determine a joint moment based on values from the first and second movement sensors and the set of excitation values from the machine learning processor (joint torque at the elbow was calculated from the EMG signals of ten flexor and extensor muscles, using the neural network result of estimated activation of the muscles (excitations); a neural network is designed to perform two computations (processor); EMGs and joint moments were recorded during maximal voluntary contractions; Abstract & page 32, column 1, para 5 & page .33, column 2, para 5). Regarding claim 2, Wang discloses each of the first muscle contraction sensor and/or the second muscle contraction sensor is an electromyography (EMG) sensor (electromyographic/EMG (sensor) signals of ten flexor and extensor muscles, using the neural network result of estimated activation of the muscles; muscle activation from muscle EMG signals; Abstract, Fig. 1 & page 31, column 2, para 3). Regarding claim 6, Wang et al. discloses the machine learning processor has been trained by the individual performing a set training motions (the subject was asked to perform three cycles of 100% maximum isometric extension and flexion; the EMG measured in the 100% maximum isometric contraction was used as reference of normalization; EMG data were processed and passed to neural network model; page 33, column 2, para 6). Regarding claim 8, Wang discloses wherein the processor (perform two computations (processor); page 32, column 1, para 4) is further configured to determine a set of muscle-tendon unit (MTU) lengths and moment arm values from the movement sensor values (predict muscle activation from EMG signals; combination of artificial neural networks with a Hill type muscle-tendon model used to estimate muscle force; measure elbow (arm) joint moment; Fig. 1 shows muscle contraction dynamics transforms activation to muscle forces by using Hill-type muscle models that account for the length-tension and force-velocity relationships; transformation from EMG (sensor) to joint movement; EMGs (sensors) and joint moments were recorded during maximal voluntary contractions (movement). Regarding claim 9, Wang discloses wherein the processor (perform two computations (processor); page 32, column 1, para 4) is further configured to determine a set of muscle forces based on the MTU lengths and muscle activation dynamics derived from the set of excitation values determined by the machine leaning processor (predict muscle activation from EMG signals; combination of artificial neural networks with a Hill-type muscle-tendon model used to estimate muscle force; measure elbow (arm) joint moment; Fig. 1 shows muscle contraction dynamics transforms activation to muscle forces by using Hill-type muscle models that account for the length-tension and force-velocity relationships; transformation from EMG (sensor) to joint movement; neural network is designed to perform two computations (processor); EMGs (sensors) and joint moments were recorded during maximal voluntary contractions (movement); Fig. 1, Abstract, page 31, column 2, para 2, page 32, column 1, para 4, page 32, column 2, para 5 & page 33, Col. 2 paragraph 5). Regarding claim 10, Wang discloses wherein the joint moment is determined based on the set of muscle forces and moment arm values (Fig. 1. shows muscle contraction dynamics transforms activation to muscle forces by using Hill-type muscle models that account for the length-tension and force-velocity relationships; predicted joint moment derived from the measured elbow joint moment and muscle contractions; Fig. 1 & page 32, column 1, para 1). Regarding claim 13, Wang discloses the system of claim 1. In addition, Wang discloses wherein the machine learning processor is a neural network (the "learning" process, and the given data provided for the artificial neural network to begin learning; page 32, column 2, para 1). Regarding claim 16, Wang discloses wherein the machine learning processor is a part of the processor (a neural network is designed to perform two computations (processor); page 32, column 1, para 4). Regarding claim 17, Wang discloses a method for determining joint dynamics of a joint of an individual (an adjusted back-propagation algorithm was developed; joint torque at the elbow was calculated; a method to predict muscle activation from EMG signals; generation of muscle forces and joint moments; Figs. 1-7, Abstract, page 30, column 1, para 1 & page 30, column 2, para 5), comprising: receiving data from a first muscle contraction sensor configured to measure an excitation of a first muscle on a first side of a joint {joint torque at the elbow was calculated from the EMG (sensor) signals of ten flexor (first side) and extensor muscles; Fig. 1 shows transformation from EMG (sensor) to joint movement; Fig. 4 shows predicted elbow flexion moments with the measured joint moments; this figure shows the predictions of joint moments based on use of the neural network estimations of muscle activations: Figs. 1, 4, Abstract & page 36, column 1, para 1) a second muscle contraction sensor configured to measure an excitation of a second muscle located on a second side of a joint (determine muscle activations (excitations) from EMG (sensor) signals; EMGs (second sensor} and joint moments were recorded during maximal voluntary contractions; Fig. 4 shows predicted elbow flexion moments with the measured joint moments; this figure shows the predictions of joint moments based on use of the neural network estimations of muscle activations (excitations); Figs. 1, 4, Abstract & page 33, column 2, para 5 & page 36, column 1, para 1 ), a first movement sensor configured to measure movement on the first side of the joint (joint torque at the elbow was calculated from the EMG (sensor) signals of ten flexor (first side) and extensor muscles; Fig. 1 shows transformation from EMG (sensor) to joint movement; Fig. 4 shows predicted elbow flexion moments with the measured joint moments; this figure shows the predictions of joint moments based on use of the neural network estimations of muscle activations; Figs. 1, 4, Abstract & page 36, column 1 [0001]). and a second movement sensor configured to measure movement on the second side of the joint (joint torque at the elbow was calculated from the EMG signals of ten flexor (first side) and extensor (second side) muscles; Fig. 4 shows predicted elbow flexion moments with the measured joint moments; this figure shows the predictions of joint moments based on use of the neural network estimations of muscle activations (excitations); use the neural network to model the whole process from EMG to muscle force, EMG to joint moment or EMG to joint angles; Figs. 1, 4, Abstract & page 36, column 1, para 1 ); and determining, using a machine learning processor, a set of excitation values based on an excitation value from the first muscle contraction sensor and an excitation value from the second muscle contraction sensor {joint torque at the elbow was calculated from the EMG signals of ten flexor and extensor muscles, using the neural network result of estimated activation of the muscles (excitations); the excitation of muscle tissue acts through activation dynamics to generate an internal muscle tissue state, which is called muscle activation; through muscle contraction dynamics, muscle activation energizes the cross bridges and muscle force is developed; a neural network model for muscle activation dynamics; EMGs (second sensor) and joint moments were recorded during maximal voluntary contractions; Abstract, page 30, column 2, para 2, page 31, column 1, para 1 & page 32, column 2, para 5}, wherein the set of excitation values includes an excitation value for each muscle of the joint (joint torque at the elbow was calculated from the EMG signals of ten flexor and extensor muscles, using the neural network result of estimated activation of the muscles (excitations); EMGs and joint moments were recorded during maximal voluntary contractions, and these maximal values were used to normalize the EMG data; Abstract & page 33, column 2, para 5). Regarding claim 18, Wang discloses further comprising determining, using a processor (perform two computations (processor); page 32, column 1, para 4), a set of muscle tendon unit (MTU) lengths and moment arm values from the movement sensor values(Fig. 1 shows muscle contraction dynamics transforms activation to muscle forces by using Hill type muscle models that account for the length-tension and force-velocity relationships; predicted joint moment derived from the measured elbow joint moment and muscle contractions; Fig. 1 & page 32, column 1, para 1). Regarding claim 19, Wang discloses further comprising determining, using a processor ((perform two computations (processor); page 32, column 1, para 4 ), a set of muscle forces based on the MTU lengths and muscle activation dynamics derived from the set of excitation values determined by the machine learning processor (predict muscle activation from EMG signals; combination of artificial neural networks with a Hill-type muscle-tendon model used to estimate muscle force; measure elbow (arm) joint moment; Fig. 1 shows muscle contraction dynamics transforms activation to muscle forces by using Hill-type muscle models that account for the length-tension and force-velocity relationships; transformation from EMG (sensor) to joint movement;· neural network is designed to perform two computations (processor); EMGs (sensors) and joint moments were recorded during maximal voluntary contractions (movement); Fig. 1, Abstract, page 31, column 2, para 2, page 32, column 1, para 4, page 32, column 2, para 5 & page 33, column 2, para 5). Regarding claim 20, Wang discloses wherein the machine learning unit has been trained by the individual performing a set training motions (the subject was asked to perform three cycles of 100% maximum isometric extension and flexion; the EMG measured in the 100% maximum isometric contraction was used as reference of normalization; EMG data were processed and passed to neural network model; page 33, column 2, para 6). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wang et al. as applied to claim 1 above, and further in view of Dilon (USPN 9,442,564-Cited by the Applicant). Regarding claim 3, Wang discloses the system of claim 1. Wang fails to disclose wherein each of the first movement sensor and/or the second movement sensor is configured to measure movement in at least six degrees of freedom. Dilon discloses wherein each of the first movement sensor and/or the second movement sensor is configured to measure movement in at least six degrees of freedom (the sensor data captured by the accelerometer and the gyroscope can be used to derive motion according to six dimensions or six degrees of freedom (6DOF); col. 4, lines 67-col. 5, lines 3). It would have been obvious to one of ordinary skill in the art at the time the invention was made to include wherein each of the first movement sensor and/or the second movement sensor is configured to measure movement in at least six degrees of freedom as taught by Dilon into the system of Wang for the purpose of providing a device configured to appropriately respond to and measure various motions via a sensor in order to motion tracking of the user. Claim(s) 4-5, 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wang et al. as applied to claim 1 above, and further in view of Longinotti-buttoni et al. (USPN 2014/0070957-Cited by the Applicant). Regarding claim 4, Wang discloses the system of claim 1. Wang fails to disclose wherein each of the first movement sensor and/or the· second movement sensor is an inertial measurement unit (lMU). Longinott-Buton discloses wherein each of the first movement sensor and/or the second movement sensor is an inertial measurement unit (lMU) muscle activity (sensors configured to measure muscle activity or motion; any type of accelerometer may be used, e.g. an inertial measurement unit (IMU), and may be configured to measure any parameter(s) to determine user motion or infer user motion; para [0099], [02231). It would have been obvious to of ordinary skill in the art at the time the invention was made to include wherein each of the first movement sensor and/or the second movement sensor is an inertial • measurement unit (IMU) as taught by Longinott-Buton into the system of Wang for the purpose of providing wearable sensors to measure any parameter(s) related to the user in order.to determine user motion or infer user motion. Regarding claim 5, Wang in view of Longinott-Buton discloses the system of claim 4. Wang fails to disclose wherein each IMU comprises at least one gyroscope and at least one accelerometer. Longinott-Buton discloses wherein each IMU comprises at least one gyroscope and at least one accelerometer (sensors configured to measure muscle activity or motion; any type of accelerometer may be used, e.g. an inertial measurement unit (IMU), and may be configured to measure any parameter(s) to determine user motion or infer user motion, e.g. an accelerometer, a gyroscope; para [0099}, [0223]). It would have been obvious to of ordinary skill in the art at the time the invention was made to include wherein each IMU comprises at least one gyroscope.and at least one accelerometer as taught by Longinott-Buton into the system of Wang for the purpose of providing wearable sensors to measure any parameter(s) related to the user in order to determine user motion or infer user motion. Regarding claim 15, Wang discloses the system of claim 1. Wang fails to disclose wherein the processor is a remote processor and wherein the system further comprises a transceiver for sending muscle contraction sensor values and/or IMU values to the remote processor. Longinott-Buton wherein the processor is a remote processor (mobile devices such as smart phones have enabled mobile device users to communicate remotely; Fig. 1 A & para [00041) and wherein the system further comprises a transceiver for sending muscle contraction sensor values and/or IMU values to the remote processor (accelerometer, and/or Wi-Fi capability, and a transceiver; sensors configured to measure muscle activity or motion; transceiver 22, a transmitter and/or receiver or to enable transmission or receiver connections with another internal or external element for an item; measure any parameter(s) to determine user motion or infer user motion, e.g. an accelerometer, a gyroscope; para [0088], [0099], [0120], [0223]). It would have been obvious to of ordinary skill in the art at the time the invention was made to include wherein the processor is a remote processor and wherein the system further comprises a transceiver for sending muscle contraction sensor values and/or IMU values to the remote processor as taught by Longinott-Buton into the system of Wang for the purpose of providing some ability to obtain, analyze. use, and control information and data from the remotely located sensors. Claim(s) 7, 14, 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wang et al. as applied to claims 1 and 17 above, and further in view of Stever et al. (USPN 2017/0281054-Cited by the Applicant). Regarding claim 7. Wang discloses the system of claim 1. Wang fails to disclose wherein the machine learning processor has been trained using movement data from a plurality of individuals. Stever et al. discloses wherein the machine learning processor has been trained using movement data from a plurality of individuals (train the motion recognition process using at least one of predetermined measured motions from multiple patients; processor may be configured to classify machine learning) the patient movement at least in part by comparing the patient movement to templates derived from a pre-collected database of measured motions from multiple patients or other users; deep networks for supervised learning, which are intended to directly provide discriminative power for pattern classification purposes; [0008],[0016], (0051]). It would have been obvious lo one of ordinary skill in the art at the time the invention was made to include wherein the machine learning processor has been trained using movement data from a plurality of individuals as taught by Stever et al. into the system of Wang for the purpose of providing a sensor to acquire sensor data descriptive of patient motion and lo detect the patient motion from the sensor data, and to classify the patient motion. Regarding claim 14, Wang discloses the system of claim 13. Wang fails to disclose wherein the neural network is a convolutional neural network, a deep neural network, Stever et al. discloses wherein the neural network is a convolutional neural network, a deep neural network, etc (machine learning techniques including an artificial neural network, deep learning network; [0123]). It would have been obvious to one of ordinary skill in the art at the time the invention was made to include wherein the neural network is a convolutional neural network, a deep neural network, etc. as taught by Stever et al. into the system of Wang for the purpose of providing motion classification or recognition performed on the sensor data using any of a variety of machine learning techniques. Regarding claim 21, Wang discloses the method of claim 17. Stever et al. discloses wherein the machine learning processor has been trained using movement data from a plurality of individuals (train the motion recognition process using at least one of predetermined measured motions from multiple patients; processor may be configured to classify machine learning) the patient movement at least in part by comparing the patient movement to templates derived from a pre-collected database of measured motions from multiple patients or other users; deep networks for supervised learning, which are intended to directly provide discriminative power for pattern classification purposes; para [0008]-[0016], [0051]). It would have been obvious to one of ordinary skill in the art at the time the invention was made to include wherein the machine learning processor has been trained using movement data from a plurality of individuals as taught by Stever et al. into the system of Wang for the purpose of providing a sensor to acquire sensor data descriptive of patient motion and to detect the patient motion from the sensor data, and to classify the patient motion. Claim(s) 11-12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wang et al. as applied to claim 1 above. Regarding claim 11, Wang discloses the system of claim 1. ln addition, Wang discloses further comprising a muscle contraction sensor configured to measure an excitation of a third muscle located adjacent to the joint (electromyographic/EMG (sensor) signals of ten flexor and extensor muscles, using the neural network result of estimated activation of the muscles; electromyographic/EMG (sensor) signals of ten flexor and extensor muscles, using the neural network result of estimated activation of the muscles; muscle activation from muscle EMG signals: EMGs and joint moments were recorded during maximal voluntary contractions; Abstract, Fig. 1, page 31, column 2, para 3); and wherein the machine learning processor uses an excitation value from the muscle contraction sensor in determining the set of excitation values (electromyographic/EMG (sensor) signals of ten flexor and extensor muscles, using the neural network result of estimated activation of the muscles; muscle activation from muscle EMG signals; EMGs (sensors) and joint moments were recorded during maximal voluntary contractions; Abstract, Fig. 1, page 31, column 2, para 3). Wang fails to disclose a third muscle contraction sensor. It would have been obvious to one of ordinary skill in the art at the time the invention was made to include a third muscle contraction sensor into the system of Wang for the purpose of collecting EMG signals from flexor and extensor muscles in order to provide a training data set(s}. Regarding claim 12, Wang discloses the system of claim 11. In addition, Wang discloses further comprising a muscle contraction sensor configured to measure an excitation of a fourth muscle located adjacent to the joint (electromyographic/EMG (sensor) signals of ten flexor and extensor muscles, using the neural network result of estimated activation of the muscles; estimating joint moments from EMGs (sensors); determine muscle activations (excitations) from EMG (sensor) signals; EMGs (second sensor) and joint moments were recorded during maximal voluntary contractions; Fig. 4 shows predicted elbow flexion moments with the measured joint moments; this figure shows the predictions of joint moments based on use of the neural network estimations of muscle activations (excitations); Figs. 1, 4, Abstract & page 33, column 2, para 5 & page 36, column 1, para 1 ); and wherein the machine learning processor uses an excitation value from the muscle contraction sensor in determining the set of excitation values (determine muscle activations (excitations) from EMG (sensor) signals; EMGs (second sensor) and joint moments were recorded during maximal voluntary contractions; Fig. 4 shows predicted elbow flexion moments with the measured joint moments; this figure shows the predictions of joint moments based on use of the neural network estimations of muscle activations (excitations); Figs. 1, 4, Abstract & page 33, column 2, para 5 & page 36, column 1, para 1 ). Wang fails to disclose a fourth muscle contraction sensor. It would have been obvious to one of ordinary skill in the art at the time the invention was made to include a fourth muscle contraction sensor into the system of Wang for the purpose of collecting EMG signals from flexor and extensor muscles in order to provide a training data set(s). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MARJAN FARDANESH whose telephone number is (571)270-5508. The examiner can normally be reached Monday-Friday 9:00-17:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jacqueline Cheng can be reached at (571)272-5596. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MARJAN FARDANESH/ Primary Examiner, Art Unit 3791
Read full office action

Prosecution Timeline

Nov 13, 2023
Application Filed
Jan 24, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594064
DUAL MOVABLE BLADE BIOPSY TOOL WITH STYLET
2y 5m to grant Granted Apr 07, 2026
Patent 12588841
CALIBRATION-FREE PULSE OXIMETRY
2y 5m to grant Granted Mar 31, 2026
Patent 12582313
PHYSIOLOGICAL MONITORING SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12582336
DETECTION DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12551147
ANALYTE SENSOR
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
72%
Grant Probability
91%
With Interview (+18.5%)
3y 6m
Median Time to Grant
Low
PTA Risk
Based on 846 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month