Prosecution Insights
Last updated: April 19, 2026
Application No. 18/786,359

SYSTEM AND METHOD OF REMOTE REHABILITATION THERAPY FOR MOVEMENT DISORDERS

Non-Final OA §103
Filed
Jul 26, 2024
Examiner
YIP, JACK
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Encora Inc.
OA Round
3 (Non-Final)
33%
Grant Probability
At Risk
3-4
OA Rounds
4y 1m
To Grant
70%
With Interview

Examiner Intelligence

Grants only 33% of cases
33%
Career Allow Rate
229 granted / 702 resolved
-37.4% vs TC avg
Strong +38% interview lift
Without
With
+37.6%
Interview Lift
resolved cases with interview
Typical timeline
4y 1m
Avg Prosecution
51 currently pending
Career history
753
Total Applications
across all art units

Statute-Specific Performance

§101
22.8%
-17.2% vs TC avg
§103
42.4%
+2.4% vs TC avg
§102
15.0%
-25.0% vs TC avg
§112
12.4%
-27.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 702 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/24/2025 has been entered. Claims 1 – 20 are pending. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2,4-9,11 and 13-19 are rejected under 35 U.S.C. 103 as being unpatentable over Choi et al. (US 2023/0123383 A1) in view of Gwin (US 9311789 B1). Re claims 1, 11: 1. A system for remote rehabilitation of a movement disorder of a patient (Choi, figs. 1A – 1B; [0063], “movement disorders”), comprising: a wearable device (Choi, [0109], “wearable devices”) comprising: a sensor configured to generate movement data of a patient, the movement data indicative of one or more movement disorder symptoms of the patient (Choi, [0060], “diagnosis/or prognosis of individual patients”; [0073], “clinician evaluates the patient in view of the … hand movement/tremors, walking, gait, ambulatory status/stability, and other characteristics to arrive at appropriate medical assessment”; [0118], “patient may be instructed to conduct one or more tasks for determining characteristics of the patient's condition. Exemplary tasks may include flexion tasks, tension tasks, or other types of tasks that may be used to evaluate the rigidity of the patient”); a processor in communication with the sensor (Choi, figs. 1A – 1B; figs. 12 - 13); a stimulator in communication with the processor, wherein the processor is configured to cause the stimulator to stimulate a body part of the patient based on the movement data (Choi, [0054], “Clinician controller device 1208 may permit programming of IPG 170 to provide a number of different stimulation patterns or therapies to the patient as appropriate for a given patient and/or disorder”; [0073], “hand movement/tremors, walking, gait, ambulatory status/stability, and other characteristics to arrive at appropriate medical assessment. Depending on such telehealth consultation/evaluation, the clinician may remotely adjust stimulation therapy settings for secure transmission to the patient device”; Choi, [0173], “Gloves or arm sleeves with actuators embedded at the interface may be employed for haptic feedback in some embodiments” and [0175], “For example, if the patient is supposed to shift his weight to the left foot but failed to do so, a vibration can take place on the left foot to remind the patient”; [0050], “remote care therapy applications may involve implantable devices, additional and/or alternative embodiments may involve external personal devices and/or noninvasive/minimally invasive (NIMI) devices (e.g., wearable biomedical devices, transcutaneous/subcutaneous devices, etc.) that may be configured to provide therapy to the patients analogous to the implantable devices”; [0094], “therapy settings data may comprise electrode configuration data for delivery of electrical pulses”); and a wireless communication unit in communication with the processor (Choi, [0088]); and a virtual platform running on a server in communication with the wearable device via the wireless communication unit (Choi, figs. 1A – 1B; figs. 12 – 13; [0107], “the patient controller app is adapted to log such events (“Device Use/Events Data”) and communicate the events to system 1200 to maintain a therapy history for the patient for review by the patient’s clinician(s) to evaluate and/or optimize the patient’s therapy”), the virtual platform programmed to receive the movement data from the wearable device (Choi, [0123], “, the one or more servers may evaluate the patient data to determine characteristics associated with the patient’s performance of the task(s), such as a minimum jerk trajectory, a smoothness of motion, a co-contraction profile, a gait profile, an arm-swing profile, a balance/sway, a range of motion, completeness of movement, or other characteristics, and the metrics may be determined based on the characteristics”) and host both: a provider application in communication with a provider device, wherein the provider application is programmed to generate a therapy regime based on the movement data (Choi, fig. 1A, “Clinician Programmer 180”, “Cloud/ Datacenter 155”; [0123], “one or more servers, such as a server of the virtual clinic/remote programming platform 1214 of FIG. 12. Each metric may be obtained by processing respective sets of patient data captured during performance of one or more of the various patient tasks”); and a patient application in communication with a patient device (Choi, fig. 1A, “Patient Controller 150”, “Cloud/ Datacenter 155”; [0123], “one or more servers, such as a server of the virtual clinic/remote programming platform 1214 of FIG. 12. Each metric may be obtained by processing respective sets of patient data captured during performance of one or more of the various patient tasks”; fig. 8, “Patient Controller Application 802”; [0170], “trainee can mimic the teacher's exact movements. Here, the trainee would match the ideal movement trajectory/posture outlined by the digital teacher”; [0053]; [0073], “remote monitoring, remote therapy, remote learning, data logging, etc. … the clinician evaluates the patient in view of the physiological/biological data, telemedicine/video consultation, audio/visual cues and signals regarding patient’s facial expressions, hand movement/tremors, walking, gait, ambulatory status/stability, and other characteristics to arrive at appropriate medical assessment. Depending on such telehealth consultation/evaluation, the clinician may remotely adjust stimulation therapy settings for secure transmission to the patient device”; figs. 12 shows a virtual clinic platform; fig. 13 shows adjustment to a patient’s neurostimulation program based on patient’s movement metric). 11. A method of remote rehabilitation of a movement disorder of a patient (Choi, figs. 1A – 1B; [0063], “movement disorders”), comprising: generating movement data by a sensor of a wearable device placed on a patient, the movement data indicative of one or more movement disorder symptoms of the patient (Choi, [0060], “diagnosis/or prognosis of individual patients”; [0073], “clinician evaluates the patient in view of the … hand movement/tremors, walking, gait, ambulatory status/stability, and other characteristics to arrive at appropriate medical assessment”; [0118], “patient may be instructed to conduct one or more tasks for determining characteristics of the patient's condition. Exemplary tasks may include flexion tasks, tension tasks, or other types of tasks that may be used to evaluate the rigidity of the patient”); communicating the movement data from a wireless communication unit of the wearable device to a virtual platform running on a server (Choi, figs. 1A – 1B; figs. 12 – 13; [0107], “the patient controller app is adapted to log such events (“Device Use/Events Data”) and communicate the events to system 1200 to maintain a therapy history for the patient for review by the patient’s clinician(s) to evaluate and/or optimize the patient’s therapy”; [0123], “, the one or more servers may evaluate the patient data to determine characteristics associated with the patient’s performance of the task(s), such as a minimum jerk trajectory, a smoothness of motion, a co-contraction profile, a gait profile, an arm-swing profile, a balance/sway, a range of motion, completeness of movement, or other characteristics, and the metrics may be determined based on the characteristics”), generating, by a provider application hosted by the virtual platform, a therapy regime based on the movement data (Choi, fig. 1A, “Clinician Programmer 180”, “Cloud/ Datacenter 155”; [0123], “one or more servers, such as a server of the virtual clinic/remote programming platform 1214 of FIG. 12. Each metric may be obtained by processing respective sets of patient data captured during performance of one or more of the various patient tasks”; [0053]; [0073], “remote monitoring, remote therapy, remote learning, data logging, etc. … the clinician evaluates the patient in view of the physiological/biological data, telemedicine/video consultation, audio/visual cues and signals regarding patient’s facial expressions, hand movement/tremors, walking, gait, ambulatory status/stability, and other characteristics to arrive at appropriate medical assessment. Depending on such telehealth consultation/evaluation, the clinician may remotely adjust stimulation therapy settings for secure transmission to the patient device”; figs. 12 shows a virtual clinic platform; fig. 13 shows adjustment to a patient’s neurostimulation program based on patient’s movement metric); providing, by the patient application (Choi, fig. 1A, “Patient Controller 150”, “Cloud/ Datacenter 155”; [0123], “one or more servers, such as a server of the virtual clinic/remote programming platform 1214 of FIG. 12. Each metric may be obtained by processing respective sets of patient data captured during performance of one or more of the various patient tasks”; fig. 8, “Patient Controller Application 802”; [0170], “trainee can mimic the teacher's exact movements. Here, the trainee would match the ideal movement trajectory/posture outlined by the digital teacher”; [0053]; [0073], “remote monitoring, remote therapy, remote learning, data logging, etc. … the clinician evaluates the patient in view of the physiological/biological data, telemedicine/video consultation, audio/visual cues and signals regarding patient’s facial expressions, hand movement/tremors, walking, gait, ambulatory status/stability, and other characteristics to arrive at appropriate medical assessment. Depending on such telehealth consultation/evaluation, the clinician may remotely adjust stimulation therapy settings for secure transmission to the patient device”; figs. 12 shows a virtual clinic platform; fig. 13 shows adjustment to a patient’s neurostimulation program based on patient’s movement metric). Choi teaches the trainee can mimic the teacher's exact movements. Here, the trainee would match the ideal movement trajectory/posture outlined by the digital teacher (Choi, [0170]). Choi further teaches a virtual clinic/remote programming platform that compares the conditions (physiological/biological data) of the patient to the appropriate therapy or treatment (Choi, [0053]; [0073], “remote monitoring, remote therapy, remote learning, data logging, etc. … the clinician evaluates the patient in view of the physiological/biological data, telemedicine/video consultation, audio/visual cues and signals regarding patient’s facial expressions, hand movement/tremors, walking, gait, ambulatory status/stability, and other characteristics to arrive at appropriate medical assessment. Depending on such telehealth consultation/evaluation, the clinician may remotely adjust stimulation therapy settings for secure transmission to the patient device”; figs. 12 shows a virtual clinic platform; fig. 13 shows adjustment to a patient’s neurostimulation program based on patient’s movement metric). Choi does not explicitly disclose wherein the patient application is programmed to compare the movement data to the therapy regime and provide real-time feedback to the patient through a patient device; nor disclose comparing, by a patient application hosted by the virtual platform, the movement data to the therapy regime; and providing, by the patient application, real-time feedback to the patient through a patient device in communication with the virtual platform based on the comparison. Gwin (US 9311789 B1) teaches a sensorimotor rehabilitation system can analyze body movement kinematics of a person, and in particular analyze the use of limbs (e.g., upper limbs) to provide feedback for sensorimotor rehabilitation (Gwin, Abstract). Gwin teaches wherein the patient application is programmed to compare the movement data to the therapy regime and provide real-time feedback to the patient through a patient device and comparing, by a patient application hosted by the virtual platform, the movement data to the therapy regime; and providing, by the patient application, real-time feedback to the patient through a patient device in communication with the virtual platform based on the comparison (Gwin, col. 3, lines 25 – 46, “Generating feedback information can include generating the feedback information for real time presentation, for example. The generating of the feedback information can further include generating information for use in generating audible and/or tactile (e.g., vibrotactile) feedback”; col. 7, lines 19 – 37, “feedback data is generated and outputted by the sensorimotor rehabilitation system. In some embodiments, the feedback data can include information sufficient to generate depiction of desired feedback for display via a user interface display. The feedback data can include a comparison of a determined quantity and type of limb movement or other characteristics with a desired quantity and type of limb movement or other characteristics. The desired quantity and type of limb movement may be determined based on sensor data collected when the user is in motion. The feedback data can be provided for real time display to a user”; col. 10, lines 27 – 44, “5) initiate visual and/or vibrotactile biofeedback according to desired and/or predetermined feedback”; fig. 8 shows a virtual platform (fig. 8, 310, 341, 302) in connection with the wearable device). Therefore, in view of Gwin, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the method/system described in Choi, by providing the real-time feedback on movement compliance as taught by Gwin, since Gwin suggests that the sensorimotor rehabilitation system can provide real time visual, audible, and/or vibrotactile feedback to encourage affected limb use during activities of daily living (Gwin, col. 10, lines 27 – 44). The real time feedback can be indicative of usage of the limb and/or a corrective action for achieving a desired and/or predetermined usage of the limb (Gwin, col. 4, lines 51 – 64). Choi does not explicitly disclose wherein the sensor and wireless communication unit are contained within a housing of the wearable device. Gwin teaches communicating the movement data from a wireless communication unit of the wearable device to a virtual platform running on a server, wherein the sensor and wireless communication unit are contained within a housing of the wearable device (Gwin, figs. 2 – 5; fig. 8, 366; col. 4, line 65 – col. 5, line 6, “”; col. 8, lines 20 – col. 9, line 18, “The band 122 can house and position the sensor 110”; fig. 7, 366, 367, 364). Therefore, in view of Gwin, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the method/system described in Choi, by providing a wearable device integrated wireless module and sensors as taught by Gwin, since it was known in the art to integrate components (i.e., sensors, communication module, processor, vibration motor) into a single package to minimize the footprint the device. Re claims 2, 13: 2. The system of claim 1, wherein the provider application is further programmed to generate a visualization of the movement data and display the visualization on the provider device. 13. The method of claim 11, further comprising visualizing the movement data and providing the visualization to a provider device via the provider application in communication with the virtual platform (Choi, [0170], “trainee can mimic the teacher's exact movements. Here, the trainee would match the ideal movement trajectory/posture outlined by the digital teacher”; Gwin, col. 7, lines 1- 5, “Therapists can remotely monitor the movement data and/or review the data with patients during clinic visits”; col. 13, lines 36 – 50, “Users of the sensorimotor rehabilitation system may include the person wearing one or more sensor 110 or other individuals assigned to view the wearer's data (e.g., a therapist, clinician, trainer, caregiver, or other medical personnel)”; col. 20, lines 27 – 43, “data can be transmitted and displayed to a third party (e.g., a therapist) and then the third party can provide feedback to the wearer”; col. 14, lines 12 – 27, “the sensorimotor rehabilitation system can provide feedback to direct, guide, and/or encourage use of the limbs at a desired”; col. 16, lines 50 – 64, “feedback information related to one or more corrective actions for improving sensorimotor rehabilitation or performing exercises”). Re claims 4, 14: 4. The system of claim 1, wherein the patient application is further programmed to provide an indication of an effectiveness of the therapy regime based on the comparison. 14. The method of claim 11, further comprising indicating, by the patient application, an effectiveness of the therapy regime (Choi, [0101], “to evaluate the effectiveness of the therapy for the patient”; [0118], “may be employed to gather data during patient performance of one or more tasks”; [0178]). Re claims 5, 15: 5. The system of claim 1, wherein the patient application is further programmed to guide the patient through one or more exercises based on the therapy regime. 15. The method of claim 11, further comprising guiding the patient through one or more exercises based on the therapy regime via the patient device (Choi, [0170], “trainee can mimic the teacher's exact movements. Here, the trainee would match the ideal movement trajectory/posture outlined by the digital teacher”; Gwin, col. 7, lines 1- 5, “Therapists can remotely monitor the movement data and/or review the data with patients during clinic visits”; col. 13, lines 36 – 50, “Users of the sensorimotor rehabilitation system may include the person wearing one or more sensor 110 or other individuals assigned to view the wearer's data (e.g., a therapist, clinician, trainer, caregiver, or other medical personnel)”; col. 20, lines 27 – 43, “data can be transmitted and displayed to a third party (e.g., a therapist) and then the third party can provide feedback to the wearer”; col. 14, lines 12 – 27, “the sensorimotor rehabilitation system can provide feedback to direct, guide, and/or encourage use of the limbs at a desired”; col. 16, lines 50 – 64, “feedback information related to one or more corrective actions for improving sensorimotor rehabilitation or performing exercises”). Re claims 6, 16: 6. The system of claim 5, wherein the patient application is further programmed to generate a completion score of the one or more exercises based on the movement data generated by the wearable device. 16. The method of claim 15, further comprising generating a completion score of the one or more exercises based on the movement data via the patient application (Choi, fig. 20, 2006, 2007; [0183], “the performance data and/or video data is provided to a clinician for review”; [0128], “The combined objective and subjective scores may represent the overall rigidity score for the patient and provide a holistic assessment of the patient's symptoms and progress”; Gwin, col. 20, lines 4 – 26, “performance score”). Re claims 7, 17: 7. The system of claim 5, wherein the patient application is further programmed to present the patient with a visualization of the one or more exercises via the patient device. 17. The method of claim 16, further comprising presenting the patient with a visualization of the one or more exercises at the patient device via the patient application (Choi, [0170], “trainee can mimic the teacher's exact movements. Here, the trainee would match the ideal movement trajectory/posture outlined by the digital teacher”; ; Gwin, col. 7, lines 1- 5, “Therapists can remotely monitor the movement data and/or review the data with patients during clinic visits”; col. 13, lines 36 – 50, “Users of the sensorimotor rehabilitation system may include the person wearing one or more sensor 110 or other individuals assigned to view the wearer's data (e.g., a therapist, clinician, trainer, caregiver, or other medical personnel)”; col. 20, lines 27 – 43, “data can be transmitted and displayed to a third party (e.g., a therapist) and then the third party can provide feedback to the wearer”; col. 14, lines 12 – 27, “the sensorimotor rehabilitation system can provide feedback to direct, guide, and/or encourage use of the limbs at a desired”; col. 16, lines 50 – 64, “feedback information related to one or more corrective actions for improving sensorimotor rehabilitation or performing exercises”). Re claims 8, 18: 8. The system of claim 1, wherein the real-time feedback provided by the patient application includes visualized directional guidance of the body part of the patient provided through the patient device. 18. The method of claim 11, wherein the real-time feedback provided by the patient application includes visualized directional guidance of the body part of the patient provided through the patient device (Choi, [0170], “trainee can mimic the teacher's exact movements. Here, the trainee would match the ideal movement trajectory/posture outlined by the digital teacher”; Gwin, col. 7, lines 1- 5, “Therapists can remotely monitor the movement data and/or review the data with patients during clinic visits”; col. 13, lines 36 – 50, “Users of the sensorimotor rehabilitation system may include the person wearing one or more sensor 110 or other individuals assigned to view the wearer's data (e.g., a therapist, clinician, trainer, caregiver, or other medical personnel)”; col. 20, lines 27 – 43, “data can be transmitted and displayed to a third party (e.g., a therapist) and then the third party can provide feedback to the wearer”; col. 14, lines 12 – 27, “the sensorimotor rehabilitation system can provide feedback to direct, guide, and/or encourage use of the limbs at a desired”; col. 16, lines 50 – 64, “feedback information related to one or more corrective actions for improving sensorimotor rehabilitation or performing exercises”). Re claims 9, 19: 9. The system of claim 1, wherein the stimulator is configured to deliver one of electrical, vibratory, or ultrasonic stimulation to the patient's body part. 19. The method of claim 11, further comprising providing one or more of electrical, vibratory, or ultrasonic stimulation to the patient's body part (Choi, [0093]; [0177]). Claims 3 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Choi and Gwin as applied to claims 1 and 11 above, and further in view of Ganguly et al. (US 2020/0061378 A1). Re claims 3, 12: Choi teaches 3. The system of claim 1, wherein provider application is further programmed to: receive parameter selections via the provider device; and communicate the parameter selections to the wearable device via the virtual platform. 12. The method of claim 11, further comprising: receiving parameter selections via a provider device in communication with the provider application hosted by the virtual platform (Choi, [0123]; [0088]; figs. 22 – 23; [0031], “FIGS. 22 and 23 depict processing of patient data from a wearable device to control neurostimulation”; [0184], “through a wearable device 2200 (e.g. a wearable watch), biosignals such as … posture information using accelerometer and gyroscope are measured for use in AI/ML processing to control neurostimulation”; [0202], “A controller for neurostimulation may be implemented in a number of system locations for any of the embodiments discussed herein … the processor or suitable circuit may be included within an implantable pulse generator, a patient controller device, a clinician programmer device, a wearable electronic device, a remote health digital platform/server, or any other suitable computing system”). Choi does not explicitly disclose communicate the parameter selections to the wearable device via the virtual platform, wherein the processor of the wearable device is configured to command the stimulator based on the received parameter selections. Ganguly teaches method and system for sensory electrical stimulation of the peripheral nervous system to improve human motor function and performance (Ganguly, Abstract). Ganguly teaches 3. The system of claim 1, wherein provider application is further programmed to: receive parameter selections via the provider device; and communicate the parameter selections to the wearable device via the virtual platform, wherein the processor of the wearable device is configured to command the stimulator based on the received parameter selections. 12. The method of claim 11, further comprising: receiving parameter selections via a provider device in communication with the provider application hosted by the virtual platform; communicating the parameter selections to the wearable device via the wireless communication unit; and commanding, by a processor of the wearable device, a stimulator of the wearable device to stimulate a body part of the patient based on the received parameter selections (Ganguly, [0037], “The stimulator may be a wearable stimulator that is configured to be worn on a subject's arm and/or wrist, and to apply stimulation to one or more of the subject's radial, ulnar and median nerves. The stimulator may be a smart sole that can be placed inside the subject's shoe and apply stimulation to one or more of the subject's leg or foot nerves”; [0158], “The server has revised the stimulation algorithm and pushed it back down to the app. She then starts a monitored stimulation session”; [0034], “a controller configured to deliver an electrical stimulation from the electrodes and a wireless communication circuit”). Therefore, in view of Ganguly, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify method and system described in Choi, by providing the wearable stimulation as taught by Ganguly, in order to stimulate the area of the movement disorder which is closed or at the location of the sensors of a wearable device (Ganguly, [0037, “The stimulator may be a smart sole that can be placed inside the subject's shoe and apply stimulation to one or more of the subject's leg or foot nerves”). Claims 10 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Choi and Gwin as applied to claims 1 and 11 above, and further in view of Schepis et al. (US 2020/0179697 A1). Re claims 10 and 20: Choi does not explicitly disclose 10. The system of claim 1, wherein the stimulator provides stimulation output to one or more of Aδ fibers, C fibers, Ia afferents, Ib afferents, Aβ fibers, Aγ fibers, or dorsal root ganglion (DRG) neurons. 20. The method of claim 11, further comprising providing stimulation output via a stimulator of the wearable device to one or more of Aδ fibers, C fibers, Ia afferents, Ib afferents, Aβ fibers, Aγ fibers, or dorsal root ganglion (DRG) neurons. Schepis teaches a system and method for selectively and reversibly modulating targeted neural and non-neural tissue of a nervous system for the treatment of pain. Schepis further teaches providing stimulation output via a stimulator of the wearable device to one or more of Aδ fibers, C fibers, Ia afferents, Ib afferents, Aβ fibers, Aγ fibers, or dorsal root ganglion (DRG) neurons (Schepis, [0159]; [0156]). Therefore, in view of Schepis, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify method and system described in Choi, by stimulating fibers as taught by Schepis, since electrical stimulation can be adjusted to selectively inhibit downstream or secondary effects of pain originating from Aδ fibers and/or originating from the unmyelinated C fibers, while the function of central nervous system and peripheral nervous system neurons involved in detection, transmission, processing, and generation of non-painful touch, motor control, and proprioception are preserved (Schepis, [0159]). Response to Arguments Applicant's arguments filed 11/24/2025 have been fully considered but they are not persuasive. Applicant argues: Choi provides no teaching, suggestion, or motivation to combine these separate, functionally distinct systems that serve different purposes and operate independently … Claim 1 has been amended to recite "the sensor, processor, and wireless communication unit are contained within a housing of the wearable device" to further distinguish this difference. The Office cites Gwin (US 9311789 B1) to teaches the newly added limitations. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JACK YIP whose telephone number is (571)270-5048. The examiner can normally be reached Monday thru Friday; 9:00 AM - 5:00 PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, XUAN THAI can be reached at (571) 272-7147. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JACK YIP/Primary Examiner, Art Unit 3715
Read full office action

Prosecution Timeline

Jul 26, 2024
Application Filed
May 09, 2025
Non-Final Rejection — §103
Aug 12, 2025
Applicant Interview (Telephonic)
Aug 12, 2025
Examiner Interview Summary
Aug 14, 2025
Response Filed
Aug 23, 2025
Final Rejection — §103
Nov 24, 2025
Request for Continued Examination
Dec 04, 2025
Response after Non-Final Action
Jan 16, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12588859
SYSTEM AND METHOD FOR INTERACTING WITH HUMAN BRAIN ACTIVITIES USING EEG-FNIRS NEUROFEEDBACK
2y 5m to grant Granted Mar 31, 2026
Patent 12592160
System and Method for Virtual Learning Environment
2y 5m to grant Granted Mar 31, 2026
Patent 12558290
BLOOD PRESSURE LOWERING TRAINING DEVICE
2y 5m to grant Granted Feb 24, 2026
Patent 12525140
SYSTEMS AND METHODS FOR PROGRAM TRANSMISSION
2y 5m to grant Granted Jan 13, 2026
Patent 12512012
SYSTEM FOR EVALUATING RADAR VECTORING APTITUDE
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
33%
Grant Probability
70%
With Interview (+37.6%)
4y 1m
Median Time to Grant
High
PTA Risk
Based on 702 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month