Prosecution Insights
Last updated: April 19, 2026
Application No. 18/304,722

SYSTEMS AND METHODS FOR NEUROLOGICAL REHABILITATION USING VIRTUAL REALITY

Non-Final OA §103
Filed
Apr 21, 2023
Examiner
WEI, XIAOMING
Art Unit
2611
Tech Center
2600 — Communications
Assignee
Somos Inc.
OA Round
1 (Non-Final)
82%
Grant Probability
Favorable
1-2
OA Rounds
2y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
28 granted / 34 resolved
+20.4% vs TC avg
Strong +26% interview lift
Without
With
+26.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
24 currently pending
Career history
58
Total Applications
across all art units

Statute-Specific Performance

§101
7.1%
-32.9% vs TC avg
§103
83.6%
+43.6% vs TC avg
§102
4.4%
-35.6% vs TC avg
§112
2.2%
-37.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 34 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hargrove et al. (US 20210049928 A1), hereinafter as Hargrove. Regarding claim 1, Hargrove teaches A method (Hargrove paragraph [0031] “A virtual reality (VR) interface for individuals is disclosed. The VR interface may provide individuals with physical disabilities self-administered and targeted rehabilitation within a home or clinical setting.”) comprising: acquiring via one or more electromyography (EMG) sensors in electrical contact with a patient, one or more electrical signals (Hargrove paragraph [0038] “FIG. 7 displays a representation of an interface 400. The interface 400 may comprise a sensor system 410 that comprises at least one EMG sensor 405…… The sensor system 410 may also comprise one or more kinematic sensors 407 for determining the position, velocity, acceleration, or other kinematic characteristic of a portion of a user of the interface 400, such as the user's limb.”, paragraph [0036] “an arm band 300 may be used in lieu of the VR assistive device 100. The arm band 300 may comprise one or more sensors 302 provided in a housing 301.”); mapping the one or more electrical signals to one or more intended movements via an EMG signal classifier (Hargrove paragraph [0038] “The EMG control module 420 analyzes the EMG information from the sensor system 410 and determines the intended gesture based on the received EMG information.”, paragraph [0047] “In an embodiment, the EMG control module 420 may be defined as a pattern recognition classifier.”); applying the one or more intended movements to a simulated body region (Hargrove paragraph [0040] “The EMG control module 420 determines that the user intends to perform the wrist rotation gesture. The EMG control module 420 then sends a gesture instruction to the VR control module 430 that instructs the VR control module 430 to generate instructions to rotate a virtual reality wrist displayed on the VR display 440.”); and rendering a movement of the simulated body region using a virtual reality (VR) display device (Hargrove paragraph [0038] “The VR control module 430 receives the gesture instructions and causes the VR display 440 to display a virtual representation of the intended gesture.”). Hargrove and the current application are in the same field of endeavor, namely virtual reality based system for therapeutic, training, diagnostic and rehabilitation. In various embodiments, Hargrove teaches a VR based approach to improve mobility of users (paragraph [0078] “The embodiments described herein provide several advantages for individuals who use myoelectric arm prostheses or other assistive devices. Such advantages, in addition to those already described, include: work toward more accurate programming of the assistive device specified for particular users to encourage more efficient and effective transition to an assistive device; facilitating motor practice and improvements in assistive device control independently of the user's assistive device brand or type”). Therefore, it would have been obvious for a person of ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of various embodiments of Hargrove to improve mobility of users. Regarding claim 2, Hargrove teaches The method of claim 1, and further teach wherein the one or more intended movements include a unique identifier (Hargrove teaches the name of gesture as the unique identifier of the intended movements, paragraph [0054] “The top of each column at 816 identifies the name of the gesture to be trained (wrist flexion, wrist pronation, hand open, no motion, hand closed, wrist supination, and wrist extension).”), and a strength of an electrical signal associated with the intended movement (Hargrove paragraph [0060] “the speed of the movement of the hand 840 may depend on a measurement of the EMG signal provided by the user. For instance, a strong EMG signal may result in a proportional control value that makes the hand move quickly and a weak EMG signal may result in a proportional control value that makes the hand move slowly.”). Regarding claim 3, Hargrove teaches The method of claim 1, and further teach wherein the EMG signal classifier comprises a trained linear regression model (Hargrove paragraph [0049] “the EMG signals may be represented using features, which are statistical descriptions of the signal…… These features across all movements may then be learned using machine learning techniques. Examples of commonly used learning algorithms include expectation maximum, gradient descent with back-propagation, or linear/non-linear regression. After the model has been learned, EMG signals are classified in real-time using the mathematical model.”). Regarding claim 4, Hargrove teaches The method of claim 1, and further teaches wherein the one or more EMG sensors are in electrical contact with a body region of the patient (Hargrove teaches a sensor system with EMG sensors, paragraph [0055] “The sensor system 410 detects the movement of the user's real-life limb”), wherein the body region is paralyzed and/or displays reduced mobility from neurological injury or disease (Hargrove paragraph [0031] “In other embodiments, the VR interface may be used in connection with VR systems for persons with neurological disorders such as amputations, stroke, spinal cord injury, or cerebral palsy.”), and wherein the body region corresponds to the simulated body region (Hargrove Figure 9 and paragraph [0055] “The sensor system 410 detects the movement of the user's real-life limb and sends corresponding information to the VR system 450 to cause a corresponding gesture of the virtual limb 850.”). Regarding claim 5, Hargrove teaches The method of claim 1, and further teaches the method further comprising: updating a state of a virtual environment based on the movement of the simulated body region (Hargrove teaches a user’s score as the state of a virtual environment, the score changes based on user’s movement in virtual environment, paragraph [0070] “when a user sees a green balloon, green bullets will appear if the person performs the correct gesture with her residual limb. The user may shoot a target balloon multiple times to make it expand, until it bursts, if she has performed the correct action. A user's score can increase in inverse proportion to the time taken to explode the balloon. The user's score (example shown by score 630 in FIG. 10 can) increase or decrease depending on the size and position of the balloon. If a user performs the incorrect movement and a different color bullet (representing a different gesture) hits the balloon, their score may decrease.”). Regarding claim 6, Hargrove teaches The method of claim 1, and further teach the method further comprising: applying virtual physics to the simulated body region (Hargrove teaches a virtual hand, and applying the physics information collected from kinematic sensor on the virtual hand, paragraph [0055] “FIG. 9 shows a representation of the VR space 800 which further includes a virtual limb 850 comprising a forearm 830 and a hand 840.”, paragraph [0060] “The speed of the movement of hand 840 may depend on information provided by kinematic sensors 407. For instance, the kinematic sensors 407 may provide a proportional control value to regulate the speed of movement of the hand 840.”); and incorporating the virtual physics in rendering the movement of the simulated body region (paragraph [0060] “the kinematic sensors 407 provide a value between 0 and 1 to the VR control module 430 which reflects the speed of intended hand movement. If the value is closer to 0, the hand movement is correspondingly slow. If the value is closer to 1, the hand movement is correspondingly fast….. the VR display 440 will display the wrist flexing while the hand remains in a hand closed position.”). Regarding claim 7, Hargrove teaches A method (Hargrove paragraph [0031] “A virtual reality (VR) interface for individuals is disclosed. The VR interface may provide individuals with physical disabilities self-administered and targeted rehabilitation within a home or clinical setting.”) comprising: prompting a user to execute a pre-determined movement (Hargrove paragraph [0074] “the virtual environment shown on VR display 440 may comprise an instruction mechanism, such as a virtual coach, that instructs the user in making appropriate movements of her residual limb.”); recording electrical signals received from one or more electromyography (EMG) sensors in electrical contact with a body region of the user (Hargrove paragraph [0038] “FIG. 7 displays a representation of an interface 400. The interface 400 may comprise a sensor system 410 that comprises at least one EMG sensor 405…… The sensor system 410 may also comprise one or more kinematic sensors 407 for determining the position, velocity, acceleration, or other kinematic characteristic of a portion of a user of the interface 400, such as the user's limb……The sensor system 410 receives EMG information from a user and transmits it to an EMG control module 420.”, paragraph [0036] “an arm band 300 may be used in lieu of the VR assistive device 100. The arm band 300 may comprise one or more sensors 302 provided in a housing 301.”); generating a map from the electrical signals to one or more intended movements of the body region of the user (Hargrove teaches a mapping between the EMG signal to the intended gesture, paragraph [0038] “The EMG control module 420 analyzes the EMG information from the sensor system 410 and determines the intended gesture based on the received EMG information. The EMG control module 420 produces gesture instructions to the VR control module 430.”); and storing the map in a non-transitory memory (Hargrove teaches storing the training data and gesture classes in the EMG control module, paragraph [0053-0059] “As shown in FIG. 12, the EMG control module 420 may be provided on computer-readable media 504, such as memory…….Reset 825 resets the training data stored in the EMG control module 420 and the gesture classes 421”). Hargrove and the current application are in the same field of endeavor, namely virtual reality based system for therapeutic, training, diagnostic and rehabilitation. In various embodiments, Hargrove teaches a VR based approach to improve mobility of users (paragraph [0078] “The embodiments described herein provide several advantages for individuals who use myoelectric arm prostheses or other assistive devices. Such advantages, in addition to those already described, include: work toward more accurate programming of the assistive device specified for particular users to encourage more efficient and effective transition to an assistive device; facilitating motor practice and improvements in assistive device control independently of the user's assistive device brand or type”). Therefore, it would have been obvious for a person of ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of various embodiments of Hargrove to improve mobility of users. Regarding claim 8, Hargrove teaches The method of claim 7, and further teaches wherein the map comprises a linear regression model (Hargrove paragraph [0049] “the EMG signals may be represented using features, which are statistical descriptions of the signal…… These features across all movements may then be learned using machine learning techniques. Examples of commonly used learning algorithms include expectation maximum, gradient descent with back-propagation, or linear/non-linear regression. After the model has been learned, EMG signals are classified in real-time using the mathematical model.”). Regarding claim 9, Hargrove teaches The method of claim 7, and further teaches wherein the map comprises a machine learning model (Hargrove paragraph [0049] “the EMG signals may be represented using features, which are statistical descriptions of the signal…… These features across all movements may then be learned using machine learning techniques.”). Regarding claim 10, Hargrove teaches The method of claim 7, and further teaches the method further comprising: rendering in real time the one or more intended movements via a virtual reality display device (Hargrove teaches a VR display 440, further teaches performing classification in real-time, paragraph [0058] “the user may train each class in gesture classes 421 with EMG information that corresponds to the user's movements conducted during a training session. Feature extraction and classification may be performed in real-time, which can allow the user to test the classifier's performance following each collected set of training data.”, paragraph [0064] “FIG. 10 shows an image of a virtual target 610 in a virtual three-dimensional space 600 displayed on a VR display 440.”). Regarding claim 11, Hargrove teaches A system (paragraph [0032] “a virtual reality (VR) training system is disclosed “) comprising: an electromyography (EMG) sensor (paragraph [0033] “The liner 101 comprises one or more sensors 102. The liner 101 is placed over a user's residual limb and the sensors 102 can receive electromyographic (EMG) information from the user's residual limb. The MCI 110 can process the EMG information from the sensors 102”); a virtual reality (VR) display device (Figure 7, VR display 440); a non-transitory memory, wherein the non-transitory memory includes an EMG signal classifier, and instructions (paragraph [0052-0053] “when the EMG control module 420 is defined as a pattern recognition classifier …… As shown in FIG. 12, the EMG control module 420 may be provided on computer-readable media 504, such as memory. The computer-readable media 504 may also comprise an operating system 505 for operation of an EMG controller 500.”); and a processor, wherein the processor is communicatively coupled to the EMG sensor, the VR display device, and the non-transitory memory, and wherein, when executing the instructions (Figure 7, paragraph [0081] “All of the methods and processes described above may be embodied in, and fully automated via, software code modules executed by one or more general purpose computers or processors. The code modules may be stored in any type of computer-readable storage medium or other computer storage device.”), the processor is configured to: initialize a VR environment (Hargrove paragraph [0073] “After the EMG control module 420 has been trained, the VR system 450 can create a virtual environment in which the user attempts to make specified gestures of a virtual limb”); acquire one or more EMG signals via the EMG sensor, wherein the EMG sensor is in electrical contact with a body region of a user (Hargrove paragraph [0038] “FIG. 7 displays a representation of an interface 400. The interface 400 may comprise a sensor system 410 that comprises at least one EMG sensor 405…… The sensor system 410 may also comprise one or more kinematic sensors 407 for determining the position, velocity, acceleration, or other kinematic characteristic of a portion of a user of the interface 400, such as the user's limb……The sensor system 410 receives EMG information from a user and transmits it to an EMG control module 420.”, paragraph [0036] “an arm band 300 may be used in lieu of the VR assistive device 100. The arm band 300 may comprise one or more sensors 302 provided in a housing 301.”); map the one or more EMG signals into one or more intended movements of the body region (Hargrove teaches a mapping between the EMG signal to the intended gesture, paragraph [0038] “The EMG control module 420 analyzes the EMG information from the sensor system 410 and determines the intended gesture based on the received EMG information. The EMG control module 420 produces gesture instructions to the VR control module 430.”); apply the one or more intended movements to a simulated body region, wherein the simulated body region corresponds to the body region (Hargrove Figure 9, paragraph [0040] “The EMG control module 420 determines that the user intends to perform the wrist rotation gesture. The EMG control module 420 then sends a gesture instruction to the VR control module 430 that instructs the VR control module 430 to generate instructions to rotate a virtual reality wrist displayed on the VR display 440.”, and paragraph [0055] “The sensor system 410 detects the movement of the user's real-life limb and sends corresponding information to the VR system 450 to cause a corresponding gesture of the virtual limb 850.”); apply virtual physics to the simulated body region (Hargrove teaches a virtual hand, and applying the physics information collected from kinematic sensor on the virtual hand, paragraph [0055] “FIG. 9 shows a representation of the VR space 800 which further includes a virtual limb 850 comprising a forearm 830 and a hand 840.”, paragraph [0060] “The speed of the movement of hand 840 may depend on information provided by kinematic sensors 407. For instance, the kinematic sensors 407 may provide a proportional control value to regulate the speed of movement of the hand 840.”); render, in real time, a movement of the simulated body region based on the one or more intended movements and the virtual physics (Hargrove teaches performing classification in real-time, and further teaches applying kinematic physics on virtual hand, paragraph [0060] “the kinematic sensors 407 provide a value between 0 and 1 to the VR control module 430 which reflects the speed of intended hand movement. If the value is closer to 0, the hand movement is correspondingly slow. If the value is closer to 1, the hand movement is correspondingly fast.”, paragraph [0058] “the user may train each class in gesture classes 421 with EMG information that corresponds to the user's movements conducted during a training session. Feature extraction and classification may be performed in real-time, which can allow the user to test the classifier's performance following each collected set of training data.”); and update a state of the virtual environment based on the movement of the simulated body region (Hargrove teaches a user’s score as the state of a virtual environment, the score changes based on user’s movement in virtual environment, paragraph [0070] “when a user sees a green balloon, green bullets will appear if the person performs the correct gesture with her residual limb. The user may shoot a target balloon multiple times to make it expand, until it bursts, if she has performed the correct action. A user's score can increase in inverse proportion to the time taken to explode the balloon. The user's score (example shown by score 630 in FIG. 10 can) increase or decrease depending on the size and position of the balloon. If a user performs the incorrect movement and a different color bullet (representing a different gesture) hits the balloon, their score may decrease.”). Hargrove and the current application are in the same field of endeavor, namely virtual reality based system for therapeutic, training, diagnostic and rehabilitation. In various embodiments, Hargrove teaches a VR based approach to improve mobility of users (paragraph [0078] “The embodiments described herein provide several advantages for individuals who use myoelectric arm prostheses or other assistive devices. Such advantages, in addition to those already described, include: work toward more accurate programming of the assistive device specified for particular users to encourage more efficient and effective transition to an assistive device; facilitating motor practice and improvements in assistive device control independently of the user's assistive device brand or type”). Therefore, it would have been obvious for a person of ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of various embodiments of Hargrove to improve mobility of users. Regarding claim 12, claim 12 has similar limitations as claim 2, therefore it is rejected under the same rationale as claim 2. Regarding claim 13, claim 13 has similar limitations as claim 3, therefore it is rejected under the same rationale as claim 3. Regarding claim 14, claim 14 has similar limitations as claim 4, therefore it is rejected under the same rationale as claim 4. Regarding claim 15, claim 15 has similar limitations as claim 5, therefore it is rejected under the same rationale as claim 5. Regarding claim 16, claim 16 has similar limitations as claim 6, therefore it is rejected under the same rationale as claim 6. Regarding claim 17, Hargrove teaches One or more tangible, non-transitory computer-readable media storing executable instructions that, when executed by a processor (Hargrove paragraph [0080] “the operations represent computer-executable instructions stored on one or more computer-readable media that, when executed by one or more processors, enable the one or more processors to perform the recited operations.”), cause the processor to perform the method of claim 1 (Please refer to claim 1 for detailed rejection rationale). Regarding claim 18, Hargrove teaches One or more tangible, non-transitory computer-readable media storing executable instructions that, when executed by a processor (Hargrove paragraph [0080] “the operations represent computer-executable instructions stored on one or more computer-readable media that, when executed by one or more processors, enable the one or more processors to perform the recited operations.”), cause the processor to perform the method of claim 7 (Please refer to claim 7 for detailed rejection rationale). Regarding claim 19, Hargrove teaches The method of claim 7, and further teaches wherein the body region is paralyzed and/or displays reduced mobility from neurological injury or disease (Hargrove paragraph [0031] “In other embodiments, the VR interface may be used in connection with VR systems for persons with neurological disorders such as amputations, stroke, spinal cord injury, or cerebral palsy.”). Regarding claim 20, Hargrove teaches The system of claim 11, and further teaches wherein the non-transitory memory is configured to store the state of the virtual environment (Hargrove paragraph [0070] “The score may be saved locally or uploaded to a central server.”). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Wetmore et al. (US 20250095302 A1) teaches an extended reality system, EMG sensor information collected from users, are used to model an interaction of the user with the extended reality environment; Liew et al. (US 20240057926 A1) teaches a VR system of virtual avatar, the movement of the avatar is calculated based on the neural activity and muscle activity. Any inquiry concerning this communication or earlier communications from the examiner should be directed to XIAOMING WEI whose telephone number is (571)272-3831. The examiner can normally be reached M-F 8:00-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kee Tung can be reached at (571)272-7794. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /XIAOMING WEI/Examiner, Art Unit 2611 /KEE M TUNG/Supervisory Patent Examiner, Art Unit 2611
Read full office action

Prosecution Timeline

Apr 21, 2023
Application Filed
Mar 22, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603064
CIRCUIT AND METHOD FOR VIDEO DATA CONVERSION AND DISPLAY DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12597246
METHOD AND APPARATUS FOR GENERATING ADVERSARIAL PATCH
2y 5m to grant Granted Apr 07, 2026
Patent 12597175
Avatar Creation From Natural Language Description
2y 5m to grant Granted Apr 07, 2026
Patent 12586280
TECHNIQUES FOR GENERATING DUBBED MEDIA CONTENT ITEMS
2y 5m to grant Granted Mar 24, 2026
Patent 12586318
METHOD AND APPARATUS FOR LABELING ROAD ELEMENT, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
82%
Grant Probability
99%
With Interview (+26.1%)
2y 5m
Median Time to Grant
Low
PTA Risk
Based on 34 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month