Prosecution Insights
Last updated: April 19, 2026
Application No. 18/371,712

NEUROMAPPING SYSTEMS, METHODS, AND DEVICES

Non-Final OA §103
Filed
Sep 22, 2023
Examiner
ZONG, HELEN
Art Unit
2683
Tech Center
2600 — Communications
Assignee
Medos International Sàrl
OA Round
1 (Non-Final)
79%
Grant Probability
Favorable
1-2
OA Rounds
2y 3m
To Grant
87%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
561 granted / 709 resolved
+17.1% vs TC avg
Moderate +8% lift
Without
With
+8.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 3m
Avg Prosecution
32 currently pending
Career history
741
Total Applications
across all art units

Statute-Specific Performance

§101
5.9%
-34.1% vs TC avg
§103
66.8%
+26.8% vs TC avg
§102
13.3%
-26.7% vs TC avg
§112
9.7%
-30.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 709 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-7, 10-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ryan (US 20220160439) in view of Singh et al. (US 20230088370). Regarding claim 1, Ryan teaches a computer-assisted surgical system, comprising: a neuromonitoring probe (p0358:Probe (e.g., needle, injection, pin, screw, etc.) ); a navigation array attached to the probe m(p0267: sensor suite 210 can be used for navigation); a tracking system to detect and track elements of the navigation array (p0269: optical tracking); an augmented reality (AR) device (104 in fig. 1); a robotic arm having a cutting instrument (p0003:cutting tools and p0300: cutting tool); and a controller having at least one processor configured to: receive neuromonitoring data from the probe (p0290:patient's actual nerves have been imaged and reconstructed into 3D models, if the system detects that a particular nerve has been stimulated or is being approached by the stimulating probe); receive probe positional data from the tracking system to determine a three-dimensional position and orientation of the probe (p0358:The system 10 continues to track a position and an orientation of a probe) correlate the neuromonitoring data to the positional data to determine neuromapping data indicating a nerve position in the patient (p0290: detects that a particular nerve has been stimulated or is being approached); overlay a representation of the neuromapping data over three-dimensional position and orientation of the patient to establish a boundary around the nerve (p0290: the hologram representing that nerve structure can be highlighted to the user 106 to make it easier); control the robotic arm to avoid the cutting instrument entering the boundary; and display the boundary on the AR device (p0290: nerve structure can be highlighted to the user 106 to make it easier to avoid contact with or injury to the nerve structure). Ryan does not explicitly disclose overlay a representation of data to the overlay a representation. Singh teaches overlay a representation of data to the overlay a representation (p0104: integrate that data with ultrasound data, and display the combined data on the ultrasound image (e.g., as an additional overlay or an adjacent image). Ryan and Singh are combinable because they both deal with surgical procedure. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Ryan the teaching of Singh for purpose for determining safe passage through human tissue without damaging nearby neurovascular structures (p0002). Regarding claim 2, Ryan teaches the system of claim 1, further comprising a second navigation array affixed to the patient or a surgical surface (600 in fig. 6). Regarding claim 3, Ryan teaches The system of claim 1, wherein the image is a live video feed (p0230:video camera). Regarding claim 4, Ryan teaches the system of claim 1, wherein the live video feed is from an endoscope (p0230:This would activate voice and video camera recording allowing the user 106). Regarding claim 5, Ryan teaches the system of claim 1, wherein the image is a 3D image (p0230:video camera recording allowing the user 106 to capture and narrate the complaint in 3D while the issue is occurring). Regarding claim 6, Ryan teaches the system of claim 1, wherein the image is a 2D image (p0285: the 2D ultrasound image relative to the marker's 1300 Position). Regarding claim 7, Ryan teaches the system of claim 1, wherein the controller is further configured to convey a warning if a predetermined threshold is exceeded (p0278:actual drill bit moves outside a safe target trajectory, the color of the virtual target 3400 changes to alert the user and an audible warning is emitted). Regarding claim 10, claim 10 recites very similar limitations as claim1, therefore it is rejection for same reason as claim 1. In addition, teaches control the robotic arm to avoid the cutting instrument entering the boundary (00290: to avoid contact with or injury to the nerve structure); and display the boundary on the AR device (p0290: the hologram representing that nerve structure can be highlighted to the user 106 to make it easier to avoid contact with or injury to the nerve structure). Regarding claim 11, Ryan in view Singh teaches wherein the boundary is displayed as an overlay image based on the neuromapping data (Singh: p0096-97: the B-mode overlay 450 of the psoas muscle, which the computer displays in color… instructs the computer to display location and proxmity information of any nerves in the psoas muscle within the view of the probe 12). The rational applied to the rejection of claim 10 has been incorporated herein. Regarding claim 12, Ryan in view of Singh teaches the system of claim 11, wherein the overlay image comprises a plurality of zones around the nerve (Singh: 450 in fig. 44). The rational applied to the rejection of claim 10 has been incorporated herein. Regarding claim 13, Ryan in view of Singh teaches the system of claim 12, wherein the overlay image comprises a first portion having a first appearance and a second portion having a second appearance (Singh: fig.45: 1, 2 and 3), wherein the second portion is more distal with respect to the nerve (Singh: fig. 45: 464). The rational applied to the rejection of claim 10 has been incorporated herein. Regarding claim 18, the limitations is same as limitations of claim 13, therefore it is rejected for the same reason as claim 13. Regarding claim 14, Ryan in view of Singh teaches the system of claim 13, wherein the overlay image further comprises a third portion distal to the second portion to represent a margin (singh: p0097:the displayed nerve may be up to 120% or more of identified size to build in a safety margin). The rational applied to the rejection of claim 11 has been incorporated herein. Regarding claim 15, Ryan teaches the system of claim 10, wherein the controller is further configured to provide haptic feedback to a user when the cutting instrument comes within a predefined proximity to the boundary (p0278). Regarding claim 16, claim 10 recites similar limitations as claim 16, therefore it is rejected for the same reason as claim 10, except when a user of the AR device is looking at the nerve position, display the boundary on the AR device as an overlay image over the real world view based on the neuromapping data. Ryan further teaches when a user of the AR device is looking at the nerve position, display the boundary on the AR device as an overlay image over the real world view based on the neuromapping data (p00204: display device 104 for viewing by the user 106). Regarding claim 17, Ryan teaches the system of claim 16, wherein the controller is further configured to perform at least one of: control the robotic arm to avoid the cutting instrument entering the boundary (p0290: nerve structure can be highlighted to the user 106 to make it easier to avoid contact with or injury to the nerve structure)); or provide haptic feedback to the user of the AR device when the cutting instrument comes within a predefined proximity to the boundary. Claims 8-9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ryan in view of Singh as applied to claim 1 above, and further in view of Polidoro (US 7001361). Regarding claim 8, Ryan in view of Singh does not teach the system of claim 1, wherein the controller is further configured to receive temperature data from a temperature probe. Polidoro teaches wherein the controller is further configured to receive temperature data from a temperature probe (col.7 Line:45-50: using a temperature or pressure probe to assess blood temperature or pressure in an vein or artery). Ryan in view of Singh and Polidoro are combinable because they both deal with surgical procedure. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Ryan in view of Singh with the teaching of Polidoro for purpose of performing surgical procedures. Regarding claim 9, Ryan in view of Singh and Polidoro teaches the system of claim 1, wherein the controller is further configured to receive strain data from a pressure sensor (Polidoro: col.7 Line:45-50). The rational applied to the rejection of claim 8 has been incorporated herein. Claim 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ryan in view of Singh, further in view of Visser (US 8519999). Regarding claim 19, Ryan in view of Singh does not teach the system of claim 18, wherein the first appearance and the second appearance differ in color. Visser teaches wherein the first appearance and the second appearance differ in color (col.3 Lines:50-51: These segmented regions are usually rendered in a distinctive way, often using a different colour). Ryan in view of Singh and Visser are combinable because they both deal with surgical procedure. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Ryan in view of Singh with the teaching of Visser for purpose instantly recognizable within the image. Claim 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ryan in view of Singh, further in view of Nawana (US 20250186145 ). Regarding claim 20, Ryan in view of Singh does not teach the system of claim 16, wherein the controller is further configured to update the overlay image to account for a change in perspective of the user of the AR device. Nawana teaches wherein the controller is further configured to update the overlay image to account for a change in perspective of the user of the AR device (p0055: updates the output image such that image is adjusted based on the user's head movement). Ryan in view of Singh and Nawana are combinable because they both deal with surgical procedure. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to combine the teachings of Ryan in view of Singh with the teaching of Nawana for purpose reduce a total number of operating rooms within a hospital as multiple operating rooms may be combined during a renovation (p0005). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to HELEN Q ZONG whose telephone number is (571)270-1600. The examiner can normally be reached Mon-Fri 9-6. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sarpong Akwasi M. can be reached on (571) 270-3438. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. HELEN ZONG Primary Examiner Art Unit 2681 /HELEN ZONG/Primary Examiner, Art Unit 2681
Read full office action

Prosecution Timeline

Sep 22, 2023
Application Filed
Nov 18, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602909
Multi-modal Model Training Method, Apparatus and Device, and Storage Medium
2y 5m to grant Granted Apr 14, 2026
Patent 12593984
SYSTEM, INFORMATION STORAGE MEDIUM, AND INFORMATION PROCESSING METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12591981
EFFECTIVE METHOD TO ESTIMATE POSE, VELOCITY AND ATTITUDE WITH UNCERTAINTY
2y 5m to grant Granted Mar 31, 2026
Patent 12591400
PRINT PROCESSING SYSTEM AND CONTROL METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12586420
CASCADE ENSEMBLES FOR LIVENESS DETECTION
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
79%
Grant Probability
87%
With Interview (+8.2%)
2y 3m
Median Time to Grant
Low
PTA Risk
Based on 709 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month