Prosecution Insights
Last updated: April 19, 2026
Application No. 18/249,986

System for a Microscope System and Corresponding Method and Computer Program

Non-Final OA §103
Filed
Apr 21, 2023
Examiner
ORR, HENRY W
Art Unit
2172
Tech Center
2100 — Computer Architecture & Software
Assignee
LEICA INSTRUMENTS (SINGAPORE) PTE. LTD.
OA Round
5 (Non-Final)
50%
Grant Probability
Moderate
5-6
OA Rounds
3y 10m
To Grant
88%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
230 granted / 456 resolved
-4.6% vs TC avg
Strong +37% interview lift
Without
With
+37.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 10m
Avg Prosecution
29 currently pending
Career history
485
Total Applications
across all art units

Statute-Specific Performance

§101
6.8%
-33.2% vs TC avg
§103
53.4%
+13.4% vs TC avg
§102
19.4%
-20.6% vs TC avg
§112
15.1%
-24.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 456 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 3/17/2026 has been entered. DETAILED ACTION 1. This action is responsive to applicant’s amendment dated 3/17/2026. 2. Claims 1-7 and 10-15 are pending in the case. 3. Claims 8 and 9 are cancelled. 4. Claims 1, 14 and 15 are independent claims. Applicant’s Response 5. In Applicant’s response dated 3/17/2026, applicant has amended the following: a) Claims 1, 14 and 15 Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-7 and 10-15 are rejected under 35 U.S.C. 103 as being unpatentable over Hallen; Paul, U.S. Published Application No. 20190099226 A1 in view Johnson et al. (hereinafter “Johnson”), U.S. Published US 20210275264 A1 in further view of Kim et al. (hereinafter “Kim”), U.S. Published Application No. 20220265373 A1. Claim 1: Hallen teaches A system for a microscope system, the system comprising one or more processors and one or more storage devices, wherein the system is configured to: (e.g., surgical console system comprising a processor and storage par. 4; Some embodiments involve an ophthalmic system that includes a surgical console and a heads-up display communicatively coupled with a surgical camera for capturing a three-dimensional image of an eye. Par. 116; The storage device 1430 can include software modules 1432, 1434, 1436 for controlling the processor 1410.) obtain a trigger signal, the trigger signal indicating a desire of a user of the microscope system to display a visual representation of a control functionality currently associated with an input device of the microscope system; (e.g., obtain trigger indicating a desire to turn on touch screen and display control functionality such as graphical overlays or menu items as shown in Figure 4A and are associated with at least a touch screen coupled to a microscope system par. 46; The heads-up display 304 can also include touchscreen capability, voice control capability, gaze-enabled control (as described below), etc. par. 56; Also, a variety of graphical overlays 408, 410, 412, 414, 416 can be displayed along with the image of the eye. Par. 60; The GUI 400 can also include a variety of menu items 426, 428, 430, 432, 434, 436 for interacting with the surgical suite optimization engine or other components in the surgical suite par. 61; The GUI 400 can also include a variety of icons for controlling aspects of the display such as an Information Icon 438, a color adjustment icon 440, a camera adjustment icon 442, a monitor adjustment 444, a screen recording 446, an input source icon 448, a screen layout icon 450, a surgical step icon 452, etc. Par. 62; A wide variety of common approaches can be employed for interacting with the GUI 400 displayed on a heads-up display, e.g. a mouse, a touchpad, etc. Additionally, the heads-up display can include a touchscreen and the GUI 400 can be interacted with via the touchscreen. Par. 64; The digital pointer allows selection assistance via x or y-axis movements for navigating through menus and selecting preferred image or options. ) generate, based on the trigger signal, a visual overlay comprising the visual representation; (e.g., either a graphical overlay or a menu item are considered as the recited visual overlay par. 56; . Also, a variety of graphical overlays 408, 410, 412, 414, 416 can be displayed along with the image of the eye. Par. 60; The GUI 400 can also include a variety of menu items 426, 428, 430, 432, 434, 436 for interacting with the surgical suite optimization engine or other components in the surgical suite. Par. 64; The digital pointer allows selection assistance via x or y-axis movements for navigating through menus and selecting preferred image or options.) and provide a display signal to a display device of the microscope system, the display signal comprising the visual overlay. (e.g., provide a signal to a display device as shown in Figure 4A with a graphical overlay or menu item (i.e., visual overlay) par. 56; . Also, a variety of graphical overlays 408, 410, 412, 414, 416 can be displayed along with the image of the eye. Par. 60; The GUI 400 can also include a variety of menu items 426, 428, 430, 432, 434, 436 for interacting with the surgical suite optimization engine or other components in the surgical suite. Par. 62; A wide variety of common approaches can be employed for interacting with the GUI 400 displayed on a heads-up display, e.g. a mouse, a touchpad, etc. Additionally, the heads-up display can include a touchscreen and the GUI 400 can be interacted with via the touchscreen.) and wherein the system is configured to generate the visual overlay such that the visual overlay comprises a visual representation of a control functionality currently associated with the input device and a visual representation of a control functionality available for association with the input device after toggling. (e.g., displaying a variety of menu items wherein a selected menu item is a visual representation of a control functionality currently associated with the touchscreen input device and a non selected menu item is a visual representation of a control functionality available for association with the touchscreen after toggling par. 60; The GUI 400 can also include a variety of menu items 426, 428, 430, 432, 434, 436 for interacting with the surgical suite optimization engine or other components in the surgical suite. For example, the GUI 400 can include a surgical plan menu item 426 for interacting with a surgical planning application, surgical plan preferences, etc. The GUI 400 can also include a surgical practice management application 428, a technical support contact menu 430, etc. Also, the GUI 400 can include control menus such as a supplemental optics/overlay menu 432, a display position menu 434, a diagnostic device control menu 436, etc.) Hallen fails to expressly teach wherein the microscope system includes a plurality of input devices and a plurality of control functionalities are available to each of the plurality of input devices of the microscope system; wherein the input device is separate from the display device, wherein the system is configured to obtain a further trigger signal, the further trigger signal indicating a desire of a user of the microscope system to toggle a control functionality associated with the input device among the plurality of functionalities available to the input device, wherein the system is configured to toggle the control functionality associated with the input device based on the further trigger signal, and wherein the system is configured to adapt the visual representation after toggling the control functionality associated with the input device. However, Johnson teaches wherein the microscope system includes a plurality of input devices and a plurality of control functionalities are available to each of the plurality of input devices of the microscope system; (e.g., foot operated controls or handheld device remotely controlling surgical instruments or other aspects of the user console par. 35; In the exemplary user console shown in FIG. 1C, a user located in the seat 110 and viewing the user display 130 may manipulate the foot-operated controls 120 and/or handheld user input devices 122 to remotely control the robotic arms 160 and/or surgical instruments mounted to the distal ends of the arm. The foot-operated controls 120 and/or handheld user input devices 122 may additionally or alternatively be used to control other aspects of the user console 100 or robotic system 150.) wherein the input device is separate from the display device, (e.g., foot-operated controls is separate from the console par. 35; The foot-operated controls 120 and/or handheld user input devices 122 may additionally or alternatively be used to control other aspects of the user console 100 or robotic system 150.)) wherein the system is configured to obtain a further trigger signal, the further trigger signal indicating a desire of a user of the microscope system to toggle a control functionality associated with the input device among the plurality of functionalities available to the input device, wherein the system is configured to toggle the control functionality associated with the input device based on the further trigger signal, (e.g., pedal controls toggle between control functionality with instruments and control functionalities with graphical user interface and after toggling to the control functionalities with the graphical user interface adapting the visual display of GUI with corresponding visual results as shown in Figures 20 and 21. par. 166; In one embodiment, one of the foot controls acts as a “clutch pedal,” which can toggle the functionality of the user input devices 122 between controlling the robotic arms and controlling the graphical user interface. Par. 67; After the surgeon picks up the handheld left and right user input devices 122, the message overlaid on the image of the surgical site in the first region 500 changes to “Match Grips,” as shown in FIG. 7. “Match Grips” instructs the surgeon to open and close the controller's grip to match the current state of the instrument (e.g., the grip angle of the end effector). After the surgeon “matches grips,” he fully engages the robotic arms and can proceed with surgery. Par. 18; The user input devices 122 can be used to toggle among the various displays and apps.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microscope system as taught by Hallen to include input devices for toggling associated control functionalities as taught Johnson, with a reasonable expectation of success, to provide the benefit of improving the manipulation of a variety of tools during a surgical procedure. Hallen/Johnson fails to expressly teach a system configured to generate a visual overlay based on a progress of a surgery being performed with help of the microscope system. (emphasis added) However, Kim teaches a system configured to generate a visual overlay based on a progress of a surgery being performed with help of the microscope system. (e.g., instructing surgeon with visual overlays based on a smart guide (i.e., help from a microscope system) par. 75; Referring to FIG. 12, in an exemplary embodiment, another step in the pedicle screw placement procedure is robotic arm movement in fine movement mode to reach the pedicle. This step can be subsequent to the step in FIG. 11. The instructive information in the first icon has been updated to the Smart Guide mode is active and may also instruct the surgeon to insert screwdriver into the robot tool guide. At this step, the user's input can be depressing and holding a button on the robotic arm to enter fine movement mode and/or actuate the robotic arm in fine movement mode. Par. 75; At this step, graphical representation of robot tool guide dynamically moves as user manipulates the control handle or motion clutch button. par. 84; The GUI 100 may display the current status of the robotic arm and/or the prompt for instructing the user to connect a retractor to the robotic arm. The user may push a red button on the robotic arm to put the robotic arm into a hold position 2404, and connect a retractor to the distal end of the robotic arm. And the GUI may display “position hold,” and/or “clutch to move retractor” after the retractor is connected 2404. When the user/surgeon provides an input at the clutch, foot pedal, or at the GUI, the robotic arm can enter the smart guide mode, and the GUI can be updated to display the status information as “Fine Movement Mode” Active, 2405. If a red button is pressed after position hold or smart guide state, the robotic arm may get ready to depart the surgical site 2406. ) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microscope system as taught by Hallen to include input devices for toggling associated control functionalities as taught Johnson based on a smart guide as taught by Kim, with a reasonable expectation of success, to provide the benefit of improving the manipulation of tools during delicate and complex surgical procedures (see Kim; par. 3 and 4). Claim 2 depends on claim 1: Hallen teaches wherein the system is configured to obtain the trigger signal from one of one or more input devices of the microscope system, and to generate the visual overlay with a visual representation of the control functionality associated with the input device the trigger signal is received from. (e.g., obtain signal from touchscreen to interact with GUI 400 to generate graphical overlays or menu items representing control functionality Hallen; par. 56; . Also, a variety of graphical overlays 408, 410, 412, 414, 416 can be displayed along with the image of the eye. Par. 60; The GUI 400 can also include a variety of menu items 426, 428, 430, 432, 434, 436 for interacting with the surgical suite optimization engine or other components in the surgical suite. Par. 62; A wide variety of common approaches can be employed for interacting with the GUI 400 displayed on a heads-up display, e.g. a mouse, a touchpad, etc. Additionally, the heads-up display can include a touchscreen and the GUI 400 can be interacted with via the touchscreen.) Claim 3 depends on claim 1: Hallen/Johnson teaches wherein the system is configured to obtain the trigger signal from a button of the input device. (e.g., buttons to turn on input devices or to make gesture selections with input devices Hallen; par. 62; A wide variety of common approaches can be employed for interacting with the GUI 400 displayed on a heads-up display, e.g. a mouse, a touchpad, etc. Additionally, the heads-up display can include a touchscreen and the GUI 400 can be interacted with via the touchscreen.) (e.g., signal obtained from pressing on sensors (i.e., , button), Johnson; Furthermore, in some variations, the user input device 122 may include one or more sensors for detecting other manipulations of the user input device 122, such as squeezing of the housing 420 (e.g., via one or more pressure sensors, one or more capacitive sensors, etc.). Claim 4 depends on claim 1: Hallen teaches wherein the system is configured to process audio captured via a microphone of the microscope system, and to generate the trigger signal based on one or more keywords spoken by the user within the captured audio, (e.g., voice commands par. 62; therefore, contact-less interaction with the GUI 400 can be helpful. As explained above, in some cases, the surgical suite optimization engine includes a voice control module for recognizing voice commands for interacting with the GUI 400.) or wherein the system is configured to obtain the trigger signal via a touch-screen of the microscope system. (e.g., touchscreen gesture selections par. 62; Additionally, the heads-up display can include a touchscreen and the GUI 400 can be interacted with via the touchscreen.) Claim 5 depends on claim 1: Hallen/Johnson teaches wherein the visual representation represents a control functionality associated with a foot pedal, with a handle of the microscope system, with a mouth switch of the microscope system, or with an eye-tracking system of the microscope system. (e.g., control functionality associated with eye tracking or foot pedal as input devices Hallen; par. 62; However, maintaining a sterile environment can be important in a surgical suite and a surgeon may already be using both hands for manipulating tools and his feet with foot pedals. par. 63; Additionally, in some cases, the surgical suite optimization engine can employ gaze-tracking to interact with the GUI 400. For example, surgical glasses used to view a stereoscopic image can include an eye tracking mechanism (e.g. a camera, one or more accelerometer etc.) that can detect eye position, movement of gaze, duration of focus, etc. and the GUI 400 can be interacted with using the detected eye/gaze position.) (e.g., control functionality associated foot pedal Johnson; par. 55; For example, a user may use one or more handheld user input devices 122 and/or one or more foot pedals 120 to selectively control an aspect of the robotic surgical system 150 and selectively interact with the GUI. par. 60; The handheld user input device may include a clutch mechanism for switching between controlling a robotic arm or end effector and controlling a graphical user interface, etc., and/or between other control modes. Par. 69; These bars are color-coded to the pedals 120 of the robotic surgical system 150, and the length of the bars indicate the amount of energy applied to the tool by the pedals 120.) Claim 6 depends on claim 1: Hallen/Johnson teaches wherein the input device has two or more sets of functionalities that are associated with two or more modes of the input device, with one of the two or more modes being active at the input device, wherein the system is configured to generate the visual representation based on the set of functionalities that is associated with the mode that is active at the input device. (e.g., touchscreen has two or more sets of functionalities associated with two laser modes Hallen; par. 106; FIGS. 11B and 11C illustrate examples of graphical user interfaces showing optimized visualization for two laser states according to some embodiments of the present technology. As represented in FIGS. 11B and 11C, a graphical user interface (GUI) 1130 of a display dashboard includes a main surgical view window 1130 that displays a stereoscopic representation of a three-dimensional image of an eye received from the surgical camera. In FIG. 11B, the image of the eye is modulated toward red to maximize red contract during a laser aiming step. In FIG. 11C, the image of the eye is modulated toward green to neutralize green flashback during a laser firing step.) (e.g., input device switching between controlling instruments and GUI and control modes Johnson; par. 60; The handheld user input device may include a clutch mechanism for switching between controlling a robotic arm or end effector and controlling a graphical user interface, etc., and/or between other control modes.) Claim 10 depends on claim 1: Hallen/Johnson teaches wherein the visual representation represents a plurality of control functionalities associated with a plurality of input modalities of the input device. (e.g., graphical overlay or selectable menu items representing a plurality of control functionalities associated with a plurality of means to trigger functionality with input device Hallen; par. 46; The heads-up display 304 can also include touchscreen capability, voice control capability, gaze-enabled control (as described below), etc. par. 56; Also, a variety of graphical overlays 408, 410, 412, 414, 416 can be displayed along with the image of the eye. Par. 60; The GUI 400 can also include a variety of menu items 426, 428, 430, 432, 434, 436 for interacting with the surgical suite optimization engine or other components in the surgical suite par. 61; The GUI 400 can also include a variety of icons for controlling aspects of the display such as an Information Icon 438, a color adjustment icon 440, a camera adjustment icon 442, a monitor adjustment 444, a screen recording 446, an input source icon 448, a screen layout icon 450, a surgical step icon 452, etc. Par. 62; A wide variety of common approaches can be employed for interacting with the GUI 400 displayed on a heads-up display, e.g. a mouse, a touchpad, etc. Additionally, the heads-up display can include a touchscreen and the GUI 400 can be interacted with via the touchscreen. Par. 64; The digital pointer allows selection assistance via x or y-axis movements for navigating through menus and selecting preferred image or options. ) (e.g., input device switching between controlling instruments and GUI and control modes Johnson; par. 60; The handheld user input device may include a clutch mechanism for switching between controlling a robotic arm or end effector and controlling a graphical user interface, etc., and/or between other control modes.) Claim 11 depends on claim 1: Hallen teaches A microscope system comprising the system according to claim 1, an input device, and a display device. (e.g., touchscreen is an input and display device par. 62; A wide variety of common approaches can be employed for interacting with the GUI 400 displayed on a heads-up display, e.g. a mouse, a touchpad, etc. Additionally, the heads-up display can include a touchscreen and the GUI 400 can be interacted with via the touchscreen.) Claim 12 depends on claim 11: Hallen/Johnson teaches wherein the input device is one of a foot pedal, a handle, a mouth switch or an eye-tracking system of the microscope system. (e.g., eye tracking or foot pedal as input devices Hallen; par. 62; However, maintaining a sterile environment can be important in a surgical suite and a surgeon may already be using both hands for manipulating tools and his feet with foot pedals. par. 63; Additionally, in some cases, the surgical suite optimization engine can employ gaze-tracking to interact with the GUI 400. For example, surgical glasses used to view a stereoscopic image can include an eye tracking mechanism (e.g. a camera, one or more accelerometer etc.) that can detect eye position, movement of gaze, duration of focus, etc. and the GUI 400 can be interacted with using the detected eye/gaze position.) (e.g., control functionality associated foot pedal Johnson; par. 55; For example, a user may use one or more handheld user input devices 122 and/or one or more foot pedals 120 to selectively control an aspect of the robotic surgical system 150 and selectively interact with the GUI. par. 60; The handheld user input device may include a clutch mechanism for switching between controlling a robotic arm or end effector and controlling a graphical user interface, etc., and/or between other control modes. Par. 69; These bars are color-coded to the pedals 120 of the robotic surgical system 150, and the length of the bars indicate the amount of energy applied to the tool by the pedals 120.) Claim 13 depends on claim 11: Hallen/Johnson teaches wherein the input device comprises a plurality of input modalities, the plurality of input modalities comprising at least one of one or more buttons, one or more switches, one or more rotary controls, and one or more control sticks. (e.g., button means to trigger functionality Hallen; par. 62; A wide variety of common approaches can be employed for interacting with the GUI 400 displayed on a heads-up display, e.g. a mouse, a touchpad, etc. Additionally, the heads-up display can include a touchscreen and the GUI 400 can be interacted with via the touchscreen.) (e.g., signal obtained from pressing on sensors (i.e., , button), Johnson; Furthermore, in some variations, the user input device 122 may include one or more sensors for detecting other manipulations of the user input device 122, such as squeezing of the housing 420 (e.g., via one or more pressure sensors, one or more capacitive sensors, etc.). Claim 14: Claim 14 is substantially encompassed in claim 1, therefore, Examiner relies on the same rationale set forth in claim 1 to reject claim 14. Claim 15: Claim 15 is substantially encompassed in claim 1, therefore, Examiner relies on the same rationale set forth in claim 1 to reject claim 15. (e.g., surgical console system comprising a processor and storage Hallen; par. 4; Some embodiments involve an ophthalmic system that includes a surgical console and a heads-up display communicatively coupled with a surgical camera for capturing a three-dimensional image of an eye. Par. 116; The storage device 1430 can include software modules 1432, 1434, 1436 for controlling the processor 1410.) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Hallen/Johnson/Kim as cited above, in view of Hassan; Alexander, U.S. Published Application No. 20210401527 A1 which claims priority to provisional-application US 63046044 dated 6/30/2020. Claim 7 depends on claim 1: Hallen/Johnson/Kim fails to expressly teach wherein the visual representation of the control functionality comprises a visual representation of the input device the control functionality is associated with. However, Hassan teaches wherein the visual representation of the control functionality comprises a visual representation of the input device the control functionality is associated with. (e.g., displaying graphical representation 502 has the appearance of the foot pedal assembly when the user selects a medical tool associated with functionality that is operated by a foot pedal of the foot pedal assembly. par. 155; In some embodiments, for example as illustrated in FIG. 25, the graphical representation 502 of the user input device can have the appearance of the user input device that it represents. In FIG. 25, the graphical representation 502 has the appearance of the foot pedal assembly, including representations of each of the foot pedals thereof. In some embodiments, the graphical representation 502 can be an image of the user input device. Par. 156; In some embodiments, the graphical representation 502 may be displayed to the user automatically based on a state or context of the robotic medical system or component(s) thereof. As one example, the graphical representation 502 can be displayed when the user selects a medical tool associated with functionality that is operated by a foot pedal of the foot pedal assembly. The graphical representation 502 can be displayed during certain portions of a robotic medical procedure.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the console for displaying control functionality based on an input device as taught by Hallen/Johnson/Kim to include a corresponding visual representation of the input device as taught by Hassan, with a reasonable expectation of success, to provide the benefit provide a better indication to the user of how to interact with the user input device. (see Hassan; par. 7) Response to Arguments Applicant's arguments filed 3/17/2026 have been fully considered but they are not persuasive. Prior Art Rejections 1) Applicant argues that Hallen fails to teach obtaining a trigger signal indicating a desire of a user of the microscope system to display a visual representation of a control functionality currently associated with an input device of the microscope system. (see Response; page 7) Examiner respectfully disagrees. Examiner notes that newly applied Kim reference is relied upon to teach obtaining a trigger signal indicating a desire of a user of the microscope system to display a visual representation of a control functionality currently associated with an input device of the microscope system. For example, Kim teaches a smart guide mode (e.g., par. 84; When the user/surgeon provides an input at the clutch, foot pedal, or at the GUI, the robotic arm can enter the smart guide mode, and the GUI can be updated to display the status information as “Fine Movement Mode” Active, 2405. Par. 84; In one or more steps disclosed herein in FIG. 24, the robotic surgical system 2101 may also provide visual and audio cues in addition to the user prompts at the GUI 100. ). Examiner submits that input at the foot pedal or clutch indicates a desire of a user to display a visual representation of a control functionality related to “Fine Movement” currently associated with the foot pedal or clutch input device. (see par. 83; In one or more steps during the procedure shown in FIG. 23, the robotic surgical system may provide visual or audio signal 2312 when the user's input, e.g., pressing a red button, pressing the clutch, etc., initiates no actuation of the robotic arm.) Applicant’s remaining arguments (see Response; pages 8 and 9), with respect to the previously cited prior art failing to disclose the new limitations has been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new grounds of rejection is made in view of newly applied “Kim“ reference (see Office action). For at least the foregoing reasons, the claims are not in condition for allowance. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to HENRY ORR whose telephone number is (571)270-1308. The examiner can normally be reached 9AM-5PM EST M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached at (571)272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HENRY ORR/Primary Examiner, Art Unit 2172
Read full office action

Prosecution Timeline

Apr 21, 2023
Application Filed
Dec 03, 2024
Non-Final Rejection — §103
Feb 28, 2025
Response Filed
Apr 05, 2025
Final Rejection — §103
May 06, 2025
Interview Requested
May 16, 2025
Examiner Interview Summary
May 16, 2025
Applicant Interview (Telephonic)
Jun 04, 2025
Response after Non-Final Action
Jul 09, 2025
Response after Non-Final Action
Jul 09, 2025
Notice of Allowance
Sep 02, 2025
Response after Non-Final Action
Sep 15, 2025
Request for Continued Examination
Sep 22, 2025
Response after Non-Final Action
Sep 23, 2025
Non-Final Rejection — §103
Dec 19, 2025
Response Filed
Jan 22, 2026
Final Rejection — §103
Mar 17, 2026
Request for Continued Examination
Mar 19, 2026
Response after Non-Final Action
Mar 20, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578851
SYSTEMS, METHODS, AND GRAPHICAL USER INTERFACES FOR GENERATING SHORT RUN CONTROL CHARTS
2y 5m to grant Granted Mar 17, 2026
Patent 12572268
ACCELERATED SCROLLING AND SELECTION
2y 5m to grant Granted Mar 10, 2026
Patent 12561589
SYSTEM AND METHOD FOR INDUSTRIAL AUTOMATION RULES ENGINE
2y 5m to grant Granted Feb 24, 2026
Patent 12547304
INFORMATION PROCESSING SYSTEM, METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM FOR DISPLAYING ENLARGEED IMAGE CORRESPONDING TO A FILE IMAGE
2y 5m to grant Granted Feb 10, 2026
Patent 12530968
MAP-BASED EMERGENCY CALL MANAGEMENT AND DISPATCH
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
50%
Grant Probability
88%
With Interview (+37.2%)
3y 10m
Median Time to Grant
High
PTA Risk
Based on 456 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month