Prosecution Insights
Last updated: April 19, 2026
Application No. 18/718,863

SURGICAL ROBOT SYSTEM AND CONTROL METHOD

Non-Final OA §103§112
Filed
Jun 12, 2024
Examiner
BOICE, JAMES EDWARD
Art Unit
3795
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
B. Braun New Ventures GmbH
OA Round
1 (Non-Final)
79%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
89%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
94 granted / 119 resolved
+9.0% vs TC avg
Moderate +10% lift
Without
With
+10.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
56 currently pending
Career history
175
Total Applications
across all art units

Statute-Specific Performance

§101
0.6%
-39.4% vs TC avg
§103
57.7%
+17.7% vs TC avg
§102
20.7%
-19.3% vs TC avg
§112
17.6%
-22.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 119 resolved cases

Office Action

§103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy of DE 10 2021 133 060.2 has been received. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. As such, the feature “manual gripping control means” for transmitting a manual input by a user in lines 4-5 of Claim 12 is interpreted as control unit 30 in FIG. 1 and described in paragraph [0030] of the present patent application. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: Claim 1, which claims in line 11 a surgical navigation system for navigating the end effector. A surgical navigation system is interpreted as navigation system 28 shown in FIG. 1 and described in paragraph [0052] of the present patent application. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Drawings The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore, the feature of “a digital surgical microscope that provides an optical image of the control unit” claimed in lines 3-4 of Claim 10 must be shown or the feature(s) canceled from the claim(s). Examiner notes that FIG. 1 of the present specification shows camera head 36 aimed at the surgical cite, rather than the control unit 30. Examiner is unable to find support in the written description for the camera to be imaging the control unit. No new matter should be entered. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 1 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Specifically, Claim 1 claims a “surgical navigation system for navigating the end effector” in line 11. Paragraphs [0015], [0018], [0027], [0052], and [0065] of the present specification describes a surgical navigation system as a tracking system. This feature is unclear since 1) the surgical navigation system is not claimed as a tracking system in Claim 1, and 2) if the surgical navigation system is intended to be a tracking system, then it is unclear what is being introduced in Claim 2 as “a tracking system”. Appropriate correction by Applicant is required. For purposes of examination, Examiner interprets “a surgical navigation system” as a tracking system. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The present rejection(s) reference specific passages from cited prior art. However, Applicant is advised that the rejections are based on the entirety of each cited prior art. That is, each cited prior art reference “must be considered in its entirety”. (See MPEP 2141.02(VI)) Therefore, Applicant is advised to review all portions of the cited prior art if traversing a rejection based on the cited prior art. Claims 1-10 and 12-15 are rejected under 35 U.S.C. 103 as being unpatentable over Joskowicz et al. (US PGPUB 2009/0177081 – “Joskowicz”) in view of Crawford (US PGPUB 2019/0117313 – “Crawford”). Regarding Claim 1, Joskowicz discloses: A surgical robot system (Joskowicz FIG. 1A, miniature robot 10) for surgical intervention on a patient (Joskowicz paragraph [0071], “FIG. 1A, which illustrates schematically a miniature robot 10 in use in an image-guided robotic keyhole neurosurgery system”), comprising: at least one patient fixation unit (Joskowicz FIG. 1A, mounting plate 24) which is adapted to be rigidly and directly attached to the to a head of the patient (Joskowicz FIG. 1C, showing plate 24 directly attached to a patient’s head/skull using pins 22), in order to rigidly fix at least a body portion of the patient with an intervention region relative to the at least one patient fixation unit (Joskowicz paragraph [0076], “the surgeon defines on the image set the desired entry point 18 or points and the desired target location or locations, and determines the robot mounting type (head clamp or skull, depending on clinical criteria) and the desired robot location”); at least one surgical robot (Joskowicz FIG. 1D, robot 10) that is controllable (Joskowicz FIG. 1A, cable 26; Joskowicz paragraph [0073], “The robot…may be operated via a single cable 26 from a controller”), the at least one surgical robot comprising a robot base (Joskowicz FIG. 1A, fixed platform 12), which is directly connected to the at least one patient fixation unit (Joskowicz paragraph [0072], “robot consists of a fixed platform 12 that attaches to a mounting plate 24”), as well as a robot arm that is moveable and connected to the robot base and an end effector (Joskowicz FIG. 1D, showing robot 10 connected to guide 16 for needle 20 by a control arm coming from robot 10; Joskowicz paragraph [0073], “The robot…positions and orients the targeting guide 16 to a predefined location”). Joskowicz does not explicitly disclose a surgical navigation system for navigating the end effector. Crawford teaches a surgical navigation system (Crawford FIG. 5, tracking subsystem 532) for navigating the end effector (Crawford FIG. 3, end-effector 310 on surgical robot system 300; Crawford paragraph [0063], “Tracking subsystem may track the location of certain markers that are located on the different components of system 300 and/or instruments used by a user during a surgical procedure”). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Crawford’s tracking subsystem with the surgical robot system disclosed by Joskowicz. A person having ordinary skill in the art would be motivated to combine these prior art elements according to known methods to yield the predictable result of a surgical robot system that is capable of identifying the location of its end effector for purposes of navigation thereof (see Crawford FIG. 6 and Crawford paragraph [0048]). Regarding Claim 2, Joskowicz in view of Crawford teaches the features of Claim 1, as described above. Crawford further teaches: at least one optical image unit (Crawford FIG. 2, camera 200), which is adapted to create an image of the intervention region and an end effector tip (Crawford FIG. 13B, tracking markers 118 on end effector 112; Crawford paragraph [0049], “robotic surgical system 100 can comprise one or more tracking markers 118 configured to track the movement of…end-effector 112, patient 210, and/or the surgical instrument 608; Crawford FIG. 1, camera 200; Crawford paragraph [0044], “camera 200 may include any suitable camera or cameras, such as one or more infrared cameras (e.g., bifocal or stereophotogrammetric cameras), able to identify, for example, active and passive tracking markers 118”), and to provide the image in a computer-readable manner (Crawford paragraph [0044], “surgical robot system 100 may also utilize a camera 200…The camera 200 may include any suitable camera or cameras, such as one or more infrared cameras (e.g., bifocal or stereophotogrammetric cameras), able to identify, for example, active and passive tracking markers 118”); a data provision unit, which is adapted to provide digital 3D image data of the patient (Crawford paragraph [0044], “camera 200 may scan the given measurement volume and detect the light that comes from the markers 118 in order to identify and determine the position of the markers 118 in three-dimensions”; Crawford paragraph [0049], “robotic surgical system 100 can comprise one or more tracking markers 118 configured to track the movement of…patient 210”); a tracking system (Crawford FIG. 5, tracking system 300; Crawford paragraph [0109], “When using an external 3D tracking system…300…to track a full rigid body array of three or more markers attached to a robot's end effector 112 (for example, as depicted in FIGS. 13A and 13B), it is possible to directly track or to calculate the 3D position of every section of the robot 102 in the coordinate system of the cameras 200, 326 “), which is provided and adapted to detect and track in space at least the at least one optical image unit and at least the body portion of the patient, which is fixed relative to the at least one patient fixation unit (Crawford paragraph [0123], “each frame of data collected consists of the tracked position of the DRB 1404 on the patient 210, the tracked position of the single marker 1018 on the end effector 1014, and a snapshot of the positions of each robotic axis. From the positions of the robot's axes, the location of the single marker 1018 on the end effector 1012 is calculated. This calculated position is compared to the actual position of the marker 1018 as recorded from the tracking system”). Joskowicz further teaches: a control unit (Joskowicz paragraph [0040], “system hardware consists of: 1) the miniature robot and its controller”), which is provided and adapted to determine a position of the end effector tip and to generate an overlapping with the digital 3D image data and a positionally correct overlapped position of the end effector tip and to output the overlapping visually via a display unit as an overlapping representation and/or to control the end effector based on the overlapping (Joskowicz FIG. 5, augmented reality images 58; Joskowicz paragraph [0088], “The video monitor preferably shows real-time, augmented reality images 58 consisting of a video image of the actual patient skull and the positioning jig with mounting base in the hand of the surgeon 60, and, superimposed on it, a virtual image of the same jig indicating the robot base in its desired preplanned location”). Regarding Claim 3, Joskowicz in view of Crawford teaches the features of Claim 2, as described above. Joskowicz further discloses wherein the control unit is adapted to determine, via an optical image of the end effector tip, the position, of the end effector tip, relative to the at least one optical image unit by machine vision, and to determine, via a pose of the at least one optical image unit tracked by the tracking system, the position, of the end effector tip relative to the digital 3D image data (Joskowicz paragraph [0090], “the system also includes a surface scan processing module, which automatically extracts three sets of points from the intraoperative 3D surface scan generated by the 3-D surface scanner 36: 1) a forehead (frontal scan) or ear cloud of points (lateral scan); 2) preferably four eye or ear landmark points; and 3) the registration jig cloud of points (when the jig is present in the scan). The forehead/ear cloud of points is computed by first isolating the corresponding areas and removing outliers. The landmark points are extracted by fitting a triangular mesh and identifying the areas of maximum curvature as in the CT/MRI images. The jig cloud of points is computed by isolating the remaining points.”; Joskowicz paragraph [0093], “tracking an…end-effector 112…to be tracked in 3D”). Regarding Claim 4, Joskowicz in view of Crawford teaches the features of Claim 2, as described above. Crawford further teaches wherein the end effector has a predefined optical marking pattern (Crawford FIG. 13B, tracking markers 118 on end-effector 112), and the control unit is adapted to determine the position of the end effector tip based on the predefined optical marking pattern detected by the at least one optical image unit (Crawford paragraph [0052], “light emitted from and/or reflected by markers 118 can be detected by camera 200 and can be used to monitor the location and movement of the marked objects”). Regarding Claim 5, Joskowicz in view of Crawford teaches the features of Claim 2, as described above. Crawford further teaches wherein an optical marker is arranged on the at least one end effector (Crawford FIG. 13B, tracking markers 118 on end-effector 112) or on a terminal member of the robot arm, wherein the optical marker is detected and tracked by the tracking system in order to determine the position and orientation of the end effector tip (Crawford paragraph [0052], “light emitted from and/or reflected by markers 118 can be detected by camera 200 and can be used to monitor the location and movement of the marked objects”). Regarding Claim 6, Joskowicz in view of Crawford teaches the features of Claim 1, as described above. The first embodiment of Joskowicz cited in the rejection of Claim 1 does not explicitly disclose wherein the at least one patient fixation unit is a head holder used in neurosurgery. However, a second embodiment of Joskowicz discloses wherein the at least one patient fixation unit is a head holder used in neurosurgery (Joskowicz FIG. 1D, showing mounting plate 29 for robot 10 affixed to Mayfield clamp 28, which is screwed to the patient’s skull). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to substitute Joskowicz’s mounting plate 29 for Joskowicz’s mounting plate 24. A person having ordinary skill in the art would be motivated to make this simple substitution of one known element for another to obtain the predictable result of a surgical robot system that utilizes a Mayfield clamp 28 to immobilize the patient’s head while still promoting positional adjustment of the robot before and/or during surgery (see Joskowicz paragraph [0078]). Regarding Claim 7, Joskowicz in view of Crawford teaches the features of Claim 1, as described above. The first embodiment of Joskowicz cited in the rejection of Claim 1 does not explicitly disclose wherein the at least one patient fixation unit has a circumferential rail, to which the at least one surgical robot is fastened via the robot base or is moveable translationally along the circumferential rail via the robot base at least in sections. However, a second embodiment of Joskowicz discloses wherein the at least one patient fixation unit has a circumferential rail (Joskowicz FIG. 1D, Mayfield clamp 28), to which the at least one surgical robot is fastened via the robot base (Joskowicz FIG. 1D, showing robot 10 secured to Mayfield clamp 28 via mounting plate 29) or is moveable translationally along the circumferential rail via the robot base at least in sections. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to substitute Joskowicz’s mounting plate 29 for Joskowicz’s mounting plate 24. A person having ordinary skill in the art would be motivated to make this simple substitution of one known element for another to obtain the predictable result of a surgical robot system that utilizes a Mayfield clamp 28 to immobilize the patient’s head during surgery. Regarding Claim 8, Joskowicz in view of Crawford teaches the features of Claim 7, as described above. The second embodiment of Joskowicz further discloses wherein the robot base has a slide with a clamping and/or latching element which is adapted to be guided translationally along the circumferential rail and, when the clamping and/or latching element is activated, to fix a position of the robot base relative to the circumferential rail and, when the clamping and/or latching element is deactivated, to be freely movable again in order to adjust various positions of the at least one surgical robot relative to the at least one patient fixation unit and thus relative to the patient (Joskowicz paragraph [0078], “When the robot is mounted on a head frame such as a Mayfield clamp 28, as shown in FIG. 1D, the robot 10 is preferably attached thereto by means of an adjustable mechanical arm or mounting plate 29.”). Regarding Claim 9, Joskowicz in view of Crawford teaches the features of Claim 1, as described above. Crawford further teaches wherein the robot arm is configured to have at least five degrees of freedom to align the end effector with a surgical entry path (Crawford FIG. 6, robot arm 604; Crawford paragraph [0075], “end-effector 602 may engage with robot arm 604 through a locating coupling”; Crawford paragraph [0076], “locating coupling may be any style of kinematic mount that uniquely restrains six degrees of freedom”), and/or the end effector itself has at least one further degree of freedom, in order to allow further articulation within the patient. Regarding Claim 10, Joskowicz in view of Crawford teaches the features of Claim 2, as described above. Crawford further teaches wherein the at least one optical image unit (Crawford FIG. 2, camera 200) is a digital surgical microscope that provides an optical image of the control unit (Crawford FIG. 2, showing surgical robot 102 in the field of view of camera 200) in a computer-readable manner. Regarding Claim 12, Joskowicz in view of Crawford teaches the features of Claim 1, as described above. Crawford further teaches wherein the surgical robot system further comprises an external control console which is separate from the at least one patient fixation unit with the at least one surgical robot, and which comprises manual gripping control means for transmitting a manual input by a user to the end effector and controlling the end effector accordingly from a distance (Crawford FIG. 5, controller 538 within motion control subsystem 506; Crawford paragraph [0064], Motion control subsystem 506 may be configured to physically move vertical column 312, upper arm 306, lower arm 308, or rotate end-effector 310…These movements may be achieved by controller 538 which may control these movements through load cells disposed on end-effector 310 and activated by a user engaging these load cells to move system 300 in a desired manner. “). Regarding Claim 13, Joskowicz in view of Crawford teaches the features of Claim 2, as described above. Joskowicz further discloses wherein a preoperative intervention plan is furthermore stored in the data provision unit of the surgical robot system, and the control unit is adapted to control the end effector semi-autonomously or completely autonomously based on the preoperative intervention plan in order to perform the surgical intervention (Joskowicz paragraph [0086], “FIG. 4 which illustrates schematically the computation method for preoperative robot placement. The goal is to compute the transformation that aligns the planned trajectoryimageplanned 50, (defined by the entry point Pimageentry 18 and target point Pimagetarget 19) in image coordinates, to the targeting guide axis with the robot in its home position, guidehomerobot 52, in robot coordinates.“). Regarding Claim 14, Joskowicz discloses: A control method for a surgical robot system that includes a robot, the control method comprising the steps of: wherein the robot (Joskowicz FIG. 1D, robot 10) is directly connected to a patient fixation unit (Joskowicz Fig. 1D, Mayfield clamp 28) rigidly connected to the patient (Joskowicz FIG. 1D, showing Mayfield clamp 28 screwed onto the skull of the patient), and providing the optical image (Joskowicz paragraph [0068], “tracking markers 702 may be activated such that the infrared markers 702 are visible to the camera 200, 326”); creating, via the control unit (Joskowicz paragraph [0040], “system hardware consists of: 1) the miniature robot and its controller”), an overlapping with the digital 3D image data and the position of the end-effector tip (Joskowicz FIG. 5, augmented reality images 58; Joskowicz paragraph [0088], “The video monitor preferably shows real-time, augmented reality images 58 consisting of a video image of the actual patient skull and the positioning jig with mounting base in the hand of the surgeon 60, and, superimposed on it, a virtual image of the same jig indicating the robot base in its desired preplanned location”); and outputting the overlapping as an overlapping representation by a display unit and/or controlling the robot-guided end effector based on the overlapping (Joskowicz FIG. 5, augmented reality images 58; Joskowicz paragraph [0088], “The video monitor preferably shows real-time, augmented reality images 58 consisting of a video image of the actual patient skull and the positioning jig with mounting base in the hand of the surgeon 60, and, superimposed on it, a virtual image of the same jig indicating the robot base in its desired preplanned location”). Joskowicz does not explicitly disclose: creating, via an optical image device, an optical image of an intervention region of a patient together with an end-effector tip of a robot-guided end effector, providing, via a data provision unit, digital 3D image data of the patient; tracking, via a tracking system, of at least the optical image device and a body portion of the patient fixed relative to the patient fixation unit; and determining, via a control unit, a position of the end-effector tip relative to the optical image device by machine vision based on the basis of the optical image. Crawford teaches: creating, via an optical image device (Crawford FIG. 2, camera 200), an optical image of an intervention region of a patient together with an end-effector tip of a robot-guided end effector (Crawford FIG. 13B, tracking markers 118 on end effector 112; Crawford paragraph [0049], “robotic surgical system 100 can comprise one or more tracking markers 118 configured to track the movement of…end-effector 112, patient 210, and/or the surgical instrument 608”; Crawford FIG. 1, camera 200; Crawford paragraph [0044], “camera 200 may include any suitable camera or cameras, such as one or more infrared cameras (e.g., bifocal or stereophotogrammetric cameras), able to identify, for example, active and passive tracking markers 118”), providing, via a data provision unit, digital 3D image data of the patient (Crawford paragraph [0044], “camera 200 may scan the given measurement volume and detect the light that comes from the markers 118 in order to identify and determine the position of the markers 118 in three-dimensions”); tracking, via a tracking system, of at least the optical image device and a body portion of the patient fixed relative to the patient fixation unit (Crawford paragraph [0123], “each frame of data collected consists of the tracked position of the DRB 1404 on the patient 210, the tracked position of the single marker 1018 on the end effector 1014, and a snapshot of the positions of each robotic axis. From the positions of the robot's axes, the location of the single marker 1018 on the end effector 1012 is calculated. This calculated position is compared to the actual position of the marker 1018 as recorded from the tracking system”); and determining, via a control unit (Crawford FIG. 5, computer subsystem/computer 504; Crawford paragraph [0062], “Computer 504 includes an operating system and software to operate system 300. Computer 504 may receive and process information from other components (for example, tracking subsystem 532, platform subsystem 502, and/or motion control subsystem 506) in order to display information to the user.”), a position of the end-effector tip relative to the optical image device by machine vision based on the basis of the optical image (Crawford FIG. 7A, tracking markers 702 on end-effector 602; Crawford paragraph [0068], “tracking markers 702 may be activated such that the infrared markers 702 are visible to the camera 200, 326”); It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Crawford’s tracking subsystem with the surgical robot system disclosed by Joskowicz. A person having ordinary skill in the art would be motivated to combine these prior art elements according to known methods to yield the predictable result of a surgical robot system that is capable of identifying the location of its end effector for purposes of navigation thereof (see Crawford FIG. 6 and Crawford paragraph [0048]). Regarding Claim 15, Joskowicz in view of Crawford teaches the features of Claim 14, as described above. Crawford further teaches a computer-readable storage medium comprising instructions which, when executed by a computer, cause the computer to perform the control method according to claim 14 (Crawford FIG. 5, computer/computer subsystem 504; Crawford paragraph [0062], “Computer subsystem 504…includes an operating system and software to operate system 300.”). Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Joskowicz et al. (US PGPUB 2009/0177081 – “Joskowicz”) in view of Crawford (US PGPUB 2019/0117313 – “Crawford”) and Nagler et al. (US PGPUB 2004/0204646 – “Nagler”). Regarding Claim 11, Joskowicz in view of Crawford teaches the features of Claim 2, as described above. Joskowicz in view of Crawford does not explicitly teach wherein the at least one optical image unit is an endoscope with a distal image head, which is adapted to create an intracorporeal optical image of the patient. Nagler teaches wherein the at least one optical image unit is an endoscope (Nagler FIG. 7J, catheter 106) with a distal image head (Nagler FIG. 7J, camera 113), which is adapted to create an intracorporeal optical image of the patient (Nagler FIG. 7J, showing unlabeled tumor 103 emitting radiation 12; Nagler paragraph [0145], “as seen in FIG. 7J, tumor 103 is first detected by camera 113, visually”). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Nagler’s endoscope with the surgical robot system taught by Joskowicz in view of Crawford. A person having ordinary skill in the art would be motivated to combine these prior art elements according to known methods to yield the predictable result of a surgical robot system that is able to capture real-time images of internal organs/features of the patient, in order to confirm that the stereotactic guidance is accurate. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JIM BOICE whose telephone number is (571)272-6565. The examiner can normally be reached Monday-Friday 9:00am - 5:00pm Eastern. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anhtuan Nguyen can be reached at (571)272-4963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JIM BOICE Examiner Art Unit 3795 /JAMES EDWARD BOICE/Examiner, Art Unit 3795 /ANH TUAN T NGUYEN/Supervisory Patent Examiner, Art Unit 3795 1/27/26
Read full office action

Prosecution Timeline

Jun 12, 2024
Application Filed
Jan 21, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599385
ENDOSCOPE SYSTEM AND ENDOSCOPIC LIGATOR ATTACHMENT METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12594126
INTRALUMINAL NAVIGATION USING VIRTUAL SATELLITE TARGETS
2y 5m to grant Granted Apr 07, 2026
Patent 12569117
ENDOSCOPE
2y 5m to grant Granted Mar 10, 2026
Patent 12533012
METHOD FOR FIXING CABLES FOR ACTUATING THE DISTAL HEAD OF A MEDICAL DEVICE
2y 5m to grant Granted Jan 27, 2026
Patent 12507875
ENDOSCOPE AND ENDOSCOPE SYSTEM
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
79%
Grant Probability
89%
With Interview (+10.0%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 119 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month