Prosecution Insights
Last updated: April 19, 2026
Application No. 18/766,857

SURGICAL ROBOTIC SYSTEM AND METHOD FOR OPTICAL MEASUREMENT OF END EFFECTOR PITCH, YAW, AND JAW ANGLE

Final Rejection §103
Filed
Jul 09, 2024
Examiner
CULLEN, TANNER L
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Covidien LP
OA Round
2 (Final)
71%
Grant Probability
Favorable
3-4
OA Rounds
3y 0m
To Grant
87%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
114 granted / 161 resolved
+18.8% vs TC avg
Strong +17% interview lift
Without
With
+16.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
35 currently pending
Career history
196
Total Applications
across all art units

Statute-Specific Performance

§101
8.5%
-31.5% vs TC avg
§103
57.2%
+17.2% vs TC avg
§102
19.3%
-20.7% vs TC avg
§112
11.7%
-28.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 161 resolved cases

Office Action

§103
DETAILED CORRESPONDENCE This final office action is in response to the Amendments filed on 09 February 2026, regarding application number 18/766,857. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Response to Amendment Claims 1-13 remain pending in the application. Claims 1-2 were amended in the Amendments to the Claims. Claims 3-13 are original. Applicant’s amendment to claim 2 has overcome the objection previously set forth in the non-final office action mailed 10 November 2025. Therefore, the objection has been withdrawn. Response to Arguments Applicant’s arguments, see Pages 5-7, filed 09 February 2026, with respect to the rejections of claims 1-13 under 35 U.S.C. § 103 have been fully considered and are persuasive. Therefore, the rejections have been withdrawn. However, upon further consideration, a new ground(s) of rejection is made further in view of newly cited reference Nemmers et al. (US 20110029132 A1). See full details below. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1 and 3-9 are rejected under 35 U.S.C. 103 as being unpatentable over Roy (US 20190381660 A1 and Roy hereinafter) in view of Rockrohr et al. (US 20220071726 A1 and Rockrohr hereinafter) and Nemmers et al. (US 20110029132 A1 and Nemmers hereinafter). Regarding Claim 1 Roy teaches a surgical robotic system (see all Figs.; [0012]-[0017]) comprising: an instrument including a shaft defining a longitudinal axis (see Fig. 1, guide rail 7; [0031 "...a third degree of freedom long z-axis defined as pointing up is achieved by a guide rail 7 moving across a carriage guide 6 fixed to second carriage guide 5."]) and an end effector pivotable relative to the shaft at a yaw angle (see Fig. 1, end effector 9; Figs. 8-9, all; [0012], [0031 "The end effector 9 is able to translate along x, y and z axes and tilt along the axis of the rotational joint 8."] and [0037]), the end effector including a pair of jaws (see Fig. 1, end effector 9); a first imaging device configured to obtain a first image of the end effector along a first axis (see Fig. 1, any of the plural orthogonal direction cameras 16, 17, 18, 19; Figs. 3, 5, 7 and 9, all; [0032 "One or plural orthogonal direction cameras 15, 16, 17, 18 mounted on frame 1 viewing in a direction perpendicular to primary direction cameras along y-axis…"] and [0034]-[0037]); a second imaging device configured to obtain a second image of the end effector along a second axis (see Fig. 1, any of the plural third cross cameras 20, 21, 22; [0032 "...one of plural third cross cameras 20, 21, 22 mounted on the roof, mounting structure not shown for clarity, also fixed with reference to frame 1 viewing in a mutually perpendicular direction to viewing directions or primary and orthogonal direction cameras."] and [0037]); and a third imaging device configured to obtain a third image of the end effector along a third axis (see Fig. 1, any of the plural primary direction cameras 13, 14, 15; Figs. 10-11, all; [0032 "A robot vision supervisor comprised of one or plural primary direction cameras 13, 14, 15 fixed to frame 1, viewing along x-axis."] and [0037]), wherein each of the first, second, and third axes are transverse relative to each other (see Fig. 1, all; [0032 "A robot vision supervisor comprised of one or plural primary direction cameras 13, 14, 15 fixed to frame 1, viewing along x-axis. One or plural orthogonal direction cameras 15, 16, 17, 18 mounted on frame 1 viewing in a direction perpendicular to primary direction cameras along y-axis and one of plural third cross cameras 20, 21, 22 mounted on the roof, mounting structure not shown for clarity, also fixed with reference to frame 1 viewing in a mutually perpendicular direction to viewing directions or primary and orthogonal direction cameras.]), wherein each of the first imaging device, the second imaging device, and the third imaging device comprises an image sensor, such that each imaging device is configured to obtain a profile image of the end effector and the pair of jaws (see Fig. 1, cameras 13-22; [0012 "An object of the invention is to use cameras and image processing to extract the boundaries of a robot workspace and compare with the boundaries of the robot moving parts and end effectors to prevent collisions and guide the robot to its homing position when needed."], [0032] and [0034 "The robot vision supervisor can predict this collision as seen in FIG. 3 viewed by orthogonal camera 18 by extracting a silhouette of the known workspace object 10 and a silhouette of the end effector 9 and guide the end effector 9 such that the workspace object silhouettes and the robot silhouettes do not meet."]-[0038]) Roy is silent regarding the pair of jaws pivotable at a pitch angle and openable to a jaw angle; wherein each of the imaging devices comprise a corresponding light source positioned opposite the image sensor to backlight the end effector and the pair of jaws, such that each imaging device is configured to obtain a backlit profile image. Rockrohr teaches a surgical robotic system (see all Figs.; [0006]) comprising: an instrument including a shaft defining a longitudinal axis (see Fig. 2, instrument 20 and shaft 212; [0019 "The elongated shaft defines a longitudinal axis and has a proximal end and a distal end."], [0027] and [0055]) and an end effector pivotable relative to the shaft at a yaw angle (see Fig. 2, end effector 270; Fig. 7. all; [0019 "The end effector is supported adjacent the distal end of the elongate shaft and includes a first jaw and a second jaw movable in pitch, yaw, and jaw DOFs."], [0027], [0040 "FIG. 7 is a perspective view of the end effector of FIG. 6 with the end effector rotated in a positive yaw DOF;"], [0055] and [0066]), the end effector including a pair of jaws pivotable at a pitch angle (see Fig. 8, all; [0019 "The end effector is supported adjacent the distal end of the elongate shaft and includes a first jaw and a second jaw movable in pitch, yaw, and jaw DOFs."], [0041 "FIG. 8 is a perspective view of the end effector of FIG. 6 with the end effector rotated in a positive pitch DOF;"] and [0067]) and openable to a jaw angle (see Fig. 9, all; [0019 "The end effector is supported adjacent the distal end of the elongate shaft and includes a first jaw and a second jaw movable in pitch, yaw, and jaw DOFs."], [0042 "FIG. 9 is a perspective view of the end effector of FIG. 6 with the end effector rotated in a positive jaw DOF;"] and [0068]); and a first imaging device configured to obtain a first image of the end effector along a first axis (see Fig. 1, imaging device 16; [0049]-[0050]), wherein the first imaging device comprises an image sensor (see Fig. 1, imaging device 16; [0049]-[0050]). Nemmers teaches a surgical robotic system (see all Figs.; [0017]) comprising: an instrument including a shaft defining a longitudinal axis and an end effector pivotable relative to the shaft at a yaw angle, the end effector including a pair of jaws pivotable at a pitch angle (see Figs. 10-12, robotic tool 200; [0017 "In one embodiment, a system for calibrating a robotic tool comprises: a housing including an aperture for receiving the robotic tool…"], [0044 "As a non-limiting example, a robotic tool is capable of rotating +/−30 degrees for Yaw and Pitch and +/−45 degrees for rotation. It is understood that any tool or robot arm attachment may be calibrated using the system 10."], [0061] and [0064]-[0066]); a first imaging device configured to obtain a first image of the end effector along a first axis (see Fig. 6, light source 24; Figs. 10-12, all; Abstract, all; [0017 "...an image generating device disposed in the housing and positioned to generate an image of the robotic tool received through the aperture of the housing, wherein the image generating device generates an image signal representing the image of the robotic tool..."], [0045 "The image generating device 22 is typically a camera. It is understood that various cameras can be used. It is further understood that although a single image generating device 22 is shown, any number of the devices or cameras can be used."]-[0046], [0058] and [0064]-[0066]); a second imaging device configured to obtain a second image of the end effector (see [0045 "The image generating device 22 is typically a camera. It is understood that various cameras can be used. It is further understood that although a single image generating device 22 is shown, any number of the devices or cameras can be used."]); and a third imaging device configured to obtain a third image of the end effector (see [0045 "The image generating device 22 is typically a camera. It is understood that various cameras can be used. It is further understood that although a single image generating device 22 is shown, any number of the devices or cameras can be used."]), wherein each of the first imaging device, the second imaging device, and the third imaging device comprises an image sensor and a corresponding light source positioned opposite the image sensor to backlight the end effector and the pair of jaws, such that each imaging device is configured to obtain a backlit profile image of the end effector and the pair of jaws (see Fig. 6, all, especially image generating device 22 and light source 24; Figs. 10-12, all, especially outline 205; Abstract, all; [0017 "...an image generating device disposed in the housing and positioned to generate an image of the robotic tool received through the aperture of the housing, wherein the image generating device generates an image signal representing the image of the robotic tool; a light source disposed in the housing to backlight the robotic tool received through the aperture of the housing; and a processor responsive to the image signal for calculating and monitoring a configuration of the robotic tool."], [0045 "The image generating device 22 is typically a camera. It is understood that various cameras can be used. It is further understood that although a single image generating device 22 is shown, any number of the devices or cameras can be used."]-[0046 "In certain embodiments a light diffuser 26 is disposed between the image generating device 22 and the light source 24 to provide a substantially even light distribution for silhouetting the robotic tool in the generated image."], [0058] and [0064 "As more clearly shown in FIG. 10, while the robotic tool is positioned within the field of view of the image generating device, a portion of the robotic tool (i.e. parent tool) is initially located, as identified by an outline 205. In order to teach the user tool for the tandem welding tool 200, four points (i.e. targets) 206, 208, 210, 212 need to be located in three dimension space, as shown in FIG. 11. Specifically, while the robotic tool is in the first position, the processor 14 analyzes the first image to locate the first point 206 and an XYZ coordinate representing a position where a pre-defined view line passes through a calibration plane defined during the calibration step 102."]-[0066]). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the pair of jaws of the surgical robot system of Roy to be pivotable at a pitch angle and openable to a jaw angle, as taught by Rockrohr, to enable movement of the end effector in additional degrees of freedom to facilitate grasping tasks. It additionally would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the imaging devices of the surgical robotic system of Roy to further include corresponding light sources positioned opposite the image sensors to backlight the end effector and the pair of jaws to obtain a backlit profile image, as taught by Nemmers, in order to provide a substantially even light distribution for silhouetting the end effector in the generated images to facilitate determining a tool center point of the end effector. Regarding Claim 3 Modified Roy teaches the surgical robotic system according to claim 1 (as discussed above in claim 1), Roy further teaches wherein the end effector is pivotable relative to the shaft about a first pin defining a first pivot axis (see Fig. 1, rotational joint 8; Figs. 8-9, all; [0012], [0031 "A fourth degree of freedom along a tilt axis is achieved by fixing a rotational joint 8 on third guide rail 7 and mounting an end effector 9 to rotational joint 8. The end effector 9 is able to translate along x, y and z axes and tilt along the axis of the rotational joint 8. A known workspace object 10 sits at the bottom of the robot workspace."] and [0037]). Rockrohr additionally teaches wherein the end effector is pivotable relative to the shaft about a first pin defining a first pivot axis (see Figs. 6-10, first and second idlers 273 and 275; [0061 "The first and second idlers 273, 275 each define an idler axis I1, I2 that is perpendicular to the longitudinal axis A-A of the shaft 212 and parallel to one another."]). Regarding Claim 4 Modified Roy teaches the surgical robotic system according to claim 3 (as discussed above in claim 3), Roy further teaches wherein the first imaging device is disposed coaxially with the first pivot axis that is perpendicular to the longitudinal axis (see Fig. 1, any of the plural orthogonal direction cameras 16, 17, 18, 19; Figs. 3, 5, 7 and 9, all; [0031]-[0032 "One or plural orthogonal direction cameras 15, 16, 17, 18 mounted on frame 1 viewing in a direction perpendicular to primary direction cameras along y-axis…"] and [0034]-[0037]). Regarding Claim 5 Modified Roy teaches the surgical robotic system according to claim 3 (as discussed above in claim 3), Roy further teaches wherein the third imaging device is disposed perpendicular to the longitudinal axis and the first pivot axis (see Fig. 1, any of the plural primary direction cameras 13, 14, 15; Figs. 10-11, all; [0032 "A robot vision supervisor comprised of one or plural primary direction cameras 13, 14, 15 fixed to frame 1, viewing along x-axis."] and [0037]). Regarding Claim 6 Modified Roy teaches the surgical robotic system according to claim 1 (as discussed above in claim 1), Roy is silent regarding wherein the end effector includes a pair of jaws pivotable about a second pin defining a second pivot axis. Rockrohr teaches wherein the end effector includes a pair of jaws pivotable about a second pin defining a second pivot axis (see Figs. 6-10, jaws 276 and 278; [0025], [0062] and [0078 "As shown, the first and second jaws “a, b” are typically mirror images of each other with only minor differences and both of the first and second jaws “a, b” pivot about the same pin or axis, e.g., spindle axes S1, S2. "]). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the pair of jaws of the surgical robot system of Roy to be pivotable about a second pin defining a second pivot axis, as taught by Rockrohr, to enable movement of the end effector in additional degrees of freedom to facilitate grasping tasks. Regarding Claim 7 Modified Roy teaches the surgical robotic system according to claim 1 (as discussed above in claim 1), Roy further teaches wherein the second imaging device is disposed coaxially with the longitudinal axis (see Fig. 1, any of the plural third cross cameras 20, 21, 22; [0032 "...one of plural third cross cameras 20, 21, 22 mounted on the roof, mounting structure not shown for clarity, also fixed with reference to frame 1 viewing in a mutually perpendicular direction to viewing directions or primary and orthogonal direction cameras."] and [0037]). Regarding Claim 8 Modified Roy teaches the surgical robotic system according to claim 1 (as discussed above in claim 1), Roy further teaches further comprising: an imaging assembly configured to couple to the instrument and to secure the first imaging device, the second imaging device, and the third imaging device relative to the end effector (see Fig. 1, frame 1 and cameras 13-22; [0032 "A robot vision supervisor comprised of one or plural primary direction cameras 13, 14, 15 fixed to frame 1, viewing along x-axis. One or plural orthogonal direction cameras 15, 16, 17, 18 mounted on frame 1 viewing in a direction perpendicular to primary direction cameras along y-axis and one of plural third cross cameras 20, 21, 22 mounted on the roof, mounting structure not shown for clarity, also fixed with reference to frame 1 viewing in a mutually perpendicular direction to viewing directions or primary and orthogonal direction cameras."]). Regarding Claim 9 Modified Roy teaches the surgical robotic system according to claim 8 (as discussed above in claim 8), Roy further teaches wherein the imaging assembly further including a first mount defining a first plane that is aligned with the longitudinal axis (see Fig. 1, frame 1 wall mounted to cameras 16-19; [0032 "One or plural orthogonal direction cameras 15, 16, 17, 18 mounted on frame 1 viewing in a direction perpendicular to primary direction cameras along y-axis"]) and a second mount coupled to the first mount, the second mount defining a second plane that is transverse to the first plane (see Fig. 1, frame 1 wall mounted to cameras 13-15 or 20-22; [0032 "A robot vision supervisor comprised of one or plural primary direction cameras 13, 14, 15 fixed to frame 1, viewing along x-axis … one of plural third cross cameras 20, 21, 22 mounted on the roof, mounting structure not shown for clarity, also fixed with reference to frame 1 viewing in a mutually perpendicular direction to viewing directions or primary and orthogonal direction cameras."]). Claims 2 and 10-11 are rejected under 35 U.S.C. 103 as being unpatentable over Roy (as modified by Rockrohr and Nemmers) as applied to claim 1 above, and further in view of Huang et al. (US 20220097234 A1 and Huang hereinafter) and Wellman et al. (US 20190298398 A1 and Wellman hereinafter). Regarding Claim 2 Modified Roy teaches the surgical robotic system according to claim 1 (as discussed above in claim 1), Roy further teaches wherein the first imaging device is configured to obtain the first image of the end effector for determining the yaw angle (see Fig. 1, any of the plural orthogonal direction cameras 16, 17, 18, 19; Figs. 7-9, all; [0031 "A fourth degree of freedom along a tilt axis is achieved by fixing a rotational joint 8 on third guide rail 7 and mounting an end effector 9 to rotational joint 8. The end effector 9 is able to translate along x, y and z axes and tilt along the axis of the rotational joint 8."]-[0032] and [0034]-[0037 "As can be seen in camera viewed images FIG. 8 and FIG. 9 a single primary camera can successful compute all 3 positions and orientation."]). the second imaging device is configured to obtain the second image of the end effector (see Fig. 1, any of the plural third cross cameras 20, 21, 22; [0032 "...one of plural third cross cameras 20, 21, 22 mounted on the roof, mounting structure not shown for clarity, also fixed with reference to frame 1 viewing in a mutually perpendicular direction to viewing directions or primary and orthogonal direction cameras."] and [0037]), and the third imaging device is configured to obtain the third image of the end effector (see Fig. 1, any of the plural primary direction cameras 13, 14, 15; Figs. 10-11, all; [0032 "A robot vision supervisor comprised of one or plural primary direction cameras 13, 14, 15 fixed to frame 1, viewing along x-axis."] and [0037]). Roy is silent regarding the second imaging device is configured to obtain the second image of the end effector for determining at least one of the pitch angle or the jaw angle, and the third imaging device is configured to obtain the third image of the end effector for determining at least one of the pitch angle or the jaw angle. Huang teaches a surgical robotic system (see all Figs.; [0008]) comprising: an instrument including a shaft defining a longitudinal axis (see Fig. 14, robotic arm 11 or flange 110; [0044]) and an end effector (see Fig. 14, tool 12; [0044]), the end effector pivotable at a pitch angle (see Fig. 8, all; [0060] and [0086 "In this disclosure, the processor 10 may change the height of the tool 12, change the tilted direction of the tool 12, and rotate the tool 12 for multiple times, and obtain multiple horizontal displacement amounts through the step S522."]-[0087]); a second imaging device configured to obtain a second image of the end effector along a second axis (see Fig. 14, alignment device 13′; [0059]-[0060 "...the alignment device 13′ may directly capture a 3D image of the tool 12, and perform an image analysis to the 3D image..."] and [0091]), wherein the second imaging device comprises an image sensor, such that the imaging device is configured to obtain a profile image of the end effector (see Fig. 14, alignment device 13′; [0059]-[0060 "When the robotic arm 11 drives the tool 12 to move within the three-dimensional space 131, the alignment device 13′ may directly capture a 3D image of the tool 12, and perform an image analysis to the 3D image, therefore the processor may obtain information of the tool 12, such as shape, gesture, position, tilted angle, etc. through the analyzed data obtained from the image analysis."] and [0091]), wherein the second imaging device is configured to obtain the second image of the end effector for determining the pitch angle (see Figs. 8 and 14, all; [0060 "When the robotic arm 11 drives the tool 12 to move within the three-dimensional space 131, the alignment device 13′ may directly capture a 3D image of the tool 12, and perform an image analysis to the 3D image, therefore the processor may obtain information of the tool 12, such as shape, gesture, position, tilted angle, etc. through the analyzed data obtained from the image analysis."], [0087] and [0091]-[0094]). Wellman teaches a surgical robotic system (see all Figs.; especially Fig. 4; [0009]) comprising: an instrument including a shaft defining a longitudinal axis (see Fig. 2, shaft 210; [0037 "As shown in FIG. 2, instrument 200 includes a long shaft 210 used to couple an end effector 220, located at a distal end of shaft 210..."]) and an end effector (see Figs. 4-5B, end effector 220; [0046 "FIG. 4 is a simplified perspective diagram that includes an imaging device 490 as well as a worksite 400 showing, at the distal end of an instrument, an end effector grasping a material 420, according to some embodiments."]-[0052]), the end effector including a pair of jaws (see Figs. 4-5B, jaws 410; [0046 "The end effector includes jaws 410, which may be consistent with the jaws of two-jawed gripper-style end effector 220 of FIG. 2 and/or the jaws 310 of FIG. 3."]-[0052]) openable to a jaw angle (see Figs. 4-5B, angle 470; [0049 "In some embodiments, an angle 470 between the jaws 410 may be determined."]-[0052]); and a third imaging device configured to obtain a third image of the end effector along a third axis (see Figs. 4-5B, imaging device 490; [0047 "Also shown in FIG. 4 is an imaging device 490 that includes one or more lenses 480 that may be used to capture imaging data of worksite 400."]-[0052]), wherein the third imaging device comprises an image sensor, such that the imaging device is configured to obtain a profile image of the end effector and the pair of jaws (see Figs. 4-5B, imaging device 490; [0047 "Also shown in FIG. 4 is an imaging device 490 that includes one or more lenses 480 that may be used to capture imaging data of worksite 400."]-[0050 "In some examples, FIGS. 5A and 5B comprise imaging data 500 and 570 captured via an imaging device such as the imaging device 490 of FIG. 4."]), wherein the third imaging device is configured to obtain the third image of the end effector for determining the jaw angle (see Figs. 4-5B, angle 470; [0049]-[0052], especially [0050 "In some examples, FIGS. 5A and 5B comprise imaging data 500 and 570 captured via an imaging device such as the imaging device 490 of FIG. 4."] and [0052 "As further shown in FIGS. 5A and 5B, fiducial indicia 530. e.g., markers, may exist on one or both jaws 510 to create a reference system or scale for determining the lengths L1, L2, and L3 and angle A."]; claim 7). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the surgical robot system of modified Roy to determine at least one of the pitch angle or the jaw angle of the end effector by the second and third imaging devices, as taught by Huang and Wellman, in order to provide additional posture information of the end effector to facilitate control of the jaws. Regarding Claim 10 Modified Roy teaches the surgical robotic system according to claim 1 (as discussed above in claim 1), Roy further teaches further comprising: an image processing device configured to process the first image, the second image, and the third image (see Figs. 3, 5 and 7-11, all; [0012 "An object of the invention is to use cameras and image processing to extract the boundaries of a robot workspace and compare with the boundaries of the robot moving parts and end effectors to prevent collisions and guide the robot to its homing position when needed."]-[0013] and [0035]-[0038 "Even though a single primary camera can successfully compute all positions a plurality of primary cameras is used to cover large robot motion range due to limited field of views of a single camera and image distortions causing lack of accuracy. Image stitching algorithms are employed to improve positioning accuracy between the primary and secondary camera arrays by joining the viewed images to that of adjacent cameras creating panoramic views."]) and to calculate the yaw angle (see Figs. 8- 9, all; [0031 "A fourth degree of freedom along a tilt axis is achieved by fixing a rotational joint 8 on third guide rail 7 and mounting an end effector 9 to rotational joint 8."]-[0032] and [0034]-[0037 "As can be seen in camera viewed images FIG. 8 and FIG. 9 a single primary camera can successful compute all 3 positions and orientation."]). Roy is silent regarding the image processing device configured to calculate the pitch angle and the jaw angle. Huang teaches further comprising: an image processing device configured to process the second image (see Fig. 14, alignment device 13′; [0059]-[0060 "...the alignment device 13′ may directly capture a 3D image of the tool 12, and perform an image analysis to the 3D image..."] and [0091]) and to calculate the pitch angle (see Figs. 8 and 14, all; [0060 "When the robotic arm 11 drives the tool 12 to move within the three-dimensional space 131, the alignment device 13′ may directly capture a 3D image of the tool 12, and perform an image analysis to the 3D image, therefore the processor may obtain information of the tool 12, such as shape, gesture, position, tilted angle, etc. through the analyzed data obtained from the image analysis."], [0087] and [0091]-[0094]). Wellman teaches further comprising: an image processing device configured to process the third image (see [0048 "The received imaging data may be transferred through the interface 140 to the control unit 150 for processing by one or more modules of the control unit 150, such as image processing module 190."]-[0052]) and to calculate the jaw angle (see Figs. 4-5B, angle 470; [0049]-[0052], especially [0050 "In some examples, FIGS. 5A and 5B comprise imaging data 500 and 570 captured via an imaging device such as the imaging device 490 of FIG. 4."] and [0052 "As further shown in FIGS. 5A and 5B, fiducial indicia 530. e.g., markers, may exist on one or both jaws 510 to create a reference system or scale for determining the lengths L1, L2, and L3 and angle A."]; claim 7). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the image processing device of the surgical robot system of modified Roy to calculate the pitch angle and the jaw angle, as taught by Huang and Wellman, in order to provide additional posture information of the end effector to facilitate control of the jaws. Regarding Claim 11 Modified Roy teaches the surgical robotic system according to claim 10 (as discussed above in claim 10), Roy further teaches wherein the image processing device is further configured to determine position of the end effector relative to the shaft (see Figs. 8-9, all; [0015 "A further object of the invention is to use only a primary and a secondary orthogonal cameras to compute all positions along all axes."] and [0037 "As can be seen in camera viewed images FIG. 8 and FIG. 9 a single primary camera can successful compute all 3 positions and orientation. As seen in FIG. 8 two orthogonal positions axes to the view direction are computed by referencing the features 23, 24 and 26. The position along the viewing direction is computed by comparing the size of end effector 9 to that of a size at a prior known position such as home as seen in FIG. 9."]). Huang additionally teaches wherein the image processing device is further configured to determine position of the end effector relative to the shaft (see [0090]-[0094 "...controls the alignment device 13′ to continually capture the images of the three-dimensional space 131, and performs an image analysis to the tool 12 in the captured images for obtaining the positions and the tilted angles of the tool 12, so as to compute the direction vector of the tool 12 (step S622)."]). Wellman additionally teaches wherein the image processing device is further configured to determine position of the end effector or each jaw of the pair of jaws relative to the shaft (see [0049]-[0052], especially [0050 "In some examples, FIGS. 5A and 5B comprise imaging data 500 and 570 captured via an imaging device such as the imaging device 490 of FIG. 4."] and [0052 "As further shown in FIGS. 5A and 5B, fiducial indicia 530. e.g., markers, may exist on one or both jaws 510 to create a reference system or scale for determining the lengths L1, L2, and L3 and angle A."]; claim 7). Claims 12-13 are rejected under 35 U.S.C. 103 as being unpatentable over Roy (as modified by Rockrohr and Nemmers) as applied to claim 1 above, and further in view of Beauchemin (US 20100225666 A1 and Beauchemin hereinafter). Regarding Claim 12 Modified Roy teaches the surgical robotic system according to claim 1 (as discussed above in claim 1), Roy is silent regarding wherein each of the first imaging device, the second imaging device, and the third imaging device is an optical comparator. Beauchemin teaches a system (see all Figs.; [0005]-[0015]) comprising: a first imaging device configured to obtain a first image of an part (see Figs. 1-2, digital camera 13; [0009 "...a camera for capturing an image of the illuminated part and a lens placed in front of said camera…"] and [0040]); a second imaging device configured to obtain a second image of the part (see the duplication of parts rational below); and a third imaging device configured to obtain a third image of the part (see the duplication of parts rational below), wherein each of the first imaging device, the second imaging device, and the third imaging device comprises an image sensor and a corresponding light source positioned opposite the image sensor to backlight the part, such that each imaging device is configured to obtain a backlit profile image of the part (see Figs. 1-2, digital camera 13 and illumination means/light source 19; [0008 "...a light source for illuminating the part;..."], [0009 "...a camera for capturing an image of the illuminated part and a lens placed in front of said camera;..."], [0010 "...a data processing system for receiving the image, for obtaining a CAD drawing of said part, for displaying said image and said drawing on a computer screen, where said drawing is digitally overlaid on said image, said data processing system further including means for aligning said drawing with said image..."] and [0038]-[0046]), wherein each of the first imaging device, the second imaging device, and the third imaging device is an optical comparator (see [0001], [0005], [0017]-[0018] and [0023 "The present invention concerns a digital optical comparator system, which will, for ease of reference, be hereinafter referred to as “DOC”. The DOC of the present invention is a computerized optical instrument that allows the user to compare a digital live video image of a part with its CAD drawing, without any restriction regarding the part's size, and in real-time (i.e. at the camera's full frame rate)."]). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the first, second and third imaging devices of the surgical robot system of modified Roy to be an optical comparator, as taught by Beauchemin, in order to measure and display the end effector in real-time, without having to physically touch it. Additionally, duplication of parts has no patentable significance unless a new and unexpected result is produced and no unexpected result would be produced by applying the teachings of Beauchemin to Roy because Roy already discloses three transverse imaging devices to capture different perspectives of the end effector. See MPEP 2144.04(VI). Regarding Claim 13 Modified Roy teaches the surgical robotic system according to claim 12 (as discussed above in claim 12), Roy is silent regarding wherein the optical comparator includes a laser backlight. Beauchemin teaches wherein the optical comparator includes a laser backlight (see Figs. 1-2, illumination means/light source 19; [0008 "...a light source for illuminating the part;..."] and [0043 "Illumination means 19 (either transmitted and/or reflected, for back- and/or front-illumination, respectively)."]). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the first, second and third imaging devices of the surgical robot system of modified Roy to be an optical comparator including a laser backlight, as taught by Beauchemin, in order to measure and display the end effector in real-time, without having to physically touch it. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TANNER LUKE CULLEN whose telephone number is (303)297-4384. The examiner can normally be reached Monday-Friday 9:00-5:00 MT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached at (571) 272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TANNER L CULLEN/Examiner, Art Unit 3656 /KHOI H TRAN/Supervisory Patent Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Jul 09, 2024
Application Filed
Nov 06, 2025
Non-Final Rejection — §103
Feb 09, 2026
Response Filed
Mar 17, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594966
REFERENCE TRAJECTORY VALIDATING AND COLLISION CHECKING MANAGEMENT
2y 5m to grant Granted Apr 07, 2026
Patent 12570002
TELEOPERATION ASSIST DEVICE, TELEOPERATION ASSIST METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 10, 2026
Patent 12568883
METHOD AND SYSTEM FOR COMPUTER-ASSISTED HARVESTING
2y 5m to grant Granted Mar 10, 2026
Patent 12564969
EVENT-DRIVEN SELF-PROGRAMMABLE ROBOTS IN SMART HOMES AND SMART COMMUNITIES
2y 5m to grant Granted Mar 03, 2026
Patent 12539607
ROBOT PROGRAMMING
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
71%
Grant Probability
87%
With Interview (+16.6%)
3y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 161 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month