Prosecution Insights
Last updated: April 19, 2026
Application No. 17/911,290

SYSTEMS AND METHODS FOR FACILITATING AUTOMATED OPERATION OF A DEVICE IN A SURGICAL SPACE

Final Rejection §103
Filed
Sep 13, 2022
Examiner
BARNES JR, CARL E
Art Unit
2178
Tech Center
2100 — Computer Architecture & Software
Assignee
Intuitive Surgical Operations, Inc.
OA Round
4 (Final)
32%
Grant Probability
At Risk
5-6
OA Rounds
4y 4m
To Grant
57%
With Interview

Examiner Intelligence

Grants only 32% of cases
32%
Career Allow Rate
65 granted / 202 resolved
-22.8% vs TC avg
Strong +25% interview lift
Without
With
+25.2%
Interview Lift
resolved cases with interview
Typical timeline
4y 4m
Avg Prosecution
32 currently pending
Career history
234
Total Applications
across all art units

Statute-Specific Performance

§101
14.3%
-25.7% vs TC avg
§103
62.6%
+22.6% vs TC avg
§102
9.0%
-31.0% vs TC avg
§112
8.7%
-31.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 202 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 02/26/2026 was filed after the mailing date of the non-final on 11/20/2025. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Response to Amendment Claims 1-7, and 10-20 were previously pending and subject to non-final action filed on 11/03/2025. In the response filed 02/10/2026, claims 1, 12, 16 and 17 were amended. Therefore, claims 1-7, and 10-20 are currently pending and subject to the final action below. Response to Arguments Applicant's arguments, see pages 12-15, filed 02/10/2026 with respect to claims 1-7, and 10-20 under 35 U.S.C. 103 have been fully considered but they are not persuasive. Applicant’s argument: Applicant respectfully traverses these rejections since Azizian, McDowall, and DiMaio fails to teach or suggest each and every element recited in claim 1. For example, as discussed during the interview and for at least the reasons described below, the combination of Azizian, McDowall, and DiMaio fails to disclose a processor configured to execute instructions to "direct the computer-assisted surgical system to automatically perform, without requiring further input from a user of the computer-assisted surgical system and based on the one or more operating characteristics of the device and the one or more anatomical characteristics associated with the surgical space, an operation with the device located in the surgical space, wherein: the directing of the computer- assisted surgical system to automatically perform the operation includes directing the computer- assisted surgical system to automatically control operation of the first robotic instrument in the surgical space," as recited in amended independent claim 1 (emphasis added). Office Action, pg. 7. DiMaio indicates that a "trajectory may be associated with a voice command which upon its detection, the auxiliary controller 242 causes the slave arm 124 to move the LUS probe 150 back and forth along the stored trajectory of positions and orientations." See, e.g., DiMaio, para. [0080]. However, as discussed during the interview, DiMaio fails to disclose a processor configured to execute instructions to "direct the computer- assisted surgical system to automatically perform, without requiring further input from a user of the computer-assisted surgical system and based on the one or more operating characteristics of the device and the one or more anatomical characteristics associated with the surgical space, an operation with the device located in the surgical space, wherein: the directing of the computer- assisted surgical system to automatically perform the operation includes directing the computer- assisted surgical system to automatically control operation of the first robotic instrument in the surgical space," as recited in amended independent claim 1 (emphasis added). Azizian, McDowall, and DiMaio fails to disclose, teach, or suggest each and every feature of independent claim 1, the combination of Azizian, McDowall, and DiMaio is not sufficient to establish a prima facie obviousness rejection of claim 1. Moreover, the additional cited art in the Office Action fails to cure the deficiencies of Azizian, McDowall, and DiMaio. Hence, the combination of the additional cited art with Azizian, McDowall, and DiMaio is also not sufficient to establish a prima facie obviousness rejection of claim 1. Although of different scope than claim 1, independent claims 16 and 17 are patentable over the combination of Azizian, McDowall, and DiMaio for at least the same reasons presented with respect to claim 1. Examiner Response: After careful consideration and review of the prior art and applicant’s argument. The examiner respectfully disagrees. Azizian teaches: obtain one or more operating characteristics of a device located in a surgical space; (Azizian − [0034] FIG. 4A shows a laparoscopic ultrasound probe 400, and FIG. 4B shows a drop-in ultrasound probe 450 that can be grasped by a laparoscopic tool 460. Examiner Notes: The laparoscopic tool 460 has a robotic arm and is a device. The ultrasound probe 450 is also a device. [0052] tracking of the robotic arm using join encoders and/or kinematic calculation. Various techniques may be used for controlling the robotic arm.) obtain one or more anatomical characteristics associated with the surgical space, (Azizian − [0034] For example, ultrasound images can provide intraoperative information about anatomical structures beneath the surface of organs that are visible via an endoscope. Specialized ultrasound images such as Doppler images may be helpful in visualizing temporal information such as blood flow or tissue perfusion) and direct the computer-assisted surgical system to automatically perform, without requiring further input from a user of the computer-assisted surgical system and based on the one or more operating characteristics of the device and the one or more anatomical characteristics associated with the surgical space, an operation with the device located in the surgical space, (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Automatically control the constant contact force between ultrasound probe and anatomical feature (e.g., soft tissue). Constant force is an automatic operation of the device.) Azizian recites maintain a constant contact force is perform automatically without user control. McDowall teaches: obtain one or more anatomical characteristics associated with the surgical space, (McDowall − [0005] A tissue surface depth map is produced for a tissue surface within a field of view of a stereoscopic viewing system. [0041] FIG. 11 is an illustrative drawing representing a perspective view of a tissue structure image 1102A is overlaid with a tissue surface depth map 1100.) the one or more anatomical characteristics including depth map data of the surgical space; (McDowall − [0005] [0019] [0041] FIG. 11 is an illustrative drawing representing a perspective view of a tissue structure image 1102A is overlaid with a tissue surface depth map 1100.) Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined the teaching of Azizian, and McDowall as each invention same field of visual computer assisted system for minimal invasive computer assisted surgery. One of ordinary skill in the art would have been motivated to make these modification in help improve tracking robotic surgical equipment in a confined space to avoid accidental collision of robotic surgical equipment during surgery. Therefore, reducing the risk of accidental damage of a patient anatomy. DiMaio teaches: wherein the device is engaged with a first robotic instrument attached to a computer- assisted surgical system and master controls of the computer-assisted surgical system are configured for bimanual control of no more than two robotic instruments at a time; (DiMaio − Fig. 2 [Col. 4 ll 34-38] The master input devices 107 and 108 may include any one or more of a variety of input devices such as joysticks, gloves, trigger-guns, hand-operated controllers, or the like. [Col. 8 ll. 12-55] FIG. 2 illustrates, as an example, a block diagram of the LUS robotic surgical system 100. In this system, there are two master input devices 107 and 108. When control switch mechanism 231 is placed in the first mode, it causes master controller 222 to, causes master controller 202 to communicate with slave controller 203 so that manipulation of the master input 107 by the surgeon results in corresponding movement of tool 138 by slave arm 121, while the endoscope 140 is locked in position. Similarly, when control switch mechanism 231 is placed in the first mode, it causes master controller 222 to communicate with slave controller 223 so that manipulation of the master input 108 by the surgeon results in corresponding movement of tool 139 by slave arm 122.) In simple wording, when in first mode/normal mode; tool1, tool 2 are operable for moving while endoscope (another tool) is locked in position. wherein: the directing of the computer-assisted surgical system to automatically perform the operation includes directing the computer-assisted surgical system to automatically control operation of the first robotic instrument in the surgical space; (DiMaio – [Col. 12 ll. 16-22] Fig. 1 element 139; FIG. 6 illustrates, as an example, a flow diagram of a method for automatically moving the LUS probe 150 to a position and orientation associated with a clickable thumbnail upon command to do so by a surgeon while performing a minimally invasive surgical procedure using tool 139) and the automatically controlling operation of the first robotic instrument is performed while a second robotic instrument and a third robotic instrument attached to the computer- assisted surgical system are bimanually teleoperated by a user of the computer-assisted surgical system. (DiMaio − [Col. 12 ll. 16-22] Fig. 1 element 139; FIG. 6 illustrates, as an example, a flow diagram of a method for automatically moving the LUS probe 150 to a position and orientation associated with a clickable thumbnail upon command to do so by a surgeon while performing a minimally invasive surgical procedure using tool 139. [Col. 8 ll. 12-55] FIG. 2 illustrates, as an example, a block diagram of the LUS robotic surgical system 100. In this system, there are two master input devices 107 and 108. When control switch mechanism 231 is placed in the first mode, it causes master controller 222 to, causes master controller 202 to communicate with slave controller 203 so that manipulation of the master input 107 by the surgeon results in corresponding movement of tool 138 by slave arm 121, while the endoscope 140 is locked in position. Similarly, when control switch mechanism 231 is placed in the first mode, it causes master controller 222 to communicate with slave controller 223 so that manipulation of the master input 108 by the surgeon results in corresponding movement of tool 139 by slave arm 122.) Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined the teaching of Azizian, McDowall and DiMaio as each invention same field of visual computer assisted system for minimal invasive computer assisted surgery. One of ordinary skill in the art would have been motivated to make these modification in help improve laparoscopic surgery in a confined space of the body to avoid damaging tissue in small spaces. DiMaio recites that auxiliary controller 242 cause the LUS probe 150 to move to a position automatically. The movement of the LUS probe 150 moves automatically. Examiner Note Non-robotic device is being interpreted as an image processing device. Bimanual teleoperation refers to the design or use of two devices at once (i.e., left robotic arm, right robotic arm). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1-7, 10-13, and 15-19 are rejected under 35 U.S.C. 103 as being unpatentable over Azizian (WO 2018152183 A1, Pub. Date Aug. 23, 2018) in view of McDowall (US 20190060013 A1 Pub Date: Feb. 28, 2019) in view of DiMaio (US 20180042680 A1 Pub Date: Feb. 15, 2018). Regarding independent claim 1, Azizian teaches: A system comprising: a memory storing instructions; (Azizian − [0053, 0056] a storage device, and one or more memory device) and a processor communicatively coupled to the memory and configured to execute the instructions to: (Azizian − [0053] various modifications (hereinafter "the functions") can be implemented, at least in part, one or more data processing apparatus, storage device; [0056] a processor will receive instructions and data from a read-only memory or a random access memory or both. Components of a computer include a processor for executing instructions and one or more memory devices for storing instructions and data.) obtain one or more operating characteristics of a device located in a surgical space; (Azizian − [0034] FIG. 4A shows a laparoscopic ultrasound probe 400, and FIG. 4B shows a drop-in ultrasound probe 450 that can be grasped by a laparoscopic tool 460. Examiner Notes: The laparoscopic tool 460 has a robotic arm and is a device. The ultrasound probe 450 is also a device. [0052] tracking of the robotic arm using join encoders and/or kinematic calculation. Various techniques may be used for controlling the robotic arm.) obtain one or more anatomical characteristics associated with the surgical space, (Azizian − [0034] For example, ultrasound images can provide intraoperative information about anatomical structures beneath the surface of organs that are visible via an endoscope. Specialized ultrasound images such as Doppler images may be helpful in visualizing temporal information such as blood flow or tissue perfusion) and direct the computer-assisted surgical system to automatically perform, without requiring further input from a user of the computer-assisted surgical system and based on the one or more operating characteristics of the device and the one or more anatomical characteristics associated with the surgical space, an operation with the device located in the surgical space, (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Automatically control the constant contact force between ultrasound probe and anatomical feature (e.g., soft tissue). Constant force is an automatic operation of the device.) maintain a constant contact force is perform automatically without user control. Azizian does not explicitly teach: depth map data of the surgical space; However, McDowall teaches: obtain one or more anatomical characteristics associated with the surgical space, (McDowall − [0005] A tissue surface depth map is produced for a tissue surface within a field of view of a stereoscopic viewing system. [0041] FIG. 11 is an illustrative drawing representing a perspective view of a tissue structure image 1102A is overlaid with a tissue surface depth map 1100.) the one or more anatomical characteristics including depth map data of the surgical space; (McDowall − [0005] [0019] [0041] FIG. 11 is an illustrative drawing representing a perspective view of a tissue structure image 1102A is overlaid with a tissue surface depth map 1100.) Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined the teaching of Azizian, and McDowall as each invention same field of visual computer assisted system for minimal invasive computer assisted surgery. One of ordinary skill in the art would have been motivated to make these modification in help improve tracking robotic surgical equipment in a confined space to avoid accidental collision of robotic surgical equipment during surgery. Therefore, reducing the risk of accidental damage of a patient anatomy. Azizian does not explicitly teach: wherein the device is engaged with a first robotic instrument attached to a computer- assisted surgical system and master controls of the computer-assisted surgical system are configured for bimanual control of no more than two robotic instruments at a time; However, DiMaio teaches: wherein the device is engaged with a first robotic instrument attached to a computer- assisted surgical system and master controls of the computer-assisted surgical system are configured for bimanual control of no more than two robotic instruments at a time; (DiMaio − Fig. 2 [Col. 4 ll 34-38] The master input devices 107 and 108 may include any one or more of a variety of input devices such as joysticks, gloves, trigger-guns, hand-operated controllers, or the like. [Col. 8 ll. 12-55] FIG. 2 illustrates, as an example, a block diagram of the LUS robotic surgical system 100. In this system, there are two master input devices 107 and 108. When control switch mechanism 231 is placed in the first mode, it causes master controller 222 to, causes master controller 202 to communicate with slave controller 203 so that manipulation of the master input 107 by the surgeon results in corresponding movement of tool 138 by slave arm 121, while the endoscope 140 is locked in position. Similarly, when control switch mechanism 231 is placed in the first mode, it causes master controller 222 to communicate with slave controller 223 so that manipulation of the master input 108 by the surgeon results in corresponding movement of tool 139 by slave arm 122.) In simple wording, when in first mode/normal mode; tool1, tool 2 are operable for moving while endoscope (another tool) is locked in position. wherein: the directing of the computer-assisted surgical system to automatically perform the operation includes directing the computer-assisted surgical system to automatically control operation of the first robotic instrument in the surgical space; (DiMaio – [Col. 12 ll. 16-22] Fig. 1 element 139; FIG. 6 illustrates, as an example, a flow diagram of a method for automatically moving the LUS probe 150 to a position and orientation associated with a clickable thumbnail upon command to do so by a surgeon while performing a minimally invasive surgical procedure using tool 139) and the automatically controlling operation of the first robotic instrument is performed while a second robotic instrument and a third robotic instrument attached to the computer- assisted surgical system are bimanually teleoperated by a user of the computer-assisted surgical system. (DiMaio − [Col. 12 ll. 16-22] Fig. 1 element 139; FIG. 6 illustrates, as an example, a flow diagram of a method for automatically moving the LUS probe 150 to a position and orientation associated with a clickable thumbnail upon command to do so by a surgeon while performing a minimally invasive surgical procedure using tool 139. [Col. 8 ll. 12-55] FIG. 2 illustrates, as an example, a block diagram of the LUS robotic surgical system 100. In this system, there are two master input devices 107 and 108. When control switch mechanism 231 is placed in the first mode, it causes master controller 222 to, causes master controller 202 to communicate with slave controller 203 so that manipulation of the master input 107 by the surgeon results in corresponding movement of tool 138 by slave arm 121, while the endoscope 140 is locked in position. Similarly, when control switch mechanism 231 is placed in the first mode, it causes master controller 222 to communicate with slave controller 223 so that manipulation of the master input 108 by the surgeon results in corresponding movement of tool 139 by slave arm 122.) auxiliary controller 242 cause the LUS probe 150 to move to a position automatically. The movement of the LUS probe 150 moves automatically. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined the teaching of Azizian, McDowall and DiMaio as each invention same field of visual computer assisted system for minimal invasive computer assisted surgery. One of ordinary skill in the art would have been motivated to make these modification in help improve laparoscopic surgery in a confined space of the body to avoid damaging tissue in small spaces. Regarding dependent claim 2, depends on claim 1, Azizian teaches: wherein: the obtaining of the one or more anatomical characteristics includes deriving the one or more anatomical characteristics based on one or more data streams associated with the surgical space; (Azizian − [0034-0037] [0034] For example, ultrasound images can provide intraoperative information about anatomical structures beneath the surface of organs that are visible via an endoscope. [0036] a 3D visualization using 2D images such as the 2D ultrasound slices, [0037] provide 2D image slices of the tissue they are in touch with) and the one or more data streams are configured to provide at least one of imaging data, kinematics data, procedural context data, or user input data associated with the surgical space. (Azizian − [0034-0037] [0036] In such cases, the acquired images may be aligned with respect to an underlying image (e.g., an endoscope image) based on the location information associated with the individual images. The alignment can be calculated via an image registration process that includes transforming the sets of data corresponding to the acquired images into one coordinate system based on location information corresponding to the images. [0052] robot-assisted MIS, tracking of ultrasound probes held by a robotic arm can be improved via tracking of the robotic arm using kinematic calculations.) Regarding dependent claim 3, depends on claim 2, Azizian teaches: wherein the one or more anatomical characteristics include at least one of depth map data, surface contour data, or three- dimensional (3D) tissue position data associated with the surgical space. (Azizian − [0034-0037] [0034] For example, ultrasound images can provide intraoperative information about anatomical structures beneath the surface of organs that are visible via an endoscope. [0036-0037] a 3D visualization using 2D images such as the 2D ultrasound slices;) Regarding dependent claim 4, depends on claim 1, Azizian teaches: wherein: the processor is further configured to execute the instructions to apply, based on the one or more operating characteristics of the device and the one or more anatomical characteristics associated with the surgical space, an operating constraint to the device located in the surgical space; (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Motion constraint) the operating constraint is associated with constraining movement of the device in the surgical space; (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Motion constraint) and the directing of the computer-assisted surgical system to automatically perform the operation includes directing the computer-assisted surgical system to change a pose of the device from a first pose to a second pose (Azizian − [0051] The user-input can also include instructions to change an angle of visualization and/or instructions to activate a virtual tool. For example, a surgeon can change a viewpoint to visualize the set of images from a different location.) and maintain the device in the second pose during at least part of a surgical procedure performed in the surgical space. (Azizian − [0051] The user-input can also include instructions to change an angle of visualization and/or instructions to activate a virtual tool. For example, a surgeon can change a viewpoint to visualize the set of images from a different location. Viewpoint is maintained when the change of an angle of visualization.) Regarding dependent claim 5, depends on claim 1, Azizian teaches: wherein: the processor is further configured to execute the instructions to apply, based on the one or more operating characteristics of the device and the one or more anatomical characteristics associated with the surgical space, an operating constraint to the device located in the surgical space; (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Motion constraint) the operating constraint is associated with constraining movement of the device in the surgical space; (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Motion constraint) and the directing of the computer-assisted surgical system to automatically perform the operation includes directing the computer-assisted surgical system to automatically move the device in the surgical space based on movement of a robotic instrument to which the device is not attached. (Azizian − [0052] virtual viscosity and motion damping may be applied to an arm holding an imaging transducer such as an ultrasound probe. This may result in slower or smoother motion, which in turn may provide better trail visualization.) Regarding dependent claim 6, depends on claim 1, Azizian teaches: wherein: the processor is further configured to execute the instructions to apply, based on the one or more operating characteristics of the device and the one or more anatomical characteristics associated with the surgical space, an operating constraint to the device located in the surgical space; (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Motion constraint) the operating constraint is associated with constraining movement of the device in the surgical space; (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Motion constraint) Azizian does not explicitly teach: to automatically maintain the device at a rigid offset with respect to a robotic instrument However, DiMaio teaches: and the directing of the computer-assisted surgical system to automatically perform the operation includes directing the computer-assisted surgical system to automatically maintain the device at a rigid offset with respect to a robotic instrument to which the device is not attached. (DiMaio − [0085] The processor 102 may generate a virtual fixture, such as a guidance virtual fixture or a forbidden region virtual fixture. To generate the virtual fixture, local kinematic constraints on the slave arm manipulating the tool may be specified by providing a table of constraints. Generally, a virtual fixture can limit movement of a surgical instrument or tool.) Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined the teaching of Azizian, McDowall and DiMaio as each invention same field of visual computer assisted system for minimal invasive computer assisted surgery. One of ordinary skill in the art would have been motivated to make these modification in help improve laparoscopic surgery in a confined space of the body to avoid damaging tissue in small spaces. Regarding dependent claim 7, depends on claim 1, Azizian teaches: wherein: the processor is further configured to execute the instructions to apply, based on the one or more operating characteristics of the device and the one or more anatomical characteristics associated with the surgical space, an operating constraint to the device located in the surgical space; (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Motion constraint) the operating constraint is associated with constraining movement of the device in the surgical space; (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Motion constraint) and the directing of the computer-assisted surgical system to automatically perform the operation includes directing the computer-assisted surgical system to maintain at least one of a contact pressure or a contact angle of the device with respect to a surface of an object in the surgical space. (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site.) Regarding dependent claim 10, depends on claim 1, Azizian teaches: wherein: the device is a non-robotic device that is engaged by a robotic instrument attached to the computer-assisted surgical system; and the directing of the computer-assisted surgical system to automatically perform the operation includes directing the computer-assisted surgical system to automatically control, by way of the robotic instrument, operation of the non-robotic device in the surgical space. (Azizian − [0034] FIG. 4A shows a laparoscopic ultrasound probe 400, and FIG. 4B shows a drop-in ultrasound probe 450 that can be grasped by a laparoscopic tool 460. Examiner Notes: The laparoscopic tool 460 has a robotic arm and is a device. The ultrasound probe 450 is also a device. [0052] tracking of the robotic arm using join encoders and/or kinematic calculation. Various techniques may be used for controlling the robotic arm. [0052-0053] [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Automatically control the constant contact force between ultrasound probe and anatomical feature (e.g., soft tissue). Constant force is an automatic operation of the device.) Regarding dependent claim 11, depends on claim 1, Azizian teaches: wherein: the device is a non-robotic imaging device that is engaged by a robotic instrument attached to the computer-assisted surgical system, the non-robotic imaging device configured to capture imagery of an object in the surgical space; (Azizian − [0034] FIG. 4A shows a laparoscopic ultrasound probe 400, and FIG. 4B shows a drop-in ultrasound probe 450 that can be grasped by a laparoscopic tool 460. Examiner Notes: The laparoscopic tool 460 has a robotic arm and is a device. The ultrasound probe 450 is also a device) and the directing of the computer-assisted surgical system to automatically perform the operation includes directing the computer-assisted surgical system to automatically control the non-robotic imaging device to capture the imagery of the object. (Azizian − [0021] [0034] capturing multiple 2D slices of a 3D structure using an imaging device such as an ultrasound probe.) Regarding dependent claim 12, depends on claim 11, Azizian teaches: wherein the non-robotic imaging device is configured to contact a surface of the object to capture the imagery of the object; (Azizian −[0021] capturing multiple 2D slices of a 3D structure using an imaging device such as an ultrasound probe [0034] For example, ultrasound images can provide intraoperative information about anatomical structures beneath the surface of organs that are visible via an endoscope. Specialized ultrasound images such as Doppler images may be helpful in visualizing temporal information such as blood flow or tissue perfusion) and the directing of the computer-assisted surgical system to automatically perform the operation includes directing the computer-assisted surgical system to maintain a state of contact of the non-robotic imaging device with the surface of the object. (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Motion constraint) Regarding dependent claim 13, depends on claim 1, Azizian teaches: wherein the maintaining of the state of contact includes: determining, based on the one or more anatomical characteristics, that there will be a change in at least one of a contact pressure or a contact angle of the non-robotic imaging device with respect to the object as the non-robotic imaging device moves along the surface of the object; (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Automatically control the constant contact force between ultrasound probe and anatomical feature (e.g., soft tissue). Constant force is an automatic operation of the device. ) and automatically moving, in response to the determining that there will be the change, the non-robotic imaging device to maintain at least one of an amount of contact pressure or the contact angle of the non-robotic imaging device with respect to the object (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Automatically control the constant contact force between ultrasound probe and anatomical feature (e.g., soft tissue). Constant force is an automatic operation of the device. ) while the non-robotic imaging device moves along the surface of the object to capture the imagery of the object. (Azizian − [0052] virtual viscosity and motion damping may be applied to an arm holding an imaging transducer such as an ultrasound probe. This may result in slower or smoother motion, which in turn may provide better trail visualization.) Regarding dependent claim 15, depends on claim 1, Azizian teaches: wherein the directing of the computer-assisted surgical system to automatically perform the operation includes directing the computer- assisted surgical system to: generate a motion path for the device to follow in the surgical space; and automatically move the device along the motion path during a surgical procedure. (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. virtual viscosity and motion damping may be applied to an arm holding an imaging transducer such as an ultrasound probe. This may result in slower or smoother motion, which in turn may provide better trail visualization.) Regarding independent claim 16, Azizian teaches: A system comprising: a memory storing instructions; (Azizian − [0053, 0056] a storage device, and one or more memory device) and a processor communicatively coupled to the memory and configured to execute the instructions to: (Azizian − [0053] various modifications (hereinafter "the functions") can be implemented, at least in part, one or more data processing apparatus, storage device; [0056] a processor will receive instructions and data from a read-only memory or a random access memory or both. Components of a computer include a processor for executing instructions and one or more memory devices for storing instructions and data.) obtain one or more operating characteristics of a first robotic instrument in a surgical space, (Azizian − [0034] FIG. 4A shows a laparoscopic ultrasound probe 400, and FIG. 4B shows a laparoscopic tool 460. Examiner Notes: The laparoscopic tool 460 has a robotic arm and is a first robotic instrument.) wherein the first robotic instrument, a second robotic instrument, and a third robotic instrument are each attached to a computer-assisted surgical system, (Azizian − [0011] [0028] FIG. 1 is a perspective view of an example patient-side cart of a computer- assisted tele-operated surgery system. A plurality of robotic instruments shown in Fig. 1. [0028] robotic manipulator arm assemblies 120, 130, 140, and 150 of the patient-side cart 100) and the second and third robotic instruments are configured to be bimanually teleoperated by a user of the computer-assisted surgical system; (Azizian − [0012] FIG. 2 is a front view of an example surgeon console of a computer-assisted tele-operated surgery system. [0028] The surgeon console 40 also includes left and right input devices 41 , 42 that the user may grasp respectively with his/her left and right hands to manipulate devices (e.g. , surgical instruments) being held by the robotic manipulator arm assemblies 120, 130, 140, and 150 of the patient-side cart 100 in preferably six or more degrees-of-freedom ("DOF"). Foot pedals 44 with toe and heel controls are provided on the surgeon console 40 so the user may control movement and/or actuation of devices associated with the foot pedals.) obtain one or more anatomical characteristics associated with the surgical space; (Azizian − [0034] For example, ultrasound images can provide intraoperative information about anatomical structures beneath the surface of organs that are visible via an endoscope. Specialized ultrasound images such as Doppler images may be helpful in visualizing temporal information such as blood flow or tissue perfusion) and direct the computer-assisted surgical system to automatically perform, without requiring further input from the user of the computer-assisted surgical system and based on the one or more operating characteristics of the first robotic instrument and the one or more anatomical characteristics associated with the surgical space, an operation with the first robotic instrument while the user of the computer-assisted surgical system bimanually teleoperates the second and third robotic instruments. (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Automatically control the constant contact force between ultrasound probe and anatomical feature (e.g., soft tissue). Constant force is an automatic operation of the device.) Azizian does not explicitly teach: depth map data of the surgical space; However, McDowall teaches: obtain one or more anatomical characteristics associated with the surgical space, (McDowall − [0005] A tissue surface depth map is produced for a tissue surface within a field of view of a stereoscopic viewing system. [0041] FIG. 11 is an illustrative drawing representing a perspective view of a tissue structure image 1102A is overlaid with a tissue surface depth map 1100.) the one or more anatomical characteristics including depth map data of the surgical space; (McDowall − [0005] [0019] [0041] FIG. 11 is an illustrative drawing representing a perspective view of a tissue structure image 1102A is overlaid with a tissue surface depth map 1100.) Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined the teaching of Azizian, and McDowall as each invention same field of visual computer assisted system for minimal invasive computer assisted surgery. One of ordinary skill in the art would have been motivated to make these modification in help improve tracking robotic surgical equipment in a confined space to avoid accidental collision of robotic surgical equipment during surgery. Therefore, reducing the risk of accidental damage of a patient anatomy. Azizian does not explicitly teach: wherein master controls of the computer-assisted surgical system are configured for bimanual control of no more than two robotic instruments at a time, However, DiMaio teaches: wherein master controls of the computer-assisted surgical system are configured for bimanual control of no more than two robotic instruments at a time, (DiMaio − Fig. 2 [Col. 4 ll 34-38] The master input devices 107 and 108 may include any one or more of a variety of input devices such as joysticks, gloves, trigger-guns, hand-operated controllers, or the like. [Col. 8 ll. 12-55] FIG. 2 illustrates, as an example, a block diagram of the LUS robotic surgical system 100. In this system, there are two master input devices 107 and 108. When control switch mechanism 231 is placed in the first mode, it causes master controller 222 to, causes master controller 202 to communicate with slave controller 203 so that manipulation of the master input 107 by the surgeon results in corresponding movement of tool 138 by slave arm 121, while the endoscope 140 is locked in position. Similarly, when control switch mechanism 231 is placed in the first mode, it causes master controller 222 to communicate with slave controller 223 so that manipulation of the master input 108 by the surgeon results in corresponding movement of tool 139 by slave arm 122.) In simple wording, when in first mode/normal mode; tool1, tool 2 are operable for moving while endoscope (another tool) is locked in position. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined the teaching of Azizian, McDowall and DiMaio as each invention same field of visual computer assisted system for minimal invasive computer assisted surgery. One of ordinary skill in the art would have been motivated to make these modification in help improve laparoscopic surgery in a confined space of the body to avoid damaging tissue in small spaces. Regarding independent claim 17, claim 17 have similar/same technical features/limitations as claim limitation are rejected under the same rational. Regarding dependent claim 18, depends on claim 17, Azizian teaches: wherein: the device is a non-robotic imaging device configured to contact a surface of an object in the surgical space to capture imagery of the object; (Azizian −[0021] capturing multiple 2D slices of a 3D structure using an imaging device such as an ultrasound probe [0034] For example, ultrasound images can provide intraoperative information about anatomical structures beneath the surface of organs that are visible via an endoscope. Specialized ultrasound images such as Doppler images may be helpful in visualizing temporal information such as blood flow or tissue perfusion) and the directing of the computer-assisted surgical system to automatically perform the operation includes directing the computer-assisted surgical system to maintain a state of contact of the non-robotic imaging device with the surface of the object. (Azizian − [0052] the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site. Motion constraint) Regarding dependent claim 19, depends on claim 18, Azizian teaches: wherein the maintaining a state of contact includes the computer-assisted surgical system automatically adjusting at least one of contact pressure or a contact angle of the non-robotic imaging device with respect to the object while the non-robotic imaging device captures the imagery of the object. (Azizian − [0052] ultrasound probe 450; the control mechanism for the arm may include a force sensor that can be operated under a motion constraint, for example, to maintain a constant contact force between an imaging transducer (e.g., an ultrasound probe) and an anatomical feature (e.g., soft tissue) at the surgical site.) Claim(s) 14 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Azizian and McDowall as applied to claim 13 above, and further in view of Richmond (US 20170143429 A1 Pub Date: May 25, 2017). Regarding dependent claim 14, depends on claim 13, Azizian teaches: wherein the automatically moving of the non- robotic imaging device includes: analyzing an image captured by the non-robotic imaging device while the non- robotic imaging device is used to capture the imagery of the object; (Azizian −[0021] capturing multiple 2D slices of a 3D structure using an imaging device such as an ultrasound probe [0034] For example, ultrasound images can provide intraoperative information about anatomical structures beneath the surface of organs that are visible via an endoscope. Specialized ultrasound images such as Doppler images may be helpful in visualizing temporal information such as blood flow or tissue perfusion) and automatically adjusting the at least one of the contact pressure or the contact angle based on the image capture deficiency to maintain the at least one of the amount of contact pressure or the contact angle. (Azizian − [0052] virtual viscosity and motion damping may be applied to an arm holding an imaging transducer such as an ultrasound probe. This may result in slower or smoother motion, which in turn may provide better trail visualization.) Azizian does not explicitly teach: determining, based on the analyzing of the image captured by the non-robotic imaging device, that the image includes an image capture deficiency; However, Richmond teaches: determining, based on the analyzing of the image captured by the non-robotic imaging device, that the image includes an image capture deficiency; (Richmond − [0155] In a similar, closed-loop manner, color and white balance of the imaging device output can be determined through suitable imaging processing methods. the images can be analyzed to automatically optimize the color balance, white balance, dynamic range and illumination uniformity (spatial uniformity).) Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined the teaching of Azizian, McDowall and Richmond as each invention same field of visual computer assisted system for minimal invasive computer assisted surgery. One of ordinary skill in the art would have been motivated to make these modification in help improve tracking robotic surgical equipment in a confined space to avoid accidental collision of robotic surgical equipment during surgery. Therefore, reducing the risk of accidental damage of a patient anatomy. Regarding dependent claim 20, depends on claim 19, Azizian teaches: wherein the automatically adjusting of the at least one of the contact pressure or the contact angle includes: analyzing an image captured by the non-robotic imaging device while the non- robotic imaging device is used to capture the imagery of the object; (Azizian −[0021] capturing multiple 2D slices of a 3D structure using an imaging device such as an ultrasound probe [0034] For example, ultrasound images can provide intraoperative information about anatomical structures beneath the surface of organs that are visible via an endoscope. Specialized ultrasound images such as Doppler images may be helpful in visualizing temporal information such as blood flow or tissue perfusion) and automatically adjusting the at least one of the contact pressure or the contact angle based on the image capture deficiency. (Azizian − [0052] virtual viscosity and motion damping may be applied to an arm holding an imaging transducer such as an ultrasound probe. This may result in slower or smoother motion, which in turn may provide better trail visualization.) Azizian does not explicitly teach: determining, based on the analyzing of the image captured by the non-robotic imaging device, that the image includes an image capture deficiency; However, Richmond teaches: determining, based on the analyzing of the image captured by the non-robotic imaging device, that the image includes an image capture deficiency; (Richmond − [0155] In a similar, closed-loop manner, color and white balance of the imaging device output can be determined through suitable imaging processing methods. the images can be analyzed to automatically optimize the color balance, white balance, dynamic range and illumination uniformity (spatial uniformity).) Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined the teaching of Azizian, McDowall and Richmond as each invention same field of visual computer assisted system for minimal invasive computer assisted surgery. One of ordinary skill in the art would have been motivated to make these modification in help improve tracking robotic surgical equipment in a confined space to avoid accidental collision of robotic surgical equipment during surgery. Therefore, reducing the risk of accidental damage of a patient anatomy. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CARL E BARNES JR whose telephone number is (571)270-3395. The examiner can normally be reached Monday-Friday 9am-6pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen Hong can be reached at (571) 272-4124. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CARL E BARNES JR/Examiner, Art Unit 2178 /STEPHEN S HONG/Supervisory Patent Examiner, Art Unit 2178
Read full office action

Prosecution Timeline

Sep 13, 2022
Application Filed
Feb 08, 2025
Non-Final Rejection — §103
Apr 30, 2025
Examiner Interview Summary
Apr 30, 2025
Applicant Interview (Telephonic)
May 07, 2025
Response Filed
May 27, 2025
Final Rejection — §103
Jul 10, 2025
Applicant Interview (Telephonic)
Jul 10, 2025
Examiner Interview Summary
Jul 29, 2025
Response after Non-Final Action
Aug 27, 2025
Request for Continued Examination
Sep 06, 2025
Response after Non-Final Action
Nov 15, 2025
Non-Final Rejection — §103
Jan 27, 2026
Examiner Interview Summary
Jan 27, 2026
Applicant Interview (Telephonic)
Feb 10, 2026
Response Filed
Mar 04, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12584932
SLIDE IMAGING APPARATUS AND A METHOD FOR IMAGING A SLIDE
2y 5m to grant Granted Mar 24, 2026
Patent 12541640
COMPUTING DEVICE FOR MULTIPLE CELL LINKING
2y 5m to grant Granted Feb 03, 2026
Patent 12536464
SYSTEM FOR CONSTRUCTING EFFECTIVE MACHINE-LEARNING PIPELINES WITH OPTIMIZED OUTCOMES
2y 5m to grant Granted Jan 27, 2026
Patent 12530765
SYSTEMS AND METHODS FOR CALCIUM-FREE COMPUTED TOMOGRAPHY ANGIOGRAPHY
2y 5m to grant Granted Jan 20, 2026
Patent 12530523
METHOD, APPARATUS, SYSTEM, AND COMPUTER PROGRAM FOR CORRECTING TABLE COORDINATE INFORMATION
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
32%
Grant Probability
57%
With Interview (+25.2%)
4y 4m
Median Time to Grant
High
PTA Risk
Based on 202 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month