Prosecution Insights
Last updated: April 19, 2026
Application No. 18/255,314

SYSTEMS AND METHODS FOR PLANNING A MEDICAL ENVIRONMENT

Non-Final OA §101§103
Filed
May 31, 2023
Examiner
EDOUARD, PATRICIA KELLY
Art Unit
3682
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Intuitive Surgical Operations, Inc.
OA Round
3 (Non-Final)
13%
Grant Probability
At Risk
3-4
OA Rounds
2y 11m
To Grant
36%
With Interview

Examiner Intelligence

Grants only 13% of cases
13%
Career Allow Rate
6 granted / 45 resolved
-38.7% vs TC avg
Strong +23% interview lift
Without
With
+23.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
29 currently pending
Career history
74
Total Applications
across all art units

Statute-Specific Performance

§101
39.9%
-0.1% vs TC avg
§103
42.9%
+2.9% vs TC avg
§102
7.6%
-32.4% vs TC avg
§112
8.8%
-31.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 45 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/16/2025 has been entered. Status of Amendments Claims 1-6, 10-15, 17-24 are currently pending in this case and have been examined and addressed below. This communication is a Final Rejection in response to the Amendment to the Claims and Remarks filed on 10/16/2025. Claims 1, 6, and 23-24 are amended claims. Claims 2-5, 10-14, 17, 19-22 are original claims. Claims 15 and 18 are previously presented. Claims 7-9, 16, and 25 have been cancelled and will not be considered at this time. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-6, 10-15, and 17-24 are rejected under 35 U.S.C. § 101 because the claimed invention is directed to a judicial exception (i.e. an abstract idea) without significantly more. Step 1 – Statutory Categories of Invention: Claims are 1-6, 10-15, and 17-24 drawn to a system, which are statutory categories of invention. Step 2A – Judicial Exception Analysis, Prong 1: Independent claim 1 recites a system comprising receive spatial information for a medical environment; determine a component for use in the medical environment; receive an indicator for a mode of an operator-selected mode of operation from the plurality of modes of operation of the component; receive a set of operation constraints for the component for the associated with the operator-selected mode of operation of the component; generate an environment preparation plan including a suggested travel path of the component or personnel for implementing the environment preparation plan, wherein the suggested travel path is based on the set of operation constraints for the operator-selected mode of operation of the component and the spatial information, during an implementation of the environment preparation plan, receive real-time movement data of the component or the personnel from the one or more sensors, and dynamically update the suggested travel path based on the real-time movement data to minimize travel times and path intersections for the component or personnel within the medical environment during the implementation. These steps amount to certain methods of organizing human activity which includes functions relating to managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions) (MPEP § 2106.04(a)(2)(II)(C) citing the abstract idea grouping for methods of organizing human activity for managing personal behavior or relationships or interactions between people – also note MPEP § 2106.04(a)(2)(II) stating certain activity between a person and a computer may fall within the “certain methods of organizing human activity” grouping). The claims recite collecting environmental data to determine an optimal environment preparation plan including a suggested travel path of the component or personnel. These steps organize the medical environment and healthcare providers to determine the optimal location and travel paths of healthcare providers. These steps organize patients and hospital staff by determining optimal scenarios to increase emergency department production. Because these limitations determine the optimal travel to implement following the analysis of medical environment data, they constitute the management of personal behavior on part of healthcare providers and hospital administrative staff. Accordingly, the claims fall under “Certain Methods of Organizing Human Activity” grouping of abstract, and, thus, recite an abstract idea. Step 2A – Judicial Exception Analysis, Prong 2: This judicial exception is not integrated into a practical application because the additional elements within the claims only amount to instructions to implement the judicial exception using a computer [MPEP 2106.05(f)]. The claims recite the additional elements of a processor and a memory having computer readable instructions stored thereon. These elements are recited at a high-level of generality such that it amounts to mere instructions to apply the exception because this is an example of applying the abstract idea by use of general-purpose computer which does not integrate the abstract idea into a practical application. Claim 1 recites one or more sensors. The specification and the instant claim do not provide any indication that the camera is being utilized beyond its ordinary capacity. Therefore, this step is directed to invoking a device merely as a tool to perform an existing process and does not integrate a judicial exception into a practical application or provide significantly more (MPEP 2106.05(f)(2)). Claim 1 recites display, on the display device, a plurality of modes of operation of the component, display the environment preparation plan including the suggested travel path on the display device and display the dynamically updated suggested travel path on the display device. The specification and the instant claim do not provide any indication that the display device is being utilized beyond its ordinary capacity. The specification recites: “As shown in FIG. 3A, a user display and input device 300, such as a touch screen display on a mobile phone, may provide a menu 302 of networked medical components such as a draped robot-assisted manipulator assembly 304 and an operator console 306 that a user may choose to deploy in the medical environment,” (Pg. 4, Lines 24-27) and “The environment preparation plan may be displayed on a display in the medical environment such as the display system 416. Alternatively or additionally, the environment preparation plan may be displayed on a mobile device in the medical environment or proximate to the medical environment. The mobile device may include an application that displays the environment preparation plan,”(Pg. 13, Lines 5-9). Therefore, this step is directed to invoking a device merely as a tool to perform an existing process and does not integrate a judicial exception into a practical application or provide significantly more (MPEP 2106.05(f)(2)). The above claims, as a whole, are therefore directed to an abstract idea. Step 2B – Additional Elements that Amount to Significantly More: The present claims do not include additional elements that are sufficient to amount to more than the abstract idea because the additional elements or combination of elements amount to no more than a recitation of instructions to implement the abstract idea on a computer. As discussed above with respect to integration of the abstract idea into a practical application, the claims recite the additional elements of a processor and a memory having computer readable instructions stored thereon. Thus, taken alone, the additional elements do not amount to significantly more than the above-identified judicial exception. Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. Their collective functions merely provide conventional computer implementation. Claim 1 recites one or more sensors. The specification recites “Kinematic information may also include dynamic kinematic information such as the range of motion of joints in the teleoperational assembly, velocity or acceleration information, and/or resistive forces. The structural or dynamic kinematic constraint information may be generated by sensors in the teleoperational assembly that measure, for example, manipulator arm configuration, medical instrument configuration, joint configuration, component displacement, component velocity, and/or component acceleration. Sensors may include position sensors such as electromagnetic (EM) sensors, shape sensors such as fiber optic sensors, and/or actuator position sensors such as resolvers, encoders, and potentiometers (pg. 15 Lines 1-8).” The specification and the instant claim do not provide any indication that the sensor(s) is being utilized beyond its ordinary capacity. Therefore, this step is directed to invoking a device merely as a tool to perform an existing process and does not integrate a judicial exception into a practical application or provide significantly more (MPEP 2106.05(f)(2)). Claim 1 recites display, on the display device, a plurality of modes of operation of the component and display the environment preparation plan on the display device. These limitations are only recited as a tool for performing steps of the abstract idea, therefore the limitation amount to mere instructions to perform the abstract idea using a computer and are not sufficient to amount to significantly more than the abstract idea (MPEP 2106.05(f) see for additional guidance on the “mere instructions to apply an exception”). For the reasons stated, these claims are consequently rejected under 35 U.S.C. § 101. Analysis of Dependent Claims Dependent claim 5 recites wherein the determined component is a robot-assisted manipulator assembly. Dependent claim 6 recites wherein the operator-selected mode of operation is a preparation mode, a storage mode, or a servicing mode. Dependent claim 10 recites wherein the set of operation constraints includes a spatial envelope for a range of motion of the component. Dependent claim 11 recites wherein the set of operation constraints includes kinematic information for the component. Dependent claim 12 recites wherein the set of operation constraints includes auxiliary component requirements. Dependent claim 13 recites wherein the set of operation constraints includes utility access requirements for the component. Dependent claim 14 recites wherein the set of operation constraints includes operation access requirements. Dependent claim 15 recites wherein the set of operation constraints includes staffing or patient requirements for the operator-selected mode of operation. Dependent claim 17 recites wherein generating the environment preparation plan includes comparing a distance between the component and a second component to a threshold distance. Dependent claim 18 recites wherein generating the environment preparation plan includes comparing an access direction for the component to a predetermined access direction for the operator-selected mode of operation. Dependent claim 19 recites wherein generating the environment preparation plan includes determining at least one auxiliary component for use in the medical environment. Dependent claim 20 recites wherein generating the environment preparation plan includes providing a suggested configuration for the component in the medical environment. Dependent claim 21 recites wherein generating the environment preparation plan includes receiving input from a remote advisory operator. Dependent claim 23 recites wherein cause the system to evaluate the implementation of the environment preparation plan that was conducted in the medical environment by comparison of the physical implementation to the environment preparation plan. Dependent claim 24 recites wherein cause the system to store the environment preparation plan for use or reference in a generation of a second environment preparation plan or to store the environment preparation plan with an evaluation indicator based on the implementation. Each of these steps of the preceding dependent claims 5-6, 10-15, and 17- 24 only serve to further limit or specify the features of independent claim 1 accordingly, and hence are nonetheless directed towards fundamentally the same abstract idea as the independent claim. Dependent claim 2 recites wherein the spatial information is received from a camera in the medical environment. The specification recites: “The spatial information may be received from a mobile device such as a phone, tablet, camera, laptop or other portable measurement device that may measure, scan, image, or otherwise record spatial information about the medical environment from within or proximate to the medical environment” (Pg. 3., Lines 27-31) and “The spatial information source may include a camera, scanner, or other imaging device located in or capable of recording an image of the medical environment” (Pg. 8, Lines 17-18). The specification and the instant claim do not provide any indication that the camera is being utilized beyond its ordinary capacity. Therefore, this step is directed to invoking a device merely as a tool to perform an existing process and does not integrate a judicial exception into a practical application or provide significantly more (MPEP 2106.05(f)(2)). Dependent claim 3 recites wherein the spatial information is received from a three-dimensional depth mapping system. The specification recites: “The dimensions L1-L5 may be measured using a three- dimensional depth mapping system which may include a rangefinder, lidar system, camera, or other measurement tool that may a single purpose device or may be incorporated in to a mobile device such as a phone, tablet, or laptop” (Pg. 4, Lines 2-4) and “The spatial information source may include a lidar scanning system that may scan the environment to generate three-dimensional images using reflected laser light” (Pg. 8, Lines 22-24). The specification and the instant claim do not provide any indication that the three-dimensional depth mapping system is being utilized beyond its ordinary capacity. Therefore, this step is directed to invoking a device merely as a tool to perform an existing process and does not integrate a judicial exception into a practical application or provide significantly more (MPEP 2106.05(f)(2)). Dependent claim 4 recites wherein the spatial information is received from a sensor system. The specification recites: “Other mapping systems, including sensor systems (e.g. electromagnetic position sensors, optical sensors) for tracking static or dynamic locations of components in the medical environment may be used”(Pg. 4, Lines 5-7). The specification and the instant claim do not provide any indication that the sensor system is being utilized beyond its ordinary capacity. Therefore, this step is directed to invoking a device merely as a tool to perform an existing process and does not integrate a judicial exception into a practical application or provide significantly more (MPEP 2106.05(f)(2)). Dependent claim 22 recites wherein cause the system to display the environment preparation plan on a display system. The display system is an additional element, which is mere instructions to apply the exception and does not provide a practical application or significantly more for the same reasons. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-6, 10-15, and 17-24 is/are rejected under 35 U.S.C. 103 as being unpatentable over Fuerst (US 20210121233 A1) in view of Blondel (US 20200281667 A1) in view of Garcia Kilroy (US 20190005848 A1). As per Claim 1, Fuerst a system comprising: one or more sensors; ([Para. 0065] Sensor feedback can be generated by one or more sensors, including but not limited to: a camera, an infrared sensor, 3D scan cameras, robotic sensors, and human tracking devices. These sensors can be located on walls, in handheld devices, or integrated with surgical robotic equipment.) a display device; ([Para. 0048] The VR simulation 100 can be communicated or output through a display, for example, on a user console 120.) a processor; ([Para. 0022] processor of the console computer system) and a memory having computer readable instructions stored thereon, the computer readable instructions, when executed by the processor, cause the system to: ([Para. 0042] memory) receive spatial information for a medical environment; ([Para. 0065] Sensor feedback can be received at operation 284. The sensor feedback can define geometry of the physical operating room and/or positions of physical surgical equipment in the physical OR. The geometry of the physical operating room shows a shape and size of the room. The positions of the physical equipment can describe a location and orientation of equipment in the room (i.e. spatial information).) determine a component for use in the medical environment; ([Para. 0005] an arrangement of surgical robotic arms (i.e. component for use) in advance of a procedure. The arrangement (e.g., location and orientation) of the surgical robotic system (e.g., the surgical robotic arms, the surgical robotic platform, the user console, the control tower, and other surgical robotic equipment) can be planned and optimized for physical environments (e.g., operating rooms in a health facility).) receive a set of operation constraints for the component for the associated with the operator-selected mode of operation of the component; ([Para. 0020] The bedside operator 8 may also operate the system 1 in an “over the bed” mode, in which the beside operator 8 (user) is now at a side of the patient 6 and is simultaneously manipulating a robotically-driven tool (end effector as attached to the arm 4), e.g., with a handheld UID 14 held in one hand, and a manual laparoscopic tool. [Para. 0027] Workflow optimization can include determining workspace, tool placement, port locations, and setup of surgical robotic arms, to prepare for a surgical procedure (i.e. mode of operation). [Para. 0030] Arrange virtual surgical equipment (e.g., workflow and/or layout) based on one or more criteria and inputs (i.e. operation constraints) such as, but not limited to, location of a workspace, reach and movements in a workspace, convenience, and risk of collisions. [Para. 0038] An optimization procedure 110 is performed to arrange the virtual objects (e.g., the patient, the robotic arm and tool, the platform, the user console, the control tower, and other surgical equipment) based on one or more inputs and changes to inputs. A user interface can allow a user to manually move a virtual object. [Para. 0050] The inputs can include changes to one or more of the following factors: the procedure type, the surgical workspace, the virtual patient, the tool position, and kinematics parameters.) Fuerst does not explicitly teach, however Blondel teaches generate an environment preparation plan including a suggested travel path of the component or personnel for implementing the environment preparation plan, wherein the suggested travel path is based on the set of operation constraints for the operator-selected mode of operation of the component and the spatial information; ([Para. 0115] The forces are measured and calculated by virtue of one or more sensors equipping the end of the robot arm 11 and/or each of its shafts. Geometric constraints (i.e. spatial information) can be integrated in the collaborative mode in order to restrict the movements of the robot arm 11 and thus facilitate the medical procedure (i.e. operator-selected mode of operation). [Para. 0118] Robotic device 10 has a man-machine interface device 19. The man-machine interface device 19 can be used when planning the medical procedure, in order to establish the trajectory of the medical instrument (i.e. suggested travel path of the component) 13, or to visualize the progress of the medical instrument 13 in the body of the patient 30, for example by displaying the real-time position or almost real-time position of the medical instrument 13 with respect to the position of the target point in the reference frame.) display the environment preparation plan including the suggested travel path on the display device; ([Para. 0118] Robotic device 10 has a man-machine interface device 19. The man-machine interface device 19 can be used when planning the medical procedure, in order to establish the trajectory of the medical instrument 13, or to visualize the progress of the medical instrument 13 in the body of the patient 30, for example by displaying the real-time position or almost real-time position of the medical instrument 13 with respect to the position of the target point in the reference frame. during an implementation of the environment preparation plan, receive real-time movement data of the component or the personnel from the one or more sensors; ([Para. 0134] The operator can initiate the guiding phase by inserting the medical instrument 13 through the guide tool 12 until the end of the medical instrument reaches the planned target point. The guide tool 12 can have a sensor for indicating the depth of insertion of the medical instrument 13. The robotic device 10 can then display, in real time or almost in real time, the position of the medical instrument 13 in the images and can supply messages to the operator when the target point is near, is reached or has been passed.) dynamically update the suggested travel path based on the real-time movement data to minimize travel times and path intersections for the component or personnel within the medical environment during the implementation; ([Para. 0107] The processing circuit 17 can also determine the position of the point of entry and the position of the target point of the trajectory in said reference frame, taking account of the deformations of the anatomical structures (caused by gravity, respiration, mechanical contact with a medical instrument, etc.) of the patient with respect to the anatomical structures of the patient 30 under consideration in order to determine said trajectory. The capture of the information concerning the position of the anatomy of the patient 30 and the calculation for coordinating the biomechanical model with said information concerning the position of the anatomy of the patient 30 are preferably carried out in real time or almost in real time, such that the position of the point of entry and the position of the target point in the reference frame can be updated in real time or almost in real time in order to monitor the movements and deformations of the anatomical structures of the patient 30. This updating can also be carried out during the insertion of the medical instrument 13 into the body of the patient 30, in order to take account of the deformations induced by the movement of said medical instrument 13. [Para. 0122] The medical intervention can be defined, for example, by the trajectory, the medical instrument 13 to be used, and the treatment parameters, which includes treatment time. [Para. 0135] During the insertion, the processing circuit 17 can use the biomechanical model of the patient 30 to estimate the local deformations of the organs or anatomical structures through which the medical instrument 13 passes and to take these deformations into account for updating the position of the target point.) and display the dynamically updated suggested travel path on the display device. ([Para. 0107] The capture of the information concerning the position of the anatomy of the patient 30 and the calculation for coordinating the biomechanical model with said information concerning the position of the anatomy of the patient 30 are preferably carried out in real time or almost in real time, such that the position of the point of entry and the position of the target point in the reference frame can be updated in real time or almost in real time in order to monitor the movements and deformations of the anatomical structures of the patient 30. This updating can also be carried out during the insertion of the medical instrument 13 into the body of the patient 30, in order to take account of the deformations induced by the movement of said medical instrument 13. [Para. 0118] The man-machine interface device 19 can be used when planning the medical procedure, in order to establish the trajectory of the medical instrument 13, or to visualize the progress of the medical instrument 13 in the body of the patient 30, for example by displaying the real-time position or almost real-time position of the medical instrument 13 with respect to the position of the target point in the reference frame. [Para. 0135] During the insertion, the processing circuit 17 can use the biomechanical model of the patient 30 to estimate the local deformations of the organs or anatomical structures through which the medical instrument 13 passes and to take these deformations into account for updating the position of the target point.) Therefore, it would be prima facie obvious to one of ordinary skill in the art, at the time of filing, to modify the optimization of an operation room as taught by Fuerst and incorporate a robotic device for performing a medical intervention on a patient using a medical instrument as taught by Blondel, with the motivation of assisting in the medical procedure for minimally invasive surgery (Blondel Para. 0017). Fuerst does not explicitly teach, however Garcia Kilroy teaches display, on the display device, a plurality of modes of operation of the component; ([Para. 0051] While wearing the head-mounted display 220, the user may view an immersive first-person perspective view of the virtual robotic surgical environment generated by the virtual reality processor 210 and displayed onto the immersive display 222. A similar first-person perspective view may be displayed onto an external display 240 (e.g., for assistants, mentors, or other suitable persons to view). [Para. 0060] A user may directly select a particular control mode (i.e. operator-selected mode) through, for example, a menu displayed in the first-person perspective view of the virtual environment.) receive an indicator for a mode of an operator-selected mode of operation from the plurality of modes of operation of the component displayed on the display device; ([Para. 0060] A user may directly select a particular control mode (i.e. operator-selected mode) through, for example, a menu displayed in the first-person perspective view of the virtual environment. [Para. 0085] One or more sensors 750 may be configured to detect status of at least one robotic component (e.g., a component of a robotic surgical system, such as a robotic arm, a tool driver coupled to a robotic arm, a patient operating table to which a robotic arm is attached, a control tower, etc.) or other component of a robotic surgical operating room. Such status may indicate, for example, position, orientation, speed, velocity, operative state (e.g., on or off, power level, mode), or any other suitable status of the component.) Therefore, it would be prima facie obvious to one of ordinary skill in the art, at the time of filing, to modify the optimization of an operation room as taught by Fuerst, a robotic device for performing a medical intervention on a patient using a medical instrument as taught by Blondel, and incorporate detecting the status of a surgical robot as taught by Garcia Kilroy, with the motivation of to effectively plan the specifics of conducting minimally invasive procedures with surgical robots (Garcia Kilroy Para. 0005). As per Claim 2, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein the spatial information is received from a camera in the medical environment. ([Para. 0065] Sensor feedback can be generated by one or more sensors, including but not limited to: a camera, an infrared sensor, 3D scan cameras, robotic sensors, and human tracking devices. These sensors can be located on walls, in handheld devices, or integrated with surgical robotic equipment.) As per Claim 3, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1,Fuerst further teaches wherein the spatial information is received from a three-dimensional depth mapping system. ([Para. 0076] based on the sensor feedback (e.g., 3D scans of the physical OR), the system can add new virtual equipment within the virtual OR based on the received data. New models can be generated and stored in the databases. In this manner, the system can add new and unknown virtual objects into the virtual operating room based on feedback from 3D scanners and stitch them into the virtual OR.) As per Claim 4, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, wherein the spatial information is received from a sensor system. ([Para. 0065] Sensor feedback can be received at operation 284. The sensor feedback can define geometry of the physical operating room and/or positions of physical surgical equipment in the physical OR. The geometry of the physical operating room shows a shape and size of the room. The positions of the physical equipment can describe a location and orientation of equipment in the room. Sensor feedback can be generated by one or more sensors, including but not limited to: a camera, an infrared sensor, 3D scan cameras, robotic sensors, and human tracking devices. These sensors can be located on walls, in handheld devices, or integrated with surgical robotic equipment.) As per Claim 5, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein the determined component is a robot-assisted manipulator assembly. ([Para. 0005] an arrangement of surgical robotic arms (i.e. component for use) in advance of a procedure. The arrangement (e.g., location and orientation) of the surgical robotic system (e.g., the surgical robotic arms, the surgical robotic platform, the user console, the control tower, and other surgical robotic equipment) can be planned and optimized for physical environments (e.g., operating rooms in a health facility).) As per Claim 6, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Garcia- Kilroy further teaches wherein the operator-selected mode of operation is a preparation mode, a procedure mode, a storage mode, or a servicing mode. ([Para. 0085] one or more sensors 750 may be configured to detect status of at least one robotic component (e.g., a component of a robotic surgical system, such as a robotic arm, a tool driver coupled to a robotic arm, a patient operating table to which a robotic arm is attached, a control tower, etc.) or other component of a robotic surgical operating room. Such status may indicate, for example, position, orientation, speed, velocity, operative state (e.g., on or off, power level, mode), or any other suitable status of the component.) [Para. 0090] the virtual reality processor 210 may be configured to receive the detected status of the robotic component, and then modify the virtual robotic component based at least in part on the detected status such that the virtual robotic component mimics the robotic component. For example, if a surgeon moves a robotic arm during a robotic surgical procedure to a particular pose, then a virtual robotic arm in the virtual environment may move correspondingly (i.e. procedure mode).) Therefore, it would be prima facie obvious to one of ordinary skill in the art, at the time of filing, to modify the optimization of an operation room as taught by Fuerst, a robotic device for performing a medical intervention on a patient using a medical instrument as taught by Blondel, and incorporate detecting the status of a surgical robot as taught by Garcia Kilroy, with the motivation of to effectively plan the specifics of conducting minimally invasive procedures with surgical robots (Garcia Kilroy Para. 0005). As per Claim 10, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein the set of operation constraints includes a spatial envelope for a range of motion of the component. ([Para. 0032] A surgical robotic kinematics processor 106 can generate kinematics parameters that define movement restraints and movement abilities of a physical surgical robotic arm. [Para. 0040] The kinematics parameters can be generated that define movement constraints of a surgical robotic arm to be used in the procedure type. For example, the kinematics parameters can define a direction that a member of the robot moves, a maximum or minimum distance or angle that each member can travel, the mass of each member, the stiffness of each member, and/or speed and force in which members move. Thus, the arranging of the virtual surgical robotic arm or the virtual tool can further be based on the kinematics parameters of the virtual robotic arm so that the virtual surgical robotic arm can perform the surgical procedures type within the surgical workspace, with reference to the kinematics parameters.) As per Claim 11, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein the set of operation constraints includes kinematic information for the component. ([Para. 0032] A surgical robotic kinematics processor 106 can generate kinematics parameters that define movement restraints and movement abilities of a physical surgical robotic arm. [Para. 0040] The kinematics parameters can be generated that define movement constraints of a surgical robotic arm to be used in the procedure type. For example, the kinematics parameters can define a direction that a member of the robot moves, a maximum or minimum distance or angle that each member can travel, the mass of each member, the stiffness of each member, and/or speed and force in which members move. Thus, the arranging of the virtual surgical robotic arm or the virtual tool can further be based on the kinematics parameters of the virtual robotic arm so that the virtual surgical robotic arm can perform the surgical procedures type within the surgical workspace, with reference to the kinematics parameters.) As per Claim 12, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein the set of operation constraints includes auxiliary component requirements. ([Para. 0038] an optimization procedure 110 is performed to arrange the virtual objects (e.g., the patient, the robotic arm and tool, the platform, the user console, the control tower, and other surgical equipment) based on one or more inputs and changes to inputs. Changes to inputs can include a modification to a virtual patient, a change to a procedure, a change to the operating room, a change to platform height, and/or any change that would modify a location or size of a workspace in a virtual patient. [Para. 0028] A position of a tool 304 can be determined based on the workspace and/or a reach 306 of a surgical tool within the workspace. A tool can include a trocar and/or additional surgical tools (for example, stapler, grasper, scissor, and camera). [Para. 0041] the VR simulation can include virtual equipment 115 such as a surgical robotic platform, a control tower, a user console, scanning equipment (e.g., mobile X-ray machine (C-arm) or ultrasound imaging machine), stands, stools, trays, and other equipment that would be arranged in a surgical operating room. [Para. 0050] The inputs can include changes to one or more of the following factors: the procedure type, the surgical workspace, the virtual patient, the tool position, and kinematics parameters.) As per Claim 13, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein the set of operation constraints includes utility access requirements for the component. ([Para. 0083] the processor can route virtual cables in the virtual operating room (e.g., between the equipment models and/or between models and electric sockets). Similarly, intravenous lines or other medical tubes or cables can be routed within the virtual operating room. The processor can identify sensitive routes where cabling is laid as ‘no-go’ zones to aid in the setup and planning of a surgical robotic procedure, and to help train staff.) As per Claim 14, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein the set of operation constraints includes operation access requirements. ([Para. 0057] A position (including location and/or orientation) of one or more virtual surgical robotic arms and/or a surgical robotic platform can be arranged in the virtual surgical operating room based on the tool position, workspace, and/or port position. The surgical robotic arm positions, on which the tools are attached to, can be determined to hold each of the virtual tools in a proper position so that reach and movements in the workspace are sufficient. Height and pitch angles of the platform can be adjusted.) As per Claim 15, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein the set of operation constraints includes staffing or patient requirements for the operator-selected mode of operation. ([Para. 0028] A workspace 310 in a virtual patient can be defined, for example, based on procedure type, patient anatomy, a patient size, etc. [Para. 0031] The virtual surgical robotic equipment and the virtual patient can be arranged in a virtual operating room, to help plan for a surgical robotic procedure (e.g., laparoscopic surgery).) As per Claim 17, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein generating the environment preparation plan includes comparing a distance between the component and a second component to a threshold distance. ([Para. 0080] Compatibility determination can also be made based on whether a sufficient clearance (i.e. threshold distance) exists between the virtual surgical equipment (i.e. a distance between the component and a second component), between the virtual surgical equipment and walls of the physical OR, or between the virtual surgical equipment and virtual personnel.) As per Claim 18, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein generating the environment preparation plan includes comparing an access direction for the component to a predetermined access direction for the operator-selected mode of operation. ([Para. 0040] The kinematics parameters can be generated that define movement constraints of a surgical robotic arm to be used in the procedure type. For example, the kinematics parameters can define a direction that a member of the robot moves, a maximum or minimum distance or angle that each member can travel, the mass of each member, the stiffness of each member, and/or speed and force in which members move. ([Para. 0080] Compatibility determination can also be made based on whether a sufficient clearance (i.e. access direction) exists between the virtual surgical equipment (i.e. predetermined access direction), between the virtual surgical equipment and walls of the physical OR, or between the virtual surgical equipment and virtual personnel.)) As per Claim 19, Fuerst/ Blondel/ Garcia Kilroy teach The system of claim 1, Fuerst further teaches wherein generating the environment preparation plan includes determining at least one auxiliary component for use in the medical environment. ([Para. 0017] The system 1 can incorporate any number of devices, tools, or accessories used to perform surgery on a patient 6. For example, the system 1 may include one or more surgical tools 7 used to perform surgery. A surgical tool 7 may be an end effector that is attached to a distal end of a surgical arm 4, for executing a surgical procedure.) As per Claim 20, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Blondel further teaches wherein generating the environment preparation plan includes providing a suggested configuration for the component in the medical environment. ([Para. 0108] Having determined the parameters of the trajectory (positions of the point of entry and of the target point) in the reference framework associated with the robotic device 10, or simultaneously with this determination, the processing circuit 17 determines a position setpoint and an orientation setpoint of the guide tool 12 in order to comply with said trajectory. [Para. 0109] The control circuit 16 can then control the robot arm 11 to place, or to help the operator place, the guide tool 12 in said position setpoint and said orientation setpoint that are determined by the processing circuit 17 (i.e. suggest configuration).) Therefore, it would be prima facie obvious to one of ordinary skill in the art, at the time of filing, to modify the optimization of an operation room as taught by Fuerst and incorporate a robotic device for performing a medical intervention on a patient using a medical instrument as taught by Blondel, with the motivation of assisting in the medical procedure for minimally invasive surgery (Blondel Para. 0017). As per Claim 21, Fuerst/ Blondel/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein generating the environment preparation plan includes receiving input from a remote advisory operator. ([Para. 0084] The virtual surgical environment is interactive such that the user can adjust the orientation and/or location of objects in the virtual surgical environment (e.g., the virtual surgical robotic arm, the control tower, an angle or height of the surgical robotic platform, an angle of a display, and more).) As per Claim 22, Fuerst/ Garcia Kilroy teach the system of claim 1, Blondel further teaches wherein the computer readable instructions, when executed by the processor, cause the system to display the environment preparation plan on a display system. ([Para. 0118] The man-machine interface device 19 is a display screen, preferably a touch screen. The man-machine interface device 19 can be used when planning the medical procedure, in order to establish the trajectory of the medical instrument 13, or to visualize the progress of the medical instrument 13 in the body of the patient 30, for example by displaying the real-time position or almost real-time position of the medical instrument 13 with respect to the position of the target point in the reference frame.) Therefore, it would be prima facie obvious to one of ordinary skill in the art, at the time of filing, to modify the optimization of an operation room as taught by Fuerst and incorporate a robotic device for performing a medical intervention on a patient using a medical instrument as taught by Blondel, with the motivation of assisting in the medical procedure for minimally invasive surgery (Blondel Para. 0017). As per Claim 23, Fuerst/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein the computer readable instructions, when executed by the processor, cause the system to evaluate the implementation of the environment preparation plan that was conducted in the medical environment by comparison of the physical implementation to the environment preparation plan. ([Para. 0138] At any moment during or after the guiding phase, the operator can verify the correct execution of the insertion of the medical instrument 13 via control images. Depending on the equipment available to the hospital and in the operating theater, the anatomical zone of interest can be examined using a fixed or mobile imaging apparatus (CT scanner, MRI scanner, radiology C-arc, ultrasound probe, etc.). In preferred embodiments, the images are transferred directly to the robotic device 10, of which the processing circuit 17 comprises, for example, registration algorithms for automatically merging these intra-operative images with the pre-operative images. The robotic device 10 then displays the planning information superposed on the intra-operative images in order to evaluate the progress or the efficacy of the treatment and to determine the corrections that have to be made if necessary.) Therefore, it would be prima facie obvious to one of ordinary skill in the art, at the time of filing, to modify the optimization of an operation room as taught by Fuerst and incorporate a robotic device for performing a medical intervention on a patient using a medical instrument as taught by Blondel, with the motivation of assisting in the medical procedure for minimally invasive surgery (Blondel Para. 0017). As per Claim 24, Fuerst/ Garcia Kilroy teach the system of claim 1, Fuerst further teaches wherein the computer readable instructions, when executed by the processor, cause the system to store the environment preparation plan for use or reference in a generation of a second environment preparation plan or to store the environment preparation plan with an evaluation indicator based on the implementation. ([Para. 0104] The trajectory corresponds to the position of a point of entry, for example on the outer surface of the anatomy of the patient 30, through which the medical instrument 13 has to penetrate into the body of the patient 30, and also the position of a target point inside the patient 30, at the area of the targeted anatomical structure to be reached by said medical instrument 13. The point of entry and the target point are, for example, stored in the form of coordinates in a frame associated with the anatomy of the patient 30. [Para. 0106] The trajectory can be predetermined by means other than the robotic device 10, in which case it is stored, for example, in the storage medium. [Para. 0138] The robotic device 10 then displays the planning information superposed on the intra-operative images in order to evaluate the progress or the efficacy of the treatment and to determine the corrections that have to be made if necessary. [Para. 0139] The processing circuit 17 can also comprise segmentation algorithms for automatically identifying a necrosis zone, comparing it with the planned zone, calculating and displaying the margins obtained in terms of diameter or volume, and indicating the diameter or volume still to be treated.) Therefore, it would be prima facie obvious to one of ordinary skill in the art, at the time of filing, to modify the optimization of an operation room as taught by Fuerst and incorporate a robotic device for performing a medical intervention on a patient using a medical instrument as taught by Blondel, with the motivation of assisting in the medical procedure for minimally invasive surgery (Blondel Para. 0017). Response to Arguments Applicant's arguments, see pgs. 6-9 “Compliance with 101” filed 10/16/ 2025 have been fully considered but they are not persuasive. Applicant argues that the claims are not directed towards an abstract idea that falls within certain methods of organizing human activity category. Examiner is not persuaded. The independent claims recite receive spatial information for a medical environment; determine a component for use in the medical environment; receive an indicator for a mode of an operator-selected mode of operation from the plurality of modes of operation of the component; receive a set of operation constraints for the component for the associated with the operator-selected mode of operation of the component; generate an environment preparation plan including a suggested travel path of the component or personnel for implementing the environment preparation plan, wherein the suggested travel path is based on the set of operation constraints for the operator-selected mode of operation of the component and the spatial information, during an implementation of the environment preparation plan, receive real-time movement data of the component or the personnel from the one or more sensors, and dynamically update the suggested travel path based on the real-time movement data to minimize travel times and path intersections for the component or personnel within the medical environment during the implementation. The claims recite collecting environmental data to determine an optimal environment preparation plan including a suggested travel path of the component or personnel. These steps organize the medical environment and healthcare providers to determine the optimal location and travel paths of healthcare providers. These steps organize patients and hospital staff by determining optimal scenarios to increase emergency department production. Because these limitations determine the optimal travel to implement following the analysis of medical environment data, they constitute the management of personal behavior on part of healthcare providers and hospital administrative staff. Accordingly, the claims fall under “Certain Methods of Organizing Human Activity” grouping of abstract, and, thus, recite an abstract idea. Applicant argues that the claims integrate the alleged judicial exception into a practical application by improving the functioning of a computer or improves another technology or technical field. Examiner is not persuaded. Amended claim 1 integrates real-time movement data from sensors with operation constraints for an operator-selected mode of operation to dynamically update a suggested travel path to minimize travel times and path intersections within the medical environment, thereby enhancing workflow efficiency and reducing the risk of collisions. These improvements are realized through the specific interaction of hardware and software. Accordingly, the additional elements show that amended claim 1 as a whole provides a technological improvement in the field of medical environment management. An improvement to the abstract idea of enhancing workflow efficiency and reducing the risk of collisions does not amount to an improvement to technology or a technical field (see MPEP § 2106.05(a)(III) stating “it is important to keep in mind that an improvement in the abstract idea itself (e.g. a recited fundamental economic concept) is not an improvement in technology. For example, in Trading Technologies Int’l v. IBG,921 F.3d 1084, 1093-94, 2019 USPQ2d 138290 (Fed. Cir. 2019), the court determined that the claimed user interface simply provided a trader with more information to facilitate market trades, which improved the business process of market trading but did not improve computers or technology.”). There is no indication in the instant disclosure that the involvement of a computer assists in improving the technology for the outlined problem statement. Here, the improvement is to medical environment management. The instant application and claim language fail to detail how a computer aids the method, the extent to which the computer aids the method, or the significance of a computer to the performance of the method. Merely adding generic computer components to perform the method is not sufficient. Applicant argues that the present claims recite additional elements that amount to significantly more than the alleged abstract idea. Examiner is not persuaded. . The consideration under Step 2B is if the additional elements, alone or in combination, are well-understood, routine, and conventional in the field – the novelty of the abstract idea is not considered relevant under the Step 2B analysis. Here, the additional elements of Claim 1 reciting a processor and a memory having computer readable instructions stored thereon, alone or in combination, amount to instruction to implement the abstract idea using a general-purpose computer. Claim 1 recites one or more sensors. The specification and the instant claim do not provide any indication that the camera is being utilized beyond its ordinary capacity. Therefore, this step is directed to invoking a device merely as a tool to perform an existing process and does not integrate a judicial exception into a practical application or provide significantly more (MPEP 2106.05(f)(2)). Claim 1 recites display, on the display device, a plurality of modes of operation of the component, display the environment preparation plan including the suggested travel path on the display device and display the dynamically updated suggested travel path on the display device. The specification and the instant claim do not provide any indication that the display device is being utilized beyond its ordinary capacity. Applicant's arguments, see pgs. 9-10 “Compliance with 103” filed 10/16/ 2025 have been fully considered are persuasive regarding the newly added limitations. Therefore, the rejection has been withdrawn. However, upon further consideration, a new grounds of rejection is made in view of Blondel, as per the rejection above. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Patricia K Edouard whose telephone number is (571)272-6084. The examiner can normally be reached Monday - Friday 7:30 AM - 5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Fonya M Long can be reached at 571-270-5096. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /P.K.E./Examiner, Art Unit 3682 /FONYA M LONG/Supervisory Patent Examiner, Art Unit 3682
Read full office action

Prosecution Timeline

May 31, 2023
Application Filed
Mar 22, 2025
Non-Final Rejection — §101, §103
Jun 30, 2025
Response Filed
Jul 26, 2025
Final Rejection — §101, §103
Oct 02, 2025
Applicant Interview (Telephonic)
Oct 02, 2025
Examiner Interview Summary
Oct 16, 2025
Request for Continued Examination
Oct 29, 2025
Response after Non-Final Action
Jan 12, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12340908
REGIONALLY INTEGRATED EMERGENCY STROKE UNIT
2y 5m to grant Granted Jun 24, 2025
Patent 12272450
REVERSE RECALL NOTIFICATION SYSTEM
2y 5m to grant Granted Apr 08, 2025
Patent 12183469
METHOD AND SYSTEM FOR ACCURATELY TRACKING AND INFORMING OF HEALTH AND SAFETY FOR GROUP SETTINGS
2y 5m to grant Granted Dec 31, 2024
Patent 12040087
METHOD OF CONTROLLING USER EQUIPMENT FOR MEDICAL CHECK-UP AND APPARATUS FOR PERFORMING THE METHOD
2y 5m to grant Granted Jul 16, 2024
Patent 12014816
Multi-Sensor Platform for Health Monitoring
2y 5m to grant Granted Jun 18, 2024
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
13%
Grant Probability
36%
With Interview (+23.2%)
2y 11m
Median Time to Grant
High
PTA Risk
Based on 45 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month