DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 13OCT2023 was filed in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Drawings
The drawings have been received on 13OCT2023. These drawings have been objected to under 37 CFR 1.84 for the following reasons: the words used to designate various parts are not plain and legible for Figure 1. New corrected drawings in compliance with 37 CFR 1.121(d) are required in this application because of the reasons stated above. Applicant is advised to employ the services of a competent patent draftsperson outside the Office, as the U.S. Patent and Trademark Office no longer prepares new drawings. The corrected drawings are required in reply to the Office action to avoid abandonment of the application. The requirement for corrected drawings will not be held in abeyance.
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(4) because reference characters "118" and "124" have both been used to designate recalibration command in Figures 14-15. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Specification
The disclosure is objected to because of the following informalities:
¶0073
¶0073 of the specification states “display screen 18”. This should be “monitor 18” for consistent terminology.
¶0075
¶0075 of the specification states “patient 9.” This should be “patient 8” for consistent terminology.
¶0079
¶0079 of the specification states “instrument platform 24,”. This should be “robot platform 24” for consistent terminology.
¶0085
¶0085 of the specification states “robotic instrument 12”. This should be “passive controller 12” for consistent terminology.
¶0092
¶0092 of the specification states “clutch operation 104.”. This should be “clutch procedure 104 for consistent terminology.
Appropriate correction is required.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 20 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim does not fall within at least one of the four categories of patent eligible subject matter because it claims a computer program, which would encompass transitory forms of media. Claim 20 will need to be amended to include some sort of non-transitory computer readable medium.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-10, 13-16, & 18-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ignakov (US Publication No. 20200055195).
Regarding claim 1, Ignakov discloses an apparatus (Ignakov ¶0156 “robotic control system 200”) comprising: at least one processor (Ignakov ¶0096 “a controller processor 130”); and at least one memory (Ignakov ¶0115 “The controller storage component 128 can include one or more data storage systems (including volatile memory or non-volatile memory or other data storage elements, or a combination thereof).”)including computer program code (Ignakov ¶0260 “These embodiments may be implemented in computer programs executing on programmable computers, each computer including at least one processor, a data storage system (including volatile memory or non-volatile memory or other data storage elements or a combination thereof), and at least one communication interface.”), the at least one memory and computer program code configured to, with the at least one processor, cause the apparatus to: receive a recalibration command from a passive controller (Ignakov Figure 2a part 224 as described in ¶0156; ¶0197); configured to remotely control a robotic instrument (Ignakov Figure 2a showing the controller 224 is synched to end effector 216 wirelessly, as described in ¶0185), wherein the passive controller and robotic instrument have freedom of movement within respective control and instrument workspaces, and wherein the control workspace is mapped to the instrument workspace to allow the position of the robotic instrument to track the position of the passive controller as the passive controller moves within the control workspace (Ignakov ¶0083 “ For example, the command signals can be determined with respect to respective synchronization reference frames that can be assigned to coincide with the end effector and the end effector controller during operation of the robotic control system. Such synchronization reference frames can optionally act as offset reference frames, that can be used to help determine the relative location of the end effector controller and/or end effector and to help determine the nature of the movement required from the robotic device for the end effector to track the inputs from the end effector controller in a desired manner. “; Figure 4 as described in ¶0192-¶0196); and recalibrate the mapping of the control workspace to the instrument workspace in response to the recalibration command such that the current position of the passive controller corresponds to the current position of the robotic instrument (Ignakov Figure 4 as described in ¶0197 “if the user 201 triggers the system to transition into its engage mode, the controller processor 230 then proceeds to step 410. For example, the controller processor 230 can determine that the end effector controller 224 just resumed from a disabled mode upon receipt of the engagement signal. The controller processor 230 can then assign the robot synchronization reference frame S.sub.R to the current pose of the end effector 216, and the user synchronization reference frame S.sub.U to the current pose of the end effector controller 224.”).
Regarding claim 2, Ignakov further discloses wherein the passive controller comprises a clutch mechanism (Ignakov ¶0194 “the controller processor 230 determines whether the end effector controller 224 is in its disabled mode. For example, the operator 201 can trigger the disabled mode by causing a disengagement signal to be sent to the controller processor 230. The disengagement signal can be generated in response to an activation of a mode selection button at the end effector controller 224 or by alternative methods, such as verbal commands from the operator 201.”) configured to enable the position of the passive controller within the control workspace to be changed without causing a corresponding change in the position of the robotic instrument within the instrument workspace (Ignakov ¶0195 “By enabling the end effector controller 224 to enter a disabled mode, the operator 201 has the flexibility to rest and/or adjust his hand position without affecting the operation of the robotic device 210. The operator 201 can also take advantage of the disabled mode to reposition the end effector controller 224 to minimize strain”), and wherein the recalibration command is received from the passive controller when the clutch mechanism is engaged or subsequently disengaged. (Ignakov Figure 4 as described in ¶0197 “For example, the controller processor 230 can determine that the end effector controller 224 just resumed from a disabled mode upon receipt of the engagement signal. The controller processor 230 can then assign the robot synchronization reference frame S.sub.R to the current pose of the end effector 216, and the user synchronization reference frame S.sub.U to the current pose of the end effector controller 224.”).
Regarding claim 3, Ignakov further discloses wherein the passive controller comprises an engagement mechanism configured to initiate the tracking of the robotic instrument position to the passive controller position (Ignakov ¶0196 “To exit the disabled mode, the end effector controller 224 transmits an engagement signal to the controller processor 230 to resume transmitting motion commands to the robot.”), and wherein the recalibration command is received from the passive controller on activation of the engagement mechanism (Ignakov ¶0197 “ if the user 201 triggers the system to transition into its engage mode, the controller processor 230 then proceeds to step 410. For example, the controller processor 230 can determine that the end effector controller 224 just resumed from a disabled mode upon receipt of the engagement signal. The controller processor 230 can then assign the robot synchronization reference frame S.sub.R to the current pose of the end effector 216, and the user synchronization reference frame S.sub.U to the current pose of the end effector controller 224.”).
Regarding claim 4, Ignakov further discloses wherein the passive controller comprises an unlock mechanism configured to reinitiate the tracking of the robotic instrument position to the passive controller position following a tracking interruption , and wherein the recalibration command is received from the passive controller on activation of the unlock mechanism. (Ignakov ¶0196 “To exit the disabled mode, the end effector controller 224 transmits an engagement signal to the controller processor 230 to resume transmitting motion commands to the robot.” Showing the process of Figure 4 where reengagement of the controller initiates a resynchronization, and leaving the disabled mode unlocks the user controls of the device so a user may manipulate the robotic system again).
Regarding claim 5, Ignakov further discloses wherein the orientation of the robotic instrument tracks the orientation of the passive controller (Ignakov ¶0174 “If desired for specific applications, the rotation component of the hand reference frame H can be set to coincide exactly with the rotation of the end effector controller:”; ¶0183 “Similar to how the movement of the end effector controller 224 is tracked relative to the user synchronization reference frame S.sub.U, the movement of the end effector 216 will be executed with respect to the robot synchronization reference frame S.sub.R.”), and wherein the apparatus is configured to automatically control the orientation of the robotic instrument such that it is aligned with the orientation of the passive controller on activation of the engagement or unlock mechanisms (Ignakov ¶0197 “If the end effector controller 224 has just resumed from a disabled mode, for example if the user 201 triggers the system to transition into its engage mode, the controller processor 230 then proceeds to step 410. For example, the controller processor 230 can determine that the end effector controller 224 just resumed from a disabled mode upon receipt of the engagement signal. The controller processor 230 can then assign the robot synchronization reference frame S.sub.R to the current pose of the end effector 216, and the user synchronization reference frame S.sub.U to the current pose of the end effector controller 224.”).
Regarding claim 6, Ignakov further discloses wherein the apparatus is configured to determine a trajectory of movement for the robotic instrument within the instrument workspace based on the current orientation of the passive controller to enable said automatic control (Ignakov ¶0204 “Referring now to FIGS. 12e and 12f, a goal transformation T.sub.S.sub.R.sub.,G is then defined for the robotic device 210. The goal transformation T.sub.S.sub.R.sub.,G defines a goal for the end effector 216 based on the relative movement tracked by the end effector controller 224. “).
Regarding claim 7, Ignakov further discloses wherein the apparatus is configured to redetermine the trajectory of movement as the current orientation of the passive controller changes. (¶0078 “An end effector controller may be manipulated by a system user (e.g. a human) and used to help control the end effector on the robotic device. A controller processor may track a position and/or pose of the end effector controller, and optionally may track the position continuously and/or in real time from a user's perspective (i.e. at a sampling rate that is sufficiently fast such that a system user does not perceive a lag between controller input and robotic device movement).”; ¶0081).
Regarding claim 8, Ignakov further discloses wherein the robotic instrument is configured to be rearranged between an initial pose and one or more further poses, (Ignakov ¶0079 “Optionally, the robot device may be positionable in, and moveable between a variety of poses. A pose, as discussed herein, may include a position of the end effector and optionally an orientation of the end effector, and similarly the position and/or orientation of the associated end effector controller that is being manipulated by the user.”) and wherein the apparatus is configured to automatically control the arrangement of the robotic instrument such that it returns from the one or more further poses to the initial pose on activation of a re-homing mechanism. (Ignakov ¶0095 “In the event that communication between the robotic device 110 and the user interface apparatus 120 is interrupted, the robotic device 110 may be configured to pursue one or more predetermined courses of action, including, for example, freezing and remaining in the position it was in when communication was lost, automatically returning to a pre-determined “home” position and the like.”).
Regarding claim 9, Ignakov further discloses wherein the apparatus is configured to limit the speed of movement of the robotic instrument to a predefined magnitude during alignment/arrangement of the robotic instrument. (Ignakov ¶0205 “The functions can be static integers to simply increase or decrease the speed of motion, or more complex relations allowing, for example, for the end effector to move more quickly than the user when the user moves above a certain threshold, and slowing the manipulator down below the speed of the user when the user moves slower than a different threshold, or any other function that may be desirable.”).
Regarding claim 10, Ignakov further discloses wherein the apparatus is configured to stop automatically controlling the robotic instrument on receipt of an override command or once the alignment/arrangement is complete. (Ignakov ¶0192 “In some embodiments, the end effector controller 224 can enter a disabled mode during which movement at the end effector controller 224 does not cause corresponding movement at the end effector 216. FIG. 4 shows an example method 400 of controlling the robotic device 210 in which the end effector controller 224 can enter the disabled mode. “).
Regarding claim 13, Ignakov further discloses wherein the passive controller comprises an electronic display screen configured to display a representation of the instrument and control workspaces, (Ignakov ¶0087 “The information may be displayed to the user using any suitable user display device, including a display screen(s), a wearable display device such as a virtual reality display headset (VR display) and the like.”; display component 122) and wherein the apparatus is configured to control the electronic display screen such that the current position and/or orientation of the robotic instrument and passive controller are indicated within the representations of the respective instrument and control workspaces. (Ignakov ¶0087 “The display for the user can be a two-dimensional display, but preferably is configured to be a three-dimensional (3D) display that can help provide the user with depth-perception and increase the user's feeling of immersion in the system. Preferably, the display can help a user feel as if he/she is located in the environment of the robotic device and can interact with that environment in a generally natural, intuitive manner. “).
Regarding claim 14, Ignakov further discloses wherein the apparatus is configured to control the electronic display screen such that the current position and/or orientation are indicated in two or three dimensions. (Ignakov ¶0087 “For example, the imaging component may capture distance-based information using a LIDAR sensor and the system may then generate a 3D computer model based on the data that can be visually presented to the user on a display screen or via a VR headset.”).
Regarding claim 15, Ignakov further discloses wherein the current position and/or orientation of the passive controller and robotic instrument are the last known position and/or orientation of the passive controller and robotic instrument to the apparatus, respectively. (Ignakov Figure 4 showing the synchronization; ¶0139 “These reference frames are used to help ensure that the end effector moves in the same manner as the end effector controller, and to allow the user to stop, move and reposition themselves while the system is in the disabled state without causing corresponding movements of the end effector, and then to re-enable the system and resume control of the end effector from where they left off.”).
Regarding claim 16, Ignakov further discloses wherein the robotic instrument comprises an end effector, (Ignakov ¶0161 “The manipulator 217 extends from the proximate end to a distal end that includes an end effector 216.”) and wherein the current position and/or orientation of the robotic instrument is the current position and/or orientation of the end effector. (Ignakov Figure 3; ¶0185).
Regarding claim 18, Ignakov further discloses wherein the apparatus comprises at least one of: the passive controller (Ignakov Figure 2a part 224 as described in ¶0156; ¶0197) and robotic instrument (Ignakov robotic device 210).
Regarding Claim 19, Ignakov discloses a computer-implemented method (Ignakov Figure 4 method 400) comprising: receiving a recalibration command from a passive controller (Ignakov Figure 2a part 224 as described in ¶0156) configured to remotely control a robotic instrument (Ignakov Figure 2a showing the controller 224 is synched to end effector 216 wirelessly, as described in ¶0185), wherein the passive controller and robotic instrument have freedom of movement within respective control and instrument workspaces, and wherein the control workspace is mapped to the instrument workspace to allow the position of the robotic instrument to track the position of the passive controller as the passive controller moves within the control workspace (Ignakov ¶0083 “ For example, the command signals can be determined with respect to respective synchronization reference frames that can be assigned to coincide with the end effector and the end effector controller during operation of the robotic control system. Such synchronization reference frames can optionally act as offset reference frames, that can be used to help determine the relative location of the end effector controller and/or end effector and to help determine the nature of the movement required from the robotic device for the end effector to track the inputs from the end effector controller in a desired manner. “; Figure 4 as described in ¶0192-¶0196); and recalibrating the mapping of the control workspace to the instrument workspace in response to the recalibration command such that the current position of the passive controller corresponds to the current position of the robotic instrument. (Ignakov Figure 4 as described in ¶0197 “if the user 201 triggers the system to transition into its engage mode, the controller processor 230 then proceeds to step 410. For example, the controller processor 230 can determine that the end effector controller 224 just resumed from a disabled mode upon receipt of the engagement signal. The controller processor 230 can then assign the robot synchronization reference frame S.sub.R to the current pose of the end effector 216, and the user synchronization reference frame S.sub.U to the current pose of the end effector controller 224.”).
Regarding claim 20, Ignakov further discloses a computer program comprising computer code which, when executed by a computer, causes the computer configured to perform the method of claim 19. (Ignakov ¶0260 “These embodiments may be implemented in computer programs executing on programmable computers, each computer including at least one processor, a data storage system (including volatile memory or non-volatile memory or other data storage elements or a combination thereof), and at least one communication interface.”).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 11-12 & 17 are rejected under 35 U.S.C. 103 as being unpatentable over Ignakov (US Publication No. 20200055195) in view of Itkowitz (US Publication No. 20110118753).
Regarding claim 11, claim 3 is anticipated by Ignakov. Ignakov further discloses a proximity sensor (Ignakov ¶0088 “Optionally, the information relating to the environment surrounding the robotic device can be captured as visual information (e.g. photos and/or video) captured using optical sensors, but alternatively may be captured using distance/proximity sensors, thermal sensors, RADAR, LIDAR, SONAR and other suitable techniques. The display provided to the user may be in the same form as the captured information, e.g. video captured by the imaging component may be displayed as video to the user. “).
Ignakov does not disclose wherein an engagement mechanism is configured to detect the presence or absence of a user, and wherein the apparatus is configured to initiate the tracking of the robotic instrument position to the passive controller position only when the proximity sensor has detected the presence of a user. Itkowitz in a similar field of endeavor of remote controlled robotic systems teaches wherein an engagement mechanism is configured to detect the presence or absence of a user, and wherein the apparatus is configured to initiate the tracking of the robotic instrument position to the passive controller position only when the system has detected the presence of a user. (Itkowitz Figure 11 showing the process of presence detection initiating system tracking, or lack of adequate presence returning the system to redetect a presence). Before the effective filing date, one of ordinary skill in the art would be motivated to combine the robotic system of Ignakov with the methods for operation of a robotic system wherein an engagement mechanism is configured to detect the presence or absence of a user, and wherein the apparatus is configured to initiate the tracking of the robotic instrument position to the passive controller position only when the proximity sensor has detected the presence of a user, as taught in Itkowitz, for the purpose of restricting unwanted movements of the robotic system (Itkowitz ¶0245).
Regarding claim 12, claims 3 & 11 are obvious over Ignakov combined with Itkowitz. Ignakov does not further disclose wherein the apparatus is configured to stop the robotic instrument from tracking the position of the passive controller when the proximity sensor detects the absence of the user. Itkowitz in a similar field of endeavor of remote controlled robotic systems teaches wherein the apparatus is configured to stop the robotic instrument from tracking the position of the passive controller when the system detects the absence of the user (Itkowitz ¶0240 “ System controller 140 determines whether the hand present event or the hand not present event requires any change to the system mode of operation and issues an appropriate command. “). Before the effective filing date, one of ordinary skill in the art would be motivated to combine the robotic system of Ignakov with the methods for wherein the apparatus is configured to stop the robotic instrument from tracking the position of the passive controller when the proximity sensor detects the absence of the user, as taught in Itkowitz, for the purpose of restricting unwanted movements of the robotic system (Itkowitz ¶0245).
Regrading claim 17, claim 1 is anticipated by Ignakov. Ignakov does not disclose wherein the robotic instrument is a surgical robotic instrument. Itkowitz in a similar field of endeavor of remote controlled robotic systems teaches wherein the robotic instrument is a surgical robotic instrument (Itkowitz Figure 1 showing a surgical robot). Before the effective filing date, one of ordinary skill in the art would be motivated to combine the robotic system of Ignakov with the surgical robotic instrument, as taught in Itkowitz, for the purpose of having a remotely controlled surgical robot such as is commonly known in the art with devices such as the DaVinci system.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MEGAN FEDORKY whose telephone number is (571)272-2117. The examiner can normally be reached M-F 9:30-4:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer McDonald can be reached on M-F 9:30-4:30. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MEGAN T FEDORKY/Examiner, Art Unit 3796
/ALLEN PORTER/Primary Examiner, Art Unit 3796