DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on January 26, 2026 has been entered.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 59 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
With regards to claim 59, the limitation “wherein the first display is configured to accept a first set of user inputs for controlling the treatment probe” is recited in lines 1-2. However, claim 59 is dependent upon claim 1 which sets forth in the last 4 lines the limitation “wherein the primary user interface is configured to accept user input for controlling the treatment probe”. It is unclear as to whether there is a distinction between the “first set of user inputs” in claim 59 and the “user input” in claim 1 which are both set forth for controlling the treatment probe, thus rendering the claim indefinite. For examination purposes, Examiner assumes there is no distinction and suggests Applicant remove this limitation.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 10, 12, 15-20 and 59 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mantri et al. (US Pub No. 2020/0360100) in view of Mesaros et al. (US Pub No. 2021/0106309), Gregerson et al. (US Pub No. 2018/0185113) and Aljuri et al. (US Pub No. 2021/0121251).
With regards to claim 1, Mantri et al. a system to perform surgery on a patient, the system comprising:
a console (420, 490) (paragraphs [0064]-[0066], [0085], referring to the console (420) which is coupled to the treatment probe (450) and/or the console (490) which is coupled to an imaging probe (460); paragraph [0164], referring to the treatment system (1200) which may comprise one or more components of the system 400, which is depicted in Figure 2 and includes the console, etc. ; Figures 2, 12);
one or more processors (423, 492) configured to control movement of a treatment probe (450) and obtain images of tissue treated with the treatment probe from an imaging device (460) (paragraphs [0071], [0078]-[0079], [0087], referring to the imaging probe (460) and treatment probe being coupled to first and second robotic arms (442, 444) which may be operably coupled with the processor (423) and/or processor (492) and wherein the robotic arms provide precise control of the treatment probe and the imaging probe; paragraph [0164], referring to the treatment system (1200) which may comprise one or more components of the system 400, which is depicted in Figure 2 and includes the one or more processors, etc.; paragraph [0057], [0069], [0089], [0115], [0141], referring to the imaging probe obtaining images; Figures 2, 12);
a positioning arm (see Figure 12, referring to the arm that is holding the display screens (1220)) coupled to the console (paragraph [0164], referring to the system (1200) which comprise the components of system (400), which includes the console as depicted in Figure 2 and includes one or more display screens 1220, which is coupled to an arm (i.e. positioning arm) as depicted in Figure 12; Figures 2, 12); and
a primary user interface (“graphical user interface”) comprising a first display (i.e. one of the displays (1220)) coupled to the positioning arm and a secondary user interface comprising a second display (i.e. another of the displays (1220)) fixedly coupled to the console (paragraphs [0012], [0074], [0090], [0287], referring to the user input device comprising a user interface on a display screen; paragraphs [0221]-[0222], referring to the system comprising user input devices to enable a user to control movement of the robotic arms, the mobile base, the imaging probe (460), the treatment probe, etc., wherein the mobile device may comprise a touchscreen; paragraphs [0164]-[0165]; Figures 2, 12, in particular, see Figure 12, wherein the displays (1220; i.e. “first display”, “second display”) are coupled to the positioning arm and coupled to the console; further see Figure 2, wherein there are two display (i.e. 425, 495; “first display”, “second display”),
wherein the primary user interface is configured to accept a first set of user inputs for controlling the treatment probe (paragraphs [0081], [0109], referring to user interface allowing the user to manipulate the robotic arm motion via inputs provided to the user interface, such as the user affecting rotation, translation and/or adjustment of pitch angle of the treatment probe)
However, Mantri et al. do not specifically disclose that the positioning arm is configured to support and enable movement of the first display from a first location to a second location, the first location above a midline of the patient with a first configuration of the positioning arm and the second location away from the midline of the patient with a second configuration of the positioning arm, the primary user interface configured to receive via the first display a plurality of user inputs with the positioning arm in the first configuration and the first display supported above the midline of the patient.
Further, Mantri et al. do not specifically disclose that the secondary user interface is configured to mirror output of the first display on the second display, wherein the secondary user interface is configured to receive annotations associated with treatment planning without controlling the treatment probe.
Mesaros et al. disclose an ultrasonic diagnostic imaging system (10) and method for conduct of an ultrasonic image-guided invasive procedure, wherein the ultrasound system has a touch-screen display (30) located on the distal end of an articulating display arm (32) positioned on one side of a patient table (Abstract; paragraphs [0011]-[0014]; Figures 3-6). The articulating arm (32) has a wide range of motion so that the touchscreen display (30) can be positioned in a convenient location for a procedure, wherein, in addition to being capable of being raised and lowered, the articulating arm (32) can also pivot a full 360 degrees around the center axis of articulation joint (36) as indicated by the circular range-of-motion arrow 60 as depicted in Figure 4 (paragraphs [0012]-[0014]). The articulating arm can be swung around and articulated to an extended position over the patient (100) and toward the position of the clinician (200), wherein the clinician (200) can then grab the touchscreen where desired above the table (110) (paragraph [0114]; Figure 5-6, note that the articulating/positioning arm can thus support the touchscreen display (30) at a first location above a midline, as depicted in Figures 5-6, with a first configuration and a second location away from the midline of the patient with a second configuration, such as the configuration assumed prior to the articulating arm being extended over the patient). The articulating display arm (32) can be in a stowed position wherein it is locked in this position when the system is moved for safety purposes (paragraph [0012]; Figure 2, note that the stowed position can alternatively correspond to the second location away from the midline with a second configuration). With the touchscreen display (30) positioned in front of the clinician and above the patient (100), the clinician can use both hands to hold the ultrasound probe and any instruments needed for the procedure without further interaction with the display (paragraph [0014]; Figures 5-6).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify the positioning arm of Mantri et al. to be configured to support and enable movement of the first display from a first location to a second location, the first location above a midline of the patient with a first configuration of the positioning arm and the second location away from the midline of the patient with a second configuration of the positioning arm, the primary user interface configured to receive via the first display a plurality of user inputs with the positioning arm in the first configuration and the first display supported above the midline of the patient, as taught by Mesaros et al., in order to provide a wide range of motion for the display so that the display can be positioned in a convenient location for a procedure, including a position in front of the clinician and above the patient so that the clinician can use both hands to hold the ultrasound probe and any instruments needed for the procedure without further interaction with the display (paragraphs [0012]-[0014]).
However, though Mantri et al. do disclose that the first display and the second display (i.e. “one or more screens 1220”) are capable of displaying information to a medical practitioner or patient, the above combined references do not specifically disclose that the user interface of the above combined references is configured to mirror output of the first display on the second display and that the secondary user interface is configured to receive annotations associated with treatment planning without controlling the treatment probe.
Gregerson et al. disclose an image-guided surgery system, wherein the system may be operative coupled to a first display (121), which may include a monitor that is fixed to a cart (120) or other structure within the operating suite and the system may also be operatively coupled to at least one additional display device (401), wherein the display (401) can be mounted to an adjustable support (850), such as a gooseneck, balanced-arm or pivoting-arm support stand (paragraph [0051], Figures 1, 4, 8F). The display device (401) may be a mirror of the display device (121) and may display all or a portion of the same information as shown on display device (121), or, alternately, the display device (401) may display different information than is shown on display device (121) (paragraphs [0058], [0078], [0098]). A user-interface component may control the display of system information and/or graphical user interface elements on the displays (121, 401), wherein the user interface device may include a touchscreen user interface which may be integrated with a display device (121, 401) (paragraph [0054]). The user commands received via one or more user input devices may enable a user to control various functions of the system (400), such as changing what is shown on the displays (121, 401) (paragraph [0054]).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the user interface of the above combined references be configured to mirror output of the first display on the second display, as taught by Gregerson et al., as the above combined references require displaying information using the first display and the second display and Gregerson et al. teach a known technique for displaying information using two displays wherein one display device may effectively display the same mirrored information on the other display device during a surgical procedure (paragraphs [0058], [0078], [0098]). That is, displaying information using two displays (i.e. first display, second display), as desired by the above combined references, using the known technique of having the user interface be configured to mirror output of the first display on the second display, as taught by Gregerson et al., would have been obvious to one of ordinary skill in the art.
However, the above combined references do not specifically disclose that the secondary user interface is configured to receive annotations associated with treatment planning without controlling the treatment probe.
Aljuri et al. disclose an apparatus for robotic surgery, wherein the system includes a user interface (1700) which may comprise a user input (1770) for the user to select data to be shown on the display (paragraph [0326]; Figure 17). An image of the treatment area may be shown within a control area (1704) and a user is able to specify a treatment area for this anatomical feature (paragraph [0328, note that a user specifying a treatment area on the display corresponds to the display/user interface being configured to receive annotations associated with treatment planning without controlling the treatment probe]; Figure 17). The plurality of resection profiles and corresponding treatment areas established by the user can be fed into a computer which may save the treatment plans for execution by a surgeon, whether human or robotic (paragraph [0328]). As a treatment plan comprising a plurality of cut profiles is modified by the user, the processor may receive the modified treatment plan, cut profile and ultrasound images and the trained classifier/neural network can be used to generate updated safety and efficacy parameters shown on display 1750, wherein the user interface includes controls to allow a user to adjust the treatment plan (paragraphs [0338]-[0339]). Further, as depicted in Figure 3B, the display (425) is fixedly coupled to the console (420) (paragraph [0334], referring to Figures 17A-C referring to a display coupled to a surgical system; paragraph [0081], Figure 3B). The classifier can be trained on a plurality of resection profiles, wherein the classifier can be trained with user input to identify anatomical landmarks, etc. with an input, such as a touchpad wherein the user can draw a line of the resection profile with the input device to train the classifier (paragraphs [0378]-[0379], note that the user drawing a line corresponds to receiving annotations). The ability to receive user input in which surgical parameters are input into the system prior to treatment and allow the system to be set up by an operator prior to surgery provides an improved surgical robotic procedure and further accommodates individual variability among patients and surgical system parameters to provide improved treatment outcomes (Abstract; paragraph [0059]).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the secondary user interface of the above combined references be configured to receive annotations associated with treatment planning without controlling the treatment probe, as taught by Aljuri et al., in order to accommodate individual variability among patients and surgical system parameters to provide improved treatment outcomes (Abstract; paragraph [0059]).
With regards to claim 10, the above combined references disclose that the imaging device comprises an ultrasound imaging device configured to be inserted into the patient, and the ultrasound imaging device comprises a transrectal ultrasound (TRUS) imaging probe configured to view the treatment probe with the treatment probe located between the TRUS probe and the first display (see Mantri et al., paragraph [0062], referring to the imaging probe (460) and the treatment probe (450) being aligned so that the treatment probe is within the field of view of the imaging probe; paragraph [0063], referring to the imaging probe (460) comprising a transrectal ultrasound (TRUS) probe configured for insertion into the rectum of the patient to view the patient’s prostate and surrounding tissue; Figures 4B, 8A, 11A, 12, wherein, as depicted in Figs. 4B, 8A and 11A, the imaging probe (460) is above the treatment probe (450), and therefore the display modified in view of Mesaros et al. which can be at the first location with the first configuration would result in the treatment probe being located between the TRUS probe and the display).
With regards to claim 12, the above combined references disclose that the system further comprises a patient support (449) configured to support the patient with the patient support below patient, the midline of the patient corresponding to a plane extending through the first display and the patient support (see Mantri et al., paragraph [0063], referring to the patient being positioned on a patient support (449) and wherein, as depicted in Figures 4-6 of Mesaros et al., the display can be positioned over the patient such that the midline of the patient corresponds to a plane extending through the display and the support; see Figures 2, 12 of Mantri et al.).
With regards to claim 15, Mantri et al. disclose that the positioning arm is coupled to a top of the console (paragraphs [0090], [0164]; see Figure 4A, wherein the arm supporting the display (425) is coupled to a top of the console (420); Figure 12).
With regards to claim 16, Mantri et al. disclose that the first display comprises a touchscreen display (paragraph [0221], referring to the touchscreen).
With regards to claim 17, Mesaros et al. disclose that the first display is movable with the positioning arm from the first location above the midline of the patient to the second location in which the second location comprises a stowed position above the console with the arm in the second configuration (paragraphs [0012]-[0014], referring to the articulating display arm (32) can be in a stowed position wherein it is locked in this position when the system is moved for safety purposes, wherein the stowed position can correspond to the second location away from the midline with the second configuration; Figures 4-6).
With regards to claim 18, Mantri et al. disclose that the console (i.e. console within 1213) has a length, a width, and a height, and wherein the first display (i.e. one of the displays (1220)) and the positioning arm (i.e. arm supporting the display (1220)) are configured to fit within the length and width of the console in a stowed position with the arm in the second configuration (paragraph [0164], referring to the system (1200) comprising one or more components of the system (400) and thus includes the console of system (400) as depicted in Figure 4; Figure 12).
With regards to claim 19, the limitation “wherein the first display is movable along the midline of the patient from between the legs of a patient to over the torso of the patient”, the limitation is directed to an intended use and/or manner of operating the claimed system/touchscreen display. A recitation of the intended use of the claimed invention must result in a structural difference between the claimed invention and the prior art in order to patentably distinguish the claimed invention from the prior art. If the prior art structure is capable of performing the intended use, then it meets the claim. Since the display of Mesaros et al. is configured to move to cover a wide range of motion, including having the capability of pivoting a full 360 degrees around the center axis of the articulation joint (36) and the base (12) further has wheels for moving the system (paragraphs [0011]-[0014]; Figures 4-6), the display of Mesaros et al. is capable of being moved along the midline of the patient from between the legs of a patient to over the torso of the patient and thus meets the above limitation.
With regards to claim 20, Mesaros et al. disclose that the first display is pivotable between a downward facing position and an upward facing position (paragraph [0012, referring to the display (30) being tilted upward and downward and to be pivoted about a vertical axis with respect to a user; Figures 5-6).
With regard to claim 59, Mantri et al. disclose that the first display is configured to accept a first set of user inputs for controlling the treatment probe (paragraphs [0081], [0109], referring to user interface allowing the user to manipulate the robotic arm motion via inputs provided to the user interface, such as the user affecting rotation, translation and/or adjustment of pitch angle of the treatment probe). Further, the above combined references require that the user interface is configured to mirror output of the first display on the second display, and therefore it would follow that in the above combined references, the mirroring would result in the second display being configured to accept a second set of user inputs for controlling the treatment probe as well. Though the above combined references do not specifically disclose that the second set of user inputs is reduced when compared to the first set of user inputs, Gregerson et al. disclose that the user-interface component may control the display of system information and/or graphical user interface elements on the displays (121, 401), wherein the user commands via one or more user input devices may enable a user to control various functions of the system, such as changing what is shown on the displays (121, 401) (e.g. displaying different menu options, etc.) and further that one display device (i.e. 401) may display a portion (versus all) of the same information as shown on the display device (121) (paragraphs [0054], [0058]), and therefore, it would have been obvious to one of ordinary skill in the art, through routine optimization, to have a user selectively select a reduced second set of user inputs when compared to the first set of user inputs for the second display, in order to determine the optimal number of user inputs for presentation to the user of the second display that would provide an optimized clear, uncluttered presentation.
Claim(s) 3-9 and 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mantri et al. in view of Mesaros et al., Gregerson et al. and Aljuri et al. as applied to claim 1 above, and further in view of Aljuri’646 (US Pub No. 2015/0057646).
With regards to claims 3 and 11, as discussed above, the above combined references meet the limitations of claim 1. Mantri et al. further disclose that the imaging device comprises an ultrasound imaging device configured to be inserted into the patient and the ultrasound imaging device comprises a transducer to generate a transverse image along a transverse image plane, the positioning arm configured to support the plane with transverse image displayed on the first display above the patient (paragraph [0063], referring to the imaging probe (460) comprises a transrectal ultrasound (TRUS) probe configured for insertion into the rectum of the patient to view the patient’s prostate and surrounding tissue; paragraph [0115], referring to the imaging probe comprising a TRUS probe and may provide one or more images along the transverse plane; paragraph [0218], referring to the display being configured to display images of the treatment site from one or more views acquired by the imaging probe) and wherein the modified arm in view of Mesaros et al. is configured to support the first display above the patient, the transverse image shown on the display approximately parallel to the transverse image (paragraphs [0012]-[0014], referring to the articulating arm being able to pivot a full 360 degrees around the center axis of the articulation joint and the upper articulation joint (38) providing additional range of motion as it enables the upper arm section (52) to be pivoted about the joint, and thus the display can be positioned such that the transverse image of the above combined references can be shown on the display approximately parallel to the transverse image plane; Figures 3-6) and further the modified arm in view of Mesaros et al. is configured to position the image of the above combined references shown on the display to within 30 degrees of parallel to a long axis of the longitudinal array and to within 30 degrees of perpendicular to a long axis of the transverse array (paragraphs [0012]-[0014], referring to the articulating arm being able to pivot a full 360 degrees around the center axis of the articulation joint and the upper articulation joint (38) providing additional range of motion as it enables the upper arm section (52) to be pivoted about the joint, and thus the display can be positioned such that the image of the above combined references can be shown on the display to within 30 degrees of parallel to a long axis of the longitudinal array and to within 30 degrees of perpendicular to a long axis of the transverse array; Figures 3-6).
However, the above combined references do not specifically disclose that the transducer is a transducer “array” and/or that the ultrasound imaging device comprises a transverse array and a longitudinal array [claim 11].
Aljuri’646 disclose a system for treating a patient and wherein live patient ultrasounds can be acquired using transrectal ultrasound (Abstract; paragraph [0438]). A probe can be operated so as to provide a real time determination of tissue removal profile, wherein the probe may comprise a transducer (392) that may comprise an ultrasound array to provide axial and transverse imaging (paragraphs [0398], [0403]; Figures 22A,B, note that such an array that provides axial/longitudinal and transverse imaging corresponds to a transverse array and a longitudinal array).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the transducer of the above combined references be a transducer array and/or that the ultrasound imaging device comprises a transverse array and a longitudinal array [claim 11], as taught by Alijuri’646, in order to successfully and effectively provide both axial and transverse imaging (paragraph [0403]).
With regards to claim 4, Mesaros et al. disclose that the transverse image plane is parallel to the transverse image shown on the first display to within 30 degrees (paragraphs [0012]-[0014], referring to the articulating arm being able to pivot a full 360 degrees around the center axis of the articulation joint and the upper articulation joint (38) providing additional range of motion as it enables the upper arm section (52) to be pivoted about the joint, and thus the display can be positioned such that the transverse image of the above combined references can be shown on the display approximately parallel to the transverse image plane to within 30 degrees; Figures 3-6).
With regards to claim 5, as discussed above, the above combined references meet the limitation of claim 1. Further, Mantri et al. disclose that the imaging device comprises an ultrasound imaging device configured to be inserted into the patient and the ultrasound imaging device comprises an ultrasound transducer configured to generate a longitudinal image (i.e. sagittal image) along a longitudinal plane, the arm configured to support the first display with longitudinal image displayed on the first display above the patient (paragraph [0063], referring to the imaging probe (460) comprises a transrectal ultrasound (TRUS) probe configured for insertion into the rectum of the patient to view the patient’s prostate and surrounding tissue; paragraph [0115], referring to the imaging probe providing one or more images along the sagittal plane; paragraph [0218], referring to the display being configured to display images of the treatment site from one or more views acquired by the imaging probe).
However, Mantri et al. do not specifically disclose that the ultrasound transducer is an ultrasound “array”.
Aljuri’646 disclose a system for treating a patient and wherein live patient ultrasounds can be acquired using transrectal ultrasound (Abstract; paragraph [0438]). A probe can be operated so as to provide a real time determination of tissue removal profile, wherein the probe may comprise a transducer (392) that may comprise an ultrasound array to provide axial and transverse imaging (paragraphs [0398], [0403]; Figures 22A,B).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the transducer of the above combined references be an ultrasound array, as taught by Alijuri’646, in order to successfully and effectively provide both axial and transverse imaging (paragraph [0403]).
With regards to claim 6, Mantri et al. disclose that the longitudinal image plane (i.e. sagittal plane 950) correspond to a plane extending from the ultrasound transducer array (paragraph [0139], referring to the imaging probe (460) comprising an elongate axis (461) which defines a sagittal image plane; Figure 9A-C, 11-12) and through a portion of the display, as modified by Mesaros et al. (paragraphs [0012]-[0014], referring to the articulating arm being able to pivot a full 360 degrees around the center axis of the articulation joint and the upper articulation joint (38) providing additional range of motion as it enables the upper arm section (52) to be pivoted about the joint, and thus the display of the above combined references can be positioned such that the longitudinal image plane of the above combined references corresponds to the plane extending from the ultrasound transducer array and through a portion of the display; Figures 3-6).
With regards to claim 7, Mantri et al. disclose that the ultrasound transducer array is rotatable about a longitudinal axis (i.e. “elongate axis of the ultrasound imaging probe”) in order to rotate an angle of the longitudinal image plane (i.e. “sagittal plane”) to view the treatment probe with the plane extending through the transducer array, the treatment probe and the first display (paragraph [0140], referring to the angle of the sagittal plane of the ultrasound imaging probe being rotated by rotating the ultrasound imaging probe about the elongate axis of the ultrasound imaging probe; Figures 9A-C).
With regard to claim 8, Mantri et al. disclose that the treatment probe (450) extends through at least a portion of the longitudinal image plane (950) (paragraphs [0139]-[0140], referring to the treatment probe (450) being substantially aligned with the sagittal image plane (950); Figures 9A-C).
With regards to claim 9, Mesaros et al. disclose that the longitudinal image shown on the first display is approximately perpendicular to the longitudinal image plane (paragraphs [0012]-[0014], referring to the articulating arm being able to pivot a full 360 degrees around the center axis of the articulation joint and the upper articulation joint (38) providing additional range of motion as it enables the upper arm section (52) to be pivoted about the joint, and thus the display of the above combined references can be positioned such that the longitudinal image plane of the above combined references can be shown on the display approximately perpendicular to the longitudinal image plane and optionally to within 30 degrees of perpendicular; Figures 3-6).
Claim(s) 13-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mantri et al. in view of Mesaros et al., Gregerson et al. and Aljuri et al. as applied to claim 12 above, and further in view of Aljuri’743 (US Pub No. 2020/0170743).
With regards to claim 13, as discussed above, the above combined references meet the limitations of claim 12. Further, Mantri et al. disclose that the support comprises a plurality of stirrups to support legs of the patient splayed and feet of the patient above a torso of the patient (paragraphs [0067], [0228], referring to the stirrups on the end of the patient support; see Figures 1, 11C and 12).
However, the above combined references do not specifically disclose that the display is sized to fit between knees of the patient.
Aljuri’743 discloses a surgical drape configured for covering a patient and an ultrasonography probe during surgical treatment of the patient, wherein a graphical display (139) may be viewed through a drape, and, as depicted in Figures 7 and 11, the display (139) is sized to fit between the knees of the patient and allows a user/physician to manipulate one or more input/output devices that are connected to the graphical display (Abstract; paragraphs [0163]-[0164]; Figures 7, 11).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the display of the above combined references be sized to fit between knees of the patient, as taught by Aljuri’743, in order to allow the display to be positioned under the drape and thus protected while at the same time allowing the physician at the end of a support table to view the screen (paragraphs [0163]-[0164]; Figures 7, 11).
With regards to claim 14, Mesaros et al. disclose that the arm is configured to support the display above the patient and superiorly to a penile fenestration of a drape covering the patient (paragraphs [0012]-[0014], referring to the articulating arm being able to pivot a full 360 degrees around the center axis of the articulation joint and the upper articulation joint (38) providing additional range of motion as it enables the upper arm section (52) to be pivoted about the joint, and thus the display of the above combined references can be positioned above the patient and superiorly to a penile fenestration of a drape covering the patient; Figures 3-6).
Claim(s) 60 and 61 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mantri et al. in view of Mesaros et al., Gregerson et al. and Aljuri et al. as applied to claim 1 above, and further in view of Wu et al. (US Pub No. 2023/0114137).
With regards to claims 60 and 61, as discussed above, the above combined references meet the limitations of claim 1. Further, as set forth in the rejection of claim 1, the first display may be placed above the midline when in the first configuration and further the first display may be touched by a user with an inherent first amount of force to input data (see rejection of claim 1, referring to the first configuration limitations and the user interface being configured to receive via the first display a plurality of user inputs in the first configuration limitations, wherein there exists an inherent force associated with the reception of the plurality of user inputs on the first display) However, they do not specifically disclose that the positioning arm is configured to resist movement of the first display when in the placed position above the midline and touched by the user with the first amount of force to input data; and allow movement of the first display when in the placed position above the midline and touched by the user with a second amount of force greater than the first amount of force and wherein the positioning arm comprises one or more frictional pads configured to resist movement of the first display responsive to force that does not satisfy a threshold amount of force being applied and allow movement of the first display responsive to force satisfying the threshold amount of force being applied.
Wu et al. disclose a co-manipulation robotic system that is used for assisting with laparoscopic surgical procedures, wherein a surgical instrument coupling mechanism includes a friction pad (5132) of holder (5130) which may be sized and shaped to receive shaft (12a) of a surgical instrument (12) (Abstract; paragraphs [0007], [0055], [0397]-[0398], note that the surgical instrument coupling mechanism serves as part of a positioning arm; Figure 51). Longitudinal movement of the surgical instrument (12) is prevented unless the longitudinal force applied to surgical instrument (12) exceeds at least the friction forces applied to shaft (12a) by surgical instrument engagement portion (5126) and friction pad (5132) (Abstract; paragraphs [0007], [0055], [0397]-[0398]; Figure 51). The method includes switching from the passive mode to a co-manipulation mode response to determining that a hold force exceeds a predetermined threshold, wherein the robot arm is freely movable in the co-manipulation mode responsive to movement at the handle of the surgical instrument while compensating for gravity of the surgical instrument (paragraph [0059]).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the positioning arm of the above combined references be configured to resist movement of the first display when in the placed position above the midline and touched by the user with the first amount of force to input data; and allow movement of the first display when in the placed position above the midline and touched by the user with a second amount of force greater than the first amount of force and wherein the positioning arm comprises one or more frictional pads configured to resist movement of the first display responsive to force that does not satisfy a threshold amount of force being applied and allow movement of the first display responsive to force satisfying the threshold amount of force being applied, as taught by Wu et al., in order to have the positioning arm be freely movable in response to excessive force by the user while compensating for gravity of the coupled display, thereby allowing the user to control through touch/force the position of the display (paragraph [0059]).
Claim(s) 62 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mantri et al. in view of Mesaros et al., Gregerson et al. and Aljuri et al., as applied to claim 1 above, and further in view of Fleck et al. (US Pub No. 2008/0237341).
With regards to claim 62, as discussed above, the above combined references meet the limitations of claim 1. Further, the above combined references disclose that when the first display is positioned at the first location, the first and second displays are facing in a same direction and the first display is configured to fit within dimensions of the console (see Mantri et al., Figure 12, wherein the two displays (1220) face the same direction and see Mesaros et al., Figure 1, wherein the display (30) fits within dimensions of the console and see Figures 3-6, wherein the display (30) is capable of being moved to multiple positions/orientations and thus is capable of facing the same direction as the second display of the above combined references).
However, the above combined references do not specifically disclose that, when the first display is positioned at the second location, the first and second displays are facing in an opposite direction and are configured to fit within dimensions of the console.
Fleck et al. disclose that visual displays (82, 84) of an apparatus for monitoring the entry of objects into a surgical field may face in an opposite direction and are configured to fit within dimensions of the console (60) in order to permit viewing from the other side (Abstract; paragraphs [0002], [0082]; Figures 3-4).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have, when the first display of the above combined references is positioned at the second location, the first and second displays are facing in an opposite direction and are configured to fit within dimensions of the console, as taught by Fleck et al., in order to permit viewing from the other side (paragraph [0082]).
Claim(s) 63 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mantri et al. in view of Mesaros et al., Gregerson et al. and Aljuri et al. as applied to claim 1 above, and further in view of Masaki et al. (US Pub No. 2025/0143812).
With regards to claim 63, as discussed above, the above combined references meet the limitations of claim 1. Further, Mantri et al. further disclose that the system comprises a post extending upward from the console (1210) and a lateral segment coupled to the post, wherein the second display is mounted [indirectly] to the post (see Figure 12, note that the displays are at least indirectly, via the depicted lateral segment perpendicular to the post, mounted to the post).
However, the above combined references do not specifically disclose that the positioning arm is coupled to the post via the lateral segment.
Masaki et al. disclose a system for operating a robotic catheter system, wherein the system (1000) comprises a user interface unit which includes at least one of a main display (101-1) ( a first user interface unit), a secondary display (101-2) (a second user interface unit), and a handheld controller (105) (a third user interface unit) (Abstract; paragraph [0040]; Figure 1). As depicted in Figure 1, the main display (101-1) is mounted to a post extending upward from the console and a lateral segment (i.e. first portion of the robotic arm (190) attached to the post) is coupled to the post, wherein the positioning arm (i.e. other portions of the robotic arm (190)) is coupled to the post via the lateral segment (paragraphs [0039]-[0040]; Figure 1).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the positioning arm of the above combined references be coupled to the post via the lateral segment, as taught by Masaki et al., as the above combined references requires coupling the first display to the positioning arm and Masaki et al. teaches an effective, alternative known configuration for a positioning arm coupled to the first display that comprise having the positioning arm coupled to the post via the lateral segment. That is, using the known technique for coupling a display to a positioning arm, as desired by the above combined references, by coupling the positioning arm to the post via the lateral segment, as taught by Masaki et al., would have been obvious to one of ordinary skill in the art.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1, 3-20 and 59-63 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Aljuri has been introduced to teach the secondary user interface being configured to receive annotations associated with treatment planning without controlling the treatment probe.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KATHERINE L FERNANDEZ whose telephone number is (571)272-1957. The examiner can normally be reached Monday-Friday 9:00 AM - 5:30 PM (ET).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached at (571) 272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KATHERINE L FERNANDEZ/Primary Examiner, Art Unit 3798