DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/20/2026 has been entered.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-5, 7-22 and 24-27 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Publication No. 2021/0307831 to Fuerst et al. “Fuerst” in view of U.S. Publication No. 2020/0261159 to Johnson et al. “Johnson” and U.S. Publication No. 2017/0108930 to Banerjee et al. “Banerjee”.
As for Claims 1, 5, 9, 11-12, 17-20, 22, and 24-27, Fuerst discloses a mobile virtual reality system and method for simulation, training or demonstration of a surgical robotic system (Abstract) comprising at least one input device configured to receive input from an operator (Paragraphs [0029]-[0030]) and a processor (VR processor 42 in Fig. 2, 62 in Fig. 4 and corresponding descriptions) configured to generate a virtual robotic system and environment depicted in Fig. 1 (Paragraphs [0026]-[0028] and [0034]-[0036]) so the user may interact with various robots and tools in the virtual environment. Examiner notes that the virtual robotic system and operating environment disclosed above would include one or more virtual robotic arms each coupled with virtual surgical instruments at the distal end of the robot, a virtual operating table whereby virtual robotic arms are arranged in its broadest reasonable interpretation. Fuerst discloses wherein the operator’s profile may be stored in a database and retrieved based on log-in information (Paragraphs [0041]-[0042]). Next the operator may select an “exercise” that simulate one or more surgical procedures in the virtual environment (Paragraph [0043]). Such disclosures are considered to read on the claimed limitations of extracting a relevant data based upon the received input from a database stored on a server wherein the database can include at least one of a diagnostic scan and patient details for one or more patients or a virtual tutorial for one or more surgical procedures and to render and manipulate the relevant data based upon the operators input to perform the simulated exercise of the surgical procedure (Paragraph [0045]) in its broadest reasonable interpretation.
However, Fuerst does not specify that the display is a stereoscopic display as claimed. Furthermore, while Fuerst’s system and method would include diagnostic images to facilitate the surgical procedure, Fuerst does not expressly disclose patient specific diagnostic data used for the virtual patient data.
Johnson teaches from within a similar field of endeavor with respect to surgical training systems and methods (Abstract; Paragraphs [0007] and [0014]) wherein the data may be provided on a stereoscopic display (Paragraph [0047]). Johnson also explains that the virtual reality processor may be in communication with a patient records database in order to collect patient data (Paragraph [0058]) in order to overlay (e.g. superimpose) a “portal” to enable views of actual organs (Paragraph [0119]).
Accordingly, one skilled in the art would have been motivated to have incorporated any conventional display means as described by Johnson (e.g. stereoscopic display) with a superimposed model of internal organs and tissues into Fuerst’s system and method in order to provide an accurate and immersive of real tissue/organs to enhance the display of the simulated surgical exercise (Johnson; Paragraph [0119]). Such a modification merely involves combining prior art elements according to known techniques to yield predictable results (MPEP 2143). One skilled in the art would have also been motivated to provide access to a patient records database as described by Johnson in order to provide the aforementioned immersive and realistic stimulation as such a modification requires nothing more than combining prior art elements according to known techniques to yield predictable results (MPEP 2143).
While the modified system and method utilizes patient diagnostic data to enhance the representations of anatomy, Fuerst and Johnson do not expressly disclose the conventional technique of converting a 2D scan into a 3D model using a segmentation logic as claimed.
Banerjee teaches from within a similar field of endeavor with respect to virtual reality simulation systems and methods (Abstract) where the system can perform pre-processing on patient specific medical imaging data such that DICOM or other input data received can be originally in the form of two-dimensional slices of patient anatomy and during pre-processing, the segment anatomy processing module can convert 2D data into a 3D format (e.g. model; Paragraph [0049]).
Accordingly, one skilled in the art would have been motivated to have modified the virtual reality simulation system and method described by Fuerst and Johnson to incorporate Banerjee’s pre-processing segmentation module in order to convert 2D data into a 3D format in order to enhance the quality of displayed simulation data and improve user experience. Examiner notes that in the modified system and method, the 3D model of patient anatomy from diagnostic images would allow the user to identify an exact position and orientation (e.g. virtual position and orientation) during the simulated surgery in its broadest reasonable interpretation.
Regarding Claims 2-3, Fuerst discloses wherein the system may include more than one handheld input units (Paragraph [0029]) including an IMU sensor (Paragraph [0030]).
Regarding Claim 4, Fuerst explains that the system and method may utilize a database store and retrieve data when needed (Paragraph [0042]) and servers (Paragraph [0050]). Such an arrangement is considered to read on at least one of a local or cloud database in its broadest reasonable interpretation.
Regarding Claim 7, Fuerst explains where the operator may select and/or change the patient body model in order to gain proficiency (Paragraph [0044]).
As for Claim 8, Johnson discloses wherein the patient specific diagnostic data may include x-ray, Mill, CT, ultrasound, etc.) (Paragraph [0058]).
Regarding Claim 10, Examiner notes that Fuerst explains wherein the virtual exercises may be assigned to a user for training and simulation (Paragraph [0043]). Examiner notes that once the training is completed, the database may be modified to reflect the completed training and simulation in its broadest reasonable interpretation.
With respect to Claims 13-16, Fuerst discloses wherein the display may include a 2d screen, 3D screen, volumetric 3D display, digital hologram display, 3D wearable display on a user’s head, or housed in a device such as a laptop (Paragraph [0028]).
Regarding Claim 21, Fuerst explains Fig. 1 depicts surgical robotic system and environment which includes draped objects, robotic arms stowed or withdrawn, manipulation of robotic arms to perform tasks, tool exchange and when the procedure is done configured to perform post operative procedures such as cleaning and sterilization (Paragraph [0021]). Accordingly, one skilled in the art would have been motivated to have enabled the operator to practice any/all steps in a robotic procedure in order to gain proficiency and enhance patient safety.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-5 and 7-22 and 24-27 have been considered but are not persuasive. Applicant generally argues the combination of references fail to disclose converting a 2D scan into a 3D model using segmentation logic and superimposing the 3D model corresponding to an anatomical organ onto the virtual patient model to enable identification of a position and orientation of the organ during the simulated procedure (REMARKS, Pages 11-12) because Fuerst does not disclose converting a 2D scan into a 3D model nor superimposing the 3D model onto the virtual patient to enable identification of the organ’s position and orientation (REMARKS, Page 12), Johnson does not teach converting a 2D scan into a 3D model using segmentation and Banerjee does not teach a virtual robotic surgery environment, virtual robotic arms or robotic surgical instruments, virtual patent model positioned on a virtual operating table and superimposing the 3D into a virtual patient model for enabling identification of organ position and (REMARKS, Pages 13-14). In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Examiner respectfully notes the rejection concedes Fuerst does not specify that the display is a stereoscopic display as claimed. Furthermore, while Fuerst’s system and method would include diagnostic images to facilitate the surgical procedure, Fuerst does not expressly disclose patient specific diagnostic data used for the virtual patient data. Johnson and Banerjee disclose the missing elements and the rejection articulates why one skilled in the art would have been motivated to combine the teachings.
Applicant also argues the proposed changes would fundamentally change Fuerst’s system architecture (REMARKS, Page 13). Examiner respectfully disagrees and notes the proposed modifications would enhance the realistic simulation and merely add to capability the Fuerst’s system and still preserve the intended purpose of Faust (e.g. simulate a virtual procedure).
Finally, Applicant argues the argued features are more than a mere display because they require registration of patient specific anatomical data with the virtual patient avatar, require maintaining a shared coordinate framework between reconstructed anatomy, the virtual patient model and the virtual robotic instruments, and solve a different technical problem (REMARKS, Pages 14-15). Examiner respectfully notes such arguments are not commensurate with the scope of the claims and the claims fail to require such limitations. Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993) (MPEP 2145).
Thus, the rejections have been maintained.
Conclusion
All claims are identical to or patentably indistinct from, or have unity of invention with claims in the application prior to the entry of the submission under 37 CFR 1.114 (that is, restriction (including a lack of unity of invention) would not be proper) and all claims could have been finally rejected on the grounds and art of record in the next Office action if they had been entered in the application prior to entry under 37 CFR 1.114. Accordingly, THIS ACTION IS MADE FINAL even though it is a first action after the filing of a request for continued examination and the submission under 37 CFR 1.114. See MPEP § 706.07(b). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER L COOK whose telephone number is (571)270-7373. The examiner can normally be reached M-F approximately 8AM-5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Kozak can be reached on 571-270-0552. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHRISTOPHER L COOK/Primary Examiner, Art Unit 3797