DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 02/13/2026 has been entered.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-3, 5, 7-10, 12, 14-17 and 19-21 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Publication No. 2024/0249487 to Jaskola in view of WO 2022/249190 to Ben-Yishai and U.S. Publication No. 2020/0093611 to Forstein et al. “Forstein”.
As for Claims 1, 8 and 15 , Jaskola discloses a surgical navigation system and method for manipulating laparoscopic images (e.g. 2D) and overlaid 3D models (Abstract) where the 3D model may be generated prior to surgery and is of at least one anatomical structure (Paragraphs [0016]). Jaskola discloses a step of detecting the tool attached to a surgical instrument in the laparoscopic images, the tool having a pose (e.g. location and orientation) (Paragraphs [0021]). Based on the tool’s detected pose, Jaskola adjusts the visual representation of 3D model to avoid obscuring the surgical instrument in the 2D images and/or to display the model with highly contrasting colors or brightness (Paragraphs [0021] and [0033]-[0037]) which is considered to read on identifying a “2D focused region” within the 2D images where the region is directly faced by the tip of the surgical instrument. As for the “3D focused region”, Jaskola discloses where the computer analyzes the laparoscopic images relative to the visibility of an object (e.g. 2D anatomic features), calculates a placement of a pre-generated 3D model of the object relative to the location of the object in the laparoscopic images, generates a visual representation of the 3D model and generates a composite image using the laparoscopic images and the 3D model (Paragraphs [0025]-[0029]). Jaskola explains where focused information may include changing color or brightness information of the model to provide the surgeon with a clear picture of which part of the composite image is the laparoscopic image and which part is the 3D model (Paragraph [0053]). Examiner notes the modified model (e.g. more transparent or cut out for instrument and/or bright contours) would read on the claimed limitations of generating focused information based on the type of tool, location of tool and pose of tool in its broadest reasonable interpretation. Moreover, the modified model would comprise at least a portion of the 3D model based on a current surgical task of the instrument occurring within the 2D focused region and the 3D focused region in its broadest reasonable interpretation. Jaskola’s computer system analyzes frames of laparoscopic images with a frame grabber to make the aforementioned changes to the focused information (Paragraphs [0024]-[0028] and [0038]) and Jaskola explains the laparoscopic images can be manipulated in real time (Paragraph [0015]) which is considered to read on dynamically adjusting the focused information when the tool moves in its broadest reasonable interpretation.
However, Jaskola does not expressly disclose wherein the 3D model comprises information on a pre-planned surgery trajectory with planned cut points on the surface of the organ as now claimed.
Ben-Yishai teaches from within a similar field of endeavor with respect to surgical navigation systems and methods (Abstract) where a computer may superimpose guidance information onto preoperative and/or intraoperative images to aid the user during surgery (Page 18, Lines 14; Page 20, Lines 17-25). Ben-Yishai teaches the guidance information may be 3D guidance information displayed on live images and may include a trajectory of a medical tool (Page 33 Lines 19-35; Page 35, Lines 15-30).
Forstein teaches from within a similar field of endeavor with respect to computer assisted surgical procedures (Paragraph [0002]) where a planning computer may plan a surgical procedure pre-operatively or intraoperatively using medical imaging data and constructed 3D virtual models (Paragraph [0053]). Forstein explains the planning computer is programmed to determine insertion trajectories and operational data such as a set of instructions for modifying a volume of tissue that is defined relative to the anatomy such as a set of cutting parameters (e.g. cut paths, cut velocities) in a cut-file to modify the volume of bone (Paragraph [0053]).
Accordingly, one skilled in the art would have been motivated to have modified Jaskola’s system and method, particularly the overlaid 3D model to display guidance trajectory and cut points as described by Ben-Yishai and Forstein’s in order to enhance the accuracy of the surgical procedure. Such a modification merely involves combining prior art elements according to known techniques to yield predictable results (MPEP 2143).
As for Claims 2-3, 9-10 and 16-17, Jaskola discloses a surgical navigation system and method where a computer may detect a surgical instrument in the laparoscopic image using object recognition and edge detection (Paragraph [0021]). Examiner notes the object recognition would provide a classification result in its broadest reasonable interpretation based on extracted features from the 2D images relating to tools that can be deployed in surgery in its broadest reasonable interpretation. Furthermore, Jaskola discloses where a trained AI model may be used to identify the type, location and orientation of specific organs and, if applicable, type and/or orientation of a surgical instrument in the laparoscopic images (Paragraphs [0062]-[0067].
Regarding Claims 5, 12, and 19 Examiner notes the different display options for the focused information as explained above read on types of focused information and would depend on the type of tool identified during surgery in its broadest reasonable interpretation.
With respect to Claims 7, 14 and 21, Jaskola discloses, in one embodiment, where the contour of the 3D model is highlighted (e.g. highly contrasting colors or brightness; Paragraph [0023]).
As for Claim 20, Jaskola discloses wherein the computer system is configured to “match” 3D models of organs or organ structures to laparoscopic images and once the location and orientation of the 3D model is established, the computer or frame grabber can form a composite image displayed on the screen (Paragraph [0055]). Examiner notes the matching step performed on the computer is considered to read on a dynamic registration unit for registering a 3D model as at least one anatomical structure captured with the 2D images in its broadest reasonable interpretation. In addition, as described above, Jaskola’s system modifies the 3D model overlay in order to provide focus to an area in proximity to the surgical instrument which is considered to read on the focused overlay renderer as claimed in its broadest reasonable interpretation. Examiner notes that the highlighted contours of the organ may serve as a visual guide depending on the type of procedure and the modified model including the planned trajectory and cut points would read on the claimed visual guides.
Claim(s) 1-3, 5, 7-10, 12, 14-17 and 19-21 is/are alternatively rejected under 35 U.S.C. 103 as being unpatentable over Jaskola in view of U.S. Publication No. 2025/0000591 to Ikits et al. “Ikits”.
As for Claims 1, 8 and 15 , Jaskola discloses a surgical navigation system and method for manipulating laparoscopic images (e.g. 2D) and overlaid 3D models (Abstract) where the 3D model may be generated prior to surgery and is of at least one anatomical structure (Paragraphs [0016]). Jaskola discloses a step of detecting the tool attached to a surgical instrument in the laparoscopic images, the tool having a pose (e.g. location and orientation) (Paragraphs [0021]). Based on the tool’s detected pose, Jaskola adjusts the visual representation of 3D model to avoid obscuring the surgical instrument in the 2D images and/or to display the model with highly contrasting colors or brightness (Paragraphs [0021] and [0033]-[0037]) which is considered to read on identifying a “2D focused region” within the 2D images where the region is directly faced by the tip of the surgical instrument. As for the “3D focused region”, Jaskola discloses where the computer analyzes the laparoscopic images relative to the visibility of an object (e.g. 2D anatomic features), calculates a placement of a pre-generated 3D model of the object relative to the location of the object in the laparoscopic images, generates a visual representation of the 3D model and generates a composite image using the laparoscopic images and the 3D model (Paragraphs [0025]-[0029]). Jaskola explains where focused information may include changing color or brightness information of the model to provide the surgeon with a clear picture of which part of the composite image is the laparoscopic image and which part is the 3D model (Paragraph [0053]). Examiner notes the modified model (e.g. more transparent or cut out for instrument and/or bright contours) would read on the claimed limitations of generating focused information based on the type of tool, location of tool and pose of tool in its broadest reasonable interpretation. Moreover, the modified model would comprise at least a portion of the 3D model based on a current surgical task of the instrument occurring within the 2D focused region and the 3D focused region in its broadest reasonable interpretation. Jaskola’s computer system analyzes frames of laparoscopic images with a frame grabber to make the aforementioned changes to the focused information (Paragraphs [0024]-[0028] and [0038]) and Jaskola explains the laparoscopic images can be manipulated in real time (Paragraph [0015]) which is considered to read on dynamically adjusting the focused information when the tool moves in its broadest reasonable interpretation.
However, Jaskola does not expressly disclose wherein the 3D model comprises information on a pre-planned surgery trajectory with planned cut points on the surface of the organ as now claimed.
Ikits teaches from within a similar field of endeavor with respect to computer aided surgical systems and methods where a virtual bone model and surgical plan are provided that define planned cut such that a visualization of execution of the planned cut is generated (Paragraphs [0006]-[0007]). Fig. 15 depicts a 3D bone model 1500, planned bone modification plane 1502 (Paragraphs [0072]-[0073]). Ikits explains a surgical plan is provided which indicates a planned cut, cut plan or bone modification relative to the bone model (Paragraph [0074]) and real time visualization of the execution of the planned bone modification is provide that either removes “cut” voxels or shades them a different color (Paragraph [0079]). Such disclosures are considered to read on the claimed limitations of a 3D model comprising information on a pre-planned surgery trajectory with planned cut points on the surface of the anatomy in its broadest reasonable interpretation.
Accordingly, one skilled in the art would have been motivated to have modified Jaskola’s system and method, particularly the overlaid 3D model to display guidance trajectory and cut points as described by Ikits in order to enhance the accuracy of the surgical procedure. Such a modification merely involves combining prior art elements according to known techniques to yield predictable results (MPEP 2143).
As for Claims 2-3, 9-10 and 16-17, Jaskola discloses a surgical navigation system and method where a computer may detect a surgical instrument in the laparoscopic image using object recognition and edge detection (Paragraph [0021]). Examiner notes the object recognition would provide a classification result in its broadest reasonable interpretation based on extracted features from the 2D images relating to tools that can be deployed in surgery in its broadest reasonable interpretation. Furthermore, Jaskola discloses where a trained AI model may be used to identify the type, location and orientation of specific organs and, if applicable, type and/or orientation of a surgical instrument in the laparoscopic images (Paragraphs [0062]-[0067].
Regarding Claims 5, 12, and 19 Examiner notes the different display options for the focused information as explained above read on types of focused information and would depend on the type of tool identified during surgery in its broadest reasonable interpretation.
With respect to Claims 7, 14 and 21, Jaskola discloses, in one embodiment, where the contour of the 3D model is highlighted (e.g. highly contrasting colors or brightness; Paragraph [0023]).
As for Claim 20, Jaskola discloses wherein the computer system is configured to “match” 3D models of organs or organ structures to laparoscopic images and once the location and orientation of the 3D model is established, the computer or frame grabber can form a composite image displayed on the screen (Paragraph [0055]). Examiner notes the matching step performed on the computer is considered to read on a dynamic registration unit for registering a 3D model as at least one anatomical structure captured with the 2D images in its broadest reasonable interpretation. In addition, as described above, Jaskola’s system modifies the 3D model overlay in order to provide focus to an area in proximity to the surgical instrument which is considered to read on the focused overlay renderer as claimed in its broadest reasonable interpretation. Examiner notes that the highlighted contours of the organ may serve as a visual guide depending on the type of procedure and the modified model including the planned trajectory and cut points would read on the claimed visual guides.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-3, 5, 7-10, 12, 14-17 and 19-21 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-24 of copending Application No. 18/172428 (reference application). Although the claims at issue are not identical, they are not patentably distinct from each other because both sets of claims are directed to a system and method for receiving 2D images of anatomical structures and a surgical instrument, detecting 2D locations of the tool within the images, determining the type of tool based on the detection and determining focused information from a 3D model to assist the user in performing a surgical task. While the presently pending claims provide a registration step, Examiner notes that the ‘428 application would include this registration in order to align the 3D model with the 2D data. Furthermore, while the presently pending claims recite cut points, the ‘428 application discloses a surgical knife and focused information relating to the surgical knife (Claim 2) and generating a visual guide (Claim 5). Thus, the claims are not considered to be patentably distinct.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-3, 5, 7-10, 12, 14-17 and 19-21 have been considered but are moot in view of the updated grounds of rejection necessitated by amendment. However, Examiner will address Applicant’s remarks which may still pertain to the rejection above. For example Applicant argues the office action interprets the claims and asserts Jaskola corresponds to those limitations and “…Applicant cannot reasonably determine which ‘element’ of the claim is believed to correspond with which section, figure or feature of Jaskola” (REMARKS, page 9). Examiner respectfully disagrees and notes the rejection above is clear to which paragraphs in Jaskola read on each claim limitations. Specifically, Applicant asserts the rejection fails to address the limitation that recites focused information comprises the type of tool, the location of the tool and orientation of the tool (REMARKS, page 10). Examiner respectfully disagree and directs Applicant’s attention to the rejection where Jaskola discloses a step of detecting the tool attached to a surgical instrument in the laparoscopic images, the tool having a pose (e.g. location and orientation) (Paragraphs [0021]). Based on the tool’s detected pose, Jaskola adjusts the visual representation of 3D model to avoid obscuring the surgical instrument in the 2D images and/or to display the model with highly contrasting colors or brightness (Paragraphs [0021] and [0033]-[0037]) which is considered to read on identifying a “2D focused region” within the 2D images where the region is directly faced by the tip of the surgical instrument. Examiner notes the modified model (e.g. more transparent or cut out for instrument and/or bright contours) would read on the claimed limitations of generating focused information based on the type of tool, location of tool and pose of tool in its broadest reasonable interpretation. In other words, the unobscured view of the surgical tool depends on the type of tool, the location of the tool and the orientation of the tool in its broadest reasonable interpretation.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. U.S. Publication No. 2021/0298830 to Lennartz et al. which discloses overlaying virtual cut lines on a 3D model (Abstract, Paragraphs [0012] and [0018]).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER L COOK whose telephone number is (571)270-7373. The examiner can normally be reached M-F approximately 8AM-5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Kozak can be reached at 571-270-0552. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHRISTOPHER L COOK/Primary Examiner, Art Unit 3797