DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The replacement drawings filed 20 October 2025 are in good order and overcome the drawing objections.
The amended title accepts the previously suggested title and overcomes the title objection.
The amendments to claim 11 and cancellation of claim 12 overcomes the 112(b) rejections.
Response to Arguments
Applicant's arguments filed 20 October 2025 have been fully considered but they are not persuasive.
The core of Applicant’s argument is that Mak’s region of interest and resulting focus location is a general area and not a specific anatomical feature such as a vessel, tumor, etc. Applicant further argues that the region of interest in Mak is a smaller area within the image and admits that conventional autofocus techniques can be used to determine the focus thereof. Mak is further argued as having a preoperative treatment plan that identifies regions of interest but “may be interpreted as user input” and that focus is determined based on this “user input”.
In response, Mak’s disclosure is not so limited as Applicant’s arguments implore and does indeed disclose determining an anatomical feature of interest within the area of interest including during the surgical microscope’s abnormal/emergency phase that determine bleeds, ballooning of a vessel, blood clot, etc. [0085]-[0086]. Blood clots, ballooning of a vessel, brain swelling, bleeding, etc. are considered to be “anatomical features of interest” and indeed these particular anatomical features are so important as to engage an emergency phase/mode that triggers an autofocus functionality of the microscope to focus on the position of the anatomical feature of interest. See [0080], [0084]-[0091], [0095]-[0096], [0102], [0107]-[0113], control of system to maintain a certain region of interest in focus including, in emergency bleeding mode, higher magnification and focus on a region in which bleeding/blot clot/vessel has been detected.
Moreover, these anatomical features of interest (blood clots, ballooning of a vessel, brain swelling, bleeding, etc.) are not manual inputs but are instead detected by image analysis techniques as clearly disclosed in [0086] of Mak.
Curiously, Applicant’s arguments do not address these previously-mapped disclosures of Mak. Nor does Applicant substantively address Uyama which was and continues to be applied for the highly related and more detailed concept of locating and tracking the position of the anatomical feature of interest over a plurality of frames of the imaging sensor data.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1, 8, 13-15, and 17-19 are rejected under 35 U.S.C. 102(a)(1) and (a)(2) as being anticipated by Mak (US 2019/0324252 A1).
Claim 1
In regards to claim 1, Mak discloses a system for a microscope of a surgical microscope system {Fig. 3 copied below including surgical microscope system 500, abstract, title and cites below}, the system comprising one or more processors and one or more storage devices {Fig. 3, control and processing unit 300 including processors 302, memory 304, and processing engines 370 and including computer-readable medium embodiments as per [0045]-[0050]}, wherein the system is configured to:
obtain imaging sensor data from at least one optical imaging sensor of the microscope {surgical microscope 500 includes cameras/3D scanner 535,545 operating in different spectra including visible, UV, fluorescence, [0051]-[0052] that is obtained by the controller 530 or external processor such as a workstation of system 205, fig. 6, [0058]-[0065]}
determine information on an area of interest of a user of the surgical microscope system based on an input of the user {see [0095]-[0096], [0034]-[0039] including tool tracking mode that performs autofocus on a region indicated by a laser pointer; and/or tracked pointer including tool being used by the surgeon; and/or semi-manual control mode in which determines the focus region based on potential features of interest which include regions (areas of interest) that are identified in the image based on a preoperative treatment plan input by the user; and/or selecting a region of interest to perform autofocusing based on surgeon-specific preferences such as from historical usage logs (based on historical user inputs; and/or the surgeon manually selection from among several focus regions determined/identified by the system)};
determine an anatomical feature of interest within the area of interest
{see abnormal/emergency phase that determined if the patient is bleeding, ballooning of a vessel, blood clot, etc. [0085]-[0086]};
detect a position of the anatomical feature of interest within the imaging sensor data {detect bleeding, vessel or blood clot position [0086] and/or detect location of fluorescence in image to perform autofocus, [0095]}; and
trigger an autofocus functionality of the microscope to focus on the position of the anatomical feature of interest {see [0080], [0084]-[0091], [0095]-[0096], [0102], [0107]-[0113], control of system to maintain a certain region of interest in focus including, in emergency bleeding mode, higher magnification and focus on a region in which bleeding/blot clot/vessel has been detected}
PNG
media_image1.png
730
612
media_image1.png
Greyscale
Claim 8
In regards to claim 8, Mak discloses wherein the system is configured to perform the object detection to identify at least one of a blood vessel, branching points of a blood vessel,
a bleeding {see above mapping for claim 1 including bleeding detection including [0086]}, and
a tumor within at least the portion of the imaging sensor data representing the area of interest {regarding tumor object detection see Yip (Yip, Michael C., et al. "Tissue tracking and registration for image-guided surgery." IEEE transactions on medical imaging 31.11 (2012): 2169-2182.} which is cited but not applied to this list of optional features in order to help guide Applicant and the public}.
Claim 13
In regards to claim 13, Mak discloses
wherein the system is configured to detect a pointer operated by the user within the imaging sensor data to determine the area of interest
{see [0095]-[0096], [0034]-[0039] including tool tracking mode that performs autofocus on a region indicated by a laser pointer; and/or tracked pointer including tool being used by the surgeon; and/or the surgeon manually selection from among several focus regions determined/identified by the system)}, or
wherein the system is configured to determine a portion of a surgical site being operated on by the user within the imaging sensor data to determine the area of interest
{see [0095]-[0096], [0034]-[0039] including tool tracking mode that performs autofocus on a surgical site region indicated by a laser pointer; and/or tracked pointer including tool being used by the surgeon; and/or semi-manual control mode in which determines the focus region based on potential features of interest which include regions (areas of interest) that are identified in the image based on a preoperative treatment plan input by the user; and/or selecting a region of interest to perform autofocusing based on surgeon-specific preferences such as from historical usage logs (based on historical user inputs; and/or the surgeon manually selection from among several focus regions determined/identified by the system)}, or
wherein the system is configured to determine the area of interest using a gaze tracking mechanism {D1, Piron WO 2018/076094, [00111] is cited but not applied for gaze tracking, voice command, and user interface}, or
wherein the system is configured to determine the area of interest based on a voice description of an anatomical feature obtained via a voice command system of the surgical microscope system {see D1, [0111] above being cited but not applied}, or
wherein the system is configured to determine the area of interest based on a user input signal obtained via a user interface of the surgical microscope system {see D1, [0111] above being cited but not applied}, or
wherein the system is configured to determine the area of interest from a predetermined image area after the user finishes aligning the field of view {See Mak generally which includes preparing the patient for surgery which involves the user aligning the patient’s surgical site and the camera until “finished” such that an area of interested is determined by the above-cited technique(s) after the user finishes aligning the field of view}.
Claim 14
In regards to claim 14, Mak discloses wherein the system is configured to perform image segmentation and/or object detection to determine a plurality of features within the imaging sensor data, to determine a visual representation of the plurality of features, provide a display signal comprising the visual representation to a display device of the surgical microscope system, and to obtain the input of the user in response to the visual representation of the plurality of features
{see [0095]-[0096], [0034]-[0039] including tool tracking mode that performs autofocus on a region indicated by a laser pointer; and/or tracked pointer including tool being used by the surgeon; and/or semi-manual control mode in which determines the focus region based on potential features of interest which include regions (areas of interest) that are identified/segmented/detected in the image based on a preoperative treatment plan input by the user; and/or selecting a region of interest to perform autofocusing based on surgeon-specific preferences such as from historical usage logs (based on historical user inputs; and/or the surgeon manually selection from among several focus regions determined/identified by the system). As to determine a visual representation to a display see Fig. 3 display 206, 211, Fig. 8A-B, [0031], and as to obtaining user input see [0032] in which the display is touch-sensitive}};
Claim 15
In regards to claim 15, Mak discloses wherein the imaging sensor data comprises a first component with color imaging data and a second component with at least one of hyperspectral imaging data, multispectral imaging data, fluorescence imaging data {surgical microscope 500 includes cameras/3D scanner 535,545 operating in different spectra including visible, IV, fluorescence, [0051]-[0052] that is obtained by the controller 530 or external processor such as a workstation of system 205, fig. 6, [0058]-[0065]},
wherein the system is configured to determine the anatomical feature of interest and/or to detect the position of the anatomical feature of interest at least based on the second component {detect location of fluorescence in image to perform autofocus, [0095]}.
Claim 17
In regards to claim 17, Mak discloses a surgical microscope system comprising a microscope with an optical imaging sensor and the system according to claim 1 {see above mapping for claim 1 including fig. 3}.
Claim 18
The rejection of device claim 1 above applies mutatis mutandis to the corresponding limitations of method claim 18 while noting that the rejection above cites to both device and method disclosures.
Claim 19
In regards to claim 19, Mak discloses a non-transitory, computer-readable medium comprising a program code that, when the program code is executed on a processor, a computer, or a programmable hardware component, causes the processor, computer, or programmable hardware component to perform the method of claim 18
{Fig. 3, control and processing unit 300 including processors 302, memory 304, and processing engines 370 and including computer-readable medium embodiments as per [0045]-[0050]}.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 2, 3, 5-7, and 10-11 are rejected under 35 U.S.C. 103 as being unpatentable over Mak and Uyama (US 20210345856 A1).
Claim 2
In regards to claim 2, Mak discloses wherein the system is configured to track the position of the
Although Mak clearly tracks the position of medical instruments and the surgical field via trackable markers 212 on the patient, such disclosures may not be sufficient to encompass tracking the position of anatomical feature of interest over time (plural frames) as the markers 212 on the patient are not clearly movable but instead provide a frame of reference and a medical instrument is not an anatomical feature of interest even though such instruments are clearly tracked over time and used to adjust/trigger the autofocus on the moved/tracked instrument}.
Uyama is from the same field of surgical microscope systems. See abstract, figs. And cites below. Uyama employs a camera system that detects a region of interest to continuously adjust AF (autofocus) even when there is camera-patient motion. See Fig. 5 copied below, Fig. 4, [0079]-[0088]}
PNG
media_image2.png
546
644
media_image2.png
Greyscale
Uyama also teaches wherein the system is configured to track the position of the anatomical feature of interest over a plurality of frames of the imaging sensor data {see [0214] in which the region-of-interest setting unit 20 may include a classifier that classifies surgical field images into anatomical regions of interest that may then be used for autofocusing as per Fig. 5, [0079]-[0088], [0100], [0126]-[0128], [0183]}.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Mak which already tracks the tracks the position of non-anatomical features (medical instruments) over time to adjust/trigger the autofocus on the moved/tracked instrument such that the tracked autofocus functionality also applies to anatomical features (wherein the system is configured to track the position of the anatomical feature of interest over a plurality of frames of the imaging sensor data) as taught by Uyama because doing so dynamically adjusts the autofocus to compensate for motion of the anatomical features thereby ensuring that the surgeon is provided a clear, in-focus image despite such motion as motivated by Uyama in [0123]-[0135]; because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Claim 3
In regards to claim 3, Mak is not relied upon to disclose but Uyama teaches wherein the system is configured to trigger the autofocus functionality if the position of the anatomical feature of interest shifts relative to a field of view of the imaging sensor data for at least a pre-defined time interval {see above cites for claim 2, particularly the continuous adjustment of AF (autofocus) such that AF is triggered/continuously adjusted whenever the position of the anatomical feature shifts including position shifts (motion) that occur for at least a pre-defined time interval, fig. 4, [0002]-[0005], [0081], [0084], [0088] even when there is camera-patient motion. See Fig. 5 copied below, Fig. 4, [0079]-[0088]}.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Mak which already tracks the tracks the position of non-anatomical features (medical instruments) over time to adjust/trigger the autofocus on the moved/tracked instrument such that the tracked autofocus functionality also applies to anatomical features (wherein the system is configured to track the position of the anatomical feature of interest over a plurality of frames of the imaging sensor data) as taught by Uyama and such that wherein the system is configured to trigger the autofocus functionality if the position of the anatomical feature of interest shifts relative to a field of view of the imaging sensor data for at least a pre-defined time interval as also taught by Uyama because doing so dynamically adjusts the autofocus to compensate for motion of the anatomical features thereby ensuring that the surgeon is provided a clear, in-focus image despite such motion as motivated by Uyama in [0123]-[0135]; because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Claim 5
In regards to claim 5, Mak discloses wherein the system is configured to locate the area of interest within the imaging sensor data, and to determine the
Uyama teaches wherein the system is configured to locate the area of interest within the imaging sensor data, and to determine the anatomical feature of interest within a portion of the imaging sensor data representing the area of interest
{see [0214] in which system locates the area of interest within the sensor data and in which the region-of-interest setting unit 20 may include a classifier that classifies surgical field images into anatomical regions of interest that may then be used for autofocusing as per Fig. 5, [0079]-[0088], [0100], [0126]-[0128], [0183]}.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Mak which already tracks and locates the position of non-anatomical features (medical instruments) over time to adjust/trigger the autofocus on the moved/tracked instrument such that the tracked autofocus functionality also applies to anatomical features (wherein the system is configured to locate the area of interest within the imaging sensor data, and to determine the anatomical feature of interest within a portion of the imaging sensor data representing the area of interest) as taught by Uyama because doing so dynamically adjusts the autofocus location to compensate for motion of the anatomical features thereby ensuring that the surgeon is provided a clear, in-focus image despite such motion as motivated by Uyama in [0123]-[0135]; because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Claim 6
In regards to claim 6, Mak discloses wherein the system is configured to perform image segmentation on at least the portion of the imaging sensor data representing the area of interest to determine at least one feature present within the portion of the imaging sensor data representing the area of interest, and to determine the
{Fig. 3 including tracking 374, tracking system 321, [0046] including tracking medical instruments and tracked pointer 222 which may include trackable markers 212 to segment, identify and track the location of points of selected points on the patient, [0034]-[0039], [0103], [0109], [0111]-[0013] including segmenting and tracking the bleeding region of interest}
Uyama teaches wherein the system is configured to perform image segmentation on at least the portion of the imaging sensor data representing the area of interest to determine at least one feature present within the portion of the imaging sensor data representing the area of interest, and to determine the anatomical feature of interest based on the at least one feature present within the portion of the imaging sensor data representing the area of interest
{see [0214] in which system locates the area of interest within the sensor data and in which the region-of-interest setting unit 20 may include a classifier that classifies surgical field images into anatomical regions of interest that may then be used for autofocusing as per Fig. 5, [0079]-[0088], [0100], [0126]-[0128], [0183].
Note that image segmentation and determination of a region of interest are equivalent terms as per [0037] of Mak discussing markers 212 that are used to segment the image and in which those same markers are also disclosed as being used to determine the region of interest as per the claim 1 mapping. Moreover, the instant specification uses the terms segment and identifying feature of interest to perform object detection synonymously in [0055]-[0056], [0058], [0071], and [0075] thus establishing a BRI for “segment” that encompasses ROI determination such as that done by Uyama}.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Mak which already tracks and segments the position of non-anatomical features (medical instruments) over time to adjust/trigger the autofocus on the moved/tracked instrument such that the tracked autofocus functionality also applies to anatomical features (to determine the anatomical feature of interest based on the at least one feature present within the portion of the imaging sensor data representing the area of interest) as taught by Uyama because doing so dynamically adjusts the autofocus location to compensate for motion of the anatomical features thereby ensuring that the surgeon is provided a clear, in-focus image despite such motion as motivated by Uyama in [0123]-[0135]; because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Claim 7
In regards to claim 7, Mak discloses wherein the system is configured to perform object detection on at least the portion of the imaging sensor data representing the area of interest to identify at least one feature present within the portion of the imaging sensor data representing the area of interest, and to determine the imaging sensor data representing the area of interest {see [0095]-[0096], [0034]-[0039] including tool tracking mode that performs object detection to locates an autofocus region indicated by a laser pointer; and/or tracked pointer including tool being used by the surgeon; and/or semi-manual control mode in which performs objection detection to determine the focus region based on potential features of interest which include regions (areas of interest) locations that are identified/detected in the image based on a preoperative treatment plan input by the user; and/or selecting/locating/detection a region of interest to perform autofocusing based on surgeon-specific preferences such as from historical usage logs (based on historical user inputs; and/or the surgeon manually selection from among several focus regions determined/identified by the system)}.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Mak which already performs object detection to tracks and segments the position of non-anatomical features (medical instruments) over time to adjust/trigger the autofocus on the moved/tracked instrument such that the tracked autofocus functionality also applies to anatomical features (and to determine the there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Claim 10
In regards to claim 10, Mak is not relied upon to disclose but Uyama teaches wherein the system is configured to determine an extent of the anatomical feature of interest based on an extent of one or more features located adjacent to the feature the anatomical feature of interest is based on {see [0214] in which system locates the area of interest within the sensor data and in which the region-of-interest setting unit 20 may include a classifier that classifies surgical field images into anatomical regions of interest that may then be used for autofocusing as per Fig. 5, [0079]-[0088], [0100], [0126]-[0128], [0183]. Further as to “based on an extent…” note that the region of interest may be set based on the object occupying a predetermined area or larger in the screen which is a determination based on features located adjacent (those that are smaller than the object in question), conditions that the object is located in the center which is also based on features located adjacent as broadly claimed}.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Mak which already performs object detection to track and segment the position of objects such that the system is configured to determine an extent of the anatomical feature of interest based on an extent of one or more features located adjacent to the feature the anatomical feature of interest is based on as taught by Uyama because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Claim 11
In regards to claim 11, Mak is not relied upon to disclose but Uyama teaches
wherein the system is configured to determine two or more features from the imaging sensor data and, if the area of interest indicated by the user relates to two or more features or to an area between the two or more features, determine the anatomical feature of interest based on the two or more features, and determine the position of the anatomical feature of interest based on the positions of the two or more features
{see [0214] in which system locates the area of interest within the sensor data and in which the region-of-interest setting unit 20 may include a classifier that classifies surgical field images into anatomical regions of interest that may then be used for autofocusing as per Fig. 5, [0079]-[0088], [0100], [0126]-[0128], [0183]. Further as to “based on the position of two or more features…” note that the region of interest may be set based conditions that the object is located in the center which is also based on the position of two or more features as broadly claimed}.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Mak which already performs object detection to track and segment the position of objects such that wherein the system is configured to determine two or more features from the imaging sensor data and, if the area of interest indicated by the user relates to two or more features or to an area between the two or more features, determine the anatomical feature of interest based on the two or more features, and determine the position of the anatomical feature of interest based on the positions of the two or more features as taught by Uyama because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Mak and Uyama as applied to claim 2 above, and further in view of Maharana ((US 20220351392 A1).
Claim 4
In regards to claim 4, Mak discloses wherein the system is configured to detect the position of the anatomical feature of interest
Maharana is analogous art because it is reasonably pertinent to the problem faced by the inventor which is reducing processing load via efficient position detection {see abstract and cites below). Maharana also teaches detecting the position of a feature of interest in at most every second frame of the imaging sensor data {see [0063] [0071] object detection performed on a subset of frames in a sequence such as every second, third, or fourth frame in order to provide a more light-weight (less processing resource intensive) method while retaining accuracy of approaches that detect features for each individual frame).
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Mak which already detects the position of the anatomical feature of interest such that the detecting the position of a feature of interest is in at most every second frame of the imaging sensor data as taught by Maharana because Maharana motivates doing so in [0064] to provide a more light-weight (less processing resource intensive) method while retaining accuracy of approaches that detect features for each individual frame); because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Mak and Uyama as applied to claim 7 above, and further in view of Polchin (US 20240185432 A1).
Claim 9
In regards to claim 9, Mak discloses wherein the system is configured to perform the object detection to identify
Polchin is analogous art from the same field of surgical microscope systems. See abstract, Fig. 10 and cites below.
Polchin also teaches wherein the system is configured to perform the object detection to identify at least one of a clip and a stitching as non-anatomical feature of interest, to detect a position of the non-anatomical feature of interest within the imaging sensor data, and to perform the autofocus functionality of the microscope further based on the position of the non-anatomical feature of interest {see [0011], [0094] in which a suture (stitching) is identified and tracked to refocus the image}.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Mak which already performs the object detection to identify a non-anatomical feature of interest, to detect a position of the non-anatomical feature of interest within the imaging sensor data, and to perform the autofocus functionality of the microscope further based on the position of the non-anatomical feature of interest such that the object detection and autofocus applies to a stitching type of a non-anatomical feature (to perform the object detection to identify at least one of a clip and a stitching as non-anatomical feature of interest) as taught by Polchin because doing so expands the range of non-anatomical features that can be tracked and used for autofocus; because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Mak as applied to claim 1 above, and further in view of Fernald (US 20190365252 A1)
Claim 16
In regards to claim 16, Mak discloses wherein the system is configured to generate a digital view based on the imaging sensor data,
Fernald is analogous art from the same field of surgical microscope systems. See abstract, Figs. 1, 2, 3, 4 and cites below.
Fernald also teaches generate a digital view based on the imaging sensor data, to highlight the area of interest and/or the feature of interest within the digital view, and to provide a display signal comprising the digital view to a display device of the surgical microscope system {see [0080]-[088], [0091] which highlights and displays a detected anatomical structure such as a blood vessel or tumor}.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Mak which already generates a digital view based on the imaging sensor data and provides a display signal comprising the digital view to a display device of the surgical microscope system such that the system also highlights the area of interest and/or the feature of interest within the digital view as taught by Fernald because doing so helps the surgeon visualize and more readily find features of interest in the display; because doing so expands the range of non-anatomical features that can be tracked and used for autofocus; because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Yip, Michael C., et al. "Tissue tracking and registration for image-guided surgery." IEEE transactions on medical imaging 31.11 (2012): 2169-2182 teaches identifying tumors as a region of anatomical interest.
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Michael R Cammarata whose telephone number is (571)272-0113. The examiner can normally be reached M-Th 7am-5pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Bella can be reached at 571-272-7778. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MICHAEL ROBERT CAMMARATA/Primary Examiner, Art Unit 2667