Prosecution Insights
Last updated: April 19, 2026
Application No. 18/068,592

SYSTEM, METHOD, AND COMPUTER PROGRAM FOR A SURGICAL MICROSCOPE SYSTEM AND CORRESPONDING SURGICAL MICROSCOPE SYSTEM

Final Rejection §103
Filed
Dec 20, 2022
Examiner
JONES, JENNIFER ANN
Art Unit
2872
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
LEICA INSTRUMENTS (SINGAPORE) PTE. LTD.
OA Round
2 (Final)
70%
Grant Probability
Favorable
3-4
OA Rounds
3y 5m
To Grant
88%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
46 granted / 66 resolved
+1.7% vs TC avg
Strong +19% interview lift
Without
With
+18.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
20 currently pending
Career history
86
Total Applications
across all art units

Statute-Specific Performance

§103
60.4%
+20.4% vs TC avg
§102
26.1%
-13.9% vs TC avg
§112
12.7%
-27.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 66 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Should applicant desire to obtain the benefit of foreign priority under 35 U.S.C. 119(a)-(d) prior to declaration of an interference, a certified English translation of the foreign application must be submitted in reply to this action. 37 CFR 41.154(b) and 41.202(e). Failure to provide a certified translation may result in no benefit being accorded for the non-English application. Response to Amendment The amendments to the claims in the submission dated 12/19/2025 in response to the office action mailed 09/22/2025 are acknowledged and accepted. Claims 1, 2, 5-7, 9, 12, 13, and 16 are amended. Claims 10 and 11 are cancelled. Claim 18 is new. Response to Arguments In paragraph 3 on page 6 of 11 through the first four lines on page 6 of 11 of Applicant’s Remarks, the applicant submits a summary of the amendments to the claims. No specific arguments are made in this section. Applicant’s arguments, see paragraph 2 on page 6 of 11 through paragraph 2 on page 8 of 16 of Applicant’s Remarks, filed 12/19/2025, with respect to the rejections of claims 1 and 16 under 35 U.S.C. §102 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground of rejection is made in view of Regensburger et al., US 2019/0339502 A1 (hereinafter referred to as Regensburger) and Ogino, US 2001/0009473 A1 (hereinafter referred to as Ogino). Applicant’s arguments, see paragraph 3 on page 9 of 11 through paragraph 3 on page 10 of 11 of Applicant’s Remarks, filed 12/19/2025, with respect to claim 18 have been fully considered and are persuasive. However, upon further consideration, a ground of rejection is made in view of Regensburger et al., US 2019/0339502 A1 (hereinafter referred to as Regensburger) and Piron et al., US 2022/0079686 A1 (hereinafter referred to as Piron). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-9 and 12-17 are rejected under 35 U.S.C. 103 as being unpatentable over Regensburger et al., US 2019/0339502 A1 (hereinafter referred to as Regensburger), and further in view of Ogino, US 2001/0009473 A1 (hereinafter referred to as Ogino). As to claim 1, Regensburger teaches a system for a microscope (Regensburger, Fig. 3, paragraph [0005], “a microscopy system, ” paragraph [0055], “the microscope M”) of a surgical microscope system (Regensburger, Fig. 3, M, paragraph [0055], “the microscope M is a digital surgical stereo microscope”), the system comprising an optical imaging sensor (Regensburger, Fig. 3, 6L, 6R, 106L, 106R, paragraph [0040], “the image detectors 6L, 6R, 106L, and 106R include, for example, a CCD sensor and/or a CMOS sensor,” paragraph [0062], “The first left image detector 6L is embodied as a single first left image sensor 42L and the first right image detector 6R is embodied as a single first right image sensor 42R,” paragraph [0070], “The second image detectors 106L and 106R include four second image sensors 142L, 142La, 142R, and 142Ra… the image sensors 42L, 42R, 142L, and 142Ra can each be embodied as a CCD sensor and/or a CMOS sensor”), one or more processors (Regensburger, Fig. 3, 12, paragraph [0040], “the control device 12 can be for example a microprocessor,” paragraph [0055], the control device 12 is not illustrated in Fig. 3 but can be embodied in each case as in Fig. 1) and one or more storage devices (Regensburger, Fig. 3, 15, paragraph [0079], “the control device 15, for example provided in a storage module”), wherein the system is configured to: sweep a numerical aperture of the microscope (Regensburger, Fig. 3, 36, 136, paragraph [0023], “the stop can be part of a zoom optical unit… the stop is provided with a drive, which can be used to incrementally or continuously change the aperture size,” paragraph [0073], “the size of the pupil stops 36 and 136 is settable individually for each imaging beam path”): determine a depth characteristic of at least a portion of a surgical site being imaged using the microscope (Regensburger, Fig. 5, TB, paragraph [0079], “The object O has a depth extent. FIG. 5 shows by way of example the object O with a channel K, which extends along the optical axis of the imaging. Further shown in the figure is a depth-of-field region TB, which is produced at a given setting of the microscope M, in particular at a given aperture and a given focal length of the imaging. The depth-of-field region TB is located symmetrically with respect to a focal plane of the imaging.”); and adjust the numerical aperture of the microscope based on the determined depth characteristic (Regensburger, Fig. 3, 36, 136, paragraph [0023], “the object is imaged via a stop having a variable aperture size… by changing the aperture size with regard to the depth distribution, it is ensured that the depth-of-field region of the image is optimally set,” paragraphs [0064] and [0069], the first and second pupil stops 36 and 136 are adjustable continuously or incrementally, claim 4, “changing the adjustable aperture size to set the depth-of-field region”). Regensburger does not teach the microscope system configured to: sweep a numerical aperture of the microscope for the generation of a plurality of frames of imaging sensor data from the optical imaging sensor, the frames being based on different numerical apertures; and determine a depth characteristic of at least a portion of a surgical site being imaged using the microscope based on the plurality of frames (Regensburger, Fig. 3, 36, 136, paragraph [0023] describes that it is typical for the control device to acuate the image detector such that it increases its exposure time by increasing the time interval between two image recording cycles, thus implying the generation of a plurality of image frames to determine the depth distribution, however the generation of a plurality of frames is not explicitly taught). However, in the same field of endeavor Ogino teaches a system for a microscope (Ogino, Fig. 7, paragraph [0064], “wide field microscope”), the system comprising an optical imaging sensor (Ogino, Fig. 7, 617, paragraph [0067], “the image on the CCD camera 617 is photoelectrically converted and then is converted by an image processing unit 618 into an image signal to be displayed on display 619”), one or more processors (Ogino, Fig. 7, 618, paragraph [0067], “image processing unit 618”) and one or more storage devices (Ogino, Fig. 7, 620, paragraph [0075], the focal position is stored in a memory), wherein the system is configured to: sweep a numerical aperture of the microscope (Ogino, Fig. 7, 609, paragraph [0018], “an aperture stop for adjusting the numerical aperture of the objective lens,” paragraphs [0066] and [0068], “an aperture stop 609,” the diameter of the aperture stop 609 is variable) for the generation of a plurality of frames of imaging sensor data from the optical imaging sensor, the frames being based on different numerical apertures (Ogino, Figs. 8-9, S33-S50, paragraphs [0076]-[0077], “the value for the aperture stop 609 of the microscope is changed to another aperture stop value… if all the values for the aperture stop have not been set (No), the flow returns to step S33,” the steps S33 through S50 consist of generating and displaying a plurality of image frames); and determine a depth characteristic of at least a portion of a site being imaged using the microscope based on the plurality of frames (Ogino, Figs. 8-9, S32, S33, paragraphs [0058]-[0060] establish the relationship between the focal depth and the aperture, paragraph [0073], the values for the aperture stop for give the values for the focal depths, thus the depth characteristic is determined based on the plurality of frames collected at S33). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microscope system of Regensburger with the configuration to: sweep a numerical aperture of the microscope for the generation of a plurality of frames of imaging sensor data from the optical imaging sensor, the frames being based on different numerical apertures; and determine a depth characteristic of at least a portion of a surgical site being imaged using the microscope based on the plurality of frames of Ogino, because an image with optimal contrast can be obtained in a short period of time (Ogino, paragraph [0084]). As to claim 2, Regensburger in view of Ogino teaches all the limitations of the instant invention as detailed above with respect to claim 1, and Regensburger further teaches the system according to claim 1, wherein the system is configured to determine a depth of field of at least the portion of the surgical site (Regensburger, paragraph [0020], “Determining the depth distribution makes the adaptation of the depth-of-field region easier, because it is known from the depth distribution or the desired depth region in which range of values the depth-of-field region should lie”), and to further adjust the numerical aperture of the microscope based on the depth of field (Regensburger, Fig. 3, 36, 136, paragraph [0023], “the object is imaged via a stop having a variable aperture size…by changing the aperture size with regard to the depth distribution, it is ensured that the depth-of-field region of the image is optimally set… the stop can be part of a zoom optical unit… the stop is provided with a drive, which can be used to incrementally or continuously change the aperture size”). As to claim 3, Regensburger in view of Ogino teaches all the limitations of the instant invention as detailed above with respect to claim 2, and Regensburger further teaches the system according to claim 2, wherein the system is configured to adjust the numerical aperture such that the depth of field provided by the microscope matches the depth of field of at least the portion of the surgical site (Regensburger, Fig. 3, 12, paragraph [0055], the control device 12 is not illustrated in Fig. 3, but can be embodied in each case as in Fig. 1, paragraph [0081], “the control device 12 automatically ascertains a parameter which can be adjusted as part of the microscopy to be performed and influences the extent of the depth-of-field region TB… A known parameter is the use of a stop in the pupil plane of the imaging beam path… by adjusting the stop”). As to claim 4, Regensburger in view of Ogino teaches all the limitations of the instant invention as detailed above with respect to claim 3, and Regensburger further teaches the system according to claim 3, wherein the system is configured to adjust the numerical aperture such that the depth of field provided by the microscope is further suitable for a personal preference with respect to depth of field of a surgeon using the surgical microscope system (Regensburger, Fig. 5, paragraph [0079], “the observer indicates the region of interest D3. The control device 12 in these exemplary embodiments includes the input device 12.1 for defining the region of interest D3”). As to claim 5, Regensburger in view of Ogino teaches all the limitations of the instant invention as detailed above with respect to claim 1, and Regensburger further teaches the system according to claim 1, wherein the portion of the surgical site is a region of interest (Regensburger, Fig. 5, O, D3, paragraphs [0009]-[0010], “a surgical microscope, which magnifies and presents an object… the object is specifically a sample or a body this to be observed using the stereo microscope, for example a human being or a body part of a human being or of an animal,” paragraph [0079], “the object O should be captured in a region of interest D3 and examined using the microscope,” thus the surgical site of the object O is a region of interest D3), and wherein the system is configured to determine the region of interest within the surgical site (Regensburger, Fig. 5, D3, paragraph [0079], “the object O should be captured in a region of interest D3,” paragraphs [0080]-[0081], “the control device 12 performs a depth scan on the microscope M to ascertain the depth distribution of the object O in the region of interest D3). As to claim 6, Regensburger in view of Ogino teaches all the limitations of the instant invention as detailed above with respect to claim 5, and Regensburger further teaches the system according to claim 5, wherein the system is configured to determine the region of interest based on at least one of the imaging sensor data, a user input signal from a user interface of the system, or data from a gaze-tracking sensor (Regensburger, Fig. 3, D3, paragraph [0079], “the object O should be captured in the region of interest D3 and examined using the microscope… the observer indicates the region of interest D3. The control device 12 in these exemplary embodiments includes the input device 12.1 for defining the region of interest D3””). As to claim 7, Regensburger teaches all the limitations of the instant invention as detailed above with respect to claim 6, and Regensburger further teaches the system according to claim 6, wherein when the region of interest is determined based on the imaging sensor data (Regensburger, Fig. 3, D3, paragraphs [0075]-[0078] describe the image detectors which produce electronic image data paragraph used by the control device 12 to determine the region of interest), the system is configured to perform image processing to determine the region of interest based on the portion of the surgical site being operated on (Regensburger, Fig. 3, D3, paragraph [0079], “the control device 12 therefore performs a method which makes it possible for the object O to be continuously imaged sharply in the region of interest. To this end, the region of interest D3 is first defined,” paragraph [0080], “ the control device 12 ascertains the depth distribution TV in the region of interest D3. This can be implements by an image evaluation”). As to claim 8, Regensburger teaches all the limitations of the instant invention as detailed above with respect to claim 5, and Regensburger further teaches the system according to claim 5, wherein the system is configured to determine the region of interest based on a user input signal obtained via a user interface of the surgical microscope system (Regensburger, Fig. 5, paragraph [0079], “the observer indicates the region of interest D3. The control device 12 in these exemplary embodiments includes the input device 12.1 for defining the region of interest D3”). As to claim 9, Regensburger in view of Ogino teaches all the limitations of the instant invention as detailed above with respect to claim 1, and Regensburger further teaches the system according to claim 1, wherein the system is configured to obtain sensor data from a depth sensor of the surgical microscope system (Regensburger, Fig. 1, 1000, paragraph [0080], “the microscope M includes a depth sensor 1000”), and to determine the depth characteristic of at least the portion of the surgical site based on the sensor data of the depth sensor (Regensburger, Fig. 5, D3, paragraphs [0080]-[0081], “the control device 12 performs a depth scan on the microscope M to ascertain the depth distribution of the object O in the region of interest D3”). As to claim 12, Regensburger in view of Ogino teaches all the limitations of the instant invention as detailed above with respect to claim 1, and Regensburger further teaches the system according to claim 1, wherein the system is configured to determine the depth characteristic of at least the portion of the surgical site based on a contrast and/or based on a presence of spatial frequencies above a pre-defined spatial frequency threshold of the respective frames of the plurality of frames (Regensburger, paragraph [0017], “the depth distribution is a previously known depth map of the object or statistic indicating the relative frequency of a specific height of the object within the region of interest.”). As to claim 13, Regensburger in view of Ogino teaches all the limitations of the instant invention as detailed above with respect to claim 1, and Regensburger further teaches the system according to claim 1, wherein the system is configured to control the microscope or surgical microscope system to perform a sweep of a working distance and/or focal distance of the microscope for the generation of a further plurality of frames of imaging sensor data being based on different working distances or focal distances (Regensburger, paragraph [0025], “a focal length of the imaging is adjustable, and the depth-of-field region is centered to the desired depth region by changing the focal length. The focal length of the zoom optical unit defines the working distance between microscope and the object. The focal length does not change the size of the depth-of-field region, but does move the depth-of-field region.”), and to determine the depth characteristic of at least the portion of the surgical site based on the further plurality of frames of imaging sensor data being based on the different working distances or focal distances (Regensburger, Fig. 5, paragraph [0079], “The object O has a depth extent. FIG. 5 shows by way of example the object O with a channel K, which extends along the optical axis of the imaging. Further shown in the figure is a depth-of-field region TB, which is produced at a given setting of the microscope M, in particular at a given aperture and a given focal length of the imaging”). As to claim 14, Regensburger in view of Ogino teaches all the limitations of the instant invention as detailed above with respect to claim 13, and Regensburger further teaches the system according to claim 13, wherein the system is configured to select a working distance or focal distance based on frames (Regensburger, paragraph [0013], “variable imaging parameters which can be used to change the depth of field of the imaging can be, for example, a focal length, an aperture size of a stop provided in the imaging beam path, or a working distance… the selection can be made by the control device automatically according to prescribed rules or specifications and/or via the input device.”), and to sweep the numerical aperture of the microscope (Regensburger, Fig. 5, paragraph [0078], “when performing examinations using the microscope M, the control device 12 performs a method for setting a depth-of-field region TB,” paragraph [0079], “the control device 12 therefor performs a method which makes it possible for the object O to be continuously imaged sharply in the region of interest. To this end, the region of interest D3 is first defined. This can be effected for example by way of an automatic evaluation of the object O,” paragraph [0081], “the control device 12 automatically ascertains a parameter which can be adjusted as part of the microscopy to be performed and influences the extent of the depth-of-field region TB… A known parameter is the use of a stop in the pupil plane of the imaging beam path… by adjusting the stop”). Regensburger does not teach the microscope system wherein the further plurality of frames of imaging sensor data are generated during the sweep of the working distance or focal distance; and the generation of the plurality of frames of imaging sensor data being based on the different numerical apertures (Regensburger, Fig. 3, 36, 136, paragraph [0023] describes that it is typical for the control device to acuate the image detector such that it increases its exposure time by increasing the time interval between two image recording cycles, thus implying the generation of a plurality of image frames to determine the depth distribution, however the generation of a plurality of frames is not explicitly taught). However, in the same field of endeavor Ogino teaches a system for a microscope (Ogino, Fig. 7, paragraph [0064], “wide field microscope”) wherein the further plurality of frames of imaging sensor data are generated during the sweep of the working distance or focal distance; and the generation of the plurality of frames of imaging sensor data being based on the different numerical apertures (Ogino, Figs. 8-9, S33-S50, paragraphs [0076]-[0077], “the value for the aperture stop 609 of the microscope is changed to another aperture stop value… if all the values for the aperture stop have not been set (No), the flow returns to step S33,” the steps S33 through S50 consist of generating and displaying a plurality of image frames). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microscope system of Regensburger with the further plurality of frames of imaging sensor data are generated during the sweep of the working distance or focal distance; and the generation of the plurality of frames of imaging sensor data being based on the different numerical apertures of Ogino, because an image with optimal contrast can be obtained in a short period of time (Ogino, paragraph [0084]). As to claim 15, Regensburger in view of Ogino teaches all the limitations of the instant invention as detailed above with respect to claim 1, and Regensburger further teaches a surgical microscope system comprising a microscope and the system according to claim 1 (Regensburger, Fig. 3, M, paragraph [0055], “the microscope M is a digital surgical stereo microscope”). As to claim 16, Regensburger teaches a method for a microscope of a surgical microscope system (Regensburger, Fig. 3, paragraph [0005], “a microscopy system, ” paragraph [0055], “the microscope M,” paragraph [0055], “the microscope M is a digital surgical stereo microscope”), the method comprising: determining a depth characteristic of at least a portion of a surgical site being imaged using the microscope (Regensburger, Fig. 5, paragraph [0078], “When performing examinations using the microscope M, the control device 12 performs a method for setting a depth-of-field region TB.”); and adjusting the numerical aperture of the microscope based on the determined depth characteristic (Regensburger, Fig. 5, D3, paragraphs [0080]-[0081], “the control device 12 performs a depth scan on the microscope M to ascertain the depth distribution of the object O in the region of interest D3… the control device 12 automatically ascertains a parameter which can be adjusted as part of the microscopy to be performed and influences the extent of the depth-of-field region TB… A known parameter is the use of a stop in the pupil plane of the imaging beam path… by adjusting the stop”). Regensburger does not teach the microscope system configured to: sweep a numerical aperture of the microscope for the generation of a plurality of frames of imaging sensor data from the optical imaging sensor, the frames being based on different numerical apertures; and determining a depth characteristic of at least a portion of a surgical site being imaged using the microscope based on the plurality of frames (Regensburger, Fig. 3, 36, 136, paragraph [0023] describes that it is typical for the control device to acuate the image detector such that it increases its exposure time by increasing the time interval between two image recording cycles, thus implying the generation of a plurality of image frames to determine the depth distribution, however the generation of a plurality of frames is not explicitly taught). However, in the same field of endeavor Ogino teaches a method for a microscope (Ogino, Fig. 7, paragraph [0064], “wide field microscope,” Figs. 8-9, paragraph [0071], flowcharts shown in Figs. 8 and 9 describe the method of operation for the microscope), the method comprising: sweeping a numerical aperture of the microscope (Ogino, Fig. 7, 609, paragraph [0018], “an aperture stop for adjusting the numerical aperture of the objective lens,” paragraphs [0066] and [0068], “an aperture stop 609,” the diameter of the aperture stop 609 is variable) for the generation of a plurality of frames of imaging sensor data from the optical imaging sensor, the frames being based on different numerical apertures (Ogino, Figs. 8-9, S33-S50, paragraphs [0076]-[0077], “the value for the aperture stop 609 of the microscope is changed to another aperture stop value… if all the values for the aperture stop have not been set (No), the flow returns to step S33,” the steps S33 through S50 consist of generating and displaying a plurality of image frames); and determining a depth characteristic of at least a portion of a site being imaged using the microscope based on the plurality of frames (Ogino, Figs. 8-9, S32, S33, paragraphs [0058]-[0060] establish the relationship between the focal depth and the aperture, paragraph [0073], the values for the aperture stop for give the values for the focal depths, thus the depth characteristic is determined based on the plurality of frames collected at S33). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microscope system of Regensburger with the configuration to: sweep a numerical aperture of the microscope for the generation of a plurality of frames of imaging sensor data from the optical imaging sensor, the frames being based on different numerical apertures; and determine a depth characteristic of at least a portion of a surgical site being imaged using the microscope based on the plurality of frames of Ogino, because an image with optimal contrast can be obtained in a short period of time (Ogino, paragraph [0084]). As to claim 17, Regensburger in view of Ogino teaches all the limitations of the instant invention as detailed above with respect to claim 16, and Regensburger further teaches a non-transitory, computer-readable medium comprising a program code that, when the program code is executed on a processor, a computer, or a programmable hardware component, causes the processor, computer, or programmable hardware component to perform the method of claim 16 (Regensburger, Fig. 1, 12, paragraph [0040], “The control device 12 can be for example a microprocessor, a computer having a correspondingly configured computer program, or another electric circuit”). Claim 18 are rejected under 35 U.S.C. 103 as being unpatentable over Regensburger et al., US 2019/0339502 A1 (hereinafter referred to as Regensburger), and further in view of Piron et al., US 2022/0079686 A1 (hereinafter referred to as Piron). As to claim 18, Regensburger teaches a system for a microscope (Regensburger, Fig. 3, paragraph [0005], “a microscopy system, ” paragraph [0055], “the microscope M”) of a surgical microscope system (Regensburger, Fig. 3, M, paragraph [0055], “the microscope M is a digital surgical stereo microscope”), the system comprising one or more processors (Regensburger, Fig. 3, 12, paragraph [0040], “the control device 12 can be for example a microprocessor,” paragraph [0055], the control device 12 is not illustrated in Fig. 3 but can be embodied in each case as in Fig. 1) and one or more storage devices (Regensburger, Fig. 3, 15, paragraph [0079], “the control device 15, for example provided in a storage module”), wherein the system is configured to: determine a depth characteristic of a surgical site being imaged using the microscope (Regensburger, Fig. 5, TB, paragraph [0079], “The object O has a depth extent. FIG. 5 shows by way of example the object O with a channel K, which extends along the optical axis of the imaging. Further shown in the figure is a depth-of-field region TB, which is produced at a given setting of the microscope M, in particular at a given aperture and a given focal length of the imaging. The depth-of-field region TB is located symmetrically with respect to a focal plane of the imaging.”); obtain imaging sensor data from an optical imaging sensor of the microscope (Regensburger, Fig. 3, 6L, 6R, 106L, 106R, paragraph [0040], “the image detectors 6L, 6R, 106L, and 106R include, for example, a CCD sensor and/or a CMOS sensor”); determine a region of interest within the surgical site based on the imaging sensor data (Regensburger, Fig. 5, D3, paragraph [0079], “the object O should be captured in a region of interest D3,” paragraphs [0080]-[0081], “the control device 12 performs a depth scan on the microscope M to ascertain the depth distribution of the object O in the region of interest D3), wherein the system is configured to perform image processing on the imaging sensor data to determine a portion of the surgical site being operated on (Regensburger, Fig. 3, D3, paragraph [0079], “the control device 12 therefore performs a method which makes it possible for the object O to be continuously imaged sharply in the region of interest. To this end, the region of interest D3 is first defined,” paragraph [0080], “ the control device 12 ascertains the depth distribution TV in the region of interest D3. This can be implements by an image evaluation”), and to determine the region of interest based on the portion of the surgical site being operated on (Regensburger, Fig. 3, D3, paragraphs [0075]-[0078] describe the image detectors which produce electronic image data paragraph used by the control device 12 to determine the region of interest); and adjust a numerical aperture of the microscope for the region of interest within the surgical site based on the determined depth characteristic (Regensburger, Fig. 3, 36, 136, paragraph [0023], “the object is imaged via a stop having a variable aperture size… by changing the aperture size with regard to the depth distribution, it is ensured that the depth-of-field region of the image is optimally set,” paragraphs [0064] and [0069], the first and second pupil stops 36 and 136 are adjustable continuously or incrementally, claim 4, “changing the adjustable aperture size to set the depth-of-field region”). Regensburger does not teach the system for a microscope wherein the system is configured to perform image processing on the imaging sensor data to determine a portion of the surgical site being operated on by determining a position of one or more surgical tools in the imaging sensor data. However, in the same field of endeavor Piron teaches a system for a microscope (Piron, Fig. 2B, 205, paragraph [0064], “medical navigation system 205”) of a surgical microscope system (Piron, Fig. 5, 500, paragraph [0094], imaging system 500 is used in medical procedures) wherein the system is configured to perform image processing on the imaging sensor data to determine a portion of the surgical site being operated on by determining a position of one or more surgical tools in the imaging sensor data (Piron, Fig. 11, 500, 222, paragraph [0114], the imaging system 500 is configured to perform autofocusing relative to an instrument being used in the medical procedure… for example, the position and orientation of a medical instrument, such as a tracked pointer tool 222, is determined; and the controller 530 performs autofocusing to focus the captured image on a point defined relative to the medical instrument… as the tracked pointer tool 222 is moved, the WD (working distance) between the optical imaging system 500 and the defined focus point changes”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the system for a microscope of Regensburger with the microscope system wherein the system is configured to perform image processing on the imaging sensor data to determine a portion of the surgical site being operated on by determining a position of one or more surgical tools in the imaging sensor data of Piron, because doing so enables a surgeon to change the focus within a FoV without changing the FoV and without needing to manually adjust the focus of the imaging system 500 (Piron, paragraph [0114]). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JENNIFER A JONES whose telephone number is (703)756-4574. The examiner can normally be reached Monday - Friday 8 AM - 5 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Pham can be reached at 571-272-3689. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JENNIFER A JONES Examiner Art Unit 2872 /JENNIFER A JONES/Examiner, Art Unit 2872 /THOMAS K PHAM/Supervisory Patent Examiner, Art Unit 2872
Read full office action

Prosecution Timeline

Dec 20, 2022
Application Filed
Sep 17, 2025
Non-Final Rejection — §103
Dec 19, 2025
Response Filed
Jan 22, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601954
SELECTIVELY CONFIGURABLE PHOTONIC LOGIC DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12596291
A LENS ASSEMBLY AND A CAMERA MODULE INCLUDING THE SAME
2y 5m to grant Granted Apr 07, 2026
Patent 12578521
LENS PORTION AND DISPLAY DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12571992
OPTICAL IMAGING SYSTEM
2y 5m to grant Granted Mar 10, 2026
Patent 12535724
LOW-PROFILE BEAM SPLITTER
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
70%
Grant Probability
88%
With Interview (+18.6%)
3y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 66 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month