Prosecution Insights
Last updated: April 19, 2026
Application No. 18/784,879

METHOD FOR CALIBRATING A STEREOSCOPIC MEDICAL MICROSCOPE, AND MEDICAL MICROSCOPE ARRANGEMENT

Final Rejection §102§103§112
Filed
Jul 25, 2024
Examiner
HODGES, SUSAN E
Art Unit
2425
Tech Center
2400 — Computer Networks
Assignee
Carl Zeiss Meditec AG
OA Round
2 (Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
2y 4m
To Grant
81%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
250 granted / 375 resolved
+8.7% vs TC avg
Moderate +14% lift
Without
With
+14.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
31 currently pending
Career history
406
Total Applications
across all art units

Statute-Specific Performance

§101
6.0%
-34.0% vs TC avg
§103
48.7%
+8.7% vs TC avg
§102
20.9%
-19.1% vs TC avg
§112
22.6%
-17.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 375 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Applicant(s) Response to Official Action The response filed on January 30, 2026 has been entered and made of record. Claims 1 and 11 have been amended. Claims 1 – 11 are currently pending in the application. Response to Arguments Applicant’s arguments see pages 7 – 9 with respect to the rejection of Claims 1 - 9 and 11 under 35 U.S.C. 102(a)(1) as being anticipated by Ji et al., (US 2014/0362186 A1) have been fully considered and are not persuasive. Examiner’s response to the presented arguments follows below: Applicant argues on pages 7 and 8 that “Ji is silent about stored calibration data and further calibration data being applied to subsequently captured image representations, such that the subsequently captured image representations are corrected by the calibration data and the further calibration data, and wherein corrected image representations are provided”. Examiner respectfully disagrees. Ji clearly teaches in Par. [0032]-[0033] These 3D surface profile extraction parameters are then saved 210 for later use (i.e. calibration data are stored) and a secondary calibration phantom is positioned 212 in view of the optical system, and a stereo image pair of the secondary calibration phantom as viewed in the reference setting is captured and saved as part of calibration information (i.e. further calibration data are stored). Ji further teaches in Par. [0035] With the optical system set 302 to the arbitrary desired setting, the secondary calibration phantom is positioned in view of the optical system in a position approximating that where tissue 106 will be present during surgery, and a stereo image pair of the secondary calibration phantom is captured or taken 304 by cameras 118, 120 (i.e. subsequent capture). In addition, Ji teaches in Par. [0058] when surface profile extraction is desired at a runtime arbitrary optical setting set, such as setting 370, during surgery by a surgeon, the runtime optical settings are determined by reading information from tracker 142, Par. [0066] when a surgeon selects a runtime setting, a runtime distortion field parameter (DFP(run)) is then determined by interpolation, between nearby secondary calibration points recorded in the library 372, and in Par. [0067] A runtime stereo image (i.e. subsequently captured image representation) is then captured, and warped to correspond to images captured at the primary calibration point or reference setting, of that quadrant, such as setting 352 for the lower left quadrant 374 or setting 355 for runtime settings in the top right quadrant 378. 3D extraction is then performed on the warped image, using 3D extraction parameters recorded in library 372 and associated with the primary calibration point or reference setting 352, 355, associated with that quadrant. Therefore, Ji discloses the amended limitation as claimed. Claim Rejections - 35 USC § 112(b) The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1 – 11 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention. Regarding Claims 1 and 11, by using "and/or," the aforementioned claims recite broad limitations (designated with "or") together with narrower limitations (designated with "and"). The scope of the claim is unascertainable since the term raises a question or doubt as to whether the elements joined by "and/or" are required or optional. For examining purposes, the Examiner has broadly interpreted the claim limitations. Claims 2 - 10 are rejected for the reasons above by virtue of their respective dependencies. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. Claims 1 – 9 and 11 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ji et al., (US 2014/0362186 A1) referred to as Ji hereinafter. Regarding Claim 1, Ji discloses a method for calibrating (Par. [0013] FIG. 2 is a flowchart of a precalibration sequence that determines parameters for extraction of a surface profile from stereo images taken with the optical system in base settings, and saves images of a phantom taken with the optical system in base settings) a stereoscopic medical microscope (Fig. 1, Par. [0017] a stereo microscope 102 used to determine an intra-surgery location of a tumor or other anatomic feature of tissue (See abstract)), the method comprising: (a) capturing image representations of at least one three-dimensional calibration object (Par. [0028], A stereo pair of images (i.e. image representations) is taken 206 of the precalibration phantom (i.e. calibration object), assuming the precalibration phantom has known surface profile, providing a plurality of known points in three dimensions) with cameras of a stereo camera system (Fig. 1, Par. [0018], stereo cameras 118, 120 where camera 118 is a left camera and camera 120 is a right camera) of the medical microscope (Par. [0019] surgical microscope 102); (b) generating calibration data based on the captured image representations, wherein the generated calibration data are stored for correction purposes (Par. [0032] Since the 3D locations of the correspondence points are known on the precalibration phantoms, the parameters are fit 208 such that the extraction to a common 3D space gives results where extracted 3D points of an effective surface profile of the precalibration phantom match heights of the known points on the precalibration phantom. These 3D surface profile extraction parameters (i.e. calibration data) are then saved 210 for later use); (c) capturing further image representations of at least one two-dimensional calibration object with the cameras of the stereo camera system (Par. [0033], a secondary calibration phantom is positioned 212 in view of the optical system, and a stereo image pair of the secondary calibration phantom as viewed in the reference setting is captured (i.e. capturing) and saved as part of calibration information, where the secondary calibration phantom is a two dimensional (i.e. two-dimensional calibration object), flat, phantom having marks printed thereon); and (d) generating further calibration data based on the captured further image representations, wherein the further calibration data are stored for correction purposes (Par. [0033], a stereo image pair of the secondary calibration phantom as viewed in the reference setting is captured and saved (i.e. stored) as part of calibration information (i.e. further calibration data)), wherein the calibration data generated in step (b) for the correction purposes (Par. [0025], parameters for reconstruction (i.e. correction purposes) surface mapping routine 132 are derived that are sufficient for reconstructing a surface profile (i.e. calibration data) from a pair of stereo images (i.e. captured image representations) taken with the optical system set 202 to the reference setting at Step 206) are applied at least partially to the captured further image representations (Par. [0025], then applied at least partially to captured further image representations at step 212), wherein step (d) is performed (Step 212 is performed, Par. [0033], a stereo image pair of the secondary calibration phantom as viewed in the reference setting is captured and saved (i.e. stored) as part of calibration information (i.e. further calibration data)) and/or repeated for the corrected further image representations, and/or wherein the further calibration data generated in step (d) for correction purposes are applied at least partially to the captured image representations, wherein step (b) is performed and/or repeated for the corrected images, and wherein the stored calibration data and further calibration data (Par. [0032], These 3D surface profile extraction parameters (i.e. calibration data) are then saved 210 for later us and Par. [0033] a secondary calibration phantom is positioned 212 in view of the optical system, and a stereo image pair of the secondary calibration phantom as viewed in the reference setting is captured and saved as part of calibration information (i.e. further calibration data)) are applied to subsequently captured image representations (Par. [0035] With the optical system set 302 to the arbitrary desired setting, the secondary calibration phantom is positioned in view of the optical system in a position approximating that where tissue 106 will be present during surgery, and a stereo image pair of the secondary calibration phantom is captured or taken 304 by cameras 118, 120 (i.e. subsequent capture)), such that the subsequently captured image representations are corrected by the calibration data and the further calibration data, and wherein corrected image representations are provided (Par. [0066] when a surgeon selects a runtime setting, a runtime distortion field parameter (DFP(run)) is then determined by interpolation, between nearby secondary calibration points recorded in the library 372. Par. [0067] A runtime stereo image (i.e. subsequently captured) is then captured, and warped to correspond to images captured at the primary calibration point or reference setting, of that quadrant, such as setting 352 for the lower left quadrant 374 or setting 355 for runtime settings in the top right quadrant 378. 3D extraction is then performed on the warped image, using 3D extraction parameters recorded in library 372 and associated with the primary calibration point or reference setting 352, 355, associated with that quadrant). Regarding Claim 2, Ji discloses claim 1. Ji further discloses wherein at least one of step (a) and step (c) are/is carried out at different operating points of the medical microscope (Par. [0038] To determine image deformation due to the change in image acquisition settings (i.e., m magnification and f focal length) (i.e. different operating points), a series of phantom images were acquired using a planar secondary calibration phantom (i.e. capturing images of step (c)) with randomly generated squares of random grayscale intensity by successively changing one parameter from its reference value while maintaining other optical system parameters at the corresponding reference value), and wherein the at least one of the calibration data in step (b) and the further calibration data in step (d) are generated for each of the different operating points (Par. [0038] For all these phantom images (i.e. at different operating points), the secondary calibration phantom (i.e. calibration data) was perpendicular to the optical axis (i.e., the orientation was maintained at its reference value of .theta..sub.0=0) See Fig. 4). Regarding Claim 3, Ji discloses claim 1. Ji further discloses wherein at least one of the calibration data generated in step (b) and the further calibration data generated in step (d) for correction purposes are at least partly applied to at least one of the captured image representations and the captured further image representations (Par. [0025], parameters for reconstruction (i.e. correction purposes) surface mapping routine 132 are derived that are sufficient for reconstructing a surface profile from a pair of stereo images (i.e. captured image representations) taken with the optical system set 202 to the reference setting), and wherein at least one of step (b) is carried out and repeated for the image representations, and step (d) is carried out and repeated for the corrected further image representations (Par. [0057]-[0060] An encoder 143 is added to the microscope zoom and focus controls. Warping parameters corresponding to image deformations at each combination of magnification m, focal length f, and angle .theta., or secondary calibration point, are stored 308 in a multidimensional table indexed by the zoom and focus control settings. In this table-based embodiment, when surface profile extraction is desired (i.e. carried out and repeated) at a runtime arbitrary optical setting set, such as setting 370, during surgery by a surgeon, the runtime optical settings are determined by reading the magnification m, focal length f, and angle using the encoder. A runtime image pair of tissue is then captured. The runtime optical warping parameters are then used (i.e. carried out and repeated) to warp the runtime image pair to an image pair that corresponds to the reference setting S0, 352 as heretofore described, whereupon 3D reconstruction is performed as heretofore described.). Regarding Claim 4, Ji discloses claim 3. Ji further discloses wherein the correction and steps (c) and (d) are repeated until at least one predefined optimization criterion is satisfied (Par. [0062], in order to provide more accurate 3D reconstruction (i.e. criterion is satisfied) at higher magnification and longer focal length settings (i.e. predefined optimization criterion), additional reference image acquisition settings at the midrange of optical system settings are used (i.e. repeated) in addition to the extreme settings at the lowest magnification and shortest focal length. Additional reference settings 355, 357 are provided at a reproducible, but greater than minimum, set-point of focal length. 3D reconstruction parameters are determined by primary calibration, similarly to the process heretofore described for determination of 3D reconstruction parameters for the reference setting S0, for each of these additional reference settings 354, 355, 357). Regarding Claim 5, Ji discloses claim 1. Ji further discloses wherein the generated calibration data are checked by at least one of at least partial application to the captured image representations and the captured further image representations (Par. [0062], additional reference settings 355, 357 are provided at a reproducible, but greater than minimum, set-point of focal length (i.e. calibration data is checked). 3D reconstruction parameters are determined by primary calibration (i.e. by partial application), similarly to the process heretofore described for determination of 3D reconstruction parameters for the reference setting S0, for each of these additional reference settings 354, 355, 357) and assessment of a result of the application (Par. [0062], in order to provide more accurate 3D reconstruction (i.e. assessment of result) at higher magnification and longer focal length settings, additional reference image acquisition settings at the midrange of optical system settings are used in addition to the extreme settings at the lowest magnification and shortest focal length), and wherein steps (a) to (d) are at least partly repeated (Par. [0063], It is desirable that each reference setting S0, 352, 354, 355, 357 be a setting that the optical system can be reproducibly be returned to (i.e. partly repeated)) when an assessment result does not satisfy at least one predefined criterion (Par. [0062] in order to provide more accurate 3D reconstruction at higher magnification and longer focal length settings (i.e. predefined criterion not satisfied), additional reference image acquisition settings at the midrange of optical system settings are used in addition to the extreme settings at the lowest magnification and shortest focal length). Regarding Claim 6, Ji discloses claim 1. Ji further discloses wherein at least one of step (b) and step (d) include(s) one or more of the following measures: determining extrinsic calibration data (Par. [0028], the next step is to determine (i.e., calibrate) several unknown parameters among the equations presented above. In particular, the extrinsic camera parameters to be calibrated are the rotation and translation matrices (R; T)), determining intrinsic calibration data (Par. [0028], the next step is to determine (i.e., calibrate) several unknown parameters among the equations presented above. In particular, the intrinsic parameters are the focal length (f), lens distortion coefficient .cndot., scale factor (Sx), and image center (Cx; Cy)), determining distortion correction field data (Par. [0026] Techniques for stereo image calibration and reconstruction based on a pinhole camera model and radial lens distortion correction are outlined here for completeness. A 3D point in world space (X, Y, Z) is transformed into the camera image coordinates (x, y) using a perspective projection matrix), determining edge decrease correction data, and determining chromatic displacement field correction data. Regarding Claim 7, Ji discloses claim 1. Ji further discloses wherein step (b) includes determining a respective focus position of the cameras of the stereo camera system and setting the respective focus position (Par. [0057], Warping parameters corresponding to image deformations at each combination of magnification m, focal length f, and angle .theta., or secondary calibration point, are stored 308 in a multidimensional table indexed by the zoom and focus control settings (i.e. respective focus position settings)). Regarding Claim 8, Ji discloses claim 1. Ji further discloses wherein step (d) includes at least one of determining an offset between the captured further image representations of the cameras of the stereo camera system (Par. [0023], The difference between these two images (i.e. an offset between captured images), or their "deformation," is measured using by identifying corresponding reference marks of a phantom in images of the phantom taken at the arbitrary setting S and the reference setting S0, and then measuring displacement of the reference marks between the identified locations of the reference marks in the image of the phantom at the arbitrary setting S and the identified locations of the reference marks in the image of the phantom at the reference setting S0) and determining offset correction data (Par. [0023], The difference between these two images, or their "deformation," is measured using by identifying corresponding reference marks of a phantom in images of the phantom taken at the arbitrary setting S and the reference setting S0, and then measuring displacement (i.e. an offset correction data) of the reference marks between the identified locations of the reference marks in the image of the phantom at the arbitrary setting S and the identified locations of the reference marks in the image of the phantom at the reference setting S0) and determining a rotation (Par. [0028], the extrinsic camera parameters to be calibrated are the rotation (i.e. rotation) and translation matrices (R; T)) between the captured further image representations of the cameras of the stereo camera system (Par. [0028], A stereo pair of images (i.e. between capture images) is taken 206 of the precalibration phantom, assuming the precalibration phantom has known surface profile, providing a plurality of known points in three dimensions) and determining rotation correction data (Par. [0028], the extrinsic camera parameters to be calibrated are the rotation (i.e. rotation correction data) and translation matrices (R; T)). Regarding Claim 9, Ji discloses claim 1. Ji further discloses wherein different operating points of a movable lens system of the stereoscopic medical microscope (Par. [0003], mapping of a biological tissue surface is achieved through variable-magnification zoom lenses (i.e. movable lens system) during surgery. Par. [0063], microscopes are provided with motorized focus and zoom controls (i.e. movable lens system), together with encoders 143) are selected when capturing at least some of the image representations in step (a) or (c) (Par. [0061] a reference setting S0 at the extreme low magnification end of the optical system zoom range (i.e. different operating points), and at a nearest focus length of the optical system focus range (i.e. different operating points)), wherein an offset value is determined for each of the operating points of the movable lens system in step (b) or (d) (Par. [0023], The difference between these two images (i.e. an offset), or their "deformation," is measured using by identifying corresponding reference marks of a phantom in images of the phantom taken at the arbitrary setting S and the reference setting S0, and then measuring displacement of the reference marks between the identified locations of the reference marks in the image of the phantom at the arbitrary setting S and the identified locations of the reference marks in the image of the phantom at the reference setting S0), and wherein at least one of the calibration data and further calibration data are generated taking account of the offset values respectively determined (Par. [0037] The method for 3D surface extraction herein described warps stereo images captured using a desired setting S, using the deformation field (i.e. taking account of the offset values) obtained from images of a phantom at desired setting S and reference setting S0, into warped images corresponding to images taken at the reference setting S0. Because the reference setting S0 has been calibrated for surface extraction, the warped stereo images can then be used for surface reconstructing following the same calibration). Regarding Claim 11, Ji discloses a medical microscope arrangement (Fig. 1, Par. [0017] A surface-mapping system 100 for use during surgical procedures is based upon a stereo microscope 102 used to determine an intra-surgery location of a tumor or other anatomic feature of tissue (See abstract)), comprising: a stereoscopic medical microscope (Par. [0019] surgical microscope 102) having a stereo camera system including cameras (Fig. 1, Par. [0018], stereo cameras 118, 120 where camera 118 is a left camera and camera 120 is a right camera), storable calibration data (Par. [0033], a stereo image pair of the secondary calibration phantom as viewed in the reference setting is captured and saved (i.e. stored) as part of calibration information (i.e. calibration data)), and a data processing device (Fig. 1, image processor 124), wherein the data processing device is configured to: (a) generate calibration data based on image representations (Par. [0032] Since the 3D locations of the correspondence points are known on the precalibration phantoms, the parameters are fit 208 such that the extraction to a common 3D space gives results where extracted 3D points of an effective surface profile of the precalibration phantom match heights of the known points on the precalibration phantom. These 3D surface profile extraction parameters (i.e. calibration data) are then saved 210 for later use) of at least one three-dimensional calibration object (Par. [0028], A stereo pair of images (i.e. image representations) is taken 206 of the precalibration phantom (i.e. calibration object), assuming the precalibration phantom has known surface profile, providing a plurality of known points in three dimensions) that are captured with the cameras of the stereo camera system (Fig. 1, Par. [0018], stereo cameras 118, 120 where camera 118 is a left camera and camera 120 is a right camera), and store the calibration data for correction purposes (Par. [0032] These 3D surface profile extraction parameters (i.e. calibration data) are then saved 210 for later use), and (b) generate further calibration data based on further image representations (Par. [0033], a stereo image pair of the secondary calibration phantom as viewed in the reference setting is captured and saved (i.e. stored) as part of calibration information (i.e. further calibration data)) of at least one two-dimensional calibration object that are captured with the cameras of the stereo camera system, and store said further calibration data for correction purposes (Par. [0033], a secondary calibration phantom is positioned 212 in view of the optical system, and a stereo image pair of the secondary calibration phantom as viewed in the reference setting is captured and saved as part of calibration information, where the secondary calibration phantom is a two dimensional (i.e. two-dimensional calibration object), flat, phantom having marks printed thereon), wherein the calibration data generated in step (a) for the correction purposes (Par. [0025], parameters for reconstruction (i.e. correction purposes) surface mapping routine 132 are derived that are sufficient for reconstructing a surface profile (i.e. calibration data) from a pair of stereo images (i.e. captured image representations) taken with the optical system set 202 to the reference setting at Step 206) are applied at least partially to the captured further image representations (Par. [0025], then applied at least partially to captured further image representations at step 212), wherein step (b) is performed (Step 212 is performed, Par. [0033], a stereo image pair of the secondary calibration phantom as viewed in the reference setting is captured and saved (i.e. stored) as part of calibration information (i.e. further calibration data)) and/or repeated for the corrected further image representations, and/or wherein the further calibration data generated in step (b) for correction purposes are applied at least partially to the captured image representations, wherein step (a) is performed and/or repeated for the corrected images, and wherein the stored calibration data and further calibration data (Par. [0032], These 3D surface profile extraction parameters (i.e. calibration data) are then saved 210 for later us and Par. [0033] a secondary calibration phantom is positioned 212 in view of the optical system, and a stereo image pair of the secondary calibration phantom as viewed in the reference setting is captured and saved as part of calibration information (i.e. further calibration data)) are applied to subsequently captured image representations (Par. [0035] With the optical system set 302 to the arbitrary desired setting, the secondary calibration phantom is positioned in view of the optical system in a position approximating that where tissue 106 will be present during surgery, and a stereo image pair of the secondary calibration phantom is captured or taken 304 by cameras 118, 120 (i.e. subsequent capture)), such that the subsequently captured image representations are corrected by the calibration data and the further calibration data, and wherein corrected image representations are provided (Par. [0066] when a surgeon selects a runtime setting, a runtime distortion field parameter (DFP(run)) is then determined by interpolation, between nearby secondary calibration points recorded in the library 372. Par. [0067] A runtime stereo image (i.e. subsequently captured) is then captured, and warped to correspond to images captured at the primary calibration point or reference setting, of that quadrant, such as setting 352 for the lower left quadrant 374 or setting 355 for runtime settings in the top right quadrant 378. 3D extraction is then performed on the warped image, using 3D extraction parameters recorded in library 372 and associated with the primary calibration point or reference setting 352, 355, associated with that quadrant). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Ji (US 2014/0362186 A1) in view of Usikov (US 2003/0012449 A1) referred to as Usikov hereinafter. Regarding Claim 10, Ji discloses claim 1. Ji further discloses wherein illumination correction data of at least one light source of the medical microscope (Par. [0019], An infrared-light-emitting tracker 142 is rigidly attached to the microscope, allowing its position and orientation to be determined with respect to a reference frame (i.e., "patient tracker") during image acquisition using a commercial tracking system) are generated and are taken into account when generating at least one of the calibration data and the further calibration data (Par. [0031], an intensity-based correlation metric and a smoothness constraint (i.e. generated illumination correction data) are used to find the correspondence points in both images of the pair (i.e. calibration data)). Ji does not specifically teach one additional image representation of a reflective calibration object. However, Usikov teaches wherein at least one additional image representation of at least one of a homogeneously reflecting and reflective calibration object is furthermore captured with the stereo camera system (Fig. 1, Par. [0020] An image or picture (i.e. one additional image representation) of the object is created by the imaging system by measuring and recording the intensity or magnitude of the illuminating radiation that either passes through or reflects from the object 14. In either case, the image typically consists of a large number of intensity measurements taken and recorded at a large number of points across the planar surface 12 (i.e. homogeneously reflecting and reflective object)), wherein illumination correction data (Par. [0027], the step of correcting 104 uses the measured brightness values .f.sub.n of the step of measuring 102 and yields corrected brightness values g.sub.n (i.e. illumination correction data)) for correcting an illumination geometry of at least one light source (Par. [0029], the corrected brightness values g.sub.n produced by the step of correcting 104 advantageously will be corrected for the illumination intensity variations of the point source 10, where in Par. [0019], The point source 10 (i.e. light source) is located near to and illuminates the planar surface 12 using some form of illumination, including but not limited to electromagnetic radiation such as microwaves, X-rays, visible light, or Ultraviolet light and acoustic energy such as ultrasound) are generated based on the at least one additional image representation (Par. [0025], measuring 102 one or more brightness value(s) f.sub.n (i.e. illumination geometry) for each of the N pixels in the image (i.e. additional image representation)). References Ji and Usikov are considered to be analogous art because they relate to imaging systems. Therefore, it would be obvious to one possessing ordinary skill in the art before the effective filing date of the claimed invention to specifying capturing an image for correcting illumination geometry of the light source as taught by Usikov with the invention of Ji.. This modification would allow minimizing or largely eliminating the effects of using a light source to illuminate an object (See Usikov, Par. [0007]). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the Examiner should be directed to SUSAN E HODGES whose telephone number is (571)270-0498. The Examiner can normally be reached on M-F 8:00 am - 4:00 pm. If attempts to reach the Examiner by telephone are unsuccessful, the Examiner’s supervisor, Brian T. Pendleton, can be reached on (571) 272-7527. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Susan E. Hodges/Primary Examiner, Art Unit 2425
Read full office action

Prosecution Timeline

Jul 25, 2024
Application Filed
Oct 29, 2025
Non-Final Rejection — §102, §103, §112
Jan 30, 2026
Response Filed
Feb 27, 2026
Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603982
STEREOSCOPIC HIGH DYNAMIC RANGE VIDEO
2y 5m to grant Granted Apr 14, 2026
Patent 12604008
ADAPTIVE CLIPPING IN MODELS PARAMETERS DERIVATIONS METHODS FOR VIDEO COMPRESSION
2y 5m to grant Granted Apr 14, 2026
Patent 12574558
Method and Apparatus for Sign Coding of Transform Coefficients in Video Coding System
2y 5m to grant Granted Mar 10, 2026
Patent 12568212
ADAPTIVE LOOP FILTERING ON OUTPUT(S) FROM OFFLINE FIXED FILTERING
2y 5m to grant Granted Mar 03, 2026
Patent 12556671
THREE DIMENSIONAL STROBO-STEREOSCOPIC IMAGING SYSTEMS AND ASSOCIATED METHODS
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
81%
With Interview (+14.4%)
2y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 375 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month