Prosecution Insights
Last updated: April 19, 2026
Application No. 18/684,081

Volumetric Imaging

Non-Final OA §102§103
Filed
Feb 15, 2024
Examiner
LEE, SHUN K
Art Unit
2884
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
UNIVERSITETET I TROMSØ - NORGES ARKTISKE UNIVERSITET
OA Round
1 (Non-Final)
42%
Grant Probability
Moderate
1-2
OA Rounds
3y 9m
To Grant
58%
With Interview

Examiner Intelligence

Grants 42% of resolved cases
42%
Career Allow Rate
294 granted / 701 resolved
-26.1% vs TC avg
Strong +16% interview lift
Without
With
+15.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
61 currently pending
Career history
762
Total Applications
across all art units

Statute-Specific Performance

§101
1.9%
-38.1% vs TC avg
§103
50.6%
+10.6% vs TC avg
§102
20.4%
-19.6% vs TC avg
§112
23.8%
-16.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 701 resolved cases

Office Action

§102 §103
DETAILED ACTION National Stage Application Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Acknowledgment is made of applicant's claim for foreign priority based on an application filed in GB on 17 August 2021. It is noted, however, that a copy of the certified copy of the GB 2111782.5 priority document have not been received in this National Stage application from the International Bureau (PCT Rule 17.2(a)). Information Disclosure Statement The information disclosure statement filed on 14 May 2024 does not fully comply with the requirements of 37 CFR 1.98 because: each publication listed in an information disclosure statement must be identified by publisher, author (if any), title, relevant pages of the publication, date, and place of publication. The date of publication supplied (e.g., Tsang et al.) must include at least the month and year of publication, except that the year of publication (without the month) will be accepted if the applicant points out in the information disclosure statement that the year of publication is sufficiently earlier than the effective U.S. filing date and any foreign priority date so that the particular month of publication is not in issue. Since the submission appears to be bona fide, applicant is given ONE (1) MONTH from the date of this notice to supply the above mentioned omissions or corrections in the information disclosure statement. NO EXTENSION OF THIS TIME LIMIT MAY BE GRANTED UNDER EITHER 37 CFR 1.136(a) OR (b). Failure to timely comply with this notice will result in the above mentioned information disclosure statement being placed in the application file with the noncomplying information not being considered. See 37 CFR 1.97(i). Specification The disclosure is objected to because of the following informalities: on pg. 19, “sample region 204” in line 6 should probably be --sample region 4-- (37 CFR 1.437 and PCT Rule 11.13(m)). Appropriate correction is required. The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant's cooperation is requested in correcting any errors of which applicant may become aware in the specification. Claim Objections Claim(s) 1-4, 6, 14, 15, 18, and 20 is/are objected to because of the following informalities: (a) in claim 1, “each plane” on line 3 should probably be --each of the planes-- (antecedent basis); (b) in claim 1, “each section of pixels” on line 6 should probably be --each of the sections of pixels-- and “each section of pixels” on line 14 should probably be --each of the sections of pixels-- (antecedent basis); (c) in claim 2, “each section of pixels” on lines 1-2 should probably be --each of the sections of pixels-- (antecedent basis); (d) in claim 3, “them multi-plane optical assembly” on lines 1-2 should probably be --the multi-plane optical assembly-- (antecedent basis); (e) in claim 3, “at least four depths” on lines 2-3 should probably be --at least four of the depths-- (antecedent basis); (f) in claim 4, “them multi-plane optical assembly” on lines 1-2 should probably be --the multi-plane optical assembly-- (antecedent basis); (g) in claim 4, “at least eight depths” on lines 2-3 should probably be --at least eight of the depths-- (antecedent basis); (h) in claim 6, “each section” on line 3 should probably be --each of the sections-- (antecedent basis); (i) in claim 14, “each section” on line 2 should probably be --each of the sections-- (antecedent basis); (j) in claim 15, “each subsection” on line 2 should probably be --each of the subsections-- (antecedent basis); (k) in claim 18, “each section of pixels” on line 3 should probably be --each of the sections of pixels-- and “each section of pixels” on line 5 should probably be --each of the sections of pixels-- (antecedent basis); and (l) in claim 20, “an image sensor comprising a plurality of sections of pixels;” on line 4 should probably be deleted (antecedent basis and see also “wherein the image sensor comprises a plurality of sections of pixels” on lines 5-6 in claim 20). Appropriate correction is required. Claim Interpretation The specification (e.g., see “… narrow light sheet extending in an axial direction parallel to an imaging axis of the objective lens assembly … sweep the axial light sheet smoothly across (i.e. through) the sample region 4 so as to illuminate a continuum of axially-extending planes through the sample region. Within this continuum, a plurality of distinct axially-extending planes in the sample region can be considered to be sequentially illuminated at an illumination rate that is equal to the sensing rate. In alternative embodiments, the light sheet could be moved in a series of discrete steps, at an illumination rate that is equal to the sensing rate … light sheet extending at an oblique angle to an imaging axis of the objective lens assembly 220 (an inclined light sheet) …” on pg. 13, lines 24+ and pg. 19, lines 1+) serves as a glossary (MPEP § 2111.01) for the claim term “illuminate a plurality of planes in a sample region sequentially at an illumination rate, each plane extending over a plurality of depths in the sample region”. The specification (e.g., see “… "sections") of pixels sequentially at a sensing rate. Sensing here may refer to the period of time over which a pixel is configured to accumulate charge, before the pixel is read out … division into sub-sections is not a physical attribute of the sensor 6 itself, but relates instead to how the image data from the sensor 6 is processed …” on pg. 13, lines 12+) serves as a glossary (MPEP § 2111.01) for the claim term “an image sensor comprising a plurality of sections of pixels and arranged to sense each section of pixels sequentially at a sensing rate”. The specification (e.g., see “… Using selective sensitivity of the image sensor to achieve optical sectioning in this manner may reduce constraints on other optical components of the apparatus such as the illumination assembly and/or the multi-plane optical assembly. For instance, it may not be essential for the multi-plane optical assembly to provide optical sectioning between the plurality of planes (i.e. it may allow light from all across the sample region to be received over the whole image sensor simultaneously, if the whole sample region were to be illuminated at once), allowing smaller and/or cheaper types of multi-plane optical assembly to be used compared to prior art approaches. Whilst such embodiments may result in light from near an illuminated plane spilling onto pixels of the image sensor adjacent the corresponding section, this is acceptable because these pixels are sensed at a different time … first multi-plane optical assembly 802 comprises an multi-focus (MF) grating, arranged for use with axial illumination. A second multi­plane optical assembly 804 also comprises a MF grating. However the second multi-plane optical assembly 804 is rotated relative to the first multi-plane optical assembly 802 so that it is arranged for use with inclined illumination. A third multi-plane optical assembly 806 comprises a beam splitter (BS) cascade arranged for use with axial illumination. A fourth multi-plane optical assembly 808 also comprises a BS cascade. The BS cascades 806,808 each comprise several beamsplitter cubes and a prism mirror. Each of the components of the fourth multi­plane optical assembly 808 are rotated relative to those of the third multi-plane optical assembly 806 so that it is arranged for use with inclined illumination. A fifth multi-plane optical assembly 810 comprises a multi-focus (MF) prism …” on pg. 4, lines 1+ and pg. 22, lines 9+) serves as a glossary (MPEP § 2111.01) for the claim term “a multi-plane optical assembly arranged to receive light from the plurality of depths in the sample region and, for each section of said sections of pixels, to direct light simultaneously from each of the plurality of depths in the respective plane to a different respective subsection of said section”. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-3, 7-10, and 12-20 is/are rejected under U.S.C. 102(a)(1) as being anticipated by Mertz et al. (US 2019/0179127). In regard to claim 1, Mertz et al. disclose an apparatus for volumetric imaging, comprising: (a) an illumination assembly arranged to direct light to illuminate a plurality of planes in a sample region sequentially at an illumination rate, each plane extending over a plurality of depths in the sample region (e.g., see “… Multi-Z confocal microscopy system that can image multiple depths into a sample simultaneously … system shown in FIG. 2 is similar to the system shown in FIG. 1 except that illumination beam B1 is formed in a line in the X … dimension using for example, a cylindrical lens (or diffractive element) and the line is scanned in the Y … dimension … focusing optics f1, f2, f6, f7 and the microscope objective can be configured to focus the illumination beam B1 into a sheet … that extends in the X and Z … dimensions that can be scanned or swept across in the sample in the Y … dimension …” in Fig. 2 and paragraphs 37 and 42); (b) an image sensor comprising a plurality of sections of pixels and arranged to sense each section of pixels sequentially at a sensing rate (e.g., see “… predefined array of camera pixels in the frame camera …” in Fig. 2 and paragraph 44); and (c) a light-receiving assembly arranged to receive light from the sample region and to direct light received from each of said planes in the sample region to a different respective section of said sections of pixels (e.g., see “… detection subsystem … second scanning mechanism can be coupled to the first scanning mechanism such that the position of the … sheet) on the sample corresponds to a corresponding position of the detection beam on the frame camera. In this way, as the illumination line is scanned over the target in the sample, the resulting detection beam can be scanned over a predefined array of camera pixels in the frame camera …” in Fig. 2 and paragraph 44), wherein the light-receiving assembly comprises a multi-plane optical assembly arranged to receive light from the plurality of depths in the sample region and, for each section of said sections of pixels, to direct light simultaneously from each of the plurality of depths in the respective plane to a different respective subsection of said section (e.g., see “… detection beam B2 is elongated in the Y (or X) dimension and the pinholes from the embodiment of FIG. 1 are replaced with reflecting slits configured as described with respect to FIG. 1 and the point detectors can be replaced with line detectors (e.g., line cameras) aligned with the slits … magnification of the detection beam B2 can be used to define the distance between slits that correspond to the detected probe volumes in the illumination sheet …” in Fig. 2 and paragraph 43), and wherein the illumination rate is equal to the sensing rate, such that each section of pixels is arranged to sense light from the plurality of depths in the respective plane as the plane is illuminated by the illumination assembly (e.g., see “… second scanning mechanism can be coupled to the first scanning mechanism such that the position of the … sheet) on the sample corresponds to a corresponding position of the detection beam on the frame camera. In this way, as the illumination line is scanned over the target in the sample, the resulting detection beam can be scanned over a predefined array of camera pixels in the frame camera …” in Fig. 2 and paragraph 44). In regard to claim 2 which is dependent on claim 1, Mertz et al. also disclose that arranged such that each section of pixels senses no light from any plane of the plurality of planes other than said respective plane (e.g., “… systems according to the invention provide for simultaneous imaging of multiple planes in the Z dimension such that the volume imaging rate is the same as a single­frame rate and each plane is optically sectioned, meaning it is devoid of out-of-focus background light …” in paragraph 14). In regard to claim 3 which is dependent on claim 1, Mertz et al. also disclose that the multi-plane optical assembly is arranged to direct light simultaneously from each of at least four depths to different respective subsections of the image sensor (e.g., “… signals are acquired simultane­ously and can be used to simultaneously produce four (or more) images from four (or more) different depths, thus enabling high speed Multi-Z confocal imaging …” in paragraph 52). In regard to claims 7 and 8 which are dependent on claim 1, Mertz et al. also disclose arranged to volumetrically image the sample region repeatedly over time to generate image data representing a time series of volumes of the sample region, and operable to volumetrically image the sample region at a rate of ten or more volumes per second (e.g., see “… scanning mechanism (e.g., a single galvanometric mirror or rotating polygonal mirror) … speed limitation for a Multi-Z confocal microscopy system according to the invention is defined by the speed of the scanning mechanism or of the image acquisition electronics of the detector subsystem. Resonant galvanometers or polygonal scanners can easily provide video-rate imaging. For example, if the detectors can detect at a 10 MHz rate, then each detector can produce a 512x512 pixel image at a rate of 38 Hz …” in Fig. 2 and paragraphs 42 and 57). In regard to claims 9 and 10 which are dependent on claim 1, Mertz et al. also disclose that the light-receiving assembly comprises an objective lens assembly arranged to pass light emanating from the sample region to the multi-plane optical assembly (e.g., see “… detection beam B2 emanates from the sample through the microscope objective … to the detector subsystem …” in Fig. 2 and paragraph 43), and wherein the objective lens assembly also forms part of the illumination assembly and is arranged to pass light from the illumination assembly into the sample region (e.g., see “… illumination beam B1 can be focused to a beam width that is in a range from 65% to 0.5% of the back aperture of the microscope objective …” in Fig. 2 and paragraph 42). In regard to claim 12 which is dependent on claim 1, Mertz et al. also disclose that the illumination assembly is arranged to generate a light sheet and to sweep or step the light sheet across the sample region to illuminate said plurality of planes (e.g., see “… Multi-Z confocal microscopy system that can image multiple depths into a sample simultaneously … system shown in FIG. 2 is similar to the system shown in FIG. 1 except that illumination beam B1 is formed in a line in the X … dimension using for example, a cylindrical lens (or diffractive element) and the line is scanned in the Y … dimension … focusing optics f1, f2, f6, f7 and the microscope objective can be configured to focus the illumination beam B1 into a sheet … that extends in the X and Z … dimensions that can be scanned or swept across in the sample in the Y … dimension …” in Fig. 2 and paragraphs 37 and 42). In regard to claim 13 which is dependent on claim 1, Mertz et al. also disclose that the plurality of planes are parallel planes (e.g., see “… Multi-Z confocal microscopy system that can image multiple depths into a sample simultaneously … system shown in FIG. 2 is similar to the system shown in FIG. 1 except that illumination beam B1 is formed in a line in the X … dimension using for example, a cylindrical lens (or diffractive element) and the line is scanned in the Y … dimension … focusing optics f1, f2, f6, f7 and the microscope objective can be configured to focus the illumination beam B1 into a sheet … that extends in the X and Z … dimensions that can be scanned or swept across in the sample in the Y … dimension …” in Fig. 12 and paragraphs 37 and 42). In regard to claim 14 which is dependent on claim 1, Mertz et al. also disclose that each section consists of a respective contiguous set of pixels (e.g., see “… detection subsystem … second scanning mechanism can be coupled to the first scanning mechanism such that the position of the … sheet) on the sample corresponds to a corresponding position of the detection beam on the frame camera. In this way, as the illumination line is scanned over the target in the sample, the resulting detection beam can be scanned over a predefined array of camera pixels in the frame camera …” in Fig. 2 and paragraph 44). In regard to claim 15 which is dependent on claim 1, Mertz et al. also disclose that each subsection consists of a respective contiguous set of pixels (e.g., see “… detection subsystem … second scanning mechanism can be coupled to the first scanning mechanism such that the position of the … sheet) on the sample corresponds to a corresponding position of the detection beam on the frame camera. In this way, as the illumination line is scanned over the target in the sample, the resulting detection beam can be scanned over a predefined array of camera pixels in the frame camera …” in Fig. 2 and paragraph 44). In regard to claim 16 which is dependent on claim 1, Mertz et al. also disclose that each of the section of pixels comprises a respective line of pixels, and wherein the image sensor is arranged to sense adjacent lines sequentially (e.g., see “… detection subsystem … second scanning mechanism can be coupled to the first scanning mechanism such that the position of the … sheet) on the sample corresponds to a corresponding position of the detection beam on the frame camera. In this way, as the illumination line is scanned over the target in the sample, the resulting detection beam can be scanned over a predefined array of camera pixels in the frame camera …” in Fig. 2 and paragraph 44). In regard to claim 17 which is dependent on claim 1, Mertz et al. also disclose a processing system arranged to receive image data from the image sensor, and configured to process the image data to generate a three-dimensional image data set e.g., see “… detectors can be connected to a computer system that reconstructs the 3 D volume image from the individual detector signal data. The scanning computer or controller keeps track of the position of the X-Y scanning mechanism to inform the reconstruction computer system of the location of the illumination beam B1 on the sample to aid in reconstruction of the 3 D image. Since each detector can be configured to image at a different depth (e.g., position along the Z dimension) in the sample, the system is able to image multiple depths simultaneously enabling a single scan to produce a full 3 D volume image at higher speeds … system shown in FIG. 2 is similar to the system shown in FIG. 1 except that illumination beam B1 is formed in a line in the X …” in Fig. 2 and paragraphs 39 and 42). In regard to claim 18 which is dependent on claim 1, Mertz et al. also disclose that comprising a second image sensor comprising a plurality of sections and arranged to sense sequentially each section of pixels at the sensing rate, wherein the multi-plane optical assembly is arranged to direct light simultaneously from each of a second plurality of depths in the sample region to a respective subsection of each section of pixels of the second image sensor (e.g., see “… detection subsystem … second scanning mechanism can be coupled to the first scanning mechanism such that the position of the … sheet) on the sample corresponds to a corresponding position of the detection beam on the frame camera. In this way, as the illumination line is scanned over the target in the sample, the resulting detection beam can be scanned over a predefined array of camera pixels in the frame camera … detectors can include individual detectors (as shown in FIGS. 1, 3 and 5), line cameras (as shown in FIGS. 6, 7, and 8), and frame cameras (as shown in FIGS. 2A, 2B, 2C, 8 and 9) or a combination thereof …” in Fig. 2 and paragraphs 44 and 45). In regard to claim 19 which is dependent on claim 1, Mertz et al. also disclose arranged to perform fluorescence volumetric microscopy of a sample in the sample region (e.g., see “… sample can be configured (e.g., using fluorescent labeling) to fluoresce in response to exposure to an excitation beam of a specific excitation frequency (or in a specific frequency range) …” in Fig. 2 and paragraph 42). In regard to claim 20, Mertz et al. disclose a method of volumetric imaging, the method comprising: (a) directing light to illuminate a plurality of planes in a sample region sequentially at an illumination rate, each plane extending over a plurality of depths in the sample region (e.g., see “… Multi-Z confocal microscopy system that can image multiple depths into a sample simultaneously … system shown in FIG. 2 is similar to the system shown in FIG. 1 except that illumination beam B1 is formed in a line in the X … dimension using for example, a cylindrical lens (or diffractive element) and the line is scanned in the Y … dimension … focusing optics f1, f2, f6, f7 and the microscope objective can be configured to focus the illumination beam B1 into a sheet … that extends in the X and Z … dimensions that can be scanned or swept across in the sample in the Y … dimension …” in Fig. 2 and paragraphs 37 and 42); and (b) directing light emanating from the sample region to an image sensor, wherein the image sensor comprises a plurality of sections of pixels, wherein light received from each of said planes in the sample region is directed to a different respective section of said sections of pixels, wherein, for each section of said sections of pixels, light is directed simultaneously from each of the plurality of depths in the respective plane to a different respective subsection of said section (e.g., see “… detection beam B2 is elongated in the Y (or X) dimension and the pinholes from the embodiment of FIG. 1 are replaced with reflecting slits configured as described with respect to FIG. 1 and the point detectors can be replaced with line detectors (e.g., line cameras) aligned with the slits … magnification of the detection beam B2 can be used to define the distance between slits that correspond to the detected probe volumes in the illumination sheet … predefined array of camera pixels in the frame camera …” in Fig. 2 and paragraph 43 and 44), and wherein the image sensor sensing each of said sections sequentially at a sensing rate, wherein the sending rate is equal to the illumination rate, such that each section of pixels senses light from the plurality of depths in the respective plane as the plane is illuminated (e.g., see “… second scanning mechanism can be coupled to the first scanning mechanism such that the position of the … sheet) on the sample corresponds to a corresponding position of the detection beam on the frame camera. In this way, as the illumination line is scanned over the target in the sample, the resulting detection beam can be scanned over a predefined array of camera pixels in the frame camera …” in Fig. 2 and paragraph 44). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mertz et al. (US 2019/0179127). In regard to claim 4 which is dependent on claim 3 in so far as understood, the cited prior art is applied as in claim 3 above and a prima facie case of obviousness exists (MPEP § 2144.05) since the claimed “at least eight depths” range lie inside the “four (or more) different depths” range disclosed by the cited prior art. Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mertz et al. (US 2019/0179127) in view of Abrahamsson et al. (US 2013/0176622). In regard to claim 5 which is dependent on claim 1, the apparatus of Mertz et al. lacks an explicit description of details of the “… Multi-Z confocal microscopy system that can image multiple depths into a sample simultaneously …” such as a multi-plane prism or a multi-plane diffraction grating. However, “… image multiple depths into a sample simultaneously …” details are known to one of ordinary skill in the art (e.g., see “… Fast acquisition of three-dimensional data is important in biological microscopy. Many imaging modalities, such as wide-field, laser-scanning confocal, spinning-disk confocal, and light-sheet microscopy record information one focal plane at a time. Three-dimensional images may be assembled from the information recorded from each focal plane using sequential mechanical refocusing … imaging system 102 uses a diffractive element (not shown in FIG. 1, but discussed in detail below) to spatially separate and focus the light from each of the depths 122, 124, and 126 into diffractive orders and a correction module (not shown) to remove chromatic aberrations and direct rays 142, 144, and 146 towards the camera sensor 150. The diffractive element 102 is a multi-focus diffractive grating (MFG), which is discussed in greater detail below. The imaging system 102 generates the two-dimensional images 152, 154, and 156 from the rays 142, 144, and 146 simultaneously to generate the three-dimensional representation of the volume 120 …” in paragraphs 41 and 46 of Abrahamsson et al.). It should be noted that “when a patent claims a structure already known in the prior art that is altered by the mere substitution of one element for another known in the field, the combination must do more than yield a predictable results”. KSR International Co. v. Teleflex Inc., 550 U.S. 398 at 416, 82 USPQ2d 1385 (2007) at 1395 (citing United States v. Adams, 383 U.S. 39, 40 [148 USPQ 479] (1966)). See MPEP § 2143. In this case, one of ordinary skill in the art could have substituted a known conventional laser-scanning confocal (e.g., comprising details such as “multi-focus diffractive grating (MFG)”, in order to achieve “Fast acquisition of three-dimensional”) for the unspecified laser-scanning confocal of Mertz et al. and the results of the substitution would have been predictable. Therefore it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to provide a known conventional laser-scanning confocal (e.g., comprising details such as a multi-plane prism or a multi-plane diffraction grating) as the unspecified laser-scanning confocal of Mertz et al. Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mertz et al. (US 2019/0179127) in view of Holst (Scientific CMOS camera technology: A breeding ground for new microscopy techniques, Microscopy and Analysis Vol. 28, no. 1 (January 2014), pp. S4-S6, S8-S9, and S12). In regard to claim 6 which is dependent on claim 1, while Mertz et al. also disclose that the image sensor is an electronic image sensor and comprises circuitry arranged to selectively sense from pixels in each section of the image sensor in sequential periods (e.g., “… as the illumination line is scanned over the target in the sample, the resulting detection beam can be scanned over a predefined array of camera pixels in the frame camera … four line signals are swept across the frame camera surface during a camera exposure. In this manner, the four line signals can produce four images, side by side, within a single camera frame. For example, a standard sCMOS frame camera can perform at frame rates of up to 400 Hz or more when reading frames 512x2048 pixels in size. Each camera frame can then contain four 512x512 images side by side …” in paragraphs 44 and 58). The apparatus of Mertz et al. lacks an explicit description of details of the “… standard sCMOS frame camera …” such as electronic shutter circuitry. However, “… standard sCMOS frame camera …” details are known to one of ordinary skill in the art (e.g., see “… two sCMOS camera were used simultaneously for the recordings, which enabled the researchers to generate sharper images with less photo bleaching compared to standard methods like spinning disk confocal microscopy and SPIM with one camera. For this purpose they operated the cameras at 200 frames per second and precisely synchronized the recording with the laser light using the global reset/rolling readout feature of the CIS2521 image sensor …” in Holst). It should be noted that “when a patent claims a structure already known in the prior art that is altered by the mere substitution of one element for another known in the field, the combination must do more than yield a predictable results”. KSR International Co. v. Teleflex Inc., 550 U.S. 398 at 416, 82 USPQ2d 1385 (2007) at 1395 (citing United States v. Adams, 383 U.S. 39, 40 [148 USPQ 479] (1966)). See MPEP § 2143. In this case, one of ordinary skill in the art could have substituted a known conventional camera (e.g., comprising details such as “using the global reset/rolling readout feature of the CIS2521 image sensor”, in order “to generate sharper images”) for the unspecified camera of Mertz et al. and the results of the substitution would have been predictable. Therefore it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to provide a known conventional camera (e.g., comprising details such as electronic shutter circuitry arranged to selectively sense from pixels in each section of the image sensor in sequential electronic-shutter periods) as the unspecified camera of Mertz et al. Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mertz et al. (US 2019/0179127) in view of Anhut et al. (US 2022/0043246). In regard to claim 11 which is dependent on claim 9, the apparatus of Mertz et al. lacks an explicit description of details of the “… microscope objective can be configured to focus the illumination beam B1 into a sheet … that extends in the X and Z … dimensions that can be scanned or swept across in the sample in the Y … dimension …” such as at least one of the planes is inclined to the objective lens assembly’s imaging axis. However, “… scanned …” details are known to one of ordinary skill in the art (e.g., see “… in addition to the generation of an inclined light sheet, different illumination modes can optionally be realized … scanning apparatus for setting an angle of incidence of a focused laser light beam into a laser focus in the objective pupil, also referred to as entrance pupil, of an objective acting as the illumination objective. The laser focus is here directed into an entrance point that is radially offset with respect to the optical axis of the illumination objective. The result of this is an (inclined) light beam that is oblique with respect to the optical axis of the objective … moved quickly back and forth (scanned) within a defined angular range …” in paragraphs 13 and 15of Anhut et al.). It should be noted that “when a patent claims a structure already known in the prior art that is altered by the mere substitution of one element for another known in the field, the combination must do more than yield a predictable results”. KSR International Co. v. Teleflex Inc., 550 U.S. 398 at 416, 82 USPQ2d 1385 (2007) at 1395 (citing United States v. Adams, 383 U.S. 39, 40 [148 USPQ 479] (1966)). See MPEP § 2143. In this case, one of ordinary skill in the art could have substituted a known conventional scanning configuration (e.g., comprising details such as “scanning apparatus for setting an angle of incidence of a focused laser light beam into a laser focus in the objective pupil” “result of this is an (inclined) light beam that is oblique with respect to the optical axis of the objective”, in order to “moved quickly back and forth (scanned) within a defined angular range”) for the unspecified scanning configuration of Mertz et al. and the results of the substitution would have been predictable. Therefore it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to provide a known conventional scanning configuration (e.g., comprising details such as at least one of the plurality of planes is inclined to an imaging axis of the objective lens assembly) as the unspecified scanning configuration of Mertz et al. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Shun Lee whose telephone number is (571)272-2439. The examiner can normally be reached Monday-Friday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Makiya can be reached at (571)272-2273. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SL/ Examiner, Art Unit 2884 /DAVID J MAKIYA/Supervisory Patent Examiner, Art Unit 2884
Read full office action

Prosecution Timeline

Feb 15, 2024
Application Filed
Nov 13, 2025
Non-Final Rejection — §102, §103
Jan 22, 2026
Interview Requested
Jan 27, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12487336
CALIBRATION SYSTEM AND METHOD FOR INTEGRATED OPTICAL PHASED ARRAY CHIP
2y 5m to grant Granted Dec 02, 2025
Patent 12480865
GAS CELL
2y 5m to grant Granted Nov 25, 2025
Patent 12465297
MULTI-MODALITY DENTAL X-RAY IMAGING DEVICE AND METHODS
2y 5m to grant Granted Nov 11, 2025
Patent 12453521
MOBILE MEDICAL DEVICE, MOBILE DEVICE, AND METHOD
2y 5m to grant Granted Oct 28, 2025
Patent 12449554
Scintillator Detectors and Methods for Positron Emission Tomography
2y 5m to grant Granted Oct 21, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
42%
Grant Probability
58%
With Interview (+15.7%)
3y 9m
Median Time to Grant
Low
PTA Risk
Based on 701 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month