Prosecution Insights
Last updated: April 19, 2026
Application No. 17/311,331

OPTICAL DEVICE, PROGRAM, CONTROL DEVICE, AND IMAGING SYSTEM

Non-Final OA §103§112
Filed
Jun 05, 2021
Examiner
SRIDHAR, SAMANVITHA
Art Unit
2872
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
Nikon Corporation
OA Round
5 (Non-Final)
65%
Grant Probability
Moderate
5-6
OA Rounds
3y 8m
To Grant
91%
With Interview

Examiner Intelligence

Grants 65% of resolved cases
65%
Career Allow Rate
50 granted / 77 resolved
-3.1% vs TC avg
Strong +26% interview lift
Without
With
+26.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
35 currently pending
Career history
112
Total Applications
across all art units

Statute-Specific Performance

§101
1.3%
-38.7% vs TC avg
§103
38.7%
-1.3% vs TC avg
§102
24.5%
-15.5% vs TC avg
§112
26.8%
-13.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 77 resolved cases

Office Action

§103 §112
DETAILED ACTION Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/30/2025 has been entered. Response to Remarks Applicant’s remarks regarding the prior art rejection of the claims have been fully considered but are moot upon further consideration because the new grounds of rejection in light of a change of statutory basis and/or in light of Singer et al.’s teachings are necessitated by the Applicant’s amendments (on 12/30/2025), as detailed below. Claim Objections The claims are objected to because of the following informalities: 1. A typo (underlined) in claim 1: “…a central processor connected to the memory and that: acquires…”. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. Claims 1-2, 4-5, 7-10, 12-13 and 15-18 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Claims 1, 8-9 and 17 contain subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventors, at the time the application was filed, had possession of the claimed invention. The replacement claims submitted 12/30/2025 were not filed with the original disclosure filed on 06/05/2021 and are therefore examined for new matter, see MPEP 608.04(b) and 714.01(e). Claims 1, 8-9 and 17 limitation “a memory which stores a learning result indicating a relationship between the wavefront in the pupil and imaging performance of the imaging optical system for each of a plurality of predetermined areas formed by dividing the image of the optical image generated by the imaging element; and a central processor connected to the memory and that: acquires information that designates a specific area among the plurality of predetermined areas in the image to correct the wavefront, wherein the specific area is a partial area in the image; and controls the phase of the wavefront corresponding to the specific area in the image by controlling the spatial modulation element on the basis of the learning result and the information that designates the specific area” (italicized for emph.) amounts to prohibited new matter. Specifically, the limitation lacks support in the original specification and claims submitted 06/05/2021 because all embodiments of the as-filed specification corresponding to FIGS. 1-15 of the as-filed Drawings fail to disclose and/or depict the limitation directed to a plurality of predetermined areas as claimed, and the as-filed specification fails to provide any support that: (i) the claimed relationship between the wavefront in the pupil and imaging performance of the imaging optical system correspond to each of a plurality of predetermined areas; (ii) the predetermined areas are formed by dividing the image; (iii) the processor acquires information that designates a specific area among the plurality of predetermined areas in the image to correct the wavefront). Applicant’s statement in the Remarks that “Support for these amendments can be found in paragraphs [0031]-[0039] of the instant application” (pg. 11 of Remarks) appears to be insufficient and not germane to the issue at hand: the cited portions of the specification appears to be silent with respect to the limitation directed to the plurality of predetermined areas as recited in the independent claims. Claims 2, 4-5, 7, 10, 12-13, 15-16 and 18 fail to cure the deficiencies of the rejected base claims. Therefore, Claims 1-2, 4-5, 7-10, 12-13 and 15-18 fail to comply with the written description requirement and are rejected under 35 USC 112(a). The Examiner respectfully suggests that the claims be amended to recite limitations that are supported by the originally-filed specification. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claims 1-2, 4-5, 7-10, 12-13 and 15-18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 1, 8-9 and 17 recite: “a memory which stores a learning result indicating a relationship between the wavefront in the pupil and imaging performance of the imaging optical system for each of a plurality of predetermined areas formed by dividing the image of the optical image generated by the imaging element; and a central processor connected to the memory and that: acquires information that designates a specific area among the plurality of predetermined areas in the image to correct the wavefront, wherein the specific area is a partial area in the image; and controls the phase of the wavefront corresponding to the specific area in the image by controlling the spatial modulation element on the basis of the learning result and the information that designates the specific area”. It is unclear what is meant by the limitation “each of a plurality of predetermined areas formed by dividing the image”, i.e., ‘predetermined’ with respect to what? what structure (or material or acts) performs the function of ‘dividing the image’? in what manner is the image divided? There appears to be no context/reference frame by which one can ascertain what this means, especially in light of the limitation being directed to a structural feature(s) of an apparatus/system claim(s). A claim term is functional when it recites a feature "by what it does rather than by what it is". Further, without reciting the particular structure, materials or steps that accomplish the function or achieve the result, all means or methods of resolving the problem may be encompassed by the claim. See MPEP § 2173.05(g), citing In re Swinehart, 439 F.2d 210, 212, 169 USPQ 226, 229 (CCPA 1971) and Ariad Pharmaceuticals., Inc. v. Eli Lilly & Co., 598 F.3d 1336, 1353, 94 USPQ2d 1161, 1173 (Fed. Cir. 2010) (en banc). The use of functional language in a claim may fail "to provide a clear-cut indication of the scope of the subject matter embraced by the claim" and thus be rendered indefinite. In re Swinehart, 439 F.2d 210, 213 (CCPA 1971). Thus, the limitation is unclear as it recites functional language without providing a discernable boundary on what element performs the functions as claimed. Furthermore, the instant specification appears to be silent and fails to elucidate the limitation directed to said predetermined areas (see corresponding rejection under 35 U.S.C. 112(a)). For the purposes of examination, the limitation will be treated as: “a memory which stores a learning result indicating a relationship between the wavefront in the pupil and imaging performance of the imaging optical system for areas of the optical image generated by the imaging element; and a central processor connected to the memory and that: acquires information that designates a specific area in the image to correct the wavefront, wherein the specific area is a partial area in the image; and controls the phase of the wavefront corresponding to the specific area in the image by controlling the spatial modulation element on the basis of the learning result and the information that designates the specific area”. Claims 2, 4-5, 7, 10, 12-13, 15-16 and 18 inherit the deficiencies of the base claims, and are thus rejected under 35 U.S.C. 112(b). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-2, 4-5, 8-10, 12-13, and 16-18 are rejected under 35 U.S.C. 103 as being unpatentable over Yamaguchi et al. (US 2007/0291230 A1) in view of Singer et al. (US 20150077844 A1). Regarding Claim 1, as best understood, Yamaguchi discloses: An optical device (FIGS. 1-2 & para [0007]: an ophthalmologic photographing apparatus) comprising: an imaging optical system comprising optical lenses to form an optical image of a subject (¶0046-50: The light receiving optical systems 12 and 31 comprise lenses; ¶0087, 0090: the processor 600 rotates the scanning mirrors to scan the retina, and the light accumulation amount at the photodetector 32 is stored in association with each scan position in the memory 800); an imaging element that generates an image of the optical image formed by the imaging optical system (¶0094, 0097, 0102, 0104, 0107-08, 0112: when light of the set image pixel number (S.sub.X, S.sub.Y) is detected, the processing of picking up the retinal image is finished (D); ¶0047-48, 0056-57, 0063-64, 0082: image sensor 41 obtains an eye-anterior-part image; ¶0070: The image constructing processor 14-2 constructs a retinal image and outputs the signal corresponding to the constructed retinal image or the like or other signals/data to the processor 600); a spatial modulation element that comprises a plurality of pixels and change a phase of a wavefront in a pupil of the imaging optical system (paras [0052-53], [0059]: spatial light modulator of wavefront compensation device 71 modulates phase based on characteristics obtained by 14; ¶0059, 0098); a memory (para [0087]); and a central processor (para [0069]: The processor 14-1 may contain the processor 600) connected to the memory and that: acquires information that designates a specific area in the image (FIGS. 1, 4, 6; paras [0088-90]: processor 600 indicates a pickup position (C.sub.X, C.sub.Y) for a high-magnification retinal image, such as an area on an image may be selected according to the low-magnification retinal image displayed on the display section 700 through the input section; para [0087]: The processor 600 reads out data in the memory I (800) and displays the read-out data as an image on the display section [processor connected to memory]) to correct the wavefront, wherein the specific area is a partial area in the image (¶0118-19: partial area within the low-magnification image that is defined by a start point Q and an end point R; see FIGS. 8-9 showing specific area being a partial area within the whole image; ¶0084: “The aberration compensation processing [wavefront manipulation] may be executed prior to the processing subsequent to step S200 described later, or executed in parallel to or in the progress of the processing concerned”; see FIG. 4 flowchart showing that generation of the entire image is a separate function from the selection of partial area within said image and the execution of wavefront control S105 may occur at any point in time relative to these two steps); and controls the phase of the wavefront corresponding to the specific area in the image by controlling the spatial modulation element (paras [0041], [0059], [0067], [0085-86]: processor receives signal from calculation section 14-1 which reads an image and on the basis of the read image, then controls wavefront compensation element 71 through the wavefront-compensation-device control apparatus 15; para [0084]: The aberration compensation processing may be executed prior to magnification mode step, or executed in parallel to or in the progress of the processing concerned); wherein the memory stores a previously generated learning result indicating a relationship between the image generated by the imaging element and the phase of the wavefront in the pupil (para [0080], [0085-86]: calculation section 14-1 calculates the aberration amount R on the basis of the wavefront image signal from image sensor 13 according to an equation [relationship] using Zernike coefficients; para [0071]: memory 800 properly stores the measured aberration, etc.), and wherein the central processor controls the spatial modulation element on the basis of the learning result and the image generated by the imaging element (para [0084]: processor 600 executes aberration compensation processing (S105); para [0060], [0086]: based on the predetermined threshold [learning result], the calculation section 14-1 controls the wavefront compensation element 71 for compensation of wavefront). Yamaguchi does not appear to explicitly disclose: a memory which stores a learning result indicating a relationship between the wavefront in the pupil and imaging performance of the imaging optical system for each of a plurality of predetermined areas formed by dividing the image of the optical image generated by the imaging element; a central processor connected to the memory and that: acquires information that designates a specific area among the plurality of predetermined areas in the image to correct the wavefront, wherein the specific area is a partial area in the image; and controls the phase of the wavefront corresponding to the specific area in the image by controlling the spatial modulation element on the basis of the learning result and the information that designates the specific area. Singer is related to Yamaguchi with respect to an analogous optical device with a control unit, an imaging optical system, and a spatial modulation element (¶0032-34: illumination device for providing an illumination spot, a scanner for moving the illumination spot over a sample to be examined at sequential scanning positions, an adaptive optics unit for controlling a wavefront of the illumination spot with a control device, and a detector for capturing a spatially resolved imaging spot emitted by the sample; ¶0077-82; FIGS. 1-4), and Singer teaches: a memory which stores a learning result indicating a relationship between the wavefront in the pupil and imaging performance of the imaging optical system for each of a plurality of predetermined areas formed by dividing the image of the optical image generated by the imaging element; a central processor connected to the memory and that: acquires information that designates a specific area among the plurality of predetermined areas in the image to correct the wavefront, wherein the specific area is a partial area in the image; and controls the phase of the wavefront corresponding to the specific area in the image by controlling the spatial modulation element on the basis of the learning result and the information that designates the specific area (¶0098: wavefront corrections associated with the scanning positions [dividing image into plurality of areas] can be stored in a table as correction coefficients so that, during other measurements on the same or similar samples, the wavefront distortions need not be determined again, and an obvious starting point for optimization is present; ¶0054: wavefront correction signal is transmitted in a forward loop to the adaptive optics unit, so as for example to influence the wavefront signal in an adjoining or in the current scanning position [dividing the image into predetermined specific partial areas]; ¶0055: the system learns [learning result] and the wavefront correction signals determined are stored [memory] in a look-up table, preferably for each scan position. With each scan, the correction data can then be refined and the look-up table updated; ¶0067: Thus the probable wavefront error at the next field point (n+1;m) or (n,m+1) [specific area] can be calculated in advance from measurements known in advance, and the adaptive mirror [spatial modulation element] controlled with this pre-calculated correction value; ¶0087: The evaluation unit 21 therefore provides pixel data to an image processing unit, and the data of the evaluation unit 21 are compared with an ideal PSF and the resulting correction wavefront delivered to the control of the adaptive optics unit 17 [SME], to be used at the next scan position; ¶0045: The PSF [imaging performance] is thereby improved, which in turn leads to an improved signal-to-noise ratio; ¶0047: the signal of the detector is used for defining the deviations of the measured PSF from an ideal PSF at each scanning position and making them available for correction. In particular, the wavefront correction To this end, a control device [processor] for the adaptive optics unit is configured in such a way that a correction function is determined by comparison of the PSF of the imaging spot at a scanning position with an ideal PSF). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the optical device of Yamaguchi in view of Singer to satisfy the claimed condition, because such limitations directed to a memory and a processor are known and would be selected to allow for a wavefront correction signal determined from the PSF [imaging performance] and for the adaptive optics unit controlled accordingly, so as to compensate the detected deviations in each scanning position (on the fly), thus providing the beneficial results of improving the imaging performance, which in turn leads to an improved signal-to-noise ratio, as taught in paragraphs ¶0053, ¶0045 of Singer. Regarding Claim 2, Yamaguchi discloses the optical device according to Claim 1, as above. Yamaguchi further discloses: wherein the spatial modulation element is disposed at a position conjugate with the pupil of the imaging optical system (paras [0037], [0045], [0055]: wavefront compensation element 71 is disposed at a position conjugated with the pupil at all times). Regarding Claim 4, Yamaguchi discloses the optical device according to Claim 1, as above. Yamaguchi further discloses: wherein the central processor accepts an input received through an input device, and acquires the information on the basis of the accepted input (paras [0059], [0066], [0088-89]: processor (PC) 600 connected to an input section for selecting specific area of and image by e.g., clicking). Regarding Claim 5, Yamaguchi discloses the optical device according to Claim 1, as above. Yamaguchi further discloses: wherein the central processor fixes a state in which the spatial modulation element is controlled on the basis of a predetermined condition (para [0069], [0085]: calculation section 14-1 calculates the aberration of the eye 60 under measurement, the optical characteristics such as aberration amount, etc., on the basis of the input signal and properly outputs the signals to the processor 600; paras [0071], [0086]: when calculated aberration is not less than predetermined value, the calculation section 14-1 controls the wavefront compensation element 71 to compensate the aberration). Regarding Claim 8, as best understood, Yamaguchi discloses: A non-transitory computer storage medium storing a program for controlling a spatial modulation element of an optical device (para [0071]: memory 800 properly stores the measured aberration, etc.; paras [0071], [0084], [0090]: processor 600 executes aberration compensation processing (S105) [program] and properly reads out data from the memory 800 and writes data into the memory 800; paras [0071], [0086]: when calculated aberration is not less than predetermined value, the wavefront compensation element 71 [spatial modulation element] is controlled hereby compensating the aberration so as to offset the measured aberration), that includes an imaging optical system comprising optical lenses to form an optical image of a subject (¶0046-50: The light receiving optical systems 12 and 31 comprise lenses; ¶0087, 0090: the processor 600 rotates the scanning mirrors to scan the retina, and the light accumulation amount at the photodetector 32 is stored in association with each scan position in the memory 800), an imaging element that generates an image of the optical image formed by the imaging optical system (¶0094, 0097, 0102, 0104, 0107-08, 0112: when light of the set image pixel number (S.sub.X, S.sub.Y) is detected, the processing of picking up the high-magnification retinal image is finished (D); ¶0047, 0056-57, 0063-64: image sensor 41 can obtain an eye-anterior-part image; ¶0070: The image constructing processor 14-2 constructs a retinal image and outputs the signal corresponding to the constructed retinal image or the like or other signals/data to the processor 600), and a spatial modulation element comprises a plurality of pixels and changes a phase of a wavefront in a pupil of the imaging optical system (paras [0052-53], [0059]: spatial light modulator of wavefront compensation device 71 modulates phase based on characteristics obtained by 14; ¶0059, 0098), the program enabling the computer to execute: A. acquiring information that designates a specific area in the image to correct the wavefront (FIGS. 1, 4, 6; paras [0088], [0090]: In steps S200-300, processor 600 indicates a pickup position (C.sub.X, C.sub.Y) for a high-magnification retinal image, such as an area on an image may be selected according to the low-magnification retinal image displayed on the display section 700 through the input section, or the processor 600 may automatically indicate the pickup position), wherein the specific area is a partial area in the image (¶0118-19: partial area within the low-magnification image that is defined by a start point Q and an end point R; see FIGS. 8-9 showing specific area being a partial area within the whole image; ¶0084: “The aberration compensation processing [wavefront manipulation] may be executed prior to the processing subsequent to step S200 described later, or executed in parallel to or in the progress of the processing concerned”; see FIG. 4 flowchart showing that generation of the entire image is a separate function from the selection of partial area within said image and the execution of wavefront control S105 may occur at any point in time relative to these two steps); and B. controlling the phase of the wavefront corresponding to the specific area in the image by controlling the spatial modulation element (paras [0041], [0059], [0067], [0085-86]: processor receives signal from calculation section 14-1 which reads an image and on the basis of the read image, then controls wavefront compensation element 71 through the wavefront-compensation-device control apparatus 15; para [0084]: The aberration compensation processing may be executed prior to magnification mode step, or executed in parallel to or in the progress of the processing concerned). Yamaguchi does not appear to explicitly disclose: a learning result indicating a relationship between the wavefront in the pupil and imaging performance of the imaging optical system for each of a plurality of predetermined areas formed by dividing the image of the optical image generated by the imaging element, wherein the program enabling the computer to execute: acquiring information that designates a specific area among the plurality of predetermined areas in the image to correct the wavefront, wherein the specific area is a partial area in the image; and controlling the phase of the wavefront corresponding to the specific area in the image by controlling the spatial modulation element on the basis of the learning result and the information that designates the specific area. Singer is related to Yamaguchi with respect to an analogous optical device with a control unit, an imaging optical system, and a spatial modulation element (¶0032-34: illumination device for providing an illumination spot, a scanner for moving the illumination spot over a sample to be examined at sequential scanning positions, an adaptive optics unit for controlling a wavefront of the illumination spot with a control device, and a detector for capturing a spatially resolved imaging spot emitted by the sample; ¶0077-82; FIGS. 1-4), and Singer teaches: a learning result indicating a relationship between the wavefront in the pupil and imaging performance of the imaging optical system for each of a plurality of predetermined areas formed by dividing the image of the optical image generated by the imaging element, wherein the program enabling the computer to execute: acquiring information that designates a specific area among the plurality of predetermined areas in the image to correct the wavefront, wherein the specific area is a partial area in the image; and controlling the phase of the wavefront corresponding to the specific area in the image by controlling the spatial modulation element on the basis of the learning result and the information that designates the specific area (¶0098: wavefront corrections associated with the scanning positions [dividing image into plurality of areas] can be stored in a table as correction coefficients so that, during other measurements on the same or similar samples, the wavefront distortions need not be determined again, and an obvious starting point for optimization is present; ¶0054: wavefront correction signal is transmitted in a forward loop to the adaptive optics unit, so as for example to influence the wavefront signal in an adjoining or in the current scanning position [dividing the image into predetermined specific partial areas]; ¶0055: the system learns [learning result] and the wavefront correction signals determined are stored [memory] in a look-up table, preferably for each scan position. With each scan, the correction data can then be refined and the look-up table updated; ¶0067: Thus the probable wavefront error at the next field point (n+1;m) or (n,m+1) [specific area] can be calculated in advance from measurements known in advance, and the adaptive mirror [spatial modulation element] controlled with this pre-calculated correction value; ¶0087: The evaluation unit 21 therefore provides pixel data to an image processing unit, and the data of the evaluation unit 21 are compared with an ideal PSF and the resulting correction wavefront delivered to the control of the adaptive optics unit 17 [SME], to be used at the next scan position; ¶0045: The PSF [imaging performance] is thereby improved, which in turn leads to an improved signal-to-noise ratio; ¶0047: the signal of the detector is used for defining the deviations of the measured PSF from an ideal PSF at each scanning position and making them available for correction. In particular, the wavefront correction To this end, a control device [program executed by computer] for the adaptive optics unit is configured in such a way that a correction function is determined by comparison of the PSF of the imaging spot at a scanning position with an ideal PSF). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the optical device of Yamaguchi in view of Singer to satisfy the claimed condition, because such limitations directed to a learning result and program are known and would be selected to allow for a wavefront correction signal determined from the PSF [imaging performance] and for the adaptive optics unit controlled accordingly, so as to compensate the detected deviations in each scanning position (on the fly), thus providing the beneficial results of improving the imaging performance, which in turn leads to an improved signal-to-noise ratio, as taught in paragraphs ¶0053, ¶0045 of Singer. Regarding Claim 9, as best understood, Yamaguchi discloses: A control device comprising: a memory (para [0087]); and a central processor connected to the memory and configured to: acquire information that designates a specific area in an image (FIGS. 1, 4, 6; para [0088]: processor 600 indicates a pickup position (C.sub.X, C.sub.Y) for a high-magnification retinal image, such as an area on an image may be selected according to the low-magnification retinal image displayed on the display section 700 through the input section, or the processor 600 may automatically indicate the pickup position; para [0087]: The processor 600 reads out data in the memory I (800) and displays the read-out data as an image on the display section [processor connected to memory]) to correct the wavefront, the specific area being a partial area in the image (¶0118-19: partial area within the low-magnification image that is defined by a start point Q and an end point R; see FIGS. 8-9 showing specific area being a partial area within the whole image; ¶0084: “The aberration compensation processing [wavefront manipulation] may be executed prior to the processing subsequent to step S200 described later, or executed in parallel to or in the progress of the processing concerned”; see FIG. 4 flowchart showing that generation of the entire image is a separate function from the selection of partial area within said image and the execution of wavefront control S105 may occur at any point in time relative to these two steps) (¶0094, 0097, 0102, 0104, 0107-08, 0112: when light of the set image pixel number (S.sub.X, S.sub.Y) is detected, the processing of picking up the retinal image is finished (D); ¶0047, 0056-57, 0063-64: image sensor 41 can obtain an eye-anterior-part image; ¶0070: The image constructing processor 14-2 constructs a retinal image and outputs the signal corresponding to the constructed retinal image or the like or other signals/data to the processor 600); and control a phase of the wavefront corresponding to the specific area in the image by controlling a spatial modulation element configured to change a phase of a wavefront in the pupil (paras [0041], [0059], [0067], [0085-86]: processor receives signal from calculation section 14-1 which reads an image and on the basis of the read image, then controls wavefront compensation element 71 through the wavefront-compensation-device control apparatus 15; ¶0046-50: The light receiving optical systems 12 and 31 comprise lenses); wherein the memory stores a previously generated learning result indicating a relationship between the image and the phase of the wavefront in the pupil (para [0080], [0085-86]: calculation section 14-1 calculates the aberration amount R on the basis of the wavefront image signal from image sensor 13 according to an equation [relationship] using Zernike coefficients; para [0071]: memory 800 properly stores the measured aberration, etc.), and wherein the central processor controls the spatial modulation element on the basis of the learning result and the image (para [0084]: processor 600 executes aberration compensation processing (S105); para [0060], [0086]: based on the predetermined threshold [learning result], the calculation section 14-1 controls the wavefront compensation element 71 for compensation of wavefront). Yamaguchi does not appear to explicitly disclose: a memory which stores a learning result indicating a relationship between the wavefront in the pupil and imaging performance of the imaging optical system for each of a plurality of predetermined areas formed by dividing the image of the optical image generated by the imaging element; a central processor connected to the memory and that: acquires information that designates a specific area among the plurality of predetermined areas in the image to correct the wavefront, wherein the specific area is a partial area in the image; and controls the phase of the wavefront corresponding to the specific area in the image by controlling the spatial modulation element to change a phase of the wavefront in the pupil on the basis of the learning result and the information that designates the specific area. Singer is related to Yamaguchi with respect to an analogous optical device with a control unit, an imaging optical system, and a spatial modulation element (¶0032-34: illumination device for providing an illumination spot, a scanner for moving the illumination spot over a sample to be examined at sequential scanning positions, an adaptive optics unit for controlling a wavefront of the illumination spot with a control device, and a detector for capturing a spatially resolved imaging spot emitted by the sample; ¶0077-82; FIGS. 1-4), and Singer teaches: a memory which stores a learning result indicating a relationship between the wavefront in the pupil and imaging performance of the imaging optical system for each of a plurality of predetermined areas formed by dividing the image of the optical image generated by the imaging element; a central processor connected to the memory and that: acquires information that designates a specific area among the plurality of predetermined areas in the image to correct the wavefront, wherein the specific area is a partial area in the image; and controls the phase of the wavefront corresponding to the specific area in the image by controlling the spatial modulation element to change a phase of the wavefront in the pupil on the basis of the learning result and the information that designates the specific area (¶0098: wavefront corrections associated with the scanning positions [dividing image into plurality of areas] can be stored in a table as correction coefficients so that, during other measurements on the same or similar samples, the wavefront distortions need not be determined again, and an obvious starting point for optimization is present; ¶0054: wavefront correction signal is transmitted in a forward loop to the adaptive optics unit, so as for example to influence the wavefront signal in an adjoining or in the current scanning position [dividing the image into predetermined specific partial areas]; ¶0055: the system learns [learning result] and the wavefront correction signals determined are stored [memory] in a look-up table, preferably for each scan position. With each scan, the correction data can then be refined and the look-up table updated; ¶0067: Thus the probable wavefront error at the next field point (n+1;m) or (n,m+1) [specific area] can be calculated in advance from measurements known in advance, and the adaptive mirror [spatial modulation element] controlled with this pre-calculated correction value; ¶0087: The evaluation unit 21 therefore provides pixel data to an image processing unit, and the data of the evaluation unit 21 are compared with an ideal PSF and the resulting correction wavefront delivered to the control of the adaptive optics unit 17 [SME], to be used at the next scan position; ¶0045: The PSF [imaging performance] is thereby improved, which in turn leads to an improved signal-to-noise ratio; ¶0047: the signal of the detector is used for defining the deviations of the measured PSF from an ideal PSF at each scanning position and making them available for correction. In particular, the wavefront correction To this end, a control device [processor] for the adaptive optics unit is configured in such a way that a correction function is determined by comparison of the PSF of the imaging spot at a scanning position with an ideal PSF). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the optical device of Yamaguchi in view of Singer to satisfy the claimed condition, because such limitations directed to a memory and a processor are known and would be selected to allow for a wavefront correction signal determined from the PSF [imaging performance] and for the adaptive optics unit controlled accordingly, so as to compensate the detected deviations in each scanning position (on the fly), thus providing the beneficial results of improving the imaging performance, which in turn leads to an improved signal-to-noise ratio, as taught in paragraphs ¶0053, ¶0045 of Singer. Regarding Claim 10, Yamaguchi discloses the control device according to Claim 9, as above. Yamaguchi further discloses: wherein the spatial modulation element is disposed at a position conjugate with the pupil of the imaging optical system (paras [0037], [0045], [0055]: wavefront compensation element 71 is disposed at a position conjugated with the pupil at all times). Regarding Claim 12, Yamaguchi discloses the control device according to Claim 9, as above. Yamaguchi further discloses: wherein the central processor accepts an input received through an input device, and acquires the information on the basis of the accepted input (paras [0059], [0066], [0088-89]: processor (PC) 600 connected to an input section for selecting specific area of and image by e.g., clicking). Regarding Claim 13, Yamaguchi discloses the control device according to Claim 9, as above. Yamaguchi further discloses: wherein the central processor fixes a state in which the spatial modulation element is controlled on the basis of a predetermined condition (para [0069], [0085]: calculation section 14-1 calculates the aberration of the eye 60 under measurement, the optical characteristics such as aberration amount, etc., on the basis of the input signal and properly outputs the signals to the processor 600; paras [0071], [0086]: when calculated aberration is not less than predetermined value, the calculation section 14-1 controls the wavefront compensation element 71 to compensate the aberration). Regarding Claim 16, Yamaguchi discloses the control device according to Claim 9, as above. Yamaguchi further discloses: An imaging system (paras [0060], [0066-71]: the ophthalmologic photographing apparatus) comprising: the control device; the imaging optical system (¶0046-50: The light receiving optical systems 12 and 31 comprise lenses); an imaging element generating the image, the imaging element generating the image (¶0094, 0097, 0102, 0104, 0107-08, 0112: when light of the set image pixel number (S.sub.X, S.sub.Y) is detected, the processing of picking up the high-magnification retinal image is finished (D); ¶0047, 0056-57, 0063-64: image sensor 41 can obtain an eye-anterior-part image; ¶0070: The image constructing processor 14-2 constructs a retinal image and outputs the signal corresponding to the constructed retinal image or the like or other signals/data to the processor 600); and the spatial modulation element (paras [0052-53], [0059]: spatial light modulator of wavefront compensation device 71 modulates phase based on characteristics obtained by 14). Regarding Claim 17, as best understood, Yamaguchi discloses: A non-transitory computer storage medium storing a program, and a learning result indicating a relationship between the image generated by the imaging element and the phase of the wavefront in the pupil (para [0080], [0085-86]: calculation section 14-1 calculates the aberration amount R on the basis of the wavefront image signal from image sensor 13 according to an equation [relationship] using Zernike coefficients; para [0071]: memory 800 properly stores the measured aberration, etc.), the program causing a computer (paras [0071], [0084], [0090]: processor 600 executes aberration compensation processing (S105) [program] and properly reads out data from the memory 800 and writes data into the memory 800; paras [0060], [0066-71]: electrical system [computer] of the ophthalmologic photographing apparatus includes a processor 600 and memory 800, controller 610, a display, an input and drivers) to execute: acquiring information that designates a specific area in an image generated based on an optical image of a subject, the specific area being a partial area (¶0094, 0097, 0102, 0104, 0107-08, 0112: when light of the set image pixel number (S.sub.X, S.sub.Y) is detected, the processing of picking up the high-magnification retinal image is finished (D); ¶0047, 0056-57, 0063-64: image sensor 41 can obtain an eye-anterior-part image; ¶0070: The image constructing processor 14-2 constructs a retinal image and outputs the signal corresponding to the constructed retinal image or the like or other signals/data to the processor 600; ¶0118-19: partial area within the low-magnification image that is defined by a start point Q and an end point R; see FIGS. 8-9 showing specific area being a partial area within the whole image; ¶0084: “The aberration compensation processing [wavefront manipulation] may be executed prior to the processing subsequent to step S200 described later, or executed in parallel to or in the progress of the processing concerned”; see FIG. 4 flowchart showing that generation of the entire image is a separate function from the selection of partial area within said image and the execution of wavefront control S105 may occur at any point in time relative to these two steps); and controlling a phase of a wavefront corresponding to the specific area in the image by controlling a spatial modulation element changing a phase of a wavefront in a pupil (paras [0041], [0059], [0067], [0085-86]: processor receives signal from calculation section 14-1 which reads an image and on the basis of the read image, then controls wavefront compensation element 71 through the wavefront-compensation-device control apparatus 15) on the basis of the learning result and the image generated by the imaging element (para [0084]: processor 600 executes aberration compensation processing (S105); para [0060], [0086]: based on the predetermined threshold [learning result], the calculation section 14-1 controls the wavefront compensation element 71 for compensation of wavefront). Yamaguchi does not appear to explicitly disclose: a learning result indicating a relationship between a wavefront in a pupil of an imaging optical system that forms an optical image of a subject using optical lenses and imaging performance of the imaging optical system for each of a plurality of predetermined areas formed by dividing an image of an image generated based on the optical image of the subject by an imaging element, wherein the program causing a computer to execute: acquiring information that designates a specific area among the plurality of predetermined areas in an the image to correct the wavefront, the specific area being a partial area in the image; and controlling a phase of the wavefront corresponding to the specific area in the image by controlling, on the basis of the learning result and the information that designates the specific area, a spatial modulation element changing a phase of the wavefront in the pupil. Singer is related to Yamaguchi with respect to an analogous optical device with a control unit, an imaging optical system, and a spatial modulation element (¶0032-34: illumination device for providing an illumination spot, a scanner for moving the illumination spot over a sample to be examined at sequential scanning positions, an adaptive optics unit for controlling a wavefront of the illumination spot with a control device, and a detector for capturing a spatially resolved imaging spot emitted by the sample; ¶0077-82; FIGS. 1-4), and Singer teaches: a learning result indicating a relationship between a wavefront in a pupil of an imaging optical system that forms an optical image of a subject using optical lenses and imaging performance of the imaging optical system for each of a plurality of predetermined areas formed by dividing an image of an image generated based on the optical image of the subject by an imaging element, wherein the program causing a computer to execute: acquiring information that designates a specific area among the plurality of predetermined areas in an the image to correct the wavefront, the specific area being a partial area in the image; and controlling a phase of the wavefront corresponding to the specific area in the image by controlling, on the basis of the learning result and the information that designates the specific area, a spatial modulation element changing a phase of the wavefront in the pupil (¶0098: wavefront corrections associated with the scanning positions [dividing image into plurality of areas] can be stored in a table as correction coefficients so that, during other measurements on the same or similar samples, the wavefront distortions need not be determined again, and an obvious starting point for optimization is present; ¶0054: wavefront correction signal is transmitted in a forward loop to the adaptive optics unit, so as for example to influence the wavefront signal in an adjoining or in the current scanning position [dividing the image into predetermined specific partial areas]; ¶0055: the system learns [learning result] and the wavefront correction signals determined are stored [memory] in a look-up table, preferably for each scan position. With each scan, the correction data can then be refined and the look-up table updated; ¶0067: Thus the probable wavefront error at the next field point (n+1;m) or (n,m+1) [specific area] can be calculated in advance from measurements known in advance, and the adaptive mirror [spatial modulation element] controlled with this pre-calculated correction value; ¶0087: The evaluation unit 21 therefore provides pixel data to an image processing unit, and the data of the evaluation unit 21 are compared with an ideal PSF and the resulting correction wavefront delivered to the control of the adaptive optics unit 17 [SME], to be used at the next scan position; ¶0045: The PSF [imaging performance] is thereby improved, which in turn leads to an improved signal-to-noise ratio; ¶0047: the signal of the detector is used for defining the deviations of the measured PSF from an ideal PSF at each scanning position and making them available for correction. In particular, the wavefront correction To this end, a control device [processor] for the adaptive optics unit is configured in such a way that a correction function is determined by comparison of the PSF of the imaging spot at a scanning position with an ideal PSF). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the optical device of Yamaguchi in view of Singer to satisfy the claimed condition, because such limitations directed to a learning result and a program are known and would be selected to allow for a wavefront correction signal determined from the PSF [imaging performance] and for the adaptive optics unit controlled accordingly, so as to compensate the detected deviations in each scanning position (on the fly), thus providing the beneficial results of improving the imaging performance, which in turn leads to an improved signal-to-noise ratio, as taught in paragraphs ¶0053, ¶0045 of Singer. Regarding Claim 18, Yamaguchi discloses the optical device according to Claim 1, as above. Yamaguchi further discloses: wherein the central processor controls, for each specific area in the image generated by the imaging element (paras [0088-90]: When the high-magnification mode is indicated (S113), the processor 600 picks up a high-magnification retinal image of the eye 60 under measurement (S300) by scanning the periphery of the pickup position (C.sub.X, C.sub.Y), and storing the light accumulation amount in association with each scan position into the memory 800), the spatial modulation element on the basis of a previously generated learning result indicating a relationship between the image generated by the imaging element and a phase of a wavefront in the pupil (para [0080], [0085-86]: calculation section 14-1 calculates the aberration amount R on the basis of the wavefront image signal from image sensor 13 according to an equation [relationship] using Zernike coefficients; para [0084]: processor 600 executes aberration compensation processing (S105); para [0060], [0086]: based on the predetermined threshold [learning result], the calculation section 14-1 controls the wavefront compensation element 71 for compensation of wavefront; para [0084]: The aberration compensation processing may be executed prior to magnification mode step, or executed in parallel to or in the progress of the processing concerned). Claims 7 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Yamaguchi et al. (US 2007/0291230 A1) in view of Singer et al. (US 20150077844 A1), and further in view of Abovitz et al. (US 2015/0235447 A1). Regarding Claims 7 and 15, Yamaguchi-Singer discloses the optical device according to Claim 1 and 9, respectively, as above. Yamaguchi does not appear to explicitly disclose: wherein the learning result is a result acquired through deep learning. Abovitz is related to Yamaguchi with respect to an analogous optical device with a control unit, an imaging optical system, and a spatial modulation element (para [0013]: spatial light modulator; para [0112]: control subsystem; para [0134]: optical system 100 including camera), and Abovitz teaches: wherein the learning result is a result acquired through deep learning (para [0466]: deep learning module is used for image analysis). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the optical device of Yamaguchi in view of Abovitz to satisfy the claimed condition, because such a learning method is known and selected for image analysis with the beneficial results of immediate and fast processing, as taught in paragraphs [0459], [0466] of Abovitz. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SAMANVITHA SRIDHAR whose telephone number is (571)270-0082. The examiner can normally be reached M-F 930-1800 (EST). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, BUMSUK WON can be reached at 571-272-2713. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SAMANVITHA SRIDHAR/Examiner, Art Unit 2872 /BUMSUK WON/Supervisory Patent Examiner, Art Unit 2872
Read full office action

Prosecution Timeline

Jun 05, 2021
Application Filed
Jan 25, 2024
Non-Final Rejection — §103, §112
May 16, 2024
Response Filed
Jul 24, 2024
Final Rejection — §103, §112
Dec 26, 2024
Request for Continued Examination
Jan 06, 2025
Response after Non-Final Action
Mar 15, 2025
Non-Final Rejection — §103, §112
Aug 22, 2025
Response Filed
Sep 26, 2025
Final Rejection — §103, §112
Dec 30, 2025
Request for Continued Examination
Jan 12, 2026
Response after Non-Final Action
Feb 17, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596245
ULTRA-COMPACT LENS SYSTEM FOR FLUORESCENCE IMAGING
2y 5m to grant Granted Apr 07, 2026
Patent 12588807
VISION TEST APPARATUS, METHOD AND SYSTEM AND NON-TRANSIENT COMPUTER READABLE RECORDING MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12517352
VEHICLE DISPLAY DEVICE
2y 5m to grant Granted Jan 06, 2026
Patent 12493139
DISPLAY ASSEMBLY, DISPLAY APPARATUS AND VR/AR DISPLAY DEVICE
2y 5m to grant Granted Dec 09, 2025
Patent 12487497
ELECTROPHORETIC DISPLAY DEVICE AND METHOD FOR MANUFACTURING THE SAME
2y 5m to grant Granted Dec 02, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
65%
Grant Probability
91%
With Interview (+26.3%)
3y 8m
Median Time to Grant
High
PTA Risk
Based on 77 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month