Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Response to Amendment
The amendment and the Request for Continuing Examination filed on 11/10/2025 has been entered. Claims 1, 4-13, 16 and 19-20 are now pending in the application. Claims 1, 4, 7-8, and 19-20 have been amended and claims 2 and 14-15, 17 and 18 have been canceled by the Applicant. Previous objection to claims 1-19 have been withdrawn in light of Applicant’s amendments to claims 1 and 19.
Examiner Notes
Examiner cites particular columns and line numbers in the references as applied to the claims below for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the applicant fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner.
Priority
As required by e M.P.E.P. 210, 214.03, acknowledgement is made of applicant’s claim for priority based on application JP 2022-074488, filed 04/28/2022 (Japan).
Receipt is acknowledged of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file.
However, to overcome a prior art rejection, applicant(s) must submit a translation of the foreign priority papers in order to perfect the claimed foreign priority because said papers has not been made of record in accordance with 37 CFR 1.55. See MPEP § 213.04
Drawings
The applicant’s drawings submitted are acceptable for examination purposes.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 4-13, 16, and 19- 20 are rejected under 35 U.S.C. 103 as being unpatentable over Nakata et al. (hereafter Nakata, of record, see IDS of 04/18/2023) US 20120215923 in view of Jannard et al. (of record, see IDS dated 10/11/2023) WO 2012154585 A1 and in view of Yamane et al. (hereafter Yamane, of record see IDS dated 12/18/2025) WO2021200002 (A1) where Yamane EP 4130842 A1 (of record see IDS dated 12/18/2025) is referenced as closest English language equivalent.
In regard to independent claim 1, Nakata teaches (see Figs. 1-20) a microscope system (microscope system 1 (or 2, 3) with microscope 100, also 400, 600, 700, abstract, e.g. paragraphs [03,07, 28-29, 31-48, 50-61,65-75,79-88,91-98,106-108,120-124]) comprising:
an observation optical system that forms an optical image of a specimen on an object side of an ocular lens (including optics of objective 102, lens tube 103, eyepiece 104, forming image of sample on object side of 104, at IP1, also IP1a, paragraphs [31-48, 50-61,65-75,79-88,91-98,106-108], Figs. 1-2 and equivalents 16-20);
a superimposition device that superimposes information on an image plane on which the optical image is formed (projection apparatus 131, creating superimposition image e.g. V2-V8 obtained by superimposing a projection image e.g. P1(2,3..) onto an optical image O1, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98]);
an imaging device that is provided on an imaging optical path branched from an optical path of the observation optical system (imaging apparatus 140 on imaging path branched from 104 path, imaging the sample at IP1a, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98], Figs. 1-3, 16-18); and
a hardware processor (i.e. as the control apparatus 10 including projection control 24, controlling 131, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98]) configured to execute processes comprising:
selectively controlling the superimposition device to superimpose (i.e. as the control apparatus 10 including projection control 24, controlling 131, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98), on the image plane (as control apparatus 10 controls projection image e.g. P1,P2 superimposed on O1 in IP1(a), see paragraphs [31-48, 50-61,65-75,79-88,91-98], as depicted in e.g. Figs. 4-14), (i) focus information regarding a focus state of the imaging device, the focus information being generated based on a captured image of the specimen acquired (i.e. as projection image includes projection image data with microscope information MI including e.g. focus information as stage 101 position at which the stage 101 is to be located during focus-achieved state that is based on captured image light from sample from segment detector, e.g. paragraphs [59, 106-108], Figs. 1,4,8,13,18), and
(ii) analysis information regarding a result of image analysis performed on the captured image (as projection image includes projection image data from 23,22 sections, with object/image recognition, classification, product identification/info, cell/nucleus type, mapping results, etc., paragraphs [52-55,61, 79-89], Figs. 4-14), the image analysis being different from a focus analysis performed by the hardware processor (i.e. as focus information, stage information of focus-achieved state performed and given by 10, is different from image analysis result(s) e.g. object/image shape, structure, recognition, classification, product info, cell/nucleus type, mapping, etc., or luminance threshold obtained by 10, paragraphs [52-55, 59, 61, 79-89]),
the focus information is text information or an indicator that visualizes the focus level (i.e. as projection image with microscope information MI including e.g. focus information as stage 101 position at which 101 is during focus-achieved state that is based on captured image light from sample from segment detector, e.g. paragraphs [59, 106-108], Figs. 1,4,8,13,18), and
the controlling comprises controlling the superimposition device to superimpose an image including the analysis information and the focus information and the focus information on the image plane when the focus state of the imaging device is equal to or greater than a predetermined threshold level (as 10 including projection control 24, controlling 131 controls projection image P1(etc.) superimposed on O1 in IP1(a) with projection image data from 23,22 including focus information as stage information of the focus-achieved state i.e. in which focusing level or threshold is achieved, and information from image analysis result(s) e.g. object/image shape, structure, recognition, classification, product info, cell/nucleus type, mapping, see paragraphs [31-48, 50-61,65-75,79-88,91-98,107], e.g. Figs. 4-14), and controlling the superimposition device to superimpose an image that does not include the analysis information on the image plane (i.e. given 10, 24, controlling 131 controls projection image P1(etc.) superimposed on O1 in IP1(a) with projection image data from 23,22 including analysis results and focus, see paragraphs [31-48, 50-61,65-75,79-88,91-98], e.g. Figs. 4-14).
But Nakata does not explicitly disclose that focus information regarding the focus state of the imaging device 140 is generated based on the captured image of the sample acquired by 140 imaging device and that focus analysis performed by the hardware processor (10) on the captured image, and focus state is a focus level calculated by the hardware processor performing the focus analysis on the captured image, and that the controlling includes controlling the superimposition device to superimpose an image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (however, Nakata discloses that projection image/data includes e.g. scale, magnification, zoom, of 1, and/or instructions given image data analysis, paragraphs [59, 68-70,83-84], and as well as analyzing digital image date for brightness detection which is related to focus state, and applying it to microscope information MI, and image acquisition setting(s), paragraphs [98]).
However, Jannard teaches in the same field of invention of focus assist system and method (for cameras and instruments i.e. microscopes, see Figs. 1-19, abstract, paragraphs [07-30, 53-55,100-111]) and further teaches that superimposing focus information that is generated for the focus state of the imaging device is based on the captured image of the sample performed by the hardware processor (10) on the captured image, and focus state is a focus level calculated by the hardware processor performing the focus analysis on the captured image (i.e. as the focus information, degree of focus is generated and superimposed on the image displayed based on image analysis using an algorithm applied to acquired image, performed by processor (928), calculating focus level data, and also that controlling the focus level visual indication (by color, traces or graph is displayed based on the degree (state, level ) of focus being greater than average, or upper threshold, or between upper and lower threshold, e.g. see paragraphs [8, 17, 26-31,53-54, 64-66, 100-111, Figs. 1-6 17-10), and controlling the superimposition device to superimpose an image that does include the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (i.e. as the degree of focus is superimposed on the image displayed as focus level visual indication (by color, traces or graph) displayed based on the degree (state, level ) of focus being less than average, or upper threshold, or between upper and lower threshold, e.g. see paragraphs [8, 17, 26-31,53-54, 64-66, 100-111, Figs. 1-6 17-10), and therefore providing the display of focus level information and visual feedback to the user that can confirm that an auto-focus system is satisfactorily focused or to aid in the process of manually focusing, and/or to interpret the display to determine the relative focus levels of desired portions of the displayed image, paragraphs [8, 26-31,53-54, 100-111]).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the microscope system control apparatus of Nakata to include superimposing focus information/focus level data based on captured image displayed according to teachings of Jannard in order to provide superimposed display with focus level information, thus providing visual feedback to the user that can confirm that an auto-focus system is satisfactorily focused or to aid in the process of manually focusing, enable the user to interpret the display to determine the relative focus levels of desired portions of the displayed image (see Jannard, paragraphs [8, 26-31,53-54, 100-111]).
But Nakata-Jannard combination does not explicitly disclose that controlling the superimposition device to superimpose the image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (as noted above, Nakata displays MI including focus information and analysis information, and Jannard displays calculated focus level indication depending on the focusing threshold, see above).
However, Yamane teaches in the same field of invention of microscope system, projection unit and methods (see Figs. 1-3, 24-40, abstract, paragraphs [06-14, 16-28,81-83,88-92, 99-104]) and further teaches controlling the superimposition device to superimpose the image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (i.e. as in the case when the contrast level is low the focusing is insufficient based on image analysis, resulting in focus information e.g. auxiliary image A111 being displayed/superimposed on optical image O111with out other image analysis information Fig. 38, as opposed to case when focused image produces analysis information in e.g. Figs. 31, 34, paragraphs [88-92, 99-104], thus providing the image analysis unit that can estimate/recommend operating focusing unit to adjust the focus, using only focus information in the superimposed auxiliary image, paragraphs [99-104]).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt and modify the microscope system control apparatus of Nakata (with Jannard) to include superimposed image that does not include the analysis information but includes the focus information when the focus state of the imaging device is less than the predetermined threshold level according to teachings of Yamane in order to provide the image analysis that can estimate and recommend focusing operation to adjust the focus, using only focus information in the superimposed auxiliary image, paragraphs [99-104]).
Regarding claim 4, the Nakata-Jannard-Yamane combination teaches the invention as set forth above, and Nakata teaches (see Figs. 1-20) that the analysis information comprises information regarding a result of the image analysis performed for a region of the captured image in which the focus state of the imaging device satisfies a predetermined condition (i.e. as due to combination since user interprets the display to determine the relative focus levels of desired, different portions/regions of the displayed image, see Jannard, paragraphs [8, 26-31,53-54, 100-111], Figs. 17-18, and Nakata paragraphs [52-54,66,88]), and
the processor controls the superimposition device to superimpose an image including the analysis information on a region of the image plane that corresponds to the region of the captured image (as 10, 24, controlling 131 controls projection image P1(etc.) superimposed on O1 in IP1(a) with analysis results, and focus as visual feedback to the user including relative focus levels of desired, different portions/regions of the displayed image, see paragraphs [31-48, 50-61,65-75,79-88,91-98], e.g. Figs. 4-14, and Jannard, see paragraphs [8, 26-31,53-54, 100-111]).
Regarding claim 5, the Nakata-Jannard-Yamane combination teaches the invention as set forth above, and Nakata teaches (see Figs. 1-20) that the processor generates at least one of the focus information or the analysis information based on the captured image (as projection image, projection image data generated from 23,22 sections, including with object/image recognition, classification, product identification/info, cell/nucleus type, mapping results of acquired image OI, etc., paragraphs [61, 79-89], Figs. 4-14, and as focus info/degree of focus generated based on image analysis algorithm of acquired image, Jannard e.g. paragraphs [8, 26-31,53-54, 100-111, Figs. 1, 17-18).
Regarding claim 6, the Nakata-Jannard-Yamane combination teaches the invention as set forth above, and Nakata teaches (see Figs. 1-20) that the superimposition device comprises a projection device (131 is projection apparatus , paragraphs [31-32,39-47,50], Fig. 1), and the microscope system further comprises an optical path combining element that joins light from the superimposition device to the optical path of the observation optical system (i.e. as light deflection element 132, e.g. beam splitter, etc., paragraphs [41-43], Fig. 1).
Regarding claim 7, the Nakata-Jannard-Yamane combination teaches the invention as set forth above, and Nakata teaches (see Figs. 1-20) that the focus information includes an indicator indicating the focus state (i.e. as per combination with Jannard including superimposing focus information/focus level data with visual indicator, see paragraphs [17, 100-111]).
Regarding claims 8, the Nakata-Jannard-Yamane combination teaches the invention as set forth above, and Nakata teaches (see Figs. 1-20) that the microscope information includes text information (i.e. see Fig. 6 as projection image P1(2) superimposed on O1 in IP1(a), includes microscope information MI as text information, see paragraphs [31-48, 50-61,65-75,79-88,91-98]) and as per combination Jannard teaches indicating the focus state (see paragraphs [8, 17, 26-31,53-54, 100-111, Figs. 1, 17-18). Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the microscope system of Nakada to include focus state with the presented MI, to provide the display of focus level information to the user, who can confirm that an auto-focus system is satisfactorily focused or to aid in the process of manually focusing, (see Jannard, paragraphs [8, 26-31,53-54, 100-111]).
Regarding claims 9 and 16, the Nakata-Jannard-Yamane combination teaches the invention as set forth above, and Nakata teaches (see Figs. 1-20) that the focus information includes at least a part of the captured image (i.e. as per combination, since Jannard presents focus information, degree of focus is superimposed at least part of image, region of image displayed, e.g. paragraphs [17, 100-111, Figs. 1, 17-18).
Regarding claim 10, the Nakata-Jannard-Yamane combination teaches the invention as set forth above, and Nakata teaches (see Figs. 1-20) wherein: the image analysis different from the focus analysis includes inference processing of detecting an object in an image and a category of the object by using a neural network model trained by machine learning (i.e. as 10 with 22 image analysis section performs object recognition, identification, classification, including process using a trained neural network, paragraphs [50-58,61-64,83-85,123], different from focus level algorithm of Jannard, see e.g. claim 1 above), the analysis information includes a figure that specifies a position of an object of a predetermined category detected in the inference processing (i.e. as 22 may classify one or more structures in digital image represented by digital image data into one or more classes and output an analysis result including information specifying the position of structure classified into at least one class of the one or more classes, e.g. paragraphs [50-54]), and the focus information includes a color of the figure that specifies the focus state for the object of the predetermined category detected by the inference processing (i.e. as per combination Jannard presents also color as graphical indication of the degree of focus for at least a portion of the selected image region, see abstract, paragraphs [10,13,27, 97,100-105]).
Regarding claim 11, the Nakata-Jannard-Yamane combination teaches the invention as set forth above, and Nakata teaches (see Figs. 1-20) that the inference processing includes object detection (i.e. as analysis includes for projection image data from 23,22 sections, includes object/image recognition shape/size, classification, product identification/info, cell/nucleus type, mapping results, etc., paragraphs [50-54,61, 79-89], Figs. 4-14).
Regarding claim 12, the Nakata-Jannard-Yamane combination teaches the invention as set forth above, and Nakata teaches (see Figs. 1-20) that the inference processing includes segmentation (i.e. as analysis includes for projection image data from 23,22 sections, includes object/image by shape, size, counting cells, , product identification/info, cell/nucleus type, mapping results, etc., paragraphs [52-54,61, 79-89], Figs. 4-14).
Regarding claim 13, the Nakata-Jannard-Yamane combination teaches the invention as set forth above, and Nakata teaches (see Figs. 1-20) that the focus information includes focus peaking information for marking a region having a higher focus level than other regions (i.e. as per combination Jannard presents visual indication of the degree of focus for image region with higher focus degree, see paragraphs [17,23-24,100-105]).
In regard to independent claim 19, Nakata teaches (see Figs. 1-20) a projection unit for a microscope system (as microscope system 1 (or 2, 3) with microscope 100, also 400, 600, 700, abstract, e.g. paragraphs [03,07, 28-29, 31-48, 50-61,65-75,79-88,91-98,106-108,120-124] with projection apparatus 131, creating superimposition image e.g. V2-V8 obtained by superimposing a projection image e.g. P1(2,3..) onto an optical image O1, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98]), the projection unit comprising:
a superimposition device that superimposes information on an image plane (IP1,IP1a) on which an optical image of a specimen is formed (projection apparatus 131, creating superimposition image e.g. V2-V8 obtained by superimposing a projection image e.g. P1(2,3..) onto an optical image O1, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98]) by an observation optical system included in the microscope system (as 1 has optics of objective 102, lens tube 103, eyepiece 104, forming image of sample on object side of 104, at IP1, also IP1a, paragraphs [31-48, 50-61,65-75,79-88,91-98,106-108], Figs. 1-2 and equivalents 16-20), the image plane being positioned on an object side of an ocular lens included in the microscope system (as formed image of sample is on object side of 104, at IP1, also IP1a, paragraphs [31-48, 50-61,65-75,79-88,91-98,106-108], Figs. 1-2 and equivalents 16-20);
an imaging device that is provided on an imaging optical path branched from an optical path of the observation optical system (imaging apparatus 140 on imaging path branched from path with 104, imaging the sample at IP1a, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98], Figs. 1-3, 16-18); and
a hardware processor (i.e. as the control apparatus 10 including projection control 24, controlling 131, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98]) configured to execute processes comprising:
selectively controlling the superimposition device to superimpose (i.e. as the control apparatus 10 including projection control 24, controlling 131, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98), on the image plane (as control apparatus 10 controls projection image e.g. P1,P2 superimposed on O1 in IP1(a), see paragraphs [31-48, 50-61,65-75,79-88,91-98], as depicted in e.g. Figs. 4-14), (i) focus information regarding a focus state of the imaging device, the focus information being generated based on a captured image of the specimen acquired (i.e. as projection image includes projection image data with microscope information MI including e.g. focus information as stage 101 position at which the stage 101 is to be located during focus-achieved state that is based on captured image light from sample from segment detector, e.g. paragraphs [59, 106-108], Figs. 1,4,8,13,18), and
(ii) analysis information regarding a result of image analysis performed on the captured image (as projection image includes projection image data from 23,22 sections, with object/image recognition, classification, product identification/info, cell/nucleus type, mapping results, etc., paragraphs [52-55,61, 79-89], Figs. 4-14), the image analysis being different from a focus analysis performed by the hardware processor (i.e. as focus information, stage information of focus-achieved state performed and given by 10, is different from image analysis result(s) e.g. object/image shape, structure, recognition, classification, product info, cell/nucleus type, mapping, etc., or luminance threshold obtained by 10, paragraphs [52-55, 59, 61, 79-89]),
the focus information is text information or an indicator that visualizes the focus level (i.e. as projection image with microscope information MI including e.g. focus information as stage 101 position at which 101 is during focus-achieved state that is based on captured image light from sample from segment detector, e.g. paragraphs [59, 106-108], Figs. 1,4,8,13,18), and
the controlling comprises controlling the superimposition device to superimpose an image including the analysis information and the focus information and the focus information on the image plane when the focus state of the imaging device is equal to or greater than a predetermined threshold level (as 10 including projection control 24, controlling 131 controls projection image P1(etc.) superimposed on O1 in IP1(a) with projection image data from 23,22 including focus information as stage information of the focus-achieved state i.e. in which focusing level or threshold is achieved, and information from image analysis result(s) e.g. object/image shape, structure, recognition, classification, product info, cell/nucleus type, mapping, see paragraphs [31-48, 50-61,65-75,79-88,91-98,107], e.g. Figs. 4-14), and controlling the superimposition device to superimpose an image that does not include the analysis information on the image plane (i.e. given 10, 24, controlling 131 controls projection image P1(etc.) superimposed on O1 in IP1(a) with projection image data from 23,22 including analysis results and focus, see paragraphs [31-48, 50-61,65-75,79-88,91-98], e.g. Figs. 4-14).
But Nakata does not explicitly disclose that focus information regarding the focus state of the imaging device 140 is generated based on the captured image of the sample acquired by 140 imaging device and that focus analysis performed by the hardware processor (10) on the captured image, and focus state is a focus level calculated by the hardware processor performing the focus analysis on the captured image, and that the controlling includes controlling the superimposition device to superimpose an image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (however, Nakata discloses that projection image/data includes e.g. scale, magnification, zoom, of 1, and/or instructions given image data analysis, paragraphs [59, 68-70,83-84], and as well as analyzing digital image date for brightness detection which is related to focus state, and applying it to microscope information MI, and image acquisition setting(s), paragraphs [98]).
However, Jannard teaches in the same field of invention of focus assist system and method (for cameras and instruments i.e. microscopes, see Figs. 1-19, abstract, paragraphs [07-30, 53-55,100-111]) and further teaches that superimposing focus information that is generated for the focus state of the imaging device is based on the captured image of the sample performed by the hardware processor (10) on the captured image, and focus state is a focus level calculated by the hardware processor performing the focus analysis on the captured image (i.e. as the focus information, degree of focus is generated and superimposed on the image displayed based on image analysis using an algorithm applied to acquired image, performed by processor (928), calculating focus level data, and also that controlling the focus level visual indication (by color, traces or graph is displayed based on the degree (state, level ) of focus being greater than average, or upper threshold, or between upper and lower threshold, e.g. see paragraphs [8, 17, 26-31,53-54, 64-66, 100-111, Figs. 1-6 17-10), and controlling the superimposition device to superimpose an image that does include the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (i.e. as the degree of focus is superimposed on the image displayed as focus level visual indication (by color, traces or graph) displayed based on the degree (state, level ) of focus being less than average, or upper threshold, or between upper and lower threshold, e.g. see paragraphs [8, 17, 26-31,53-54, 64-66, 100-111, Figs. 1-6 17-10), and therefore providing the display of focus level information and visual feedback to the user that can confirm that an auto-focus system is satisfactorily focused or to aid in the process of manually focusing, and/or to interpret the display to determine the relative focus levels of desired portions of the displayed image, paragraphs [8, 26-31,53-54, 100-111]).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the microscope system control apparatus of Nakata to include superimposing focus information/focus level data based on captured image displayed according to teachings of Jannard in order to provide superimposed display with focus level information, thus providing visual feedback to the user that can confirm that an auto-focus system is satisfactorily focused or to aid in the process of manually focusing, enable the user to interpret the display to determine the relative focus levels of desired portions of the displayed image (see Jannard, paragraphs [8, 26-31,53-54, 100-111]).
But Nakata-Jannard combination does not explicitly disclose that controlling the superimposition device to superimpose the image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (as noted above, Nakata displays MI including focus information and analysis information, and Jannard displays calculated focus level indication depending on the focusing threshold, see above).
However, Yamane teaches in the same field of invention of microscope system, projection unit and methods (see Figs. 1-3, 24-40, abstract, paragraphs [06-14, 16-28,81-83,88-92, 99-104]) and further teaches controlling the superimposition device to superimpose the image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (i.e. as in the case when the contrast level is low the focusing is insufficient based on image analysis, resulting in focus information e.g. auxiliary image A111 being displayed/superimposed on optical image O111with out other image analysis information Fig. 38, as opposed to case when focused image produces analysis information in e.g. Figs. 31, 34, paragraphs [88-92, 99-104], thus providing the image analysis unit that can estimate/recommend operating focusing unit to adjust the focus, using only focus information in the superimposed auxiliary image, paragraphs [99-104]).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt and modify the microscope system control apparatus of Nakata (with Jannard) to include superimposed image that does not include the analysis information but includes the focus information when the focus state of the imaging device is less than the predetermined threshold level according to teachings of Yamane in order to provide the image analysis that can estimate and recommend focusing operation to adjust the focus, using only focus information in the superimposed auxiliary image, paragraphs [99-104]).
In regard to independent claim 20, Nakata teaches (see Figs. 1-20) an image projection method executed by a microscope system (an image projection method implemented by the microscope system 1 (or 2, 3) with microscope 100, also 400, 600, 700, abstract, e.g. paragraphs [03,07, 28-29, 31-48, 50-61,65-75,79-88,91-98,106-108,120-124], having projection apparatus 131, creating superimposition image e.g. V2-V8 obtained by superimposing a projection image e.g. P1(2,3..) onto an optical image O1, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98]) including an observation optical system (as 1 has optics of objective 102, lens tube 103, eyepiece 104, forming image of sample on object side of 104, at IP1, also IP1a, paragraphs [31-48, 50-61,65-75,79-88,91-98,106-108], Figs. 1-2 and equivalents 16-20), a superimposition device (projection apparatus 131, creating superimposition image e.g. V2-V8 obtained by superimposing a projection image e.g. P1(2,3..) onto an optical image O1, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98]), and a hardware processor (i.e. as the control apparatus 10 including projection control 24, controlling 131, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98]) , the image projection method (projection method of 1) comprising:
forming an optical image of a specimen on an object side of an ocular lens included in the observation optical system (forming optical image O1 of the sample , e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98], by 1 optics i.e. 102, lens tube 103, eyepiece 104, where image of sample is on object side of 104, at IP1, also IP1a, paragraphs [31-48, 50-61,65-75,79-88,91-98,106-108], Figs. 1-2 and equivalents 16-20); and
by the hardware processor (1), selectively controlling the superimposition device to superimpose, on an image plane (IP1, IP1a) on which the optical image is formed (with the projection apparatus 131, creating superimposition image e.g. V2-V8 obtained by superimposing a projection image e.g. P1(2,3..) onto an optical image O1 on image plane, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98]), (i) focus information regarding a focus state of an imaging device provided on an imaging optical path branched from an optical path of the observation optical system (focus of the imaging apparatus 140 on optical path branched from optical path of 104, e.g. Figs. 1-2, 18, paragraphs [31-47]), the focus information being generated based on a captured image of the specimen acquired (i.e. as projection image includes projection image data with microscope information MI including e.g. focus information as stage 101 position at which the stage 101 is to be located during focus-achieved state that is based on captured image light from sample from segment detector, e.g. paragraphs [59, 106-108], Figs. 1,4,8,13,18), and
(ii) analysis information regarding a result of image analysis performed on the captured image (as projection image includes projection image data from 23,22 sections, with object/image recognition, classification, product identification/info, cell/nucleus type, mapping results, etc., paragraphs [52-55,61, 79-89], Figs. 4-14), the image analysis being different from a focus analysis performed by the hardware processor (i.e. as focus information, stage information of focus-achieved state performed and given by 10, is different from image analysis result(s) e.g. object/image shape, structure, recognition, classification, product info, cell/nucleus type, mapping, etc., or luminance threshold obtained by 10, paragraphs [52-55, 59, 61, 79-89]),
the focus information is text information or an indicator that visualizes the focus level (i.e. as projection image with microscope information MI including e.g. focus information as stage 101 position at which 101 is during focus-achieved state that is based on captured image light from sample from segment detector, e.g. paragraphs [59, 106-108], Figs. 1,4,8,13,18), and
the controlling comprises controlling the superimposition device to superimpose an image including the analysis information and the focus information and the focus information on the image plane when the focus state of the imaging device is equal to or greater than a predetermined threshold level (as 10 including projection control 24, controlling 131 controls projection image P1(etc.) superimposed on O1 in IP1(a) with projection image data from 23,22 including focus information as stage information of the focus-achieved state i.e. in which focusing level or threshold is achieved, and information from image analysis result(s) e.g. object/image shape, structure, recognition, classification, product info, cell/nucleus type, mapping, see paragraphs [31-48, 50-61,65-75,79-88,91-98,107], e.g. Figs. 4-14), and controlling the superimposition device to superimpose an image that does not include the analysis information on the image plane (i.e. given 10, 24, controlling 131 controls projection image P1(etc.) superimposed on O1 in IP1(a) with projection image data from 23,22 including analysis results and focus, see paragraphs [31-48, 50-61,65-75,79-88,91-98], e.g. Figs. 4-14).
But Nakata does not explicitly disclose that focus information regarding the focus state of the imaging device 140 is generated based on the captured image of the sample acquired by 140 imaging device and that focus analysis performed by the hardware processor (10) on the captured image, and focus state is a focus level calculated by the hardware processor performing the focus analysis on the captured image, and that the controlling includes controlling the superimposition device to superimpose an image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (however, Nakata discloses that projection image/data includes e.g. scale, magnification, zoom, of 1, and/or instructions given image data analysis, paragraphs [59, 68-70,83-84], and as well as analyzing digital image date for brightness detection which is related to focus state, and applying it to microscope information MI, and image acquisition setting(s), paragraphs [98]).
However, Jannard teaches in the same field of invention of focus assist system and method (for cameras and instruments i.e. microscopes, see Figs. 1-19, abstract, paragraphs [07-30, 53-55,100-111]) and further teaches that superimposing focus information that is generated for the focus state of the imaging device is based on the captured image of the sample performed by the hardware processor (10) on the captured image, and focus state is a focus level calculated by the hardware processor performing the focus analysis on the captured image (i.e. as the focus information, degree of focus is generated and superimposed on the image displayed based on image analysis using an algorithm applied to acquired image, performed by processor (928), calculating focus level data, and also that controlling the focus level visual indication (by color, traces or graph is displayed based on the degree (state, level ) of focus being greater than average, or upper threshold, or between upper and lower threshold, e.g. see paragraphs [8, 17, 26-31,53-54, 64-66, 100-111, Figs. 1-6 17-10), and controlling the superimposition device to superimpose an image that does include the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (i.e. as the degree of focus is superimposed on the image displayed as focus level visual indication (by color, traces or graph) displayed based on the degree (state, level ) of focus being less than average, or upper threshold, or between upper and lower threshold, e.g. see paragraphs [8, 17, 26-31,53-54, 64-66, 100-111, Figs. 1-6 17-10), and therefore providing the display of focus level information and visual feedback to the user that can confirm that an auto-focus system is satisfactorily focused or to aid in the process of manually focusing, and/or to interpret the display to determine the relative focus levels of desired portions of the displayed image, paragraphs [8, 26-31,53-54, 100-111]).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the microscope system control apparatus of Nakata to include superimposing focus information/focus level data based on captured image displayed according to teachings of Jannard in order to provide superimposed display with focus level information, thus providing visual feedback to the user that can confirm that an auto-focus system is satisfactorily focused or to aid in the process of manually focusing, enable the user to interpret the display to determine the relative focus levels of desired portions of the displayed image (see Jannard, paragraphs [8, 26-31,53-54, 100-111]).
But Nakata-Jannard combination does not explicitly disclose that controlling the superimposition device to superimpose the image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (as noted above, Nakata displays MI including focus information and analysis information, and Jannard displays calculated focus level indication depending on the focusing threshold, see above).
However, Yamane teaches in the same field of invention of microscope system, projection unit and methods (see Figs. 1-3, 24-40, abstract, paragraphs [06-14, 16-28,81-83,88-92, 99-104]) and further teaches controlling the superimposition device to superimpose the image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (i.e. as in the case when the contrast level is low the focusing is insufficient based on image analysis, resulting in focus information e.g. auxiliary image A111 being displayed/superimposed on optical image O111with out other image analysis information Fig. 38, as opposed to case when focused image produces analysis information in e.g. Figs. 31, 34, paragraphs [88-92, 99-104], thus providing the image analysis unit that can estimate/recommend operating focusing unit to adjust the focus, using only focus information in the superimposed auxiliary image, paragraphs [99-104]).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt and modify the microscope system control apparatus of Nakata (with Jannard) to include superimposed image that does not include the analysis information but includes the focus information when the focus state of the imaging device is less than the predetermined threshold level according to teachings of Yamane in order to provide the image analysis that can estimate and recommend focusing operation to adjust the focus, using only focus information in the superimposed auxiliary image, paragraphs [99-104]).
Response to Arguments
Applicant's arguments filed in the Remarks dated 11/10/2025 with respect to claim(s) 1 as applied to claims 19 and 20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Applicant argues on pages 9-10 of the Remarks that cited prior art of Nakata and Jannard does not disclose or render obvious features recited in claim 1 with respect to recited selective control the superimposition device, and defined terminology where the focus state is a focus level calculated by the hardware processor performing the focus analysis on the captured image, the focus information is text information or an indicator that visualizes the focus level, as Nakata does not explicitly disclose such features and Jannard discloses displayed focus assist information (such as a waveform; see paragraph [0008], cited by the Examiner. The Examiner respectfully disagrees. With respect to issue (1), as noted in the rejection above, the cited prior art of Nagata teaches most and in combination with Jannard and Yamane teaches and renders obvious all limitations of claim 1, as Nakata teaches (see Figs. 1-20) a microscope system (microscope system 1 (or 2, 3) with microscope 100, also 400, 600, 700, abstract, e.g. paragraphs [03,07, 28-29, 31-48, 50-61,65-75,79-88,91-98,106-108,120-124]) comprising:
an observation optical system that forms an optical image of a specimen on an object side of an ocular lens (including optics of objective 102, lens tube 103, eyepiece 104, forming image of sample on object side of 104, at IP1, also IP1a, paragraphs [31-48, 50-61,65-75,79-88,91-98,106-108], Figs. 1-2 and equivalents 16-20);
a superimposition device that superimposes information on an image plane on which the optical image is formed (projection apparatus 131, creating superimposition image e.g. V2-V8 obtained by superimposing a projection image e.g. P1(2,3..) onto an optical image O1, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98]);
an imaging device that is provided on an imaging optical path branched from an optical path of the observation optical system (imaging apparatus 140 on imaging path branched from 104 path, imaging the sample at IP1a, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98], Figs. 1-3, 16-18); and
a hardware processor (i.e. as the control apparatus 10 including projection control 24, controlling 131, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98]) configured to execute processes comprising:
selectively controlling the superimposition device to superimpose (i.e. as the control apparatus 10 including projection control 24, controlling 131, e.g. paragraphs [31-48, 50-61,65-75,79-88,91-98), on the image plane (as control apparatus 10 controls projection image e.g. P1,P2 superimposed on O1 in IP1(a), see paragraphs [31-48, 50-61,65-75,79-88,91-98], as depicted in e.g. Figs. 4-14), (i) focus information regarding a focus state of the imaging device, the focus information being generated based on a captured image of the specimen acquired (i.e. as projection image includes projection image data with microscope information MI including e.g. focus information as stage 101 position at which the stage 101 is to be located during focus-achieved state that is based on captured image light from sample from segment detector, e.g. paragraphs [59, 106-108], Figs. 1,4,8,13,18), and
(ii) analysis information regarding a result of image analysis performed on the captured image (as projection image includes projection image data from 23,22 sections, with object/image recognition, classification, product identification/info, cell/nucleus type, mapping results, etc., paragraphs [52-55,61, 79-89], Figs. 4-14), the image analysis being different from a focus analysis performed by the hardware processor (i.e. as focus information, stage information of focus-achieved state performed and given by 10, is different from image analysis result(s) e.g. object/image shape, structure, recognition, classification, product info, cell/nucleus type, mapping, etc., or luminance threshold obtained by 10, paragraphs [52-55, 59, 61, 79-89]),
the focus information is text information or an indicator that visualizes the focus level (i.e. as projection image with microscope information MI including e.g. focus information as stage 101 position at which 101 is during focus-achieved state that is based on captured image light from sample from segment detector, e.g. paragraphs [59, 106-108], Figs. 1,4,8,13,18), and
the controlling comprises controlling the superimposition device to superimpose an image including the analysis information and the focus information and the focus information on the image plane when the focus state of the imaging device is equal to or greater than a predetermined threshold level (as 10 including projection control 24, controlling 131 controls projection image P1(etc.) superimposed on O1 in IP1(a) with projection image data from 23,22 including focus information as stage information of the focus-achieved state i.e. in which focusing level or threshold is achieved, and information from image analysis result(s) e.g. object/image shape, structure, recognition, classification, product info, cell/nucleus type, mapping, see paragraphs [31-48, 50-61,65-75,79-88,91-98,107], e.g. Figs. 4-14), and controlling the superimposition device to superimpose an image that does not include the analysis information on the image plane (i.e. given 10, 24, controlling 131 controls projection image P1(etc.) superimposed on O1 in IP1(a) with projection image data from 23,22 including analysis results and focus, see paragraphs [31-48, 50-61,65-75,79-88,91-98], e.g. Figs. 4-14).
But Nakata does not explicitly disclose that focus information regarding the focus state of the imaging device 140 is generated based on the captured image of the sample acquired by 140 imaging device and that focus analysis performed by the hardware processor (10) on the captured image, and focus state is a focus level calculated by the hardware processor performing the focus analysis on the captured image, and that the controlling includes controlling the superimposition device to superimpose an image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (however, Nakata discloses that projection image/data includes e.g. scale, magnification, zoom, of 1, and/or instructions given image data analysis, paragraphs [59, 68-70,83-84], and as well as analyzing digital image date for brightness detection which is related to focus state, and applying it to microscope information MI, and image acquisition setting(s), paragraphs [98]).
However, Jannard teaches in the same field of invention of focus assist system and method (for cameras and instruments i.e. microscopes, see Figs. 1-19, abstract, paragraphs [07-30, 53-55,100-111]) and further teaches that superimposing focus information that is generated for the focus state of the imaging device is based on the captured image of the sample performed by the hardware processor (10) on the captured image, and focus state is a focus level calculated by the hardware processor performing the focus analysis on the captured image (i.e. as the focus information, degree of focus is generated and superimposed on the image displayed based on image analysis using an algorithm applied to acquired image, performed by processor (928), calculating focus level data, and also that controlling the focus level visual indication (by color, traces or graph is displayed based on the degree (state, level ) of focus being greater than average, or upper threshold, or between upper and lower threshold, e.g. see paragraphs [8, 17, 26-31,53-54, 64-66, 100-111, Figs. 1-6 17-10), and controlling the superimposition device to superimpose an image that does include the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (i.e. as the degree of focus is superimposed on the image displayed as focus level visual indication (by color, traces or graph) displayed based on the degree (state, level ) of focus being less than average, or upper threshold, or between upper and lower threshold, e.g. see paragraphs [8, 17, 26-31,53-54, 64-66, 100-111, Figs. 1-6 17-10), and therefore providing the display of focus level information and visual feedback to the user that can confirm that an auto-focus system is satisfactorily focused or to aid in the process of manually focusing, and/or to interpret the display to determine the relative focus levels of desired portions of the displayed image, paragraphs [8, 26-31,53-54, 100-111]).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the microscope system control apparatus of Nakata to include superimposing focus information/focus level data based on captured image displayed according to teachings of Jannard in order to provide superimposed display with focus level information, thus providing visual feedback to the user that can confirm that an auto-focus system is satisfactorily focused or to aid in the process of manually focusing, enable the user to interpret the display to determine the relative focus levels of desired portions of the displayed image (see Jannard, paragraphs [8, 26-31,53-54, 100-111]).
But Nakata-Jannard combination does not explicitly disclose that controlling the superimposition device to superimpose the image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (as noted above, Nakata displays MI including focus information and analysis information, and Jannard displays calculated focus level indication depending on the focusing threshold, see above).
However, Yamane teaches in the same field of invention of microscope system, projection unit and methods (see Figs. 1-3, 24-40, abstract, paragraphs [06-14, 16-28,81-83,88-92, 99-104]) and further teaches controlling the superimposition device to superimpose the image that does not include the analysis information but includes the focus information on the image plane when the focus state of the imaging device is less than the predetermined threshold level (i.e. as in the case when the contrast level is low the focusing is insufficient based on image analysis, resulting in focus information e.g. auxiliary image A111 being displayed/superimposed on optical image O111with out other image analysis information Fig. 38, as opposed to case when focused image produces analysis information in e.g. Figs. 31, 34, paragraphs [88-92, 99-104], thus providing the image analysis unit that can estimate/recommend operating focusing unit to adjust the focus, using only focus information in the superimposed auxiliary image, paragraphs [99-104]).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt and modify the microscope system control apparatus of Nakata (with Jannard) to include superimposed image that does not include the analysis information but includes the focus information when the focus state of the imaging device is less than the predetermined threshold level according to teachings of Yamane in order to provide the image analysis that can estimate and recommend focusing operation to adjust the focus, using only focus information in the superimposed auxiliary image, paragraphs [99-104]).
Further regarding claim 1 and its dependent claims 2, 4-18 and claim 19 it is noted that "While features of an apparatus may be recited either structurally or functionally, claims directed to an apparatus must be distinguished from the prior art in terms of structure rather than function. See MPEP § 2113; In re Schreiber, 128 F.3d 1473, 1477-78, 44 USPQ2d 1429, 1431-32 (Fed. Cir. 1997); In re Swinehart, 439 F.2d 210, 212-13, 169 USPQ 226, 228-29 (CCPA 1971); In re Danly, 263 F.2d 844, 847, 120 USPQ 528, 531 (CCPA 1959). “[A]pparatus claims cover what a device is, not what a device does.” Hewlett-Packard Co. v. Bausch & Lomb Inc., 909 F.2d 1464, 1469, 15 USPQ2d 1525, 1528 (Fed. Cir. 1990) (emphasis in original, MPEP §2114).
The same answers and responses apply also to claims 19 and 20.
No additional substantial arguments were presented after page 10 of the Remarks dated 11/10/2025.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MARIN PICHLER whose telephone number is (571)272-4015. The examiner can normally be reached Monday-Friday 8:30am -5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas K Pham can be reached at (571)272-3689. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MARIN PICHLER/Primary Examiner, Art Unit 2872