Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This communication is filed in response to the action filed on 02/03/2026.
The claims 1-11 and 13-14 are pending.
Response to Arguments
Applicant’s arguments filed on 02/03/2026 on pages 5-7, under REMARKS with respect to 35 U.S.C. 103 claim rejections to claims 1-11, and 13-14 have been fully considered and are persuasive. The rejections to the claims have been withdrawn. However, upon further consideration, a new grounds of rejection is made in view of US 2020/0183140 A1 to MANIAN (hereinafter “MANIAN”).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or non-obviousness.
Claims 1-11, 13-14 are rejected under 35 § U.S.C. 103 as being obvious over US 2005/0109959 A1 to Wasserman et al. (hereinafter “WASSERMAN”) in view of US 2020/0183140 A1 to MANIAN (hereinafter “MANIAN”).
As per claim 1, WASSERMAN discloses a method for configuring a sequence controller of an automated microscope (a computing system and corresponding method for controlling an imaging system adapted for machine vision allowing the cameras to provide magnified/zoomed in images therefore acting as a microscope and is further adapted for automatic focusing and other automated functions related to the magnified images captured by the imager acting substantially as a microscope; abstract; figs 1-2; paragraphs [0067-0070]), the method comprising: in a learning operating mode, performing training settings automatically or manually in succession (see step S110 of fig 6 the image inspection system is switched to a learn mode and undergoes training steps in order to train the auto focus function via specific instructions; fig 6; paragraphs [0040-0041], [0137]), each training setting corresponding to a respective setting data set of microscope components of the microscope, and assigns each respective position (the training steps of fig 6 include a step S120 of selecting a lens which is a microscope component and the lens is positioned in steps S130 and repeated throughout a region to test said region as a region of interest in step S190; fig 6; paragraphs [0137], [0145], [0163-0165]) to one or more examination steps to be performed by the microscope components (via the computing system a point may be assigned as an allowed or selected operating point, the corresponding auto focus scan motion speed that provides the required sampling density or maximum auto focus image sample spacing is easily determined and is used as a corresponding auto focus parameter; paragraphs [0194-0195]), assessing the training settings automatically or manually after the training settings are performed, based on the assessment (the user via user interface defines the region of interest (position) for the auto focus via direct user input on a graphical interface for editing where the region of interest includes the operating point and defined characteristics to trigger the evaluation steps of the training system and method of fig 6 at step S400 and acquiring the images after evaluation is deemed successful as stated in steps S420-S480; fig 6; paragraphs [0146-0149], [0188-0191], [0214-0215], [0219], [0221], [0225]), storing the setting data sets associated with the training settings for subsequent use in the sequence controller (the control system portion is adapted to receive the operation position value and is used to determine when to perform autofocus in an image range based on a trigger point/characteristic of the image and this is stored to the system to be used in operative mode after saving the point value in said training mode; fig 6; paragraphs [0141-0143], [0146-0149], [0218-0219], [0224-0225]), or discarding the setting data sets, modifying the setting data sets, wherein the sequence controller specifies a use of the stored setting data sets in the form of a setting of the microscope components according to the setting data sets in an examination operating mode following the learning operating mode (based on the various trigger methods of an operating point/characteristics the determination of the corresponding governing auto focus variable or parameters, is based on the settings to the variables defined and determined in steps S190-S210 of FIG. 6 and is further adapted to move the lens component (microscope component ) to the desired location of the region of interest to perform the auto focus according to the trained procedure and setting; fig 6; paragraphs [0141-0143], [0146-0149], [0224-0226]). WASSERMAN fails to disclose wherein a user brings at least a part of the microscope components into a plurality of positions in succession, wherein the part of the microscope components includes a cross table on which a sample is placed, wherein each respective position includes a respective coordinate of the cross table corresponding to a respective sample region and a respective focus position.
MANIAN discloses wherein a user brings at least a part of the microscope components into a plurality of positions in succession (a user via a computing system providing a user interface is able to move components of a microscope into a plurality of coordinate positions using a motor to move said components; abstract; fig 2; paragraphs [0021-0022], [0027-0028], [0038]), wherein the part of the microscope components includes a cross table on which a sample is placed (wherein one of the movable components includes a platform to hold a sample (sample is placed on) for being imaged via the microscope; abstract; fig 2; paragraphs [0021-0022], [0027-0028], [0038]; NOTE: the applicant defines “cross table” as something holding the sample see claim and spec paragraph [0080] and is the definition used in examination it is noted the google definition of microscope cross table differs and the definition claimed by the applicant is the definition used in examination), wherein each respective position includes a respective coordinate of the cross table corresponding to a respective sample region and a respective focus position (wherein each position the platform is moved to via the motor is a coordinate position of which the user may command the system to move the platform and other movable components to that said position and further includes a focus component denoted by the Z coordinate value in the r, theta, Z coordinate positions provided by the system; abstract; fig 2; paragraphs [0021-0022], [0027-0028], [0032-0033], [0036-0039]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify WASSERMAN to have each respective position includes a respective coordinate of the cross table corresponding to a respective sample region and a respective focus position of MANIAN reference. The Suggestion/motivation for doing so would have been to provide the ability to perform an optically-defined measurement fast for the location of any fluorescing targets within a predefined area suggested at paragraph [0033] of MANIAN. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine MANIAN with WASSERMAN to obtain the invention as specified in claim 1.
As per claim 2, WASSERMAN in view of MANIAN discloses the method as claimed in claim 1. Modified WASSERMAN further discloses wherein the examination operating mode comprises one or more acquisition steps, the one or more acquisition steps are performed in the respective sample regions and using the respective focus positions, wherein image data are obtained and stored by the one or more acquisition steps (the flow chart of fig 6 showing a training mode procedure to learn new settings, in which step S140 is an image acquisition step adapted to provide via an on board camera capturing a working view image acquired using a strobe lighting duration short enough to provide an acceptable auto focus image when used in combination with any practical Z-axis speed during an auto focus image acquisition sequence; fig 6; paragraphs [0075], [0094], [0102], [0139]).
As per claim 3, WASSERMAN in view of MANIAN discloses the method as claimed in claim 2. Modified WASSERMAN further discloses wherein the image data obtained and stored in the one or more acquisition steps are subjected to an image analysis in one or more evaluation steps (see fig 7 as an example of an automatic exposure evaluation tool that is used to help establish desirable settings and performs an evaluation on the acquired images; paragraphs [0148-0149]).
As per claim 4, WASSERMAN in view of MANIAN discloses the method as claimed in claim 3. Modified WASSERMAN further discloses wherein an implementation of the one or more evaluation steps is initiated as a reaction to a user input or automatically, wherein the user input is evaluated in the one or more evaluation steps (the user over a graphical user interface and a selection tool selects the region of interest for the auto focus training sequence to be performed using the identified region input by said user; paragraphs [0075], [0094], [0148-0149]).
As per claim 5, WASSERMAN in view of MANIAN discloses the method as claimed in claim 1. Modified WASSERMAN further discloses wherein the use of the setting data sets in the sequence controller comprises stringing together the stored setting data sets (based on a stored signal value the controller is able to set an image acquisition pixel range for the camera, and initiate an image acquisition camera operation sequence based on the stored steps related to the identified stored data value; paragraphs [0078], [0084], [0139], [0191]).
As per claim 6, WASSERMAN in view of MANIAN discloses the method as claimed in claim 5. Modified WASSERMAN further discloses wherein a sequence in which the stored setting data sets are strung together is predeterminable automatically or by a user (a machine vision inspection system is able to automatically generate via a workpiece part program generating circuit 155 generates workpiece program instructions that capture operations and settings determined according to the training sequence; paragraph [0084]).
As per claim 7, WASSERMAN in view of MANIAN discloses the method as claimed in claim 1. Modified WASSERMAN further discloses wherein the microscope components further comprise at least one of a motorized lens barrel a motorized adjustable incident light axis, a motorized objective revolver, a motorized z drive for a focus setting, at least one illumination device for incident light or transmitted light illumination, a condenser, one or more operating knobs, or a combination thereof (the computer vision machine for taking auto focus magnified images is adapted to include a motorized z axis motion control subsystem 145 and receives position information from the X, Y and Z axis position encoders and transmits position altering control signals via data sent from application programming interfaces 195; paragraphs [0079-0080], [0084]).
As per claim 8, WASSERMAN in view of MANIAN discloses the method as claimed in claim 1. Modified WASSERMAN further discloses wherein the automatic or manual assessment is implemented by visual judgment of a microscope image obtained using the microscope or based on automatic image processing (via the graphical user interface the user may position a work piece for evaluation at a desired location/portion in the field of view of the camera; paragraphs [0137], [0145], [0148], [0181], [0219]).
As per claim 9, WASSERMAN in view of MANIAN discloses the method as claimed in claim 8. Modified WASSERMAN further discloses wherein, the visual judgment or the automatic image processing- comprises determining at least one of sharpness, contrast, brightness, color values1 or other image properties, in a form of an image comparison to previously or subsequently obtained images (the operation characteristics/parameters include contrast and lighting/illumination conditions which included brightness values; paragraphs [0145], [0167], [0219], [0232]).
As per claim 10, WASSERMAN discloses a microscope system comprising: a microscope, and a control device configured to operate the microscope in a learning operating mode and an examination operating mode (a computing system and corresponding method for controlling an imaging system adapted for machine vision allowing the cameras to provide magnified/zoomed in images therefore acting as a microscope and is further adapted for automatic focusing and other automated functions related to the magnified images captured by the imager acting substantially as a microscope and controlled via control system 100 comprising a plurality of data input methods such as a keyboard, joy stick, and mouse and an output device such as a monitor 111 to display results and provide a user interface; abstract; figs 1-2; paragraphs [0067-0070]), wherein the microscope system is configured to: in the learning operating mode, automatically or manually perform training settings in succession (see step S110 of fig 6 the image inspection system is switched to a learn mode and undergoes training steps in order to train the auto focus function via specific instructions; fig 6; paragraphs [0040-0041], [0137]), each training setting corresponding to a respective setting data set of microscope components of the microscope (the training steps of fig 6 include a step S120 of selecting a lens which is a microscope component and the lens is positioned in steps S130 and repeated throughout a region to test said region as a region of interest in step S190; fig 6; paragraphs [0137], [0145], [0163-0165]), and assigns each respective position to one or more examination steps to be performed by the microscope components (via the computing system a point may be assigned as an allowed or selected operating point, the corresponding auto focus scan motion speed that provides the required sampling density or maximum auto focus image sample spacing is easily determined and is used as a corresponding auto focus parameter; paragraphs [0194-0195]), automatically or manually assess the training settings after the training settings are performed (the user via user interface defines the region of interest (position) for the auto focus via direct user input on a graphical interface for editing where the region of interest includes the operating point and defined characteristics to trigger the evaluation steps of the training system and method of fig 6 at step S400 and acquiring the images after evaluation is deemed successful as stated in steps S420-S480; fig 6; paragraphs [0146-0149], [0188-0191], [0214-0215], [0219], [0221], [0225]), and based on the assessment, store the setting data sets associated with the training settings for a subsequent use in a sequence controller, or discard the setting data sets, or modify the setting data sets (the control system portion is adapted to receive the operation position value and is used to determine when to perform autofocus in an image range based on a trigger point/characteristic of the image and this is stored to the system to be used in operative mode after saving the point value in said training mode; fig 6; paragraphs [0141-0143], [0146-0149], [0218-0219], [0224-0225]), wherein the sequence controller is configured to use the stored setting data sets in a form of a specification of a setting of the microscope components according to the setting data sets in an examination operating mode following the learning operating mode (based on the various trigger methods of an operating point/characteristics the determination of the corresponding governing auto focus variable or parameters, is based on the settings to the variables defined and determined in steps S190-S210 of FIG. 6 and is further adapted to move the lens component (microscope component) to the desired location of the region of interest to perform the auto focus according to the trained procedure and setting; fig 6; paragraphs [0141-0143], [0146-0149], [0224-0226]). WASSERMAN fails to disclose wherein a user brings at least a part of the microscope components into a plurality of positions in succession, wherein the part of the microscope components includes a cross table on which a sample is placed, wherein each respective position includes a respective coordinate of the cross table corresponding to a respective sample region and a respective focus position.
MANIAN discloses wherein a user brings at least a part of the microscope components into a plurality of positions in succession (a user via a computing system providing a user interface is able to move components of a microscope into a plurality of coordinate positions using a motor to move said components; abstract; fig 2; paragraphs [0021-0022], [0027-0028], [0038]), wherein the part of the microscope components includes a cross table on which a sample is placed (wherein one of the movable components includes a platform to hold a sample (sample is placed on) for being imaged via the microscope; abstract; fig 2; paragraphs [0021-0022], [0027-0028], [0038]; NOTE: the applicant defines “cross table” as something holding the sample and is the definition used in examination it is noted the google definition of microscope cross table differs and the definition claimed by the applicant is the definition used in examination), wherein each respective position includes a respective coordinate of the cross table corresponding to a respective sample region and a respective focus position (wherein each position the platform is moved to via the motor is a coordinate position of which the user may command the system to move the platform and other movable components to that said position and further includes a focus component denoted by the Z coordinate value in the r, theta, Z coordinate positions provided by the system; abstract; fig 2; paragraphs [0021-0022], [0027-0028], [0032-0033], [0036-0039]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to modify WASSERMAN to have wherein each respective position includes a respective coordinate of the cross table corresponding to a respective sample region and a respective focus position of MANIAN reference. The Suggestion/motivation for doing so would have been to provide the ability to perform an optically-defined measurement fast for the location of any fluorescing targets within a predefined area suggested at paragraph [0033] of MANIAN. Further, one skilled in the art could have combined the elements as described above by known method with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine MANIAN with WASSERMAN to obtain the invention as specified in claim 10.
As per claim 11, WASSERMAN in view of MANIAN discloses the microscope system as claimed in claim 10. Modified WASSERMAN further discloses wherein the microscope is configured as an incident light microscope, a transmitted light microscope, a light sheet microscope, a scanning microscope, or a fluorescence microscope (as described in paragraphs 0070-0071 machine vision inspection system 10 is used to inspect work piece 10 and is imaged microscopically using light sources 220, 230 or 240 emits source light 222, 232, or 242, respectively, that is usable to illuminate the workpiece 20, the light emitted by the light sources 220, 230 and/or 240 illuminates the workpiece 20 and is reflected or transmitted as workpiece light 255, which passes through the interchangeable objective lens 252 and one of a lens 286 or a lens 288 of the turret lens assembly 280 and is gathered by the camera system 260 and therefore based on similar functionality and methods vision system 10 is substantially a transmitted light microscope; paragraphs [0067-0071]).
As per claim 13, WASSERMAN in view of MANIAN discloses the method as claimed in claim 1. Modified WASSERMAN further discloses a control unit for a microscope system, which is configured to carry out the method as claimed in claim 1 (the vision system includes controller system 100 which is a computer consol with various input and output device connected to provide a user graphic interface for a microscopic imaging device; paragraphs [0067-0071]).
As per claim 14, WASSERMAN in view of MANIAN discloses the method as claimed in claim 1. Modified WASSERMAN further discloses a non-transitory computer-readable medium having processor-executable instructions stored thereon which, when executed on a processing unit, facilitate performance of the method (the controller 100 comprising a computer console would further comprise computing processors resident in said counsel to perform and execute the program instructions related to the vision inspection process; paragraphs [0042], [0067-0071]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DEVIN JACOB DHOOGE whose telephone number is (571) 270-0999. The examiner can normally be reached 7:30-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Bee can be reached on (571) 270-5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800- 786-9199 (IN USA OR CANADA) or 571-272-1000.
/Devin Dhooge/
USPTO Patent Examiner
Art Unit 2677
/ANDREW W BEE/Supervisory Patent Examiner, Art Unit 2677