DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 09/04/2025 has been entered.
Response to Amendment
The Amendment filed 08/14/2025 has been entered. No claims have been added. Claims 10, 11, and 20 have been cancelled. Claims 1, 9, and 18 have been amended. Claims 1, 4-9, and 15-19 remain pending in the application.
Response to Arguments
Applicant’s arguments, see pages 6-9, filed 08/14/2025, with respect to the 103 rejections have been fully considered but are not persuasive.
Applicant's arguments do not comply with 37 CFR 1.111(c) because they do not clearly point out the patentable novelty which he or she thinks the claims present in view of the state of the art disclosed by the references cited or the objections made. Further, they do not show how the amendments avoid such references or objections. The presented remarks provide a general statement with respect to Applicant’s designators (A)(a) and (A)(b) for specific limitations of claim 1 on page 7 but do not provide a corresponding argument with respect to the amended claim language. Further, the remarks provide a comparison table between presented reference Islam and Applicant’s designator (A)(c) for specific limitation of claim 1 on pages 7-8 but again do not provide a corresponding argument with respect to the amended claim language and instead provides a conclusory statement asserting the combination of presented reference fail to disclose the claim language. The rejections presented below have been updated to reflect the amended claim language with corresponding citations from the presented prior art. The 103 rejections are maintained.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 08/07/2025 was considered by the examiner.
Claim Objections
Claim 1 is objected to because of the following informalities. Appropriate correction is required: claim 1 recites “visible calibration target” in line 5 but then recites “calibration target” in line 8.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 9 and 15-19 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 9 recites the limitation "recognizing one of the captured or detected calibration target or a feature within a scene or capture volume" in lines 4-5. There is insufficient antecedent basis for this limitation in the claim. Dependent claims 15-17 fall together accordingly.
Claim 18 recites the limitation "recognize the captured or detected calibration target" in line 6. There is insufficient antecedent basis for this limitation in the claim. Dependent claim 18 falls together accordingly.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 4, 8-9, 15, and 18-19 rejected under 35 U.S.C. 103 as being unpatentable over Islam et al. US 2021/0012534 A1, hereafter Islam, in view of Liu et al. US 2019/0308326 A1, hereafter Liu, further in view of Rowell et al. US 2019/0158813 A1, hereafter Rowell.
Regarding claim 1, Islam discloses a machine vision and control system to capture images using camera and to calibrate the cameras using the images (method and system for performing automatic camera calibration for a scanning system) [title], the machine vision and control system comprising:
a machine vision system including:
a plurality of imaging devices to capture or detect one of (1) visible calibration target or (2) feature within a scene or capture volume for volumetric data capture (3D calibration target; control circuit receives a first set of calibration images via the communication interface from the first camera, wherein the first set of calibration images capture the first set of faces of the polyhedron and capture the first set of 2D calibration patterns disposed respectively on the first set of faces 201; control circuit receives a second set of one or more calibration images via the communication interface from the second camera, wherein the second set of one or more calibration images capture the additional face of the polyhedron 203) [abstract; FIG. 2];
a processor (control circuit may include one or more processors) [0035] to determine a configuration of the plurality of imaging devices using the captured or detected calibration target or feature (control circuit determines, based on first set of coordinates and the second set of coordinates, a transformation function for describing a spatial relationship between the first camera and the second camera 209) [FIG. 2],
wherein the determined configuration includes arranging the plurality of imaging devices in an array of different types and groupings of imaging devices including at least one of groups of a certain number of imaging devices, groups of infra-red and color imaging devices, and arrays of imaging devices in a spherical structure to perform the volumetric capture (control circuit determines, based on first set of coordinates and the second set of coordinates, a transformation function for describing a spatial relationship between the first camera and the second camera 209) [FIG. 2],
the processor to automatically calibrate the plurality of imaging devices using the arrangement of the plurality of imaging devices in an array of different types and groupings of imaging devices (camera calibration according to steps 201-209) [0045]; and
a control system (robot operation system 101) including motorized device (robot) [0023],
wherein the processor automatically adjusts, positions, aligns, and calibrates the motorized device of the control system using the arrangement of the plurality of imaging devices in an array of different types and groupings of imaging devices (after camera calibration has been performed and when an object other than the 3D pattern is disposed on a first surface of the platform, control circuit generates a 3D model for representing the object, wherein the 3D model is generated based on the transformation function, based on images of the object 211; 3D model of may facilitate an ability of a robot to interact with the object; picking up the object) [FIG. 2; 0023; 0024].
However, while Islam discloses using a pattern to determine a camera configuration to generate a 3D object for a robot to interact with the corresponding real world object, Islam fails to explicitly disclose automatically adjust, position, align, and calibrate lens parameters, including focus and aperture; motorized device mounts on which the plurality of imaging devices is placed, wherein the processor automatically adjusts, positions, aligns, and calibrates the motorized device mounts of the control system using the arrangement of the plurality of imaging devices in an array of different types and grouping of imaging devices.
Liu, in an analogous environment, discloses motorized device mounts on which the plurality of imaging devices is placed (robot 105 with a camera 110 mounted thereon; in some embodiments, more than one camera can be used) [0038; 0084], wherein the processor automatically adjusts, positions, aligns, and calibrates the motorized device mounts of the control system using the arrangement of the plurality of imaging devices in an array of different types and grouping of imaging devices (the motion controller can be the machine vision master…the motion controller can control the pose and movement of the robot and to cause the machine vision processor to acquire images…cause the machine vision processor to perform the calibration described herein) [0039].
Islam and Liu are analogous because they are both related to calibration of cameras for machine vision. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the mounted cameras, as disclosed by Liu, with the invention disclosed by Islam, the motivation being accuracy [0003].
Further, Rowell, in an analogous environment, discloses the processor automatically adjusts, positions, aligns, and calibrates lens parameters needed including focus and aperture (real time camera setting 902…video sequences captured using camera configurations comprising one or more camera setting (e.g., baseline, zoom, focus aperture; slight changes to the position of one or more lenses…an auto re-calibration process that modifies 3D calibration metadata in real time to correct for changes in the position of one or more camera module components) [0119; 0135].
Islam and Liu and Rowell are analogous because they are related to calibration of a plurality of cameras. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the lens auto adjustment for calibration, as disclosed by Rowell, with the invention disclosed by Islam and Liu, the motivation being compensating for intrinsic lens parameters [0004].
Regarding claim 4, Islam, Liu, and Rowell address all of the features with respect to claim 1 as outlined above.
Liu further discloses the control system manually activates or initiates a control process of the processor, wherein the control process includes adjusting, positioning, aligning, and calibrating (control system 115 can manipulate the pose of robot 105, e.g., based on analysis of image data from camera 110) [0039].
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the control process, as disclosed by Liu, with the invention disclosed by Islam, the motivation being accuracy [0003].
Regarding claim 8, Islam, Liu, and Rowell address all of the features with respect to claim 1 as outlined above.
Islam further discloses lenses electronically connected to the plurality of imaging devices (lens) [0030].
Claims 9 are drawn to a method implemented by the control system of claims 1, and are therefore rejected in the same manner as above.
Regarding claim 15, Islam, Liu, and Rowell address all of the features with respect to claim 9 as outlined above.
Liu further discloses wherein adjusting, positioning, aligning, and calibrating the plurality of imaging devices comprises manually activating or initiating adjusting, positioning, aligning, and calibrating of the plurality of imaging devices (control system 115 can manipulate the pose of robot 105, e.g., based on analysis of image data from camera 110) [0039].
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the control process, as disclosed by Liu, with the invention disclosed by Islam, the motivation being accuracy [0003].
Regarding claim 18-19, non-transitory computer readable storage medium claims 18-20 are drawn to the instructions corresponding to the method of claims 9 and 16. Therefore, non-transitory computer readable storage medium claims 18-19 correspond to method claims 9 and 16 and are rejected for the same reasons of unpatentability as used above.
Claims 5, 6, 16, and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Islam and Liu in view of Yi-Ping Hung “A Simple Real-Time Method for Calibrating a Camera Mounted on A Robot For Three Dimensional Machine Vision”, hereafter Hung.
Regarding claim 5, Islam, Liu, and Rowell address all of the features with respect to claim 1 as outlined above.
However, the combination fails to disclose the processor continuously aligns and calibrates the control system according to defined conditions.
Hung, in an analogous environment, discloses the processor continuously aligns and calibrates the control system according to defined conditions (off-line stage…using observations of the calibration objects directly as described in section III.A…calibrate the robot as accurately as possible) [section IV].
Islam and Liu and Hung are analogous because they are both related to calibrating a camera mounted on a robot. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use offline calibration, as disclosed by Hung, with the invention disclosed by Islam and Liu, the motivation being improving calibration accuracy [section I].
Regarding claim 6, Islam and Liu and Hung address all of the features with respect to claim 5 as outlined above.
Liu further discloses the defined conditions include when the system is not recording any images (off-line stage…using observations of the calibration objects directly as described in section III.A…calibrate the robot as accurately as possible) [section IV].
Claims 16 and 17 are drawn to a method implemented by the control system of claims 5 and 6, and are therefore rejected in the same manner as above.
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Islam and Liu in view of De Villiers et al. US 9,330,463 B2, hereafter De Villiers.
Regarding claim 7, Islam, Liu, and Rowell address all of the features with respect to claim 1 as outlined above.
However, the combination fails to explicitly disclose wherein the motorized device mounts include one of (1) Brushless DC Motor (BLDC) or (2) Synchronous Servo Motor (SSVM).
De Villiers, in an analogous environment, discloses wherein the motorized device mounts include one of (1) Brushless DC Motor (BLDC) or (2) Synchronous Servo Motor (SSVM) (the mechanical actuator 18 is movable by servo motors) [column 3, lines 36-38].
Islam and Liu and De Villiers are analogous because they are both related to calibrating cameras on robot arms. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the servo motor, as disclosed by De Villiers, with the invention disclosed by Islam and Liu, to obtain the predictable result of using various types of motors in the robot arms.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEFAN GADOMSKI whose telephone number is (571)270-5701. The examiner can normally be reached Monday - Friday, 12-8PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jay Patel can be reached at 571-272-2988. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
STEFAN GADOMSKI
Primary Examiner
Art Unit 2485
/STEFAN GADOMSKI/Primary Examiner, Art Unit 2485