DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-14 are rejected under 35 U.S.C. 103 as being unpatentable over Abari et al. (US Pat. No. 10, 739, 462 hereinafter referred as Abari) in view of Jensen et al. (US Pat. No. 11,490,068 hereinafter referred as Jensen).
Regarding claim 1, Abari discloses an optical measurement system comprising:
a first optical transceiver including a first light source that emits light modulated at frequencies exceeding 100 kHz (see figure 4A, light source; system with transmitter and receiver; col. 6 lines 25-42 each light sources direct one or more light beams, the light beams at a wavelength between 840nm to 904nm), a first camera that acquires intensity or color images (see col. 1 lines 14-20 and col. 2 lines 44-51 camera capturing images), a second camera capable of sensing and demodulating the light emitted from the first light source and reflected by an object (figure 4A, col. 2 lines 44-67 and col. 4 lines 39-50 each camera and LiDAR sensor system may include an image sensor that is configured to capture photo images), and a first beam splitter that separates and directs the reflected light to the first camera and the second camera (see abstract, first beam splitter; figure 4A and col. 6 lines 43-65 first beam splitter separates received light beams and reflect them to different sensors; see also col. 8 lines 39-62 and also col. 1 lines 58-63); and
a second optical transceiver including a second light source that emits light modulated at frequencies exceeding 100 kHz, a third camera that acquires intensity or color images, a fourth camera capable of sensing and demodulating the modulated light emitted from the second light source and reflected by the object, and a second beam splitter that separates and directs the reflected light to the third camera and the fourth camera (see figure 4A, light source; beam splitter, and col. 6 lines 43-65, one or more return light beams 410; second beam splitter separates a second portion of received light beams into third option that is directed to 426 and fourth portion 428; see col. 6 lines 19-41 multiple transmitters for transmitting multiple wavelength; each light sources direct one or more light beams, the light beams at a wavelength between 840nm to 904nm; see col. 2 lines 49-51 multiple cameras between four to six can be used; see col. 2 lines 44-67 and col. 4 lines 39-50 each camera and LiDAR sensor system may include an image sensor that is configured to capture photo images),
wherein the optical measurement system operates in at least one operation mode including a first operation mode, and in the first operation mode, the first camera and the third camera are employed to determine a three-dimensional shape of an environment, and at least one of the second camera and the fourth camera is employed to determine the three-dimensional shape of the environment using a time-of-flight technique (see col. 6 line 62-col. 7 line 16 using TOF sensor and InGaAs sensor; see col. 8 line 63-col. 9 line 16, using any other suitable sensor or a combination thereof; emitting pulsed laser light; see col. 3 lines 17-36 and col. 5 lines 45-53, using TOF sensors for receiving and processing depth profile data; see col. 2 lines 36-67 creating three-dimensional image; see col. 9 lines 50-59 3D model of surrounding; sensors in any suitable locations in or on a vehicle; see col. 10 lines 17-23 sensor data represent a three-dimensional schema of vehicle’s external environment).
Claim 1 differs from Abari in that the claim further requires using a triangulation technique.
In the same field of endeavor Jensen discloses a trainagulation scanner having a projection unit, at least one having first, second, third and fourth image acquisition unit (see abstract). Jensen further discloses a first and second acquisition states (see col. 5 lines 25-36 and line 61-col. 6 line 3); two beam splitters are provided and arranged together with the cameras (see figure 2 and col. 5 lines 20-24), and the first and third or the second and fourth image acquisition unit are aligned in relation to the beam splitter (see col. 11 lines 9-24). See also col. 1 lines 13-14, col. 2 lines 33-38 and col. 10 lines 45-49.
Therefore, in light of the teaching in Jensen it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Abari by specifically adding a triangulation technique in order to measure distance and vibrations on various surfaces; provide rapid and precise scanning of a surface, and perform real-time inspection.
Regarding claim 2, Jensen discloses the at least one operation mode includes a second operation mode, and in the second operation mode, the first camera and the third camera are employed to determine the three-dimensional shape of the environment using the triangulation technique, and the second camera and the fourth camera are not employed to determine the three-dimensional shape of the environment (see abstract; col. 5 lines 13-19; col. 8 lines 23-39 and also col 2 lines 33-38). The motivation to combine the references is discussed in claim 1 above.
Regarding claim 3, Jensen discloses the at least one operation mode includes a third operation mode, and in the third operation mode, the first camera and the third camera are not employed to determine the three- dimensional shape of the environment, and at least one of the second camera and the fourth camera is employed to determine the three-dimensional shape of the environment using the time- of-flight technique (col. 5 lines 15-24; col. 8 lines 24-48 and also col. 2 lines 33-38 and col. 5 line 52-col. 6 line 3). The motivation to combine the references is discussed in claim 1 above.
Regarding claim 4, Abari discloses at least one of the at least one operation mode, both of the first light source and the second light source emit light towards the object (see col. 6 lines 16-39 and figure 4A).
Regarding claim 5, Abari discloses when using the time-of-flight technique, both of the second camera and the fourth camera are employed to determine the three-dimensional shape of the environment using the time-of-flight technique (see figure 4A, col. 5 lines 35-53 and col. 6 line 43-col. 7 line 16)
Regarding claim 6, Jensen discloses under a first condition, the optical measurement system operates in the first operation mode, and under a second condition in which background illuminance is higher than background illuminance in the first condition, the optical measurement system operates in the second operation mode (see col. 8 line 48-col. 9 line 50). The motivation to combine the references is discussed in claim 1 above.
Regarding claim 7, Jensen discloses under a first condition, the optical measurement system operates in the first operation mode and under a third condition in which background illuminance is lower than background illuminance in the first condition, the optical measurement system operates in the third operation mode (see col. 8 line 48-col. 9 line 50 and col. 11 line 31-col. 12 line 14). The motivation to combine the references is discussed in claim 1 above.
Regarding claim 8, Jensen discloses in the first operation mode, the first camera and the third camera are employed to determine the three-dimensional shape of a first part in the environment using the triangulation technique (see col. 5 lines 25-36 and line 61-col. 6 line 3; figure 2 and col. 5 lines 20-24), and Abari discloses the at least one of the second camera and the fourth camera is employed to determine the three-dimensional shape of a second part of the environment using the time-of- flight technique, the second part being weakly lit than the first part (see col. 6 line 62-col. 7 line 16; see col. 8 line 63-col. 9 line 16; see col. 3 lines 17-36 and col. 5 lines 45-53; see also col. 2 lines 36-67 and col. 9 lines 50-59). The motivation to combine the references is discussed in claim 1 above.
Regarding claim 9, Abari discloses in the first operation mode, a DC part of the modulated light is employed for the triangulation technique, while an AC part of the modulated light is demodulated and employed for the time-of-flight technique (see col. 1 lines 7-20; col. 3 lines 17-36 and col. 8 lines 36-41).
Regarding claim 10, Abari discloses each of the first light source and the second light source includes lasers with wavelength of red, green, and blue, respectively (see col. 5 lines 20-53).
Regarding claim 11, Abari discloses each of the first light source and the second light source includes a white LED (see col. 5 lines 20-53 and col. 6 lines 27-42).
Regarding claim 12, Abari discloses each of the first camera and the third camera includes a color filter, an imaging lens, and a black and white or color image sensor (see col. 1 lines 7-20; col. 3 lines 17-36 and col. 8 lines 36-62).
Regarding claim 13, Abari discloses each of the second camera and the fourth camera includes a color filter, an imaging lens, and a demodulation image sensor (see col. 5 lines 20-53; col. 6 line 43-col. 7 line 15 and col. 8 lines 36-62).
Regarding claim 14, the limitation of method claim 14 is found in claim 1. Therefore, claim 14 is analyzed and rejected for the same reasons as discussed in claim 1 above.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HELEN SHIBRU whose telephone number is (571)272-7329. The examiner can normally be reached M-TR 8:00AM-5:00PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, THAI TRAN can be reached at 571 272 7382. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HELEN SHIBRU/Primary Examiner, Art Unit 2484 February 18, 2026