Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. TW112117055, filed on 5/08/2023.
Drawings
The drawings submitted on 8/04/2023 are in compliance with the provisions of 37 CFR 1.81. Accordingly, the drawings are being considered by the examiner.
Specification
The specification submitted on 8/04/2023 are in compliance with the provisions of 37 CFR 1.71. Accordingly, the specification is being considered by the examiner.
Claim Objections
Claim 2 is objected to because of the following informalities:
Claim 2, line 2: “in the environmental image” appears to be --the environmental image-- .
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 2 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claim 2, the limitation “in the environmental image including a plurality of sampling areas” is unclear. Previously in claim 1 “environmental images” was disclosed. Which should be considered ”the environmental image”?
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-5, 7, 12 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Price et al. (US 20180227566 A1, "Price") in view of Grandjean et al. (DE69801758T2, "Grandjean") and Donovan (US 20170307736 A1, "Donovan").
Regarding claim 1, Price teaches a LiDAR system, comprising:
a microcontroller unit (Price, Para [0042], Fig 1, where the imaging system 100 can be a time-of-flight measurement device that modulates over time. Resultingly, the system must have a controller unit in order to modulate);
a laser light source, coupled to the microcontroller unit (Price, Para [0043], Fig 1, where the illuminator 104 can output multiple wavelength light within a wavelength range and is a part of the imaging system 100 and therefore coupled to the controller modulating the laser);
a lens module (Price, Para [0047], Fig 1, where the illuminator 104 may include lenses that can change the field of illumination (FOI) of the illuminator. Para [0052] discloses how the imaging sensor 106 may also include lenses to change the field of view (FOV));
and a receiver, coupled to the microcontroller unit (Price, Para [0044], Fig 1, where the imaging sensor receives reflected light and is a part of the imaging system 100 and therefore coupled to the controller modulating the laser),
the lens module includes(Price, Para [0076], Fig 5-2)
the receiver lens module receives a reflective light signal of the (Price, Para [0076], Fig 5-2, where the multiple lenses shown the light back toward the photoreceptor 265);
the laser light source emits a pulse signal with a cycle time (Price, Para [0042], Fig 1, where the imaging sensor 1-6 has a coordinated shutter that operates with light modulation, which when allowing for a time-of-flight dept measurement, implies a pulse emission and return time);
the microcontroller controls the receiver to turn on during a sensor shutter time and turn off during a reset time in each cycle time (Price, Para [0042], Fig 1, where the imaging sensor 1-6 has a coordinated shutter that operates with light modulation).
However, Price does not teach the laser light source emits a plurality of laser lights with different wavelengths and includes a light coupler and a fiber, the light coupler optically coupling the laser lights into a collimated light signal transmitted through the fiber;
the lens module includes a laser beam splitter module and diffracts the laser lights into a plurality of diffractive lights, the diffractive lights being emitted towards a target;
the laser beam splitter module includes a diffractive optical element and a collimation lens assembly;
in a sensor shutter time of a subframe in a frame, a plurality of pixels of the receiver receive at least one reflective light signal of the laser lights with different wavelengths, obtains environmental images of a plurality of subframes, and takes distance values representing the reflective light signals as the distance values of the pixels in the subframe;
and the microcontroller unit fuses the distance values of the pixels in the environmental images of the plurality of subframes as a final distance value of the frame.
On the other hand, a different embodiment of Price teaches a diffraction grating as a part of a lens assembly, to produced the diffractive lights that can be adjusted axially for different FOI’s (Price, Para [0087], Fig 8, where the diffraction grating 364 can be used in Fig 5-1 to produce structured light for higher resolution).
Accordingly, it would have been obvious of one of ordinary skill in the art, before the effective filing date of the invention to have modified the Lidar system of a first embodiment of Price in view a second embodiment of Price, by substituting the diffuser with a diffraction grating to produce structured light to which allows for the improvement angular resolution to improve depth calculations. See MPEP 2141.III KSR Rationale B.
However, Price still does not teach the laser light source emits a plurality of laser lights with different wavelengths and includes a light coupler and a fiber, the light coupler optically coupling the laser lights into a collimated light signal transmitted through the fiber;
the laser beam splitter module includes a
in a sensor shutter time of a subframe in a frame, a plurality of pixels of the receiver receive at least one reflective light signal of the laser lights with different wavelengths, obtains environmental images of a plurality of subframes, and takes distance values representing the reflective light signals as the distance values of the pixels in the subframe;
and the microcontroller unit fuses the distance values of the pixels in the environmental images of the plurality of subframes as a final distance value of the frame.
On the other hand, Grandjean teaches the use of collimating lenses and optical fibers to focus incoming emitted light (Grandjean, Para [0069], Fig 1A, where diffraction devices 12 collimate light transmitting through fiber 8),
Accordingly, it would have been obvious of one of ordinary skill in the art, before the effective filing date of the invention to have modified the Lidar system of Price in view of Grandjean, by applying Grandjean’s optical fiber 8 to allow for the transport of radiation coming from the laser in the direction of the desired target See MPEP 2141.III KSR Rationale B.
Accordingly, it also would have been obvious of one of ordinary skill in the art, before the effective filing date of the invention to have modified the Lidar system of Price in view of Grandjean, by applying diffractive-type devices 12 to correct any aberrations of the focused light while being transmitted through optical fiber 8. See MPEP 2141.III KSR Rationale B.
However, Price in view of Grandjean still does not teach
in a sensor shutter time of a subframe in a frame, a plurality of pixels of the receiver receive at least one reflective light signal of the laser lights with different wavelengths, obtains environmental images of a plurality of subframes, and takes distance values representing the reflective light signals as the distance values of the pixels in the subframe;
and the microcontroller unit fuses the distance values of the pixels in the environmental images of the plurality of subframes as a final distance value of the frame.
On the other hand, Donovan teaches the use of multiple wavelengths (Donovan, Para [0131], Fig 21, where multiple wavelengths are used to create an image using multiple lasers), using multiple subframes base on the distances measured from the different wavelengths (Donovan, Para [0131], Fig 21, where two wavelengths are used for different distance ranges which can be used as a combined point cloud image made up of multiple wavelength based subframes) to create a final map representing the distance of the frame (Donovan, Para [0131], Fig 21, where two wavelengths are used for different distance ranges which can be used as a combined point cloud image made up of multiple wavelength based subframes).
Accordingly, it would have been obvious of one of ordinary skill in the art, before the effective filing date of the invention to have further modified Price’s Lidar system, in view of Donovan, by using multiple wavelengths to achieve a frame rate, which would allow for increased resolution relative to a fast moving body. See MPEP 2141.III KSR Rationale G.
Regarding claim 2, Price in view of Grandjean and Donovan teaches the LiDAR system according to claim 1, further comprising:
in the environmental image including a plurality of sampling areas, performing a batch comparison of average distance values of the plurality of sampling areas in the subframes (Donovan, Para [0131], Fig 21, where two wavelengths are used for different distance ranges such as the short and long ranges in Price, Fig 2 and Fig 3, which are compared to get a final combined point cloud image);
and according to the result of the batch comparison, the microcontroller unit eliminating abnormal subframes and fusing normal subframes as the final distance value of the frame (Donovan, Para [0131], Fig 21, where two wavelengths are used for different distance ranges such as the short and long ranges in Price, Fig 2 and Fig 3, which are compared to get a final combined point cloud image).
Regarding claim 3, Price in view of Grandjean and Donovan teaches the LiDAR system according to claim 1, wherein the diffractive optical element has a function of rotation or oscillation (Price, Para [0121]-[0122], Fig 20, where the gimbal comprising the illuminator and therefore associated lenses can rotate vertically or horizontally).
Regarding claim 4, Price in view of Grandjean and Donovan teaches the LiDAR system according to claim 1, wherein the receiver lens module includes a lens module with an adjustable focal length including at least one concave lens and at least one convex lens, which modulates a size of field of view according to a detection range (Price, Para [0076], Fig 5-2, where the lens module consists of the multiple lenses shown and moveable lens 258, which is moveable for different FOV's).
Regarding claim 5, Price in view of Grandjean and Donovan teaches the LiDAR system according to claim 1, wherein the receiver lens module includes a plurality of lens modules with fixed focal lengths, each lens module including at least one concave lens and at least one convex lens, the lens modules being switched according to a detection range to modulate a size of field of view (Price, Para [0076], Fig 5-2, where the lens module consists of the combination of multiple lenses shown and moveable lens 258, which is moveable for different FOV's).
Regarding claim 7, Price in view of Grandjean and Donovan teaches the LiDAR system according to claim 1, wherein the laser beam splitter module includes the diffractive optical element and a collimation lens assembly with an adjustable focal length, the collimation lens assembly being switched according to a detection range to modulate a range of field of image (Grandjean, Para [0069], Fig 1A, where diffraction-type devices 12 collimate light transmitting through fiber 8 and is used with Price’s Para [0087], Fig 8, diffraction grating and lens assembly where the lens module consists of the multiple lenses shown and moveable lens 354, which is moveable for different FOV's ).
Regarding claim 12, Price in view of Grandjean and Donovan teaches the LiDAR system according to claim 1, wherein the sensor shutter time and the reset time are determined according to a detection range (Price, Para [0042], Fig 1, where the imaging sensor 1-6 has a coordinated shutter that operates with light modulation, which when allowing for a time of flight dept measurement, implies the modulation of different depths, such as in Fig 2 and Fig 3).
Regarding claim 14, Price in view of Grandjean and Donovan teaches a resolution improvement method of the LiDAR system according to claim 1, the method comprising:
setting the diffractive optical element as a movable element with a function of rotation and/or reciprocating movement (Price, Para [0121]-[0122], Fig 20, where the gimbal comprising the illuminator and therefore associated lenses can rotate vertically or horizontally from 1 degree to 180 degrees);
under conditions of a plurality of rotation angles or reciprocating positions, obtaining a plurality of subframes of environmental images (Donovan, Para [0131], Fig 21, where two wavelengths are used for different distance ranges which can be used as a combined point cloud image made up of multiple wavelength based subframes);
each of the reflective light signals at each pixel of the environmental images representing a sub-distance value, a plurality of sub-distance values in each environmental image of a subframe constituting a three-dimensional image with depth information (Donovan, Para [0131], Fig 21, where two wavelengths are used for different distance ranges which can be used as a combined point cloud image made up of multiple wavelength based subframes);
and after eliminating abnormal subframes, fusing the environmental images of the remaining subframes, if a pixel has a plurality of sub-distance values, taking an average or selecting one, if the pixel has only one sub-distance value, selecting the sub-distance value, if the pixel has no sub-distance value, selecting a maximum value within a detection range, and calculating the final distance value of the three-dimensional image of the frame (Donovan, Para [0131], Fig 21, where multiple wavelengths are used to create an image of multiple lasers based on distance information categorized by wavelength. In combination with the FOI changing process including depth values of distorted images as disclosed in Price, Para [0107], an accurate image from multiple depths can be generated).
Claims 6, 8-11 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Price in view of Grandjean, Donovan and Dong et al. (US 20210341610 A1, "Dong").
Regarding claim 6, Price in view of Grandjean and Donovan teaches the LiDAR system according to claim 1, wherein the laser beam splitter module includes the diffractive optical element (Grandjean, Para [0069], Fig 1A, where diffraction-type devices 12 collimate light transmitting through fiber 8 and is used with Price’s Para [0087], Fig 8, diffraction grating and lens assembly where the lens module consists of the multiple lenses shown and moveable lens 354, which is moveable for different FOV's ).
However, Price in view of Grandjean and Donovan does not teach
On the other hand, Dong teaches a collimation lens assembly including a concave mirror and a collimating lens to make incoming light from multiple angles parallel (Dong, Para [0050], Fig 6, where concave reflector 3 and lens 4 as disclosed in Para [0047] also collimates light).
Accordingly, it would have been obvious of one of ordinary skill in the art, before the effective filing date of the invention to have modified the Lidar system of Price in view of Donovan and Dong, by applying the use of a collimating light assembly to collimate the light diffracted by the lenses in Price Fig 5-1 for improving the strength of the optical signal and making the system more compact. See MPEP 2141.III KSR Rationale D.
Regarding claim 8, Price in view of Grandjean, Donovan and Dong teaches the LiDAR system according to claim 6, wherein the diffractive optical element diffracts the laser light into the diffractive lights, the collimation lens assembly is placed at a front of the diffractive optical element, and a mirror surface of the collimation lens assembly is perpendicular to an incident direction of the laser light to converge the diffractive lights to be substantially parallel to each other (Dong, Para [0050], Fig 6, where concave reflector 3 and lens 4 as disclosed in Para [0047], in order to collimate diffracted light, can be placed in front of the lenses in Prince Fig 8 ).
Regarding claim 9, Price in view of Grandjean and Donovan teaches the LiDAR system according to claim 7, wherein the diffractive optical element diffracts the laser light into the diffractive lights (Price, Para [0087], Fig 8, where the diffraction grating 364 can be used in Fig 5-1 to produce structured light for higher resolution).
However, Price in view of Grandjean and Donovan does not teach the collimation lens assembly is placed at a front of the diffractive optical element, and a mirror surface of the collimation lens assembly is perpendicular to an incident direction of the laser light to converge the diffractive lights to be substantially parallel to each other.
On the other hand, Dong teaches a collimation lens assembly including a concave mirror and a collimating lens to make incoming light from multiple angles parallel, which can be placed in front of the mirror for collimation (Dong, Para [0050], Fig 6, where concave reflector 3 and lens 4 as disclosed in Para [0047], in order to collimate diffracted light, can be placed in front of the combination of lenses in Price, Fig 8 ).
Accordingly, it would have been obvious of one of ordinary skill in the art, before the effective filing date of the invention to have modified the Lidar system of Price in view of Grandjean, Donovan and Dong, by applying the use of a collimating light assembly to collimate the light diffracted by the lenses in Price, Fig 8, for improving the strength of the optical signal and making the system more compact . See MPEP 2141.III KSR Rationale D.
Regarding claim 10, Price in view of Grandjean, Donovan and Dong teaches the LiDAR system according to claim 6, further including a concave mirror, the diffractive optical element diffracts the laser light into the diffractive lights, the concave mirror collects the diffractive lights, and the collimation lens assembly is placed at a front of the concave mirror to converge the diffractive lights to be substantially parallel to each other (Dong, Para [0050], Fig 6, where concave reflector 3 and lens 4 as disclosed in Para [0047], in order to collimate diffracted light, can be placed in front of the combination of lenses in Price, Fig 8 ).
Regarding claim 11, Price in view of Grandjean, and Donovan teaches the LiDAR system according to claim 7., the diffractive optical element diffracts the laser light into the diffractive lights (Price, Para [0087], Fig 8, where the diffraction grating 364 can be used in Fig 5-1 to produce structured light for higher resolution).
However, Price in Donovan does not teach further including a concave mirror, the concave mirror collects the diffractive lights, and the collimation lens assembly is placed at a front of the concave mirror to converge the diffractive lights to be substantially parallel to each other.
On the other hand, Dong teaches a collimation lens assembly including a concave mirror and a collimating lens to make incoming light from multiple angles parallel, which can be placed in front of the mirror for collimation (Dong, Para [0050], Fig 6, where concave reflector 3 and lens 4 as disclosed in Para [0047], in order to collimate diffracted light, can be placed in front of the combination of lenses in Price, Fig 8).
Accordingly, it would have been obvious of one of ordinary skill in the art, before the effective filing date of the invention to have modified the Lidar system of Price in view of Grandjean, Donovan and Dong, by applying the use of a collimating light assembly to collimate the light diffracted by the lenses in Price Fig 8, for improving the strength of the optical signal and making the system more compact. See MPEP 2141.III KSR Rationale D.
Regarding claim 13, Price in view of Grandjean, Donovan and Dong teaches the LiDAR system according to claim 11, further including a start time and an end time, the microcontroller controls the receiver to turn on between the start time and the end time within each cycle time, and to turn off during the remaining time (Price, Para [0042], Fig 1, where the imaging sensor 1-6 has a coordinated shutter that operates with light modulation, which when allowing for a time of flight depth measurement, implies the modulation of different depths, such as in Fig 2 and Fig 3);
the start time is determined according to a lower limit of the detection range (Price, Para [0064], Fig 2, where the lower limit is in between 5 degrees and 90 degrees (for the longer range)); and
the end time is determined according to an upper limit of the detection range (Price, Para [0056], Fig 3, where the upper limit is in between 60 degrees and 150 degrees (for the shorter range)).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZAKI HAWKINS whose telephone number is (571)272-6595. The examiner can normally be reached Monday-Friday 7:30am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, YUQING XIAO can be reached at (571) 270-3603. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ZAKI KEHINDE HAWKINS/Examiner, Art Unit 3645
/YUQING XIAO/Supervisory Patent Examiner, Art Unit 3645