Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-7, 9-12, 14-18, 21- 24 are currently pending and examined below.
Response to amendment
This is a final Office action in response to applicant's remarks/arguments filed on 09/22/2025.
Status of the claims:
Claims 1, 4-5, 9-10, 12, 17 have been amended.
Claims 8, 13, 19, 20 have been canceled
Claims 21-24 have been added.
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d).
Applicant’s arguments, see Remarks pages 7-15, filed on 09/22/2025, with respect to the rejection(s) of claims 1, 8, 10, 12-13, 17, 20 under 102 and claims 2, 11, 14, 18, 19 under 103 have been fully considered and are not persuasive. Therefore, the rejection is maintained.
In remarks on page 8, the Applicant argues that Hicks as a whole, fail to disclose, a position sensor, the position sensor is configured to independently determine orientation information of the object based on the laser light reflected by the object; obtaining laser point cloud data of the object based on the distance information and the orientation information (i.e. obtained by the position sensor).
Firstly, Applicant respectfully submits that the function and purpose of the camera in Hicks is different to the position sensor. This is because the camera in Hicks is unrelated to determining the orientation, and thus fails to unambiguously disclose the limitation "independently determine orientation information".
Secondly, Applicant respectfully submits that the operation of the camera is different to that in the position sensor. Specifically, Hicks as understood by the person of ordinary skill in the art, fails to unambiguously teach that the camera operates on the returned (reflected) laser light, and thus fails to unambiguously disclose the limitation "based on the laser light reflected by the object" performed by the position sensor.
Hicks discloses a laser radar system (see at least fig. 1, para 13, 35, 50-53) comprising a laser transceiver 100 (lidar) including a laser emitter and receiver, and an imaging sensor 101 (camera) configured to capture light reflected from a target illuminated by the laser emitter. Hicks further teaches that the imaging sensor determines position and orientation information of the objects or scene features based on the captured reflected light, and that a system processor fuses the range information (from the lidar) and the orientation information (from the camera) to construct a 3D point cloud representation of the object (para 50-53). Thus, the camera 101 of Hicks performs the claimed function of a “a position sensor, the position sensor is configured to independently determine orientation information of the object based on the laser light reflected by the object.”
Applicant’s assertion that Hicks’s camera is not a “position sensor” is not persuasive because the claim language defines the element by its function, not by name. The camera in Hicks performs the function, determining orientation based on reflected laser illumination, regardless of its label. Further, Hicks explicitly discloses that the camera operates independently (at least in fig. 1, para 13, 35) from the lidar and communicated its results to a processor that generates point cloud from both data sources, thereby meeting the claimed “obtaining laser point cloud data of the object based on the distance information and the orientation information.”
Accordingly, Hicks (para 13, 35, 50-53) discloses every limitation of amended claim 1, for these reasons the rejection is maintained.
Applicant’s arguments with respect to claim 3 have been considered but are moot in view of the same ground of rejection, which were necessitated by amendment.
In remarks on page 15, the Applicant says that Hicks 447 fails to disclose the specific pixel-count threshold:
“Wherein a number of pixels output by the position sensor is less than half of a total number of pixels of the position sensor and greater than a number of pixels corresponding to the laser light reflected by the object in each measurement”
The Applicant argues Hicks 447 uses a DVS (dynamic vision sensor) camera that outputs asynchronous pixel events, but doesn’t quantity threshold as claimed.
Applicant’s arguments are not persuasive. Hicks 447 (col 2: lines 62-67, col 3: lines 16-25, col 4: lines 35-52) teaches event driven pixel readout based on intensity change thresholds, i.e., only pixel with changes beyond a threshold are transmitted. This inherently produces a subset of total pixels (typically far less than 50%) and “threshold” language implies the lower bound (greater than number of active pixels corresponding to reflected laser returns).
One of ordinary skill in the art would recognize that tuning the DVS trigger threshold and region of interest effectively defines a dynamic pixel fraction between full frame and sparse reflected points.
The claimed numerical boundaries (< ½ total pixels > reflection points) are optimization ranges of a known DVS principal and would have been obvious to select bandwidth efficiency and accuracy trade-offs.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the
subject matter which the applicant regards as his invention.
Claims 14, 17 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 14, “the time of flight” lacks antecedent basis.
Claim 17, “the reflected laser light” lacks antecedent basis.
The following is a quotation of 35 U.S.C. 112(d):
(d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers.
The following is a quotation of pre-AIA 35 U.S.C. 112, fourth paragraph:
Subject to the following paragraph [i.e., the fifth paragraph of pre-AIA 35 U.S.C. 112], a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers.
Claims 14 and 17 are rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements.
Claims 14 and 17 dependents on canceled claim 13.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1, 10, 12, 17, 21-24 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Hicks et al. (US 20200018854 A1).
Regarding claim 1, Hicks teaches a laser radar, the laser radar (Fig. 1, machine vision 10) comprising:
a laser transceiver (Fig. 1, para 35, Lidar 100), wherein the laser transceiver comprises a laser emitter (Fig. 1, para 36, light source 110) and a laser receiver (Fig. 1, para 40, receiver 140), and the laser receiver determines distance information of the laser transceiver away from an object based on laser light emitted by the laser emitter and reflected by the object (para 40);
a position sensor (Fig. 1, para 35, camera 101), the position sensor is configured to independently determine orientation information of the object based on the laser light reflected by the object (Para 53 “the camera 101 …. can detect depth or relative distance of objects, identify types of objects, identify presence of atmospheric obscurants, etc.”, para 54-55. Based on the depth measurement a 3D representation of the object can be made and compared the relative motion of the objects, an orientation (direction) of the object relative to camera/lidar can be determined for example, if the object is going away or toward the camera. See also, para 13 “The processor is configured to receive, from a camera with a field of regard (FOR) which includes the direction of the possible targets of the emitted pulse of light”); and
a processor (Fig. 1, controller 150), the processor communicating with the laser transceiver (para 49) and the position sensor (Para 53 “the camera 101 can be communicatively coupled to the controller 150”) respectively, and obtaining laser point cloud data of the object based on the distance information and the orientation information (Para 50, 51-53).
Regarding claim 10, Hicks teaches the laser radar according to claim 1, wherein the position sensor comprises a CMOS image sensor, and/or a CCD image sensor (para 53), a clock counter (para 55 “the camera 101 perceives depth information using time-gated exposure” So a timer will measure each time an image was captured) and an APD array (para 54 “The camera 101 can be a stereo camera with two active regions”), the position sensor determines the orientation information of the object based on the laser light reflected by the object during an exposure duration, and the clock counter records a time that the laser reflected by the object reaches the transceiver in the exposure duration relative to an exposure start time (Para 54-55).
Claim 12 is a method claim corresponding to system claim 1. It is rejected for the same reasons.
Regarding claim 17, Hicks teaches the method according to claim 13, wherein the method further comprises: determining a material or a surface shape of the object based on light intensity information of the reflected laser light (Para 35, 37, 43, 52 teach a Lidar that includes a light source (laser) configured to produce pulses of light that are characterized by their intensity that varies in time and the reflected light allows to identify or detect objects or to determine a shape or distance of objects within the FOR. Light intensity information of the reflected laser may refer to the return strength of a laser pulse, indicating how much reflected energy is received by the sensor. See also, figs.11A-11C, para 95-96).
Regarding claim 21, Hicks teaches the laser radar according to claim 1,
wherein at least one of: the laser transceiver does not record the orientation information of the object (Fig.1, para 13, 35, 50-53. Hicks separates the function of lidar 100 and imaging sensor 101 (orientation detection). The lidar 100 measures distance based on time of flight; the orientation of the object is determined by the camera (imaging sensor 101)); and
scan positions of the laser transceiver are not used to determine the orientation information of the object (para 50-53. Orientation and pose are computed from the image captured by imaging sensor 101; the scan angle or mirror position of the lidar is used only for range mapping).
Regarding claim 22 Hicks teaches the laser radar according to claim 1, wherein only the position sensor is configured to determine the orientation information of the object (Fig. 1, para 13, 35, Hicks describes camera 101 (the imaging sensor) that “captures reflected light from the target illuminated by the emitted laser” to determine position or orientation of scene features) based on only the laser light reflected by the object (Para 35 “camera 101 captures light reflected from the target illuminated by the laser emitter 110”).
Regarding claim 23, Hicks teaches the laser radar according to claim 1, wherein the position sensor is further configured to obtain an orientation of the laser transceiver (Para 53. Hicks describes that the system determine the relative orientation between the imaging sensor (camera 101) and the lidar 100), corresponding to the orientation information of the object (Para 51. Orientation and pose of object are used to determine system alignment), based on the laser light reflected by the object which is received by the position sensor (Para 35 “camera 101 captures light reflected from the target illuminated by the laser emitter 110”).
Regarding claim 24, Hicks teaches the laser radar according to claim 1, wherein the position sensor is configured to independently collect, from laser light reflected by the object and which is received on the position sensor, laser reflection points from a surface of the object (Para 50-53. Hicks described capturing a plurality of reflected laser returns (pixels) representing surface points of the object to build a 3-D model),
and to determine orientation information of those laser reflection points (Para 53, the processor determines surface normal vectors and orientation of features from the captured laser reflections),
and/or the position sensor is configured to identify which laser emitter of the laser transceiver each laser reflection point originates from (Para 53, Hick’s timing/multiplex system associates each reflection with the originating emitter).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 2, 11, 14, 18 are rejected under 35 U.S.C. 103 as being unpatentable over Hicks et al. (US 20200018854 A1).
Regarding claim 2, Hicks teaches the laser radar according to claim 1, wherein the laser transceiver comprises at least two sets of laser transceivers, and the at least two sets of laser transceivers scan independently of each other (Para 62, machine vision 10 can have 4-10 lidar systems 100), and
Hicks fails to explicitly teach the at least two sets of laser transceivers nonuniformly divide a total field-of-view of the laser radar. However, Hicks in para 62 teaches that 4-10 lidar systems can cover 45 to 90-degree horizontal FOR in order to cover 360-degree horizontal FOR. One ordinary skill in the art would know how to set the angle of at least 2 lidar systems to nonuniformly divide a total field-of-view of the laser radar for example, in a turn or curb one lidar can cover 30-degree and another one 60-degree horizontal FOR.
Regarding claim 11, Hicks teaches the laser radar according to claim 1, wherein the laser transceiver comprises at least two sets of laser transceivers, wherein at least one set of laser transceivers are Flash laser radars (Para 62 “machine vision system 10 can have 4-10 lidar systems 100,” Para 112, “flash Lidar can be used”), and
Hicks fails to explicitly teach a field-of-view of the Flash laser radars is less than 0.75 times of a total field-of-view of a to-be-measured scenario measured by the laser radar. However, Hicks teaches in para 62 “machine vision system 10 can have 4-10 lidar systems 100, each system having a 45-degree to 90-degree horizontal FOR, may be combined together to form a sensing system that provides a point cloud covering a 360-degree horizontal FOR” and para 112, “flash Lidar can be used”. For example, if machine vision 10 uses 4 lidar systems 100 and each cover 90-degree and 2 of them are flash laser radars they will cover 180-degree so 0.5 times of a total FOR (360-degree).
Regarding claim 14, Hicks teaches the method according to claim 13, wherein the laser transceiver comprises at least two laser receivers that are spatially separated from each other (Fig.2, para 56, receivers 214A and 214B), and measuring the distance information further comprises:
Hicks fails to explicitly teach determining jointly the distance information based on positions of the at least two laser receivers that are separated from each other and the time of flight. However. Hicks in para 52, 56, 69-70 teaches a lidar system that operates in at least wo-eye configuration or sensor head and each eye or sensor head has a receiver and the data of each eye or sensor can be combined to determine the distance to one or more downrange targets.
Regarding claim 18, Hicks teaches the method according to claim 12, wherein measuring the orientation information of the object further comprises: recording the orientation information based on an intensity of a laser signal sensed within an exposure duration of the position sensor being greater than a predetermined threshold; or recording the orientation information, in response to a number of regions of a set of lasers having a strongest laser light intensity of a laser signal sensed within an exposure duration of the position sensor being greater than a number of emitted laser sources (Para 55, the camera 101 perceives depth information using time-gated exposures. In essence, time-gated cameras capture "slices" of the scene at different depths by selectively recording reflected light within specific time intervals, allowing for depth estimation based on the arrival time of photons.), and
Hicks fails to explicitly an intensity of any laser in the strongest set of lasers being greater than 1.5 times an intensity of any laser in a non-strongest set of lasers. However, Hicks in para 55 teaches that the camera perceives depth information using time-gated exposures……. can detect relative positions of objects in space by segmenting the image and identifying which objects are in the foreground of other objects. So, to create a depth map: Multiple images are captured with different gating delays and/or exposure times. By analyzing the intensity information in these gated images, a depth map of the scene can be reconstructed. Objects that appear bright in a specific time-gated image are at a distance corresponding to that gating delay, and vice versa.
Claims 3, 6-7, 16 are rejected under 35 U.S.C. 103 as being unpatentable over Hicks et al. (US 20200018854 A1) in view of Steinberg et al. (US 20180113200 A1).
Regarding claim 3, Hicks fails to explicitly teach but Steinberg teaches the laser radar according to claim 1, wherein the laser transceiver has a nonuniform scanning step size (Para 650-651. dynamically adjust a spatial/temporal resolution).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify, Hicks, in view of Steinberg, to include program or algorithm that allows the controller to control the scanning rate of the scanner. Doing so will allow to have a better resolution in area (region of interest (ROI)) that is more important.
Regarding claim 6, Hicks teaches the laser radar according to claim 1, wherein the laser radar comprises a scan driver (Fig. 1, para 45, scanner 120) corresponding to the laser transceiver, and
the scan driver comprises: at least one of a reflection mirror (para 45 and para 56 “the polygon mirror 202 has six reflective surfaces 220A, 220B, . . . 220F”) (and a light transmission optics),
the at least one of the reflection mirror (and the light transmission optics) controls an emission direction of the laser corresponding to the laser transceiver (Fig. 2, para 56-57); and
a motor (para 56 “a polygon mirror 202 driven by a motor 204”),
Hicks fails to explicitly teach but Steinberg teaches the scan driver comprises: at least one of a reflection mirror and a light transmission optics (Para 256. See also, Figs. 1A, 2A-2B),
the at least one of the reflection mirror and the light transmission optics controls an emission direction of the laser corresponding to the laser transceiver (Para 256. See also, Figs. 1A, 2A-2B); and
the scan driver drives the laser transceiver to perform a random scanning operation without preset direction information of laser emission (Para 291 “deflector 114 may be continuously moved (e.g. in a sweeping pattern, in a raster pattern, randomly, pseudo-randomly) through a plurality of instantaneous positions”); and
wherein the motor drives at least one of the reflection mirror and the light transmission optics to move randomly within a predetermined angle range (212, 363).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify, Hicks, in view of Steinberg, to have a scanner with a light transmission optics and a driver that randomly drives the scanner. Doing so will allow to detect distances at various direction.
Regarding claim 7, Hicks, in view of Steinberg, teaches the laser radar according to claim 6, wherein the scan driver drives the laser transceiver to move randomly within a predetermined angle range through an optical path control device (Steinberg, Para 291 “deflector 114 may be continuously moved (e.g. in a sweeping pattern, in a raster pattern, randomly, pseudo-randomly) through a plurality of instantaneous positions”), or
drives the laser transceiver to have a spatial angle change greater than 1.5 times a spatial angle change from a previous scan during at least one scan; and
the optical path control device comprises at least one of: an optical phased array (Steinberg, Para 256), a microelectromechanical system (Steinberg, Para 256), a liquid crystal photoconductive device, a reflective liquid crystal light valve or a transmissive liquid crystal light valve.
Regarding claim 16, Hicks fails to explicitly teach but Steinberg teaches the method according to claim 12, wherein measuring the distance information further comprises: acquiring the distance information through scanning by the laser transceiver, wherein, the scanning is spatial random scanning (Para 291-292).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify, Hicks, in view of Steinberg, to include program or algorithm that allows the controller to control the scanning rate of the scanner. Doing so will allow to have a better resolution in area (region of interest (ROI)) that is more important.
Claims 4-5, 15 are rejected under 35 U.S.C. 103 as being unpatentable over Hicks et al. (US 20200018854 A1) in view of Donovan et al. (US 12153163 B2).
Regarding claim 4, Hicks fails to explicitly teach but Donovan teaches the laser radar according to claim 2, wherein a wavelength of laser light corresponding to each set of laser transceivers of the at least two sets of laser transceivers is different from a wavelength of laser light corresponding to other laser transceivers (Fig. 14, col 14: line 66 to col 15: line 1, the wavelengths of the two transmit/receive modules can be different); or
a modulation of laser light corresponding to each set of laser transceivers of the at least two sets of laser transceivers is different from a modulation of laser light corresponding to other laser transceivers.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify, Hicks, in view of Donovan, to have at least two sets of laser transceivers with different wavelengths. Doing so will allow to detect objects at different ranges (improved range, eye safety).
Regarding claim 5, Hicks, in view of Donovan, teaches the laser radar according to claim 4, wherein laser receivers of the each set of laser transceivers comprise filters that filter the laser light corresponding to the other laser transceivers (Donovan, col 11: lines 23-34, if two wavelengths are used, then the receiver can have two separate 2D arrays of detector, one for each wavelength).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify, Hicks, in view of Donovan, to have each detector that can detects a specific wavelength (having a matching filter). Doing so will allow to more accurately determine distance or identify of an object (improved accuracy).
Regarding claim 15 Hicks teaches the method according to claim 12, wherein the laser transceiver comprises at least two sets of laser transceivers (Para 62, machine vision 10 can have 4-10 lidar systems 100), and
Hicks fails to explicitly teach but Donovan teaches the method comprises: configuring a different laser wavelength or modulation for each set of laser transceivers (Fig. 14, col 14: line 66 to col 15: line 1, the wavelengths of the two transmit/receive modules can be different).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify, Hicks, in view of Donovan, to have at least two sets of laser transceivers with different wavelengths. Doing so will allow to detect objects at different ranges (improved range, eye safety).
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Hicks et al. (US 20200018854 A1) in view of Richmond Hicks (US 10345447 B1).
Regarding claim 9, Hicks fails to explicitly teach the laser radar according to claim 1, wherein a number of pixels output by the position sensor is less than half of a total number of pixels of the position sensor and greater than a number of pixels corresponding to the laser light reflected by the object in each measurement. However, Hicks 447 in figs. 1-3, 5-6, col 2: lines 49-54, col 4: lines 35-50, col 7: lines 43-58, a dynamic vision sensor (DVS) camera that detects temporal contrast and generate an event in response (that responds to local changes in brightness instead of capturing static frames like conventional cameras. DVS cameras operate asynchronously, meaning individual pixels respond independently to changes in brightness without a fixed).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify, Hicks, in view of Hicks 447, to have a DVS camera because it is faster than a frame capture camera with less processing because it does not generate image frames that must then be analyzed (Hicks 447, col 3: lines 3-5) that will reduce the burden of data processing and transmission.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEMPSON NOEL whose telephone number is (571) 272-3376. The examiner can normally be reached on Monday-Friday 8:00-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yuqing Xiao can be reached on (571) 270-3603. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JEMPSON NOEL/Examiner, Art Unit 3645
/YUQING XIAO/Supervisory Patent Examiner, Art Unit 3645