Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
The following is a non-final, first office action in response to the communication filed 06/12/2023. Claims 1-20 are currently pending and have been examined.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 06/12/2023 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement has been considered by the examiner.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 14 and corresponding dependent claims 15-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because claim 14 is directed towards a non-volatile computer readable medium, which is not defined by applicant’s specification to exclude transitory embodiments. Appropriate correction is required. It is suggested that the claim be amended to state a non-transitory computer readable medium.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-13 are rejected under 35 U.S.C. 102(a) as being anticipated by You et al. (Measurement, 183 (2021), p. 109817, 10.1016/j.measurement.2021.109817; hereinafter You).
Regarding claim 1, You discloses:
An apparatus for testing a Light Detection and Ranging (LiDAR) unit configured to project an illumination beam (see at least Abstract and Section 1 describing a “LiDAR sensor alignment inspection system for automobile production”),
comprising:
a bottom frame configured to accept the LiDAR unit (see at least Fig. 9 showing the experimental test bench with the LiDAR sensor mounted on a motion stage for testing and calibration);
a screen movably coupled to the bottom frame, the screen having at least two positions (see at least Fig. 4 and Fig. 9 showing a target board positioned relative to the LiDAR sensor for alignment inspection, and Section 6 describing positional variation during testing); and
an imaging device coupled to the bottom frame and configured to observe the screen (see at least Fig. 3 and Section 4 describing photodetector arrays used to detect the LiDAR beam spot projected onto the target board).
Regarding claim 2, You discloses:
The screen and bottom frame arranged such that the illumination beam projected by the LiDAR unit creates an illumination area onto the screen (see at least Fig. 4 illustrating LiDAR beams projected onto the target board);
the screen movable along an expected pointing direction of the LiDAR unit (see at least Section 6 describing displacement experiments where relative positioning between the LiDAR sensor and target board is varied);
the at least two positions of the screen disposed along the expected pointing direction at different distances from the bottom frame (see Section 6.2.2 describing displacement tests varying the LiDAR position relative to the target board); and the imaging device arranged to observe the entire illumination area in each of the at least two positions of the screen (see Section 5 describing detection of beam positions across the target board using photodetectors).
Regarding claim 3, You discloses:
An image provided by the imaging device of the entire illumination area having a first resolution (see Section 5 describing acquisition of beam detection signals from photodetector arrays); and
the imaging device configured to selectively observe a portion of the illumination area at a second resolution that is higher than the first resolution (see Section 5.4 describing Gaussian fitting of photodetector signals to determine beam center with higher precision).
Regarding claim 4, You discloses:
The screen comprising a layer that receives the projected LiDAR beam (see Fig. 4 and Section 4 describing the planar target board receiving LiDAR beams);
the screen and bottom frame arranged such that the LiDAR unit projects an illumination beam that creates an illumination area on the screen (see Fig. 4 illustrating beam projection on the target board); and
the imaging device arranged to observe the screen (see Fig. 3 and Section 4 describing photodetector arrays detecting beam spots on the board).
Regarding claim 5, You discloses:
An actuator coupled between the bottom frame and the screen configured to move the screen between the at least two positions (see at least Fig. 9 showing a motion stage used to adjust the LiDAR sensor position relative to the target board during calibration experiments).
Regarding claim 6, You discloses:
A gripper or positioning mechanism configured to adjust at least one of a position and an orientation of the transmission lens relative to the base (see Section 6 describing the pose controller module capable of adjusting yaw orientation and position of the LiDAR sensor).
Regarding claim 7, You discloses:
A system for automatically aligning a LiDAR unit (see Abstract describing an “automatic LiDAR sensor alignment inspection system”);
A test apparatus comprising a bottom frame configured to accept the LiDAR unit (see Fig. 9 showing the LiDAR sensor mounted to the test bench);
A screen movable along an expected pointing direction of the LiDAR unit (see Fig. 4 showing the target board receiving LiDAR beams);
An actuator coupled between the bottom frame and the screen configured to move the screen between positions (see Fig. 9 motion stage);
An imaging device configured to observe the screen (see Fig. 3 photodetector arrays);
A processor communicatively coupled to the imaging device (see Section 5 describing algorithmic processing of photodetector signals); and
Memory containing instructions that cause the processor to receive a first image of the illumination area and a second image of the illumination area and determine an offset between the images and calculate the pointing direction of the LiDAR unit (see Section 5.5 describing optimization algorithms used to estimate the LiDAR pose).
Regarding claim 8, You discloses:
Determining the offset between the first and second images by correlating measurements of beam positions (see Section 5 describing matching beam positions detected by photodetectors and LiDAR point cloud data);
Identifying a location of maximum correlation value (see Section 5.4 describing detection of the peak beam intensity corresponding to beam center); and
calculating the offset based on the identified location (see Section 5.5 describing pose estimation using detected beam positions).
Regarding claim 9, You discloses:
a gripper or positioning mechanism configured to adjust the orientation of the LiDAR unit relative to the base (see Section 6 describing the pose controller module adjusting yaw orientation);
the processor determining whether the actual pointing direction is within a predetermined acceptable range (see Section 6 describing evaluation of alignment accuracy and error); and
causing the adjustment mechanism to correct misalignment if the pointing direction is outside the acceptable range (see Abstract describing alignment inspection and calibration in manufacturing environments).
Regarding claim 10, You discloses:
receiving an image of a first region-of-interest and a second region-of-interest displaced from the first region (see Section 5.1 describing segmentation of the target board region of interest from the LiDAR point cloud);
determining an intensity of each pixel or detection element in the ROI images (see Section 5.4 describing photodetector signal intensity measurements);
plotting cumulative distributions of intensity values (see Section 5 describing signal processing and beam intensity analysis);
calculating slopes of the cumulative curves (see signal processing steps described in Section 5);
transforming the slope curves into a frequency domain representation (see signal analysis steps used for beam characterization); and
determining a degree-of-freedom parameter from differences between ROI measurements (see Section 5.5 describing estimation of orientation parameters from beam measurements).
Regarding claim 11, You discloses:
the illumination area comprising X- and Y-axes (see Fig. 2 illustrating the coordinate system of the target board);
the second ROI displaced along one axis relative to the first ROI (see Section 5 describing beam detection positions along target coordinates); and
the calculated DOF parameter corresponding to rotation of the LiDAR sensor about an axis (see Section 5.5 describing estimation of orientation angles including roll, pitch, and yaw).
Regarding claim 12, You discloses:
receiving an image of a third ROI displaced along the X-axis from the first ROI (see Section 5 describing detection of beam positions across multiple photodetectors);
plotting cumulative intensity curves and slope curves (see Section 5 signal analysis);
transforming the slope curves to frequency domain curves (see signal processing steps); and
determining a second DOF parameter corresponding to rotation about another axis (see Section 5.5 solving multiple orientation angles).
Regarding claim 13, You discloses:
receiving an image of a fourth ROI displaced from the second ROI along the X-axis (see Section 5 describing detection of multiple beam positions across the board);
plotting cumulative intensity curves and slope curves for the fourth ROI (see Section 5 signal processing);
transforming the slope curve into a frequency domain curve (see signal analysis);
identifying a value of the frequency domain curve at a predetermined frequency (see signal processing steps); and
determining a third DOF parameter corresponding to rotation about a Z-axis perpendicular to the X- and Y-axes (see Section 5.5 solving roll, pitch, and yaw orientation parameters).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Dominick J. Cabrera whose telephone number is (571) 317-1401. The examiner can normally be reached Monday - Friday, 8 AM - 5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vladimir Magloire can be reached at (571) 270-5144. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DOMINICK JACOB CABRERA/Examiner, Art Unit 3648
/VLADIMIR MAGLOIRE/Supervisory Patent Examiner, Art Unit 3648