DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 01 March 2023 by the applicant has been considered and is included in the file.
Drawings
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference character(s) not mentioned in the description:
Reference character “51”, which appears in Figs. 1A, 11B, 11C, 12B and 12C
Reference character “62”, which appears in Figs. 11B, 11C, 12B and 12C
Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Claim Objections
Claims 1-3, and 6-11 are objected to because of the following informalities. The examiner notes that this is not an exhaustive list, and requests cooperation by the applicant to further verify all antecedent basis corrections are made.
Claim 1: "the orientation" (line 4), "the points of interception" (line 18), "comprising a distance" (line 22), "the time" (line 23), "the first iteration" (line 25), "determining a position" (line 28), "an iteration" (line 31).
Claim 2: "determining a position" (line 2), "an iteration" (line 5).
Claim 3: "determining a position" (line 2), "an iteration" (line 4).
Claim 6: "estimating a position" (line 2), "a distance" (line 4), "a perpendicular plane" (line 6), "a line" (line 7), "an estimated position" (line 16), "a distance" (line 18), ", , " (line 18).
Claim 7: "an identification pattern" (line 2).
Claim 8: "an estimated position" (line 8), "a distance" (line 10), "the estimated dimension" (line 10), "the perpendicular" (line 11).
Claim 9: "a LIDAR" (line 1), "a laser", "a probe laser" (line 3), "a movement system" (line 4), "a LIDAR" (line 6).
Claim 10: "a LIDAR" (line 1), "the group" (line 3).
Claim 11: "a LIDAR" (line 1) , "an object" (line 3), "the necessary indications" (line 4).
Appropriate correction is required.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1, and 5-9 is/are rejected under 35 U.S.C. 102(a)(1) and (a)(2) as being anticipated by Blais (US 5216236 A).
Regarding claim 1, Blais anticipates a method of tracking objects based on the use of a LIDAR apparatus, the LIDAR apparatus comprising:
a laser source configured to emit a probe laser beam (Col. 4, lines 18-42; Fig. 5, light source (50) emits beam (51)),
and a system for moving the probe laser beam configured to modify the orientation of the probe laser beam (Col. 4, lines 18-42; Fig. 5, scanning mechanism (52) directs beam (53) into environment),
the method (Col. 1, lines 31-33) comprising the following steps:
A. identifying an object to track (Col. 2, line 49-Col. 3, line 30; Fig. 5 detects target (55)),
B. estimating a position of the object, the position of the object comprising a distance between the object and the LIDAR apparatus (Col. 1, line 64 -Col. 2, line 13),
C. tracking the object (Col. 1, line 64 -Col. 2, line 9),
Step C of tracking the object comprising the sub-steps of:
C1. determining a tracking pattern, of the parametric curve type (Col. 1, line 64 -Col. 2, line 9, where tracking pattern is scanned in a Lissajous pattern), to pass along by the probe laser beam, the tracking pattern corresponding to a parametric curve with at least one angular parameter of the parametric curve of the tracking pattern relative to the LIDAR apparatus , which is determined from the estimated position of the object , including in particular the distance between the object and the LIDAR apparatus (Col. 2, line 56 -Col. 3, line 30, where an estimated position of intersection of beam and object occurs yields information on distance to target and angular orientation of beam),
C2. moving the probe laser beam by a movement system, so as to move the probe laser beam along the tracking pattern determined at step C1 and identifying the points of interception of the object by the probe laser beam during the movement of the probe laser beam (Col. 2, line 56 -Col. 3, line 30; Fig. 1, 1A where an estimated position of intersection of beam and object occurs yields information on distance to target and angular orientation of beam as it is scanned through pattern (L)),
C3. determining a position of the object from the points of interception of the probe laser beam by the identified object the determined position comprising a distance between the object and the LIDAR apparatus (Col. 4, lines 20-27), wherein at the time of the implementation of tracking step C, steps C1 to C3 are reproduced successively and iteratively, the estimated position of the object used at step C1 being either, for the first iteration, the estimated position of the object obtained at step B, or, for an iteration n, n being an integer greater than or equal to 2, the position of the object determined at step C3 of the iteration n- 1 wherein in sub-step C3 of determining a position of the object a direction of movement of the object is furthermore determined based on the estimated position used at sub-step C1 and on the position determined at sub-step C3 (Col. 4 lines 44-64, Col. 5, lines 26-36; where system utilizes repeated scans to cover an entire search area, and information from prior search may be used to adjust tracking modes of the target ),
and wherein, at the time of the implementation of step C, for an iteration n, n being an integer greater than or equal to 2, in the sub-step C1 of determining the tracking pattern, at least one other parameter of the parametric curve of the tracking pattern is furthermore determined based on the estimated direction of movement of the object determined at step C3 of iteration n-1 (Col. 3, lines 3-45, Col. 6 line 58 - Col. 7, line 2; where motion of object is tracked and scanning pattern center may be adjusted to accommodate for object's motion between scans which adjusts center of scanning pattern location).
Regarding claim 5, Blais anticipates the method of tracking objects according to claim 1, wherein
at the time of one of step A of identifying the object to track and of step B of estimating the position of the object, there is furthermore determined at least one estimated dimension of the object in a perpendicular plane containing the estimated position of the object and perpendicular to a line passing via the estimated position of the object and the position of the LIDAR apparatus,
and wherein, at the sub-step C1 of determining the tracking pattern, the at least one angular parameter of the tracking pattern is furthermore determined from the estimated dimension (Col. 6, line 20-Col. 7, line 55; Figs. 10, 11 where a calculated profile and/or shape of the target may be determined during tracking and interception points may influence where the pattern is directed, the size of the pattern, and the shape of the pattern).
Regarding claim 6, Blais anticipates the method of tracking objects according to claim 5, wherein step B of estimating a position of the object comprises the following sub-steps:
B1 obtaining a preliminary position of the object, the estimated preliminary position comprising a distance between the object and the LIDAR apparatus (Col. 2, line 56-Col. 3 line 8),
B2 determining an identification pattern to pass along by the probe laser beam along a perpendicular plane containing the estimated preliminary position of the object, and perpendicular to a line passing via the estimated preliminary position of the object and the position of the LIDAR apparatus, at least one angular parameter of the identification pattern being determined from the estimated preliminary distance between the LIDAR apparatus and the object and the estimated preliminary position of the object (Col. 5, lines 10-57, Col. 6, line 20-Col. 7, line 13; where object identification pattern includes information on intersection points, object range, and object estimated location),
B3. moving the probe laser beam by the movement system so as to move the probe laser beam along the identification pattern determined at step B2 and identifying the points of intersection between the object and the probe laser beam during the movement of the probe laser beam,
B4. determining an estimated position of the object from the points of interception of the probe laser beam by the identified object, the determined position comprising a distance between the object and the LIDAR apparatus, , the estimated dimension of the object in the perpendicular plane also being determined from the points of interception of the probe laser beam by the identified object (Col. 5, lines 26-43, Col. 6, line 20-Col. 7, line 55; Figs. 10, 11 where a calculated profile and/or shape of the target may be determined during tracking and interception points may influence where the pattern is directed, the size of the pattern, and the shape of the pattern).
Regarding claim 7, Blais anticipates the tracking method according to claim 6, wherein
at step B2 of determining an identification pattern, the identification pattern corresponds to a parametric curve of a type other than that of the tracking pattern determined at step C1 (Col. 5, lines 10-57, Col. 6, line 20-Col. 7, line 13; where patterns required for object identification may require different ratios, shapes, or number of points than object detection scanning patterns).
Regarding claim 8, Blais anticipates the method of tracking objects according to claim 1, wherein step B of estimating a position of the object comprises the following sub-steps:
B'1. moving the probe laser beam by the movement system so as to carry out scanning of a region of space in which the object to track is estimated to be and identifying the intersection points between the object and the probe laser beam during the movement of the probe laser beam,
B'2. determining an estimated position of the object from the points of interception of the laser beam probe by the identified object, the determined position comprising a distance between the object and the LIDAR apparatus, the estimated dimension of the object in the perpendicular plane also being determined from the points of interception of the laser beam by the identified object (Col. 5, lines 26-43, Col. 6, line 20-Col. 7, line 55; Figs. 10, 11 where a calculated profile and/or shape of the target may be determined during tracking and interception points may influence where the pattern is directed, the size of the pattern, and the shape of the pattern).
Regarding claim 9, Blais anticipates a system for tracking objects from a LIDAR apparatus, the system comprising:
a laser source configured to emit a probe laser beam (Col. 4, lines 18-42; Fig. 5, light source (50) emits beam (51)),
a movement system for moving the probe laser beam configured to modify the orientation of the probe laser beam, the laser source and the movement system participating in forming a LIDAR apparatus (Col. 4, lines 18-42; Fig. 5, scanning mechanism (52) directs beam (53) into environment),
a control unit configured to control the movement system for moving the probe laser beam (Col. 4, lines 18-42; Fig. 5, microprocessor (43) controls system, which includes laser (50) and detector (60)),
wherein the control unit is furthermore configured for the implementation of at least step C) of the method of tracking according to claim 1 (Col. 6, lines 24-46).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 2-4, and 10-11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Blais (US 5216236 A) in view of O’Keeffe (US 20180059248 A1).
Regarding claim 2, Blais teaches the method of tracking objects according to claim 1.
Blais does not explicitly teach determining an estimated speed of an object, and adjusting scanning parameters based on the object speed.
O’Keeffe teaches a LIDAR system which dynamically steers the system backed on sensor data, wherein in sub-step C3 of determining a position of the object an estimated speed of movement of the object is furthermore determined based on the estimated position used at sub-step C1 and on the position determined at sub-step C3,
and wherein, at the time of the implementation of step C, for an iteration n, n being an integer greater than or equal to 2, in the sub-step C1 of determining the tracking pattern, the at least one other parameter of the tracking pattern is furthermore determined based on the estimated speed of movement of the object determined at step C3 of iteration n-1 ([0232] - [0237]; Fig. 36B where a "keep out region", which informs steering instructions, may be determined by the speed of the detected object which is determined by at least two sets of data, as based on change in position).
Therefore, to one of ordinary skill in the art before the effective filing date of the claimed invention, it would have been obvious prima facie to modify Blais to incorporate the teachings of O’Keeffe to determine an object speed which informs scanning parameters with a reasonable expectation of success. As O’Keeffe notes, the accuracy of object detection, which includes speed information is related to the ability of the system to determine the object boundaries during scans ([0021]). Utilizing the object’s speed/velocity to modify scanning parameters in the system of Blais would have a predictable result of increasing the accuracy of the object tracking as the system is able to compensate its scanning by the information based on the motion of the object.
Regarding claim 3, Blais as modified above teaches the method of tracking objects according to claim 2.
Blais does not explicitly teach determining acceleration of the target object during detection.
O’Keeffe teaches a LIDAR system which dynamically steers the system backed on sensor data, wherein the sub-step C3 of determining a position of the object an estimated acceleration of the object is furthermore determined,
wherein, at the time of implementation of step C, for an iteration n, n being an integer greater than or equal to 2, in sub-step C1 of determining the tracking pattern, the at least one other parameter of the tracking pattern is furthermore determined from the estimated acceleration ([0281]; Figs. 45C-D, where changes in velocity of object during successive scans may be detected and inform scanning parameters).
Therefore, to one of ordinary skill in the art before the effective filing date of the claimed invention, it would have been obvious prima facie to modify Blais to incorporate the teachings of O’Keeffe to determine an object acceleration which informs scanning parameters with a reasonable expectation of success. As O’Keeffe notes, the accuracy of object detection, which includes speed and change in speed information, is related to the ability of the system to determine the object boundaries during scans ([0021], [0260]). Utilizing the object’s speed/velocity and the changes of the velocity to modify scanning parameters in the system of Blais would have a predictable result of increasing the accuracy of the object tracking as the system is able to compensate its scanning by the information based on the motion of the object.
Regarding claim 4, Blais as modified above teaches the method of tracking objects according to claim 2, wherein
the at least one other parameter of the pattern comprises a pattern type selected from a group of predefined patterns each corresponding to a respective type of parametric curve, the pattern type being selected from said group of predefined patterns according to the estimated direction of movement and/or estimated speed of movement if the latter is available (Col. 5, lines 26-43, Col. 7, lines 3-13; pattern size and shape such as various Lissajous and circular patterns, may be determined based on delta X, delta Y and therefore the movement of tracked object).
Regarding claim 10, Blais teaches the system for tracking objects from a LIDAR apparatus according to claim 9.
Blais does not explicitly teach the system comprising at least one additional imaging apparatus, such as a camera or RADAR.
O’Keeffe teaches a LIDAR system which dynamically steers the system backed on sensor data, where the system furthermore comprises at least one imaging apparatus selected from the group comprising optical cameras and radar apparatuses, and wherein the imaging apparatus is configured to implement at least step A) and to provide the control unit with the indications necessary for the control unit to be able to implement step B), the control unit being configured to implement step B) of the tracking method ([0251]; Fig. 9, where camera (910a) and/or radar (910d) may be used for event driven laser steering and to determine steering parameters (601c)).
Therefore, to one of ordinary skill in the art before the effective filing date of the claimed invention, it would have been obvious prima facie to modify Blais to incorporate the teachings of O’Keeffe to combine a laser range finding system, operating as taught by Blais, with a system such as a camera or RADAR to determine an object’s preliminary location and then operate the laser range finding system with a reasonable expectation of success. Blais notes that integration of laser scanning methods and cameras enable the two methods to benefit from one another (Col. 6, lines 3-20), and such integration of the systems of Blais and O’Keeffe would have a predictable result of using less resource intense systems, such as a camera, to determine where to direct a LIDAR scanning pattern.
Regarding claim 11, Blais teaches the system for tracking objects from a LIDAR apparatus according to claim 9.
Blais does not explicitly teach a user interface.
O’Keeffe teaches a LIDAR system which dynamically steers the system, where the system comprises a device for entering into communication with the control unit in which an observer having identified an object to track in accordance with step A) is able to provide the necessary indications for the control unit to implement step B), the control unit being configured to implement step B) of the tracking method ([0082], [0311]; Fig. 5, where user interface may be integrated into processing subassembly to dynamically steer laser assembly (505) for object tracking and detection).
Therefore, to one of ordinary skill in the art before the effective filing date of the claimed invention, it would have been obvious prima facie to modify Blais to incorporate the teachings of O’Keeffe to utilize user inputs to direct the system to scan for object detection with a reasonable expectation of success. Implementation of user interfaces in LIDAR systems, either to determine objects to track or specifics of scanning parameters, is well known in the art of object detection and tracking in systems which utilize RADAR, cameras, and/or LIDAR.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Hicks (US 10345447 B1) teaches a dynamic vision sensor camera, which directs light along a scan pattern to determine a region of interest, which then adjusts a scan parameter as don information of the one or more objects detected.
Livingston (US 5936229 A) teaches a laser tracking and engagement system which steers a laser to scan and track an object, which utilizes a Lissajous scanning pattern to determine the object’s position and velocity vector.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Kara Richter whose telephone number is (571)272-2763. The examiner can normally be reached Monday - Thursday, 8A-5P EST, Fridays are variable.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Robert Hodge can be reached at (571) 272-2097. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/K.M.R./Examiner, Art Unit 3645
/ROBERT W HODGE/Supervisory Patent Examiner, Art Unit 3645