DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Examiner’s Note
In the prior art relied upon in this office action, “LiDAR,” “LIDAR,” and “lidar” are all used to refer to light detection and ranging, and these spellings may be used interchangeably.
Response to Arguments
Applicant’s arguments filed 17 September 2025, have been fully considered. Claims 1-12 and 14 remain pending. Claim 13 has been canceled. Claims 1, 7, and 14 have been amended.
Applicant’s efforts to amend the claims to address rejections under 35 USC 112(b) have been considered and are satisfactory. However, new grounds for rejection have been raised in light of the amended claims. See the 112(b) rejections below.
Applicant’s efforts to amend the claims to address the rejections under 35 USC 101 have been considered and are satisfactory. In particular, adding language to independent claims 1 and 7 positively reciting a self-propelled vehicle with a LiDAR sensor is considered enough to integrate the judicial exceptions into the practical application of enabling a self-propelled vehicle to generate a map of its own surroundings.
Applicant’s arguments regarding the prior art rejections under 35 USC 103 have been considered. The applicant states that Yoon was relied on to suggest that it would have been obvious to remove dynamic points in a frame, then update its static points to a map. The applicant then argues that Yoon only teaches labelling points as dynamic, not removing them, and that Yoon doesn’t update a static map. The examiner agrees that Yoon is interested in detecting and labeling dynamic objects, and that Yoon does not teach removing them or updating a map. However, O’Brien is interested in creating a spatial static map of an environment such as an indoor facility (¶61). Since environments often contain static and dynamic objects, O’Brien would be interested in detecting dynamic objects as Yoon is, but for the purpose of excluding them from the static map. Once dynamic points are labeled, one of ordinary skill in the art practicing the invention of O’Brien in view of Yoon would be interested in and capable of removing the dynamic points from the LiDAR scan and updating a static map with the resulting LiDAR points. Additionally, O’Brien would expect to update a map with new scans since it is interested in generating an indoor map of an environment, and it is reasonable to assume that such a task would take more than one scan.
The examiner agrees that Moras does not teach comparing a dynamic point in a current frame to points in each of N frames. Rather, Moras teaches comparing the current occupancy status of a cell to a map grid representing a function of observations over multiple previous scans. However, Moras was used because it teaches comparing data indicating a dynamic object with data from N previous scans. While this occurs at a cell level rather than point level, one of ordinary skill in the art would have been motivated by Moras to consider that a point labelled dynamic may in fact represent a static object, and that comparing that point to data from previous scans would be useful in order to make that determination.
However, new grounds of rejection have been raised in light of the amended claim language. See the 103 rejections below.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-12 and 14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 1 is rejected for the reasons given below.
Claim 1 recites “controlling the self-propelled vehicle to travel…” There is no antecedent basis for the self-propelled vehicle. The examiner assumes that the applicant intends to recite “controlling a self-propelled vehicle to travel…” and will use this interpretation for examination purposes.
Claim 1 also has been amended to recite “comparing each of the dynamic points labelled in the LiDAR frame sequentially with a plurality of points in each of N LiDAR frames generated before the time sequence to determine whether each of the N LiDAR frames comprises the dynamic point, wherein if it is determined that the N LiDAR frames comprise the dynamic point, the dynamic point is corrected as the static point.” This is worded in a confusing way. One may interpret the above limitations as requiring that a given dynamic point must be present in all N of the N LiDAR frames before relabeling the dynamic point as a static point. However, the examiner thinks that the applicant’s intent is to relabel a dynamic point as a static point if it is observed to be present in at least one of the N LiDAR frames. This interpretation has support from ¶43 of the specification, which states: “If the labelled dynamic point once appeared in the previous N LiDAR frames, this means that it is likely that the labelled dynamic point is a static point. Therefore, in the embodiment, the dynamic point is corrected as a static point.” For examination purposes, then, it will be assumed that the above quote from claim 1 should be rewritten to read: “comparing each of the dynamic points labelled in the LiDAR frame sequentially with a plurality of points in each of N LiDAR frames generated before the time sequence to determine whether at least one of the N LiDAR frames comprises the dynamic point, wherein if it is determined that at least one of the N LiDAR frames comprise the dynamic point, the dynamic point is corrected as the static point” or something similar.
Claims 2-6 depend from claim 1, therefore they inherit the issues of claim 1 and are rejected for the same reasons.
Claim 7 recites the same language as claim 1 concerning comparing each of the dynamic points to N LiDAR frames, which is confusing for the same reasons and will be interpreted the same way for examination purposes.
Claims 8-12 and 14 depend from claim 7, therefore they inherit the issues of claim 7 and are rejected for the same reasons.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-12 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over O’Brien (US 20180094935 A1) in view of Yoon (“Mapless Online Detection of Dynamic Objects in 3D Lidar”).
Regarding claim 1, O’Brien teaches a method for spatial static map construction (¶61: “LIDAR systems and methods are employed by [a] drone to survey [an] indoor facility…The LIDAR system may be coupled to a drone tasked to survey and scan the indoor facility to generate a map of the interior portion of the facility indicating location of static objects”), adapted for an electronic device with a processor (¶61: the drone and a computational device which the drone may communicate with; ¶15: the drone may include a processing device; ¶21: computational device may be equated with device 420; ¶19: device 420 may comprise one or more processors), wherein the method comprises:
controlling a self-propelled vehicle to travel in a three-dimensional space and scanning during a traveling process of the self-propelled vehicle (¶61, the drone travels an indoor space and generates a map), the three-dimensional space with a LiDAR sensor configured on the self-propelled vehicle (¶61: “the LIDAR system may be coupled to a drone”) to generate a static map built according to the three-dimensional space (¶61: the drone may “scan the indoor facility to generate a map of the interior portion of the facility indicating location of static objects”).
O’Brien does not teach:
scanning a three-dimensional space with a LiDAR sensor to generate a LiDAR frame comprising a plurality of points in the three-dimensional space in a time sequence;
for each of the points in the LiDAR frame, finding a corresponding point closest to the point of the LiDAR frame from a static map built according to the three-dimensional space and calculating a distance between the point and the corresponding point, wherein the point is labelled as a dynamic point if the distance is greater than a predetermined threshold, and the point is labelled as a static point if the distance is not greater than the predetermined threshold;
comparing each of the dynamic points labelled in the LiDAR frame sequentially with a plurality of points in N LiDAR frames generated before the time sequence to determine whether at least one of the N LiDAR frames comprises the dynamic point, wherein if it is determined that at least one of the N LiDAR frames comprise the dynamic point, the dynamic point is corrected as the static point, where N is a positive integer; and
removing [each of the dynamic points] in the LiDAR frame and updating each of the static points in the LiDAR frame to the static map.
Yoon teaches:
scanning a three-dimensional space with a LiDAR sensor to generate a LiDAR frame (Fig. 2, query lidar scan) comprising a plurality of points in a three-dimensional space (p. 115, Section III “Methodology,” ¶1: a scan is a set of point measurements of a three-dimensional space around a lidar) in a time sequence (p. 115, Section III.A. “Odometry,” ¶1: measurements are associated with timestamps; also Fig. 2 query lidar scan occurs in a sequence of lidar scans); and
for each of the points in the LiDAR frame (p. 116, column 1, ¶1: query points -
q
0
), finding a corresponding point (p. 116, column 1, ¶1: nearest reference scan neighbor
p
0
) closest to the point of another LiDAR frame (p.115, column 2, step 2 “Pointcloud Comparison”; p. 116, column 1, ¶1: for -
q
0
of the query scan,
p
0
in a reference scan is found) and calculating a distance between the point and the corresponding point (p. 115, col. 2, last ¶ – p. 116, column 1, ¶1: an error metric which is either a point-to-plane metric or a point-to-point metric is calculated for each
q
0
and
p
0
; p. 116, column 1, ¶2: “the error metric [is computed] for all query points”), wherein the point is labelled as a dynamic point if the distance is greater than a predetermined threshold, and the point is labelled as a static point if the distance is not greater than the predetermined threshold (p. 116, column 1, ¶2: “We compute the error metric for all query points…Those greater than [an] error threshold are labelled dynamic, the rest are static.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to incorporate the teachings of Yoon with the invention of O’Brien by:
scanning a three-dimensional space with the LiDAR sensor to generate a LiDAR frame comprising a plurality of points in the three-dimensional space in a time sequence;
for each of the points in the LiDAR frame, finding a corresponding point closest to the point of the LiDAR frame from the static map built according to the three-dimensional space and calculating a distance between the point and the corresponding point, wherein the point is labelled as a dynamic point if the distance is greater than a predetermined threshold, and the point is labelled as a static point if the distance is not greater than the predetermined threshold; and
removing [each of the dynamic points] in the LiDAR frame and updating each of the static points in the LiDAR frame to the static map.
Doing so would enable one to use the labelling method of Yoon to update the static map with subsequent LiDAR scans and filter out points considered not part of the static environment.
O’Brien in view of Yoon does not explicitly teach comparing each of the dynamic points labelled in the LiDAR frame sequentially with a plurality of points in N LiDAR frames generated before the time sequence to determine whether at least one of the N LiDAR frames comprises the dynamic point, wherein if it is determined that at least one of the N LiDAR frames comprise the dynamic point, the dynamic point is corrected as the static point, where N is a positive integer.
However, Yoon does teach that a dynamic point may be corrected as a static point (pg. 3, Section C. Freespace Check, ¶1: “We check all dynamic query points…against the freespace of another scan to correct mislabels from the pointcloud comparison”). Yoon also teaches that points can be erroneously labeled dynamic because they represent static objects that were previously occluded (pg. 3, Section C. Freespace Check, ¶1: “Recall that points are mislabeled dynamic because of viewpoint occlusions or they are new surface observations.” See also Fig. 3(a) where ground points on the upper right side were labelled dynamic because they were occluded by a moving car). Yoon corrects mislabeled dynamic points by comparing them to the freespace of a previous scan, where freespace represents the laser ray path between a LiDAR and the laser’s endpoint in the previous scan (pg. 3, Section C. Freespace Check, ¶2: “Given a query point and reference scan that defines the freespace of interest, we wish to determine if the query point is inside, on the border of, or outside freespace...The laser ray paths, from the sensor to their endpoints, define freespace.”). If a query point is inside a freespace, it is considered dynamic, while if it is outside or on the border of the freespace, it is changed to static (pg. 3, Section C. Freespace Check, ¶1; also pg. 2, Section III. Methodology, “3) Freespace Check: “Check dynamic points against freespace of another scan. Points not in freespace are not dynamic and changed to static.”).
If a query point were present in the previous scan, it would be on the border of the freespace and considered static. One of ordinary skill in the art practicing the invention of O’Brien in view of Yoon would recognize that, if a previous LiDAR scan comprised a query point, the query point would be on the freespace of that previous scan and ought to be updated to a static point. This would be a simple way to correct an incorrectly labeled dynamic point. Furthermore, it would have been obvious to compare the query point to multiple points in order to conclude confidently whether the previous LiDAR frame comprises the query point.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to incorporate the teachings of Yoon with the invention of O’Brien in view of Yoon by comparing each of the dynamic points labelled in the LiDAR frame sequentially with a plurality of points in N LiDAR frames generated before the time sequence to determine whether at least one of the N LiDAR frames comprises the dynamic point, wherein if it is determined that at least one of the N LiDAR frames comprise the dynamic point, the dynamic point is corrected as the static point, where N is a positive integer. Doing so would enable one to simply determine whether a dynamic point should actually be considered static because it lies on the freespace of a previous scan.
Regarding claim 7, the limitations of claim 1 are included in claim 7 and rejected for the same reasons. Claim 7 also recites a processing device comprising a processor which is connected to the LiDAR sensor, and which is configured to control the LiDAR sensor and implement the method of claim 1. O’Brien discloses that the drone may include a processor (¶15: the drone may include a processing device). It would have been obvious for the drone of O’Brien to include the processing device comprising a processor in the manner described, in order to enable the drone to perform its functions autonomously.
Regarding claims 2 and 8, O’Brien in view of Yoon teaches the method of claim 1 and the system of claim 7. Yoon further teaches performing posture conversion on a generated LiDAR frame so that a coordinate axis of the converted LiDAR frame is consistent with a general coordinate frame (p. 116, column 1, ¶1: all points in the query scan are transformed to a world frame before calculating distances between query points and nearest reference scan neighbor points). While Yoon is silent on whether the world frame is also the coordinate frame of the reference scan, it would have been obvious to set the world frame as the coordinate frame of the reference scan so that the distances calculated between points in the query and reference scans represent distances between points in the coordinate frame where the three-dimensional space is at rest.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to incorporate the teachings of Yoon with the invention of O’Brien in view of Yoon by performing posture conversion on the generated LiDAR frame so that a coordinate axis of the converted LiDAR frame is consistent with a coordinate axis of the static map. Doing so would ensure distances calculated between points in the LiDAR frame and points in the static map represent distances between points in the coordinate frame where the three-dimensional space is at rest.
Regarding claims 3 and 9, O’Brien in view of Yoon teaches the method of claim 2 and the system of claim 8. O’Brien further teaches the use of simultaneous localization and mapping (SLAM) technology (¶20: a navigation module 110 can implement SLAM “for location awareness and to assist in navigations”). Noting that SLAM is a method for performing posture conversion, among other things, it would have been obvious to one of ordinary skill in the art to perform the posture conversion on the LiDAR frame outlined in claims 2 and 8 through the SLAM technology taught in O’Brien.
Regarding claims 4 and 10, O’Brien in view of Yoon teaches the method of claim 1 and the system of claim 7. Claims 4 and 10 recite that, after the step of generating the LiDAR frame comprising the points in the three-dimensional space in the time sequence, the method further comprises/the processor: determining/determines whether the LiDAR frame is a first LiDAR frame generated by the LiDAR sensor; and directly updating/updates the LiDAR frame to the static map if the LiDAR frame is the first LiDAR frame and setting/sets a coordinate axis of the static map based on a coordinate axis of the LiDAR frame. Note that while claims 4 and 10 use different syntax (here the syntax of claim 4 is used), they recite the same limitations.
This language merely describes defining the initial static map and coordinate axis from the first LiDAR scan in a sequence of scans. Because O’Brien teaches generating a map of an indoor facility with a LiDAR sensor coupled to a drone (¶61), and because O’Brien would necessarily have to make a first scan to initiate the process, it would have been obvious for O’Brien to perform this step to define a static map and frame of reference where one is not previously present.
Regarding claims 5 and 11, O’Brien in view of Yoon teaches the method of claim 1 and the system of claim 7. O’Brien further teaches that a self-propelled vehicle travels in the three-dimensional space (¶15: the drone may include a processing device; ¶61: the drone surveys an indoor facility), and while traveling, continuously uses the LiDAR sensor to scan the three-dimensional space (¶61: the drone surveys the indoor facility with the LIDAR system “to generate a map of the interior portion of the facility indicating location of static objects”) to generate LiDAR frames in different time sequences (a LiDAR frame is not scanned instantaneously and different frames are scanned at different times, therefore the LiDAR frames are generated in different time sequences) and to update a static map (¶61: the survey is performed “to generate a map of the interior portion of the facility indicating location of static objects” which is a static map).
O’Brien does not explicitly teach stopping the LiDAR sensor when the three-dimensional space is completely scanned, however it would have been obvious to do so because once the static map of the entire indoor facility is finished there is no longer a need to use the LiDAR sensor.
Regarding claims 6 and 12, O’Brien in view of Yoon teaches the method of claim 1 and the system of claim 7. Yoon further teaches calculating a Euclidean distance between the point of the LiDAR frame and the corresponding point of another frame (p. 116, column 1, ¶1: the point-to-point metric is the Euclidean norm). Yoon also teaches a method for calculating the Euclidean distance between the corresponding point of another frame and a plane comprising a point of the LiDAR frame when such a plane can be identified (p. 116, column 1, ¶1: the point-to-plane metric
n
q
⋅
p
0
-
q
0
gives the Euclidean distance between
p
0
and the surface containing
q
0
, the surface being defined using points neighboring
q
0
in the LiDAR frame). This second calculation requires more operations and can only be performed for points of the LiDAR frame from which surface normals can be computed.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to incorporate the teachings of Yoon with the invention of O’Brien in view of Yoon by setting the distance between the point of the LiDAR frame and the corresponding point of the static map to be the Euclidean distance. Doing so would be natural because the three-dimensional environment is Euclidean, and the Euclidean distance represents a natural way to define distance between points in Euclidean space.
Regarding claim 14, O’Brien in view of Yoon teaches the system of claim 7. O’Brien further teaches that the processor further controls the self-propelled vehicle to travel in the three-dimensional space (¶15: the drone may include a processing device; ¶61: the drone surveys an indoor facility), and during a traveling process of the self-propelled vehicle, continuously uses the LiDAR sensor to scan the three-dimensional space (¶61: the drone surveys the indoor facility with the LIDAR system “to generate a map of the interior portion of the facility indicating location of static objects”) to generate LiDAR frames in different time sequences (a LiDAR frame is not scanned instantaneously and different frames are scanned at different times, therefore the LiDAR frames are generated in different time sequences) and to update a static map (¶61: the survey is performed “to generate a map of the interior portion of the facility indicating location of static objects” which is a static map).
O’Brien does not explicitly teach that the LiDAR updates the static map until the self-propelled vehicle travels the entire three-dimensional space, however since the vehicle is configured to generate a static map of the interior of a facility it would have been obvious to do so in order to create a static map of the entire interior of the facility.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Arora (“Mapping the Static Parts of Dynamic Scenes from 3D LiDAR Point Clouds Exploiting Ground Segmentation”) teaches removing dynamic points from point clouds.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ETHAN WESLEY EDWARDS whose telephone number is (571)272-0266. The examiner can normally be reached Monday - Friday, 7:30am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Schechter can be reached at (571) 272-2302. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
ETHAN WESLEY EDWARDS
Examiner
Art Unit 2857
/E.W.E./Examiner, Art Unit 2857
/ANDREW SCHECHTER/Supervisory Patent Examiner, Art Unit 2857