DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Election/Restrictions
Applicant’s election without traverse of claims 1-11 and 18-20 in the reply filed on 29 October 2025 is acknowledged.
Claim 12-17 withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected invention, there being no allowable generic or linking claim. Election was made without traverse in the reply filed on 29 October 2025.
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. 18/241515 (the instant application), filed on 1 September 2023.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 1 September 2023 was received and the information disclosure statement has been considered by the examiner.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-6, 9-10, and 18-20 are rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa et al. (US 20220383749 A1; hereafter, Ishikawa) in view of Feng (CN 115410176 A) in further view of Buerkle et al. (DE 102013223803 A1).
Regarding claim 1, Ishikawa discloses:
A dynamic obstacle tracking method performed by at least one processor ([0038] and Fig. 1, the system includes a processor, item 21), the dynamic obstacle tracking method comprising: acquiring, from an environment recognition sensor ([0051]-[0052] and Fig. 1, the system includes an external recognition sensor, item 25, which is understood as an environment recognition sensor), environmental data regarding surroundings of an unmanned vehicle ([0152] and Fig. 8, acquire environmental data);
generating an occupancy map, that is grid-based, by processing the environmental data ([0158] and Fig. 8, an occupancy grid map is generated based on the environmental data);
and finding dynamic obstacles by searching for the dynamic obstacles in an entirety of the occupancy map ([0161] and Fig. 8, movement for each object is estimated which is understood as finding dynamic obstacles. As it is performed for each object it is understood as doing so for each obstacle in the entirety of the occupancy map)
Ishikawa does not disclose expressly to filter out areas from the occupancy map that are occupied by objects that have a size greater than a first threshold and not searching filtered out areas for dynamic obstacles.
Feng discloses:
filtering out areas from the occupancy map that are occupied by objects that have a size greater than a first threshold value (pg. 9 para. 4, S306 if the height, which is understood as the size, of the obstacle is greater than a threshold then the point cloud data of the obstacle is removed from the point cloud. This is understood as a filtering out of obstacles), among all of the objects obtained based on the object segmentation process (pg. 8 para. 6, within S302 the position, size, and orientation of the obstacle are determined which is understood as a segmentation process);
searching for the dynamic obstacles except for the areas that are filtered out (pg. 9 para. last, the system determines the area which the foreground obstacle passes through which is understood as determining motion or a dynamic obstacle. If the target obstacle exceeds the size threshold, as above, "then the obstacle does not affect the traffic of the vehicle, removing the point cloud data corresponding to the target obstacle at the position, or removing the point cloud data of the target obstacle." This shows that large obstacles are determined not to affect the traffic of the vehicle and they are removed from consideration. Therefore, further consideration of obstacles, see pg. 10 para. 2 S307, considers all areas except those areas that are filtered out).
Ishikawa and Feng are combinable because they are from the same field of endeavor of obstacle detection in the field of autonomous vehicle operations (Ishikawa, [0007]; Feng, pg. 2 para. 4-7).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to combine the filtering of Feng with the invention of Ishikawa.
The motivation for doing so would have been "to improve the accuracy of detecting the target obstacle around the vehicle, effectively reducing the obstacle error detection of the vehicle travel route planning or the influence of the control of the vehicle" (Feng, pg. 4 para. last).
Therefore, it would have been obvious to combine Feng with Ishikawa.
Ishikawa in view of Feng does not disclose expressly to perform object segmentation on the occupancy map.
Buerkle discloses:
obtaining objects by performing an object segmentation process on the occupancy map (pg. 2 para. 11, “By associating one or more objects with the grid cells of an occupancy grid, the grid may be segmented and grid cells associated with an object further processed as a unit.” Therefore, objects in the occupancy map are segmented);
Buerkle is combinable with Ishikawa in view of Feng because they are from the same field of endeavor of obstacle detection in the field of autonomous vehicle operations (Buerkle, pg. 2 para. 6).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to combine the occupancy map segmentation of Buerkle with the invention of Ishikawa in view of Feng.
The motivation for doing so would have been “This can create a better environment model for a driver assistance system” (Buerkle, pg. 2 para. 11).
Therefore, it would have been obvious to combine Buerkle with Ishikawa in view of Feng to obtain the invention as specified in claim 1.
Regarding claim 2, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 1.
Ishikawa further discloses:
The dynamic obstacle tracking method of claim 1, wherein the performing the object segmentation process comprises performing the object segmentation process only on areas of the occupancy map that have an occupancy rate greater than a second threshold value ([0111] the grids in the occupancy map are marked to have an object present when the presence probability, which is understood as an occupancy rate, is greater than a threshold. As objects are recognized during the object segmentation of [0156], it is understood that objects are only segmented in areas where objects are present which is understood as areas which have a presence probability, i.e. occupancy rate, greater than a threshold).
Regarding claim 3, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 1.
Ishikawa further discloses:
The dynamic obstacle tracking method of claim 1, wherein the finding the dynamic obstacles comprises repeatedly performing particle generation, prediction, and update processes on the entirety of the occupancy map ([0161] and Fig. 8, movement estimation is performed by "particle filter" method which is understood as the process of repeatedly performing particle generation, prediction, and update process. This is performed for each object which is understood as the entirety of the occupancy map)
Ishikawa does not disclose expressly to consider all areas except for the areas that are filtered out.
Feng discloses:
except for the areas that are filtered out (pg. 9 para. 4-5, as shown above, filtered out areas are not considered).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to combine the filtering of Feng with the invention of Ishikawa.
The motivation for doing so would have been "to improve the accuracy of detecting the target obstacle around the vehicle, effectively reducing the obstacle error detection of the vehicle travel route planning or the influence of the control of the vehicle" (Feng, pg. 4 para. last).
Therefore, it would have been obvious to combine Feng with Ishikawa to obtain the invention as specified in claim 3.
Regarding claim 4, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 3.
Ishikawa further discloses:
The dynamic obstacle tracking method of claim 3, further comprising: displaying marks that indicate the dynamic obstacles that are found on the occupancy map (Fig. 12 and [0210], the dynamic obstacles 401 and 402 are marked by arrows A101 and A102).
Regarding claim 5, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 4.
Ishikawa further discloses:
The dynamic obstacle tracking method of claim 4, further comprising: displaying movement information corresponding to the marks (Fig. 12 and [0210], arrows indicating movement information are displayed in the image), the movement information including at least one from among a position, a moving direction, and a moving speed of each of the dynamic obstacles that are found (as the claim states "at least one from among" the listed items, the examiner is interpreting the scope of the claim as encompassing any one of the items though more than one item may be present. Fig. 12 and [0210], the position of the dynamic objects 401 and 402 are shown and the movement directions are indicated by the arrows A101 and A102).
Regarding claim 6, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 5.
Ishikawa further discloses:
The dynamic obstacle tracking method of claim 5, further comprising: causing the unmanned vehicle to perform an avoidance maneuver ([0077], a plan to direct the vehicle is generated. [0080], the vehicle operates by following the planned path. This is understood as causing an unmanned vehicle to perform a maneuver) based on at least one of the dynamic obstacles that is found ([0115] the path is set to avoid mobile objects, i.e. dynamic obstacles), and based on the movement information ([0115] the path is set to avoid a mobile object even if the object moves. This is understood as a maneuver based on movement information).
Regarding claim 9, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 1.
Ishikawa further discloses:
The dynamic obstacle tracking method of claim 1, wherein the environment recognition sensor includes at least one from among a light detection and ranging (lidar) sensor, a visible light camera, a thermal imaging camera, and a laser sensor ([0053] and Fig. 1, the environment sensor includes at least one from among a LiDAR sensor, a camera, and an ultrasonic sensor).
Regarding claim 10, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 1.
Ishikawa further discloses:
The dynamic obstacle tracking method of claim 1, wherein the performing the object segmentation process comprises performing a semantic segmentation process on the occupancy map ([0156], the segmentation recognizes the type of object which is understood as semantic segmentation. See also [0073] the system may perform semantic segmentation) and recognizing types of the objects obtained by the semantic segmentation process ([0156], the segmentation recognizes the type of object).
Regarding claim 18, claim 18 recites a system with elements corresponding to the steps recited in claim 1. Therefore, the recited elements of this claim are mapped to the proposed combination of Ishikawa in view of Feng in the same manner as the corresponding steps in its corresponding method claim, claim 1. Additionally, the rationale and motivation to combine Ishikawa in view of Feng in further view of Buerkle, presented in rejection of claim 1, apply to this claim. Finally, Ishikawa discloses:
a system comprising: at least one processor ([0038] and Fig. 1, the system includes a processor, item 21);
and memory storing computer instructions ([0038] and Fig. 1, the system includes a recording unit, [0061] which is understood as a memory storing programs, i.e. instructions),
Regarding claim 19, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 18.
Claim 19 recites a system with elements corresponding to the steps recited in claim 2. Therefore, the recited elements of this claim are mapped to the proposed combination of Ishikawa in view of Feng in further view of Buerkle in the same manner as the corresponding steps in its corresponding method claim, claim 2.
Regarding claim 20, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 18.
Claim 20 recites a system with elements corresponding to the steps recited in claim 3. Therefore, the recited elements of this claim are mapped to the proposed combination of Ishikawa in view of Feng in further view of Buerkle in the same manner as the corresponding steps in its corresponding method claim, claim 3.
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa et al. (US 20220383749 A1; hereafter, Ishikawa) in view of Feng (CN 115410176 A) in further view of Buerkle et al. (DE 102013223803 A1) and of Fan et al. (CN 114812435 A; hereafter, Fan).
Regarding claim 7, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 1.
Ishikawa in view of Feng in further view of Buerkle does not disclose expressly setting the first threshold to a value that is greater than a size of a human, an animal, and a vehicle.
Fan discloses:
The dynamic obstacle tracking method of claim 1, further comprising: setting the first threshold value to a value that is greater than a size of a human, a size of an animal, and a size of a vehicle (since there is only one first threshold, when considering the limitation "greater than a size of a human, a size of an animal, and a size of a vehicle", the examiner considers a threshold greater than the size of a vehicle to teach a size greater than a human and an animal as a person of ordinary skill in the art understands a vehicle to be larger than a human and an animal. pg. 8 para. 2, the threshold is greater than the height of the vehicle).
Fan is combinable with Ishikawa in view of Feng in further view of Buerkle because it is from the related field of endeavor of processing a three-dimensional point cloud around a vehicle (Fan, pg. 2 para. 6).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to combine the threshold filtering of Fan with the invention of Ishikawa in view of Feng in further view of Buerkle.
The motivation for doing so would have been "filtering the surrounding each kind of noise, at this time for measuring the outline (long, width, high) is more accurate, and convenient for subsequent analysis of the real vehicle point cloud" (Fan, pg. 8 para. 3).
Therefore, it would have been obvious to combine Fan with Ishikawa in view of Feng in further view of Buerkle to obtain the invention as specified in claim 7.
Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa et al. (US 20220383749 A1; hereafter, Ishikawa) in view of Feng (CN 115410176 A) in further view of Buerkle et al. (DE 102013223803 A1) and of Aizawa et al. (US 20240077874 A1; hereafter, Aizawa).
Regarding claim 8, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 1.
Ishikawa in view of Feng in further view of Buerkle does not disclose expressly displaying the occupancy map such that areas within the occupancy map that have a higher occupancy rate are displayed darker than other areas of the occupancy map.
Aizawa discloses:
The dynamic obstacle tracking method of claim 1, further comprising: displaying the occupancy map ([0055] the grid map is displayed on an operation panel), wherein the occupancy map is displayed such that areas within the occupancy map that have a higher occupancy rate than occupancy rates of other areas of the occupancy map are displayed darker than the other areas of the occupancy map ([0048] and Fig. 4, regions with objects, which are understood to have a higher occupancy rate, are defines as "1" while other spaces are "0". As shown on fig. 4, the occupied areas with a value of "1" are darker than other areas).
Aizawa is combinable with Ishikawa in view of Feng in further view of Buerkle because it is from the related field of endeavor of detecting obstacles around a moving object (Aizawa, [0008]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to combine the occupancy map display method of Aizawa with the invention of Ishikawa in view of Feng in further view of Buerkle.
The motivation for doing so would have been the moving object 100 generates a path so as to avoid these obstacles" (Aizawa, [0048]). In other words, the display method allows the obstacles to be detected such that the path can avoid the obstacles.
Therefore, it would have been obvious to combine Aizawa with Ishikawa in view of Feng in further view of Buerkle to obtain the invention as specified in claim 8.
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa et al. (US 20220383749 A1; hereafter, Ishikawa) in view of Feng (CN 115410176 A) in further view of Buerkle et al. (DE 102013223803 A1) and of Bälter (US 20220055660 A1; hereafter, Balter).
Regarding claim 11, Ishikawa in view of Feng in further view of Buerkle discloses the subject matter of claim 10.
Ishikawa does not disclose expressly that the searching for the dynamic obstacle does not include the areas filtered out based on the first threshold value.
Feng discloses:
wherein the searching for the dynamic obstacles comprises searching for the dynamic obstacles in the entirety of the occupancy map except for the areas that are filtered out based on the first threshold value (pg. 9 para. last, the system determines the area which the foreground obstacle passes through which is understood as determining motion or a dynamic obstacle. If the target obstacle exceeds the size threshold, as above, "then the obstacle does not affect the traffic of the vehicle, removing the point cloud data corresponding to the target obstacle at the position, or removing the point cloud data of the target obstacle." This shows that large obstacles are determined not to affect the traffic of the vehicle and they are removed from consideration. Therefore, further consideration of obstacles, see pg. 10 para. 2 S307, considers all areas except those areas that are filtered out)
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to combine the filtering of Feng with the invention of Ishikawa.
The motivation for doing so would have been "to improve the accuracy of detecting the target obstacle around the vehicle, effectively reducing the obstacle error detection of the vehicle travel route planning or the influence of the control of the vehicle" (Feng, pg. 4 para. last).
Therefore, it would have been obvious to combine Feng with Ishikawa.
Ishikawa in view of Feng in further view of Buerkle does not disclose expressly classifying objects as dynamic and static based on the type of objects recognized, filtering out areas from the occupancy map that are classified as static obstacles, and that when searching for dynamic obstacles not searching areas that are filtered out based on being occupied by static obstacles.
Balter discloses:
The dynamic obstacle tracking method of claim 10, further comprising: classifying the objects as the dynamic obstacles and static obstacles based on the types of the objects that are recognized ([0046] "the object recognition algorithm used to implement the method of detecting objects of a predetermined type will be executed by the stationary object detection component/module/unit 26", therefore, types of stationary objects are detected. [0048], moving or dynamic objects are also detected);
and additionally filtering out areas from the occupancy map that are occupied by objects that are classified as the static obstacles ([0051] stationary objects are removed),
and the areas that are filtered out based on being occupied by the objects that are classified as the static obstacles ([0097] "stationary sensed object detections can then be tagged as such or discarded from the sensor stream to reduce the computational burden on the data processing component 28 (e.g., for tracking moving objects)", which is understood to showing that searching for dynamic objects occurs except in areas filtered out as stationary objects).
Balter is combinable with Ishikawa in view of Feng in further view of Buerkle because it is from the same field of endeavor of determining stationary and dynamic objects surrounding a vehicle (Balter, [0006]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention to combine the static object filtering of Balter with the invention of Ishikawa in view of Feng in further view of Buerkle.
The motivation for doing so would have been "to reduce the computational burden on the data processing component 28 (e.g., for tracking moving objects)" (Balter, [0097]).
Therefore, it would have been obvious to combine Balter with Ishikawa in view of Feng in further view of Buerkle to obtain the invention as specified in claim 11.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US20200409387 A1, Tsurumi et al., discloses a system for generating an occupancy grid map around a moving object to determine positions of objects around the moving object, i.e. a vehicle.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOSHUA B CROCKETT whose telephone number is (571)270-7989. The examiner can normally be reached Monday-Thursday 8am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, John M Villecco can be reached at (571) 272-7319. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JOSHUA B. CROCKETT/Examiner, Art Unit 2661
/JOHN VILLECCO/Supervisory Patent Examiner, Art Unit 2661