DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This action is in response to the amendments filed on 09/18/2025, in which claims 1-3, 5-8, 10-12, 14-17, and 19-20 are pending.
Response to Arguments
Applicant’s arguments with respect to claims 1-3, 5-8, 10-12, 14-17, and 19-20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-3, 5, 10-12, 14-15, and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Das et al., U.S. Patent Application Publication No. 2022/0214444 A1 (hereinafter Das), in view of Derbisz, U.S. Patent Application Publication No. 2022/0171975 A1, and further in view of Vozar et al., U.S. Patent Application Publication No. 2020/0026286 A1 (hereinafter Vozar).
Regarding claim 1, Das discloses a method (see at least Das Fig. 8)
performed by a free-space verification system for one or both of supporting and providing confidence in that a perception system of an Automated Driving System, ADS, of a vehicle detects presence of objects, the method comprising (see at least Das [0117]-[0118]: “As illustrated in FIG. 5B, environment synthesis is used to create map, the map is memorized for dynamic object identification and free space detection. The module 214 performs fused grid based environment mapping for surround view tracking system (the invention) incorporating aspects of: a) Confidence level of sensed perception data”); [0147]: “Surround view tracking enabled by invention disclosed may be well used for autonomous operation of vehicles it is implemented in for aspects such as valet parking, traffic jam pilot or highway pilot operation.”):
obtaining sensor data of vehicle surroundings with support from vehicle-mounted surrounding detecting sensors (see at least Das [0143]: “The method comprises, at step 802, receiving lidar data from one or more lidar sensors and radar data from one or more radar sensors and mapping the received lidar data and the received radar data in a grid, wherein each of the one or more lidar sensors and one or more radar sensors sense the one or more objects in a corresponding region”);
generating perception data of vehicle surroundings based on fusing the sensor data with support from the perception system (see at least Das [0144]: “The method further comprises, at step 806, fusing the one or more grid maps of the one or more lidar sensors and the one or more radar sensors by converting said one or more grid maps from sensor frame to vehicle frame to generate a fused grid map, wherein the fused grid map is integrated with any or a combination of track management and scan matching to perform classification of the one or more static or dynamic objects and identification of free space in the fused grid map.”));
determining that the perception system based on the perception data perceives at least a first zone in the vehicle surroundings, free from objects (see at least Das [0079]: “At block 174, based on grid fusion 176 integrated with track management, availability of free space is determined.”);
evaluating for one or more of the surrounding detecting sensors their respective obtained sensor data separately, to encounter potential sensor-specific detections in an at least first extended zone at least partly encompassing the at least first zone (see at least Das [0114]: “As shown in FIG. 5A, fused grid 504 is assimilation of grid maps developed by individual sensors i.e. two grid maps of radar sensors 524 and two grid maps of lidar sensors 522.”; [0090]: “Referring to FIG. 3, grid based 360 degree surround view system is established multiple lidar and radar sensors. In an example, lidar sensors with 180 degree beam angle are mounted in front of vehicle and at the rear of vehicle, whereas two radar sensors with 45 degrees beam angle are mounted in the sideways of vehicle.”),
and determining when respective potential sensor-specific detections within the at least first extended zone for one or both of a predeterminable number of and combination of the surrounding detecting sensors, comply with at least a first free- space verifying criterion, that the at least first zone is verified as object-free (see at least Das [0078]: “According to an embodiment, the system 100 integrates sensed signals from radar and lidar pre-processed data. The technique uses initialization of track based on post-processing of sensed signals from radar and lidar sensors, and identification and classification of target feature from cluster signals from lidar and radar sensed signals. The system 100 includes multi target track management and sensor data fusion comprising synchronization, track initialization, centralized track management, fused grid map and target classification in grid. Furthermore, the system 100 determines availability of free space.”; under broadest reasonable interpretation free-space verifying criterium that a zone is verified as object-free includes determining the availability of free space; Das discloses determining respective potential sensor-specific detections for a combination of the surrounding detecting sensors through sensor data fusion from radar and lidar).
Das fails to expressly disclose the free-space verifying criterion including a minimum number of detections not within a maximum proximity to signify a single object or a minimum number of detection within a maximum proximity to signify a single object with a current position outside the first zone. However, Derbisz teaches
the at least first free-space verifying criterion comprising one or both of: existence in the at least first extended zone of at least a predeterminable minimum number of detections but which detections are not within a predeterminable maximum proximity signifying a single object (see at least Derbisz [0068]: “For example, the bounding boxes 41 of the passenger cars 21 as shown in FIG. 9 each comprise distance points 33 which are quite close to the center of the respective bounding boxes 41. In contrast, the bounding boxes 41 of the trucks 23 do not include distance data points 33 which are close to the center of the respective bounding box 41.”; Derbisz teaches detections are not within a predeterminable maximum proximity signifying a single object because the large distance of data points indicates the presence of a truck that is distinct from the passenger car);
and existence in the at least first extended zone of at least the predeterminable minimum number of detections out of which a predeterminable number of the detections are within the maximum proximity signifying a single object but which detections imply that a current position of the single object lies not within the at least first zone (see at least Derbisz [0072]: “Finally, the contour 31 of the free space 25 in front of the vehicle 11 is equally divided by a predetermined azimuth angle with respect to the vehicle 11, and each segment 55 of the contour 31 is classified by assigning the respective segment 55 to the respective classification of the center 53 of the bounding box 51 (see FIG. 10) if this segment comprises the center 53 of the bounding box 51. Classified segments 55 are shown in FIG. 11 which includes the same representation of the free space 25 as FIG. 10. That is, the segments 55 of the contour 31 are classified as in “passenger car” 21 which means that the free space 25 in front of the vehicle 11 is limited by a passenger car 21 (see FIG. 2) at the classified segments 55.”; [0053]: “FIG. 4 additionally depicts a contour 30 which extends continuously along the limitation 29 between the free space 25 and the non-drivable area 27 as also shown in FIG. 3.”; Derbisz teaches detections within the maximum proximity signifying a single object because the data points contained in a bounding box signify a single object; Derbisz teaches a current position of the object is not within a first zone because the object is not within the free space area).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to modify the method disclosed by Das with the criterion taught by Derbisz with reasonable expectation of success. Derbisz is directed towards the related field of determining free space for a vehicle. Therefore, one of ordinary skill in the art would be motivated to combine Das with the criterion taught by Derbisz to improve accuracy in determining the distance to an object and the type of object present in an environment (see at least Derbisz [0006]: “Accordingly, there is a need to have a method and a system which are able to determine an accurate distance to an object and a type of the object in an environment of a vehicle.”).
Das in view of Derbisz fail to expressly disclose evaluating sensor data of a sensor/modality-specific buffer ranging back a time period or number of samples. However, Vozar teaches
the evaluating comprising evaluating respective obtained sensor data of a sensor/modality-specific buffer, the sensor/modality-specific buffer ranging back one or both of a respective predeterminable time period and a respective predeterminable number of samples (see at least Vozar [0077]: “That is, in such embodiments, S221 may insert sensor data from each distinct sensor of the autonomous agent and/or the infrastructures devices into a dedicated memory section and/or independent memory sections of the decisioning data buffer and/or similar data buffer. In this way, S221 may function to track agents of each distinct sensor independently.”; [0067]: “S215, which includes building a data buffer comprising decisioning data, may function to arrange and/or store streams of data from one or more data sources within a data buffer, which may sometimes be referred to herein as a historical and/or global data buffer. The data buffer preferably functions to store historical data as well as presently collected data (e.g., real-time data or near real-time).”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to modify the method disclosed by Das in view of Derbisz with the data buffer taught by Vozar with reasonable expectation of success. Vozar is directed towards the related field of sensor data processing for autonomous vehicles. Therefore, one of ordinary skill in the art would be motivated to combine Das in view of Derbisz with Vozar to optimize vehicle behavioral policy decision making (see at least Vozar [0005]: “Thus, there is a need in the vehicle automation field for enabling a multi-perspective view of an operating environment of an autonomous agent that enables an optimal selection of behavioral policy by an autonomous agent in real-time operating conditions. The embodiments of the present application described herein provide technical solutions that address, at least, the need described above.”).
Regarding claim 2, Das in combination with Derbisz and Vozar teaches all elements of the method according to claim 1 as explained above. Das further teaches wherein the at least first free-space verifying criterion comprises:
existence in the at least first extended zone of fewer than the predeterminable minimum number of detections (see at least Das [0123]: “In an aspect, the integration module 218 along with the map fusion module 216 integrates the fused grid map with any or a combination of track management and scan matching to perform classification of the one or more objects into static objects or dynamic objects and identification of free space in the fused grid map.”; under broadest reasonable interpretation a predeterminable minimum number of detections includes one detection; therefore, free-space verifying criterium comprising fewer than a predeterminable minimum number of detections includes determining free space based on zero objects detected);
Regarding claim 3, Das in combination with Derbisz and Vozar teaches all elements of the method according to claim 2 as explained above. Das further teaches,
wherein the determining that the at least first zone is verified as object-free comprises that the at least first zone otherwise is not verified as object-free (see at least Das [0134]: “In an embodiment, the fused map and track management integration module 218 reconstructs and maps one or more cluster points, resembling lidar point cloud data, on one or more data points obtained from radar data for mapping of the one or more objects on the fused grid to form complete surroundings around the host vehicle.”; [0076]: “At step 112, the processing unit 104 integrates the fused grid map with any or a combination of track management and scan matching for dynamic target classification 168 for classification of the one or more objects into static objects or dynamic objects and identification of free space in the fused grid map. The output of block 168 may be used for pedestrian classification in pedestrian point model 172.”; Das Fig. 7A shows a zone is not verified as object-free because the zone contains a pedestrian point cloud).
Regarding claim 5, Das in combination with Derbisz and Vozar teaches all elements of the method according to claim 1 as explained above. Das further teaches
wherein the determining that the at least first zone is verified as object-free comprises that the at least first zone otherwise is not verified as object-free (see at least Das [0134]: “In an embodiment, the fused map and track management integration module 218 reconstructs and maps one or more cluster points, resembling lidar point cloud data, on one or more data points obtained from radar data for mapping of the one or more objects on the fused grid to form complete surroundings around the host vehicle.”; [0076]: “At step 112, the processing unit 104 integrates the fused grid map with any or a combination of track management and scan matching for dynamic target classification 168 for classification of the one or more objects into static objects or dynamic objects and identification of free space in the fused grid map. The output of block 168 may be used for pedestrian classification in pedestrian point model 172.”; Das Fig. 7A shows a zone is not verified as object-free because the zone contains a pedestrian point cloud).
Regarding claim 10, this claim recites a system that performs the method of claim 1. Das in combination with Derbisz and Vozar also teaches a system for performing the method of claim 1 as outlined in the rejection to claim 1 above. Specifically, Das discloses a sensor data obtaining unit, a perception data generating unit, a free-space determining unit, an evaluating unit, and a verification determining unit (Das [0044], where the units are being interpreted as any general computing component that performs the specific function). Therefore, claim 10 is rejected for the same rationale as claim 1.
Regarding claim 11, this claim recites a system that performs the method of claim 2 as explained above. Therefore, claim 11 is rejected for the same rationale as claim 2.
Regarding claim 12, this claim recites a system that performs the method of claim 3 as explained above. Therefore, claim 12 is rejected for the same rationale as claim 3.
Regarding claim 14, Das in combination with Derbisz and Vozar teaches all elements of the system according to claim 11 as explained above. Das further teaches
wherein the free-space verification system is comprised in a vehicle (see at least Das [0057]: “An aspect of the present disclosure provides a system implemented in a vehicle for tracking of one or more objects to identify free space”).
Regarding claim 15, this claim recites a system that performs the method of claim 3 as explained above. Therefore, claim 15 is rejected for the same rationale as claim 3.
Regarding claim 19, this claim recites a system similar to the system recited in claim 14 as explained above. Therefore, claim 19 is rejected for the same rationale as claim 14.
Regarding claim 20, this claim recites a storage medium embodying the method of claim 1. Das in combination with Derbisz and Vozar also teaches a non-volatile computer readable storage medium storing a computer program containing computer program code and a processor (Das [0083]). Therefore, claim 20 is rejected for the same rationale as claim 1.
Claims 6-8 and 16-17 are rejected under 35 U.S.C. 103 as being unpatentable over Das in view of Derbisz and Vozar, and further in view of Philbin et al., U.S. Patent No. 11726492 B2 (hereinafter Philbin).
Regarding claim 6, Das in combination with Derbisz and Vozar teaches all elements of the method according to claim 5 as explained above. Das in view of Derbisz and Vozar fails to expressly disclose adapting path planning of the vehicle when the first zone is not verified as object-free. However, Philbin teaches
providing, when the at least first zone is determined not verified as object-free, instructions to adapt path planning of the vehicle as if one or more objects are present within the at least first zone (see at least Philbin Col. 19, line 60-Col. 20, line 5: “In other words, operation 416 may comprise determining whether the trajectory will put the autonomous vehicle within the threshold distance of a portion of the environment predicted as being occupied at any of the respective times by the respective occupancy maps associated therewith. If, at operation 416, the collision avoidance system determines the trajectory is not associated with a position that is less than the threshold distance from an occupied portion (i.e., the positions are at a distance from any occupied portions that meets or exceeds the threshold distance), example process 400 may continue to operation 418; otherwise, example process 400 may continue to operation 420.”; Col. 20, lines 31-42: “For example, operation 420 may comprise causing the vehicle to execute a contingent trajectory, which may comprise transmitting instructions to a system controller to cause the vehicle to slow down, execute a safe-stop maneuver, hard-brake, and/or the like. In some examples, the planning component may determine the contingent trajectory and transmit the contingent trajectory to the collision avoidance component with the trajectory and/or upon request by the collision avoidance component. In an additional or alternate example, operation 420 may comprise transmitting a request to the planning component to determine a new trajectory.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to modify the method disclosed by Das in view of Derbisz and Vozar with the adaptive path planning taught by Philbin with reasonable expectation of success. Philbin is directed towards the related field of a collision avoidance perception system. Therefore, one of ordinary skill in the art would be motivated to combine Das in view of Derbisz and Vozar with the adapting path planning taught by Philbin to improve safety (see at least Philbin Col. 1, lines 12-23: “Safety of passengers in a vehicle and other people or objects in proximity to the vehicle is of the upmost importance. Such safety is often predicated on an accurate detection of a potential collision and timely deployment of a safety measure. To safely operate, an autonomous vehicle may include multiple sensors and various systems for detecting and tracking events surrounding the autonomous vehicle and may take these events into account when controlling the autonomous vehicle. For example, the autonomous vehicle may detect and track every object within a 360-degree view of a set of cameras, LIDAR sensors, radar, and/or the like to control the autonomous vehicle safely.”).
Regarding claim 7, Das in combination with Derbisz, Vozar, and Philbin teach all elements of the method according to claim 6 as explained above. Philbin further teaches
wherein the providing instructions to adapt path planning further comprises providing instructions to actuate the adapted path planning (see at least Philbin Col. 20, lines 31-42: “For example, operation 420 may comprise causing the vehicle to execute a contingent trajectory, which may comprise transmitting instructions to a system controller to cause the vehicle to slow down, execute a safe-stop maneuver, hard-brake, and/or the like. In some examples, the planning component may determine the contingent trajectory and transmit the contingent trajectory to the collision avoidance component with the trajectory and/or upon request by the collision avoidance component. In an additional or alternate example, operation 420 may comprise transmitting a request to the planning component to determine a new trajectory.”).
Regarding claim 8, this claim recites a method similar to the method performed in claim 6 as explained above. Therefore, claim 8 is rejected for the same rationale as claim 6.
Regarding claim 16, this claim recites a system that performs the method of claim 6 as explained above. Therefore, claim 16 is rejected for the same rationale as claim 6.
Regarding claim 17, this claim recites a system that performs the method of claim 7 as explained above. Therefore, claim 17 is rejected for the same rationale as claim 7.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Kim, U.S. Patent Application Publication No. 2022/0080998 A1, directed towards storing vehicle position point data from various sensors into a buffer memory.
Kim, U.S. Patent Application Publication No. 2021/0179136 A1, directed towards multiple data input buffer streams corresponding to distinct vehicle sensors that are processed using an output buffer.
Applicant's amendment necessitated the new grounds of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ELIZABETH J SLOWIK whose telephone number is (571)270-5608. The examiner can normally be reached MON - FRI: 0900-1700.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ANISS CHAD can be reached on (571)270-3832. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ELIZABETH J SLOWIK/Examiner, Art Unit 3662
/ANISS CHAD/Supervisory Patent Examiner, Art Unit 3662