DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This Office Action is in response to the application filed on July 31, 2024. Claims 1-6 are pending. Claim 1 is independent.
Priority
Receipt is acknowledged of certified copies of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file.
Information Disclosure Statement
The information disclosure statements (IDSs) submitted on July 31, 2024 has been considered. The submission is in compliance with the provisions of 37 CFR 1.97. The Forms PTO-1449 are signed and attached hereto.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-5 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. Patent Publication 2018/0151077 to Lee.
Claims 1-5 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Lee.
With respect to independent claim 1, Lee discloses a recognition sensor that acquires information about a position and a velocity of the target (see paragraphs [0039] – [0044] and [0048]: The collision preventing apparatus of the first vehicle 100 receives sensor data generated by the sensor. The collision preventing apparatus tracks, in real-time, a location of the person 110 and a location of the second vehicle 120 based on the received sensor data. The collision preventing apparatus estimates a trajectory of each of neighboring objects to estimate a collision between the neighboring objects. The estimated trajectory indicates a path a that the corresponding object may follow in the future. In an example, the collision preventing apparatus verifies whether sections or all of the estimated trajectories overlap or intersect by comparing the estimated trajectories. If the estimated trajectory 111 and the estimated trajectory 121 cross each other, the collision preventing apparatus may estimate that the person 110 and the second vehicle 120 are to collide with each other. In an example, the collision preventing apparatus estimates whether objects may collides based on a braking distance of each of the objects and a probability that each of the objects is to follow the estimated trajectory. The collision preventing apparatus is included in a vehicle 200 and estimates whether objects located in a vicinity of the vehicle 200 may collide.);
a first collision estimation unit and a second collision estimation unit that output an estimation result regarding a collision between the mobile object and the target using the information acquired by the recognition sensor as an input and have different characteristics, the first collision estimation unit having higher responsiveness than the second collision estimation unit, the second collision estimation unit having higher noise resistance than the first collision estimation unit (see paragraphs [0015], [0051], [0058] and [0060]: there is provided collision preventing apparatus including a processor configured to track objects located in a vicinity of a vehicle based on data collected from a sensor in the vehicle, determine a trajectory of each of the objects, verify a collision level of each of the objects based on the determined trajectory, and perform a collision prevention operation based on the collision level. The collision level is determined based on a degree to which estimated trajectories overlap. When the estimated trajectories overlap, in an example, the collision level is determined based on a time to collision (TTC) or a braking distance of each of the objects. The collision level may be determined based on a probability that the objects collide with each other. The collision preventing apparatus determines a probability that each of the objects is to follow the estimated trajectory. In an example, the collision preventing apparatus determines the estimated trajectory and the probability that each of the objects is to follow the estimated trajectory for each one of the objects. The probability that each of the objects is to follow the estimated trajectory may be determined based on the speed of each of the objects and the information on the road on which the objects are located. When the estimated trajectories overlap, in an example, the collision preventing apparatus determines a probability that the objects may collide with each other based on the probability that each of the objects will continue to follow the estimated trajectory. The probability that the objects may collide with each other may be determined based on the probability that all objects corresponding to the respective overlapping estimated trajectories are to follow the overlapping estimated trajectories.);
a first collision risk level calculation unit that calculates a first collision risk level based on a first estimation result output by the first collision estimation unit (see paragraphs [0051]: The collision preventing apparatus determine an estimated trajectory of the vehicle 200. In an example, the collision preventing apparatus verifies a collision level of each of the objects based on the determined estimated trajectory of each of the objects. The collision level is determined based on a degree to which estimated trajectories overlap. When the estimated trajectories overlap, in an example, the collision level is determined based on a time to collision (TTC) or a braking distance of each of the objects. The collision level may be determined based on a probability that the objects collide with each other.);
a second collision risk level calculation unit that calculates a second collision risk level based on a second estimation result output by the second collision estimation unit (see paragraphs [0058]: the collision preventing apparatus determines a probability that each of the objects is to follow the estimated trajectory. In an example, the collision preventing apparatus determines the estimated trajectory and the probability that each of the objects is to follow the estimated trajectory for each one of the objects. The probability that each of the objects is to follow the estimated trajectory may be determined based on the speed of each of the objects and the information on the road on which the objects are located.); and
a collision estimation arbitration unit that selects either the first estimation result or the second estimation result (see abstract, paragraphs [0047] and [0063]: differently perform an operation of informing the objects of the collision risk based on a collision level of each of the objects. The collision preventing apparatus changes a path of the first vehicle 100 based on the collision risk between the objects in the vicinity of the first vehicle 100. The collision preventing apparatus may control the path of the first vehicle 100 such that the first vehicle 100 avoids a point at which the person 100 and the second vehicle 120 may collide with each other. To control the path of the first vehicle 100, the collision preventing apparatus may be connected to another driver assistance system or an autonomous driving apparatus included in the first vehicle 100. The collision preventing apparatus may attempt to prevent a secondary accident caused by the first vehicle 100 by controlling the path of the first vehicle 100 to avoid the point at which the person 110 and the second vehicle 120 may collide with each other. When the probability that the objects may collide with each other is greater than or equal to 0.6 and less than or equal to 0.8, the collision preventing apparatus may warn each of the objects of the collision risk. When the probability that the objects may collide with each other is greater than or equal to 0.8, the collision preventing apparatus may warn each of the objects of the collision risk and change the path of the vehicle based on the point at which the objects are estimated to collide with each other.).
With respect to dependent claim 2, Lee discloses wherein the first estimation result and the second estimation result include at least one of a time until a collision (see paragraph [0051]: The collision level is determined based on a degree to which estimated trajectories overlap. When the estimated trajectories overlap, in an example, the collision level is determined based on a time to collision (TTC) or a braking distance of each of the objects. The collision level may be determined based on a probability that the objects collide with each other.),
an overlap ratio that is a ratio at which a mobile object range and a target range overlap at a collision prediction point, and a relative velocity in a front-rear direction at the collision prediction point (see paragraphs [0051], [0057] and [0059]: The collision level is determined based on a degree to which estimated trajectories overlap. When the estimated trajectories overlap, in an example, the collision level is determined based on a time to collision (TTC) or a braking distance of each of the objects. The collision preventing apparatus determines a trajectory of each of the objects. In an example, the trajectory of each of the objects is estimated based on the collected temporal data. The collision level may be determined based on a probability that the objects collide with each other. The collision preventing apparatus estimates whether the objects may collide with each other. The collision preventing apparatus may verify a collision level of each of the objects based on the determined estimated trajectory. The collision preventing apparatus may verify the collision level based on whether the estimated trajectories overlap with each other.),
wherein the first collision risk level calculation unit and the second collision risk level calculation unit calculate the first collision risk level and the second collision risk level from at least one of a margin to a collision, accuracy of a collision prediction with the target, and magnitude of damage (see paragraphs [0051], [0053], [0058] and [0060]: the collision preventing apparatus verifies a collision level of each of the objects based on the determined estimated trajectory of each of the objects. The collision level is determined based on a degree to which estimated trajectories overlap. When the estimated trajectories overlap, in an example, the collision level is determined based on a time to collision (TTC) or a braking distance of each of the objects. The collision level may be determined based on a probability that the objects collide with each other. The collision preventing apparatus determines the estimated trajectory and the probability that each of the objects is to follow the estimated trajectory for each one of the objects. The probability that each of the objects is to follow the estimated trajectory may be determined based on the speed of each of the objects and the information on the road on which the objects are located. The collision preventing apparatus determines a probability that the objects may collide with each other based on the probability that each of the objects will continue to follow the estimated trajectory. The probability that the objects may collide with each other may be determined based on the probability that all objects corresponding to the respective overlapping estimated trajectories are to follow the overlapping estimated trajectories. The collision preventing apparatus estimates that the objects located in the vicinity of the vehicle 200 may collide and generates a path of the vehicle 200 considering the estimated collision between the objects located in the vicinity of the vehicle 200 when the probability of the objects colliding with each other is relatively great.), and
wherein the collision estimation arbitration unit selects either the first estimation result or the second estimation result based on the first estimation result, the second estimation result, the first collision risk level, and the second collision risk level (see abstract and paragraphs [0013], [0014], [0051] and [0059]: differently perform an operation of informing the objects of the collision risk based on a collision level of each of the objects. The performing of the collision prevention operation may include any one or any combination of changing a path of the vehicle based on the collision level or informing an object of the objects of a collision risk. The changing of the path of the vehicle may include changing the path of the vehicle based on a point at which trajectories of at least two objects of the objects intersect. The collision preventing apparatus verifies a collision level of each of the objects based on the determined estimated trajectory of each of the objects. The collision level is determined based on a degree to which estimated trajectories overlap. When the estimated trajectories overlap, in an example, the collision level is determined based on a time to collision (TTC) or a braking distance of each of the objects. The collision level may be determined based on a probability that the objects collide with each other.).
With respect to dependent claim 3, Lee discloses wherein the collision estimation arbitration unit determines that the mobile object and the target do not to collide with each other when either the first estimation result or the second estimation result indicates no collision [0046] and [0103]: the collision preventing apparatus verifies whether the person 110 and the second vehicle 120 detect each other based on the field of view 112 and the field of view 122. When it is determined that the person 110 and the second vehicle 120 are able to detect each other, the collision preventing apparatus may not inform the person 110 or the second vehicle 120 of the collision risk although the estimated trajectory 111 and the estimated trajectory 121 cross each other. When at least one of the objects changes its path or is not following the estimated trajectory, the estimated trajectories may no longer overlap. When the estimated trajectories no longer overlap, the collision preventing apparatus may stop the sounding of the horn. When each of the objects do not change their path and continue to follow the estimated trajectory), and
selects an estimation result corresponding to a low collision risk level by comparing the first collision risk level with the second collision risk level when both of the first estimation result and the second estimation result indicate a collision (see paragraphs [0051] and [0062]: the collision preventing apparatus verifies a collision level of each of the objects based on the determined estimated trajectory of each of the objects. The collision level is determined based on a degree to which estimated trajectories overlap. When the estimated trajectories overlap, in an example, the collision level is determined based on a time to collision (TTC) or a braking distance of each of the objects. The collision level may be determined based on a probability that the objects collide with each other. The collision preventing apparatus proposes a response or responds. The collision preventing apparatus may perform an operation based on the collision level. In an example, the collision preventing apparatus informs each of the objects of a collision risk or changes a path of the vehicle.).
With respect to dependent claim 4, Lee discloses wherein the collision estimation arbitration unit further selects the first estimation result or the second estimation result based on outputs by the recognition sensor and a mobile object state sensor that acquires a state of the mobile object (see paragraphs [0049] - [0051] and [0056]: The collision preventing apparatus may be connected to at least one sensor included in the vehicle 200. Referring to FIG. 2, in an example, the vehicle 200 includes a camera 210, a radar 220, and a global positioning system (GPS) 230. The collision preventing apparatus identifies a moving object located in a vicinity of the vehicle 200 based on sensor data collected from at least one sensor. The collision preventing apparatus may detect a location of the vehicle 200 in real-time using the GPS 230. The collision preventing apparatus may detect a path, such as a road, on which the vehicle 200 is located based on the location of the vehicle 200. The collision preventing apparatus may be connected to a map database or include the map database to detect the road on which the vehicle 200 is located. The collision preventing apparatus estimates a movement of the vehicle 200 and movements of the objects in the vicinity of the vehicle 200. In an example, the collision preventing apparatus determines an estimated trajectory of each of the objects located in the vicinity of the vehicle 200.).
With respect to dependent claim 5, Lee discloses wherein the collision estimation arbitration unit arbitrates the first estimation result and the second estimation result according to a weight of collision prediction accuracy calculated based on outputs by the recognition sensor and the mobile object state sensor (see paragraphs [0050], [0051], [0058] and [0060]: The collision preventing apparatus identifies a moving object located in a vicinity of the vehicle 200 based on sensor data collected from at least one sensor. The collision level is determined based on a degree to which estimated trajectories overlap. When the estimated trajectories overlap, in an example, the collision level is determined based on a time to collision (TTC) or a braking distance of each of the objects. The collision level may be determined based on a probability that the objects collide with each other.
The collision preventing apparatus determines the estimated trajectory and the probability that each of the objects is to follow the estimated trajectory for each one of the objects. The probability that each of the objects is to follow the estimated trajectory may be determined based on the speed of each of the objects and the information on the road on which the objects are located. When the estimated trajectories overlap, in an example, the collision preventing apparatus determines a probability that the objects may collide with each other based on the probability that each of the objects will continue to follow the estimated trajectory. The probability that the objects may collide with each other may be determined based on the probability that all objects corresponding to the respective overlapping estimated trajectories are to follow the overlapping estimated trajectories.).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Lee in view of U.S. Patent Publication 2021/0245743 to Jiang et al. (hereinafter “Jiang”).
With respect to dependent claim 6, Lee discloses the collision preventing apparatus verifies a collision level of each of the objects based on the determined estimated trajectory of each of the objects. The collision level is determined based on a degree to which estimated trajectories overlap. When the estimated trajectories overlap, in an example, the collision level is determined based on a time to collision (TTC) or a braking distance of each of the objects. The collision level may be determined based on a probability that the objects collide with each other. (See paragraph [0051]). Lee does not explicitly disclose confidence-based adjustment of collision determination using predicted object behavior under degraded sensing conditions.
Jiang discloses wherein the collision estimation arbitration unit increases a weight of collision prediction accuracy as a sensing environment obtained from an output by the recognition sensor or a travel environment obtained from an output by the mobile object state sensor improves (see paragraphs [0015] and [0030]: the processor identifies the one or more of the one or more information sources that provide the most relevant and reliable information about the specified area corresponding with the collision potential scenario according to real time conditions. Sensor fusion combines the information from multiple sensors. For example, a false alarm may be avoided by establishing that all sensors or a majority of sensors must detect an object or collision potential before such a detection is accepted.),
when either the first estimation result of the second estimation result indicates no collision, determines that the mobile object and the target do not collide with each other (see paragraph [0013]: Each collision potential scenario defines a risk of a collision between the vehicle and an object in a specified area.), and
when both of the first estimation result and the second estimation result indicate a collision, arbitrates the first estimation result and the second estimation result such that as the sensing environment or the travel environment improves [0015]: the processor identifies the one or more of the one or more information sources that provide the most relevant and reliable information about the specified area corresponding with the collision potential scenario according to real time conditions. The real time conditions include ambient light intensity and obstructed views of any of the information sources of the vehicle.),
an allocation of an estimation result corresponding to a high collision risk level is increased, and an allocation of an estimation result corresponding to a low collision risk level is decreased by comparing the first collision risk level with the second collision risk level, or as the sensing environment or the travel environment deteriorates, an allocation of an estimation result corresponding to a high collision risk level is decreased, and an allocation of an estimation result corresponding to a low collision risk level is increased by comparing the first collision risk level with the second collision risk level (see paragraphs [0013], [0031] and [0077] - [0079]: Each collision potential scenario defines a risk of a collision between the vehicle and an object in a specified area. The processor also adjusts a weight with which one or more of the information sources of the vehicle are considered for each collision potential scenario such that a highest weight is given to one or more of the one or more of the information sources that provide most relevant and reliable information about the specified area corresponding with the collision potential scenario. An intersection scene refers to a particular intersection and its real time conditions (e.g., ambient light, blocked views). Sensor information is weighted based on the relevance and reliability of the sensor for the particular intersection scene. A given intersection scene and a given path of the vehicle through the intersection may involve more than one collision potential scenario (e.g., path ahead, cross traffic). Each scenario may require a different weighting of the sensors that are part of the sensor fusion. As detailed, sensor fusion refers to the fusion of not only information from the sensors of the vehicle but also information from other sources. The stand-alone reliabilities of the sensors are acquired, and the driving environment reliability of each sensor is calculated (step 20). For a situation in which the detection environment when the sensor performs a detection operation is such that the detection error tends to increase, the control unit 50 sets the driving environment reliability as a low value. The driving environment reliability is quantified, for example, as a value in the range from 0 to 100. The stand-alone reliabilities for each sensor are described below. The driving environment reliability of the GPS receiver 11 is set in response to, for example, the conditions of structures in the area surrounding the vehicle (for example, the shapes and locations thereof). The control unit 50, based on map information in the memory unit 20, calculates the driving environment reliability as a low value if the vehicle exists in a location such as in a tunnel or in an area of tall buildings, in which it is difficult to receive a radio signal from a GPS satellite.).
It would have been obvious to one skilled in the art before the effective filing date of the invention to combine the collision estimation arbitration unit that dynamically weights and selects collection estimation results based on the sensed environment of Lee with the collision determination system that modifies collision judgment based on reliability of detected object information that adjusts collision risk when sensor accuracy is reduced due to environmental factors of Jiang in order to provide reliable arbitration and prevent false positives when sensor conditions deteriorates.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DEMETRA R SMITH-STEWART whose telephone number is (571)270-3965. The examiner can normally be reached 10am - 6pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Peter Nolan can be reached at 571-270-7016. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DEMETRA R SMITH-STEWART/Examiner, Art Unit 3661
/PETER D NOLAN/Supervisory Patent Examiner, Art Unit 3661