DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Claim Rejections - 35 USC § 103:
Applicant’s arguments with respect to independent claims 1 and 10 have been considered but are moot. Applicant’s amendment to further clarify the second threshold distance of the navigation planning method (e.g. determine that the expanded bounding box of the close vehicle is within a second threshold distance shorter than the first threshold distance) changes the scope of the claimed invention necessitating new grounds of rejection.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 1/16/2026 has been entered.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2, 4-7, 10-12, 14-17 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Lin et al. (US 12240497 B1; hereafter Lin) in view of Liu et al. (US 20190103026 A1; hereafter Liu).
Regarding claim 1, Lin teaches a method for navigation planning for an automated vehicle (see at least Col 2 lines 28-29, model may be used to control the autonomous vehicle), the method comprising: identifying, by an object tracking and classification module executed by one or more processors of an automated vehicle (see at least, Col 12 lines 25-33, The perception component 218 may detect object(s) in in an environment surrounding the vehicle 202…classify the object(s)…determine characteristics associated with an object…e.g., a track identifying current, predicted, and/or previous position, heading, velocity, and/or acceleration associated with an object), via an artificial intelligence model (see at least, Col 14 lines 10-13, the perception component 218…may comprise one or more ML
models…an ML model may comprise a neural network), one or more traffic vehicles in image data obtained from one or more sensors of the automated vehicle (see at least, Col 6 lines 35-39, The
perception component 126 may be configured to track the objects it detects using one or more bounding boxes. The bounding boxes may be applied to LIDAR, image, or other data to enable the objects to be tracked over time), for each traffic vehicle of the one or more traffic vehicles, generating, by the object tracking and classification module, a bounding box associated with the traffic vehicle in the image data (see at least, Fig 1, Col 6 lines 35-43, The perception component 126 may be configured to track the objects it detects using one or more bounding boxes…The bounding boxes may be applied to image data based on other data…three bounding boxes, 132, 134, and 136, corresponding to bicycles 112, 114, and 116 respectively); tracking, by the object tracking and classification module, the one or more traffic vehicles with bounding boxes (see at least, Fig 1, Col 6 lines 35-37, The perception component 126 may be configured to track the objects it detects using one or more bounding boxes) by: matching bounding boxes of the one or more traffic vehicles predicted by the artificial intelligence model with one or more tracking identifiers of the one or more traffic vehicles (see at least, Col 6 lines 40-45, box 128, in which the scene 108 is overlaid with three bounding boxes, 132, 134, and 136,
corresponding to bicycles 112, 114, and 116 respectively); identifying, by one or more processors, a close vehicle of the one or more traffic vehicles, the close vehicle identified within a first threshold distance from the automated vehicle (see at least, Col 13 lines 34-40, The prediction component 228
may generate one or more probability maps representing prediction probabilities of possible locations of one or more objects in an environment. For example, the prediction component 228 may generate one or more probability maps for vehicles, pedestrians, animals, and the like within a threshold
distance from the vehicle 202); generating, by one or more processors, an expanded bounding box associated with the close vehicle in the image data, the expanded bounding box having a comparatively larger size than the bounding box (see at least, Col 17 lines 21-24, a first bounding box may be applied to the person-wide object in the image data and the first bounding box may be expanded to provide a second bounding box having a larger area than the first bounding box).
Lin does not explicitly teach determining, by one or more processors, that the expanded bounding box of the close vehicle is within a second threshold distance, shorter than the first threshold distance, from the automated vehicle, thereby detecting an encroaching vehicle as the close vehicle; generating, by the one or more processors, an avoidance instruction for operating the automated vehicle, in response to detecting the encroaching vehicle; and controlling, by a control module executed by the one or more processors, operation of the automated vehicle according to the avoidance instruction. However, Kumar teaches these limitations.
Kumar teaches determining, by one or more processors, that the expanded bounding box of the close vehicle is within a second threshold distance (see at least, [0038] The trigger may be generated when the dynamic separation distance at a current time instance is below a first distance threshold…may correspond to a minimum distance that must be maintained between the autonomous vehicle 104 and the first vehicle 106), shorter than the first threshold distance (see at least, [0037] the autonomous vehicle 104 may compare area of bounding boxes rendered at consecutive time intervals to determine whether distance between the first vehicle 106 and the autonomous vehicle 104 is increasing or decreasing), from the automated vehicle, thereby detecting an encroaching vehicle as the close vehicle (see at least, [0037] if the relative size of the consecutive bounding box increases, it implies that the distance between the first vehicle 106 and the autonomous vehicle 104 is decreasing…the autonomous vehicle 104 is nearing the first vehicle 106); generating, by the one or more processors, an avoidance instruction for operating the automated vehicle, in response to detecting the encroaching vehicle (see at least, [0038] The trigger may be generated when the dynamic separation distance at a current time instance is below a first distance threshold); and controlling, by a control module executed by the one or more processors, operation of the automated vehicle according to the avoidance instruction (see at least, [0038] the autonomous vehicle 104 may adjust its current velocity in order to maintain a pre-decided safe distance from the first vehicle 106).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Lin to include determining, by one or more processors, that the expanded bounding box of the close vehicle is within a second threshold distance, shorter than the first threshold distance, from the automated vehicle, thereby detecting an encroaching vehicle as the close vehicle; generating, by the one or more processors, an avoidance instruction for operating the automated vehicle, in response to detecting the encroaching vehicle; and controlling, by a control module executed by the one or more processors, operation of the automated vehicle according to the avoidance instruction as taught by Kumar in order to avoid collision when determining an overtaking trajectory for autonomous vehicles (Kumar, [0004]).
Regarding claim 2, the combination of Lin and Kumar teaches the method according to claim 1. Kumar further teaches wherein the first threshold distance includes at least one of: a threshold range-distance ahead of or behind the automated vehicle, or a threshold horizontal- distance alongside the automated vehicle (see at least, [0029] the autonomous vehicle 104 and the first vehicle 106 may be moving in a same lane…In order to monitor the dynamic separation distance).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Lin to include the first threshold distance includes at least one of: a threshold range-distance ahead of or behind the automated vehicle, or a threshold horizontal- distance alongside the automated vehicle as taught by Kumar in order to avoid collision when determining an overtaking trajectory for autonomous vehicles (Kumar, [0004]).
Regarding claim 4, the combination of Lin and Kumar teaches the method according to claim 1. Lin further teaches comprising: updating, by the one or more processors, a position history of the close vehicle using a plurality of historic positions of the close vehicle (see at least, Col 12 lines 32-34, determine characteristics associated with an object…a track identifying …previous position…
associated with an object); and updating, by the one or more processors, a trajectory track of the close vehicle based upon the position history of the close vehicle (see at least, Col 13 lines 40-56, the prediction component…may measure a track of an object and generate…a trajectory for the object based on observed and predicted behavior).
Regarding claim 5, the combination of Lin and Kumar teaches the method according to claim 4. Lin further teaches wherein the one or more processors continually updates the position history and the trajectory track for the close vehicle (see at least, Col 23 lines 47-49, The method of FIG. 4 may be repeated for a plurality of different person-wide vehicles and/or actions to develop the training data for the model).
Regarding claim 6, the combination of Lin and Kumar teaches the method according to claim 1. Lin further teaches comprising deriving, by the one or more processors, a velocity of the close vehicle based upon a position history of the close vehicle (see at least, Col 17 lines 37-47, The machine-learned model may be trained based on sensor data extracted from log data...Velocity…may be determined based on position of a person-wide vehicle at more than one points in time) stored in a non-transitory storage accessible to the one or more processors (Col 11, lines 56-57, memory…communicatively coupled with the one or more processors).
Regarding claim 7, the combination of Lin and Kumar teaches the method according to claim 1. Lin further teaches wherein detecting the encroaching vehicle as the close vehicle includes: generating, by the one or more processors, a predicted trajectory track of the close vehicle to a future time by modeling the close vehicle to the future time using a trajectory track and a velocity of the close vehicle (see at least, Col 5 lines 1-5, average velocity data, instantaneous velocity data, yaw data…may be combined with the future intention to determine a predicted trajectory or path that the bicycle is expected to take in future); wherein the one or more processors detects the encroaching vehicle as the close vehicle in response to determining that the expanded bounding box of the close vehicle is within the second threshold distance at a point along the predicted trajectory track of the close vehicle (see at least, Col 17 lines 29-32, the expanded second bounding box may allow the use of a road positioning of the person-wide vehicle to be used in determining the future intention by the machine-learned model).
Regarding claim 10, the combination of Lin and Kumar teaches the method according to claim 1. Lin further teaches wherein the one or more processors generates the expanded bounding box having the comparatively larger size in accordance with a preconfigured safety buffer size (see at least, Col 8 lines 7-11, identifying or extracting such data may include expanding the bounding box by a predetermined amount or percentage and using the image data in the expanded bounding box as the image data).
Regarding claim 11, Lin teaches a system for navigation planning for an automated vehicle (see at least Col 2 lines 28-29, model may be used to control the autonomous vehicle), the system comprising: a non-transitory computer-readable memory on board an automated vehicle (see at least, Col 32 lines 36-40, computer-executable instructions stored on one or more non-transitory computer-readable storage media that, when executed by one or more processors, cause a computer or autonomous vehicle to perform the recited operations) configured to store image data for a roadway associated with one or more traffic vehicles obtained from one or more sensors of the automated vehicle (see at least, Col 16 lines 35-39, a moving window approach may be implemented so that the image data is updated based on the current time and the predetermined period. The image data may be obtained from a cache or buffer configured to store recent image data); and one or more processors of the automated vehicle configured to: identify, by an object tracking and classification module executed by one or more processors (see at least, Col 12 lines 25-33, The perception component 218 may detect object(s) in in an environment surrounding the vehicle 202…classify the object(s)…determine characteristics associated with an object…e.g., a track identifying current, predicted, and/or previous position, heading, velocity, and/or acceleration associated with an object), via an artificial intelligence model (see at least, Col 14 lines 10-13, the perception component 218…may comprise one or more ML
models…an ML model may comprise a neural network), the one or more traffic vehicles in the image data obtained from the one or more sensors of the automated vehicle (see at least, Col 6 lines 35-39, The perception component 126 may be configured to track the objects it detects using one or more bounding boxes. The bounding boxes may be applied to LIDAR, image, or other data to enable the objects to be tracked over time); for each traffic vehicle of the one or more traffic vehicles, generate, by the object tracking and classification module, a bounding box associated with the traffic vehicle in the image data (see at least, Fig 1, Col 6 lines 35-43, The perception component 126 may be configured to track the objects it detects using one or more bounding boxes…The bounding boxes may be applied to image data based on other data…three bounding boxes, 132, 134, and 136, corresponding to bicycles 112, 114, and 116 respectively); tracking, by the object tracking and classification module, the one or more traffic vehicles with bounding boxes (see at least, Fig 1, Col 6 lines 35-37, The perception component 126 may be configured to track the objects it detects using one or more bounding boxes) by: matching bounding boxes of the one or more traffic vehicles predicted by the artificial intelligence model with one or more tracking identifiers of the one or more traffic vehicles (see at least, Col 6 lines 40-45, box 128, in which the scene 108 is overlaid with three bounding boxes, 132, 134, and 136, corresponding to bicycles 112, 114, and 116 respectively); identify a close vehicle of the one or more traffic vehicles, the close vehicle identified within a first threshold distance from the automated vehicle (see at least, Col 13 lines 34-40, The prediction component 228 may generate one or more probability maps representing prediction probabilities of possible locations of one or more objects in an environment. For example, the prediction component 228 may generate one or more probability maps for vehicles, pedestrians, animals, and the like within a threshold distance from the vehicle 202); generate an expanded bounding box associated with the close vehicle in the image data, the expanded bounding box having a comparatively larger size than the bounding box (see at least, Col 17 lines 21-24, a first bounding box may be applied to the person-wide object in the image data and the first bounding box may be expanded to provide a second bounding box having a larger area than the first bounding box).
Lin does not explicitly teach determine that the expanded bounding box of the close vehicle is within a second threshold distance, shorter than the first threshold distance, from the automated vehicle, thereby detecting an encroaching vehicle as the close vehicle; generate an avoidance instruction for operating the automated vehicle, in response to detecting the encroaching vehicle; and controlling, by a control module executed by the one or more processors, operation of the automated vehicle according to the avoidance instruction. However, Kumar teaches these limitations.
Kumar teaches determine that the expanded bounding box of the close vehicle is within a second threshold distance (see at least, [0038] The trigger may be generated when the dynamic separation distance at a current time instance is below a first distance threshold…may correspond to a minimum distance that must be maintained between the autonomous vehicle 104 and the first vehicle 106), shorter than the first threshold distance (see at least, [0037] the autonomous vehicle 104 may compare area of bounding boxes rendered at consecutive time intervals to determine whether distance between the first vehicle 106 and the autonomous vehicle 104 is increasing or decreasing), from the automated vehicle, thereby detecting an encroaching vehicle as the close vehicle (see at least, [0037] if the relative size of the consecutive bounding box increases, it implies that the distance between the first vehicle 106 and the autonomous vehicle 104 is decreasing…the autonomous vehicle 104 is nearing the first vehicle 106); generate an avoidance instruction for operating the automated vehicle, in response to detecting the encroaching vehicle (see at least, [0038] The trigger may be generated when the dynamic separation distance at a current time instance is below a first distance threshold); and controlling, by a control module executed by the one or more processors, operation of the automated vehicle according to the avoidance instruction (see at least, [0038] the autonomous vehicle 104 may adjust its current velocity in order to maintain a pre-decided safe distance from the first vehicle 106).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Lin to include determine that the expanded bounding box of the close vehicle is within a second threshold distance, shorter than the first threshold distance, from the automated vehicle, thereby detecting an encroaching vehicle as the close vehicle; generate an avoidance instruction for operating the automated vehicle, in response to detecting the encroaching vehicle; and controlling, by a control module executed by the one or more processors, operation of the automated vehicle according to the avoidance instruction as taught by Kumar in order to avoid collision when determining an overtaking trajectory for autonomous vehicles (Kumar, [0004]).
Regarding claim 12, the combination of Lin and Kumar teaches the system according to claim 11. Kumar further teaches wherein the first threshold distance includes at least one of: a threshold range-distance ahead of or behind the automated vehicle, or a threshold horizontal-distance alongside the automated vehicle (see at least, [0029] the autonomous vehicle 104 and the first vehicle 106 may be moving in a same lane…In order to monitor the dynamic separation distance).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified Lin to include the first threshold distance includes at least one of: a threshold range-distance ahead of or behind the automated vehicle, or a threshold horizontal- distance alongside the automated vehicle as taught by Kumar in order to avoid collision when determining an overtaking trajectory for autonomous vehicles (Kumar, [0004]).
Regarding claim 14, the combination of Lin and Kumar teaches the system according to claim 11. Lin further teaches wherein the one or more processors is further configured to: update a position history of the close vehicle using a plurality of historic positions of the close vehicle (see at least, Col 12 lines 32-34, determine characteristics associated with an object…a track identifying …previous position…associated with an object); and update a trajectory track of the close vehicle based upon the position history of the close vehicle (see at least, Col 13 lines 40-56, the prediction component…may measure a track of an object and generate…a trajectory for the object based on observed and predicted behavior).
Regarding claim 15, the combination of Lin and Kumar teaches the system according to claim 14. Lin further teaches wherein the one or more processors continually updates the position history and the trajectory track for the close vehicle (see at least, Col 23 lines 47-49, The method of FIG. 4 may be repeated for a plurality of different person-wide vehicles and/or actions to develop the training data for the model).
Regarding claim 16, the combination of Lin and Kumar teaches the system according to claim 11. Lin further teaches wherein the one or more processors is further configured to derive a velocity of the close vehicle based upon a position history of the close vehicle (see at least, Col 17 lines 37-47, The machine-learned model may be trained based on sensor data extracted from log data...Velocity…may be determined based on position of a person-wide vehicle at more than one points in time) stored in a non- transitory storage accessible to the one or more processors (see at least, Col 11, lines 56-57, memory…communicatively coupled with the one or more processors).
Regarding claim 17, the combination of Lin and Kumar teaches the system according to claim 11. Lin further teaches wherein when detecting the encroaching vehicle as the close vehicle, the one or more processors is further configured to generate a predicted trajectory track of the close vehicle to a future time by modeling the close vehicle to the future time using a trajectory track and a velocity of the close vehicle (see at least, Col 5 lines 1-5, average velocity data, instantaneous velocity data, yaw data…
may be combined with the future intention to determine a predicted trajectory or path that the bicycle is expected to take in future) and wherein the one or more processors detects the encroaching vehicle as the close vehicle in response to determining that the expanded bounding box of the close vehicle is within the second threshold distance at a point along the predicted trajectory track of the close vehicle (see at least, Col 17 lines 29-32, the expanded second bounding box may allow the use of a road positioning of the person-wide vehicle to be used in determining the future intention by the machine-learned model).
Regarding claim 20, the combination of Lin and Kumar teaches the system according to claim 11. Lin further teaches wherein the one or more processors generates the expanded bounding box having the comparatively larger size in accordance with a preconfigured safety buffer size (see at least, Col 8 lines 7-11, identifying or extracting such data may include expanding the bounding box by a predetermined amount or percentage and using the image data in the expanded bounding box as the image data).
Claims 3, 8-9, 13 and 18-19 are rejected under 35 U.S.C. 103 as being unpatentable over Lin et al. (US 12240497 B1; hereafter Lin) in view of Liu et al. (US 20190103026 A1; hereafter Liu) in further view of Park et al. (US 20200216075 A1; hereafter Park).
Regarding claim 3, the combination of Lin and Kumar teaches the method according to claim 1. The combination does not explicitly teach wherein the second threshold distance relative to the expanded bounding box includes at least one of: a threshold horizontal-distance alongside the automated vehicle, or a contact between the expanded bounding box and a middle-lane line. However, Park teaches this limitation.
Park teaches wherein the second threshold distance relative to the expanded bounding box includes at least one of: a threshold horizontal-distance alongside the automated vehicle, or a contact between the expanded bounding box and a middle-lane line (see at least, [0064] the electronic apparatus…may determine whether to extend the bounding box in a horizontal direction).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the combination of Lin and Kumar to include the second threshold distance relative to the expanded bounding box includes at least one of: a threshold horizontal-distance alongside the automated vehicle, or a contact between the expanded bounding box and a middle-lane line as taught by Park in order to avoid collision by the autonomous vehicle refraining from attempts to change lanes and accelerate to pass the front vehicle or a sudden turn of the front vehicle (Park, [0041]).
Regarding claim 8, the combination of Lin and Kumar teaches the method according to claim 1. The combination does not explicitly teach wherein the one or more processors generates the expanded bounding box for the close vehicle having a size relative to a velocity of at least one of: the automated vehicle or the close vehicle. However, Park teaches this limitation.
Park further teaches wherein the one or more processors generates the expanded bounding box for the close vehicle having a size relative to a velocity of at least one of: the automated vehicle or the close vehicle (see at least, [0111] the electronic apparatus…may adjust the extension ratio differently for each extension direction of the bounding box based on…a driving speed…of the front vehicle).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the combination of Lin and Kumar to include the one or more processors generates the expanded bounding box for the close vehicle having a size relative to a velocity of at least one of: the automated vehicle or the close vehicle as taught by Park in order to avoid collision by the autonomous vehicle refraining from attempts to change lanes and accelerate to pass the front vehicle or a sudden turn of the front vehicle (Park, [0041]).
Regarding claim 9, the combination of Lin and Kumar teaches the method according to claim 1. The combination does not explicitly teach wherein the avoidance instruction includes at least one of: slowing down, biasing away from the encroaching vehicle, or sounding a horn. However, Park teaches this limitation.
Park teaches wherein the avoidance instruction (see at least, [0068] a processor 120….of the electronic apparatus 100 may transmit signals to a plurality of driving modules in such a manner that a driving operation of the vehicle 1 is controlled based on the adjusted bounding box) includes at least one of: slowing down, biasing away from the encroaching vehicle, or sounding a horn (see at least, [0043] control a driving operation, based on the extended bounding box…and thus be slowly driven and keep a safe distance from the front vehicle…may provide a safe autonomous driving environment by adjusting the size of the bounding box).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the combination of Lin and Kumar to include the avoidance instruction includes at least one of: slowing down, biasing away from the encroaching vehicle, or sounding a horn as taught by Park in order to avoid collision by the autonomous vehicle refraining from attempts to change lanes and accelerate to pass the front vehicle or a sudden turn of the front vehicle (Park, [0041]).
Regarding claim 13, the combination of Lin and Kumar teaches the system according to claim 11. The combination does not explicitly teach wherein the second threshold distance relative to the expanded bounding box includes at least one of: a threshold horizontal-distance alongside the automated vehicle, or a contact between the expanded bounding box and a middle-lane line. However, Park teaches this limitation.
Park teaches wherein the second threshold distance relative to the expanded bounding box includes at least one of: a threshold horizontal-distance alongside the automated vehicle, or a contact between the expanded bounding box and a middle-lane line (see at least, [0064] the electronic apparatus…may determine whether to extend the bounding box in a horizontal direction).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the combination of Lin and Kumar to include the second threshold distance relative to the expanded bounding box includes at least one of: a threshold horizontal-distance alongside the automated vehicle, or a contact between the expanded bounding box and a middle-lane line as taught by Park in order to avoid collision by the autonomous vehicle refraining from attempts to change lanes and accelerate to pass the front vehicle or a sudden turn of the front vehicle (Park, [0041]).
Regarding claim 18, the combination of Lin and Park teaches the system according to claim 11. The combination does not explicitly teach wherein the one or more processors generates the expanded bounding box for the close vehicle having a size relative to a velocity of at least one of: the automated vehicle or the close vehicle. However, Park teaches this limitation.
Park further teaches wherein the one or more processors generates the expanded bounding box for the close vehicle having a size relative to a velocity of at least one of: the automated vehicle or the close vehicle (see at least, [0111] the electronic apparatus…may adjust the extension ratio differently for each extension direction of the bounding box based on…a driving speed…of the front vehicle).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the combination of Lin and Kumar to include the one or more processors generates the expanded bounding box for the close vehicle having a size relative to a velocity of at least one of: the automated vehicle or the close vehicle as taught by Park in order to avoid collision by the autonomous vehicle refraining from attempts to change lanes and accelerate to pass the front vehicle or a sudden turn of the front vehicle (Park, [0041]
Regarding claim 19, the combination of Lin and Kumar teaches the system according to claim 11. The combination does not explicitly teach wherein the avoidance instruction includes at least one of: slowing down, biasing away from the encroaching vehicle, or sounding a horn.
Park teaches wherein the avoidance instruction (see at least, [0068] a processor 120….of the electronic apparatus 100 may transmit signals to a plurality of driving modules in such a manner that a driving operation of the vehicle 1 is controlled based on the adjusted bounding box) includes at least one of: slowing down, biasing away from the encroaching vehicle, or sounding a horn (see at least, [0043] control a driving operation, based on the extended bounding box…and thus be slowly driven and keep a safe distance from the front vehicle…may provide a safe autonomous driving environment by adjusting the size of the bounding box).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the combination of Lin and Kumar to include the avoidance instruction includes at least one of: slowing down, biasing away from the encroaching vehicle, or sounding a horn as taught by Park in order to avoid collision by the autonomous vehicle refraining from attempts to change lanes and accelerate to pass the front vehicle or a sudden turn of the front vehicle (Park, [0041]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Rajendra et al. (US 20230127465 A1) discloses generating, by the processor, an avoidance instruction for operating the automated vehicle, in response to detecting the encroaching vehicle (e.g. [0057] The notification device may similarly correspond to a horn, loudspeaker, or exterior notification device that communicates audible alerts in the operating environment 28...the system 10 may detect the trailing vehicle 14 and output a notification from the equipped vehicle 12 in a variety of ways).
Barrera et al. (US 20240416949 A1) discloses for each traffic vehicle of the one or more traffic vehicles, generating, by the processor, a bounding box associated with the traffic vehicle in the image data (e.g. [0252] computing one or more bounding boxes around one or more traffic related objects).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TOYA PETTIEGREW whose telephone number is (313)446-6636. The examiner can normally be reached 8:30pm - 5:00pm M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jelani Smith can be reached at 571-270-3969. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TOYA PETTIEGREW/Primary Examiner, Art Unit 3662