Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Allowable subject matter
An amendment to claim 1 that includes the yield driving guide signal being transmitted to at least five vehicles to yield all five vehicles from the server would be allowable over the prior art of record.
Response to the Applicant’s arguments
The previous rejection is withdrawn. Applicant’s amendments are entered. Applicant’s remarks are also entered into the record. A new search was made necessitated by the applicant’s amendments.
A new reference was found. A new rejection is made herein.
Applicant’s arguments are now moot in view of the new rejection of the claims.
The applicant contends that no reference in the prior art provides for the guide signal.
This is identified in the specification as originally filed at least at paragraphs 41-46 that recite ‘...[0041] The cloud 200 is configured to receive an emergency request signal from the communication system 110 of a specific vehicle, determines whether the specific vehicle is an emergency vehicle, and then transmits a yielding driving guide signal to vehicles around the specific vehicle. [0042] The cloud 200 may be configured to determine the predetermined signal as an emergency signal and transmit a yielding driving guide signal to vehicles around the specific vehicle when receiving a signal proving that the specific vehicle is an emergency vehicle from an emergency center that has received an emergency report. [0043] The cloud 200 may also be configured to determine the specific vehicle as an emergency vehicle and transmit a yielding driving guide signal to vehicles around the specific vehicle when receiving an image signal (e.g., an image taken by a passenger) that shows an emergency patient in the specific vehicle from the communication system 110 in the specific vehicle.
[0044] The yielding driving guide signal which is transmitted to a surrounding vehicle of a specific vehicle by the cloud 200 may include the location information of the specific vehicle, the driving direction of the specific vehicle, the lane change direction of the specific vehicle, a forward safety distance and a safety road width for the specific vehicle, a direction and a movement width for yielding driving of surrounding vehicles, and whether to accelerate or decelerate the surrounding vehicles, etc. The yielding driving guide signal is transmitted to the autonomous driving controllers 100 through the communication systems 110 of the surrounding vehicles. [0045] When receiving an emergency request signal from a specific vehicle, the cloud 200 can determine a forward safety distance and a safety road width for the specific vehicle based on the specifications, current location, driving direction, etc. of the specific vehicle. [0046] Accordingly, the autonomous driving controller 100 is configured to control the surrounding vehicle to move at a speed and in a direction for securing a driving path of the specific vehicle based on the yielding driving guide signal transmitted from the cloud 200. [0047] To the present end, the autonomous driving controller 100 is configured to determine an available road width D of the surrounding vehicle based on a forward safety distance L and a safety road width d for the specific vehicle of the yielding driving guide signal transmitted from the cloud 200, and to determine a yielding driving width B for one surrounding vehicle based on the determined available road width D”.
PNG
media_image1.png
670
930
media_image1.png
Greyscale
Wang discloses receiving a yield driving guide signal that can be received by the autonomous vehicle controller and to move at speeds in a direction of the driving path of the vehicle in an emergency state.
Wang has a yield planning block 159.
The yield planning block can be received by an autonomous vehicle controller for the behavior planning 140 and to element 180. See paragraph 56-57 where the contenders like an ambulance have a right of way and this can be provided to the vehicle to yield and a motion planning in paragraph 58. This can be pulling over for an ambulance.
In paragraph 115, this can be from a system on a chip in the vehicle or alternatively from a network interface 824 from the server 878.
Therefore, Wang discloses a yield driving guide signal that can be received by the autonomous vehicle controller and to move at speeds in a direction of the driving path of the vehicle in an emergency state. This can be from the yield planning block and from a server and provide a motion planning that there is an ambulance and the vehicle should move over or stop or some other motion planning.
The applicant states on page 10 of the remarks that the controller mounted in each vehicles a-e controls all surrounding vehicles a to e based on the driving yield signal to move at speeds in directions for securing a driving path of the vehicle in an emergency state.
This is not really claimed.
The applicant has a controller that can control the vehicle and at least one surrounding vehicle and the cloud transmits the guide signal to each of the controller of the surrounding vehicle. Therefore under the BRI there are not vehicles a-e but two vehicles.
Wang discloses in paragraph 161 that the cloud server can provide the connection with at least two vehicles.
Nister teaches “...based on yielding driving guide signal transmitted from the cloud (see paragraph 158, 195) to the autonomous driving controller mounted in each of the surrounding vehicles such that the autonomous driving controller mounted in each of the surrounding vehicles controls all the surrounding vehicles based on the yielding driving guide signal to move at speeds in directions for securing the driving path of the vehicle in the (see col. 14, lines 9-61 and paragraph 56-63 where the yield planner 159 can provide a yield entry and yield contention and a yield planning where the yield planner will provide an analysis of show stopped first in the intersection and then short term prediction of the path)
emergency state, and wherein the cloud is configured to transmit
the yielding driving guide signal to each of the autonomous driving controller of the
surrounding vehicles and the yielding driving guide signal includes (see paragraph 158 where the vehicle can detect that an ambulance is approaching from the sound and the gps and lidar and that the vehicle must provide a control program to yield. Once an emergency vehicle is detected, a control program may be used to execute an emergency vehicle safety routine, slowing the vehicle, pulling over to the side of the road, parking the vehicle, and/or idling the vehicle, with the assistance of ultrasonic sensors 862, until the emergency vehicle(s) passes)
location information of the vehicle, (see paragraph 26 where the processor can determine the location of the vehicle in the lane and in a lane graph that includes multiple lanes with multiple scoring every 20 feet)
driving information of the vehicle, (see paragraph 26 where the processor can determine the trajectory of the vehicle in the lane and in a lane graph that includes multiple lanes with multiple scoring every 20 feet)
a lane change direction of the vehicle, (see paragraph 158 where the vehicle can detect that an ambulance is approaching from the sound and the gps and lidar and that the vehicle must provide a control program to yield and leave the lane and pull over to the side of the road. Once an emergency vehicle is detected, a control program may be used to execute an emergency vehicle safety routine, slowing the vehicle, pulling over to the side of the road, parking the vehicle, and/or idling the vehicle, with the assistance of ultrasonic sensors 862, until the emergency vehicle(s) passes)
a forward safety direction and (see paragraph 158 where the vehicle can detect that an ambulance is approaching from the sound and the gps and lidar and that the vehicle can determine it cannot move forward and instead will just remain in place and idle and until the emergency vehicle(s) passes)
PNG
media_image2.png
754
1210
media_image2.png
Greyscale
a safety road width of the vehicle, (see paragraph 158 where the vehicle can detect an ambulance and pull over to the side edge of the road shown as path 222 defined by the road edge)
a direction and (see paragraph[h 158 where according to the doppler effect the ambulance can be detected to identify from the back moving to the front of the vehicle and the vehicle should pull over)
a movement width for yielding driving of the surrounding vehicles, and
whether to accelerate or decelerate the surrounding vehicles”. (See FIG. 2 where the lane is split into several paths 202-222 and where the lane change can be expressed to 1. Following in the lane or pull over to the side and edge of the road to make way for an ambulance in paragraph 70-74)
Further, transmitting the signal to yield from one server to a second vehicle and a first vehicle is a duplication of parts that only involves routine skill in the art. See In re Harza, 274 F.2d 669, 124 USPQ 378 (CCPA 1960) (Claims at issue were directed to a water-tight masonry structure wherein a water seal of flexible material fills the joints which form between adjacent pours of concrete. The claimed water seal has a “web” which lies in the joint, and a plurality of “ribs” projecting outwardly from each side of the web into one of the adjacent concrete slabs. The prior art disclosed a flexible water stop for preventing passage of water between masses of concrete in the shape of a plus sign (+). Although the reference did not disclose a plurality of ribs, the court held that mere duplication of parts has no patentable significance unless a new and unexpected result is produced.).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1 and 2 and 4 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Pub. No.: US 2023/0033315 a1 to Tariq et al. that was filed in 2019 which is prior to the effective filing date of 12-6-21 (hereinafter “Tariq”) and in view of United States Patent Application Pub. No.: US20210295171A1 to Kamenev et al. that was filed in 3-19-2020 and is assigned to NVIDIA™ (hereinafter “KAMENEV”) and in view of United States Patent App. Pub.. No.: US20210253128A1 to Nister et al. that was filed on 2-18-21 and in view of United States Patent Application Pub. No.: US20210253128A1 to Wang et al. filed 2-18-21 (hereinafter “Wang”).
PNG
media_image3.png
836
1320
media_image3.png
Greyscale
Tariq discloses “...1. A system for inducing vehicles to yield right of way, the system comprising: (see vehicle 122 that can move from the first location 124 to the second location 128 based on the emergency vehicle detected 104 and from the trajectory 122)
PNG
media_image4.png
757
1053
media_image4.png
Greyscale
a communication device mounted in each of a fleet of the vehicles; (see FIG. 4 where the vehicle 400 has a communication connection 410)
a cloud configured to receive an emergency state signal from the communication device of a vehicle of the fleet of vehicles, (see paragraph 19, 122 and FIG. 4 where the cloud computing device includes a network connection to the fleet of the vehicles 400)
PNG
media_image5.png
786
598
media_image5.png
Greyscale
and determine an emergency state, and then to transmit a guide signal requesting surrounding vehicles to yield the right of way to the vehicle in the emergency state; and(see paragraph 19 where the yield can be provided from a remote operator or a remote computing device and see vehicle 122 that can move from the first location 124 to the second location 128 based on the emergency vehicle detected 104 and from the trajectory 122) (see FIG. 5 where the probability that the emergency vehicle is not operating in an emergency state in block 506 then the support request is sent to a server and then the server can provide an indication that there is an emergency and then the vehicle trajectory is determined and a yielding is provided)
PNG
media_image6.png
712
682
media_image6.png
Greyscale
an autonomous driving controller configured, based on the guide signal, to control the surrounding vehicles at a speed and in a direction so that a driving path of the vehicle in the emergency state is secured. (see paragraph 19 and FIG. 5 where the av is provided to be in a yielding location that is safe and then the vehicle is controlled to the yielding location in blocks 602-612)”
PNG
media_image7.png
820
904
media_image7.png
Greyscale
The independent claims are amended to recite and the primary reference to Tariq is silent but Kamenev teaches “...wherein the autonomous driving controller is mounted in each of the surrounding vehicles and the autonomous driving controller mounted in each of the surrounding vehicles is configured to control the surrounding vehicles such that all the surrounding vehicles are controlled to move at speeds in directions for securing the driving path of the vehicle in the emergency state”. (see paragraph 27-37 where the neural network cloud server can track the present location of each vehicle and also the past and the future state of all vehicles in FIG. 4b and 4a; see paragraph 86 where each of the vehicles has a redundant emergency controller; see paragraph 145 and 165 where using the cloud and also the sensors in the autonomous vehicle an emergency can be detected and then once it is detected each of the vehicle can include an emergency safety routine that includes 1. Slowing the vehicle 2. Pulling over the vehicle to the side of the road and 3. Parking 4. Idling and then until the emergency passes)
It would have been obvious for one of ordinary skill in the art to combine the disclosure of TARIQ with the teachings of KAMENEV with a reasonable expectation of success since KAMENEV teaches that in an emergency such as a fire trucking passing, the cloud server can include a tracking of 1. The past location of each vehicle, 2. The current location for each vehicle and 3. The future location of each vehicle. Using this information, each of the vehicles can be controlled to detect the emergency and provide a remedial action such as 1. the SoC(s) 604 use the CNN for classifying environmental and urban sounds, as well as classifying visual data. In a preferred embodiment, the CNN running on the DLA is trained to identify the relative closing speed of the emergency vehicle (e.g., by using the Doppler Effect). Once an emergency vehicle is detected, a control program may be used to execute an emergency vehicle safety routine, slowing the vehicle, pulling over to the side of the road, parking the vehicle, and/or idling the vehicle, with the assistance of ultrasonic sensors 662, until the emergency vehicle(s) passes. This can provide a collision avoidance and ensuring that the autonomous vehicle does not add to the harm. See paragraph 140-147 and claims 1-20 and the abstract.
PNG
media_image8.png
802
1076
media_image8.png
Greyscale
Wang teaches ‘...controls the vehicle and at least one of the [[all]] of the surrounding vehicles (Therefore, Wang discloses a yield driving guide signal that can be received by the autonomous vehicle controller and to move at speeds in a direction of the driving path of the vehicle in an emergency state. This can be from the yield planning block and from a server and provide a motion planning that there is an ambulance and the vehicle should move over or stop or some other motion planning.
The applicant states on page 10 of the remarks that the controller mounted in each vehicles a-e controls all surrounding vehicles a to e based on the driving yield signal to move at speeds in directions for securing a driving path of the vehicle in an emergency state.
This is not really claimed.
The applicant has a controller that can control the vehicle and at least one surrounding vehicle and the cloud transmits the guide signal to each of the controller of the surrounding vehicle. Therefore under the BRI there are not vehicles a-e but two vehicles.
Wang discloses in paragraph 161 that the cloud server can provide the connection with at least two vehicles.
)based on the yielding driving guide signal received by the each autonomous driving controller to move at speeds in directions for securing the driving path of the vehicle in the emergency state” (see Fig. 1 and paragraph 53, 56-60 where each of the autonomous vehicles has a yield planning block 158 that can provide a pre limiting condition 150 to the behavior planning block 140 and see paragraph 62 where if the contender vehicle and the autonomous vehicle are both stopped a future prediction of motion is used and if the second contender vehicle does not move then the autonomous vehicle will move forward and see paragraph 63 where there can be a negotiation of motion forward by the two vehicles and in paragraph 94 the contender can be an ambulance where the vehicle can stop and then allow the second vehicle to move unless the ambulance is not moving and then the vehicle will move)”.
It would have been obvious for one of ordinary skill in the art to combine the disclosure of TARIQ with the teachings of WANG of NVIDIA with a reasonable expectation of success since WAMG teaches each controller can include a yield planning block 159. This can allow a fire truck or ambulance to pass by pulling over or if the truck is stopped it can advance or it can negotiate a yielding situation. See paragraph 59-65 of Wang.
Nister teaches “...based on yielding driving guide signal transmitted from the cloud (see paragraph 158, 195) to the autonomous driving controller mounted in each of the surrounding vehicles such that the autonomous driving controller mounted in each of the surrounding vehicles controls all the surrounding vehicles based on the yielding driving guide signal to move at speeds in directions for securing the driving path of the vehicle in the (see col. 14, lines 9-61 and paragraph 56-63 where the yield planner 159 can provide a yield entry and yield contention and a yield planning where the yield planner will provide an analysis of show stopped first in the intersection and then short term prediction of the path)
emergency state, and wherein the cloud is configured to transmit
the yielding driving guide signal to each of the autonomous driving controller of the
surrounding vehicles and the yielding driving guide signal includes (see paragraph 158 where the vehicle can detect that an ambulance is approaching from the sound and the gps and lidar and that the vehicle must provide a control program to yield. Once an emergency vehicle is detected, a control program may be used to execute an emergency vehicle safety routine, slowing the vehicle, pulling over to the side of the road, parking the vehicle, and/or idling the vehicle, with the assistance of ultrasonic sensors 862, until the emergency vehicle(s) passes)
location information of the vehicle, (see paragraph 26 where the processor can determine the location of the vehicle in the lane and in a lane graph that includes multiple lanes with multiple scoring every 20 feet)
driving information of the vehicle, (see paragraph 26 where the processor can determine the trajectory of the vehicle in the lane and in a lane graph that includes multiple lanes with multiple scoring every 20 feet)
a lane change direction of the vehicle, (see paragraph 158 where the vehicle can detect that an ambulance is approaching from the sound and the gps and lidar and that the vehicle must provide a control program to yield and leave the lane and pull over to the side of the road. Once an emergency vehicle is detected, a control program may be used to execute an emergency vehicle safety routine, slowing the vehicle, pulling over to the side of the road, parking the vehicle, and/or idling the vehicle, with the assistance of ultrasonic sensors 862, until the emergency vehicle(s) passes)
a forward safety direction and (see paragraph 158 where the vehicle can detect that an ambulance is approaching from the sound and the gps and lidar and that the vehicle can determine it cannot move forward and instead will just remain in place and idle and until the emergency vehicle(s) passes)
PNG
media_image2.png
754
1210
media_image2.png
Greyscale
a safety road width of the vehicle, (see paragraph 158 where the vehicle can detect an ambulance and pull over to the side edge of the road shown as path 222 defined by the road edge)
a direction and (see paragraph[h 158 where according to the doppler effect the ambulance can be detected to identify from the back moving to the front of the vehicle and the vehicle should pull over)
a movement width for yielding driving of the surrounding vehicles, and
whether to accelerate or decelerate the surrounding vehicles”. (See FIG. 2 where the lane is split into several paths 202-222 and where the lane change can be expressed to 1. Following in the lane or pull over to the side and edge of the road to make way for an ambulance in paragraph 70-74)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the NISTER publication to NVIDIA™ with the disclosure of TARIQ with a reasonable expectation of success since NISTER publication teaches that a cloud server can provide a number of lanes in FIG. 2 and a score of each lane. The autonomous vehicle can also interface with the GPU cloud units to assist the vehicle. The AV can then detect using an acoustic sensor and a lidar and a GPS sensor that an ambulance is in the rear and moving forward to the vehicle. The AV can then determine a predicted trajectory to move that can include the road edge of the road to pull over to allow the ambulance to pass and wait until it is safe to re-merge into the lanes with the best scoring for improved safety.
Tariq discloses “..2. The system of claim 1, wherein the cloud is configured to conclude that the vehicle is in the emergency state when receiving a signal proving the emergency state from an emergency center. (see paragraph 19 where the remote operator can determine there is an emergency and or provide a signal to move to the yield location)”.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim 3 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Pub. No.: US 2023/0033315 a1 to Tariq et al. that was filed in 2019 which is prior to the effective filing date of 12-6-21 (hereinafter “Tariq”) and in view of Korean Patent Pub. No.: KR102414191B1 that was filed on 12-1-2021 and in view of Kamenev and in view of United States Patent App. Pub.. No.: US20210253128A1 to Nister et al. that was filed on 2-18-21 and Wang.
Tariq discloses “..3. The system of claim 1, wherein the cloud is configured to conclude that the vehicle is in the emergency state when receiving an image showing that there (see paragraph 19-24 where the remote operator can determine there is an emergency and or provide a signal to move to the yield location)
The 191 teaches “...is an emergency patient aboard the vehicle”. (see abstract and paragraph 1-8 where the doctor can review via a remote treatment telemedicine a patient in the vehicle and ambulance)”.
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the ‘191 publication with the disclosure of TARIQ with a reasonable expectation of success since the ‘191 publication teaches that the remote viewer can determine that there is an emergency and a patient in the ambulance based on the telemedicine approach to avoid exposure to an infectious disease via a remote monitoring of the ambulance. See paragraph 1-11 and claims 1-4 and the abstract.
PNG
media_image9.png
735
963
media_image9.png
Greyscale
Tariq discloses “.4. The system of claim 1, wherein the cloud is configured to transmit a yielding driving guide signal, (see element 108 in FIG. 1 where the remote and the ev can detect and respond via a communication and sensors for the vehicle 102 to yield in that direction 140) (see FIG. 1 where the vehicle 112 can move to the shoulder via object trajectory 122 to the second location 128 from the first location ) (see FIG. 1 where the vehicle 112 can move to the shoulder via object trajectory 122 to the second location 128 from the first location which can be a second lane or a shoulder of the road 118 ) to an autonomous driving controller of the surrounding vehicles”. (See paragraph 19-24 and 107-108 and paragraph 41-44 where depending on the size the vehicle can determine to pull over on the shoulder or merely change lanes )
Claim 5 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Pub. No.: US 2023/0033315 a1 to Tariq et al. that was filed in 2019 which is prior to the effective filing date of 12-6-21 (hereinafter “Tariq”) and in view of U.S. Patent Pub. No.: US20200238975A1 to Mizuno that was filed in 2017 and in view of Kamenev and in view of United States Patent App. Pub.. No.: US20210253128A1 to Nister et al. that was filed on 2-18-21 and in view of Wang.
PNG
media_image10.png
945
646
media_image10.png
Greyscale
Tariq is silent but Mizuno teaches “...5. The system of claim 4, wherein the autonomous driving controller is configured to determine an available road width (See FIG. 5 where the emergency vehicle is detected and then the yielding vehicle immediately consults a road vehicle and emergency vehicle and a target zone) for the surrounding vehicles based on the forward safety direction and the safety road width of the vehicle, (see FIG. 1 where the vehicle has a width and then determines it can fit into the areas PE or PO to yield) and to determine a yielding driving width for one surrounding vehicle based on the determined available road width for the one surrounding vehicle”. (see paragraph 39-47)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the MIZUNO publication with the disclosure of TARIQ with a reasonable expectation of success since the MIZUNO publication teaches that the server can detect a width of the vehicles and the width of the road. The server can pull over a first vehicle in a target zone where a width is sufficient so the emergency vehicle can pass. However, if there is a second oncoming vehicle and a second width plus the first width of the first vehicle is too large for the vehicle to pass, the remove server can control the second vehicle to stop and present the blockage of the road to allow the ambulance to pass. See paragraph 1-4 and paragraph 39-50.
Claim 6 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Pub. No.: US 2023/0033315 a1 to Tariq et al. that was filed in 2019 which is prior to the effective filing date of 12-6-21 (hereinafter “Tariq”) and in view of U.S. Patent Pub. No.: US20200238975A1 to Mizuno that was filed in 2017 and Kamenev and in view of United States Patent App. Pub.. No.: US20210253128A1 to Nister et al. that was filed on 2-18-21 and Wang.
Tariq is silent but Mizuno teaches “..6. The system of claim 5, wherein the autonomous driving controller is configured to control the surrounding vehicles at a speed and in a direction for securing a driving path of a predetermined direction when the yielding driving width for the one surrounding vehicle is greater than a sum of a width of the one surrounding vehicle and a safety width”. (See paragraph 39-40)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the MIZUNO publication with the disclosure of TARIQ with a reasonable expectation of success since the MIZUNO publication teaches that the server can detect a width of the vehicles and the width of the road. The server can pull over a first vehicle in a target zone where a width is sufficient so the emergency vehicle can pass. However, if there is a second oncoming vehicle and a second width plus the first width of the first vehicle is too large for the vehicle to pass, the remove server can control the second vehicle to stop and present the blockage of the road to allow the ambulance to pass. See paragraph 1-4 and paragraph 39-50.
Claim 7 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Pub. No.: US 2023/0033315 a1 to Tariq et al. that was filed in 2019 which is prior to the effective filing date of 12-6-21 (hereinafter “Tariq”) and in view of U.S. Patent Pub. No.: US20200238975A1 to Mizuno that was filed in 2017 and Kamenev and in view of United States Patent App. Pub.. No.: US20210253128A1 to Nister et al. that was filed on 2-18-21 and Wang.
Tariq is silent but Mizuno teaches “.7. The system of claim 6, wherein the autonomous driving controller is configured to control the vehicle to be decelerated when the yielding driving width for the one surrounding vehicle is equal to or smaller than the sum of the width of the one surrounding vehicle and the safety width. ”. (See claims 1-6 and paragraph 39-40)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the MIZUNO publication with the disclosure of TARIQ with a reasonable expectation of success since the MIZUNO publication teaches that the server can detect a width of the vehicles and the width of the road. The server can pull over a first vehicle in a target zone where a width is sufficient so the emergency vehicle can pass. However, if there is a second oncoming vehicle and a second width plus the first width of the first vehicle is too large for the vehicle to pass, the remove server can control the second vehicle to stop and present the blockage of the road to allow the ambulance to pass. See paragraph 1-4 and paragraph 39-50.
Claim 8 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Pub. No.: US 2023/0033315 a1 to Tariq et al. that was filed in 2019 which is prior to the effective filing date of 12-6-21 (hereinafter “Tariq”) and in view of U.S. Patent Pub. No.: US20200238975A1 to Mizuno that was filed in 2017 and Kamenev and in view of United States Patent App. Pub.. No.: US20210253128A1 to Nister et al. that was filed on 2-18-21 and Wang.
Tariq is silent but Mizuno teaches “8. The system of claim 4, further including a display mounted in the surrounding vehicles and configured to display information of the yielding driving guide signal. (see FIG. 2 and indicator to the driver 6 and see paragraph 45-46)”
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the MIZUNO publication with the disclosure of TARIQ with a reasonable expectation of success since the MIZUNO publication teaches that the server can detect a width of the vehicles and the width of the road. The server can pull over a first vehicle in a target zone where a width is sufficient so the emergency vehicle can pass. However, if there is a second oncoming vehicle and a second width plus the first width of the first vehicle is too large for the vehicle to pass, the remove server can control the second vehicle to stop and present the blockage of the road to allow the ambulance to pass. See paragraph 1-4 and paragraph 39-50.
Claims 9 and 10 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Pub. No.: US 2023/0033315 a1 to Tariq et al. that was filed in 2019 which is prior to the effective filing date of 12-6-21 (hereinafter “Tariq”) and in view of Kamenev and in view of United States Patent App. Pub.. No.: US20210253128A1 to Nister et al. that was filed on 2-18-21 and Wang.
PNG
media_image3.png
836
1320
media_image3.png
Greyscale
Tariq discloses “...9. A method for inducing vehicles to yield right of way, the method comprising: (see vehicle 122 that can move from the first location 124 to the second location 128 based on the emergency vehicle detected 104 and from the trajectory 122)
PNG
media_image4.png
757
1053
media_image4.png
Greyscale
transmitting, by a communication system of a vehicle, an emergency request signal to a cloud; (see FIG. 4 where the vehicle 400 has a communication connection 410 and paragraphs 19-24)
determining, by the cloud, whether the vehicle is an emergency vehicle; (see paragraph 19-24) (see paragraph 19, 122 and FIG. 4 where the cloud computing device includes a network connection to the fleet of the vehicles 400)
transmitting, by the cloud, a yielding driving guide signal to surrounding vehicles of the vehicle when the vehicle is an emergency vehicle; and (see paragraph 19 where the yield can be provided from a remote operator or a remote computing device and see vehicle 122 that can move from the first location 124 to the second location 128 based on the emergency vehicle detected 104 and from the trajectory 122) (see FIG. 5 where the probability that the emergency vehicle is not operating in an emergency state in block 506 then the support request is sent to a server and then the server can provide an indication that there is an emergency and then the vehicle trajectory is determined and a yielding is provided)
controlling, by an autonomous driving controller, the surrounding vehicles at a speed and in a direction for securing a driving path of the vehicle based on the yielding driving guide signal. (see paragraph 19 and FIG. 5 where the av is provided to be in a yielding location that is safe and then the vehicle is controlled to the yielding location in blocks 602-612)”
The independent claims 1 and 9 are amended to recite and the primary reference to Tariq is silent but Kamenev teaches “...wherein the autonomous driving controller is mounted in each of the surrounding vehicles and the autonomous driving controller mounted in each of the surrounding vehicles is configured to control the surrounding vehicles such that all the surrounding vehicles are controlled to move at speeds in directions for securing the driving path of the vehicle in the emergency state”. (see paragraph 27-37 where the neural network cloud server can track the present location of each vehicle and also the past and the future state of all vehicles in FIG. 4b and 4a; see paragraph 86 where each of the vehicles has a redundant emergency controller; see paragraph 145 and 165 where using the cloud and also the sensors in the autonomous vehicle an emergency can be detected and then once it is detected each of the vehicle can include an emergency safety routine that includes 1. Slowing the vehicle 2. Pulling over the vehicle to the side of the road and 3. Parking 4. Idling and then until the emergency passes)
It would have been obvious for one of ordinary skill in the art to combine the disclosure of TARIQ with the teachings of KAMENEV with a reasonable expectation of success since KAMENEV teaches that in an emergency such as a fire trucking passing, the cloud server can include a tracking of 1. The past location of each vehicle, 2. The current location for each vehicle and 3. The future location of each vehicle. Using this information, each of the vehicles can be controlled to detect the emergency and provide a remedial action such as 1. the SoC(s) 604 use the CNN for classifying environmental and urban sounds, as well as classifying visual data. In a preferred embodiment, the CNN running on the DLA is trained to identify the relative closing speed of the emergency vehicle (e.g., by using the Doppler Effect). Once an emergency vehicle is detected, a control program may be used to execute an emergency vehicle safety routine, slowing the vehicle, pulling over to the side of the road, parking the vehicle, and/or idling the vehicle, with the assistance of ultrasonic sensors 662, until the emergency vehicle(s) passes. This can provide a collision avoidance and ensuring that the autonomous vehicle does not add to the harm. See paragraph 140-147 and claims 1-20 and the abstract.
The primary reference is silent but Nister teaches “...based on yielding driving guide signal transmitted from the cloud (see paragraph 158, 195) to the autonomous driving controller mounted in each of the surrounding vehicles such that the autonomous driving controller mounted in each of the surrounding vehicles controls all the surrounding vehicles based on the yielding driving guide signal to move at speeds in directions for securing the driving path of the vehicle in the (see col. 14, lines 9-61 and paragraph 56-63 where the yield planner 159 can provide a yield entry and yield contention and a yield planning where the yield planner will provide an analysis of show stopped first in the intersection and then short term prediction of the path)
emergency state, and wherein the cloud is configured to transmit
the yielding driving guide signal to each of the autonomous driving controller of the
surrounding vehicles and the yielding driving guide signal includes (see paragraph 158 where the vehicle can detect that an ambulance is approaching from the sound and the gps and lidar and that the vehicle must provide a control program to yield. Once an emergency vehicle is detected, a control program may be used to execute an emergency vehicle safety routine, slowing the vehicle, pulling over to the side of the road, parking the vehicle, and/or idling the vehicle, with the assistance of ultrasonic sensors 862, until the emergency vehicle(s) passes)
location information of the vehicle, (see paragraph 26 where the processor can determine the location of the vehicle in the lane and in a lane graph that includes multiple lanes with multiple scoring every 20 feet)
driving information of the vehicle, (see paragraph 26 where the processor can determine the trajectory of the vehicle in the lane and in a lane graph that includes multiple lanes with multiple scoring every 20 feet)
a lane change direction of the vehicle, (see paragraph 158 where the vehicle can detect that an ambulance is approaching from the sound and the gps and lidar and that the vehicle must provide a control program to yield and leave the lane and pull over to the side of the road. Once an emergency vehicle is detected, a control program may be used to execute an emergency vehicle safety routine, slowing the vehicle, pulling over to the side of the road, parking the vehicle, and/or idling the vehicle, with the assistance of ultrasonic sensors 862, until the emergency vehicle(s) passes)
a forward safety direction and (see paragraph 158 where the vehicle can detect that an ambulance is approaching from the sound and the gps and lidar and that the vehicle can determine it cannot move forward and instead will just remain in place and idle and until the emergency vehicle(s) passes)
PNG
media_image2.png
754
1210
media_image2.png
Greyscale
a safety road width of the vehicle, (see paragraph 158 where the vehicle can detect an ambulance and pull over to the side edge of the road shown as path 222 defined by the road edge)
a direction and (see paragraph[h 158 where according to the doppler effect the ambulance can be detected to identify from the back moving to the front of the vehicle and the vehicle should pull over)
a movement width for yielding driving of the surrounding vehicles, and
whether to accelerate or decelerate the surrounding vehicles”. (See FIG. 2 where the lane is split into several paths 202-222 and where the lane change can be expressed to 1. Following in the lane or pull over to the side and edge of the road to make way for an ambulance in paragraph 70-74)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the NISTER publication to NVIDIA™ with the disclosure of TARIQ with a reasonable expectation of success since NISTER publication teaches that a cloud server can provide a number of lanes in FIG. 2 and a score of each lane. The autonomous vehicle can also interface with the GPU cloud units to assist the vehicle. The AV can then detect using an acoustic sensor and a lidar and a GPS sensor that an ambulance is in the rear and moving forward to the vehicle. The AV can then determine a predicted trajectory to move that can include the road edge of the road to pull over to allow the ambulance to pass and wait until it is safe to re-merge into the lanes with the best scoring for improved safety.
PNG
media_image8.png
802
1076
media_image8.png
Greyscale
Claim 9 is amended to recite and Wang teaches ‘...all of the surrounding vehicles based on the yielding driving guide signal received by the each autonomous driving controller to move at speeds in directions for securing the driving path of the vehicle in the emergency state” (see Fig. 1 and paragraph 53, 56-60 where each of the autonomous vehicles has a yield planning block 158 that can provide a pre limiting condition 150 to the behavior planning block 140 and see paragraph 62 where if the contender vehicle and the autonomous vehicle are both stopped a future prediction of motion is used and if the second contender vehicle does not move then the autonomous vehicle will move forward and see paragraph 63 where there can be a negotiation of motion forward by the two vehicles and in paragraph 94 the contender can be an ambulance where the vehicle can stop and then allow the second vehicle to move unless the ambulance is not moving and then the vehicle will move)”.
It would have been obvious for one of ordinary skill in the art to combine the disclosure of TARIQ with the teachings of WANG of NVIDIA with a reasonable expectation of success since WAMG teaches each controller can include a yield planning block 159. This can allow a fire truck or ambulance to pass by pulling over or if the truck is stopped it can advance or it can negotiate a yielding situation. See paragraph 59-65 of Wang.
Tariq discloses “..10. The method of claim 9, wherein, in the determining of whether the vehicle is the emergency vehicle, the cloud is configured to conclude that the vehicle is the emergency vehicle when receiving a signal, which proves that the vehicle is the emergency vehicle, from an emergency center. (see paragraph 19 where the remote operator can determine there is an emergency and or provide a signal to move to the yield location)”.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim 11 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Pub. No.: US 2023/0033315 a1 to Tariq et al. that was filed in 2019 which is prior to the effective filing date of 12-6-21 (hereinafter “Tariq”) and in view of Korean Patent Pub. No.: KR102414191B1 that was filed on 12-1-2021 and in view of Kamenev and in view of United States Patent App. Pub.. No.: US20210253128A1 to Nister et al. that was filed on 2-18-21 and Wang.
Tariq discloses “..11. The method of claim 9, wherein, in the determining of whether the vehicle is the emergency vehicle, the cloud is configured to conclude that the vehicle is the emergency vehicle when receiving an image signal (see paragraph 19-24 where the remote operator can determine there is an emergency and or provide a signal to move to the yield location)
The ‘191 teaches “...showing that there is an emergency patient in the vehicle”. (see abstract and paragraph 1-8 where the doctor can review via a remote treatment telemedicine a patient in the vehicle and ambulance)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the ‘191 publication with the disclosure of TARIQ with a reasonable expectation of success since the ‘191 publication teaches that the remote viewer can determine that there is an emergency and a patient in the ambulance based on the telemedicine approach to avoid exposure to an infectious disease via a remote monitoring of the ambulance. See paragraph 1-11 and claims 1-4 and the abstract.
PNG
media_image9.png
735
963
media_image9.png
Greyscale
Claim 12 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Pub. No.: US 2023/0033315 a1 to Tariq et al. that was filed in 2019 which is prior to the effective filing date of 12-6-21 (hereinafter “Tariq”) and in view of Kamenev and in view of United States Patent App. Pub.. No.: US20210253128A1 to Nister et al. that was filed on 2-18-21 and Wang.
Tariq discloses “.12. The method of claim 9, wherein when transmitting the yielding driving guide signal to the surrounding vehicles, the cloud is configured to transmit the yielding driving guide signal, which includes location information of the vehicle, a driving information of the vehicle, a lane change direction of the vehicle, a forward safety direction and a safety road width of the vehicle, a direction and a movement width for yielding driving of the surrounding vehicles, and whether to accelerate or decelerate the surrounding vehicles, to an autonomous driving controller of the surrounding vehicles. , (see element 108 in FIG. 1 where the remote and the ev can detect and respond via a communication and sensors for the vehicle 102 to yield in that direction 140) (see FIG. 1 where the vehicle 112 can move to the shoulder via object trajectory 122 to the second location 128 from the first location ) (see FIG. 1 where the vehicle 112 can move to the shoulder via object trajectory 122 to the second location 128 from the first location which can be a second lane or a shoulder of the road 118 ) (See paragraph 19-24 and 107-108 and paragraph 41-44 where depending on the size the vehicle can determine to pull over on the shoulder or merely change lanes )
Claims 13-16 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Pub. No.: US 2023/0033315 a1 to Tariq et al. that was filed in 2019 which is prior to the effective filing date of 12-6-21 (hereinafter “Tariq”) and in view of U.S. Patent Pub. No.: US20200238975A1 to Mizuno that was filed in 2017 and Kamenev and Nister and Wang.
Tariq is silent but Mizuno teaches “...13. The method of claim 12, wherein the autonomous driving controller is further configured to perform:
determining an available road width for the surrounding vehicles based on the forward safety direction and the safety road width of the vehicle; and determining a yielding driving width for the surrounding vehicles based on the determined available road width for the surrounding vehicles. (See FIG. 5 where the emergency vehicle is detected and then the yielding vehicle immediately consults a road vehicle and emergency vehicle and a target zone) (see FIG. 1 where the vehicle has a width and then determines it can fit into the areas PE or PO to yield) (see paragraph 39-47)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the MIZUNO publication with the disclosure of TARIQ with a reasonable expectation of success since the MIZUNO publication teaches that the server can detect a width of the vehicles and the width of the road. The server can pull over a first vehicle in a target zone where a width is sufficient so the emergency vehicle can pass. However, if there is a second oncoming vehicle and a second width plus the first width of the first vehicle is too large for the vehicle to pass, the remove server can control the second vehicle to stop and present the blockage of the road to allow the ambulance to pass. See paragraph 1-4 and paragraph 39-50.
Tariq is silent but Mizuno teaches “..14. The method of claim 13, wherein the autonomous driving controller is configured to control the surrounding vehicles at a speed and in a direction for securing a driving path of a predetermined direction when the yielding driving width for one surrounding vehicle is greater than a sum of a width of the one surrounding vehicle and a safety width. (See paragraph 39-40)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the MIZUNO publication with the disclosure of TARIQ with a reasonable expectation of success since the MIZUNO publication teaches that the server can detect a width of the vehicles and the width of the road. The server can pull over a first vehicle in a target zone where a width is sufficient so the emergency vehicle can pass. However, if there is a second oncoming vehicle and a second width plus the first width of the first vehicle is too large for the vehicle to pass, the remove server can control the second vehicle to stop and present the blockage of the road to allow the ambulance to pass. See paragraph 1-4 and paragraph 39-50.
Tariq is silent but Mizuno teaches “..15. The method of claim 14, wherein the autonomous driving controller is configured to control the vehicle to be decelerated when the yielding driving width for the one surrounding vehicle is equal to or smaller than the sum of the width of the one surrounding vehicle and the safety width. (see paragraph 39-40 and claims 1-6).
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the MIZUNO publication with the disclosure of TARIQ with a reasonable expectation of success since the MIZUNO publication teaches that the server can detect a width of the vehicles and the width of the road. The server can pull over a first vehicle in a target zone where a width is sufficient so the emergency vehicle can pass. However, if there is a second oncoming vehicle and a second width plus the first width of the first vehicle is too large for the vehicle to pass, the remove server can control the second vehicle to stop and present the blockage of the road to allow the ambulance to pass. See paragraph 1-4 and paragraph 39-50.
Tariq is silent but Mizuno teaches “16. The method of claim 9, further including displaying information of the yielding driving guide signal on a display mounted in the surrounding vehicles. (see FIG. 2 and indicator to the driver 6 and see paragraph 45-46)”
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of the MIZUNO publication with the disclosure of TARIQ with a reasonable expectation of success since the MIZUNO publication teaches that the server can detect a width of the vehicles and the width of the road. The server can pull over a first vehicle in a target zone where a width is sufficient so the emergency vehicle can pass. However, if there is a second oncoming vehicle and a second width plus the first width of the first vehicle is too large for the vehicle to pass, the remove server can control the second vehicle to stop and present the blockage of the road to allow the ambulance to pass. See paragraph 1-4 and paragraph 39-50.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEAN PAUL CASS whose telephone number is (571)270-1934. The examiner can normally be reached Monday to Friday 7 am to 7 pm; Saturday 10 am to 12 noon.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached on 571-270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JEAN PAUL CASS/Primary Examiner, Art Unit 3668