DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This Office Action is in response to the application filed on 17 December 2025. Claims 1-20 are presently pending and are presented for examination.
Response to Amendments
In response to Applicant’s amendments dated 17 December 2025, Examiner withdraws the previous claim objections; withdraws the previous claim interpretations; withdraws the previous 35 U.S.C. 112(b) rejections; withdraws the previous 35 U.S.C. 112(d) rejections; withdraws the previous 35 U.S.C. 101 rejections; and maintains the previous prior art rejections.
Response to Arguments
Applicant's arguments, see Remarks, filed 17 December 2025, have been fully considered but they are not persuasive.
Applicant argues, see Remarks, pg. 8-9, that US-20210347371-A1 (“Lee”) does not disclose a safety system that is communicatively coupled to at least a CAN bus and a vehicle driving control system, where the safety system identifies an unsafe driving condition of the vehicle and overrides at least one driving input that originates from one of a driver and an autonomous driving system when the unsafe driving condition is identified. Examiner respectfully disagrees. Lee discloses a “second autonomous driving controller” (i.e., safety system) that operates in parallel with a “first autonomous driving controller” (i.e., vehicle driving control system) and that the second autonomous driving controller may sense a “dangerous situation” (i.e., unsafe driving condition) and then override the driving control of the vehicle by implementing driving controls like braking or lane changing to a road shoulder (see Lee, FIG. 2; para. 0073). For these reasons, examiner is unpersuaded and maintains the corresponding rejections.
Applicant argues, see Remarks, pg. 9, that Lee doesn’t disclose a system that overrides a driver or a driver input. Examiner agrees, however, the claim limitation “override at least one driving input that originates from one of a driver and an autonomous driving system when the unsafe driving condition is identified” is disclosed by Lee (see Lee para. 0073). Per the aforementioned claim limitation, the original driving input is from either a driver or an autonomous driving system; Lee discloses the second aspect of the “or” clause. For these reasons, examiner is unpersuaded and maintains the corresponding rejections.
Applicant argues, see Remarks, pg. 9-10, that Lee does not disclose a safety system configured to override at least one driving input. Examiner respectfully disagrees. Lee discloses a “second autonomous driving controller” (i.e., safety system) that operates in parallel with a “first autonomous driving controller” (i.e., vehicle driving control system) and that the second autonomous driving controller may sense a “dangerous situation” (i.e., unsafe driving condition) and then override the driving control of the vehicle by implementing driving controls to enter a Minimum Risk Management (MRM) mode, like braking or lane changing to a road shoulder (see Lee, FIG. 2; para. 0030 and 0073). For these reasons, examiner is unpersuaded and maintains the corresponding rejections.
Applicant argues, see Remarks, pg. 11-12, that Lee does not disclose a safety system that operates in parallel with an autonomous driving system. Examiner respectfully disagrees. Lee discloses a “second autonomous driving controller” (i.e., safety system) that operates in parallel with a “first autonomous driving controller” (i.e., vehicle driving control system) (see Lee, FIG. 2, para. 0030 and 0063). The second autonomous driving controller can sense a dangerous situation when the first autonomous driving controller is operating the vehicle and immediately take control (i.e., override) of a vehicle driving control system to control the vehicle into a Minimum Risk Management (MRM) mode, like braking or lane changing to a road shoulder. For these reasons, examiner is unpersuaded and maintains the corresponding rejections.
The remaining arguments are essentially the same as those addressed above and/or below and are unpersuasive for at least the same reasons. Therefore, examiner is unpersuaded and maintains the corresponding rejections.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-3, 5, 9-12, 14-17, and 19-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US-20210347371-A1, hereinafter “Lee”.
Regarding claim 1, Lee discloses A system comprising (Lee, para. 0014: “An aspect of the present disclosure provides a method and an apparatus for controlling autonomous driving.”):
a safety system communicatively coupled to at least a controller area network bus (CAN bus) and a vehicle driving control system (Lee, FIG. 2: second autonomous driving controller 240 [i.e., a safety system communicatively coupled to at least a controller area network bus (CAN bus)], second controller 260 [i.e., a vehicle driving control system], second communication line 280 [i.e., CAN bus]; para. 0083: “…the second autonomous driving controller 240 [i.e., a safety system] may be connected to [i.e., communicatively coupled] the steering controller 261 and the acceleration/deceleration controller 262 [i.e., and a vehicle driving control system] through a second communication line 280 [i.e., to at least a controller area network bus (CAN bus)].”; It would be obvious to one of ordinary skill in the art, at the time of the application, to know that CAN buses are commonly used, and represent an industry standard, for connecting controllers in a vehicle control system.),
the CAN bus allows a plurality of microcontrollers to communicate therebetween (Lee, FIG. 2: second autonomous driving controller 240, second controller 260, second communication line 280; Note: This limitation is reciting a definition of a CAN bus.) and
the vehicle driving control system controls at least direction and speed of a vehicle via the CAN bus (Lee, FIG. 2: second controller 260 [i.e., vehicle driving control system], second communication line 280 [i.e., CAN bus]; para. 0084: “To this end, each of the steering controller 261 [i.e., vehicle driving control system controls at least direction…of a vehicle] and the acceleration/deceleration controller 262 [i.e., vehicle driving control system controls at least…speed of a vehicle] may include a first communication port connected to the first communication line 270 and a second communication port connected to the second communication line 280 [i.e., via the CAN bus].”),
wherein the safety system is configured to: identify an unsafe driving condition of the vehicle (Lee, para. 0073: “The second autonomous driving controller 240 [i.e., safety system] may sense a dangerous situation based on the sensing information collected from the front radar 211 and the front camera 212.”);
override at least one driving input that originates from one of a driver and an autonomous driving system when the unsafe driving condition is identified, wherein the at least one driving input was intended to be provided to the vehicle driving control system to control the vehicle (Lee, para. 0073: “The second autonomous driving controller 240 may sense a dangerous situation based on the sensing information collected from the front radar 211 and the front camera 212. The second autonomous driving controller 240 may perform control operations [i.e., at least one driving input was intended to be provided to the vehicle driving control system to control the vehicle], such as emergency braking or a stop on the shoulder after changing a lane [i.e., override at least one driving input that originates from one of a driver and an autonomous driving system], when sensing the dangerous situation [i.e., when the unsafe driving condition is identified].”; para. 0071: “The second autonomous driving controller 240 may calculate a required steering value for lane keeping control and a deceleration/acceleration value, which is to prevent the collision with the front vehicle, regardless of activating the autonomous driving function.”); and
provide at least one new input to the vehicle driving control system to control the vehicle (Lee, para. 0073: “The second autonomous driving controller 240 may perform control operations, such as emergency braking [i.e., at least one new input] or a stop on the shoulder after changing a lane [i.e., at least one new input], when sensing the dangerous situation.”),
wherein the at least one new input causes the vehicle driving control system to change at least one of the direction and the speed of the vehicle (Lee, para. 0073: “The second autonomous driving controller 240 may perform control operations, such as emergency braking [i.e., causes the vehicle driving control system to change at least one of…the speed of the vehicle] or a stop on the shoulder after changing a lane [i.e., causes the vehicle driving control system to change at least one of the direction…of the vehicle], when sensing the dangerous situation.”).
Regarding claim 9, Lee discloses An apparatus comprising: at least one non-transitory computer readable storage medium couplable to a vehicle, the at least one non-transitory computer readable storage medium having instructions stored thereon to (Lee, para. 0114: “The operations of the methods or algorithms described in connection with the processor forms disclosed in the present disclosure may be directly implemented with a hardware module, a software module, or the combinations thereof, executed by the processor. The software module may reside on a storage medium (that a memory and/or a storage), such as a RAM, a flash memory, a ROM, an erasable and programmable ROM (EPROM), an electrically EPROM (EEPROM), a register, a hard disc, a removable disc, or a compact disc-ROM (CD-ROM).”; para. 0015: “…the present disclosure provides a method for controlling autonomous driving, capable of providing MRM in a dangerous situation during the autonomous driving, and an apparatus for the same.”):
gather sensor data from a plurality of sensors, wherein at least one sensor of the plurality of sensors is configured to receive input regarding an outside environment in which the vehicle is operating (Lee, FIG. 2: sensor 210, sensors 211-212, second autonomous driving controller 240; para. 0061: “The sensor 210 includes a front radar 211 [i.e., at least one sensor of the plurality of sensors is configured to receive input regarding an outside environment in which the vehicle is operating], a front camera 212 [i.e., at least one sensor of the plurality of sensors is configured to receive input regarding an outside environment in which the vehicle is operating], various vehicle sensors 213, a LiDAR 214, a side radar 215, a side camera 216, a rear camera 217, a global positioning system (GPS) 218 and a precision map providing device 219.”; para. 0067: “Meanwhile, the second autonomous driving controller 240 may be configured to include a second cognition device 241 to generate the information on the line and the information on the vehicle driving in front, based on the sensing information received from a front camera 212 and a front radar 211 [i.e., gather sensor data from a plurality of sensors]…”);
identify an unsafe driving condition of the vehicle, wherein identification of the unsafe driving condition is based at least in part on the gathered sensor data (Lee, para. 0069: “The second autonomous driving controller 240 may use the line information recognized through the front camera 212 [i.e., based at least in part on the gathered sensor data] to control a vehicle lateral behavior, and may use sensing information from the front radar 211 [i.e., based at least in part on the vehicle sensor data] to calculate acceleration/deceleration for preventing the collision with the front vehicle [i.e., identify an unsafe driving condition of the vehicle] in the vehicle deceleration.”; para. 0073: “The second autonomous driving controller 240 may sense a dangerous situation [i.e., identify an unsafe driving condition of the vehicle] based on the sensing information collected from the front radar 211 and the front camera 212 [i.e., identification of the unsafe driving condition is based at least in part on the gathered sensor data]. The second autonomous driving controller 240 may perform control operations, such as emergency braking or a stop on the shoulder after changing a lane, when sensing the dangerous situation.”);
override at least one driving input that originates from one of a driver and an autonomous vehicle (AV) driving system when the unsafe driving condition is identified, wherein the at least one driving input was intended to be provided to a driving control system to control the vehicle (Lee, para. 0063: “When a system fault occurs while autonomous driving is performed through the first autonomous driving controller 230 [i.e., at least one driving input that originates from one of a driver and an autonomous vehicle (AV) driving system], the control authority over the autonomous driving may be automatically switched over from the first autonomous driving controller 230 to the second autonomous driving controller 240 to maintain the autonomous driving.”; para. 0069: “The second autonomous driving controller 240 may use the line information recognized through the front camera 212 to control a vehicle lateral behavior, and may use sensing information from the front radar 211 to calculate acceleration/deceleration for preventing the collision with the front vehicle in the vehicle deceleration [i.e., the at least one driving input was intended to be provided to a driving control system to control the vehicle].”; para. 0071: “The second autonomous driving controller 240 may calculate a required steering value for lane keeping control and a deceleration/acceleration value, which is to prevent the collision with the front vehicle [i.e., override at least one driving input], regardless of activating the autonomous driving function.”; para. 0073: “The second autonomous driving controller 240 may sense a dangerous situation based on the sensing information collected from the front radar 211 and the front camera 212. The second autonomous driving controller 240 may perform control operations, such as emergency braking or a stop on the shoulder after changing a lane, when sensing the dangerous situation.”); and
provide at least one new input to the driving control system to control the vehicle after the unsafe driving condition is identified, wherein the at least one new input causes the driving control system to change at least one of the direction and the speed of the vehicle (Lee, para. 0063: “When a system fault occurs while autonomous driving is performed through the first autonomous driving controller 230, the control authority over the autonomous driving may be automatically switched over from the first autonomous driving controller 230 to the second autonomous driving controller 240 to maintain the autonomous driving.”; para. 0071: “The second autonomous driving controller 240 may calculate a required steering value for lane keeping control and a deceleration/acceleration value, which is to prevent the collision with the front vehicle, regardless of activating the autonomous driving function.”).
Regarding claim 2, and analogous claim 10, Lee discloses The system of claim 1
wherein the safety system comprises a localization module configured to receive sensor output from a plurality of sensors of the vehicle to determine a location of the vehicle via one or more processors (Lee, para. 0034: “The sensor may include a front camera and a front radar, and the second autonomous driving controller [i.e., via one or more processors] may include a second cognition device [i.e., safety system comprises a localization module] to generate the information on the line and the information on the vehicle driving in front, based on the sensing information received from the front camera and the front radar [i.e., receive sensor output from a plurality of sensors of the vehicle to determine a location of the vehicle]…”), and
wherein the at least one driving input originates from the autonomous driving system (Lee, para. 0063: “When a system fault occurs while autonomous driving is performed through the first autonomous driving controller 230 [i.e., at least one driving input originates from the autonomous driving system], the control authority over the autonomous driving may be automatically switched over from the first autonomous driving controller 230 to the second autonomous driving controller 240 to maintain the autonomous driving.”; para. 0064: “When an autonomous driving controller is automatically switched over, the autonomous driving vehicle enters an MRM mode to perform deceleration.”), and
wherein the safety system operates in parallel to the autonomous driving system (Lee, FIG. 2: second autonomous driving controller (i.e., the safety system), first autonomous driving controller (i.e., autonomous driving system); para. 0030: “…a first controller including a first autonomous driving controller [i.e., autonomous driving system] and an second autonomous driving controller [i.e., the safety system], which are provided in a dual structure [i.e., safety system operates in parallel to the autonomous driving system], to control autonomous driving based on the sensing information received from the sensor…”; para. 0063: “When a system fault occurs while autonomous driving is performed through the first autonomous driving controller 230, the control authority over the autonomous driving may be automatically switched over from the first autonomous driving controller 230 to the second autonomous driving controller 240 to maintain the autonomous driving [i.e., safety system operates in parallel to the autonomous driving system].”).
Regarding claim 3, Lee discloses The system of claim 2
wherein the localization module is distinct from an autonomous vehicle (AV) localization module (Lee, para. 0034: “…the first autonomous driving controller may include a precision positioning device [i.e., an autonomous vehicle (AV) localization module] to generate information on a present position of the autonomous driving vehicle, based on the sensing information, a first cognition device to generate information on a line and information on a vehicle driving in front…”).
Regarding claim 5, and analogous claim 11, Lee discloses The system of claim 2
wherein the safety system further comprises a world model module configured to: receive the location of the vehicle from the localization module (Lee, para. 0067: “…a second controller 242 [i.e., safety system further comprises a world model module] to calculate the required command value based on the information on the line and the information on the vehicle driving in front, which are generated from the second cognition device 241 [i.e., receive the location of the vehicle from the localization module]…”);
receive sensor data from at least some of the plurality of sensors (Lee, para. 0067: “Meanwhile, the second autonomous driving controller 240 may be configured to include a second cognition device 241 to generate the information on the line and the information on the vehicle driving in front, based on the sensing information received from a front camera 212 and a front radar 211 [i.e., from at least some of the plurality of sensors], and a second controller 242 to calculate the required command value based on the information on the line and the information on the vehicle driving in front [i.e., receive sensor data], which are generated from the second cognition device 241…”);
determine vehicle proximity to objects around the vehicle based on the location and the sensor data (Lee, para. 0067: “Meanwhile, the second autonomous driving controller 240 may be configured to include a second cognition device 241 to generate the information on the line and the information on the vehicle driving in front, based on the sensing information received from a front camera 212 and a front radar 211, and a second controller 242 to calculate the required command value based on the information on the line and the information on the vehicle driving in front [i.e., determine vehicle proximity to objects around the vehicle based on the location and the sensor data], which are generated from the second cognition device 241…”);
receive driving commands (Lee, para. 0067: “Meanwhile, the second autonomous driving controller 240 may be configured to include a second cognition device 241 to generate the information on the line and the information on the vehicle driving in front, based on the sensing information received from a front camera 212 and a front radar 211, and a second controller 242 to calculate the required command value [i.e., receive driving commands] based on the information on the line and the information on the vehicle driving in front, which are generated from the second cognition device 241…”); and
predict, via at least one of the one or more processors, future paths of the vehicle based at least in part on the driving commands, the sensor data, and the vehicle proximity to the objects around the vehicle (Lee, para. 0067: “Meanwhile, the second autonomous driving controller 240 [i.e., via at least one of the one or more processors] may be configured to include a second cognition device 241 to generate the information on the line and the information on the vehicle driving in front, based on the sensing information received from a front camera 212 and a front radar 211, and a second controller 242 to calculate the required command value based on the information on the line and the information on the vehicle driving in front [i.e., based at least in part on the driving commands, the sensor data, and the vehicle proximity to the objects around the vehicle], which are generated from the second cognition device 241 and to perform the controlling of the deceleration based on the calculated required command value. In this case, the second controller 242 may perform the controlling of the deceleration in the state that a lane is maintained (in a lane keeping state).”; para. 0069: “The second autonomous driving controller 240 may use the line information recognized through the front camera 212 to control a vehicle lateral behavior, and may use sensing information from the front radar 211 to calculate acceleration/deceleration for preventing the collision with the front vehicle in the vehicle deceleration [i.e., predict future paths of the vehicle based at least in part on the driving commands].”).
Regarding claim 12, Lee discloses The apparatus of claim 11,
the at least one non-transitory computer readable storage medium having further instructions stored thereon to identify a failure of at least one software module controlling the AV driving system, wherein the failure of the at least one software module controlling the AV driving system creates the unsafe driving condition (Lee, para. 0063: “When a system fault occurs while autonomous driving is performed [i.e., failure of the at least one software module controlling the AV driving system creates the unsafe driving condition] through the first autonomous driving controller 230 [i.e., identify a failure of at least one software module controlling the AV driving system], the control authority over the autonomous driving may be automatically switched over from the first autonomous driving controller 230 to the second autonomous driving controller 240 to maintain the autonomous driving.”).
Regarding claim 14, Lee discloses A method comprising (Lee, para. 0014: “An aspect of the present disclosure provides a method and an apparatus for controlling autonomous driving.”):
identifying, in part via at least one processor, an unsafe driving condition of a vehicle (Lee, para. 0078: “For example, the system fault may be a controller fault occurring in the first autonomous driving controller 230 [i.e., an unsafe driving condition of a vehicle]. The second controller 260 may perform the controlling of a dual (or switchover) control to switch over the control authority over the autonomous driving from the first autonomous driving controller 230 to the second autonomous driving controller 240 when the fault of the first autonomous driving controller 230 occurs [i.e., identifying, in part via at least one processor,].”);
overriding, in part via the at least one processor, at least one driving input that originates from one of a driver and an autonomous vehicle (AV) driving system when the unsafe driving condition is identified, wherein the at least one driving input was intended to be provided to a driving control system to control the vehicle (Lee, para. 0073: “The second autonomous driving controller 240 may sense a dangerous situation based on the sensing information collected from the front radar 211 and the front camera 212. The second autonomous driving controller 240 may perform control operations, such as emergency braking or a stop on the shoulder after changing a lane, when sensing the dangerous situation.”; para. 0078: “For example, the system fault may be a controller fault occurring in the first autonomous driving controller 230 [i.e., the at least one driving input was intended to be provided to a driving control system to control the vehicle]. The second controller 260 [i.e., at least one processor] may perform the controlling of a dual (or switchover) control to switch over the control authority over the autonomous driving from the first autonomous driving controller 230 [i.e., at least one driving input that originates from one of a driver and an autonomous vehicle (AV) driving system] to the second autonomous driving controller 240 [i.e., overriding, in part via the at least one processor] when the fault of the first autonomous driving controller 230 occurs [i.e., when the unsafe driving condition is identified].”); and
providing, via a controller area network bus (CAN bus), at least one new input to the driving control system to control the vehicle, wherein the at least one new input causes the driving control system to change at least one of the direction and the speed of the vehicle (Lee, FIG. 2: second communication line 280 [i.e., CAN bus], second controller 260 [i.e., driving control system]; para. 0081: “The first controller 220 (in detail, the second autonomous driving controller 240) may enter an MRM mode to adaptively control [i.e., providing, via a controller area network bus (CAN bus), at least one new input to the driving control system to control the vehicle] a deceleration degree to prevent collision with a front vehicle in the lane keeping state [i.e., the at least one new input causes the driving control system to change at least one of the direction and the speed of the vehicle].”).
Regarding claim 15, Lee discloses The method of claim 14 further comprising
receiving sensor output from a plurality of sensors of the vehicle, wherein the sensor output represents a driving environment around the vehicle (Lee, para. 0073: “The second autonomous driving controller 240 may sense a dangerous situation based on the sensing information collected from the front radar 211 and the front camera 212. The second autonomous driving controller 240 may perform control operations, such as emergency braking or a stop on the shoulder after changing a lane, when sensing the dangerous situation.”),
wherein identifying the unsafe driving condition of the vehicle is carried out in parallel with operations of the AV driving system (Lee, FIG. 2: second autonomous driving controller 240, first autonomous driving controller 230; para. 0063: “When a system fault occurs while autonomous driving is performed through the first autonomous driving controller 230 [i.e., operations of the AV driving system], the control authority over the autonomous driving may be automatically switched over from the first autonomous driving controller 230 to the second autonomous driving controller 240 to maintain the autonomous driving [i.e., carried out in parallel with operations of the AV driving system].”; para. 0073: “The second autonomous driving controller 240 may sense a dangerous situation based on the sensing information collected from the front radar 211 and the front camera 212 [i.e., identifying the unsafe driving condition of the vehicle]. The second autonomous driving controller 240 may perform control operations, such as emergency braking or a stop on the shoulder after changing a lane [i.e., in parallel with operations of the AV driving system], when sensing the dangerous situation.”).
Regarding claim 16, Lee discloses The method of claim 15 further comprising
determining a localization of the vehicle based at least in part on the sensor output, wherein the localization includes proximity of the vehicle to one or more other objects in the driving environment (Lee, para. 0069: “The second autonomous driving controller 240 may use the line information recognized through the front camera 212 [i.e., determining a localization of the vehicle based at least in part on the sensor output] to control a vehicle lateral behavior, and may use sensing information from the front radar 211 to calculate acceleration/deceleration for preventing the collision with the front vehicle [i.e., localization includes proximity of the vehicle to one or more other objects in the driving environment] in the vehicle deceleration.”), and
wherein the determination of the localization is independent from a localization determination carried out by an autonomous vehicle drive system (Lee, para. 0066: “For example, the first autonomous driving controller 230 [i.e., an autonomous vehicle drive system] may be configured to include a precision positioning device 231 to generate information on a present position of the autonomous driving vehicle [i.e., a localization determination carried out by an autonomous vehicle drive system], based on the sensing information collected from the sensor 210, a first cognition device 232 to generate information on a line and information on a vehicle driving in front, based on the sensing information, a determination device 233 to determine whether a dangerous situation occurs, based on the information generated from the precision positioning device 231 and the first cognition device 232…”; para. 0067: “Meanwhile, the second autonomous driving controller 240 may be configured to include a second cognition device 241 to generate the information on the line and the information on the vehicle driving in front [i.e. the determination of the localization is independent from], based on the sensing information received from a front camera 212 and a front radar 211…”).
Regarding claim 17, Lee discloses The method of claim 16
wherein identifying the unsafe driving condition includes receiving the at least one driving input from the AV driving system, wherein identifying the unsafe driving condition is based in part on the at least one driving input (Lee, para. 0067: “Meanwhile, the second autonomous driving controller 240 may be configured to include a second cognition device 241 to generate the information on the line and the information on the vehicle driving in front, based on the sensing information received from a front camera 212 and a front radar 211, and a second controller 242 to calculate the required command value based on the information on the line and the information on the vehicle driving in front, which are generated from the second cognition device 241 and to perform the controlling of the deceleration based on the calculated required command value [i.e., wherein identifying the unsafe driving condition is based in part on the at least one driving input]. In this case, the second controller 242 may perform the controlling of the deceleration in the state that a lane is maintained (in a lane keeping state) [i.e., identifying the unsafe driving condition includes receiving the at least one driving input from the AV driving system].”).
Regarding claim 19, Lee discloses The method of claim 17
further comprising: monitoring the AV driving system; and determining if the AV driving system is faulty (Lee, para. 0063: “When a system fault occurs while autonomous driving is performed through the first autonomous driving controller 230, the control authority over the autonomous driving may be automatically switched over from the first autonomous driving controller 230 to the second autonomous driving controller 240 to maintain the autonomous driving.”; para. 0073: “The second autonomous driving controller 240 may sense a dangerous situation based on the sensing information collected from the front radar 211 and the front camera 212. The second autonomous driving controller 240 may perform control operations, such as emergency braking or a stop on the shoulder after changing a lane, when sensing the dangerous situation.”).
Regarding claim 20, Lee discloses The method of claim 15
further comprising predicting a potential future path of the vehicle based in part on the sensor output and the at least one driving input (Lee, para. 0067: “Meanwhile, the second autonomous driving controller 240 may be configured to include a second cognition device 241 to generate the information on the line and the information on the vehicle driving in front, based on the sensing information received from a front camera 212 and a front radar 211, and a second controller 242 to calculate the required command value based on the information on the line and the information on the vehicle driving in front, which are generated from the second cognition device 241 and to perform the controlling of the deceleration based on the calculated required command value. In this case, the second controller 242 may perform the controlling of the deceleration in the state that a lane is maintained (in a lane keeping state) [i.e., predicting a potential future path of the vehicle based in part on the sensor output and the at least one driving input].”).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 4, 6-8, 13, and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lee, as applied to claims 1, 9, and 14 above, and further in view of US-20210163021-A1, hereinafter “Frazzoli”.
Regarding claim 4, Lee discloses The system of claim 3 but does not appear to explicitly disclose the following:
wherein the safety system further comprises a safety aggregator (SA) module configured to: carry out, via at least one of the one or more processors, the identification of the unsafe driving condition of the vehicle, wherein the unsafe driving condition is based on a failure of at least one sensor of the plurality of sensors; and cause the at least one driving input to be overridden based upon the identification of the unsafe driving condition.
However, in the same field of endeavor, Frazzoli teaches:
wherein the safety system further comprises a safety aggregator (SA) module configured to (Frazzoli, para. 0334: “FIG. 32 shows an example of a sensor-related architecture of an autonomous vehicle 3205 (e.g., the AV 100 shown in FIG. 1) for detecting and handling sensor failure. The autonomous vehicle 3205 includes first sensor 3210 a, first buffer 3215 a, first multiplexer 3225 a, second sensor 3210 b, second buffer 3215 b, second multiplexer 3225 b, first transformer 3220 a, second transformer 3220 b, anomaly detector 3240 [i.e., a safety aggregator (SA) module], sensor selector 3235, and autonomous vehicle processor 3250. Various examples of sensors 3210 a-b include LiDAR, RADAR, camera, radio frequency (RF), ultrasound, infrared, and ultraviolet. Other types of sensors are possible. While two sensors are shown, the autonomous vehicle 3205 can use any number of sensors.”):
carry out, via at least one of the one or more processors, the identification of the unsafe driving condition of the vehicle, wherein the unsafe driving condition is based on a failure of at least one sensor of the plurality of sensors (Frazzoli, para. 0334: “FIG. 32 shows an example of a sensor-related architecture of an autonomous vehicle 3205 (e.g., the AV 100 shown in FIG. 1) for detecting and handling sensor failure [i.e., unsafe driving condition is based on a failure of at least one sensor].”; para. 0335: “In an embodiment, the sensors 3210 a-b [i.e., the plurality of sensors] are configured to produce respective sensor data streams from one or more environmental inputs such as objects, weather conditions, or road conditions external to the autonomous vehicle 3205 while the autonomous vehicle is in an operational driving state.”; para. 0336: “…the processor 3250 [i.e., via at least one of the one or more processors] is communicatively coupled with the sensors 3210 a-b via buffers 3215 a-b and multiplexers 3225 a-b.”; para. 0337: “…the anomaly detector 3240 is configured to detect an abnormal condition [i.e., identification of the unsafe driving condition of the vehicle] based on a difference between the sensor data streams being produced by respective sensors 3210 a-b.”); and
cause the at least one driving input to be overridden based upon the identification of the unsafe driving condition (Frazzoli, para. 0339: “The sensors 3210 a-b, for example, captures video of the road ahead of the autonomous vehicle 3205 at different angles such as from the left and right sides of the autonomous vehicle 3205. In one implementation, if the right-side sensor 3210 b fails [i.e., based upon the identification of the unsafe driving condition], then transformer 3220 b performs an affine transformation of the stream being produced by the left side sensor 3210 a to generate a replacement version of the stream [i.e., at least one driving input to be overridden] that was being produced by the right-side sensor 3210 b. As such, a video processing routine running on processor 3250 that is expecting two different camera angles can continue to function by using the replacement stream.”).
Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Lee, with the concept of implementing an autonomous vehicle safety system that considers sensor failure an unsafe condition and overrides at least one driving input with redundant, functioning input, taught by Frazzoli, in order to maintain safe operation of the autonomous vehicle, which relies on sensors to function properly and safely (Frazzoli, para. 0013: “Particular aspects of the foregoing techniques can provide one or more of the following advantages. Detecting and handling sensor failures are important in maintaining the safe and proper operation of an autonomous vehicle.”).
Regarding claim 6, Lee and Frazzoli teach The system of claim 4 and Lee further discloses the following:
wherein the SA module is further configured to identify failure of at least one software module implementing vehicle autonomy (Lee, para. 0063: “When a system fault occurs while autonomous driving is performed through the first autonomous driving controller 230 [i.e., at least one software module implementing vehicle autonomy], the control authority over the autonomous driving may be automatically switched over from the first autonomous driving controller 230 to the second autonomous driving controller 240 to maintain the autonomous driving.”).
Regarding claim 7, Lee and Frazzoli teach The system of claim 4 and Lee further discloses the following:
wherein the SA module is further configured to receive data from the plurality of sensors to identify the unsafe driving condition (Lee, para. 0069: “The second autonomous driving controller 240 [i.e., the SA module] may use the line information recognized through the front camera 212 to control a vehicle lateral behavior, and may use sensing information from the front radar 211 to calculate acceleration/deceleration for preventing the collision with the front vehicle in the vehicle deceleration [i.e., receive data from the plurality of sensors to identify the unsafe driving condition].”).
Regarding claim 8, Lee and Frazzoli teach The system of claim 4 and Frazzoli further teaches the following:
wherein the plurality of sensors includes at least a first sensor and a second sensor (Frazzoli, para. 0337: “…the anomaly detector 3240 is configured to detect an abnormal condition based on a difference between the sensor data streams being produced by respective sensors 3210 a-b [i.e., plurality of sensors includes at least a first sensor and a second sensor].”), and
wherein the safety system is further configured to compare inputs from the first sensor to inputs from at least the second sensor to determine whether one of the plurality of sensors is malfunctioning (Frazzoli, para. 0337: “…the anomaly detector 3240 [i.e., the safety system] is configured to detect an abnormal condition based on a difference between the sensor data streams being produced by respective sensors 3210 a-b [i.e., compare inputs from the first sensor to inputs from at least the second sensor to determine whether one of the plurality of sensors is malfunctioning].”).
Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Lee, as modified by Frazzoli, with the concept of determining whether one of the sensors of an autonomous vehicle has malfunctioned/failed by comparing the sensor data to another sensor, taught by Frazzoli, in order to verify the accuracy of the sensor output data, and thus maintain safe operation of the autonomous vehicle, which relies on sensors to function properly and safely (Frazzoli, para. 0013: “Particular aspects of the foregoing techniques can provide one or more of the following advantages. Detecting and handling sensor failures are important in maintaining the safe and proper operation of an autonomous vehicle.”).
Regarding claim 13, Lee discloses The apparatus of claim 9, but does not appear to explicitly disclose the following:
wherein the plurality of sensors includes at least a first sensor and a second sensor, the at least one non-transitory computer readable storage medium having further instructions stored thereon to compare inputs from the first sensor to inputs from at least the second sensor to determine whether one of the plurality of sensors is malfunctioning.
However, in the same field of endeavor, Frazzoli teaches:
wherein the plurality of sensors includes at least a first sensor and a second sensor (Frazzoli, para. 0337: “…the anomaly detector 3240 is configured to detect an abnormal condition based on a difference between the sensor data streams being produced by respective sensors 3210 a-b [i.e., plurality of sensors includes at least a first sensor and a second sensor].”),
the at least one non-transitory computer readable storage medium having further instructions stored thereon to compare inputs from the first sensor to inputs from at least the second sensor to determine whether one of the plurality of sensors is malfunctioning (Frazzoli, para. 0171: “A control module operates in accordance with a controller 1102 which includes, for example, one or more processors (e.g., one or more computer processors such as microprocessors or microcontrollers or both), short-term and/or long-term data storage (e.g., memory random-access memory or flash memory or both), and instructions stored in memory [i.e., at least one computer readable storage medium having further instructions stored thereon] that carry out operations of the controller 1102 when the instructions are executed (e.g., by the one or more processors).”; para. 0337: “…the anomaly detector 3240 is configured to detect an abnormal condition based on a difference between the sensor data streams being produced by respective sensors 3210 a-b [i.e., compare inputs from the first sensor to inputs from at least the second sensor to determine whether one of the plurality of sensors is malfunctioning].”).
Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Lee, with the concept of determining whether one of the sensors of an autonomous vehicle has malfunctioned/failed by comparing the sensor data to another sensor, taught by Frazzoli, in order to verify the accuracy of the sensor output data, and thus maintain safe operation of the autonomous vehicle, which relies on sensors to function properly and safely (Frazzoli, para. 0013: “Particular aspects of the foregoing techniques can provide one or more of the following advantages. Detecting and handling sensor failures are important in maintaining the safe and proper operation of an autonomous vehicle.”).
Regarding claim 18, Lee discloses The method of claim 16, but does not appear to explicitly disclose the following:
further comprising comparing a first sensor output to at least a second sensor output to determine if one of a first sensor and a second sensor is defective, wherein the first and second sensors are part of the plurality of sensors.
However, in the same field of endeavor, Frazzoli teaches:
further comprising comparing a first sensor output to at least a second sensor output to determine if one of a first sensor and a second sensor is defective, wherein the first and second sensors are part of the plurality of sensors (Frazzoli, para. 0334: “Various examples of sensors 3210 a-b include LiDAR, RADAR, camera, radio frequency (RF), ultrasound, infrared, and ultraviolet. Other types of sensors are possible. While two sensors are shown, the autonomous vehicle 3205 can use any number of sensors [i.e., the first and second sensors are part of the plurality of sensors].”; para. 0337: “…the anomaly detector 3240 is configured to detect an abnormal condition based on a difference between the sensor data streams being produced by respective sensors 3210 a-b [i.e., comparing a first sensor output to at least a second sensor output to determine if one of a first sensor and a second sensor is defective].”).
Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention and with a reasonable likelihood of success to modify the invention disclosed by Lee, with the concept of determining whether one of the sensors of an autonomous vehicle has malfunctioned/failed by comparing the sensor data to another sensor, taught by Frazzoli, in order to verify the accuracy of the sensor output data, and thus maintain safe operation of the autonomous vehicle, which relies on sensors to function properly and safely (Frazzoli, para. 0013: “Particular aspects of the foregoing techniques can provide one or more of the following advantages. Detecting and handling sensor failures are important in maintaining the safe and proper operation of an autonomous vehicle.”).
Additional Relevant Art
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US-20200017114-A1 (2020-01-16) | Para. 0049: "…compute subsystem 705 may be configured to perform automated driving tasks, while the safety companion subsystem 710 is tasked with monitoring and potentially correcting or at least mitigating malfunctions of the compute subsystem 705 detected by the safety companion subsystem 710." Relevant to claims 1, 9,and 14.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Leah N Miller whose telephone number is (703)756-1933. The examiner can normally be reached M-Th 8:30am - 5:30pm ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Flynn can be reached at (571) 272-9855. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/L.N.M./Examiner, Art Unit 3663
/ABBY J FLYNN/Supervisory Patent Examiner, Art Unit 3663