DETAILED ACTION
This office action is in response to an amendment filed 3/3/2026 wherein claims 1-23 are pending and being examined. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Claim 19 has been amended to correct an indefiniteness issue. Therefore the rejection of claim 19 under 35 U.S.C. 112(b) is withdrawn.
Applicant’s arguments with respect to claim(s) 1-23 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-6, 8-11, 13, 14, and 20-22 is/are rejected under 35 U.S.C. 103 as being unpatentable over Van Dan Elzen (US 2021/0044774) in view of Nix (US 2017/0113664).
In regard to claim 1, Van Dan Elzen discloses a vehicular vision system [¶0012; vehicle vision system and/or driver assist system], the vehicular vision system comprising:
a camera disposed at a vehicle equipped with the vehicular vision system and viewing exterior of the equipped vehicle, wherein the camera is operable to capture image data [Fig.1; camera (12). ¶0013; vehicle 10 includes an imaging system or vision system that includes a forward viewing camera module 12 that is disposed at and views through the windshield 14 of the vehicle and captures image data of the scene exterior and forward of the vehicle];
wherein the camera comprises a CMOS imaging array [¶0004; cameras (preferably one or more CMOS cameras)], and wherein the CMOS imaging array comprises at least one million photosensors arranged in rows and columns [¶0081; two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array (such as 1 to 2 MP) or the like)];
an electronic control unit (ECU) comprising electronic circuitry and associated software [¶0013; vision system includes a control or electronic control unit (ECU) or processor. ¶0080-¶0081];
wherein the electronic circuitry of the ECU comprises an image processor operable to process image data captured by the camera and transferred to the ECU [¶0013; process image data captured by the camera or cameras. ¶0080-¶0081; logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data];
wherein the vehicular vision system, responsive to processing by the image processor of image data captured by the camera [¶0012; object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle. ¶0080], saves the image data to a circular buffer, and wherein the circular buffer comprises volatile memory [¶0017; advanced driver assist system (ADAS) front camera that is already designed into the vehicle (such as for use in automatic or intelligent headlamp control, lane keep assist, lane departure warning, traffic sign recognition and/or the like) and can write endlessly to RAM. Claim 1; image data captured by the imaging array of the camera module and temporarily saved in volatile memory prior to the occurrence of the emergency event];
wherein the vehicular vision system, responsive to a trigger condition indicating a deficiency in an advanced driving assistance feature of the equipped vehicle [¶0030; trigger event can be … automatic emergency braking (AEB) system command by the forward camera module (FCM), or even an airbag deployment], transmits via wireless communication the saved image data [¶0027-¶0028; record video images or image data (and optionally other vehicle information or the like) to record an incident or event… recorded data can be communicated via a wireless communication. ¶0016-¶0017]; and
wherein the vehicular vision system, responsive to a user input [¶0030-¶0031; trigger event can be a user request (such as via a user input or button or switch or a user voice request or the like)], copies the saved image data from the circular buffer to non-volatile memory disposed within the equipped vehicle [¶0048; If the module receives a trigger to save video, all the video in the circular buffer, as well as a small amount of post-trigger video, will be written to a time/date coded file in flash memory. Claim 16; non-volatile memory is part of a recording device of the equipped vehicle].
Van Dan Elzen does not explicitly disclose, wherein the vehicular vision system, responsive to a trigger condition indicating a deficiency in an advanced driving assistance feature of the equipped vehicle, transmits via wireless communication the saved image data from the circular buffer to a remote server located remote from the equipped vehicle, and wherein the deficiency comprises detecting a manual driving maneuver by a driver of the equipped vehicle that exceeds a threshold while the advanced driving assistance feature fails to activate. However Nix discloses,
wherein the vehicular vision system [¶0070; Machine vision cameras 417 may capture images from the environment outside of a vehicle], responsive to a trigger condition indicating a deficiency in an advanced driving assistance feature of the equipped vehicle [¶0062; ADAS analytics module 340 may comprise an event detector 342… surprising event module 344 may identify various types of surprises, e.g., sensor inconsistencies that indicate potential sensor problems. ¶0031; defects that have been identified may be communicated to the related parties. ¶0041; collected data may be transmitted to another device… based on the context (e.g., altering operation of an ADAS system). ¶0107-¶0109; Event analysis module 612 may further include vehicle defect identification module 61], transmits via wireless communication [¶0028; wireless communication channel] the saved image data from the circular buffer to a remote server located remote from the equipped vehicle [¶0027-¶0028; When a surprising event has been detected… compiled data may include video data from a video camera 116… Data snapshots stored in the event data buffer 113 may be uploaded to a cloud server 120. ¶0064-¶0065; Responsive to an indication of a surprising event, data stored in event data buffer 346 from before and/or after the surprising event may be compiled by event data file generator 348… event data file may then be uploaded to analytics cloud server 334 via extra-vehicle communication system 324. ¶0096-¶0097; event data file comprises video (e.g., compressed video) and/or still images derived from a rear-facing camera], and wherein the deficiency comprises detecting a manual driving maneuver by a driver of the equipped vehicle that exceeds a threshold while the advanced driving assistance feature fails to activate [¶0094, Table 2; surprising event… manual override AND deceleration > .5 AND rear object detection == null. ¶0096; a manual override event may be indicated responsive to sudden deceleration of the host vehicle when traveling in reverse, and when no object has been detected in the rearward travel path of the host vehicle. ¶0132; surprising event detection system including an event detector comprising a set of rules defining a surprising event… combination of at least two input signals additionally or alternatively includes at least one signal reflecting a vehicle operator input and at least one signal from the object detection sensor].
Van Dan Elzen discloses a vehicular vision system including one or more cameras mounted on a vehicle that capture images of the vehicle's surroundings. The captured images are temporarily saved prior to the occurrence of an emergency event. In the event of an emergency or as selected by user input, the system saves camera images to memory or transmits camera images via wireless communication. Van Dan Elzen discloses that recorded images may be transmitted to a server in at least ¶0027-¶0029.
Nix discloses a vehicle system, wherein cameras and sensors capture data of the vehicle's surroundings, the data is stored temporarily in buffers, and depending on event conditions, the stored data is selectively uploaded to a server. As shown in Table 2 and as described in ¶0096, for example, when a manual deceleration event is detected of a certain threshold level ("a manual driving maneuver… that exceeds a threshold) and a rear object detection function of the vehicle fails to indicate an object ("the advanced driving assistance feature fails to activate"), it is determined that there is likely a deficiency in the rear object detection feature of the vehicle. This indicates a surprising event, which results in video and sensor data being transmitted from a buffer to a server.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the system disclosed by Van Dan Elzen with the deficiency determination and event data as disclosed by Nix in order to guide future development of ADAS and sensor improvement for vehicles [Nix ¶0004-¶0007, ¶0031, ¶0100-¶0101, ¶0111]. As disclosed by Nix, combining manual operating conditions and sensed ADAS conditions allows for a system to continuously improve advanced driving assistance systems for vehicles.
In regard to claim 2, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 1. Van Dan Elzen in view of Nix further discloses,
wherein the trigger condition comprises determining the deficiency in the advanced driving assistance feature of the equipped vehicle via processing, by the vehicular vision system, sensor data from a sensor of the equipped vehicle, and wherein the sensor is different from the camera [Van Dan Elzen ¶0023. Nix ¶0057, ¶0084, ¶0094-¶0096, ¶0107-¶0109].
See claim 1 for motivation to combine.
In regard to claim 3, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 2. Van Dan Elzen in view of Nix further discloses,
wherein the sensor comprises at least one selected from the group consisting of (i) an acceleration sensor [Nix ¶0057, ¶0084-¶0085], (ii) a braking sensor [Nix ¶0084-¶0085] and (iii) a steering sensor [Van Dan Elzen ¶0023. Nix ¶0084-¶0085].
See claim 1 for motivation to combine.
In regard to claim 4, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 1. Van Dan Elzen in view of Nix further discloses,
wherein the vehicular vision system, responsive to the trigger condition, transfers sensor data from one or more additional sensors from the circular buffer to the remote server [Nix ¶0027-¶0028, ¶0034, ¶0065-¶0066, ¶0096-¶0097],
See claim 1 for motivation to combine.
In regard to claim 5, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 4. Van Dan Elzen in view of Nix further discloses,
wherein the one or more additional sensors comprises at least one selected from the group consisting of (i) a GPS sensor [Nix ¶0080], (ii) a radar sensor, (iii) a lidar sensor [Van Dan Elzen ¶0081], (iv) an ultrasonic sensor [Van Dan Elzen ¶0081], (v) a temperature sensor and (vi) an ambient light sensor [Van Dan Elzen ¶0081].
See claim 1 for motivation to combine.
In regard to claim 6, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 1. Van Dan Elzen in view of Nix further discloses,
wherein the non-volatile memory comprises internal non-volatile memory [Van Dan Elzen ¶0048. Nix ¶0035-¶0036].
See claim 1 for motivation to combine.
In regard to claim 8, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 1. Van Dan Elzen further discloses,
wherein the user input comprises a voice command from an occupant of the equipped vehicle [¶0005, ¶0019-¶0020].
In regard to claim 9, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 1. Van Dan Elzen further discloses,
wherein the camera comprises a forward- viewing camera disposed at the windshield of the equipped vehicle [¶0013].
In regard to claim 10, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 1. Van Dan Elzen in view of Nix further discloses,
wherein the advanced driving assistance feature of the equipped vehicle comprises at least one selected from the group consisting of (i) an automatic emergency braking system [Nix ¶0132], (ii) an object detection system [Nix Table 2, ¶0096, ¶0132], (iii) an automatic cruise control system [Nix ¶0084] and (iv) a lane keeping system [Van Dan Elzen ¶0005, ¶0017, ¶0023, ¶0084].
See claim 1 for motivation to combine.
In regard to claim 11, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 1. Van Dan Elzen in view of Nix further discloses
comprising a plurality of cameras, and wherein image data captured by the plurality of cameras is transferred to the ECU [Van Dan Elzen ¶0080. Nix ¶0070-¶0072], and wherein the vehicular vision system, responsive to processing by the image processor of image data captured by the plurality of cameras, saves the image data from at least two cameras of the plurality of cameras to the circular buffer [Van Dan Elzen ¶0080. Nix ¶0079, ¶0089].
See claim 1 for motivation to combine.
In regard to claim 13, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 1. Van Dan Elzen in view of Nix further discloses,
wherein the trigger condition comprises a second user input [Van Dan Elzen ¶0005, ¶0030-¶0031. Nix ¶0081-¶0082, ¶0132].
See claim 1 for motivation to combine.
In regard to claim 14, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 1. Van Dan Elzen in view of Nix further discloses,
wherein the trigger condition comprises an indication of an accident [Van Dan Elzen ¶0023. Nix ¶0099, ¶0116].
See claim 1 for motivation to combine.
In regard to claim 20, Van Dan Elzen discloses a vehicular vision system [¶0012; vehicle vision system and/or driver assist system], the vehicular vision system comprising:
a forward-viewing camera disposed at a vehicle equipped with the vehicular vision system and viewing exterior of the equipped vehicle, wherein the forward-viewing camera is operable to capture image data [Fig.1; camera (12). ¶0013; vehicle 10 includes an imaging system or vision system that includes a forward viewing camera module 12 that is disposed at and views through the windshield 14 of the vehicle and captures image data of the scene exterior and forward of the vehicle];
wherein the forward-viewing camera comprises a CMOS imaging array [¶0004; cameras (preferably one or more CMOS cameras)], and wherein the CMOS imaging array comprises at least one million photosensors arranged in rows and columns [¶0081; two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array (such as 1 to 2 MP) or the like)];
an electronic control unit (ECU) comprising electronic circuitry and associated software [¶0013; vision system includes a control or electronic control unit (ECU) or processor. ¶0080-¶0081];
wherein the electronic circuitry of the ECU comprises an image processor operable to process image data captured by the forward-viewing camera and transferred to the ECU [¶0013; process image data captured by the camera or cameras. ¶0080-¶0081; logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data];
wherein the vehicular vision system, responsive to processing by the image processor of image data captured by the forward-viewing camera [¶0012; object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle. ¶0080], saves the image data to a circular buffer, and wherein the circular buffer comprises volatile memory [¶0017; advanced driver assist system (ADAS) front camera that is already designed into the vehicle (such as for use in automatic or intelligent headlamp control, lane keep assist, lane departure warning, traffic sign recognition and/or the like) and can write endlessly to RAM. Claim 1; image data captured by the imaging array of the camera module and temporarily saved in volatile memory prior to the occurrence of the emergency event];
wherein the vehicular vision system, responsive to a trigger condition indicating a deficiency in an advanced driving assistance feature of the equipped vehicle [¶0030; trigger event can be … automatic emergency braking (AEB) system command by the forward camera module (FCM), or even an airbag deployment], transmits via wireless communication, (i) the saved image data [¶0027-¶0028; record video images or image data (and optionally other vehicle information or the like) to record an incident or event… recorded data can be communicated via a wireless communication. ¶0016-¶0017]; and
wherein the vehicular vision system, responsive to a user input [¶0030-¶0031; trigger event can be a user request (such as via a user input or button or switch or a user voice request or the like)], copies the saved image data from the circular buffer to non-volatile memory disposed within the equipped vehicle [¶0048; If the module receives a trigger to save video, all the video in the circular buffer, as well as a small amount of post-trigger video, will be written to a time/date coded file in flash memory. Claim 16; non-volatile memory is part of a recording device of the equipped vehicle].
Van Dan Elzen does not explicitly disclose, wherein the vehicular vision system, responsive to a trigger condition indicating a deficiency in an advanced driving assistance feature of the equipped vehicle, transmits via wireless communication, (i) the saved image data from the circular buffer to a remote server located remote from the equipped vehicle and (ii) sensor data from one or more additional sensors from the circular buffer to the remote server, and wherein the deficiency comprises detecting a manual driving maneuver by a driver of the equipped vehicle that exceeds a threshold while the advanced driving assistance feature fails to activate. However Nix discloses,
wherein the vehicular vision system [¶0070; Machine vision cameras 417 may capture images from the environment outside of a vehicle], responsive to a trigger condition indicating a deficiency in an advanced driving assistance feature of the equipped vehicle [¶0062; ADAS analytics module 340 may comprise an event detector 342… surprising event module 344 may identify various types of surprises, e.g., sensor inconsistencies that indicate potential sensor problems. ¶0031; defects that have been identified may be communicated to the related parties. ¶0041; collected data may be transmitted to another device… based on the context (e.g., altering operation of an ADAS system). ¶0107-¶0109; Event analysis module 612 may further include vehicle defect identification module 61], transmits via wireless communication [¶0028; wireless communication channel], (i) the saved image data from the circular buffer to a remote server located remote from the equipped vehicle [¶0027-¶0028; When a surprising event has been detected… compiled data may include video data from a video camera 116… Data snapshots stored in the event data buffer 113 may be uploaded to a cloud server 120. ¶0064-¶0065; Responsive to an indication of a surprising event, data stored in event data buffer 346 from before and/or after the surprising event may be compiled by event data file generator 348… event data file may then be uploaded to analytics cloud server 334 via extra-vehicle communication system 324. ¶0096-¶0097; event data file comprises video (e.g., compressed video) and/or still images derived from a rear-facing camera] and (ii) sensor data from one or more additional sensors from the circular buffer to the remote server [¶0034; intra-vehicle communication module 222 may provide a signal via a bus corresponding to any status of the vehicle… sensors. ¶0096; event data file comprises… data received from the vehicle bus (BUS). ¶0132; rolling event data buffer. ¶0027-¶0028, ¶0065-¶0066], and wherein the deficiency comprises detecting a manual driving maneuver by a driver of the equipped vehicle that exceeds a threshold while the advanced driving assistance feature fails to activate [¶0094, Table 2; surprising event… manual override AND deceleration > .5 AND rear object detection == null. ¶0096; a manual override event may be indicated responsive to sudden deceleration of the host vehicle when traveling in reverse, and when no object has been detected in the rearward travel path of the host vehicle. ¶0132; surprising event detection system including an event detector comprising a set of rules defining a surprising event… combination of at least two input signals additionally or alternatively includes at least one signal reflecting a vehicle operator input and at least one signal from the object detection sensor].
See claim 1 for elaboration on Van Dan Elzen and Nix. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the system disclosed by Van Dan Elzen with the deficiency determination and event data as disclosed by Nix in order to guide future development of ADAS and sensor improvement for vehicles [Nix ¶0004-¶0007, ¶0031, ¶0100-¶0101, ¶0111]. As disclosed by Nix, combining manual operating conditions and sensed ADAS conditions allows for a system to continuously improve advanced driving assistance systems for vehicles.
In regard to claim 21, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 20. Van Dan Elzen in view of Nix further discloses,
wherein the one or more additional sensors comprises at least one selected from the group consisting of (i) a GPS sensor [Nix ¶0073], (ii) a radar sensor, (iii) a lidar sensor [Van Dan Elzen ¶0081], (iv) an ultrasonic sensor [Van Dan Elzen ¶0081], (v) a temperature sensor and (vi) an ambient light sensor [Van Dan Elzen ¶0081].
See claim 20 for motivation to combine.
In regard to claim 22, this claim is drawn to a vehicular vision system corresponding to the vehicular vision system of claim 11, wherein claims 22 contains the same limitations as claim 11 and is therefore rejected upon the same basis.
Claim(s) 7, and 16-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Van Dan Elzen (US 2021/0044774) in view of Nix (US 2017/0113664) in view of Han (US 2020/0226393).
In regard to claim 7, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 1. Neither Van Dan Elzen nor Nix explicitly disclose, wherein the non-volatile memory comprises external non-volatile memory that is connected to the vehicular vision system via a Universal Serial Bus (USB) connection. However Han discloses,
wherein the non-volatile memory comprises external non-volatile memory that is connected to the vehicular vision system via a Universal Serial Bus (USB) connection [¶0052; memory embedded in the vehicle 100 may be implemented in a form of a non-volatile memory, a volatile memory, a hard disk drive (HDD), or a solid-state drive (SDD), and the memory detachable from the vehicle 100 may be implemented as a memory card (for example, a micro SD card or a USB memory) or an external memory (for example, a USB memory) connectable to a USB port].
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the system disclosed by Van Dan Elzen in view of Nix with the usb memory disclosed by Han in order to provide a portable and low cost memory option in a vehicle recorder [Han Abstract, ¶0052]. As readily appreciated by a person of ordinary skill and as disclosed by Han, usb memory is a well-known and convenient option for memory storage depending on user wants.
In regard to claim 16, Van Dan Elzen discloses a vehicular vision system [¶0012; vehicle vision system and/or driver assist system], the vehicular vision system comprising:
a camera disposed at a vehicle equipped with the vehicular vision system and viewing exterior of the equipped vehicle, wherein the camera is operable to capture image data [Fig.1; camera (12). ¶0013; vehicle 10 includes an imaging system or vision system that includes a forward viewing camera module 12 that is disposed at and views through the windshield 14 of the vehicle and captures image data of the scene exterior and forward of the vehicle];
wherein the camera comprises a CMOS imaging array [¶0004; cameras (preferably one or more CMOS cameras)], and wherein the CMOS imaging array comprises at least one million photosensors arranged in rows and columns [¶0081; two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array (such as 1 to 2 MP) or the like)];
an electronic control unit (ECU) comprising electronic circuitry and associated software[¶0013; vision system includes a control or electronic control unit (ECU) or processor. ¶0080-¶0081];
wherein the electronic circuitry of the ECU comprises an image processor operable to process image data captured by the camera and transferred to the ECU [¶0013; process image data captured by the camera or cameras. ¶0080-¶0081; logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data];
wherein the vehicular vision system, responsive to processing by the image processor of image data captured by the camera [¶0012; object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle. ¶0080], saves the image data to a circular buffer, and wherein the circular buffer comprises volatile memory [¶0017; advanced driver assist system (ADAS) front camera that is already designed into the vehicle (such as for use in automatic or intelligent headlamp control, lane keep assist, lane departure warning, traffic sign recognition and/or the like) and can write endlessly to RAM. Claim 1; image data captured by the imaging array of the camera module and temporarily saved in volatile memory prior to the occurrence of the emergency event];
wherein the vehicular vision system, responsive to a trigger condition indicating a deficiency in an advanced driving assistance feature of the equipped vehicle [¶0030; trigger event can be … automatic emergency braking (AEB) system command by the forward camera module (FCM), or even an airbag deployment], transmits via wireless communication the saved image data [¶0027-¶0028; record video images or image data (and optionally other vehicle information or the like) to record an incident or event… recorded data can be communicated via a wireless communication. ¶0016-¶0017], and
wherein the trigger condition comprises determining the deficiency in the advanced driving assistance feature of the equipped vehicle via processing, by the vehicular vision system, sensor data from a sensor of the equipped vehicle, and wherein the sensor is different from the camera [¶0030-¶0031; vehicle generated trigger, such as activation of an automatic emergency braking (AEB) system … or even an airbag deployment. ¶0081]; and
wherein the vehicular vision system, responsive to a user input, copies the saved image data from the circular buffer to non-volatile memory disposed within the equipped vehicle [¶0048; If the module receives a trigger to save video, all the video in the circular buffer, as well as a small amount of post-trigger video, will be written to a time/date coded file in flash memory. Claim 16; non-volatile memory is part of a recording device of the equipped vehicle].
Van Dan Elzen does not explicitly disclose, transmits via wireless communication the saved image data from the circular buffer to a remote server located remote from the equipped vehicle, and wherein the trigger condition comprises determining the deficiency in the advanced driving assistance feature of the equipped vehicle via processing, by the vehicular vision system, sensor data from a sensor of the equipped vehicle, and wherein the sensor is different from the camera, and wherein the deficiency comprises detecting a manual driving maneuver by a driver of the equipped vehicle that exceeds a threshold while the advanced driving assistance feature fails to activate. However Nix discloses,
wherein the vehicular vision system [¶0070; Machine vision cameras 417 may capture images from the environment outside of a vehicle], responsive to a trigger condition indicating a deficiency in an advanced driving assistance feature of the equipped vehicle [¶0062; ADAS analytics module 340 may comprise an event detector 342… surprising event module 344 may identify various types of surprises, e.g., sensor inconsistencies that indicate potential sensor problems. ¶0031; defects that have been identified may be communicated to the related parties. ¶0041; collected data may be transmitted to another device… based on the context (e.g., altering operation of an ADAS system). ¶0107-¶0109; Event analysis module 612 may further include vehicle defect identification module 61], transmits via wireless communication [¶0028; wireless communication channel] the saved image data from the circular buffer to a remote server located remote from the equipped vehicle [¶0027-¶0028; When a surprising event has been detected… compiled data may include video data from a video camera 116… Data snapshots stored in the event data buffer 113 may be uploaded to a cloud server 120. ¶0064-¶0065; Responsive to an indication of a surprising event, data stored in event data buffer 346 from before and/or after the surprising event may be compiled by event data file generator 348… event data file may then be uploaded to analytics cloud server 334 via extra-vehicle communication system 324. ¶0096-¶0097; event data file comprises video (e.g., compressed video) and/or still images derived from a rear-facing camera], and wherein the trigger condition comprises determining the deficiency in the advanced driving assistance feature of the equipped vehicle via processing, by the vehicular vision system, sensor data from a sensor of the equipped vehicle, and wherein the sensor is different from the camera [¶0034; intra-vehicle communication module 222 may provide a signal via a bus corresponding to any status of the vehicle… sensors. ¶0096; event data file comprises… data received from the vehicle bus (BUS). ¶0132; rolling event data buffer. ¶0027-¶0028, ¶0065-¶0066, Table 2], and wherein the deficiency comprises detecting a manual driving maneuver by a driver of the equipped vehicle that exceeds a threshold while the advanced driving assistance feature fails to activate [¶0094, Table 2; surprising event… manual override AND deceleration > .5 AND rear object detection == null. ¶0096; a manual override event may be indicated responsive to sudden deceleration of the host vehicle when traveling in reverse, and when no object has been detected in the rearward travel path of the host vehicle. ¶0132; surprising event detection system including an event detector comprising a set of rules defining a surprising event… combination of at least two input signals additionally or alternatively includes at least one signal reflecting a vehicle operator input and at least one signal from the object detection sensor].
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the system disclosed by Van Dan Elzen with the deficiency determination and event data as disclosed by Nix in order to guide future development of ADAS and sensor improvement for vehicles [Nix ¶0004-¶0007, ¶0031, ¶0100-¶0101, ¶0111]. As disclosed by Nix, combining manual operating conditions and sensed ADAS conditions allows for a system to continuously improve advanced driving assistance systems for vehicles.
Neither Van Dan Elzen nor Nix explicitly disclose wherein the non-volatile memory comprises external non-volatile memory that is connected to the vehicular vision system via a Universal Serial Bus (USB) connection. However Han discloses,
wherein the non-volatile memory comprises external non-volatile memory that is connected to the vehicular vision system via a Universal Serial Bus (USB) connection [¶0052; memory embedded in the vehicle 100 may be implemented in a form of a non-volatile memory, a volatile memory, a hard disk drive (HDD), or a solid-state drive (SDD), and the memory detachable from the vehicle 100 may be implemented as a memory card (for example, a micro SD card or a USB memory) or an external memory (for example, a USB memory) connectable to a USB port].
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the system disclosed by Van Dan Elzen in view of Nix with the usb memory disclosed by Han in order to provide a portable and low cost memory option in a vehicle recorder [Han Abstract, ¶0052]. As readily appreciated by a person of ordinary skill and as disclosed by Han, usb memory is a well-known and convenient option for memory storage depending on user wants.
In regard to claim 17, Van Dan Elzen in view of Nix in view of Han discloses the vehicular vision system of claim 16. Van Dan Elzen in view of Nix in view of Han further discloses,
wherein the sensor comprises at least one selected from the group consisting of (i) an acceleration sensor [Nix ¶0084-¶0085], (ii) a braking sensor [Nix ¶0084-¶0085] and (iii) a steering sensor [Van Dan Elzen ¶0023. Nix ¶0084-¶0085].
See claim 16 for motivation to combine.
In regard to claim 18, Van Dan Elzen in view of Nix in view of Han discloses the vehicular vision system of claim 16. Van Dan Elzen in view of Nix in view of Han further discloses,
wherein the vehicular vision system, responsive to the trigger condition, transfers sensor data from one or more additional sensors from the circular buffer to the remote server [Nix ¶0027-¶0028, ¶0034, ¶0065-¶0066, ¶0096-¶0097],
See claim 16 for motivation to combine.
In regard to claim 19, Van Dan Elzen in view of Nix in view of Han discloses the vehicular vision system of claim 16. Van Dan Elzen in view of Nix in view of Han further discloses,
further comprising internal non-volatile memory [Van Dan Elzen ¶0048. Nix ¶0035-¶0036].
See claim 16 for motivation to combine.
Claim(s) 12 and 23 is/are rejected under 35 U.S.C. 103 as being unpatentable over Van Dan Elzen (US 2021/0044774) in view of Nix (US 2017/0113664) in view of Ochiai et al. (US 2017/0251163) (hereinafter Ochiai).
In regard to claim 12, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 11. Neither Van Dan Elzen nor Nix explicitly disclose wherein the at least two cameras are selected by a user of the vehicle. However Ochiai discloses,
wherein the at least two cameras are selected by a user of the vehicle [¶0050-¶0052; The capture of the multiple video streams may be based on user configuration settings that enable/disable the multiple cameras 134. ¶0058-¶0060; system buffers the media content including the multiple video streams and/or the one or more audio streams (212)].
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the system disclosed by Van Dan Elzen in view of Nix with user selection as disclosed by Ochiai in order to provide additional user configuration settings that allow a user to further refine the data to be stored and transmitted [Ochiai Abstract, ¶0006-¶0008, ¶0078].
In regard to claim 23, this claim is drawn to a vehicular vision system corresponding to the vehicular vision system of claim 12, wherein claims 23 contains the same limitations as claim 12 and is therefore rejected upon the same basis.
Claim(s) 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Van Dan Elzen (US 2021/0044774) in view of Nix (US 2017/0113664) in view of Ghannam et al. (US 2021/0229629) (hereinafter Ghannam).
In regard to claim 15, Van Dan Elzen in view of Nix discloses the vehicular vision system of claim 1. Neither Van Dan Elzen nor Nix explicitly disclose wherein the trigger condition comprises an object within a threshold distance of the equipped vehicle. However Ghannam discloses,
wherein the trigger condition comprises an object within a threshold distance of the equipped vehicle [¶0042-¶0046; determining that the object in the one of the plurality of distance-based zones has moved towards the vehicle and into a second of the plurality of distance-based zone… unique security response… recording and/or broadcasting images] .
Van Dan Elzen discloses in at least ¶0012 and ¶0080 detecting objects around the vehicle and generating alerts based thereon. However, as Van Dan Elzen does not explicitly disclose a trigger condition based on object threshold distance(s), Ghannam is relied upon. Ghannam discloses a device for recording the surroundings of a vehicle and conditionally transmitting collected sensor data. When an object is detected as coming into a threshold distance zone of the vehicle, a unique security alert can be generated. As noted in ¶0025-¶0030, the alerts can include transmission of images to a remote entity.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the system disclosed by Van Dan Elzen in view of Nix with the threshold distances disclosed by Ghannam in order to provide improved user customization of recorded and transmitted sensor data [Ghannam ¶0009-¶0010, ¶0032]. As disclosed by Ghannam, different users may desire different alert settings depending on personal preferences.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to REBECCA A VOLENTINE whose telephone number is (571)270-7261. The examiner can normally be reached Monday-Friday 9am - 5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Joe Ustaris can be reached at (571)272-7383. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/REBECCA A VOLENTINE/Primary Examiner, Art Unit 2483 March 21, 2026