DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments with respect to claims 1-5, 7, 9-19 and 20 have been considered but are moot because the new ground of rejection does not rely on a newly discover art of Han applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-5, 7, 9-19 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Mahmoud et al. US 2014/0218529 further in view of Han US 2021/0160415.
In regarding to claim 1 Mahmoud teaches:
1. A computer system comprising processing circuitry configured to: control a surveillance system to operate at a power saving state for a predetermined sleep time of a first surveillance cycle, the surveillance system comprising a plurality of image sensor systems;
[0041] The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase t.sub.a, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be: [0042] (1) wake up; [0043] (2) initialize camera(s); [0044] (3) capture image(s); [0045] (4) transfer image (to processing unit if not processed locally in camera); [0046] (5) filter image (Gaus filter/Box filter/Mosaicing); [0047] (6) load compare image (if not temporarily stored earlier); [0048] (7) calculate difference image (or other suitable object detection); [0049] (8) load ignore mask table; [0050] (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode); [0051] (10) update ignore mask table; [0052] (11) store image; and [0053] (12) enter sleep phase (t.sub.i). [0054] The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases t.sub.a), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
Mahmoud, 0041, emphasis added.
upon lapse of the predetermined sleep time, control the surveillance system to operate at a detection power state;
[0041] The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase t.sub.a, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be: [0042] (1) wake up; [0043] (2) initialize camera(s); [0044] (3) capture image(s); [0045] (4) transfer image (to processing unit if not processed locally in camera); [0046] (5) filter image (Gaus filter/Box filter/Mosaicing); [0047] (6) load compare image (if not temporarily stored earlier); [0048] (7) calculate difference image (or other suitable object detection); [0049] (8) load ignore mask table; [0050] (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode); [0051] (10) update ignore mask table; [0052] (11) store image; and [0053] (12) enter sleep phase (t.sub.i). [0054] The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases t.sub.a), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
Mahmoud, 0041, emphasis added.
at the detection power state, control the surveillance system to obtain first image data from a first image sensor subset comprising at least one image sensor system of the plurality of image sensor systems of a surrounding area of a vehicle;
[0041] The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase t.sub.a, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be: [0042] (1) wake up; [0043] (2) initialize camera(s); [0044] (3) capture image(s); [0045] (4) transfer image (to processing unit if not processed locally in camera); [0046] (5) filter image (Gaus filter/Box filter/Mosaicing); [0047] (6) load compare image (if not temporarily stored earlier); [0048] (7) calculate difference image (or other suitable object detection); [0049] (8) load ignore mask table; [0050] (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode); [0051] (10) update ignore mask table; [0052] (11) store image; and [0053] (12) enter sleep phase (t.sub.i). [0054] The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases t.sub.a), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
Mahmoud, 0041, emphasis added.
at the detection power state, process the first image data to determine whether at least one predetermined target is in the surrounding area;
[0041] The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase t.sub.a, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be: [0042] (1) wake up; [0043] (2) initialize camera(s); [0044] (3) capture image(s); [0045] (4) transfer image (to processing unit if not processed locally in camera); [0046] (5) filter image (Gaus filter/Box filter/Mosaicing); [0047] (6) load compare image (if not temporarily stored earlier); [0048] (7) calculate difference image (or other suitable object detection); [0049] (8) load ignore mask table; [0050] (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode); [0051] (10) update ignore mask table; [0052] (11) store image; and [0053] (12) enter sleep phase (t.sub.i). [0054] The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases t.sub.a), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
Mahmoud, 0041, emphasis added.
and upon determining that the predetermined target is not present in the surrounding area, control the surveillance system to operate at the power saving state for the predetermined sleep time of a second surveillance cycle following the first surveillance cycle.
[0041] The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase t.sub.a, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be: [0042] (1) wake up; [0043] (2) initialize camera(s); [0044] (3) capture image(s); [0045] (4) transfer image (to processing unit if not processed locally in camera); [0046] (5) filter image (Gaus filter/Box filter/Mosaicing); [0047] (6) load compare image (if not temporarily stored earlier); [0048] (7) calculate difference image (or other suitable object detection); [0049] (8) load ignore mask table; [0050] (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode); [0051] (10) update ignore mask table; [0052] (11) store image; and [0053] (12) enter sleep phase (t.sub.i). [0054] The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases t.sub.a), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
Mahmoud, 0041, emphasis added.
However, Mahmoud fails to explicitly teach, but Han teaches:
upon lapse of the predetermined sleep time, control the surveillance system to operate at
the detection power state; and at the detection power state, control the surveillance system to obtain second image data from a second image sensor subset comprising at least one image sensor system of the plurality of image sensor systems at the second surveillance cycle, wherein the first image sensor subset is different from the second image sensor subset.
0064] Furthermore, the network camera 10 may be a battery-powered low-power camera. The low-power camera maintains a sleep mode at normal times and periodically wakes up to check whether an event has occurred. The low-power camera is switched to an active mode when the event occurs and returns to the sleep mode when no event occurs. As described above, the low-power camera can reduce power consumption by maintaining the active mode only when an event occurs.
Han, 0064, emphasis added.
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date to combine the teaching of Han with the system of Mahmoud in order upon lapse of the predetermined sleep time, control the surveillance system to operate at the detection power state; and at the detection power state, control the surveillance system to obtain second image data from a second image sensor subset comprising at least one image sensor system of the plurality of image sensor systems at the second surveillance cycle, wherein the first image sensor subset is different from the second image sensor subset, as such, the low-power camera can reduce power consumption by maintaining the active mode only when an event occurs..—0064.
In regarding to claim 2 Mahmoud and Han teaches:
2. The computer system of claim 1, furthermore, Mahmoud teaches: wherein the processing circuitry is further configured to, during the second surveillance cycle: upon lapse of the predetermined sleep time, control the surveillance system to operate at the detection power state;
[0041] The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase t.sub.a, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be: [0042] (1) wake up; [0043] (2) initialize camera(s); [0044] (3) capture image(s); [0045] (4) transfer image (to processing unit if not processed locally in camera); [0046] (5) filter image (Gaus filter/Box filter/Mosaicing); [0047] (6) load compare image (if not temporarily stored earlier); [0048] (7) calculate difference image (or other suitable object detection); [0049] (8) load ignore mask table; [0050] (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode); [0051] (10) update ignore mask table; [0052] (11) store image; and [0053] (12) enter sleep phase (t.sub.i). [0054] The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases t.sub.a), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
Mahmoud, 0041, emphasis added.
at the detection power state, control the surveillance system to obtain the second image data of the surrounding area of the vehicle;
[0041] The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase t.sub.a, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be: [0042] (1) wake up; [0043] (2) initialize camera(s); [0044] (3) capture image(s); [0045] (4) transfer image (to processing unit if not processed locally in camera); [0046] (5) filter image (Gaus filter/Box filter/Mosaicing); [0047] (6) load compare image (if not temporarily stored earlier); [0048] (7) calculate difference image (or other suitable object detection); [0049] (8) load ignore mask table; [0050] (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode); [0051] (10) update ignore mask table; [0052] (11) store image; and [0053] (12) enter sleep phase (t.sub.i). [0054] The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases t.sub.a), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
Mahmoud, 0041, emphasis added.
at the detection power state, process the second image data to determine whether at least one predetermined target is in the surrounding area;
[0041] The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase t.sub.a, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be: [0042] (1) wake up; [0043] (2) initialize camera(s); [0044] (3) capture image(s); [0045] (4) transfer image (to processing unit if not processed locally in camera); [0046] (5) filter image (Gaus filter/Box filter/Mosaicing); [0047] (6) load compare image (if not temporarily stored earlier); [0048] (7) calculate difference image (or other suitable object detection); [0049] (8) load ignore mask table; [0050] (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode); [0051] (10) update ignore mask table; [0052] (11) store image; and [0053] (12) enter sleep phase (t.sub.i). [0054] The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases t.sub.a), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
Mahmoud, 0041, emphasis added.
and upon determining that the predetermined target is not present in the surrounding area, control the surveillance system to operate at the power saving state for the predetermined sleep time of a third surveillance cycle following the second surveillance cycle.
[0041] The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase t.sub.a, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be: [0042] (1) wake up; [0043] (2) initialize camera(s); [0044] (3) capture image(s); [0045] (4) transfer image (to processing unit if not processed locally in camera); [0046] (5) filter image (Gaus filter/Box filter/Mosaicing); [0047] (6) load compare image (if not temporarily stored earlier); [0048] (7) calculate difference image (or other suitable object detection); [0049] (8) load ignore mask table; [0050] (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode); [0051] (10) update ignore mask table; [0052] (11) store image; and [0053] (12) enter sleep phase (t.sub.i). [0054] The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases t.sub.a), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
Mahmoud, 0041, emphasis added.
In regarding to claim 3 Mahmoud and Han teaches:
3. The computer system of claim 1, furthermore, Mahmoud teaches: wherein the processing circuitry is further configured to: at the detection power state, upon determining that the predetermined target is present in the surrounding area, control the surveillance system to operate at a monitoring power state.
[0041] The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase t.sub.a, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be: [0042] (1) wake up; [0043] (2) initialize camera(s); [0044] (3) capture image(s); [0045] (4) transfer image (to processing unit if not processed locally in camera); [0046] (5) filter image (Gaus filter/Box filter/Mosaicing); [0047] (6) load compare image (if not temporarily stored earlier); [0048] (7) calculate difference image (or other suitable object detection); [0049] (8) load ignore mask table; [0050] (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode); [0051] (10) update ignore mask table; [0052] (11) store image; and [0053] (12) enter sleep phase (t.sub.i). [0054] The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases t.sub.a), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
Mahmoud, 0041, emphasis added.
In regarding to claim 4 Mahmoud and Han teaches:
4. The computer system of claim 2, furthermore, Mahmoud teaches: wherein processing the second image data to determine whether the at least one predetermined target is present in the surrounding area further comprises processing the second image data by comparison of the second image data to the first image data.
[0041] The data recording system may comprise one or more excitation status flags or state machines, the states of which equate to elapsed time, the environmental input and battery charge status. At cyclical wake ups, the system may enter an active or awake time phase t.sub.a, and may activate the vehicle cameras, the environmental sensors and the image processing device such as in the example shown in FIG. 2. In such an example, the processing steps may be: [0042] (1) wake up; [0043] (2) initialize camera(s); [0044] (3) capture image(s); [0045] (4) transfer image (to processing unit if not processed locally in camera); [0046] (5) filter image (Gaus filter/Box filter/Mosaicing); [0047] (6) load compare image (if not temporarily stored earlier); [0048] (7) calculate difference image (or other suitable object detection); [0049] (8) load ignore mask table; [0050] (9) decide whether to initiate recording mode (video image capturing) jump to recording procedure (and exit the cyclical wake up mode); [0051] (10) update ignore mask table; [0052] (11) store image; and [0053] (12) enter sleep phase (t.sub.i). [0054] The captured data may initially or provisionally be stored local or at the vehicle. The storage media and the vehicle communication bus (such as, for example, a vehicle CAN bus or a vehicle LIN bus or the like) may stay asleep if not awakened or activated by entering a higher excitation state, which may be triggered by the object detection algorithm or when the local storage memory is nearly full (and not able to conceive another image capture data set). When awake (time phases t.sub.a), the data recording system may capture several images or video images or a movie, preferably capturing at least one image from one or more cameras (including remote cameras), and/or optionally the system may fetch or capture or determine a status of one or more external sensors and may compare these to earlier captured data sets. By cyclically taking just one image at a time (such as by capturing frames of image data with a selected or determined period of time between captures, such as, for example, less than or equal to about ten frames per second (fps) or about five fps about one fps or about 0.2 fps (one frame every five seconds) or about 0.1 fps (one frame every ten seconds) or one frame per minute or one frame per five minutes or any selected capture rate or time interval or period, a time lapse video develops of the scene encompassed by that camera's field of view. That time lapse video records the good or no-change case, when no disturbances happened nearby the parked vehicle. The time lapse may differ by the chosen wake up time gaps or time periods between image captures.
Mahmoud, 0041, emphasis added.
In regarding to claim 5 Mahmoud and Han teaches:
5. The computer system of claim 4, furthermore, Mahmoud teaches: wherein the processing circuitry is further configured to, at the detection power state: upon determining that the second image data is substantially different from the first image data, further process the first image data by an item recognition circuitry to determine presence of the predetermined target in the surrounding area.
[0055] The data captured may be stored in a FIFO memory at which the oldest part of the lapse video and optionally other sensor data may be overwritten by the newer ones in cases where no vehicle alarm occurs, or alternatively the captured data may be stored in a memory device (in all cases), and the system may periodically back up the stored data and/or may transfer the stored data from a local memory device (such as a vision system or vehicle inherent memory device such as like a flash memory or solid state drive, which may be exchangeable by the vehicle owner) to a remote or external memory device (such as via a telematics system or other communication or data transfer system). The captured images or video and optionally other sensor data may be stored/transferred in a compressed data format or as RAW or may be stored in RAW locally and transferred compressed or may be stored compressed locally and transferred in RAW. In cases where the system employs an object detection algorithm, the system may store images in the area or areas of moving objects and/or regions of interest in a high definition and/or uncompressed format, and the system may store images in other areas or parts of less interest at or surrounding the vehicle in a low definition or compressed format for shrinking the data size for storing or transmission.
Mahmoud, 0041, 0055, emphasis added.
In regarding to claim 7 Mahmoud and Han teaches:
7. The computer system of claim 1, furthermore, Mahmoud teaches: wherein the plurality of image sensor systems comprises two or more of a front view image sensor system, a rear view image sensor system, a right view image sensor system, a left view image sensor system or a birds view image sensor system.
Mahmoud, 0032, 0089
In regarding to claim 9 Mahmoud and Han teaches:
9. The computer system of claim 1, furthermore, Mahmoud teaches: wherein the processing circuitry is further configured to: upon a lapse of the predetermined sleep time of each surveillance cycle, control a subset of a plurality of image sensor systems of the surveillance system to operate at the detection power state, wherein each subset of the plurality of image sensor systems is cycled through in an order, the order being at least one of, a predetermined order, a variable order determined depending on targets detected, a random order.
Mahmoud, 0036, 0041
In regarding to claim 10 Mahmoud and Han teaches:
10. The computer system of claim 1, furthermore, Mahmoud teaches: wherein the processing circuitry is further configured to: at the detection power state, obtain first image data as a single still image.
Mahmoud, 0041
In regarding to claim 11 Mahmoud and Han teaches:
11. The computer system of claim 1, furthermore, Mahmoud teaches: wherein the predetermined sleep time is 30 s or less.
Mahmoud, 0041
In regarding to claim 12 Mahmoud and Han teaches:
12. The computer system of claim 1, furthermore, Mahmoud teaches: wherein the processing circuitry is further configured to: upon obtaining sensor data from sensor circuitry of the vehicle indicating movement at the surrounding area of the vehicle, wake up from the power saving state to the detection power state.
Mahmoud, 0041
In regarding to claim 13 Mahmoud and Han teaches:
13. The computer system of claim 1, furthermore, Mahmoud teaches: wherein the detection power state consumes more power than the power saving state and the monitoring power state consumes more power than the detection power state.
Mahmoud, 0033, 0040-0041
In regarding to claim 14 Mahmoud and Han teaches:
14. The computer system of claim 1, furthermore, Mahmoud teaches: wherein the processing circuitry is further configured to: during the second surveillance cycle, upon lapse of the predetermined sleep time, control the surveillance system to operate at the detection power state;
Mahmoud, 0041
at the detection power state, control the surveillance system to obtain the second image data of the surrounding area of the vehicle;
Mahmoud, 0032, 0041
at the detection power state, process the second image data to determine whether at least one predetermined target is in the surrounding area;
Mahmoud, 0041
upon determining that the predetermined target is not present in the surrounding area, control the surveillance system to operate at the power saving state for the predetermined sleep time of a third surveillance cycle following the second surveillance cycle;
Mahmoud, 0041
at the detection power state, upon determining that the predetermined target is present in the surrounding area, control the surveillance system to operate at a monitoring power state;
Mahmoud, 0041
at the detection power state, upon determining that the second image data is substantially different from the first image data, further process the first image data by an item recognition circuitry to determine presence of the predetermined target in the surrounding area;
Mahmoud, 0041
at the detection power state, obtain the first image data from the surveillance system, wherein the surveillance system comprises a plurality of image sensor systems;
Mahmoud, 0036, 0041
obtain the first image data from the first image sensor subset comprising the at least one image sensor system of the plurality of image sensor systems at the first surveillance cycle, and obtain the second image data from the second image sensor subset comprising the at least one image sensor system of the plurality of image sensor systems at the second surveillance cycle, wherein the first image sensor subset is different from the second image sensor subset;
Mahmoud, 0036, 0041
and upon a lapse of the predetermined sleep time of each surveillance cycle, control a subset of the plurality of image sensor systems of the surveillance system to operate at the detection power state, wherein each subset of the plurality of image sensor systems is cycled through in an order, the order being at least one of a predetermined order, a variable order determined depending on targets detected, and a random order, wherein processing the second image data to determine whether the at least one predetermined target is present in the surrounding area further comprises processing the image data by comparison of the second image data to the first image data.
Mahmoud, 0033, 0036, 0040-0041
In regarding to claim 15 Mahmoud and Han teaches:
15. A vehicle comprising the computer system of claim 1.
Mahmoud, 0041
In regarding to claim 16 Mahmoud and Han teaches:
16. The vehicle of claim 15, furthermore, Mahmoud teaches: wherein the power saving state is entered responsive to the vehicle being parked.
Mahmoud, 0039
In regarding to claim 17 Mahmoud and Han teaches:
17. The vehicle of claim 15, furthermore, Mahmoud teaches: wherein the vehicle is a heavy-duty vehicle.
Mahmoud, Fig. 1.
Claims 18-20 list all similar elements of claim 1, but in method, computer program and a non-transitory computer-readable medium form rather than system form. Therefore, the supporting rationale of the rejection to claim 1 applies equally as well to claims 18-20.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIEL T TEKLE whose telephone number is (571)270-1117. The examiner can normally be reached Monday-Friday 8:00-4:30 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Vaughn can be reached at 571-272-3922. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DANIEL T TEKLE/Primary Examiner, Art Unit 2481