Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Claims 1-20 rejected on the ground of nonstatutory double patenting as being unpatentable over claims of U.S. Patent No. 11800063, U.S. Patent No. 12212892. Although the claims at issue are not identical, they are not patentably distinct from each other because they recited substantial similar surveillance with substantially similar sensor structure or the claimed patents has additional features such as scaling the range.
Instant Application App.
U.S. Patent No. 11800063
1. A surveillance control system controlling a plurality of sensors comprising: at least one memory storing instructions; and at least one processor connected to the memory that, based on the instructions, performs operations comprising:
receiving sensor data including event related data relating to at least one of scale of the event and occurrence location of the event from at least one of the plurality of sensors;
selecting at least one of the plurality of sensors based on map information, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image, and
actuating the selected at least one of the plurality of sensors.
the map information concerning wall or other space defining structures;
2. The surveillance control system according to claim 1, the operations further comprising: selecting the at least one of the plurality of sensors, the at least one being capable of sensing the event directly.
3. The surveillance control system according to claim 1, the operations further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and selecting, as the at least one of the plurality of sensors, at least one of the sensors determined to be capable of sensing the event directly.
4. The surveillance control system according to claim 1, the operations further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and when the sensors of the one type are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, at least one of sensors of another type being capable of sensing the event from among the plurality of sensors.
5. The surveillance control system according to claim 1, the operations further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and selecting, as the at least one of the plurality of sensors, at least one of the cameras determined to be capable of sensing the event directly.
6. The surveillance control system according to claim 1, the operations further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and when the cameras are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, a microphone being capable of sensing the event from among the plurality of sensors.
7. The surveillance control system according to claim 1, the operations further comprising: comparing the coverage information with the sensor data; and selecting the at least one of the plurality of cameras based on the comparing result.
8. The surveillance control system according to claim 1, the operations further comprising: selecting the at least one of the plurality of sensors based on the type of the event.
9. The surveillance control system according to claim 1, the operations further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures.
10. The surveillance control system according to claim 1, the operations further comprising: actuating at least one of the plurality of sensors by changing at least one of direction, resolution or pan-tilt-zoom (PTZ) settings of at least one of the plurality of sensors.
11. A surveillance control method controlling a plurality of sensors comprising: receiving sensor data including event related data relating to at least one of scale of the event and occurrence location of the event from at least one of the plurality of sensors; selecting at least one of the plurality of sensors based on map information, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image, the map information concerning wall or other space defining structures; andactuating the selected at least one of the plurality of sensors.
12. The surveillance control method according to claim 11, further comprising: selecting the at least one of the plurality of sensors, the at least one being capable of sensing the event directly.
13. The surveillance control method according to claim 11, further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and selecting, as the at least one of the plurality of sensors, at least one of the sensors determined to be capable of sensing the event directly.
14. The surveillance control method according to claim 11, further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and when the sensors of the one type are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, at least one of sensors of another type being capable of sensing the event from among the plurality of sensors.
15. The surveillance control method according to claim 11, further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and selecting, as the at least one of the plurality of sensors, at least one of the cameras determined to be capable of sensing the event directly.
16. The surveillance control method according to claim 11, further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and when the cameras are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, a microphone being capable of sensing the event from among the plurality of sensors.
17. The surveillance control method according to claim 11, further comprising: comparing the coverage information with the sensor data; and selecting the at least one of the plurality of cameras based on the comparing result.
18. The surveillance control method according to claim 11, further comprising: selecting the at least one of the plurality of sensors based on the type of the event.
19. The surveillance control method according to claim 11, further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures.
20. A non-transitory computer readable storage medium causing a computer controlling a plurality of sensors to execute operations comprising: receiving sensor data including event related data relating to at least one of scale of the event and occurrence location of the event from at least one of the plurality of sensors; selecting at least one of the plurality of sensors based on map information, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image, the map information concerning wall or other space defining structures; and actuating the selected at least one of the plurality of sensors.
1. A surveillance control system controlling a plurality of sensors comprising: at least one memory storing instructions; and at least one processor connected to the memory that, based on the instructions, performs operations comprising:
receiving sensor data including event related data relating to at least one of scale of the event and occurrence location of the event from at least one of the plurality of sensors;
selecting at least one of the plurality of sensors based on map information, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image, and
actuating the selected at least one of the plurality of sensors.
the map information concerning wall or other space defining structures;
selecting, when it is determined that the at least one of the scale of the event and the occurrence location of the event exceeds a sensing range of the at least one of the plurality of sensors.
2. The surveillance control system according to claim 1, the operations further comprising: selecting the at least one of the plurality of sensors, the at least one being capable of sensing the event directly.
3. The surveillance control system according to claim 1, the operations further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and selecting, as the at least one of the plurality of sensors, at least one of the sensors determined to be capable of sensing the event directly.
4. The surveillance control system according to claim 1, the operations further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and when the sensors of the one type are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, at least one of sensors of another type being capable of sensing the event from among the plurality of sensors.
5. The surveillance control system according to claim 1, the operations further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and selecting, as the at least one of the plurality of sensors, at least one of the cameras determined to be capable of sensing the event directly.
6. The surveillance control system according to claim 1, the operations further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and when the cameras are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, a microphone being capable of sensing the event from among the plurality of sensors.
7. The surveillance control system according to claim 1, the operations further comprising: comparing the coverage information with the sensor data; and selecting the at least one of the plurality of cameras based on the comparing result.
8. The surveillance control system according to claim 1, the operations further comprising: selecting the at least one of the plurality of sensors based on the type of the event.
9. The surveillance control system according to claim 1, the operations further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures.
10. The surveillance control system according to claim 1, the operations further comprising: actuating at least one of the plurality of sensors by changing at least one of direction, resolution or pan-tilt-zoom (PTZ) settings of at least one of the plurality of sensors.
11. A surveillance control method controlling a plurality of sensors comprising: receiving sensor data including event related data relating to at least one of scale of the event and occurrence location of the event from at least one of the plurality of sensors; selecting at least one of the plurality of sensors based on map information, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image, the map information concerning wall or other space defining structures; andactuating the selected at least one of the plurality of sensors;
selecting, when it is determined that the at least one of the scale of the event and the occurrence location of the event exceeds a sensing range of the at least one of the plurality of sensors,
12. The surveillance control method according to claim 11, further comprising: selecting the at least one of the plurality of sensors, the at least one being capable of sensing the event directly.
13. The surveillance control method according to claim 11, further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and selecting, as the at least one of the plurality of sensors, at least one of the sensors determined to be capable of sensing the event directly.
14. The surveillance control method according to claim 11, further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and when the sensors of the one type are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, at least one of sensors of another type being capable of sensing the event from among the plurality of sensors.
15. The surveillance control method according to claim 11, further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and selecting, as the at least one of the plurality of sensors, at least one of the cameras determined to be capable of sensing the event directly.
16. The surveillance control method according to claim 11, further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and when the cameras are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, a microphone being capable of sensing the event from among the plurality of sensors.
17. The surveillance control method according to claim 11, further comprising: comparing the coverage information with the sensor data; and selecting the at least one of the plurality of cameras based on the comparing result.
18. The surveillance control method according to claim 11, further comprising: selecting the at least one of the plurality of sensors based on the type of the event.
19. The surveillance control method according to claim 11, further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures.
20. A non-transitory computer readable storage medium causing a computer controlling a plurality of sensors to execute operations comprising: receiving sensor data including event related data relating to at least one of scale of the event and occurrence location of the event from at least one of the plurality of sensors; selecting at least one of the plurality of sensors based on map information, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image, the map information concerning wall or other space defining structures; and actuating the selected at least one of the plurality of sensors.
selecting, when it is determined that the at least one of the scale of the event and the occurrence location of the event exceeds a sensing range of the at least one of the plurality of sensors.
US Pat. No.: 12212892
1. A surveillance control system controlling a plurality of sensors comprising: at least one memory storing instructions; and at least one processor connected to the memory that, based on the instructions, performs operations comprising:
receiving sensor data including event related data including sensing result from at least one microphone included in the plurality of sensors and
position information of the at least one microphone; determining if a sound sensed as the sensing result comes from inside a compartment based on the sensing data;
selecting at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and
actuating the selected at least one of the plurality of sensors.
2. The surveillance control system according to claim 1, wherein the event related data includes a direction in which the sound comes.
3. The surveillance control system according to claim 1, wherein each of the plurality of sensors is arranged in one of a plurality of compartments including, and the operation comprises: determining, among the plurality of compartments, the compartment from which the sound comes based on the sensor data; and selecting, as the at least one of the sensors, the at least one of the plurality of cameras based on the determined compartment and the coverage information.
4. The surveillance control system according to claim 3, wherein the operation comprises selecting, from among the plurality of cameras, at least one camera included in the determined compartment.
5. The surveillance control system according to claim 3, wherein at least one microphone and at least one camera are arranged in each of the plurality of compartments.
6. The surveillance control system according to claim 3, wherein the operation comprises: when the position information indicates one of the plurality of compartments, selecting, from among the plurality of cameras, an inside camera arranged in the one of the plurality of compartments; andwhen the position information indicates outside part of the plurality of compartments, selecting an outside camera capable of capturing the outside part.
7. The surveillance control system according to claim 1, the operations further comprising: comparing the coverage information with the sensor data; and selecting at least one of the plurality of cameras based on the comparing result.
8. The surveillance control system according to claim 1, the operations further comprising:selecting at least one of the plurality of sensors based on the type of the event.
9. The surveillance control system according to claim 1, the operations further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures.
10. The surveillance control system according to claim 1, the operations further comprising: selecting at least one of the plurality of sensors by changing at least one of direction, resolution or pan-tilt-zoom (PTZ) settings of at least one of the plurality of sensors.
11. A surveillance control method controlling a plurality of sensors comprising: receiving sensor data including event related data including sensing result from at least one microphone included in the plurality of sensors and position information of the at least one microphone; determining if a sound sensed as the sensing result comes from inside a compartment based on the sensing data; selecting at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and actuating the selected at least one of the plurality of sensors.
12. The surveillance control method according to claim 11, wherein the event related data includes a direction in which the sound comes.
13. The surveillance control method according to claim 11, wherein each of the plurality of sensors is arranged in one of a plurality of compartments including, and the surveillance control method comprises: determining, among the plurality of compartments, the compartment from which the sound comes based on the sensor data; and selecting, as the at least one of the sensors, the at least one of the plurality of cameras based on the determined compartment and the coverage information.
14. The surveillance control method according to claim 13, further comprising selecting, from among the plurality of cameras, at least one camera included in the determined compartment.
15. The surveillance control method according to claim 13, wherein at least one microphone and at least one camera are arranged in each of the plurality of compartments.
16. The surveillance control method according to claim 13, further comprising: when the position information indicates one of the plurality of compartments, selecting, from among the plurality of cameras, an inside camera arranged in the one of the plurality of compartments; andwhen the position information indicates outside part of the plurality of compartments, selecting an outside camera capable of capturing the outside part.
17. The surveillance control method according to claim 11, further comprising: comparing the coverage information with the sensor data; and selecting at least one of the plurality of cameras based on the comparing result.
18. The surveillance control method according to claim 11, further comprising: selecting at least one of the plurality of sensors based on the type of the event.
19. The surveillance control method according to claim 11, further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures.
20. A non-transitory computer readable storage medium storing a program causing a computer controlling a plurality of sensors to execute operations comprising: receiving sensor data including event related data including sensing result from at least one microphone included in the plurality of sensors and position information of the at least one microphone; determining if a sound sensed as the sensing result comes from inside a compartment based on the sensing data; selecting at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and actuating the selected at least one of the plurality of sensors.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
1. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Buehler (Pub. No.: US 2015/0244992 Al, “Buehler”), in view of McCoy et al hereafter McCoy (US 2015/0116501 A1, “McCoy”).
Regarding claim 1, Buehler discloses a surveillance control system controlling a plurality of sensors comprising: at least one memory storing instructions; and at least one processor connected to the memory that, based on the instructions, performs operations comprising:
receiving sensor data including event related data relating to at least one of scale of the event (BUEHLER, para 16) and occurrence location of the event from at least one of the plurality of sensors (BUEHLER, para 51 and in claim 20, receiving one or more signal from plurality of sensors… para 16)
selecting at least one of the plurality of sensors based on map (BUEHLER, ¶ 43) information (BUEHLER, see citation of claim 2, map citation), the map information concerning wall (BUEHLER, ¶ 46) or other space (BUEHLER, ¶ 5) defining structures (BUEHLER, Fig. 4, ¶ 43);
It is noted that Buehler is silent about the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image, and actuating the selected as claimed.
However, MCCOY discloses the sensor data and coverage information relating to imaging range (para 51 and in claim 20, receiving one or more signal from plurality of sensors) which each of the plurality of sensors is able to image (MCCOY, ¶ 78, i.e., the processor 202 may adjust the pan, zoom, and/or tilt of the selected first camera 104a based on a direction and/or a distance of each of the first object 102a and the second object 102b, relative to the selected first camera 104a. In another embodiment, the processor 202 may adjust the pan, zoom, and/or tilt of the selected first camera 104a based on a direction and/or a distance of the first object 102a, relative to the second object 102b. For example, both the first object 102a and the second object 102b move in the same direction. In such a case, the processor 202 may zoom the selected first camera 104a to the extent that both the first object 102a and the second object 102b lie in field of view of the first camera 104a. In another example, the first object 102a and the second object 102b move in an opposite direction. In such a case, the processor 202 may zoom out the selected first camera 104a, such that both the first object 102a and the second object 102b remain in field of view of the first camera 104a), and actuating the selected (¶ 60, i.e. when the object is moving away from camera) at least one of the plurality of sensors.
Both BUEHLER and MCCOY teach systems with selection of cameras based on event, and those systems are comparable to that of the instant application. Because the two cited references are analogous to the instant application, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains, to include in the BUEHLER disclosure, switching to camera based on the input of the current cameras, as taught by MCCOY. Such inclusion would have increased the usefulness of the camera system by adding tracking objects not visible in current field of view of the camera, and would have been consistent with the rationale of combining prior art elements according to known methods to yield predictable results to show a prima facie case of obviousness (MPEP 2143(I)(A)) under KSR International Co. v. Teleflex Inc., 127 S. Ct. 1727, 82 USPQ2d 1385, 1395-97 (2007).
Regarding claim 2, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control system according to claim 1, the operations further comprising: selecting the at least one of the plurality of sensors (as cited below, i.e. the cameras cited), the at least one being capable of sensing the event directly (see MCCOY, i.e. zoning based on an object detection such that the distance between objects, the objects are in the different zones. ¶ 78, i.e., the processor 202 may adjust the pan, zoom, and/or tilt of the selected first camera 104a based on a direction and/or a distance of each of the first object 102a and the second object 102b, relative to the selected first camera 104a. In another embodiment, the processor 202 may adjust the pan, zoom, and/or tilt of the selected first camera 104a based on a direction and/or a distance of the first object 102a, relative to the second object 102b. For example, both the first object 102a and the second object 102b move in the same direction. In such a case, the processor 202 may zoom the selected first camera 104a to the extent that both the first object 102a and the second object 102b lie in field of view of the first camera 104a. In another example, the first object 102a and the second object 102b move in an opposite direction. In such a case, the processor 202 may zoom out the selected first camera 104a, such that both the first object 102a and the second object 102b remain in field of view of the first camera 104a).
Regarding claim 3, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control system according to claim 1, the operations further comprising: determining if sensors of one type are capable of sensing the event directly based on the map (BUEHLER, ¶ 43) information, (see MCCOY, i.e. zoning based on an object detection such that the distance between objects, the objects are in the different zones. see, ¶ 78, i.e., the processor 202 may adjust the pan, zoom, and/or tilt of the selected first camera 104a based on a direction and/or a distance of each of the first object 102a and the second object 102b, relative to the selected first camera 104a. In another embodiment, the processor 202 may adjust the pan, zoom, and/or tilt of the selected first camera 104a based on a direction and/or a distance of the first object 102a, relative to the second object 102b. For example, both the first object 102a and the second object 102b move in the same direction. In such a case, the processor 202 may zoom the selected first camera 104a to the extent that both the first object 102a and the second object 102b lie in field of view of the first camera 104a. In another example, the first object 102a and the second object 102b move in an opposite direction. In such a case, the processor 202 may zoom out the selected first camera 104a, such that both the first object 102a and the second object 102b remain in field of view of the first camera 104a)
the sensors of the one type being included in the plurality of sensors (as cited above, i.e. cameras); and selecting, as the at least one of the plurality of sensors, at least one of the sensors determined to be capable of sensing the event directly (see citation above, i.e. first camera).
Regarding claim 4, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control system according to claim 1, the operations further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, (BUEHLER, ¶ 43, see as cited above, i.e. meta data, ¶ 77) obtained from a first camera (as cited above, first camera, ¶ 71);
the sensors of the one type being included in the plurality of sensors (MCCOY, ¶ 66); and when the sensors of the one type are determined not to be capable of sensing the event directly (MCCOY, ¶ 79), selecting, as the at least one of the plurality of sensors, at least one of sensors of another type being capable of sensing the event from among the plurality of sensors (MCCOY, ¶ 78).
Regarding claim 5, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control system according to claim 1, the operations further comprising: determining if cameras (as cited above, i.e. meta data, ¶ 77) obtained from a first camera (as cited above, first camera, ¶ 71);
in the plurality of sensors are capable of sensing the event directly based on the map information (BUEHLER, ¶ 43); and selecting, as the at least one of the plurality of sensors, at least one of the cameras determined to be capable of sensing the event directly (MCCOY, ¶ 79).
Regarding claim 6, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control system according to claim 1, the operations further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information (BUEHLER, ¶ 43); and when the cameras are determined not (MCCOY, ¶ 78) to be capable of sensing the event directly (MCCOY, ¶ 79), selecting, as the at least one of the plurality of sensors (see MCCOY, ¶ 66), a microphone (as cited below, microphone sensing the voice) being capable of sensing the event from among the plurality of sensors (MCCOY, ¶ 66).
Regarding claim 7, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control system according to claim 1, the operations further comprising: comparing the coverage information with the sensor data (MCCOY, ¶ 79); and selecting the at least one of the plurality of cameras based on the comparing result (MCCOY, ¶ 78).
Regarding claim 8, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control system according to claim 1, the operations further comprising: selecting the at least one of the plurality of sensors based on the type of the event (see MCCOY, ¶ 106).
Regarding claim 9, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control system according to claim 1, the operations further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information (BUEHLER, ¶ 43) including wall or other space defining structures (as cited above).
Regarding claim 10, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control system according to claim 1, the operations further comprising: actuating at least one of the plurality of sensors by changing at least one of direction (MCCOY, ¶ 78), resolution or pan-tilt-zoom (PTZ) settings of at least one of the plurality of sensors (MCCOY, ¶ 78).
Regarding claim 11, BUEHLER/MCCOY, for the same motivation of combination, discloses a surveillance control method controlling a plurality of sensors comprising: receiving sensor data including event related data relating to at least one of scale of the event and occurrence location of the event from at least one of the plurality of sensors (see rejection of claim 1); selecting at least one of the plurality of sensors based on map information (BUEHLER, ¶ 43), the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image (see rejection of claim 1), the map information concerning wall or other space defining structures (see rejection of claim 1); and actuating the selected at least one of the plurality of sensors (see rejection of claim 1).
Regarding claim 12, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control method according to claim 11, further comprising: selecting the at least one of the plurality of sensors, the at least one being capable of sensing the event directly (This claim recited similar features as claim 2, therefore see rejection thereof).
Regarding claim 13, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control method according to claim 11, further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information (BUEHLER, ¶ 43), the sensors of the one type being included in the plurality of sensors; and selecting, as the at least one of the plurality of sensors, at least one of the sensors determined to be capable of sensing the event directly (This claim recited similar features as claim 3, therefore see rejection thereof).
Regarding claim 14, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control method according to claim 11, further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information (BUEHLER, ¶ 43), the sensors of the one type being included in the plurality of sensors; and when the sensors of the one type are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, at least one of sensors of another type being capable of sensing the event from among the plurality of sensors (This claim recited similar features as claim 4, therefore see rejection thereof).
Regarding claim 15, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control method according to claim 11, further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information (BUEHLER, ¶ 43); and selecting, as the at least one of the plurality of sensors, at least one of the cameras determined to be capable of sensing the event directly (This claim recited similar features as claim 5, therefore see rejection thereof).
Regarding claim 16, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control method according to claim 11, further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information (BUEHLER, ¶ 43); and when the cameras are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, a microphone being capable of sensing the event from among the plurality of sensors (This claim recited similar features as claim 6, therefore see rejection thereof).
Regarding claim 17, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control method according to claim 11, further comprising: comparing the coverage information with the sensor data; and selecting the at least one of the plurality of cameras based on the comparing result (This claim recited similar features as claim 7, therefore see rejection thereof).
Regarding claim 18, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control method according to claim 11, further comprising: selecting the at least one of the plurality of sensors based on the type of the event (This claim recited similar features as claim 8, therefore see rejection thereof).
Regarding claim 19, BUEHLER/MCCOY, for the same motivation of combination, further discloses the surveillance control method according to claim 11, further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information (BUEHLER, ¶ 43) including wall or other space defining structures (see citation above).
Regarding claim 20, BUEHLER/MCCOY, for the same motivation of combination, discloses a non-transitory computer readable storage medium causing a computer controlling a plurality of sensors to execute operations comprising: receiving sensor data including event related data relating to at least one of scale of the event and occurrence location of the event from at least one of the plurality of sensors (see rejection of claim 1); selecting at least one of the plurality of sensors based on map information (see rejection of claim 1), the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image (see rejection of claim 1), the map information concerning wall or other space defining structures (see rejection of claim 1); and actuating the selected at least one of the plurality of sensors (see rejection of claim 1).
Conclusion
Examiner’s note:
McCoy discloses selecting (¶ 57) a camera (as cited above) based on the coverage analysis (para 32, 34)
executing (¶ 69-71) coverage analysis (¶ 77-78) by analyzing location (para 72-73) of the plurality of cameras (¶ 74-75) and the event-related data (para 13-16, 20, 23, 26, 27, 32);
Following prior arts of record are considered pertinent to applicant's disclosure.
US 20180172787 A1 – hereafter Aley et al, see at least [par 34, 49]
US9881216B2 Object tracking and alerts
US 20210099433 A1 VIDEO COMMUNICATION DATA SECURITY
US 20190068895 A1 PRESERVING PRIVACY IN SURVEILLANCE
US 20190050592 A1 SYSTEMS AND METHODS FOR PROCESSING AND HANDLING PRIVACY-SENSITIVE IMAGE DATA
US 20180268240 A1 VIDEO REDACTION METHOD AND SYSTEM
US 20180234847 A1 CONTEXT-RELATED ARRANGEMENTS
US 20180158220 A1 METADATA IN MULTI IMAGE SCENES
Any inquiry concerning this communication or earlier communications from the examiner should be directed to FRANK F HUANG whose telephone number is (571)272-0701. The examiner can normally be reached Monday-Friday, 8:30 am - 6:00 pm (Eastern Time), Federal Alternative First Friday Off.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jay Patel can be reached on (571)272-2988.. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/FRANK F HUANG/Primary Examiner, Art Unit 2485