Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Claims 1-20 rejected on the ground of nonstatutory double patenting as being unpatentable over claims of U.S. Patent No. 11800063, U.S. Patent No. 12212892. Although the claims at issue are not identical, they are not patentably distinct from each other because they recited substantial similar surveillance with substantially similar sensor structure and selecting the sensor based on the slightly different conditional parameters.
Instant Application
U.S. Patent No. 11800063
1. A surveillance control system controlling a plurality of sensors comprising: at least one memory storing instructions; and at least one processor connected to the memory that, based on the instructions, performs operations comprising:
receiving sensor data including event related data relating to at least one of a scale of the event and an occurrence location of the event from at least one of the plurality of sensors;
selecting, based on the at least one of the scale of the event, the occurrence location of the event, and a sensing range of the at least one of the plurality of sensors,
at least a second one of the plurality of sensors based on a type of the event, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and actuating the selected at least the second one of the plurality of sensors.
2. The surveillance control system according to claim 1, the operations further comprising: comparing the coverage information with the sensor data; and selecting the at least second one of the plurality of sensors based on a comparing result of comparing the coverage information with the sensor data.
3. The surveillance control system according to claim 1, the operations further comprising: in the selecting, identifying the at least second one of the plurality of sensors based on map information including wall or other space defining structures.
4. The surveillance control system according to claim 1, the operations further comprising: actuating the at least second one of the plurality of sensors by changing at least one of direction, resolution or pan-tilt-zoom (PTZ) settings of at least one of the plurality of sensors.
5. The surveillance control system according to claim 1, wherein the coverage information includes information concerning Field of View (FOV) of the plurality of sensors.
6. The surveillance control system according to claim 1, wherein the coverage information includes information concerning Field of Regard (FOR) for non-directional or movable sensors included in the plurality of sensors.
7. The surveillance control system according to claim 1, wherein the selecting includes selecting the at least second one of the plurality of sensors further based on map information concerning wall or other space defining structures..
8. The surveillance control system according to claim 1, wherein the selecting includes selecting the at least second one of the plurality of sensors further based on capabilities of sensor actuators of the plurality of sensors.
9. A surveillance control method controlling a plurality of sensors comprising: receiving sensor data including event related data relating to at least one of a scale of the event and an occurrence location of the event from at least one of the plurality of sensors; selecting, based on the at least one of the scale of the event, the occurrence location of the event, and a sensing range of the at least one of the plurality of sensors, at least a second one of the plurality of sensors based a type of the event, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and actuating the selected at least second one of the plurality of sensors.
10. The surveillance control method according to claim 9, further comprising: comparing the coverage information with the sensor data; and selecting the at least second one of the plurality of sensors based on a comparing result of comparing the coverage information with the sensor data.
11. The surveillance control method according to claim 9, further comprising: in the selecting, identifying the at least second one of the plurality of sensors based on map information including wall or other space defining structures.
12. The surveillance control method according to claim 9, further comprising: actuating the at least second one of the plurality of sensors by changing at least one of direction, resolution or pan-tilt-zoom (PTZ) settings of at least one of the plurality of sensors.
13. The surveillance control method according to claim 9, wherein the coverage information includes information concerning Field of View (FOV) of the plurality of sensors.
14. The surveillance control method according to claim 9, wherein the coverage information includes information concerning Field of Regard (FOR) for non-directional or movable sensors included in the plurality of sensors.
15. The surveillance control method according to claim 9, wherein the selecting includes selecting the at least second one of the plurality of sensors further based on map information concerning wall or other space defining structures..
16. The surveillance control system according to claim 9, wherein the selecting includes selecting the at least second one of the plurality of sensors further based on capabilities of sensor actuators of the plurality of sensors.
17. A non-transitory computer readable storage medium causing a computer controlling a plurality of sensors to execute operations comprising: receiving sensor data including event related data relating to at least one of a scale of the event and an occurrence location of the event from at least one of the plurality of sensors; selecting, based on the at least one of the scale of the event, the occurrence location of the event, and a sensing range of the at least one of the plurality of sensors, at least one of the plurality of sensors based a type of the event, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and actuating the selected at least second one of the plurality of sensors.
18. The non-transitory computer readable storage medium according to claim 17, the operations further comprising: comparing the coverage information with the sensor data; and selecting the at least second one of the plurality of sensors based on a comparing result of comparing the coverage information with the sensor data.
19. The non-transitory computer readable storage medium according to claim 17, the operations further comprising: in the selecting, identifying the at least second one of the plurality of sensors based on map information including wall or other space defining structures.
20. The non-transitory computer readable storage medium according to claim 17, the operations further comprising: actuating the at least second one of the plurality of sensors by changing at least one of direction, resolution or pan-tilt-zoom (PTZ) settings of at least one of the plurality of sensors.
1. A surveillance control system controlling a plurality of sensors comprising: at least one memory storing instructions; and at least one processor connected to the memory that, based on the instructions, performs operations comprising:
receiving sensor data including event related data relating to at least one of scale of the event and occurrence location of the event from at least one of the plurality of sensors;
selecting
at least one of the plurality of sensors based on map information, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image, and
actuating the selected at least one of the plurality of sensors.
the map information concerning wall or other space defining structures;
selecting, when it is determined that the at least one of the scale of the event and the occurrence location of the event exceeds a sensing range of the at least one of the plurality of sensors.
2. The surveillance control system according to claim 1, the operations further comprising: selecting the at least one of the plurality of sensors, the at least one being capable of sensing the event directly.
3. The surveillance control system according to claim 1, the operations further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and selecting, as the at least one of the plurality of sensors, at least one of the sensors determined to be capable of sensing the event directly.
4. The surveillance control system according to claim 1, the operations further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and when the sensors of the one type are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, at least one of sensors of another type being capable of sensing the event from among the plurality of sensors.
5. The surveillance control system according to claim 1, the operations further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and selecting, as the at least one of the plurality of sensors, at least one of the cameras determined to be capable of sensing the event directly.
6. The surveillance control system according to claim 1, the operations further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and when the cameras are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, a microphone being capable of sensing the event from among the plurality of sensors.
7. The surveillance control system according to claim 1, the operations further comprising: comparing the coverage information with the sensor data; and selecting the at least one of the plurality of cameras based on the comparing result.
8. The surveillance control system according to claim 1, the operations further comprising: selecting the at least one of the plurality of sensors based on the type of the event.
9. The surveillance control system according to claim 1, the operations further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures.
10. The surveillance control system according to claim 1, the operations further comprising: actuating at least one of the plurality of sensors by changing at least one of direction, resolution or pan-tilt-zoom (PTZ) settings of at least one of the plurality of sensors.
11. A surveillance control method controlling a plurality of sensors comprising: receiving sensor data including event related data relating to at least one of scale of the event and occurrence location of the event from at least one of the plurality of sensors; selecting at least one of the plurality of sensors based on map information, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image, the map information concerning wall or other space defining structures; andactuating the selected at least one of the plurality of sensors;
selecting, when it is determined that the at least one of the scale of the event and the occurrence location of the event exceeds a sensing range of the at least one of the plurality of sensors,
12. The surveillance control method according to claim 11, further comprising: selecting the at least one of the plurality of sensors, the at least one being capable of sensing the event directly.
13. The surveillance control method according to claim 11, further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and selecting, as the at least one of the plurality of sensors, at least one of the sensors determined to be capable of sensing the event directly.
14. The surveillance control method according to claim 11, further comprising: determining if sensors of one type are capable of sensing the event directly based on the map information, the sensors of the one type being included in the plurality of sensors; and when the sensors of the one type are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, at least one of sensors of another type being capable of sensing the event from among the plurality of sensors.
15. The surveillance control method according to claim 11, further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and selecting, as the at least one of the plurality of sensors, at least one of the cameras determined to be capable of sensing the event directly.
16. The surveillance control method according to claim 11, further comprising: determining if cameras in the plurality of sensors are capable of sensing the event directly based on the map information; and when the cameras are determined not to be capable of sensing the event directly, selecting, as the at least one of the plurality of sensors, a microphone being capable of sensing the event from among the plurality of sensors.
17. The surveillance control method according to claim 11, further comprising: comparing the coverage information with the sensor data; and selecting the at least one of the plurality of cameras based on the comparing result.
18. The surveillance control method according to claim 11, further comprising: selecting the at least one of the plurality of sensors based on the type of the event.
19. The surveillance control method according to claim 11, further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures.
20. A non-transitory computer readable storage medium causing a computer controlling a plurality of sensors to execute operations comprising: receiving sensor data including event related data relating to at least one of scale of the event and occurrence location of the event from at least one of the plurality of sensors; selecting at least one of the plurality of sensors based on map information, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image, the map information concerning wall or other space defining structures; and actuating the selected at least one of the plurality of sensors.
selecting, when it is determined that the at least one of the scale of the event and the occurrence location of the event exceeds a sensing range of the at least one of the plurality of sensors.
US Pat. No.: 12212892
1. A surveillance control system controlling a plurality of sensors comprising: at least one memory storing instructions; and at least one processor connected to the memory that, based on the instructions, performs operations comprising:
receiving sensor data including event related data including sensing result from at least one microphone included in the plurality of sensors and
position information of the at least one microphone; determining if a sound sensed as the sensing result comes from inside a compartment based on the sensing data;
selecting at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and
actuating the selected at least one of the plurality of sensors.
2. The surveillance control system according to claim 1, wherein the event related data includes a direction in which the sound comes.
3. The surveillance control system according to claim 1, wherein each of the plurality of sensors is arranged in one of a plurality of compartments including, and the operation comprises: determining, among the plurality of compartments, the compartment from which the sound comes based on the sensor data; and selecting, as the at least one of the sensors, the at least one of the plurality of cameras based on the determined compartment and the coverage information.
4. The surveillance control system according to claim 3, wherein the operation comprises selecting, from among the plurality of cameras, at least one camera included in the determined compartment.
5. The surveillance control system according to claim 3, wherein at least one microphone and at least one camera are arranged in each of the plurality of compartments.
6. The surveillance control system according to claim 3, wherein the operation comprises: when the position information indicates one of the plurality of compartments, selecting, from among the plurality of cameras, an inside camera arranged in the one of the plurality of compartments; andwhen the position information indicates outside part of the plurality of compartments, selecting an outside camera capable of capturing the outside part.
7. The surveillance control system according to claim 1, the operations further comprising: comparing the coverage information with the sensor data; and selecting at least one of the plurality of cameras based on the comparing result.
8. The surveillance control system according to claim 1, the operations further comprising:selecting at least one of the plurality of sensors based on the type of the event.
9. The surveillance control system according to claim 1, the operations further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures.
10. The surveillance control system according to claim 1, the operations further comprising: selecting at least one of the plurality of sensors by changing at least one of direction, resolution or pan-tilt-zoom (PTZ) settings of at least one of the plurality of sensors.
11. A surveillance control method controlling a plurality of sensors comprising: receiving sensor data including event related data including sensing result from at least one microphone included in the plurality of sensors and position information of the at least one microphone; determining if a sound sensed as the sensing result comes from inside a compartment based on the sensing data; selecting at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and actuating the selected at least one of the plurality of sensors.
12. The surveillance control method according to claim 11, wherein the event related data includes a direction in which the sound comes.
13. The surveillance control method according to claim 11, wherein each of the plurality of sensors is arranged in one of a plurality of compartments including, and the surveillance control method comprises: determining, among the plurality of compartments, the compartment from which the sound comes based on the sensor data; and selecting, as the at least one of the sensors, the at least one of the plurality of cameras based on the determined compartment and the coverage information.
14. The surveillance control method according to claim 13, further comprising selecting, from among the plurality of cameras, at least one camera included in the determined compartment.
15. The surveillance control method according to claim 13, wherein at least one microphone and at least one camera are arranged in each of the plurality of compartments.
16. The surveillance control method according to claim 13, further comprising: when the position information indicates one of the plurality of compartments, selecting, from among the plurality of cameras, an inside camera arranged in the one of the plurality of compartments; andwhen the position information indicates outside part of the plurality of compartments, selecting an outside camera capable of capturing the outside part.
17. The surveillance control method according to claim 11, further comprising: comparing the coverage information with the sensor data; and selecting at least one of the plurality of cameras based on the comparing result.
18. The surveillance control method according to claim 11, further comprising: selecting at least one of the plurality of sensors based on the type of the event.
19. The surveillance control method according to claim 11, further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures.
20. A non-transitory computer readable storage medium storing a program causing a computer controlling a plurality of sensors to execute operations comprising: receiving sensor data including event related data including sensing result from at least one microphone included in the plurality of sensors and position information of the at least one microphone; determining if a sound sensed as the sensing result comes from inside a compartment based on the sensing data; selecting at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and actuating the selected at least one of the plurality of sensors.
Allowable Subject Matter
The following is an examiner’s statement of reasons for allowance:
US Pub. No.: US 2015/0244992 Al discloses video surveillance subsystem comprising a plurality of cameras each having a field of view, each field of view comprising a plurality of sub-regions; a radio-frequency identification subsystem comprising a plurality of radio-frequency stations, each radio-frequency station configured to detect radio-frequency identification tags within a respective operational radius; and an association module in communication with the video surveillance subsystem and the radio-frequency identification subsystem, the association module for inferring associations among one or more of the sub-regions and one or more of the operational radii, wherein inferring the associations comprises: determining a first probability that a first one of the sub-regions is associated with a first one of the operational radii based on a detection of a tracked object in both the first sub-region and the first operational radius at a first time; and determining a second, higher probability that the first sub-region and the first operational radius are associated based on a detection of a tracked object in both the first sub-region and the first operational radius at a second, later time.
US 20150116501 A1 discloses in a network capable of communicatively coupling a plurality of cameras, a plurality of sensors, and a controlling device, one or more processors in said controlling device being operable to: receive metadata associated with said one or more objects, wherein said metadata identifies said one or more objects; select a Radio Frequency Identification (RFID) sensor associated with said one or more objects, from said plurality of sensors, based on one or more signals received from said plurality of sensors; select a first set of cameras from said plurality of cameras to track said one or more objects based on said received metadata; and enable tracking of said one or more objects by said selected first set of cameras based on a signal received from said selected RFID sensor.
None of the cited prior arts disclose “receiving sensor data including event related data relating to at least one of a scale of the event and an occurrence location of the event from at least one of the plurality of sensors; selecting, based on the at least one of the scale of the event, the occurrence location of the event, and a sensing range of the at least one of the plurality of sensors, at least a second one of the plurality of sensors based on a type of the event, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and actuating the selected at least the second one of the plurality of sensors.”
Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.”
Conclusion
Following prior arts of record are considered pertinent to applicant's disclosure.
US 20180172787 A1 – hereafter Aley et al, see at least [par 34, 49]
US9881216B2 Object tracking and alerts
US 20210099433 A1 VIDEO COMMUNICATION DATA SECURITY
US 20190068895 A1 PRESERVING PRIVACY IN SURVEILLANCE
US 20190050592 A1 SYSTEMS AND METHODS FOR PROCESSING AND HANDLING PRIVACY-SENSITIVE IMAGE DATA
US 20180268240 A1 VIDEO REDACTION METHOD AND SYSTEM
US 20180234847 A1 CONTEXT-RELATED ARRANGEMENTS
US 20180158220 A1 METADATA IN MULTI IMAGE SCENES
Any inquiry concerning this communication or earlier communications from the examiner should be directed to FRANK F HUANG whose telephone number is (571)272-0701. The examiner can normally be reached Monday-Friday, 8:30 am - 6:00 pm (Eastern Time), Federal Alternative First Friday Off.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jay Patel can be reached on (571)272-2988.. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/FRANK F HUANG/Primary Examiner, Art Unit 2485