Prosecution Insights
Last updated: April 19, 2026
Application No. 19/000,080

CAMERA LISTING BASED ON COMPARISON OF IMAGING RANGE COVERAGE INFORMATION TO EVENT-RELATED DATA GENERATED BASED ON CAPTURED IMAGE

Non-Final OA §DP
Filed
Dec 23, 2024
Examiner
HUANG, FRANK F
Art Unit
2485
Tech Center
2400 — Computer Networks
Assignee
Cloud Byte LLC
OA Round
1 (Non-Final)
75%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
92%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
519 granted / 691 resolved
+17.1% vs TC avg
Strong +17% interview lift
Without
With
+17.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
33 currently pending
Career history
724
Total Applications
across all art units

Statute-Specific Performance

§101
5.0%
-35.0% vs TC avg
§103
72.0%
+32.0% vs TC avg
§102
3.6%
-36.4% vs TC avg
§112
9.3%
-30.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 691 resolved cases

Office Action

§DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp. Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims of U.S. Patent No. 12219294, Although the claims at issue are not identical, they are not patentably distinct from each other because they recited substantially similar subject matters of sensors and based on known variation of the application of such sensors. Instant Application U.S. Patent No. 12219294 1. A surveillance control system controlling a plurality of sensors comprising: at least one memory storing instructions; and at least one processor connected to the memory that, based on the instructions, performs operations comprising: receiving sensor data including event related data including sensing result and position information from any of the plurality of sensors, at least one of the plurality of sensors comprising a microphone; determining if the sensing result comes from inside a compartment; selecting the at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to a sensing range of the any of the plurality of sensors is able to image; and actuating the selected at least one of the plurality of sensors. 2. The surveillance control system according to claim 1, wherein the event related data includes a direction in which a sound comes. 3. The surveillance control system according to claim 1, wherein each of the plurality of sensors is arranged in one of a plurality of compartments including, and the operation comprises: determining, among the plurality of compartments, the compartment from which the sound comes based on the sensor data; and selecting, as the at least one of the sensors, the at least one of the plurality of cameras based on the determined compartment and the coverage information. 4. The surveillance control system according to claim 3, wherein the operation comprises selecting, from among the plurality of cameras, at least one camera included in the determined compartment. 5. The surveillance control system according to claim 3, wherein at least one microphone and at least one camera are arranged in each of the plurality of compartments. 6. The surveillance control system according to claim 3, wherein the operation comprises: when the position information indicates one of the plurality of compartments, selecting an inside camera arranged in the one of the plurality of compartments; and when the position information indicates outside part of the plurality of compartments, selecting an outside camera capable of capturing the outside part. 7. The surveillance control system according to claim 1, the operations further comprising: comparing the coverage information with the sensor data; and selecting the at least one of the plurality of cameras based on the comparing result. 8. The surveillance control system according to claim 1, the operations further comprising: selecting the at least one of the plurality of sensors based on a type of an event. 9. The surveillance control system according to claim 1, the operations further comprising: in the selecting, identifying the at least one of the plurality of sensors based on map information including wall or other space defining structures. 10. The surveillance control system according to claim 1, the operations further comprising: selecting the at least one of the plurality of sensors by changing at least one of direction, resolution or pan-tilt-zoom (PTZ) settings of at least one of the plurality of sensors. 11. A surveillance control method controlling a plurality of sensors comprising: receiving sensor data including event related data including sensing result and position information from any of the plurality of sensors, at least one of the plurality of sensors comprising a microphone; determining if the sensing result comes from inside a compartment; selecting the at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to a sensing range of the any of the plurality of sensors is able to image; and actuating the selected at least one of the plurality of sensors. 12. The surveillance control method according to claim 11, wherein the event related data includes a direction in which a sound comes. 13. The surveillance control method according to claim 11, wherein each of the plurality of sensors is arranged in one of a plurality of compartments including, and the surveillance control method comprises: determining, among the plurality of compartments, the compartment from which a sound comes based on the sensor data; and selecting, as the at least one of the sensors, the at least one of a plurality of cameras based on the determined compartment and the coverage information. 14. The surveillance control method according to claim 13, further comprising selecting, from among the plurality of cameras, at least one camera included in the determined compartment. 15. The surveillance control method according to claim 13, wherein at least one microphone and at least one camera are arranged in each of the plurality of compartments. 16. The surveillance control method according to claim 13, further comprising: when the position information indicates one of the plurality of compartments, selecting, an inside camera arranged in the one of the plurality of compartments; and when the position information indicates outside part of the plurality of compartments, selecting an outside camera capable of capturing the outside part. 17. The surveillance control method according to claim 11, further comprising: comparing the coverage information with the sensor data; and selecting the at least one of the plurality of cameras based on the comparing result. 18. The surveillance control method according to claim 11, further comprising: selecting the at least one of the plurality of sensors based on a type of an event. 19. The surveillance control method according to claim 11, further comprising: in the selecting, identifying the at least one of the plurality of sensors based on map information including wall or other space defining structures. 20. A non-transitory computer readable storage medium storing a program causing a computer controlling a plurality of sensors to execute operations comprising: receiving sensor data including event related data including sensing result and position information from any of the plurality of sensors, at least one of the plurality of sensors comprising a microphone; determining if the sensing result comes from inside a compartment; selecting the at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to a sensing range of the any of the plurality of sensors is able to image; and actuating the selected at least one of the plurality of sensors. 1. A surveillance control system controlling a plurality of sensors comprising: at least one memory storing instructions; and at least one processor connected to the memory that, based on the instructions, performs operations comprising: receiving sensor data including event related data including sensing result from at least one microphone included in the plurality of sensors and position information of the at least one microphone; determining if a sound sensed as the sensing result comes from inside a compartment based on the sensing data; selecting at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and actuating the selected at least one of the plurality of sensors. 2. The surveillance control system according to claim 1, wherein the event related data includes a direction in which the sound comes. 3. The surveillance control system according to claim 1, wherein each of the plurality of sensors is arranged in one of a plurality of compartments including, and the operation comprises: determining, among the plurality of compartments, the compartment from which the sound comes based on the sensor data; and selecting, as the at least one of the sensors, the at least one of the plurality of cameras based on the determined compartment and the coverage information. 4. The surveillance control system according to claim 3, wherein the operation comprises selecting, from among the plurality of cameras, at least one camera included in the determined compartment. 5. The surveillance control system according to claim 3, wherein at least one microphone and at least one camera are arranged in each of the plurality of compartments. 6. The surveillance control system according to claim 3, wherein the operation comprises: when the position information indicates one of the plurality of compartments, selecting, from among the plurality of cameras, an inside camera arranged in the one of the plurality of compartments; andwhen the position information indicates outside part of the plurality of compartments, selecting an outside camera capable of capturing the outside part. 7. The surveillance control system according to claim 1, the operations further comprising: comparing the coverage information with the sensor data; and selecting at least one of the plurality of cameras based on the comparing result. 8. The surveillance control system according to claim 1, the operations further comprising:selecting at least one of the plurality of sensors based on the type of the event. 9. The surveillance control system according to claim 1, the operations further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures. 10. The surveillance control system according to claim 1, the operations further comprising: selecting at least one of the plurality of sensors by changing at least one of direction, resolution or pan-tilt-zoom (PTZ) settings of at least one of the plurality of sensors. 11. A surveillance control method controlling a plurality of sensors comprising: receiving sensor data including event related data including sensing result from at least one microphone included in the plurality of sensors and position information of the at least one microphone; determining if a sound sensed as the sensing result comes from inside a compartment based on the sensing data; selecting at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and actuating the selected at least one of the plurality of sensors. 12. The surveillance control method according to claim 11, wherein the event related data includes a direction in which the sound comes. 13. The surveillance control method according to claim 11, wherein each of the plurality of sensors is arranged in one of a plurality of compartments including, and the surveillance control method comprises: determining, among the plurality of compartments, the compartment from which the sound comes based on the sensor data; and selecting, as the at least one of the sensors, the at least one of the plurality of cameras based on the determined compartment and the coverage information. 14. The surveillance control method according to claim 13, further comprising selecting, from among the plurality of cameras, at least one camera included in the determined compartment. 15. The surveillance control method according to claim 13, wherein at least one microphone and at least one camera are arranged in each of the plurality of compartments. 16. The surveillance control method according to claim 13, further comprising: when the position information indicates one of the plurality of compartments, selecting, from among the plurality of cameras, an inside camera arranged in the one of the plurality of compartments; andwhen the position information indicates outside part of the plurality of compartments, selecting an outside camera capable of capturing the outside part. 17. The surveillance control method according to claim 11, further comprising: comparing the coverage information with the sensor data; and selecting at least one of the plurality of cameras based on the comparing result. 18. The surveillance control method according to claim 11, further comprising: selecting at least one of the plurality of sensors based on the type of the event. 19. The surveillance control method according to claim 11, further comprising: in the selecting, identifying at least one of the plurality of sensors based on map information including wall or other space defining structures. 20. A non-transitory computer readable storage medium storing a program causing a computer controlling a plurality of sensors to execute operations comprising: receiving sensor data including event related data including sensing result from at least one microphone included in the plurality of sensors and position information of the at least one microphone; determining if a sound sensed as the sensing result comes from inside a compartment based on the sensing data; selecting at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to imaging range which each of the plurality of sensors is able to image; and actuating the selected at least one of the plurality of sensors. Allowable Subject Matter The following is an examiner’s statement of reasons for allowance: Buehler Pub. No.: US 20150244992 Al discloses a video surveillance subsystem comprising a plurality of cameras each having a field of view, each field of view comprising a plurality of sub-regions; a radio-frequency identification subsystem comprising a plurality of radio-frequency stations, each radio-frequency station configured to detect radio-frequency identification tags within a respective operational radius; and an association module in communication with the video surveillance subsystem and the radio-frequency identification subsystem, the association module for inferring associations among one or more of the sub-regions and one or more of the operational radii, wherein inferring the associations comprises: determining a first probability that a first one of the sub-regions is associated with a first one of the operational radii based on a detection of a tracked object in both the first sub-region and the first operational radius at a first time; and determining a second, higher probability that the first sub-region and the first operational radius are associated based on a detection of a tracked object in both the first sub-region and the first operational radius at a second, later time. MCCOY US 2015/0116501 A1 discloses Various aspects of a system and a method for tracking one or more objects may comprise a network capable of communicatively coupling a plurality of cameras, a plurality of sensors, and a controlling device. The controlling device may receive metadata associated with the one or more objects. The metadata identifies the one or more objects. The controlling device may select a first set of cameras from the plurality of cameras to track the one or more objects based on the received metadata. The controlling device may enable tracking the one or more objects by the selected first set of cameras. However, none of the cited prior arts discloses “receiving sensor data including event related data including sensing result and position information from any of the plurality of sensors, at least one of the plurality of sensors comprising a microphone; determining if the sensing result comes from inside a compartment; selecting the at least one of the plurality of sensors based on a result of the determining, the sensor data and coverage information relating to a sensing range of the any of the plurality of sensors is able to image; and actuating the selected at least one of the plurality of sensors.” Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.” Conclusion Following prior arts of record are considered pertinent to applicant's disclosure. US 20180172787 A1 – hereafter Aley et al, see at least [par 34, 49] US9881216B2 Object tracking and alerts US 20210099433 A1 VIDEO COMMUNICATION DATA SECURITY US 20190068895 A1 PRESERVING PRIVACY IN SURVEILLANCE US 20190050592 A1 SYSTEMS AND METHODS FOR PROCESSING AND HANDLING PRIVACY-SENSITIVE IMAGE DATA US 20180268240 A1 VIDEO REDACTION METHOD AND SYSTEM US 20180234847 A1 CONTEXT-RELATED ARRANGEMENTS US 20180158220 A1 METADATA IN MULTI IMAGE SCENES Any inquiry concerning this communication or earlier communications from the examiner should be directed to FRANK F HUANG whose telephone number is (571)272-0701. The examiner can normally be reached Monday-Friday, 8:30 am - 6:00 pm (Eastern Time), Federal Alternative First Friday Off. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jay Patel can be reached on (571)272-2988.. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /FRANK F HUANG/Primary Examiner, Art Unit 2485
Read full office action

Prosecution Timeline

Dec 23, 2024
Application Filed
Jan 08, 2026
Non-Final Rejection — §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593052
LOCAL ILLUMINATION COMPENSATION FOR VIDEO ENCODING AND DECODING USING STORED PARAMETERS
2y 5m to grant Granted Mar 31, 2026
Patent 12587725
IMAGE CAPTURING DEVICE AND IMAGE CAPTURING METHOD THEREOF
2y 5m to grant Granted Mar 24, 2026
Patent 12579815
VIDEO SURVEILLANCE SYSTEM
2y 5m to grant Granted Mar 17, 2026
Patent 12574625
SYSTEM WITH LIGHTING CONTROL INCLUDING GROUPED CHANNELS
2y 5m to grant Granted Mar 10, 2026
Patent 12568248
METHOD AND APPARATUS FOR DECODING A VIDEO SIGNAL
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
75%
Grant Probability
92%
With Interview (+17.3%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 691 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month