Prosecution Insights
Last updated: April 19, 2026
Application No. 18/616,781

AUTOMATED ANIMAL DETECTION AND RESPONSE

Non-Final OA §102§103
Filed
Mar 26, 2024
Examiner
WILSON, BRIAN P
Art Unit
2689
Tech Center
2600 — Communications
Assignee
Tyco Fire & Security GmbH
OA Round
1 (Non-Final)
62%
Grant Probability
Moderate
1-2
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
495 granted / 792 resolved
+0.5% vs TC avg
Strong +42% interview lift
Without
With
+42.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
26 currently pending
Career history
818
Total Applications
across all art units

Statute-Specific Performance

§101
1.7%
-38.3% vs TC avg
§103
48.0%
+8.0% vs TC avg
§102
18.8%
-21.2% vs TC avg
§112
24.5%
-15.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 792 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-4, 9-15 and 20-22 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Read (WO 2022/266705 A1). Regarding claim 1, Read discloses a method for automated animal detection (see at least Figures 1-6 | [0002]), comprising: receiving one or more data sets from one or more sensors associated with a security system (see at least Figure 1, items 1000, 103, 105 and 3000 | [0055] note the apparatus 1000 uses at least the infrared image sensor 103 and additional sensors 105 to detect the presence of a target animal 3000 in a monitored environment 2000 | [0075-0076] note the processor 104 runs a neural network classifier 4000 that can take a virtually unlimited number of inputs such as data from the imaging device 103, the acoustic sensor 108, additional sensors 105, etc., to produce a determination of a presence and/or classification of an animal species); generating an animal detection score (6005) for a detected animal (6004) based at least in part on the one or more data sets from the one or more sensors (see at least Figure 6, items 6004, 6005 and 6006 | [00119-00121] note a determination of whether an animal has been detected 6004 is based at least on the images from the infrared image sensor 103 and acoustic sensor 108 data | [00126-00128] note at step 6006, the processor 104 determines whether or not a target animal has been detected in the classification of step 6005, like with step 6004, this decision may be based on a confidence measure (exceeding a threshold) produced with the classification of shape or pattern recognition, acoustic recognition, thermal signature detection, and movement or motion detection, see [00121] | [0017-0018] note steps 6005-6006 can also be performed at the server | [0049] note steps 6007-6008+ can also be performed at the server); and causing autonomous execution of one or more intrusion response actions based on the animal detection score satisfying a threshold detection score (see at least Figure 6, note optional actions taken in response to step 6006, such as count animal detection event, administer poison, and/or activate nearby sensors | [00130] note that if optional steps 6007 and 6008 are not taken, then action is taken in step 6009 | [00133] note response actions that can be taken in step 6009). Regarding claim 2, Read discloses wherein the animal detection score indicates a confidence level associated with a detection of the detected animal (see at least [00128]). Regarding claim 3, Read discloses wherein at least a first sensor of the one or more sensors is a thermal sensor (see at least [00119] | [00121] note thermal signature detection | [0080]) and at least a second sensor of the one or more sensors is a non-thermal sensor (see at least [00121] note acoustic recognition | [0085] note the acoustic sensor 108). Regarding claim 4, Read discloses detecting, based on a first data set of the thermal sensor and an output of a model trained for detecting animals in data from thermal sensors, the detected animal in the first data set from the thermal sensor prior to generating the animal detection score, wherein the first data set is from the one or more data sets (see at least Figure 6, items 6003-6004 occur prior to 6005-6006 | [00118-00121] note thermal signature detection can be a result from a machine learning classifier). Regarding claim 9, Read discloses wherein each sensor of the one or more sensors is associated with a set of user-configured attributes, and wherein a corresponding data set of each sensor associated with the one or more data sets includes one or more values for the set of user-configured attributes (see at least [0064] | [00129] note target animals are stored in memory | [00126] note the animal’s shape, physical appearance and/or behavior, the size of the animal, walking patterns, gait, movement speed and characteristics such as the texture, skin/fur patterns and color of the animal can be used to identify the target animal stored in the memory | [00130] note physical markings can be stored for comparison | [00139] note the server can issue software updates to the various apparatus to, for example, update the classification algorithms to more accurately classify the current target animals or change the target animals to be classified | [00144] note the server’s management software may also include capability to upload and update sensor software and configurations using "over the air updates" | [00145] note capacity to connect to a network, for either upload, storage and classification of detection data, or download of new configurations and software to control the sensor). Regarding claim 10, Read discloses wherein the one or more intrusion response actions include at least one of launching of one or more drones associated with the security system, powering on one or more lights associated with the security system, powering on one or more noise makers associated with the security system, powering on one or more sounders associated with the security system, locking one or more access doors or access gates associated with the security system, or transmitting alerts to one or more computing devices of personnel associated with the security system (see at least [00133] note alerting a server, deploying drones, issuing audio signals and alerting monitoring personnel). Regarding claim 11, Read discloses receiving a configuration indicating a set of animals of interest, wherein the detected animal is one of the set of animals of interest (see at least [00129-00130] note a configuration indicating a set of animals of interest can include storing at least image of a plurality of target animals, such as a fox, a feral cat and endangered pygmy possum | [00131] note trained on images of individual animals). Regarding claim 12, Read discloses an apparatus for automated animal detection (see at least Figures 1-6 | [0002]), comprising: one or more memories storing instructions (see at least [0062] | [0090]); and one or more processors coupled with the one or more memories (see at least [0063] | [00148-00149] | [0017-0018] note steps 6005-6006 can also be performed at the server | [0049] note steps 6007-6008+ can also be performed at the server) and, individually or in combination, configured to execute the instructions to: receive one or more data sets from one or more sensors associated with a security system (see at least Figure 1, items 1000, 103, 105 and 3000 | [0055] note the apparatus 1000 uses at least the infrared image sensor 103 and additional sensors 105 to detect the presence of a target animal 3000 in a monitored environment 2000 | [0075-0076] note the processor 104 runs a neural network classifier 4000 that can take a virtually unlimited number of inputs such as data from the imaging device 103, the acoustic sensor 108, additional sensors 105, etc., to produce a determination of a presence and/or classification of an animal species); generate an animal detection score for a detected animal based at least in part on the one or more data sets from the one or more sensors (see at least Figure 6, items 6004, 6005 and 6006 | [00119-00121] note a determination of whether an animal has been detected 6004 is based at least on the images from the infrared image sensor 103 and acoustic sensor 108 data | [00126-00128] note at step 6006, the processor 104 determines whether or not a target animal has been detected in the classification of step 6005, like with step 6004, this decision may be based on a confidence measure (exceeding a threshold) produced with the classification of shape or pattern recognition, acoustic recognition, thermal signature detection, and movement or motion detection, see [00121] | [0017-0018] note steps 6005-6006 can also be performed at the server | [0049] note steps 6007-6008+ can also be performed at the server); and cause autonomous execution of one or more intrusion response actions based on the animal detection score satisfying a threshold detection score (see at least Figure 6, note optional actions taken in response to step 6006, such as count animal detection event, administer poison, and/or activate nearby sensors | [00130] note that if optional steps 6007 and 6008 are not taken, then action is taken in step 6009 | [00133] note response actions that can be taken in step 6009). Regarding claim 13, Read discloses wherein the animal detection score indicates a confidence level associated with a detection of the detected animal (see at least [00128]). Regarding claim 14, Read discloses wherein at least a first sensor of the one or more sensors is a thermal sensor (see at least [00119] | [00121] note thermal signature detection | [0080]) and at least a second sensor of the one or more sensors is a non-thermal sensor (see at least [00121] note acoustic recognition | [0085] note the acoustic sensor 108). Regarding claim 15, Read discloses detect, based on a first data set of the thermal sensor and an output of a model trained for detecting animals in data from thermal sensors, the detected animal in the first data set from the thermal sensor prior to generating the animal detection score, wherein the first data set is from the one or more data sets (see at least Figure 6, items 6003-6004 occur prior to 6005-6006 | [00118-00121] note thermal signature detection can be a result from a machine learning classifier). Regarding claim 20, Read discloses wherein each sensor of the one or more sensors is associated with a set of user-configured attributes, and wherein a corresponding data set of each sensor associated with the one or more data sets includes one or more values for the set of user-configured attributes (see at least [0064] | [00129] note target animals are stored in memory | [00126] note the animal’s shape, physical appearance and/or behavior, the size of the animal, walking patterns, gait, movement speed and characteristics such as the texture, skin/fur patterns and color of the animal can be used to identify the target animal stored in the memory | [00130] note physical markings can be stored for comparison | [00139] note the server can issue software updates to the various apparatus to, for example, update the classification algorithms to more accurately classify the current target animals or change the target animals to be classified | [00144] note the server’s management software may also include capability to upload and update sensor software and configurations using "over the air updates" | [00145] note capacity to connect to a network, for either upload, storage and classification of detection data, or download of new configurations and software to control the sensor). Regarding claim 21, Read discloses wherein the one or more intrusion response actions include at least one of launching of one or more drones associated with the security system, powering on one or more lights associated with the security system, powering on one or more noise makers associated with the security system, powering on one or more sounders associated with the security system, locking one or more access doors or access gates associated with the security system, or transmitting alerts to one or more computing devices of personnel associated with the security system (see at least [00133] note alerting a server, deploying drones, issuing audio signals and alerting monitoring personnel). Regarding claim 22, Read discloses wherein the one or more processors, individually or in combination, are further configured to execute the instructions to: receive a configuration indicating a set of animals of interest, wherein the detected animal is one of the set of animals of interest (see at least [00129-00130] note a configuration indicating a set of animals of interest can include storing at least image of a plurality of target animals, such as a fox, a feral cat and endangered pygmy possum | [00131] note trained on images of individual animals). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 5, 6, 16 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Read (WO 2022/266705 A1) in view of Koishida (US 2018/0233142 A1). Regarding claim 5, Read discloses wherein generating the animal detection score further comprises: generating, based on detecting the detected animal, a first score (see at least [00128]); detecting the detected animal in a second data set of the non-thermal sensor (see at least [00121] note acoustic recognition), wherein the second data set is from the one or more data sets (see at least [00121] | [0075] note unlimited inputs); and determining the animal detection score based at least in part on the second data set (see at least [00121] | [00128] note like with step 6004, which includes at least thermal signature detection, acoustic recognition, shape or pattern recognition, etc.). However, Read does not specifically disclose updating, the first score to generate a second score; and determining the animal detection score based at least in part on the second score. It is known to identify a monitored entity in different ways. For example, Koishida teaches a system that updates, a first score to generate a second score; and determines a detection score based at least in part on the second score (see at least [0046] | [0025]). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the features of Koishida into Read. This provides the ability to fuse Read’s data from multiple sensors in order to output more accurate predictions. Regarding claim 6, Read in view of Koishida teach wherein the second data set of the non-thermal sensor is generated within a threshold period of time as the first data set from the thermal sensor (see at least [0046] of Koishida, note the sounds received correspond to facial movements of the entity visible to the camera when the sounds were received, not some period of time after the sounds were received). Regarding claim 16, Read in view of Koishida, as addressed above, teach wherein to generate the animal detection score, the one or more processors, individually or in combination, are further configured to execute the instructions to: generate, based on detecting the detected animal, a first score; update, based on detecting the detected animal in a second data set of the non-thermal sensor, the first score to generate a second score, wherein the second data set is from the one or more data sets; and determine the animal detection score based at least in part on the second score (see at least [00128] of Read, note like with step 6004, which includes at least thermal signature detection, acoustic recognition, shape or pattern recognition, etc. | [00121] of Read, note acoustic recognition | [0075] of Read, note unlimited inputs [0046] of Koishida | [0025] of Koishida). Regarding claim 17, Read in view of Koishida, as addressed above, teach wherein the second data set of the non-thermal sensor is generated within a threshold period of time as the first data set from the thermal sensor (see at least [0046] of Koishida, note the sounds received correspond to facial movements of the entity visible to the camera when the sounds were received, not some period of time after the sounds were recevied). Claims 7, 8, 18 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Read (WO 2022/266705 A1) in view of Roberts (US 10,733,822 B1). Regarding claim 7, Read discloses detecting, based at least in part on the one or more data sets from the one or more sensors, a second detected animal (see at least [00129] note a plurality of target animals can be stored in the memory, such as a fox, feral cat, possum, etc.); monitoring the second detected animal within a building associated with the security system or outside of the building (see at least [00145] note a plurality of sensors spatially dispersed in the environment in order to detect more animals in a wider variety of locations, habitats and times | [0010-0011] note the environment, such as a forest, is outside of a building); and autonomously executing one or more intrusion response actions based on the second detected animal (see at least Figure 6, items 6006→optionally take action and 6009 | [00133] note actions). However, Read does not specifically disclose determining whether the second detected animal is within a building associated with the security system or outside of the building; and autonomously locking one or more access doors or access gates associated with the second detected animal in response to determining that the second detected animal is within the building. It is known to monitor animals in various environments. For example, Roberts teaches a system that determines whether a second detected animal is within a building associated with a security system or outside of the building; and autonomously locking one or more access doors or access gates associated with the second detected animal in response to determining that the second detected animal is within the building (see at least col. 10, lines 35-51 | col. 10, line 1 | col. 10, lines 10-21 | col. 4, lines 26-53). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the features of Roberts into Read. This provides the ability to monitor Read’s pets and lock them within a building if/when needed (e.g., in the event they may escape) (see [0003] of Read, note discriminating wildlife, pets and possibly certain individuals of particular species to safeguard them from, or target them for, automated control actions | [00101] of Read, note alerting an owner of a pet (if a pet is identified)). Regarding claim 8, Read in view of Roberts teach transmitting an alert to a computing device associated with a personnel of the building in response to determining that the second detected animal is outside of the building, wherein the alert indicates at least one of a presence of the second detected animal outside of the building, or a location of the second detected animal (see at least col. 4, lines 26-41 of Roberts). Regarding claim 18, Read in view of Roberts, as addressed above, teach wherein the one or more processors, individually or in combination, are further configured to execute the instructions to: detect, based at least in part on the one or more data sets from the one or more sensors, a second detected animal; determine whether the second detected animal is within a building associated with the security system or outside of the building; and autonomously lock one or more access doors or access gates associated with the second detected animal in response to determine that the second detected animal is within the building (see at least [00129] of Read, note a plurality of target animals can be stored in the memory, such as a fox, feral cat, possum, etc. | [00145] of Read, note a plurality of sensors spatially dispersed in the environment in order to detect more animals in a wider variety of locations, habitats and times | [0010-0011] of Read, note the environment, such as a forest, is outside of a building | Figure 6 of Read, note items 6006→optionally take action and 6009 | [00133] of Read, note actions | col. 10, lines 35-51 of Roberts | col. 10, line 1 of Roberts | col. 10, lines 10-21 of Roberts | col. 4, lines 26-53 of Roberts). Regarding claim 19, Read in view of Roberts, as addressed above, teach wherein the one or more processors, individually or in combination, are further configured to execute the instructions to: transmit an alert to a computing device associated with a personnel of the building in response to determining that the second detected animal is outside of the building, wherein the alert indicates at least one of a presence of the second detected animal outside of the building, or a location of the second detected animal (see at least col. 4, lines 26-41 of Roberts). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRIAN WILSON whose telephone number is 571-270-5884. The examiner can normally be reached Monday-Friday 9:00-5:00pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, DAVETTA GOINS can be reached at 571-272-2957. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BRIAN WILSON/Primary Examiner, Art Unit 2689
Read full office action

Prosecution Timeline

Mar 26, 2024
Application Filed
Dec 13, 2025
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12584404
DIRECTIONAL DRILLING COMMUNICATION PROTOCOLS, APPARATUS AND METHODS
2y 5m to grant Granted Mar 24, 2026
Patent 12576868
INCLEMENT WEATHER DETECTION
2y 5m to grant Granted Mar 17, 2026
Patent 12567317
SYSTEM AND METHOD FOR PREVENTION OF ACCIDENTS DUE TO TRIPPING OR BUMPING ON COMMON EQUIPMENT AND OPEN DOORS
2y 5m to grant Granted Mar 03, 2026
Patent 12562046
SYSTEM AND METHOD FOR MONITORING LOSS OF FISHING GEAR AND ESTIMATING LOCATION OF LOST FISHING GEAR
2y 5m to grant Granted Feb 24, 2026
Patent 12542043
Dynamic Context Aware Response System for Enterprise Protection
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
62%
Grant Probability
99%
With Interview (+42.2%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 792 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month