Prosecution Insights
Last updated: April 19, 2026
Application No. 17/447,103

DISASTER PREDICTION AND PROACTIVE MITIGATION

Non-Final OA §103
Filed
Sep 08, 2021
Examiner
YOUNG, ASHLEY YA-SHEH
Art Unit
3625
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
International Business Machines Corporation
OA Round
5 (Non-Final)
30%
Grant Probability
At Risk
5-6
OA Rounds
4y 2m
To Grant
47%
With Interview

Examiner Intelligence

Grants only 30% of cases
30%
Career Allow Rate
59 granted / 196 resolved
-21.9% vs TC avg
Strong +17% interview lift
Without
With
+17.2%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
11 currently pending
Career history
207
Total Applications
across all art units

Statute-Specific Performance

§101
33.2%
-6.8% vs TC avg
§103
42.6%
+2.6% vs TC avg
§102
15.5%
-24.5% vs TC avg
§112
6.0%
-34.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 196 resolved cases

Office Action

§103
DETAILED ACTION Status of Claims A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/24/25 has been entered. Claims 1, 8, 15 have been amended. Claims 4-5, 11-12, and 18-19 have been previously canceled. No claims have been added. Claims 1-3, 6-10, 13-17, and 20 have been considered as follows. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments with respect to the rejection of the pending claims under 35 U.S.C. § 103 have been considered but are moot in view of the new ground(s) of rejection. Regarding Applicant's argument that the cited references do not teach of the newly added limitations drawn to transmitting instructions to an autonomous robotic device, Castelli et al. (US 2018/0049407 A1, herein Castelli) has been brought in, necessitated by amendment, to illustrate this aspect (see claim 1 rejection below). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3, 6-10, 13-17, and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Castelli et al. (US 2018/0049407 A1, herein Castelli) in view of Houston et al. (US 2016/0144404 A1, herein Houston). As per claim 1, Castelli teaches of a processor-implemented method, the method comprising: generating a knowledge corpus of a grazing pattern of a livestock group in a preconfigured area (pg. 4, [0038] which describes how the camera may be configured to detect one or more animals and track locations of one or more animals to avoid overgrazing of herds by periodically capturing images of animal movement and mapping such geolocations; and pg. 8, [0073] which describes how the database may store information related to past grazing areas and/or types of animals that have grazed in particular foraging areas, where the mapping device may appropriately select a geolocation foraging zone such that overgrazing in particular zones are avoided); generating a knowledge corpus of a growing pattern of flora for each of a plurality of sections in the preconfigured area based on a flora type and weather conditions within a preconfigured time period in the preconfigured area (pg. 2, [0017] which describes unmanned aerial vehicles for determining geolocation foraging zones for animals, including monitoring vegetation characteristics of particular geolocations; and pg. 3, [0034] which describes how the unmanned aerial vehicle may include at least one camera configured to provide visual feedback and may capture/provide images of geolocations corresponding to vegetation/foliage to identify geolocation foraging zones, where the unmanned aerial vehicle may track and/or identify vegetation characteristics (e.g., type of vegetation, quality of vegetation, quantity of vegetation, etc.) and may provide corresponding geolocation information to generate a virtual map indicating foraging zones (e.g., areas in which animals may forage and/or where grazing may be beneficial)); generating a prediction of elevated risk of a natural disaster related to flora overgrowth based on the knowledge corpuses (pg. 1, [0002-0003] which describes how overgrazing in a particular location for an extended amount of time without sufficient recovery periods may be detrimental to the land, wildlife, and livestock, where grazing systems have been employed to minimize the effects of overgrazing, where appropriate levels of grazing may be effective in at least reducing fire hazards due to forage buildup; and pg. 1, [0004] which describes a risk analysis device to evaluate a level of risk associated with each of the plurality of geolocations; and pg. 2, [0017] which describes enabling the unmanned aerial vehicle to determine a level of risk associated with foraging zones; and pg. 8, [0075-0076] which describes how the risk analysis device may be configured to evaluate a level of risk associated with a geolocation foraging zone, where other factors such as environmental factors may be employed to determine a level of risk associated with the geolocation foraging zone, including at least weather predictions, weather analysis information, and historical data such as viability of vegetation); transmitting instructions to an autonomous robotic device to traverse the preconfigured area to a current location of the livestock group and to relocate the livestock group from the current location to a new location at a lower risk of the natural disaster than the current location based on the prediction; and directing, by the autonomous robotic device, the livestock group to the new location using prerecorded audio cues (pg. 1, [0004] which describes a mapping device coupled to the monitoring device to select at least one geolocation foraging zone when the level of risk associated with the at least one geolocation foraging zone is below a predetermined threshold value; and pg. 2, [0017] which describes enabling the unmanned aerial vehicle to determine a level of risk associated with foraging zones and provide deterrence action such that the animals avoid particularly dangerous foraging zones; and pg. 3, [0033] which describes providing movement for the unmanned aerial vehicle to determine geolocation information for foraging zones and/or guiding one or more animals to a selected foraging zone; and pg. 5, [0048] which describes how the unmanned aerial vehicle may include a guidance generator that may perform one or more functions to guide/lead the detected animals to a selected geolocation foraging zone, such as generating a sound and/or verbal instructions/commands via a speaker and/or activating the movement mechanisms such that movement from the unmanned aerial vehicle induces the animals to relocate). However, Castelli fails to explicitly teach of directing livestock using haptic sensations. Houston teaches of synchronized array of vibration actuators in an integrated module, specifically including directing the livestock group to a location identified in the generated prediction using prerecorded audio cues and haptic sensations produced through focused pressure fields created in mid-air by an array of ultrasound transducers (abstract which describes the synchronized array of vibration actuators; and pg. 49, [0688] which describes using beat patterns in haptic navigation; and pg. 50, [0689] which describes how the vibration beat patterns may be simultaneously enhanced with relevant visual stimuli, relevant audio stimuli, or both, where these types of devices may be used to gently and humanely guide animals away from certain areas via vibration collars, particularly if there were certain frequencies that produced sharp responses in the given animal). Castelli teaches of unmanned aerial vehicles for determining geolocation foraging zones. Houston teaches of synchronized array of vibration actuators in an integrated module, specifically including directing livestock to a specific location using prerecorded audio cues and haptic sensations. Both references are drawn to tracking and managing livestock. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Castelli with the prerecorded audio cues and haptic sensations as taught by Houston for the purpose of gently and humanely guiding animals away from certain areas (Houston, pg. 50, [0689]). By doing so, one would reasonably expect the overall appeal of the invention to improve in efficiency in the managing and tracking of livestock by providing additional means to direct and manage the livestock. As per claim 8, it refers to a computer system for performing the above steps. It recites limitations already addressed by claim 1 above, and is therefore rejected under the same art and rationale. Furthermore Castelli et al. (US 2018/0049407 A1, herein Castelli) discloses the steps are performed on a system, the computer system/server operational with numerous other general purpose or special purpose computing system environments or configurations (pg. 2, [0019]; and pg. 10, [0105]). As per claim 15, it refers to a computer program product for performing the above steps. It recites limitations already addressed by claim 1 above, and is therefore rejected under the same art and rationale. Furthermore Castelli et al. (US 2018/0049407 A1, herein Castelli) discloses the steps are performed using a non-transitory computer readable storage medium and/or a computer program product (pg. 1, [0005]; and pg. 2, [0019]). As per claim 2, Castelli in view of Houston discloses all the elements of claim 1, and Castelli further teaches wherein the grazing pattern is determined by identifying an activity of each animal in the livestock group when the livestock group has remained in a location for a preconfigured period of time (pg. 4, [0038] which describes how the camera may be configured to detect one or more animals and track locations of one or more animals to avoid overgrazing of herds by periodically capturing images of animal movement and mapping such geolocations; and pg. 5, [0051] which describes how one or more animals may be equipped with a tracking device that may communicate with each other and/or the unmanned aerial vehicle to communicate location information and/or movement information for each animal, monitoring information associated with a respective animal, such as location of the animal (e.g., movement of the animal) and/or health characteristics of the animal (e.g., temperature, blood information, pulse rate, oxygen content etc.); and pg. 8, [0073] which describes how the database may store information related to past grazing areas and/or types of animals that have grazed in particular foraging areas, where the mapping device may appropriately select a geolocation foraging zone such that overgrazing in particular zones are avoided). As per claim 9, it refers to the system of claim 8 used for performing the above steps. It recites limitations already addressed by claim 2 above, and is therefore rejected under the same art and rationale. As per claim 16, it refers to the computer program product of claim 15 used for performing the above steps. It recites limitations already addressed by claim 2 above, and is therefore rejected under the same art and rationale. As per claim 3, Castelli in view of Houston discloses all the elements of claim 2, and Castelli further teaches wherein the activity of each animal is identified using a plurality of sensors affixed to each animal or nearby the livestock group (pg. 5, [0051] which describes how one or more animals may be equipped with a tracking device that may communicate with each other and/or the unmanned aerial vehicle to communicate location information and/or movement information for each animal, monitoring information associated with a respective animal, such as location of the animal (e.g., movement of the animal) and/or health characteristics of the animal (e.g., temperature, blood information, pulse rate, oxygen content etc.)). As per claim 10, it refers to the system of claim 9 used for performing the above steps. It recites limitations already addressed by claim 3 above, and is therefore rejected under the same art and rationale. As per claim 17, it refers to the computer program product of claim 16 used for performing the above steps. It recites limitations already addressed by claim 3 above, and is therefore rejected under the same art and rationale. As per claim 6, Castelli in view of Houston discloses all the elements of claim 1, and Castelli further teaches wherein the prediction further comprises identifying an elevated risk of a natural disaster is present in a section of the preconfigured area and a location towards which the livestock group is to be directed so as to prevent or mitigate the elevated risk (pg. 1, [0002-0004] which describes how overgrazing in a particular location for an extended amount of time without sufficient recovery periods may be detrimental to the land, wildlife, and livestock, where grazing systems have been employed to minimize the effects of overgrazing, where appropriate levels of grazing may be effective in at least reducing fire hazards due to forage buildup, where the unmanned aerial vehicle includes a risk analysis device to evaluate a level of risk associated with each of the plurality of geolocations and a mapping device coupled to the monitoring device to select at least one geolocation foraging zone when the level of risk associated with the at least one geolocation foraging zone is below a predetermined threshold value; and pg. 2, [0017] which describes enabling the unmanned aerial vehicle to determine a level of risk associated with foraging zones and provide deterrence action such that the animals avoid particularly dangerous foraging zones; and pg. 3-4, [0034-0035] which describes how the unmanned aerial vehicle may include at least one camera configured to provide visual feedback and may capture/provide images of geolocations corresponding to vegetation/foliage to identify geolocation foraging zones, where the unmanned aerial vehicle may track and/or identify vegetation characteristics (e.g., type of vegetation, quality of vegetation, quantity of vegetation, etc.) and may provide corresponding geolocation information to generate a virtual map indicating foraging zones (e.g., areas in which animals may forage and/or where grazing may be beneficial)); and pg. 8, [0075-0076] which describes how the risk analysis device may be configured to evaluate a level of risk associated with a geolocation foraging zone, where other factors such as environmental factors may be employed to determine a level of risk associated with the geolocation foraging zone, including at least weather predictions, weather analysis information, and historical data such as viability of vegetation). As per claim 13, it refers to the system of claim 8 used for performing the above steps. It recites limitations already addressed by claim 6 above, and is therefore rejected under the same art and rationale. As per claim 20, it refers to the computer program product of claim 15 used for performing the above steps. It recites limitations already addressed by claim 6 above, and is therefore rejected under the same art and rationale. As per claim 7, Castelli in view of Houston discloses all the elements of claim 1, and Castelli further teaches wherein the growing pattern comprises flora growing rates, soil requirements, flora water requirements, and recent weather conditions for each section (pg. 2, [0017] which describes unmanned aerial vehicles for determining geolocation foraging zones for animals, including monitoring vegetation characteristics of particular geolocations; and pg. 3-4, [0034-0035] which describes how the unmanned aerial vehicle may include at least one camera configured to provide visual feedback and may capture/provide images of geolocations corresponding to vegetation/foliage to identify geolocation foraging zones, where the unmanned aerial vehicle may track and/or identify vegetation characteristics (e.g., type of vegetation, quality of vegetation, quantity of vegetation, etc.) and may provide corresponding geolocation information to generate a virtual map indicating foraging zones (e.g., areas in which animals may forage and/or where grazing may be beneficial), where the camera may also acquire information associated with a particular geolocation, such as pasture biophysical properties (e.g., biomass, leaf area index, nitrogen content, chlorophyll content, grass density, canopy height, etc.); and pg. 4-5, [0043-0045] which describes the sensor for detecting information associated with a particular geolocation, including biophysical characteristics of a geolocation, vegetation characteristics of a geolocation, environmental variables in a particular geolocation). As per claim 14, it refers to the system of claim 8 used for performing the above steps. It recites limitations already addressed by claim 6 above, and is therefore rejected under the same art and rationale. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Biffert et al. (US 2022/0200519 A1) teaches of a livestock management system. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ASHLEY Y YOUNG whose telephone number is (571)270-5294. The examiner can normally be reached Mondays, Tuesdays, and Thursdays, 9:00a-3:00p, EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Beth Boswell can be reached at (571) 272-6737. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ASHLEY Y YOUNG/Examiner, Art Unit 3625 /BETH V BOSWELL/Supervisory Patent Examiner, Art Unit 3625
Read full office action

Prosecution Timeline

Sep 08, 2021
Application Filed
Oct 10, 2024
Non-Final Rejection — §103
Jan 09, 2025
Interview Requested
Jan 16, 2025
Examiner Interview Summary
Jan 17, 2025
Response Filed
Feb 14, 2025
Final Rejection — §103
Apr 02, 2025
Interview Requested
Apr 09, 2025
Examiner Interview Summary
Apr 09, 2025
Request for Continued Examination
Apr 09, 2025
Applicant Interview (Telephonic)
Apr 10, 2025
Response after Non-Final Action
May 07, 2025
Non-Final Rejection — §103
Jul 29, 2025
Interview Requested
Aug 05, 2025
Examiner Interview Summary
Aug 05, 2025
Applicant Interview (Telephonic)
Aug 06, 2025
Response Filed
Aug 29, 2025
Final Rejection — §103
Oct 16, 2025
Interview Requested
Oct 23, 2025
Examiner Interview Summary
Oct 23, 2025
Applicant Interview (Telephonic)
Oct 24, 2025
Request for Continued Examination
Nov 03, 2025
Response after Non-Final Action
Feb 19, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12488408
NETWORK SYSTEM TO FILTER REQUESTS BY DESTINATION AND DEADLINE
2y 5m to grant Granted Dec 02, 2025
Patent 12346853
REMOTE WORKING EXPERIENCE OPTIMIZATION SYSTEMS
2y 5m to grant Granted Jul 01, 2025
Patent 12162135
MANAGING SHARED ROBOTS IN A DATA CENTER
2y 5m to grant Granted Dec 10, 2024
Patent 12124983
AUTOMATED GUEST ACTIVITY DETECTION
2y 5m to grant Granted Oct 22, 2024
Patent 12124984
APPARATUS FOR IDENTIFYING AN EXCESSIVE CARBON EMISSION VALUE AND A METHOD FOR ITS USE
2y 5m to grant Granted Oct 22, 2024
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
30%
Grant Probability
47%
With Interview (+17.2%)
4y 2m
Median Time to Grant
High
PTA Risk
Based on 196 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month