Prosecution Insights
Last updated: April 19, 2026
Application No. 18/140,738

INFORMATION FUSION/OPERATION CONTROL DEVICE FOR ANTI-DRONE SYSTEM, ANTI-DRONE SYSTEM, AND OPERATION METHOD THEREOF

Final Rejection §101§103§112
Filed
Apr 28, 2023
Examiner
BUKSA, CHRISTOPHER ALLEN
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
OA Round
2 (Final)
73%
Grant Probability
Favorable
3-4
OA Rounds
3y 0m
To Grant
94%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
99 granted / 136 resolved
+20.8% vs TC avg
Strong +21% interview lift
Without
With
+20.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
38 currently pending
Career history
174
Total Applications
across all art units

Statute-Specific Performance

§101
13.8%
-26.2% vs TC avg
§103
48.3%
+8.3% vs TC avg
§102
27.0%
-13.0% vs TC avg
§112
9.6%
-30.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 136 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Joint Inventors This application currently names joint inventors. In considering patentability of the claims, the Examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the Examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Information Disclosure Statement The information disclosure statement (IDS) submitted on 07/08/2025, was filed after the mailing of a First Office Action on the Merits but before the close of prosecution. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Examiner has checked and verified that the foreign priority document supports the subject matter of the instant specification. As such, the foreign priority date of 05/02/2022 is granted. Response to Amendment The amendments filed on 07/18/2025 have been entered. Claims 1, 3-4, 7-9, 11-12, 15, and 17-18 remain pending in the application. Claim Objections Claims 3, 7-9, 11, 15, and 17 are objected to because of the following informalities: The claims recite the phrase “ID identification”. However, ID already stands for identification and the current phrase is redundant. Examiner suggests amending the claims such that the term “ID” is removed from each instance of “ID identification”. Appropriate correction is required. Specification The disclosure is objected to because of the following informalities: The disclosure recites in numerous places the phrase “ID identification”. However, ID already stands for identification and the current phrase is redundant. Examiner suggests amending the disclosure such that the term “ID” is removed from each instance of “ID identification”. Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. 101 Analysis - Step 1 Claim 1 is directed to an apparatus. Therefore, claim 1 is within at least one of the four statutory categories. 101 Analysis -Step 2A, Prong I Regarding Prong 1 of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within the one of the following groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes. Independent claim 1 includes limitations that recite an abstract idea (emphasized below) and will be used as a representative claim for the remainder of the 101 rejection. Claim 1 recites: An information fusion and operation control device for an anti- drone system, comprising: a memory; and a processor configured to execute at least one instruction stored in the memory, wherein the processor is further configured to: receive detection location information on a drone obtained based on a detection beam pattern; receive identification location information on the drone obtained based on an identification beam pattern; generate neutralization area range information on the drone by fusing the detection location information and the identification location information; and sending the neutralization area range information to a neutralization device, and wherein when the drone is a target to be neutralized, calculate an error range of the detection location information; calculate an error range of the identification location information; and generate the neutralization area range information having a resolution with which the drone is to be neutralized by fusing the detection beam pattern, the error range of the detection location information, the identification beam pattern, and the error range of the identification location information, based on the resolution with which the drone is to be neutralized, wherein when the drone is a target to be not neutralized, calculate the error range of the detection location information; calculate the error range of the identification location information; and generate the neutralization area range information based on the resolution with which the drone is to be neutralized, the detection beam pattern, the error range of the detection location information, the identification beam pattern, and the error range of the identification location information, wherein the neutralization area range information is generated so that the identification beam pattern and the error range of the identification location information are not included in the neutralization area range information. The examiner submits that the foregoing bolded limitations constitute “mathematical concepts” because under its broadest reasonable interpretation, the claim covers a calculation of a range of numbers (see MPEP 2106.04(a)(1)). 101 Analysis – Step 2A, Prong II Regarding Prong II of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract idea into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra-solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application”. In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”): An information fusion and operation control device for an anti- drone system, comprising: a memory; and a processor configured to execute at least one instruction stored in the memory, wherein the processor is further configured to: receive detection location information on a drone obtained based on a detection beam pattern; receive identification location information on the drone obtained based on an identification beam pattern; generate neutralization area range information on the drone by fusing the detection location information and the identification location information; and sending the neutralization area range information to a neutralization device, and wherein when the drone is a target to be neutralized, calculate an error range of the detection location information; calculate an error range of the identification location information; and generate the neutralization area range information having a resolution with which the drone is to be neutralized by fusing the detection beam pattern, the error range of the detection location information, the identification beam pattern, and the error range of the identification location information, based on the resolution with which the drone is to be neutralized, wherein when the drone is a target to be not neutralized, calculate the error range of the detection location information; calculate the error range of the identification location information; and generate the neutralization area range information based on the resolution with which the drone is to be neutralized, the detection beam pattern, the error range of the detection location information, the identification beam pattern, and the error range of the identification location information, wherein the neutralization area range information is generated so that the identification beam pattern and the error range of the identification location information are not included in the neutralization area range information. For the following reason, the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application. Regarding the additional limitations of ‘memory’ and ‘processor’, the examiner submits that these limitations are an attempt to generally link additional elements to a technological environment. In particular, the ‘memory’ and ‘processor’, are recited at a high level of generality and merely automate the ‘receive’, ‘generate’, and ‘sending’ steps, therefore acting as a generic computer component to perform the abstract idea. The ‘memory’ and ‘processor’ are claimed generically and are operating in their ordinary capacity and do not use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the exception. The additional limitations are no more than mere instructions to apply the exception using a computer (‘memory’ and ‘processor’). Regarding the additional limitation of ‘drone’, the examiner submits that this limitation does not integrate the additional limitations into a practical application as the limitation is generally linking the judicial exceptions to a particular technological environment or field of use (drones). Regarding the additional limitations of ‘receive …’, ‘generate …’, and ‘sending …’, the examiner submits that these limitations are insignificant extra-solution activity (specifically pre-solution activity for the receiving steps and post-solution activity for the generating and sending steps) and are recited in such a manner that the limitations amount to mere data gathering. Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitation as an ordered combination or as a whole, the limitations add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (MPEP 2106.05). Accordingly, the additional limitations do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. 101 Analysis – Step 2B Regarding Step 2B of the Revised Guidance, representative claim 1 includes additional elements (considered both individually and as an ordered combination) that are not sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of ‘receive …’, ‘generate …’, and ‘sending …’ have been identified as insignificant extra-solution activity (see Step 2A, Prong II) and do not amount to significantly more. Specifically, the ‘receive …’ and ‘generate …’ steps are considered as mere data gathering through pre and post solution activity (see MPEP 2106.05(g); specifically obtaining and presenting information). Regarding the ‘sending …’ step, this limitation falls under the well-understood, routine, and conventional activity of receiving or transmitting data over a network (see MPEP 2106.05(d)(II.)(i.)). Additionally, as discussed above with respect to integration of the abstract idea into a practical application, the additional element of ‘drone’ does not amount to significantly more because it is simply generically linking the abstract idea to a technological environment or field of use (drones). Additionally, as discussed above with respect to integration of the abstract idea into a practical application, the additional elements of ‘memory’ and ‘processor’ amount to nothing more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. Hence, the claim is not patent eligible. Therefore, claim 1 is ineligible under 35 U.S.C. 101. Claims 3-4 and 7-8 ultimately depend from claim 1. However, claims 3-4 and 7-8 do not recite any further additional limitations that bring the claims into eligibility over the abstract idea of claim 1. Claim 3 recites the additional limitations of ‘receive ..’ and ‘generate …’ steps which, as seen above in claim 1, merely amount to insignificant pre and post solution activity. Regarding the additional limitation of ‘perform …’, this limitation could be performed in the human mind as the ‘perform …’ step simply recites an action of re-evaluating if a drone is present by comparing information. This step can easily be performed in the human mind. Claim 4 recites the similar ‘receive …’, ‘generate …’, and ‘perform …’ steps as to those in claim 3 and, as such, do not add any additional limitations that would bring the claims into eligibility. Claim 7 recites the additional limitations of ‘receive …’ and ‘transmit …’. However, the ‘receive …’ step can be considered insignificant pre-solution activity and the ‘transmit …’ step can be considered a well-understood, routine, and conventional activity of sending or receiving data over a network. Claim 8 recites the additional limitations of ‘perform …’ steps which, as seen above in claim 3, simply amount to a further mental process that can be performed in the human mind (performing a re-identification process). Regarding claims 9 and 11-12, the claim limitations are similar to those in claims 1 and 3-4 and are rejected using the same rationale as seen above in claim 1 and 3-4. Regarding claims 15 and 17-18, the claim limitations are similar to those in claims 1 and 3-4 and are rejected using the same rationale as seen above in claim 1 and 3-4. Examiner suggests amending all independent claims to recite a ‘control’ step, such as “wherein the neutralization device neutralizes the drone that is to be neutralized”, or something to that effect. Examiner notes that there is support for the amendment in at least Fig. 2 item S270. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1, 3-4, 7-9, 11-12, 15, and 17-18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claim 1, the claim recites the limitations of “… by fusing the detection beam pattern, the error range of the detection location information, the identification beam pattern, and the error range of the identification location information …”, “… generate the neutralization area range information based on the resolution with which the drone is to be neutralized, the detection beam pattern, the error range of the detection location information, the identification beam pattern, and the error range of the identification location information …”, and “… wherein the neutralization area range information is generated so that the identification beam pattern and the error range of the identification location information are not included …”. However, it is unclear how the actual beam patterns (detection and identification) are being used with the error ranges. For instance, the instant specification consistently discloses the use of the detection location information with the error range of the detection location information, as well as the identification location information with the error range of the identification location information. Nowhere in the instant specification does it disclose using the beam patterns themselves with the aforementioned information (location information and associated error ranges). As such, the claim is indefinite. Examiner believes that the limitations meant to disclose the detection location information instead of the detection beam pattern, and the identification location information instead of the identification beam pattern. For examination purposes, the claims are being interpreted such that the term ‘detection beam pattern’ is being interpreted as ‘detection location information’ and the term ‘identification beam pattern’ is being interpreted as ‘identification location information’. Regarding claims 3-4 and 7-8, the claims ultimately depend from claim 1 and contain the same indefinite subject matter as claim 1. As such, claims 3-4 and 7-8 are also rejected under 35 U.S.C. 112(b) for being indefinite. Regarding claim 9, the claim limitations are similar to those in claim 1 and are rejected using the same rationale as seen above in claim 1. Regarding claims 11-12, the claims ultimately depend from claim 9 and contain the same indefinite subject matter as claim 9. As such, claims 11-12 are also rejected under 35 U.S.C. 112(b) for being indefinite. Regarding claim 15, the claim limitations are similar to those in claim 1 and are rejected using the same rationale as seen above in claim 1. Regarding claims 17-18, the claims ultimately depend from claim 15 and contain the same indefinite subject matter as claim 15. As such, claims 17-18 are also rejected under 35 U.S.C. 112(b) for being indefinite. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 3-4, 7-9, 11-12, 15, and 17-18 are rejected under 35 U.S.C. 103 as being obvious over Ochiai et al., US 20220163670 A1, herein referred to as Ochiai, and in view of Van Voorst, US 20170261999 A1, herein referred to as Voorst. Regarding claim 1, Ochiai discloses a memory (Fig. 2 item 230, Paragraph 0035; control device includes a storage device which can be considered memory), a processor (Fig. 2 item 240, Paragraph 0035; control device includes a processor), receive detection location information on a drone (Paragraphs 0034, 0057, and 0077; system may use numerous sensors to obtain a location of a moving body, sensors can include radar, LiDAR, an aiming sensor, etc., the moving body may be a drone), receive identification location information on the drone (Paragraphs 0048 and 0057; system may identify if a moving body is a no-coping target or a coping target based on the location detection information, the moving body may be a drone), generate neutralization area range information on the drone by fusing the detection location information and the identification location information (Fig. 13, Paragraphs 0033-0034, 0048, and 0084; range of effect of weapon for coping may be generated based on the coping signal from the control device, coping signal is based on data obtained from the numerous sensors (see 0057) and the coping list which identifies coping targets, this can be considered a fusion of the location and identification data), sending the neutralization area range information to a neutralization device (Paragraph 0033 at least; coping target information is transmitted to the threat coping device; coping target information includes data related to the range of effect of the coping device), and wherein when the drone is a target to be neutralized, calculate an error range of the detection location information (Fig. 8, Paragraphs 0055-0056; a no-coping area may be determined for a no-coping target, this can be considered an error range of the detection location information as it is based on the location of the no-coping target and is an area where neutralization information is not generated (no range of effect generated for coping)), calculate an error range of the identification location information (Fig. 8 Paragraphs 0055-0056; a no-coping area may be determined for a no-coping target, this can be considered an error range of the identification location information as it is based on the identification of the no-coping target and is an area where neutralization area range information may not be generated (no weapon range of effect generated for coping; if overlap is present, suspension flag is generated to cease coping process, see rationale in claim 4)), and generate the neutralization area range information having a resolution with which the drone is to be neutralized by fusing the detection location information, the error range of the detection location information, the identification location information, and the error range of the identification location information, based on the resolution with which the drone is to be neutralized (Fig. 12, Paragraphs 0033-0034, 0055-0056, 0084; various sensors may be used with the coping list when generating the coping signal and associated range of effect of a weapon, each sensor may have an associated accuracy which can be considered a resolution, the range of effect of a weapon for coping may have an associated apex angle based on the type of weapon being used, this range of effect and apex angle may be considered the resolution for which the drone is to be neutralized, range of effect of weapon for coping may be generated based on the coping signal from the control device, coping signal is based on data obtained from the numerous sensors (see 0057) and the coping list which identifies coping targets as well as areas that are not possible for coping (error ranges; see above rationale), this can be considered the fusing of the detection location information, the error range of the detection location information, the identification location information, and the error range of the identification location information), and wherein when the drone is a target to be not neutralized, calculate an error range of the detection location information (Fig. 8, Paragraphs 0055-0056; a no-coping area may be determined for a no-coping target, this can be considered an error range of the detection location information as it is based on the location of the no-coping target and is an area where neutralization information is not generated (no range of effect generated for coping)), calculate an error range of the identification location information (Fig. 8 Paragraphs 0055-0056; a no-coping area may be determined for a no-coping target, this can be considered an error range of the identification location information as it is based on the identification of the no-coping target and is an area where neutralization area range information may not be generated (no weapon range of effect generated for coping; if overlap is present, suspension flag is generated to cease coping process, see rationale in claim 4)), and generate the neutralization area range information based on the resolution with which the drone is to be neutralized, the detection location information, the error range of the detection location information, the identification location information, and the error range of the identification location information, wherein the neutralization area range information is generated so that the identification location information and the error range of the identification location information is not included in the neutralization area range information (Fig. 12, Paragraphs 0033-0034, 0055-0056, 0084; various sensors may be used with the coping list when generating the coping signal and associated range of effect of a weapon, each sensor may have an associated accuracy which can be considered a resolution, the range of effect of a weapon for coping may have an associated apex angle based on the type of weapon being used, this range of effect and apex angle may be considered the resolution for which the drone is to be neutralized, range of effect of weapon for coping may be generated based on the coping signal from the control device, coping signal is based on data obtained from the numerous sensors (see 0057) and the coping list which identifies coping targets as well as areas that are not possible for coping (error ranges; see above rationale), the no-coping area may be used to indicate an area where range of effects of weapons for coping may not occur, range of effect for weapons for coping (neutralization information) may reside outside the no-cope area (see Fig. 12 14-1 range of effect of weapon for coping and area of no-coping 12), for a no-cope target, a neutralization area range information may be generated based on the detection location information and the error range of the detection location information as it includes areas that the weapon is not able to perform coping for a no-cope target; because the target is a no-cope target, the identification location information and the error range of the identification location information are not be needed for a neutralization area range as it is only used for targets that are to be coped), but fails to disclose receiving detection location information on a drone obtained based on a detection beam pattern, and receiving identification location information on the drone obtained based on an identification beam pattern. However, Voorst, in an analogous field of endeavor, teaches receiving detection location information on a drone obtained based on a detection beam pattern (at least Paragraphs 0045-0047; a scanning lidar may have a specific scanning pattern (fan shape) to detect and identify target drones in a given area), and receiving identification location information on the drone obtained based on an identification beam pattern (at least Paragraphs 0045-0047; a scanning lidar may have a specific scanning pattern (fan shape) to detect and identify target drones in a given area). Therefore, from the teaching of Voorst, it would have been obvious to one of ordinary skill in the art before the effective filing date to have modified, with a reasonable expectation for success, the anti-drone system of Ochiai to include receiving detection location information on a drone obtained based on a detection beam pattern, and receiving identification location information on the drone obtained based on an identification beam pattern, as taught/suggested by Voorst. The motivation to do so would be to use a specific sensor pattern for determining drone information (location and identification). This can lead to higher accuracy and better consistency when locating and identifying drones. Regarding claim 3, Ochiai in view of Voorst renders obvious all the limitations of claim 1. Ochiai further discloses receive the detection location information and a result of identifying the drone (Paragraphs 0033-0034, 0048, and 0057; sensors may be used to determine location information of a moving body, a moving body may be identified as a no-coping or coping target, with coping targets being registered on a list, moving body may be a drone), perform a re-identification process for the drone on the basis of the detection location information when the detection location information on the drone is present but no identified drone is found (Fig. 8, Paragraphs 0057-0058; if detection data is received on a moving body, the system may determine if coping or no-coping targets exist based on that data, if no coping targets exist, the system may re-attempt the previous steps to identify coping and no-coping targets, this can be considered a re-identification process), and generate control information for an ID identification device so that available resources for identification of the drone are concentrated on the detection location information in the re-identification process (Fig. 8, Paragraphs 0047, 0057-0058; identification module is used to identify a moving body as coping or no-coping, if system re-attempts the identification process, this can be considered control information being generated to concentrate available resources towards re-identification as the system uses its available sensors and devices to perform the re-identification attempt). Regarding claim 4, Ochiai in view of Voorst renders obvious all the limitations of claim 1. Ochiai further discloses the receive the identification location information and a result of detecting the drone (Paragraphs 0033-0034, 0048, and 0057; sensors may be used to determine location information of a moving body, a moving body may be identified as a no-coping or coping target, moving body may be a drone), perform a re-detection process for the drone on the basis of the identification location information when the identification location information on the drone is present but no detected drone is found (Figs. 10-11, Paragraphs 0069, 0072, and 0077-0078; system may determine if a coping target is present based on a current location of a moving body, if the coping target has a suspension flag, the system may repeat the coping process until the suspension flag has been lifted, if a coping target has a suspension flag due to positioning within a no-cope area, that drone can be considered to be not found as it is not available for coping), and generate control information for a detection device so that available resources for detection of the drone are concentrated on the identification location information in the re-detection process (Paragraphs 0077-0078; aiming sensor may continue to track a coping target until the suspension flag is lifted, this can be considered control information for performing a re-detection process, this can be considered using available resources to concentrate on identification location information during a re-detection process as the system is continually monitoring if the coping target is available for coping (identification is based on if moving body is a coping or no-coping target)) Regarding claim 7, Ochiai in view of Voorst renders obvious all the limitations of claim 1. Ochiai further discloses receive a result of identifying the drone and the identification location information from an ID identification device (Paragraph 0047; identification of targets and if they are coping or no-coping targets is performed), receive a result of detecting the drone and the detection location information from a detection device (Paragraphs 0077-0078; aiming sensor may be used to obtain a current location of a coping target), and transmit the neutralization area range information to a neutralization device (Fig. 1, Paragraph 0033; detection device 100 may be in communication with a control device 200 which can direct the coping device 300 to cope a coping target). Regarding claim 8, Ochiai in view of Voorst renders obvious all the limitations of claim 7. Additionally, a portion of the claim limitations are similar to those in claims 3 and 4 and are rejected using the same rationale as seen above in claims 3 and 4. Ochiai further discloses transmitting the control information for the ID identification device to the ID identification device (Paragraphs 0047, 0057-0058; identification module is used to identify a moving body as coping or no-coping, if system re-attempts the identification process, this can be considered transmitting control information being generated to concentrate available resources towards re-identification as the system uses its available sensors and devices to perform the re-identification attempt), and transmitting the control information for the detection device to the detection device (Paragraphs 0077-0078; aiming sensor may continue to track a coping target until the suspension flag is lifted, this can be considered transmitting control information for performing a re-detection process as the system is continually monitoring if the coping target is available for coping). Regarding claims 9 and 11-12, the claim limitations are similar to those in claims 1 and 3-4, and are rejected using the same rationale as seen above in claims 1 and 3-4. Regarding claims 15 and 17-18, the claim limitations are similar to those in claims 1 and 3-4, and are rejected using the same rationale as seen above in claims 1 and 3-4. Response to Arguments Applicant's arguments filed 07/18/2025 regarding 35 U.S.C. 101 have been fully considered but they are not persuasive. Applicant is arguing that the use of a detection and identification beam patterns are not abstract ideas. However, the limitations associated with the beam patterns are being considered as insignificant pre-solution activity under Step 2A Prong II and not significantly more under Step 2B for being insignificant pre-solution activity as identified under Step 2A Prong II. Examiner points out that these limitations merely amount to receiving data from a sensor. Furthermore, the ‘sending …’ limitations are considered as well-understood, routine, and conventional activity of sending and receiving data over a network as the neutralization range information is simply being transmitted to a neutralization device which amounts to simply sending data over a network. Regarding the Applicant’s arguments with respect to claim(s) 1, 9, and 15, they have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER ALLEN BUKSA whose telephone number is (571)272-5346. The examiner can normally be reached M-F 7:30 AM-4:30 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Worden can be reached at (571) 272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /C.A.B./Examiner, Art Unit 3658 /JASON HOLLOWAY/Primary Examiner, Art Unit 3658
Read full office action

Prosecution Timeline

Apr 28, 2023
Application Filed
Apr 30, 2025
Non-Final Rejection — §101, §103, §112
Jul 18, 2025
Response Filed
Oct 20, 2025
Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578725
SELF-MAINTAINING, SOLAR POWERED, AUTONOMOUS ROBOTICS SYSTEM AND ASSOCIATED METHODS
2y 5m to grant Granted Mar 17, 2026
Patent 12576524
CONTROL DEVICE, CONTROL METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12570428
SYSTEM AND METHOD FOR MOVING AND UNBUNDLING A CARTON STACK
2y 5m to grant Granted Mar 10, 2026
Patent 12554024
MAP-AIDED SATELLITE SELECTION
2y 5m to grant Granted Feb 17, 2026
Patent 12534223
UNMANNED ROBOT FOR URBAN AIR MOBILITY VEHICLE AND URBAN AIR MOBILITY VEHICLE
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
73%
Grant Probability
94%
With Interview (+20.8%)
3y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 136 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month