DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Allowable Subject Matter
Claim 16 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter:
With regards to claim 16, several of the features of this claim was known in the art as evidenced by Bar-Nahum et al (US PG Pub. No. 2020/0162489) which discloses determining whether a suspicious machine, which is an unmanned aerial vehicle not permitted to fly in the monitored airspace, has intruded into the monitored airspace, based on a sensor signal output from a detection sensor (e.g., “secondary sensor”) other than the image capturing device (e.g., “primary sensor”) among the plurality of types of detection sensors constituting the detection device and a result of detecting an unmanned aerial vehicle in the captured image at: ¶ [0018]; ¶ [0029]; ¶ [0037]; ¶¶ [0040]-[0042]; ¶¶ [0057]-[0059]; ¶¶ [0063]-[0064](“For example, an image sensor may detect the security event within its field of view …”); ¶ [0067]; ¶¶ [0084]-[0085]; ¶¶ [0095]- [0097]; ¶¶ [0113]-[0117]. However, Bar-Nahum does not disclose acquiring information posted on a social networking service (SNS) and determining the suspicious machine has intruded into the monitored airspace based on the information posted on the SNS.
Response to Arguments
Applicant's arguments filed 05 March 2026 have been fully considered but they are not persuasive. In the prior claim set, Applicant recited a combination of a passive radar and a camera and this was rejected over Bar-Nahum in view of Yoshitaka. Applicant’s arguments are drawn to the newly added limitation that recites detecting a suspicious UAV, “by the object recognition processing from the captured image, even when the radio wave is not detected by the passive radar due to a state of autonomous flight of the unmanned aerial vehicle or obstruction by a building or other structure.” The only support for this amendment is found in the specification-as-filed at par. [0041] which describes this “limitation” as a reason for combining passive radar and a camera. By applicant’s own admission, the limitation describes an inherent advantage derived from combining passive radar and a camera, but combining passive radar and a camera have already been rejected. The Bar-Nahum reference, already of record, was previously cited for teaching the combined use of passive radar and a camera. Applicant’s amendments do nothing more than recite the inherent benefits of a combination of sensors already known in the art.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-4 and 8-15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Bar-Nahum et al (US PG Pub. No. 2020/0162489).
With regards to claim 1, the limitations of this claim are obvious over the teachings of the prior art, as evidenced by the following references:
The Bar-Nahum reference
Bar-Nahum discloses acquiring sensor signals output from a detection device constituted by a combination of a plurality of types of detection sensors including at least an image capturing device (“camera”) and a passive radar (e.g., “rf sensor”) that detects an unmanned aerial vehicle in a monitored airspace by detecting a radio wave used by the unmanned aerial vehicle for communication at: ¶ [0017](“Examples of sensors included in the first layer of sensors include, but are not limited to, …, RF sensors, cameras,”); ¶ [0037](“[T]he selected sensors may be configured to detect a UAV in the event the at least one of the first layer of sensors 101 detected a UAV flying near a protected airspace… For example, a camera may detect the security event within its field of view.”) ¶ [0017]. Bar-Nahum discloses at ¶¶ [0017]-[0018]; ¶ [0037]; ¶¶ [0057]-[0058]; ¶¶ [0084]-[0085].
Bar-Nahum discloses detecting an unmanned aerial vehicle through object recognition processing from a captured image as a sensor signal output from the image capturing device at: ¶ [0037](“[T]he selected sensors may be configured to detect a UAV in the event the at least one of the first layer of sensors 101 detected a UAV flying near a protected airspace… For example, a camera may detect the security event within its field of view.”); ¶¶ [0087]-[0091](“ At 604, an indication of a detection is provided. The indication may be provided to a security event detection system. The indication may include a zoomed-in image of the detected object. The indication may include a classification of the detected object… At 608, a classifier trained to detect objects is updated…”).
Bar-Nahum discloses determining whether a suspicious machine, which is an unmanned aerial vehicle not permitted to fly in the monitored airspace, has intruded into the monitored airspace, based on a sensor signal output from the passive radar (i.e., RF sensors are both a “primary sensor” and a “secondary sensor”) and a result of detecting an unmanned aerial vehicle in the captured image at: ¶ [0018](“ In view of this risk posed by malicious UAVs, it may be necessary to have a system to detect, monitor, track, classify, and/or assess a UAV that has entered a restricted area, such as a protected airspace.”); ¶ [0029]; ¶ [0037](“For example, a camera may detect the security event within its field of view. An unmanned aerial vehicle may detect the security event using one of its sensors. Upon detecting the security event, a selected sensor may determine additional information associated with the detected security event.”); ¶¶ [0040]-[0042](“For example, the individual may be a person that frequently pilots a UAV near a stadium without permission…. In some embodiments, the detected security event is an object approaching a protected airspace. For example, a UAV may be approaching the protected airspace around a stadium. Risk assessment module 105 may determine the amount of time before the detected security event is going to cross a boundary associated with a protected airspace.”); ¶¶ [0057]-[0059]; ¶¶ [0063]-[0064](“For example, an image sensor may detect the security event within its field of view. At least one of the selected sensors may determine additional information associated with the detected security event… The trajectory of the UAV may also be monitored and predicted. Past flight trajectory may be determined by collecting and plotting past geolocation data …e.g., from radar, …”); ¶ [0067]; ¶¶ [0084]-[0085]; ¶¶ [0095]-[0097]; ¶¶ [0113]-[0117](“A protected airspace may be associated with one or more layers. In the example shown, protected airspace 802 is associated with a first layer 802a, a second layer 802b, and a third layer 803a. A response to a detected security event may differ based on whether the detected security event is located in the first layer, second layer, or third layer”) and FIG. 8:
PNG
media_image1.png
487
932
media_image1.png
Greyscale
Determining that the unmanned aerial vehicle is the suspicious in a case where the unmanned aerial vehicle is detected by the object recognition processing from the captured image, even when the radio wave is not detected by the passive radar due to a state of autonomous flight of the unmanned aerial vehicle or obstruction by a building or other structure is found inherent in the Bar-Nahum reference at ¶¶ [0017]-[0018] when it discloses combining passive radar (i.e., RF sensor) and a camera. By applicant’s own admission, the limitation describes an inherent advantage derived from combining passive radar and a camera. The Bar-Nahum reference teaches the combined use of passive radar and a camera. By applicant’s own admission, the recited features are inherent in Bar-Nahum’s combined passive radar (i.e., RF sensor) and camera. Moreover, this is a functional limitation. A claim term is functional when it recites a feature "by what it does rather than by what it is" (e.g., as evidenced by its specific structure or specific ingredients). In re Swinehart, 439 F.2d 210, 212, 169 USPQ 226, 229 (CCPA 1971). 24. These functional limitations appear to be inherent in the Bar-Nahum reference because the reference teaches the combined use of passive radar and a camera. Based upon the above structural similarities, it is found that the prior art structure inherently possesses the functionally defined limitations of the claimed apparatus. When a prior art structure is found to inherently possess the functionally defined limitations of a claimed apparatus, the burden then shifts to applicant to establish that the prior art does not possess the characteristic relied on. In re Schreiber, 128 F.3d at 1478, 44 USPQ2d at 1432; In re Swinehart, 439 F.2d 210, 213, 169 USPQ 226, 228 (CCPA 1971) ("where the Patent Office has reason to believe that a functional limitation asserted to be critical for establishing novelty in the claimed subject matter may, in fact, be an inherent characteristic of the prior art, it possesses the authority to require the applicant to prove that the subject matter shown to be in the prior art does not possess the characteristic relied on"). MPEP § 2112(V); MPEP § 2114(I).
With regards to claim 2, Bar-Nahum discloses detecting the unmanned aerial vehicle from the captured image using a detection model that outputs information regarding whether the unmanned aerial vehicle is present or absent in the captured image with the captured image being input to the detection model as the object recognition processing at: ¶ [0037](“[T]he selected sensors may be configured to detect a UAV in the event the at least one of the first layer of sensors 101 detected a UAV flying near a protected airspace… For example, a camera may detect the security event within its field of view.”); ¶¶ [0087]-[0091](“ At 604, an indication of a detection is provided. The indication may be provided to a security event detection system. The indication may include a zoomed-in image of the detected object. The indication may include a classification of the detected object… At 608, a classifier trained to detect objects is updated…”).
With regards to claim 3, Bar-Nahum discloses calculating a trajectory of the unmanned aerial vehicle in the monitored airspace based on the sensor signals output from the detection device at: ¶ [0037]; ¶¶ [0063]-[0064]; ¶ [0067]; ¶¶ [0113]-[0117](“User interface 800 may track a location associated with a detected security event over time. In the example shown, the trajectory paths of detected security events 805, 807, 809 are shown. User interface 800 may also display a predicted trajectory path associated with a detected security event.”) and FIG. 8.
Bar-Nahum discloses detecting that the suspicious machine has intruded into the monitored airspace using the trajectory of the unmanned aerial vehicle as well at: ¶ [0037]; ¶¶ [0041]-[0042]; ¶ [0067]; ¶¶ [0113]-[0117] and FIG. 8.
With regards to claim 4, Bar-Nahum discloses outputting information for notifying an intrusion of the suspicious machine into the monitored airspace when it is determined that the suspicious machine has intruded into the monitored airspace at: ¶ [0070](“Automatic actions associated with threat levels may be triggered when certain threat levels are assessed. Examples of actions that can be initiated based on threat level assessments include contacting law enforcement, generating a notification/message to send to specified people or organizations, triggering an alarm…”); ¶¶ [0113]-[0117](“A protected airspace may be associated with one or more layers. In the example shown, protected airspace 802 is associated with a first layer 802a, a second layer 802b, and a third layer 803a. A response to a detected security event may differ based on whether the detected security event is located in the first layer, second layer, or third layer… Each of the detected security events are represented by a circle on the user interface 800. In the example shown, each of the detected security events is associated with a color... Detected security event 809 is represented by a red circle. The color red may indicate that the detected security event is a high security risk.”) and FIG. 8 (excerpted portion below):
PNG
media_image2.png
306
367
media_image2.png
Greyscale
With regards to claim 8, the steps performed by the method of this claim are anticipated by Bar-Nahum for the same reasons as were provided in the discussion of claim 1, which recites an apparatus configured to perform these same steps.
With regards to claim 9, the steps stored in the computer readable medium of this claim are anticipated by Bar-Nahum for the same reasons as were provided in the discussion of claim 1, which recites an apparatus configured to perform these same steps.
With regards to claim 10, the steps performed by the method of this claim are anticipated by Bar-Nahum for the same reasons as were provided in the discussion of claim 2, which recites an apparatus configured to perform these same steps.
With regards to claim 11, the steps performed by the method of this claim are anticipated by Bar-Nahum for the same reasons as were provided in the discussion of claim 3, which recites an apparatus configured to perform these same steps.
With regards to claim 12, the steps performed by the method of this claim are anticipated by Bar-Nahum for the same reasons as were provided in the discussion of claim 4, which recites an apparatus configured to perform these same steps.
With regards to claim 13, the steps stored in the computer readable medium of this claim are anticipated by Bar-Nahum for the same reasons as were provided in the discussion of claim 2, which recites an apparatus configured to perform these same steps.
With regards to claim 14, the steps stored in the computer readable medium of this claim are anticipated by Bar-Nahum for the same reasons as were provided in the discussion of claim 3, which recites an apparatus configured to perform these same steps.
With regards to claim 15, the steps stored in the computer readable medium of this claim are anticipated by Bar-Nahum for the same reasons as were provided in the discussion of claim 4, which recites an apparatus configured to perform these same steps.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Bar-Nahum et al (US PG Pub. No. 2020/0162489) in view of Yoshitaka (Japanese PG pub. no. JP 2011211114 A).
With respect to claim 17, Bar-Nahum discloses determining whether a suspicious machine, which is an unmanned aerial vehicle not permitted to fly in the monitored airspace, has intruded into the monitored airspace, based on a sensor signal output from a detection sensor (e.g., “secondary sensor”) other than the image capturing device (e.g., “primary sensor”) among the plurality of types of detection sensors constituting the detection device and a result of detecting an unmanned aerial vehicle in the captured image at: ¶ [0018]; ¶ [0029]; ¶ [0037]; ¶¶ [0040]-[0042]; ¶¶ [0057]-[0059]; ¶¶ [0063]-[0064](“For example, an image sensor may detect the security event within its field of view. At least one of the selected sensors may determine additional information associated with the detected security event… The trajectory of the UAV may also be monitored and predicted. Past flight trajectory may be determined by collecting and plotting past geolocation data …e.g., from radar, …”); ¶ [0067]; ¶¶ [0084]-[0085]; ¶¶ [0095]- [0097]; ¶¶ [0113]-[0117]. does not disclose determining a suspicious machine has intruded into monitored airspace in a case where a trajectory of the suspicious machine is different from a flight route of a permitted machine permitted to fly in the monitored airspace and in a case where the trajectory of the suspicious machine is a stray. However, this limitation was known in the art:
Yoshitaka determines a suspicious machine (“aircraft”) has intruded into monitored (“controlled”) airspace in a case where a trajectory of the suspicious machine is different from a flight route (“flight plan”) of a permitted machine permitted to fly in the monitored airspace and in a case where the trajectory of the suspicious machine is a stray at: pp. 2-3 of the English translation; pp. 5-6 of the English translation. At the time of the filing of the present application, it would have been obvious to a person of ordinary skill in the art to determine a suspicious machine (“aircraft”) has intruded into monitored (“controlled”) airspace in a case where a trajectory of the suspicious machine is different from a flight route (“flight plan”), as taught by Yoshitaka, when determining whether a suspicious machine, which is an unmanned aerial vehicle not permitted to fly in the monitored airspace, has intruded into the monitored airspace, as taught by Bar-Nahum. The motivation for doing so comes from the prior art which teaches that deviation from a flight plan is, in itself, suspicious. Therefore, it would have been obvious to combine Yoshitaka with Bar-Nahum to obtain the invention specified in this claim.
Conclusion
All claims are identical to or patentably indistinct from, or have unity of invention with claims in the application prior to the entry of the submission under 37 CFR 1.114 (that is, restriction (including a lack of unity of invention) would not be proper) and all claims could have been finally rejected on the grounds and art of record in the next Office action if they had been entered in the application prior to entry under 37 CFR 1.114. Accordingly, THIS ACTION IS MADE FINAL even though it is a first action after the filing of a request for continued examination and the submission under 37 CFR 1.114. See MPEP § 706.07(b). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID F DUNPHY whose telephone number is (571)270-1230. The examiner can normally be reached 9 am - 5 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chineyere Wills-Burns can be reached at (571) 272-9752. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DAVID F DUNPHY/Primary Examiner, Art Unit 2673