Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 1/6/2026 has been entered.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1, 8 and 15 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 5-8, 12-13, 15 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Wingo et al. (US 10,417,755) in view of Pilskalns (US 202/0004272) and further in view of Masayuki et al. (WO2022/070533) see attached for translation and further in view of White (US 2020/0279367).
Regarding claims 1, 8 and 15, Wingo teaches a system comprising:
a memory (col. 5, line 46-62); and
one or more processors (col. 5, line 46 to col. 6, line 7 teaches one or more processors) in communication with the memory configured to:
inspect a cell site, in real-time, with an unmanned aerial vehicle (UAV) (at least Fig. 3 and 8-14 and step 201 deploys a UAV) including an edge-based artificial intelligence (AI) component mounted thereon (Fig. 2 and 8-14 outlines the inspection process implemented by the UAV. However, while Wingo’s system does the inspection automatically by comparing the images to data from a previous inspection (see col. 7, lines 32-36) fails to explicitly term it as an “artificial intelligence component” per se.);
capture, by one or more cameras of the UAV, a plurality of images pertaining to at least antennae and communication equipment associated with the cell site (at least Figs. 2 and 8-14, wherein steps captures images and video of tower and tower components like antennae (RADs));
identify, by the AI component, in real-time, the plurality of images to dynamically apply AI inspection models thereto (Col. 6, line 61 through col. 7, line 36 and Figs. 2 and 8-14 goes through the entire process of inspecting images captured by the cameras. Although it is once again noted that Wingo isn’t explicit in calling the system as an “artificial intelligence component” per se); and
generate, in real-time, an inspection report based on information and data derived from applying the AI inspection models to the plurality of images captured (Fig. 8 and 14, step 806 wherein results from all of the measurements using the images are generated sequentially for all the components of the tower, RADs and RTAs, which meets the claimed “report”).
However, as discussed above, while Wingo’s system does the inspection automatically by comparing the images to data from a previous inspection (see col. 7, lines 32-36) fails to explicitly term it as an “artificial intelligence component” per se.
In an analogous art, Pilskalns teaches in paragraphs 81-82 and 87 of using machine learning / AI for the purpose of comparing images captured by a UAV to past images to determine any potential changes.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the current application to incorporate the teachings of Pilskalns into the system of Wingo because said incorporation allows for the benefit of automating tasks (Pilskalns: paragraph 81-82) and for general improvements in accuracy.
Wingo and Pilskalns teaches the claimed as discussed above, however fails to teach, but Masayuki teaches in pages 16-22 and Figs. 12-14 wherein a drone’s images are put through authenticity determination means.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the current application to incorporate the teachings of Masayuki into the proposed combination of Wingo and Pilskalns because such an incorporation allows for the benefit of improving accuracy of the auditing (pages 16-22).
While the prior arts above teaches an inspection device, fails to teach, however, White teaches the claimed “analyzing angles of the plurality of images, and in response to the angles being unacceptable, suggesting course corrections by the AI component”.
White teaches in paragraph 90 that the system is able to determine that an angle … was incorrect during the capture of one or more corresponding images and labels the images with metadata in accordance with the insufficient quality. Further in paragraphs 98-102 teaches an alteration in the flight path (“course corrections”) to allow the UAV to revisit the area for further image capturing.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the current application to incorporate the teachings of White into the proposed combination of Wingo, Pilskalns and Masayuki because said incorporation allows for the benefit of improving the system by making the system automatically revisit the incorrect or incomplete data (paragraphs 7-8).
Regarding claims 5, 12 and 19, Wingo and Pilskalns teaches wherein a distance between a latitude and longitude of the UAV, and location coordinates of the cell site are measured by the edge-based AI component (Wingo: col. 7, lines 57 through col. 8, line 18. While Pilskalns teaches the AI/machine learning components.). The prior motivation as discussed in claim 1 above is incorporated herein.
Regarding claims 6, 13 and 20, Wingo and Pilskalns fails to teach, however, Terry teaches wherein an angle between a central axis of the communication equipment associated with the cell site and a North pointer is determined by the edge-based AI component (paragraph 101 teaches determining the antenna azimuth (between central axis and north pointer) is determined using the system). The prior motivation as discussed above in claim 2 is incorporated herein including utilizing the teaching with Pilskalns AI/machine learning components.
Regarding claim 7, Wingo teaches the claimed wherein an elevation of the antennae and communication equipment associated with the cell site are continuously validated by employing a magnification theorem (col. 10, line 58 through col. 11, line 6 wherein optical zoom is used to maintain/validate the elevation of the RADs or RTAs).
Claims 2-4, 9-11 and 16-18 are rejected under 35 U.S.C. 103 as being unpatentable over Wingo et al. (US 10,417,755) in view of Pilskalns (US 202/0004272) further in view of Masayuki et al. (WO2022/070533) see attached for translation, further in view of White (US 2020/0279367) and further in view of Terry (US 2018/0232871).
Regarding claims 2, 9 and 16, Wingo teaches in Fig. 3 and col. 13, line 67 of maintaining a centerline point of view of the tower orthogonal to gravity, however in conjunction with Pilskalns fails to teach, however, Terry teaches wherein antennae down tilt is measured by capturing a first set of images of the plurality of images from a same height as that of an antenna height captured horizontally (paragraphs 98-99 and 209 teaches using images/video captured to calculate the down tilt angle of an antennae. The height of the camera is set to the height of the antenna 30 to capture images of them along its flight (see para. 133) and three measurements are taken to calculate the down tilt).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the current application to incorporate the teachings of Terry into the proposed combination of Wingo, Pilskalns, Masayuki and White because such an incorporation allows for the benefit of increasing the accuracy of the measurements by taking measurements at same location and/or orientation (see paragraph 74).
Regarding claims 3, 10 and 17, Wingo and Pilskalns fails to teach, however, Terry teaches wherein azimuth measurements are determined by combining compass data with angular measurements by using a second set of images of the plurality of images captured from a top-down view of the antennae (paragraphs 101 teaches using aerial photo and compass bearing to determine azimuth measurements of the antennae).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the current application to incorporate the teachings of Terry into the proposed combination of Wingo, Pilskalns, Masayuki and White because such an incorporation allows for the benefit of increasing the accuracy of the measurements by taking measurements at same location and/or orientation (see paragraph 74).
Regarding claims 4, 11 and 18, Wingo and Pilskalns fails to teach, however, Terry teaches wherein signal strength is measured by using AZQ drive test tools (paragraph 167-169 teaches a quality of service test that is implemented by the UAV. The AZQ drive test appears to be a quality of service test as well and are therefore equivalent).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the current application to incorporate the teachings of Terry into the proposed combination of Wingo, Pilskalns, Masayuki and White because such an incorporation allows for the benefit of network benchmarking, optimization, quality monitoring, etc. (paragraph 167).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GELEK W TOPGYAL whose telephone number is (571)272-8891. The examiner can normally be reached M-F (9:30-6 PST).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Vaughn can be reached on 571-272-3922. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/GELEK W TOPGYAL/ Primary Examiner, Art Unit 2481