Prosecution Insights
Last updated: April 19, 2026
Application No. 18/952,915

METHOD AND SYSTEM FOR AUTONOMOUS EXPLORATION AND SCANNING

Non-Final OA §102§103§112
Filed
Nov 19, 2024
Examiner
MILLER, PRESTON JAY
Art Unit
3661
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Hexagon Geosystems Services AG
OA Round
1 (Non-Final)
56%
Grant Probability
Moderate
1-2
OA Rounds
3y 1m
To Grant
75%
With Interview

Examiner Intelligence

Grants 56% of resolved cases
56%
Career Allow Rate
28 granted / 50 resolved
+4.0% vs TC avg
Strong +19% interview lift
Without
With
+18.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
39 currently pending
Career history
89
Total Applications
across all art units

Statute-Specific Performance

§101
17.7%
-22.3% vs TC avg
§103
48.0%
+8.0% vs TC avg
§102
15.3%
-24.7% vs TC avg
§112
17.0%
-23.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 50 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims 2. This office action is in response to application with case number 18/952,915 filed on 11/19/2024, in which claims 1-16 are presented for examination. Priority 3. Acknowledgment is made of Applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. EP23210867.0, filed on 11/20/2023. Information Disclosure Statement 4. The information disclosure statement(s) (IDS(s)) submitted on 11/19/2024 has/have been received and considered. Examiner Notes 5. The Examiner has cited particular paragraphs or columns and line numbers in the references applied to the claims above for the convenience of the applicant. Although the specified citations are representative of the teachings of the art and are applied to specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested of the applicant in preparing responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. The prompt development of a clear issue requires that the replies of the Applicant meet the objections to and rejections of the claims. Applicant should also specifically point out the support for any amendments made to the disclosure (see MPEP §2163.06). Applicant is reminded that the Examiner is entitled to give the Broadest Reasonable Interpretation (BRI) of the language of the claims. Furthermore, the Examiner is not limited to Applicant’s definition which is not specifically set forth in the claims. SEE MPEP 2141.02 [R-07.2015] VI. PRIOR ART MUST BE CONSIDERED IN ITS ENTIRETY, INCLUDING DISCLOSURES THAT TEACH AWAY FROM THE CLAIMS: A prior art reference must be considered in its entirety, i.e., as a whole, including portions that would lead away from the claimed invention. W.L. Gore & Associates, Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851 (1984). See also MPEP §2123. 6. Examiner notes that Applicants have used the phrase “and/or” in claim(s) 3, 6, 7, 11 and 12. The Patent Trial and Appeal Board (PTAB) has held that use of the phrase “and/or” within a claim is not indefinite. According to the PTAB, “and/or” is not wrong, but it’s not preferred verbiage (see Ex Parte Gross, Appeal No. 2011-004811). 7. Nevertheless, during patent examination, the pending claims must be given their broadest reasonable interpretation (BRI) consistent with the specification (see MPEP § 2111; Phillips v. AWH Corp., 415 F.3d 1303, 1316, 75 USPQ2d 1321, 1329 (Fed. Cir. 2005)). Based upon this guidance from the MPEP and the Federal Circuit Court of Appeals, the Examiner interprets the phrase “and/or” under its broadest reasonable interpretation of “or” for purposes of examination of the instant Application. Claim Objections 8. Claim 2 objected to because of the following informalities: “UAV” should read “unmanned aerial vehicle (UAV).” “a position and orientation of the laser scanner module” in lines 6 should read “the position and orientation of the laser scanner module.” 9. Claim 5 objected to because of the following informalities: “a position and orientation of the laser scanner module” in 15 should read “the position and orientation of the laser scanner module.” 10. Claim 13 objected to because of the following informalities: “A UAV” should read “An unmanned aerial vehicle (UAV).” 11. Appropriate correction is required. Claim Rejections - 35 USC § 112 12. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. 13. Claim 1-12, and 15-16 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. 14. Claims 1, 5 and 6 recite the limitation “the point cloud.” There is insufficient antecedent basis for this limitation in the claim. For the purpose of prior art rejection, the first instance of the limitation in claim 1 was interpreted as “a point cloud.” 15. Claim 7 recites the limitation “the UAV” in 16 of the claim. There is insufficient antecedent basis for this limitation in the claim. For the purpose of prior art rejection, the limitation was interpreted as “a UAV.” To overcome the rejection, Applicant is advised to change the limitation to “An unmanned aerial vehicle (UAV).” 16. Claim(s) 2-12, and 15-16 rejected by virtue of its/their dependency on claim 1. Claim Rejections - 35 USC § 102 17. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. 18. Claim(s) 1-2, and 11-16 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kendoul et al. (US-20210278834-A1). In regard to claim 1 , Kendoul discloses a computer-implemented method for autonomously exploring, by a mobile robot, one or more objects of interest, the mobile robot comprising a computing unit and a laser scanner module for scanning surfaces of the one or more objects of interest, the laser scanner module having a field of view, comprising (Kendoul, in at least Figs. 2-3, 10, [0001 & 0073 & 0111 & 0227], discloses a method for use in performing exploration and mapping of unknown GPS-denied environments, such as indoors and underground, using an unmanned or unpiloted aerial vehicle [i.e., a computer-implemented method for autonomously exploring, by a mobile robot], beyond visual line of sight and/or beyond communication range. The aerial vehicle 210 generates range data using a range sensor 214 of the aerial vehicle 210. The range data is indicative of a range to the environment. The range sensor is provided using a Lidar sensor [i.e., the mobile robot comprising a laser scanner module]. The aerial vehicle 210 further includes a mapping and control system [i.e., the mobile robot comprising a computing unit] for facilitating functionalities for mapping an environment and autonomously controlling the flight of the aerial vehicle 210 within the environment in accordance with the map. The control board 403 is also typically coupled to a motor 407 for controlling movement of the Lidar sensor 408, to thereby perform scanning over a field of view [i.e., the laser scanner module having a field of view]. The user defined exploration target is in the form of a target plane “F” as shown in Fig. 10, or in other forms such as a target area (not shown), a target volume; a target object [i.e., scanning surfaces of the one or more objects of interest] and/or a target point): defining a three-dimensional exploration map, wherein the one or more objects of interest are situated in the exploration map (Kendoul, in at least Fig. 10 (reproduced and annotated below for Applicant’s convenience), and [0227], discloses the user defined exploration target is in the form of a target plane “F” as shown in Fig. 10, or in other forms such as a target area (not shown), a target volume (not shown); a target object (not shown) [i.e., defining a three-dimensional exploration map, wherein the one or more objects of interest are situated in the exploration map] and/or a target point (not shown). When the user defined exploration target is the target plane “F”, the aerial vehicle 210 flies autonomously toward the nearest point on the plane. Examiner notes, when the target is a target object, the one or more objects of interest are situated in the exploration map); PNG media_image1.png 904 817 media_image1.png Greyscale Annotated Fig. 10 of Kendoul – Objects of interest partitioning the exploration map into a multitude of three-dimensional exploration blocks (Kendoul, in at least Figs. 4, 7A, and [0134], discloses at step 720, the processing device 401 calculates a depth map, which involves determining a minimum range to the environment for directions surrounding the vehicle. The range data will be parsed to identify a minimum range in a plurality of directions around the vehicle. At step 725, the processing device 401 calculates an occupancy grid including an occupancy in voxels for a three dimensional grid around the vehicle [i.e., partitioning the exploration map into a multitude of three-dimensional exploration blocks]. This is typically achieved by segmenting the point cloud and examining for the presence of points within the different voxels of a three dimensional grid surrounding the vehicle. This is used to identify obstacles around the vehicle, allowing paths along which the vehicle can fly to be identified); and an autonomous exploration of the exploration map by means of the mobile robot (Kendoul, in at least Fig. 2, and [0162], discloses the aerial vehicle 210 is flying autonomously and is generating range data [i.e., an autonomous exploration of the exploration map by means of the mobile robot]. Whilst the aerial vehicle 210 is within communication range of the user processing system 220, the aerial vehicle 210 transmits to the user processing system 220, further map data generated based on the range data), wherein the exploration of the exploration map comprises the mobile robot exploring, along an exploration path, at least a subset of the exploration blocks (Kendoul, in at least Fig. 8, and [0192], discloses the map data includes at least some of the range data, a three dimensional map generated based on the range data, an occupancy grid indicative of the presence of the environment in different voxels of the grid [i.e., at least a subset of the exploration blocks], a depth map indicative of a minimum range to the environment in a plurality of directions, or a point cloud indicative of points in the environment detected by the range sensor. At step 810, the aerial vehicle 210 determines a corresponding first flight plan, and at step 820 the aerial vehicle 210 completes its flight autonomously using the flight plan [i.e., wherein the exploration of the exploration map comprises the mobile robot exploring, along an exploration path]), and the laser scanner module generating scan data while the mobile robot is travelling along the exploration path (Kendoul, in at least Figs. 9A-9C, and [0109 & 0200], discloses the range sensor 214 is a Lidar sensor [i.e., the laser scanner module generating scan data]. As mapping progresses, the windows are updated as shown in Figs. 9B and 9C, to show additional information, including expansion of the point cloud 922, together with the path 923 traversed by the vehicle and user defined waypoints 924 used to guide navigation of the vehicle [i.e., while the mobile robot is travelling along the exploration path]), wherein: exploring an exploration block at least comprises determining, whether the respective exploration block comprises one or more points of the point cloud (Kendoul, in at least Fig. 7A, and [0134], discloses at step 725, the processing device 401 calculates an occupancy grid including an occupancy in voxels for a three dimensional grid around the vehicle. This is typically achieved by segmenting the point cloud and examining for the presence of points within the different voxels of a three dimensional grid surrounding the vehicle [i.e., exploring an exploration block at least comprises determining, whether the respective exploration block comprises one or more points of the point cloud]. This is used to identify obstacles around the vehicle, allowing paths along which the vehicle can fly to be identified); and the computing unit of the mobile robot updates the exploration map and defines the exploration path (Kendoul, in at least [0076 & 0164], discloses the user processing system 220 obtains user defined flight instructions. The user interacts with the graphical user interface with regard to the map representation, to define waypoints, flight paths, manoeuvres or the like [i.e., defines the exploration path]. The aerial vehicle 210 transmits further map data that includes any updates to the map data [i.e., the computing unit of the mobile robot updates the exploration map and defines the exploration path], or selectively limits the further map data to only include updates to the map data in a predetermined time window, updates to the map data within a predetermined range of the aerial vehicle, or updates to the map data within a predetermined range of waypoints). In regard to claim 2 , Kendoul teaches/discloses the method according to claim 1, wherein the mobile robot is a UAV, particularly a quadcopter drone (Kendoul, in at least Figs. 2-3, and [0105-0106], discloses the aerial vehicle 210 is an unmanned aerial vehicle (UAV) [i.e., wherein the mobile robot is a UAV]. The aerial vehicle 210 is a quadrotor helicopter [i.e., particularly a quadcopter drone]. Examiner notes, the term particularly was interpreted based on its dictionary meaning as specifically and the claim was interpreted to limit the scope of the invention to a quadcopter drone). In regard to claim 11 , Kendoul discloses the method according to claim 1, wherein a three-dimensional volume is defined by a user as the three-dimensional exploration map in a graphical user interface, wherein the graphical user interface (Kendoul, in at least Fig. 1, and [0075-0076], discloses at step 120, upon receipt of the transmitted map data, the user processing system 220 displays, using a graphical user interface, a map representation based on the map data. The map data includes information regarding the environment surrounding the aerial vehicle 210 in three dimensions [i.e., wherein a three-dimensional volume]. At step 130, the user processing system 220 obtains user defined flight instructions in accordance with user interactions with the graphical user interface [i.e., defined by a user as the three-dimensional exploration map in a graphical user interface]): is displayed on a screen of a mobile computing device; and/or allows the user to define a polyhedron, in particular a cuboid, as the three-dimensional exploration map by marking two corner points of the cuboid; and/or shows a two-dimensional representation of the one or more objects of interest (Kendoul, in at least Figs. 2-5, discloses the user processing system 220 [i.e., a mobile computing device] includes an electronic processing device, such as at least one microprocessor 500, a memory 501, an input/output device 502, such as a touch screen display [i.e., a screen of a mobile computing device]). In regard to claim 12 , Kendoul discloses the method according to claim 1, wherein exploring an exploration block comprises scanning, by the laser scanner module, particularly with a user-defined scanning accuracy, surfaces of objects present in the exploration block, in particular wherein the one or more objects of interest comprise: buildings or other man-made structures, and the surfaces include at least one of façades, roofs, pillars and pavement; and/or natural objects or scenes, particularly caves (Kendoul, in at least Fig. 10, and [0004 & 0227], discloses drones are required to collect data (mapping, inspection, images, gas, radiations, etc.) from areas that are inaccessible to humans (dangerous or not possible) such as in underground mining stopes, underground urban utility tunnels [i.e., buildings or other man-made structures], collapsed tunnels and indoor structures, etc. In these GPS-denied environments, generally there is no navigation map that the drone can use to navigate and the options are either, assisted flight in line of sight, or waypoint navigation where waypoints are selected by the operator during flight, or autonomous exploration. The user defined exploration target is in the form of a target plane “F” as shown in Fig. 10, or in other forms such as a target area (not shown), a target volume (not shown) [i.e., buildings or other man-made structure]; a target object (not shown) [i.e., the surfaces include at least one of façades, roofs, pillars and pavement] and/or a target point (not shown). Examiner notes, the terms particularly and in particular were interpreted based on their dictionary meaning as specifically and the claim was interpreted to limit the scope of the invention by using the mentioned terms). In regard to claim 13 , Kendoul discloses a UAV comprising a computing unit and a laser scanner module, the laser scanner module having a field of view, wherein the computing unit is configured (Kendoul, in at least Figs. 2-4, and [0073 & 0105 & 0111 & 0113], discloses the aerial vehicle 210 generates range data using a range sensor 214 of the aerial vehicle 210. The range data is indicative of a range to the environment. The range sensor is provided using a Lidar sensor [i.e., a laser scanner module]. The aerial vehicle 210 is an unmanned aerial vehicle (UAV) [i.e., a UAV]. The aerial vehicle 210 further includes a mapping and control system [i.e., a computing unit] for facilitating functionalities for mapping an environment and autonomously controlling the flight of the aerial vehicle 210 within the environment in accordance with the map. The control board 403 is also typically coupled to a motor 407 for controlling movement of the Lidar sensor 408, to thereby perform scanning over a field of view [i.e., the laser scanner module having a field of view]): a) to receive a three-dimensional exploration map that is partitioned into a multitude of three-dimensional exploration blocks, or b) to receive a three-dimensional exploration map and to partition the exploration map into a multitude of three-dimensional exploration blocks, or c) to receive exploration-map information, particularly comprising coordinates, to generate a three-dimensional exploration map based on the exploration-map information, and to partition the exploration map into a multitude of three-dimensional exploration blocks (Kendoul, in at least Figs. 6, 7A-7B and, [0122-0123 & 0132-0134], discloses at step 600, the aerial vehicle 210 receives flight instructions data from the user processing system 220 [i.e., to receive a three-dimensional exploration map]. Then, at step 610, the mapping and control system of the aerial vehicle 210 determines a flight plan based on the flight instructions data, and stores flight plan data indicative of the flight plan in the memory. At step 700, a flight plan is determined and at step 720, the processing device 401 calculates a depth map, which involves determining a minimum range to the environment for directions surrounding the vehicle. At step 725, the processing device 401 calculates an occupancy grid including an occupancy in voxels for a three dimensional grid around the vehicle [i.e., to partition the exploration map into a multitude of three-dimensional exploration blocks]. This is typically achieved by segmenting the point cloud and examining for the presence of points within the different voxels of a three dimensional grid surrounding the vehicle. This is used to identify obstacles around the vehicle, allowing paths along which the vehicle can fly to be identified), wherein the one or more objects of interest are situated in the exploration map (Kendoul, in at least Fig. 10 (reproduced and annotated above for Applicant’s convenience), and [0227], discloses the user defined exploration target is in the form of a target plane “F” as shown in Fig. 10, or in other forms such as a target area (not shown), a target volume (not shown); a target object (not shown) [i.e., wherein the one or more objects of interest are situated in the exploration map] and/or a target point (not shown). When the user defined exploration target is the target plane “F”, the aerial vehicle 210 flies autonomously toward the nearest point on the plane. Examiner notes, when the target is a target object, the one or more objects of interest are situated in the exploration map), and wherein the computing unit is configured to control an autonomous exploration of the exploration map by means of the UAV (Kendoul, in at least Figs. 2-3, and [0105 & 0224], discloses the aerial vehicle 210 is an unmanned aerial vehicle (UAV) [i.e., by means of the UAV]. Once the aerial vehicle 210 has received the flight instructions data from the user processing system 220, the aerial vehicle 210 then proceeds to fly autonomously in accordance with the flight instructions data and the range data [i.e., wherein the computing unit is configured to control an autonomous exploration of the exploration]), wherein the exploration of the exploration map comprises the UAV exploring, along an exploration path, at least a subset of the exploration blocks (Kendoul, in at least Fig. 8, and [0192], discloses the map data includes at least some of the range data, a three dimensional map generated based on the range data, an occupancy grid indicative of the presence of the environment in different voxels of the grid [i.e., at least a subset of the exploration blocks], a depth map indicative of a minimum range to the environment in a plurality of directions, or a point cloud indicative of points in the environment detected by the range sensor. At step 810, the aerial vehicle 210 determines a corresponding first flight plan, and at step 820 the aerial vehicle 210 completes its flight autonomously using the flight plan [i.e., wherein the exploration of the exploration map comprises the UAV exploring, along an exploration path]), and the laser scanner module generating scan data while the UAV is travelling along the exploration path, the scan data relating to a point cloud (Kendoul, in at least Figs. 9A-9C, and [0109 & 0200], discloses the range sensor 214 is a Lidar sensor [i.e., the laser scanner module generating scan data]. As mapping progresses, the windows are updated as shown in Figs. 9B and 9C, to show additional information, including expansion of the point cloud 922 [i.e., the scan data relating to a point cloud], together with the path 923 [i.e., while the UAV is travelling along the exploration path] traversed by the vehicle and user defined waypoints 924 used to guide navigation of the vehicle), wherein: exploring an exploration block at least comprises determining, based on the scan data, whether the respective exploration block comprises one or more points of the point cloud ((Kendoul, in at least Fig. 7A, and [0134], discloses at step 725, the processing device 401 calculates an occupancy grid including an occupancy in voxels for a three dimensional grid around the vehicle. This is typically achieved by segmenting the point cloud and examining for the presence of points within the different voxels of a three dimensional grid surrounding the vehicle [i.e., exploring an exploration block at least comprises determining, based on the scan data, whether the respective exploration block comprises one or more points of the point cloud]. This is used to identify obstacles around the vehicle, allowing paths along which the vehicle can fly to be identified); and the computing unit of the UAV is configured to update the exploration map and to define the exploration path (Kendoul, in at least [0076 & 0164], discloses the user processing system 220 obtains user defined flight instructions. The user interacts with the graphical user interface with regard to the map representation, to define waypoints, flight paths, manoeuvres or the like [i.e., to define the exploration path]. The aerial vehicle 210 transmits further map data that includes any updates to the map data [i.e., the computing unit of the UAV is configured to update the exploration map and to define the exploration path], or selectively limits the further map data to only include updates to the map data in a predetermined time window, updates to the map data within a predetermined range of the aerial vehicle, or updates to the map data within a predetermined range of waypoints). In regard to claim 14 , Kendoul discloses a system for scanning surfaces of one or more objects of interest, the system comprising a UAV according to claim 13 and a mobile computing device (Kendoul, in at least Fig. 2, and [0071-0073 & 0105 & 0227], discloses the system 200 [i.e., a system] broadly includes an aerial vehicle 210 and a user processing system 220 [i.e., a mobile computing device]. The aerial vehicle 210 uses a Lidar sensor. The aerial vehicle 210 is an unmanned aerial vehicle (UAV) [i.e., the system comprising a UAV]. The user defined exploration target is in the form of a target plane “F” as shown in Fig. 10, or in other forms such as a target area (not shown), a target volume; a target object [i.e., scanning surfaces of one or more objects of interest] and/or a target point), wherein the mobile computing device is configured to receive user-input defining the three-dimensional exploration map (Kendoul, in at least Fig. 6, and [0122], discloses at step 600, the aerial vehicle 210 receives flight instructions data from the user processing system 220 [i.e., the mobile computing device is configured to receive user-input defining the three-dimensional exploration map]), wherein the one or more objects of interest are situated in the exploration map (Kendoul, in at least Fig. 6, and [0122], discloses in at least Fig. 10 (reproduced and annotated above for Applicant’s convenience), and [0227] discloses the user defined exploration target is in the form of a target plane “F” [i.e., the one or more objects of interest are situated in the exploration map] as shown in Fig. 10, or in other forms such as a target area (not shown), a target volume; a target object and/or a target point), and to provide the exploration map or exploration-map information, particularly comprising coordinates, to the UAV (Kendoul, in at least Figs. 2, 7A-7B, and [0118 & 0138], discloses the aerial vehicle 210 will transmit map data to the user processing system 220 and the user processing system 220 will transmit flight instructions data [i.e., to provide the exploration map or exploration-map information] to the aerial vehicle 210. The processing device 401 identifies one or more manoeuvres at step 745 based on the selected flight plan and taking into account the occupancy grid, the configuration data and depth map. Having determined the manoeuvres, the processing device 401 generates control instructions at step 750, taking into account the calibration data so that instructions are translated into the coordinate frame of the vehicle [i.e., to provide the exploration map or exploration-map information, particularly comprising coordinates, to the UAV]). In regard to claim 15 , Kendoul discloses a computer program product comprising program code stored in a non-transitory machine-readable medium and having computer-executable instructions for performing, the method according to claim 1 (Kendoul, in at least Fig. 5, and [0116], discloses the microprocessor 500 executes instructions in the form of applications software stored in the memory 501 to perform required processes, such as wirelessly communicating with the aerial vehicle 210 [i.e., a computer program product comprising program code stored in a non-transitory machine-readable medium and having computer-executable instructions for performing, the method according to claim 1] via the communications interface 504). In regard to claim 16 , Kendoul discloses a computer program product comprising program code stored in a non-transitory machine-readable medium and having computer-executable instructions for performing, the method according to claim 12 (Kendoul, in at least Fig. 5, and [0116], discloses the microprocessor 500 executes instructions in the form of applications software stored in the memory 501 to perform required processes, such as wirelessly communicating with the aerial vehicle 210 [i.e., a computer program product comprising program code stored in a non-transitory machine-readable medium and having computer-executable instructions for performing, the method according to claim 12] via the communications interface 504). Claim Rejections - 35 USC § 103 19. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 20. Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kendoul et al. (US-20210278834-A1)in view of Iandola et al. (US-20180188733-A1). In regard to claim 3 , Kendoul discloses the method according to claim 2, wherein the field of view depends on a position and orientation of the UAV and is limited by: a position and orientation of the laser scanner module relative to the UAV (Kendoul, in at least Fig. 10 (reproduced and annotated below for applicant’s convenience), and [0113], discloses the control board is also typically coupled to a motor for controlling movement of the Lidar sensor, to thereby perform scanning over a field of view. Examiner notes, as illustrated by Fig. 10, the field of view is imitated by the position and orientation of the Lidar sensor), and PNG media_image2.png 1164 967 media_image2.png Greyscale Annotated Fig. 10 of Kendoul – Field of view Kendoul is silent on a pre-defined radius that is smaller than a maximum scanning range of the laser scanner module, wherein the field of view is also limited by a position and orientation of the laser scanner module relative to other features of the UAV, particularly wherein the other features at least comprise rotors and/or wings of the UAV; and/or the maximum scanning range of the laser scanner module is between 15 and 150 metres, particularly between 40 and 80 metres. However, Iandola teaches a pre-defined radius that is smaller than a maximum scanning range of the laser scanner module (Iandola, in at least [0040], teaches a LIDAR sensor specifies a minimum usable return range of 0.9 meters [i.e., pre-defined radius that is smaller than a maximum scanning range of the laser scanner module] and a maximum usable return range of 120 meters [i.e., a maximum scanning range of the laser scanner module]. Examiner notes, any scanning range that is less than the maximum scanning range of the LiDAR system, including the minimum scanning rage of the LiDAR system, limits the field of view), wherein the field of view is also limited by a position and orientation of the laser scanner module relative to other features of the UAV, particularly wherein the other features at least comprise rotors and/or wings of the UAV; and/or the maximum scanning range of the laser scanner module is between 15 and 150 metres, particularly between 40 and 80 metres (Iandola, in at least [0040], teaches a LIDAR sensor specifies a minimum usable return range of 0.9 meters and a maximum usable return range of 120 meters [i.e., the maximum scanning range of the laser scanner module is between 15 and 150 metres]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify Kendoul in view of Iandola with a reasonable expectation of success, as both inventions are directed to the same field of endeavor – light detection and ranging– and the combination would provide for observing more sensor readings, which enables it to generate more accurate predictions (Iandola, see at least [0062]). Furthermore, it would have been an obvious matter of design choice to select a range between the minimum and maximum range of the LiDAR system, since Applicant(s) has/have not disclosed that the chosen range, between 40 and 80 meters, solves any stated problem or is for any particular purpose and it appears that the invention would perform equally well with other ranges between the minimum and maximum range of the LiDAR system. 21. Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kendoul et al. (US-20210278834-A1). In regard to claim 4 , Kendoul discloses the method according to claim 1, wherein updating the exploration map is performed in time intervals that are selected depending on at least one of: a scanning speed of the laser scanner unit, a computing speed of the computing unit, and a current speed of the mobile robot, wherein the time intervals are between 100ms and 1s (Kendoul, in at least [0164], discloses the aerial vehicle 210 transmits further map data that includes any updates to the map data, or selectively limit the further map data to only include updates to the map data in a predetermined time window [i.e., wherein updating the exploration map is performed in time intervals], updates to the map data within a predetermined range of the aerial vehicle, or updates to the map data within a predetermined range of waypoints). While Kendoul discloses updating to the map data in a predetermined time window (Kendoul, see at least [0164]), Kendoul is silent on a scanning speed of the laser scanner unit, a computing speed of the computing unit, and a current speed of the mobile robot, wherein the time intervals are between 100ms and 1s. It would have been an obvious matter of design choice to use the scanning speed of the LiDAR, or any other related properties of the UAV, for updating the map and use a “time intervals […] between 100ms and 1s,” since Applicant(s) has/have not disclosed that time intervals between 100ms and 1s solves any stated problem or is for any particular purpose and it appears that the invention would perform equally well with other time intervals. Claim Objections 22. In regard to the 35 U.S.C. § 102 or 103 rejection(s) noted above, claim(s) 5-10 is/are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten to overcome the claim objections and/or the claim rejection(s) under 35 USC §112(b), set forth in this Office Action and to include all of the limitations of the base claim and any intervening claims. Conclusion 23. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Huang (US-20250383216-A1) teaches a grid map construction method which is applied to a robot. Henry et al. (US-20210263515-A1) teaches an unmanned aerial vehicle (UAV) which employs one or more image sensors to capture images of a scan target and uses distance information from the images for determining respective locations in three-dimensional (3D) space of a plurality of points of a 3D model representative of a surface of the scan target. 24. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Preston J Miller whose telephone number is (703)756-1582. The examiner can normally be reached Monday through Friday 7:30 AM - 4:30 PM EST. 25. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. 26. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramya P Burgess can be reached at (571) 272-6011. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. 27. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /P.J.M./Examiner, Art Unit 3661 /Tarek Elarabi, Ph.D./Primary Examiner, Art Unit 3661
Read full office action

Prosecution Timeline

Nov 19, 2024
Application Filed
Jan 23, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12559091
CONTROL DEVICE FOR CONTROLLING SAFETY DEVICE IN VEHICLE
2y 5m to grant Granted Feb 24, 2026
Patent 12490678
VEHICLE LOCATION WITH DYNAMIC MODEL AND UNLOADING CONTROL SYSTEM
2y 5m to grant Granted Dec 09, 2025
Patent 12466388
Method for Operating a Motor Vehicle Drive Train and Electronic Control Unit for Carrying Out Said Method
2y 5m to grant Granted Nov 11, 2025
Patent 12454806
WORK MACHINE
2y 5m to grant Granted Oct 28, 2025
Patent 12447827
Electric Vehicle Control Device, Electric Vehicle Control Method, And Electric Vehicle Control System
2y 5m to grant Granted Oct 21, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
56%
Grant Probability
75%
With Interview (+18.8%)
3y 1m
Median Time to Grant
Low
PTA Risk
Based on 50 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month