Prosecution Insights
Last updated: April 19, 2026
Application No. 18/593,558

MOBILITY PLATFORM FOR AUTONOMOUS NAVIGATION OF CONSTRUCTION SITES

Final Rejection §103
Filed
Mar 01, 2024
Examiner
ARTIMEZ, DANA FERREN
Art Unit
3667
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Rugged Robotics Inc.
OA Round
3 (Final)
58%
Grant Probability
Moderate
4-5
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
46 granted / 80 resolved
+5.5% vs TC avg
Strong +44% interview lift
Without
With
+43.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
42 currently pending
Career history
122
Total Applications
across all art units

Statute-Specific Performance

§101
19.0%
-21.0% vs TC avg
§103
46.2%
+6.2% vs TC avg
§102
7.3%
-32.7% vs TC avg
§112
24.6%
-15.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 80 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Examiner Notes that the fundamentals of the rejections are based on the broadest reasonable interpretation of the claim language. Applicant is kindly invited to consider the reference as a whole. References are to be interpreted as by one of ordinary skill in the art rather than as by a novice. See MPEP 2141. Therefore, the relevant inquiry when interpreting a reference is not what the reference expressly discloses on its face but what the reference would teach or suggest to one of ordinary skill in the art. Information Disclosure Statement The information disclosure statement (IDS) filed on 08/13/2025 is being considered by the examiner. Status of the Claims This is a Final Office Action in response to Applicant’s amendment of 08 October 2025. Claims 1-20 are pending and have been considered as follows. Response to Amendments and/or Arguments Applicant’s amendments and/or arguments with respect to the Claim Rejections of Claims 1-20 under 35 U.S.C. 112(b) as set forth in the office action 08 July 2025 have been considered and are persuasive. Therefore, the Claim Rejections of Claims 1-20 under 35 U.S.C. 112(b) as set forth in the office action 08 July 2025 have been withdrawn. Applicant’s arguments, see Pages 7-9, filed on 08 October 2025, with respect to the rejection(s) of claim(s) 1, 17 and 19 under 35 U.S.C. 103 as set forth in the office action 08 July 2025 have been fully considered and are NOT persuasive. Superficially, Applicant argues (Pages 7-9 of Applicant’s Remark filed on 08 October 2025): PNG media_image1.png 737 894 media_image1.png Greyscale The Examiner’s Response: Applicant argues that prior arts Tojima and Kyo fail to disclose (1) comparing a position accuracy value to a predetermined accuracy threshold; and (2) moving the mobility platform toward a landmark when the position accuracy value falls below the threshold. The Examiner has carefully considered Applicant’s arguments and respectfully disagrees for at least the following reasons: Prior art Tojima teaches comparing a position accuracy value to a threshold as mapped in the Non-Final Office Action (mailed on 07/08/2025, see at least Page 6 of NF), Tojima discloses that when a contactless sensor detects a landmark, the control unit obtains (a) a measured position of the landmark based on relative position information, and (b) an estimated position of the mining machine determined by dead reckoning. The control unit compares the measured landmark-based position with the estimated dead-reckoning position and corrects the current position based on the comparison result. Such a comparison inherently involves evaluating whether the discrepancy between the two positions exceeds an acceptable tolerance. As would be understood by one of ordinary skill in the art, performing positional correction only when necessary implies that the discrepancy exceeds a predetermined allowable error/range, i.e., a position accuracy threshold. Therefore, Applicant’s argument that Tojima does not explicitly recite a positional accuracy value and position accuracy threshold is not persuasive since the correction of position based on comparison results necessarily requires determining that the accuracy of the estimated position is insufficient and thus below a predetermined position accuracy threshold. Prior art Kyo teaches moving toward a landmark in response to a degraded position accuracy as mapped in the Non-Final Office Action (mailed on 07/08/2025, see at least Page 6-7 of NF). Kyo discloses that when the robot determines that it has lost its position, it initiates a recovery procedure that includes: (a) obtaining a current position using RFID; (b) searching for landmarks using a stereo camera; (c) verifying landmark and RFID information; and (d) moving toward the landmark position to identify its own position before resuming work or returning to an initial position. Losing position is a condition that inherently corresponds to the robot’s position accuracy falling below an acceptable. Therefore, moving toward a landmark upon determining that position is lost constitutes moving toward a landmark based on the position accuracy value falling below the predetermined position accuracy, as claimed. Accordingly, Applicant’s attempt to distinguish losing position from position accuracy value falling below a threshold is not persuasive, as both describe functionally equivalent condition, i.e. robot’s current position estimate is not reliable and outside of predetermined accuracy threshold. Applicant argues that one of ordinary skill would recognizes a technical distinction between: (1) comparing a position accuracy value to an accuracy threshold; and (2) determining whether a position is outside of a position range. However, the Examiner finds that both approaches represent routine method of evaluating localization reliability: a position range is simply a spatial expression of an accuracy threshold, and both serve the same purpose of determining whether corrective action is required. Accordingly, Applicant’s argument are NOT persuasive and the 35 U.S.C. 103 rejection is maintained. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-7, and 11-20 are rejected under 35 U.S.C. 103 as being unpatentable over Tojima et al. (US 2016/0349753 A1 hereinafter Tojima) in view of Kyo (JP-2007249735 A_English Translation). Regarding Claim 1 (similarly claims 17 and 19), Tojima teaches A method for operating a mobility platform in a construction site comprising one or more landmarks (see at least Abstract), the method comprising: moving the mobility platform along a navigational path based at least partly on locations of the one or more landmarks; (see at least Abstract Fig. [0007, 0043-0056]: a plurality of landmarks are located in the mine and arranged on each of the loading site, discharging site and conveyance path. The dump truck travels along the travel routes generated by the travel route generating unit in at least a part of the loading site, the earth discharging site and the conveyance path wherein the dump truck obtains position of the landmark to correct the self position when GPS cannot be used.) determining a first position of the mobility platform based on information from at least one first sensor; (see at least Fig. 1, 3-4 [0058-0078]: the travel control unit allows the dump truck to travel along the travel route set in advance based on the self position of the dump truck detected by the position detection device (e.g. GPS) as the self position detection device and when the position detection device becomes unable to detect the self position of the dump truck, the travel control unit allows the dump truck to travel by the dead reckoning navigation.) sensing the one or more landmarks with at least one second sensor selected from a group of a stereo camera, a mono camera, and a LIDAR unit when the mobility platform is moved along the path to determine a second position of the mobility platform; (see at least Fig. 1, 3-4, 7, 12-14[0058-0078, 0096-0122, 0145-0168]: the contactless sensor 24 may also be an optical sensor which detects the landmark by using laser light as detection light. The contactless sensor includes an emitting unit capable of emitting the detection light and a light receiving unit capable of receiving at least a part of the detection light emitted from the emitting unit to be reflected by the landmark. The travel control unit allows the dump truck to travel based on the dead reckoning navigation and determines whether the landmark is detected based on the detection result of the contactless sensor 24 (i.e. Lidar unit). The travel control unit obtains the measured position of the landmark based on the information regarding the relative position of the dump truck and the landmark detected by the contactless sensor and the estimated position of the dump truck (e.g. traveled by dead reckoning) when the contactless sensor detects the landmark.) comparing the first position to the second position to compute a position accuracy value; (see at least Fig. 1, 3-4, 7, 12-14[0058-0078, 0096-0122, 0145-0168]: The travel control unit obtains the measured position of the landmark based on the information regarding the relative position of the dump truck and the landmark detected by the contactless sensor and the estimated position of the dump truck (e.g. traveled by dead reckoning) when the contactless sensor detects the landmark. The travel control unit corrects a current position of the mining machine based on a comparison result. That is, a position accuracy value is computed in order for the travel control unit to implement positional correction control.) comparing the position accuracy value to a predetermined position accuracy threshold; (see at least Fig. 1, 3-4, 7, 12-14 [0058-0078, 0096-0122, 0145-0168]: The travel control unit obtains the measured position of the landmark based on the information regarding the relative position of the dump truck and the landmark detected by the contactless sensor and the estimated position of the dump truck (e.g. traveled by dead reckoning) when the contactless sensor detects the landmark. The travel control unit corrects a current position of the mining machine based on a comparison result. That is, a position accuracy value is computed and corrected (because it was under a threshold) in order for the travel control unit to implement positional correction control.) and it may be alleged that Tojima does not explicitly teach moving the mobility platform toward a landmark of the one or more landmarks based on the position accuracy value falling below the predetermined position accuracy threshold. Kyo is directed to system and method for accurately recognizing and restoring a self-location of a robot when an error state and self-location is lost, Kyo teaches moving the mobility platform toward a landmark of the one or more landmarks based on the position accuracy value falling below the predetermined position accuracy threshold. (see at least [0010-0014]: when the robot loses its position, identifies its position and orientation as follows: (3-1) obtain the current position using RFID system, (3-2) rotates the head and search for landmarks using stereo camera, (3-3) when the search is successful, re-check the registered landmark position and the RFID result read by the RFID reader. If the result is valid, it uses the landmark direction and relative position information to move to the landmark position and identify its own position; and (3-4) Resume work from the identified position. Or return to the initial position. (4) If the search fails, the administrator is called by lighting the lamp or issuing a warning sound. The manager uses the landmark numbers nearby to instruct the robot on position and orientation information. As a result, the robot automatically moves to the landmark position and identifies its own position.) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified Tojima’s mining machine travel control system and method to incorporate the technique of moving the mobility platform toward a landmark of the one or more landmarks based on the position accuracy value falling below the predetermined position accuracy threshold as taught by Kyo with reasonable expectation of success to perform highly accurate and reliable self-position recovery without administrator intervention (Kyo [0016]). Regarding Claim 2, the combination of Tojima in view of Kyo teaches The method of claim 1, Tojima further teaches wherein the at least one second sensor is a Lidar unit (see at least Fig. 1, 3-4, 7, 12-14[0058-0078, 0096-0122, 0145-0168]: the contactless sensor 24 may also be an optical sensor which detects the landmark by using laser light as detection light. The contactless sensor includes an emitting unit capable of emitting the detection light and a light receiving unit capable of receiving at least a part of the detection light emitted from the emitting unit to be reflected by the landmark. The travel control unit allows the dump truck to travel based on the dead reckoning navigation and determines whether the landmark is detected based on the detection result of the contactless sensor 24 (i.e. Lidar unit). Regarding Claim 3, the combination of Tojima in view of Kyo teaches The method of claim 2, Tojima further teaches wherein determining the first position includes using information provided by the at least one first sensor in feedback control. (see at least [0047-0077]: the position detection device 29 obtains the self position of the dump truck by using the GPS. The self position obtained by the position detection device is the position of the dump truck obtained by the GPS, that is to say, the GPS position and absolute position. The dump truck 2 travels while sequentially updating the self position thereof by using azimuthal measurement with a gyroscope and a speed at which the dump truck 2 travels (hereinafter, appropriately referred to as vehicle speed). Such a method is referred to as dead reckoning navigation or autonomous navigation. Errors are accumulated in the dead reckoning navigation. Therefore, in the dead reckoning navigation, the self position is corrected by using the position of the dump truck positioned by using a GPS (global positioning system), for example. The dump truck 2 obtains the position of the landmark 8 to correct the self position when the GPS cannot be used. Meanwhile, the self position may also be corrected by the management device 10 .) Regarding Claim 4, the combination of Tojima in view of Kyo teaches The method of claim 3, Tojima further teaches wherein the at least first sensor is selected from a group of a drive system encoder, a stereo camera, an inertial measurement unit, and an optical flow sensor; (see at least [0047]: The dump truck 2 travels while sequentially updating the self position thereof by using azimuthal measurement with a gyroscope and a speed at which the dump truck 2 travels). and the method further comprises integrating information from the at least one first sensor to compute the first position of the mobility platform in the construction site. (see at least [0047]: The dump truck 2 travels while sequentially updating the self position thereof by using azimuthal measurement with a gyroscope and a speed at which the dump truck 2 travels (hereinafter, appropriately referred to as vehicle speed). Such a method is referred to as dead reckoning navigation or autonomous navigation. Errors are accumulated in the dead reckoning navigation. Therefore, in the dead reckoning navigation, the self position is corrected by using the position of the dump truck positioned by using a GPS (global positioning system), for example. The dump truck 2 obtains the position of the landmark 8 to correct the self position when the GPS cannot be used. Meanwhile, the self position may also be corrected by the management device 10 .) Regarding Claim 5, the combination of Tojima in view of Kyo teaches The method of claim 1, Tojima further teaches further comprising using a processor to change one or more control parameters of a controller of the mobility platform based on the position accuracy value, the one or more control parameters being for integrating information from the at least first sensor, to increase the position accuracy value above the predetermined position accuracy threshold. (see at least [0145-0165]: the travel control unit 20 A determines whether the position detected or measured by the GPS may be adopted. when a detection state of the GPS is not excellent based on accuracy information included in the detection result output by the position detection device 29 , the travel control unit 20 A determines the position positioned by the GPS cannot be adopted. When the GPS position may be adopted, the travel control unit 20 A allows the dump truck 2 to travel by the dead reckoning navigation while correcting the estimated position by using the GPS) Regarding Claim 6, the combination of Tojima in view of Kyo teaches The method of claim 1, Tojima further teaches wherein the one or more landmarks comprises one or more physical structures in the construction site. (see at least [0046, 0096]: the landmark is a stationary object so that it does not move from a position/place where it is located in principle. The landmark is a structure arranged on each of the loading site, the earth discharging site and the conveyance path. ) Regarding Claim 7, the combination of Tojima in view of Kyo teaches The method of claim 1, Tojima further teaches wherein the at least first sensor is an inertial measurement unit; (see at least [0047]: The dump truck 2 travels while sequentially updating the self position thereof by using azimuthal measurement with a gyroscope and a speed at which the dump truck 2 travels) and the at least one second sensor is a LIDAR unit. (see at least Fig. 1, 3-4, 7, 12-14[0058-0078, 0096-0122, 0145-0168]: the contactless sensor 24 may also be an optical sensor which detects the landmark by using laser light as detection light. The contactless sensor includes an emitting unit capable of emitting the detection light and a light receiving unit capable of receiving at least a part of the detection light emitted from the emitting unit to be reflected by the landmark. The travel control unit allows the dump truck to travel based on the dead reckoning navigation and determines whether the landmark is detected based on the detection result of the contactless sensor 24 (i.e. Lidar unit)) Regarding Claim 11, the combination of Tojima in view of Kyo teaches The method of claim 1, further comprising: Tojima further teaches correcting movement of the mobility platform based on information provided by the at least one first sensor. (see at least [0017]: the mining machine travel based on a detected self position and uses dead reckoning navigation while correcting a current position of the mining machine based on a position of the landmark obtained in advance and a detected position of the landmark when the self position cannot be detected.) Regarding Claim 12, the combination of Tojima in view of Kyo teaches The method of claim 1, further comprising: Tojima further teaches correcting movement of the mobility platform based on information provided by the at least one second sensor. the contactless sensor 24 may also be an optical sensor which detects the landmark by using laser light as detection light. The contactless sensor includes an emitting unit capable of emitting the detection light and a light receiving unit capable of receiving at least a part of the detection light emitted from the emitting unit to be reflected by the landmark. The travel control unit allows the dump truck to travel based on the dead reckoning navigation and determines whether the landmark is detected based on the detection result of the contactless sensor 24 (i.e. Lidar unit) Regarding Claim 13, the combination of Tojima in view of Kyo teaches The method of claim 1, further comprising: Tojima further teaches correcting movement of the mobility platform based on both information provided by the at least one first sensor and information provided by the at least one second sensor. (see at least Fig. 1, 3-4, 7, 12-14 [0058-0078, 0096-0122, 0145-0168]: The travel control unit obtains the measured position of the landmark based on the information regarding the relative position of the dump truck and the landmark detected by the contactless sensor and the estimated position of the dump truck (e.g. traveled by dead reckoning) when the contactless sensor detects the landmark. The travel control unit corrects a current position of the mining machine based on a comparison result. ) Regarding Claim 14, the combination of Tojima in view of Kyo teaches The method of claim 1, further comprising: Tojima further teaches wherein the at least one second sensor comprises each of a stereo camera, a mono camera, and a LIDAR unit. (see at least Fig. 1, 3-4, 7, 12-14[0058-0078, 0096-0122, 0145-0168]: the contactless sensor 24 may also be an optical sensor which detects the landmark by using laser light as detection light. The contactless sensor includes an emitting unit capable of emitting the detection light and a light receiving unit capable of receiving at least a part of the detection light emitted from the emitting unit to be reflected by the landmark. The travel control unit allows the dump truck to travel based on the dead reckoning navigation and determines whether the landmark is detected based on the detection result of the contactless sensor 24 (i.e. Lidar unit)) Regarding Claim 15, the combination of Tojima in view of Kyo teaches The method of claim 1, further comprising: Tojima further teaches wherein the position accuracy value is based on a correlation between the first position and the second position. (see at least Fig. 1, 3-4, 7, 12-14[0058-0078, 0096-0122, 0145-0168]: The travel control unit obtains the measured position of the landmark based on the information regarding the relative position of the dump truck and the landmark detected by the contactless sensor and the estimated position of the dump truck (e.g. traveled by dead reckoning) when the contactless sensor detects the landmark. The travel control unit corrects a current position of the mining machine based on a comparison result. That is, a position accuracy value is computed in order for the travel control unit to implement positional correction control.) Regarding Claim 16, the combination of Tojima in view of Kyo teaches The method of claim 1, further comprising: Tojima further teaches integrating the information from the at least first sensor to compute the first position of the mobility platform in the construction site; (see at least Fig. 1, 3-4 [0058-0078]: the travel control unit allows the dump truck to travel along the travel route set in advance based on the self position of the dump truck detected by the position detection device (e.g. GPS) as the self position detection device and when the position detection device becomes unable to detect the self position of the dump truck, the travel control unit allows the dump truck to travel by the dead reckoning navigation.) and continuously detecting the one or more landmarks with the at least one second sensor to compute the second position of the mobility platform in the construction site. (see at least Fig. 1, 3-4, 7, 12-14[0058-0078, 0096-0122, 0145-0168]: the contactless sensor 24 may also be an optical sensor which detects the landmark by using laser light as detection light. The contactless sensor includes an emitting unit capable of emitting the detection light and a light receiving unit capable of receiving at least a part of the detection light emitted from the emitting unit to be reflected by the landmark. The travel control unit allows the dump truck to travel based on the dead reckoning navigation and determines whether the landmark is detected based on the detection result of the contactless sensor 24 (i.e. Lidar unit). The travel control unit obtains the measured position of the landmark based on the information regarding the relative position of the dump truck and the landmark detected by the contactless sensor and the estimated position of the dump truck (e.g. traveled by dead reckoning) when the contactless sensor detects the landmark.) Regarding Claim 18 (similarly claim 20), the combination of Tojima in view of Kyo teaches The mobility platform of claim 17, wherein: Tojima further teaches the at least one second sensor is a LiDAR unit, (see at least Fig. 1, 3-4, 7, 12-14[0058-0078, 0096-0122, 0145-0168]: the contactless sensor 24 may also be an optical sensor which detects the landmark by using laser light as detection light. The contactless sensor includes an emitting unit capable of emitting the detection light and a light receiving unit capable of receiving at least a part of the detection light emitted from the emitting unit to be reflected by the landmark. The travel control unit allows the dump truck to travel based on the dead reckoning navigation and determines whether the landmark is detected based on the detection result of the contactless sensor 24 (i.e. Lidar unit)) the at least one first sensor is selected from a group of a drive system encoder, a stereo camera, an inertial measurement unit, and an optical flow sensor, (see at least [0047]: The dump truck 2 travels while sequentially updating the self position thereof by using azimuthal measurement with a gyroscope and a speed at which the dump truck 2 travels) the non-transitory computer-readable storage medium further cause the at least one processor to integrate the information from the at least first sensor to compute the first position of the mobility platform in the construction site; (see at least Fig. 1, 3-4 [0058-0078]: the travel control unit allows the dump truck to travel along the travel route set in advance based on the self position of the dump truck detected by the position detection device (e.g. GPS) as the self position detection device and when the position detection device becomes unable to detect the self position of the dump truck, the travel control unit allows the dump truck to travel by the dead reckoning navigation.). Claim(s) 8 is rejected under 35 U.S.C. 103 as being unpatentable over Tojima in view of Kyo and Wu et al. (US 2018/0135328 A1 hereinafter Wu). Regarding Claim 8, the combination of Tojima in view of Kyo teaches The method of claim 1, Tojima further teaches wherein the at least one second sensor is a LiDAR unit. (see at least Fig. 1, 3-4, 7, 12-14[0058-0078, 0096-0122, 0145-0168]: the contactless sensor 24 may also be an optical sensor which detects the landmark by using laser light as detection light. The contactless sensor includes an emitting unit capable of emitting the detection light and a light receiving unit capable of receiving at least a part of the detection light emitted from the emitting unit to be reflected by the landmark. The travel control unit allows the dump truck to travel based on the dead reckoning navigation and determines whether the landmark is detected based on the detection result of the contactless sensor 24 (i.e. Lidar unit)) the combination of Tojima in view of Kyo does not explicitly teach the at least one first sensor is a drive system encoder, Wu is directed to a transfer robot and method for storing and retrieving a vehicle using the robot, Wu teaches the at least one first sensor is a drive system encoder, (see at least [0053]: the main controller realizes position calculation with the following principles: two position calculators are adopted to complete the accurate estimation and calculation of the positions, wherein the main position calculator adopts a dead reckoning method, and calculates the theoretic position of the AGV in real time based on the previous position information, encoder information (speed encoder and angle encoder) and kinematic model of a vehicle body.) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Tojima and Kyo to provide a first sensor with drive system encoder for estimating a vehicle’s position as taught by Wu with reasonable expectation of success to provide a method for an AGV that can realize self navigation and free navigation and doing so would improve operation efficiency of AGV (Wu [0007]). Claim(s) 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Tojima in view of Kyo and Busby et al. (US 2018/0043386 A1 hereinafter Busby). Regarding claim 9, the combination of Tojima in view Kyo teaches The method of claim 1, further comprising The combination of Tojima in view of Kyo does not explicitly teach depositing marking fluid on a surface of the construction site at one or more task locations. Busby is directed to system and method for unmanned vehicle painting applications, Busby teaches depositing marking fluid on a surface of the construction site at one or more task locations. (see at least [0080-0084]: if UAV is at a location that should be painted and the location has not yet been painted, then UAS commands UAV to apply paint at the current position. The tasks can be accomplished using paint profiles) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Tojima and Kyo to fit an unmanned vehicle with appropriate tool that’s capable of implementing a task of depositing marking fluid on a surface of the construction site at one or more task locations as taught by Busby with reasonable expectation of success to improve cost efficiency and safety (Busby [0022]). Regarding claim 10, the combination of Tojima in view Kyo and Busby teaches The method of claim 9, The combination of Tojima in view of Kyo does not explicitly teach wherein depositing marking fluid on a surface of the construction site at one or more task locations comprises providing commands to a controller disposed on the mobility platform to release the marking fluid with a print system based at least partly on a mobility platform velocity. Busby is directed to system and method for unmanned vehicle painting applications, Busby teaches wherein depositing marking fluid on a surface of the construction site at one or more task locations comprises providing commands to a controller disposed on the mobility platform to release the marking fluid with a print system based at least partly on a mobility platform velocity. (see at least Fig. 14 [0043, 0080-0086]: calculating a flight path for a UAV component of a UAS based upon a desired paint profile. In some embodiments the associated flight path is determined based upon efficiency, i.e., the flight path that requires the UAV component to consume the least amount of power. In other embodiments, the associated flight path for a particular paint profile is determined based upon speed, i.e., the flight path that completes the paint profile in the least amount of time) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Tojima and Kyo to fit an unmanned vehicle with appropriate tool that’s capable of implementing a task of depositing marking fluid on a surface of the construction site at one or more task locations and providing commands to a controller disposed on the mobility platform to release the marking fluid with a print system based at least partly on a mobility platform velocity as taught by Busby with reasonable expectation of success to improve cost efficiency and safety (Busby [0022]). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANA F ARTIMEZ whose telephone number is (571)272-3410. The examiner can normally be reached M-F: 9:00 am-3:30 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris S. Almatrahi can be reached at (313) 446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DANA F ARTIMEZ/Examiner, Art Unit 3667 /FARIS S ALMATRAHI/Supervisory Patent Examiner, Art Unit 3667
Read full office action

Prosecution Timeline

Mar 01, 2024
Application Filed
Sep 27, 2024
Non-Final Rejection — §103
Dec 23, 2024
Examiner Interview Summary
Dec 23, 2024
Applicant Interview (Telephonic)
Dec 30, 2024
Response Filed
Jun 20, 2025
Non-Final Rejection — §103
Oct 08, 2025
Response Filed
Jan 20, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596371
SYSTEM AND METHOD FOR INTERCEPTION AND COUNTERING UNMANNED AERIAL VEHICLES (UAVS)
2y 5m to grant Granted Apr 07, 2026
Patent 12573078
METHOD AND APPARATUS FOR DETERMINING VEHICLE LOCATION BASED ON OPTICAL CAMERA COMMUNICATION
2y 5m to grant Granted Mar 10, 2026
Patent 12571646
Automated Discovery and Monitoring of Uncrewed Aerial Vehicle Ground-Support Infrastructure
2y 5m to grant Granted Mar 10, 2026
Patent 12560441
METHOD AND APPARATUS FOR OPTIMIZING A MULTI-STOP TOUR WITH FLEXIBLE MEETING LOCATIONS
2y 5m to grant Granted Feb 24, 2026
Patent 12560936
SYSTEMS AND METHODS FOR OBJECT DETECTION
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

4-5
Expected OA Rounds
58%
Grant Probability
99%
With Interview (+43.9%)
3y 2m
Median Time to Grant
High
PTA Risk
Based on 80 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month