Prosecution Insights
Last updated: April 19, 2026
Application No. 17/926,224

EXCAVATION INSPECTION AND CLEARANCE SYSTEM

Final Rejection §103
Filed
Nov 18, 2022
Examiner
CHAD, ANISS
Art Unit
3662
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Australian Droid & Robot Pty Ltd.
OA Round
2 (Final)
69%
Grant Probability
Favorable
3-4
OA Rounds
3y 10m
To Grant
98%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
303 granted / 439 resolved
+17.0% vs TC avg
Strong +29% interview lift
Without
With
+28.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 10m
Avg Prosecution
14 currently pending
Career history
453
Total Applications
across all art units

Statute-Specific Performance

§101
20.7%
-19.3% vs TC avg
§103
41.8%
+1.8% vs TC avg
§102
15.1%
-24.9% vs TC avg
§112
16.8%
-23.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 439 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Aniss Chad is the new examiner of record. Status of Claims This action is in response to the amendments filed 4/2/2025 in which claims 1-2, 5, 7 have been canceled. Claims 3-4, 6, 8-20 stand rejected. Response to Amendment Applicants have amended/canceled claims to overcome the previous objections. Accordingly, the previous claim objections have been withdrawn. Response to Arguments Applicant's arguments have been fully considered but they are not persuasive. Applicant first argues that: “a review of paragraphs 0097, 0098, and 0114 of Pack et al. reveals no discussion of having at least two autonomous agents, wherein one agent monitors an edge and a node at one portion of a graph and another agent creates and maintains a clear zone at another part of the graph. There is no disclosure of monitoring an edge and a node of a graph and there is no disclosure of maintaining a clear zone at another part of the graph. Maintaining a clear zone means that the zone is clear for a span of time. Since persons and equipment are not static and may move into areas being monitored by an autonomous agent, that autonomous agent must be capable of keeping a zone clear of such persons and equipment for a span of time. Pack et al. does not disclose that its vehicles are capable of such a task. Instead, Pack et al. is directed to a static situation, wherein personnel are prevented from entering areas where a vehicle is operating. Only when the vehicle has mapped static structures and found no threats are personnel allowed into the mapped area (see Paragraph 0094 of Pack et al.). Since there is no clear reason given in Dunbabin et al. to alter Pack et al. to use at least two autonomous agents, wherein one agent monitors an edge and a node at one portion of a graph and another agent creates and maintains a clear zone at another part of the graph, the rejection of claim 3 should be withdrawn.” The Examiner respectfully disagrees. First the examiner notes that according to the instant specification, the terms “node” and “edge” are used in the context of representing the layout of an excavation as a graph for planning and navigation by autonomous agents. Second, Pack [0097] and [0098] teach the further limitation including “wherein a first autonomous agent of the plurality of autonomous agents monitors at least one of an edge and a node of one portion of the graph while a second autonomous agent of the plurality of autonomous agents creates and maintains a clear zone in another part of the graph;”. Pack explicitly mentions of multiple vehicles assigned to different roles/areas, working in parallel and sharing information: [0097] of Pack describes remote vehicles (autonomous agents) performing persistent stare and perimeter surveillance missions. In these missions, remote vehicles can autonomously follow predefined paths, use obstacle detection and avoidance, and tag maps with sensor data and images. The vehicles can move from outpost to outpost, monitoring suspicious locations for scheduled times. One vehicle can remain at a specific location (effectively “monitoring” an area—analogous to monitoring an edge or node), while another vehicle continues to patrol or clear other areas, thus maintaining surveillance or a “clear zone” elsewhere. [0098] of Pack expands on multi-agent behaviors by explaining that remote vehicles can be tasked with clearing a route, minefield, or hazardous area. The vehicles are equipped with sensors and can be commanded to sweep an area, generate maps, and share data. Multiple vehicles can be deployed by a team, with some vehicles scanning or monitoring particular areas (e.g., remaining stationary or focusing on a suspicious location, monitoring a node or edge), while others sweep or clear other sections. The system allows for coordination such that different agents are responsible for maintaining surveillance (“monitoring”) and clearing zones in different parts of the operational space. Applicant further alleges that the cited references fail to teach "the plurality of autonomous agents are configured to navigate the excavation such that the clear zones are maintained and can be joined to form a larger clear zone in the excavation." The Examiner respectfully disagrees because as mentioned above, Pack teaches maintaining clear zone and as described in the previous office action and below, Dunbabin et al. teaches maintaining clear zones. First, Dunbabin et al. describes a system where the excavation machinery (which may be operated autonomously or semi-autonomously) continuously generates and updates a 3D digital terrain and obstacle map using data from multiple sensors ([0006], [0037], [0039]). Dunbabin’s system identifies exclusion zones and assigns dynamic safety zones or “safety bubbles” around obstacles and within the operational workspace ([0007], [0043]). These safety zones are enforced in real time, meaning the machinery is controlled to avoid entering areas that are not “clear” (i.e., areas with obstacles or personnel), thus maintaining clear zones during operation. Second, Dunbabin’s system can incorporate situational maps from other machines or off-board sensors to improve awareness and expand the mapped clear areas ([0019], [0045]). By integrating maps and data from multiple sources, the system effectively allows separate “clear zones” (areas confirmed free of obstacles) from different machines or sensors to be combined into a larger clear zone. This joining is further enabled by the system’s ability to dynamically update and merge exclusion/safety zones as obstacles are removed or as the machinery progresses, expanding the accessible workspace ([0043], [0053]). Although Dunbabin et al. primarily describes a single machine, it explicitly contemplates the use of multiple machines and off-board sensors working together ([0019], [0045], [0044]), each contributing to the overall situational awareness and clear zone mapping. The combined data from these agents ensures that as each navigates and clears its assigned area, the system can maintain and merge individual clear zones into a comprehensive, larger clear zone for safe operation. Applicant further argues that “Dunbabin et al. discloses unwanted items are present in its safety zones or bubbles while the claims do not want unwanted items in its clear zones. Thus, Dunbabin et al. teaches away from the claimed clear zones and so there is no clear reason in Dunbabin et al. to alter Pack et al. to use clear zones as recited in claim 3 and so the rejection should be withdrawn.” The Examiner respectfully disagrees. Dunbabin et al. describes dynamically assigning safety zones or “exclusion areas” around detected obstacles (¶[0043]) to prevent the machine from entering regions where unwanted items (e.g., personnel, equipment, vehicles) are present. The functional result is that the remaining workspace, outside the safety/exclusion zones, is an area confirmed to be free of such obstacles, i.e., a clear zone as claimed. The claim requires that clear zones are defined by a lack of persons and/or equipment. Dunbabin et al. achieves this by identifying all obstacles and designating the surrounding area as off-limits (exclusion zone), thereby ensuring the rest of the operational area is free of such obstacles and thus “clear.” The safety zone is not a region where obstacles are permitted or present, rather, it is a buffer to ensure obstacles are not encountered, and the machine only operates in areas where obstacles are absent. Dunbabin et al. does not “teach away” from the concept of a clear zone. Instead, the reference provides a method for reliably determining and maintaining obstacle-free (clear) areas for machine operation, which is functionally equivalent to the claimed clear zone. A person of ordinary skill in the art would recognize that both approaches aim to ensure autonomous machinery operates only in areas free of hazards (persons, equipment). Therefore, it would have been obvious to adapt Pack’s system to use Dunbabin’s mapping and exclusion techniques to define and maintain clear zones to improve the machine's own awareness of the environment thereby improving excavation safety (See below rejection for detailed mapping). Applicant’s arguments have been fully considered but have been found unpersuasive. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 3-4, 6, 8, 10, and 12-20 are rejected under 35 U.S.C. 103 as being unpatentable over Pack et al. (US 20120095619 A1) in view of Dunbabin et al. (US 20100223008 A1), as cited in the Information Disclosure Statement dated 18 November 2022. Regarding Claim 3, Pack teaches An excavation inspection and clearance method comprising: (see at least Pack [0015]) “The present teachings also provide a method for conducting a remote vehicle mission”; (see at least Pack [0087]) “…used for building and area clearance, EOD operations, tunnel and cave exploration” where the method for area clearance and tunnel exploration corresponds to an excavation inspection. navigating an excavation with a plurality of autonomous agents; (see at least Pack [0095] “The remote vehicles can autonomously explore and investigate culverts, tunnels, and caves”) generating a graph corresponding to the excavation, (see at least Pack [0103] “creating and/or using a top-down coordinate (e.g., Cartesian) map including at least one of coordinates, occupancy map, free space map…” where the coordinate map including occupancy and free space corresponds to a graph of the excavation.) wherein edges correspond to pathways of the excavation along which the plurality of autonomous agents may travel and nodes correspond to junctions joining the pathways; (see at least Pack [0143] “A third type of data package can include…topological connections of routes and waypoints…and/or free space maps in spatial graph or Voronoi diagram form)” where topological connections of routes and free space corresponds to edges and nodes of pathways the vehicle may travel.) determining a search strategy using the graph; (see at least Pack [0045] “An environment can be defined as a physical area that has a defined coordinate system with a localization strategy and a planning strategy, each of which is effective for its environment.” (see at least Pack [0142] “data package can include families of pre-defined scripts for defining sequences of guided, assisted, and autonomous acts or steps…” where the data package defines the mission procedures and preferred tactics (generates the search strategy using the graph; see Pack [0102]) searching, according to the search strategy and using one or more sensors of each of the plurality of autonomous agents...; (see at least Pack [0043] “A remote vehicle can include environment sensors”; and at least Pack [0044] “A remote vehicle can be completely autonomous, finding and recognizing tangible elements within its immediate environment”) determining clear zones around each of the one or more autonomous agents…in proximity of the autonomous agents while navigating the excavation; (see at least Pack [0094]) “In tunnel and cave exploration missions, the remote vehicle can be sent in to explore caves, tunnels, or other difficult environments…”; (see at least Pack [0091]) “For building and area clearance, the remote vehicle can be used to autonomously…examine environments (both line-of-site and non-line-of-site), map terrain structures and threats, and allow investigation and elimination of identified threats” where identification and elimination of threats in the environment around the vehicle corresponds to determining a clear zone during area clearance in a mine or tunnel (while navigating the excavation) wherein a first autonomous agent of the plurality of autonomous agents monitors at least one of an edge and a node of one portion of the graph while a second autonomous agent of the plurality of autonomous agents creates and maintains a clear zone in another part of the graph. ([0097] of Pack describes remote vehicles (autonomous agents) performing persistent stare and perimeter surveillance missions. In these missions, remote vehicles can autonomously follow predefined paths, use obstacle detection and avoidance, and tag maps with sensor data and images. The vehicles can move from outpost to outpost, monitoring suspicious locations for scheduled times. One vehicle can remain at a specific location (effectively “monitoring” an area—analogous to monitoring an edge or node), while another vehicle continues to patrol or clear other areas, thus maintaining surveillance or a “clear zone” elsewhere. [0098] of Pack expands on multi-agent behaviors by explaining that remote vehicles can be tasked with clearing a route, minefield, or hazardous area. The vehicles are equipped with sensors and can be commanded to sweep an area, generate maps, and share data. Multiple vehicles can be deployed by a team, with some vehicles scanning or monitoring particular areas (e.g., remaining stationary or focusing on a suspicious location, monitoring a node or edge), while others sweep or clear other sections. The system allows for coordination such that different agents are responsible for maintaining surveillance (“monitoring”) and clearing zones in different parts of the operational space.) Pack does not appear to expressly teach the bolded claim elements: to identify at least one of persons and equipment in proximity of the plurality of autonomous agents while navigating the excavation; and determining clear zones around each of the plurality of autonomous agents according to at least one of a lack of persons and a lack of equipment in proximity of the plurality of autonomous agents while navigating the excavation; and wherein the plurality of autonomous agents are configured to navigate the excavation such that the clear zones are maintained and can be joined to form a larger clear zone in the excavation. However, Dunbabin discloses autonomous path planning in excavation machinery that teaches (searching, using one or more sensors of the autonomous agents), to identify at least one of persons and equipment in proximity of the plurality of autonomous agents while navigating the excavation (see at least Dunbabin [0003]) “detecting and avoiding obstacles such as large boulders, trucks and other equipment, as well as collision detection with the dig-face, the machine itself, other machines and ground personnel.” (determining clear zones around each of the one or more autonomous agents) according to a lack of persons and/or equipment (see at least Dunbabin [0015]) “safety zones or "safety bubbles" may be assigned to the obstacles detected in a terrain and obstacle map to define the minimum clearance area for the machine to avoid collision.”; (see at least Dunbabin [0017]) “Path planning is performed using knowledge of the machine's current and desired states and its movement in response to inputs. The collision-free, optimal path is generated” wherein the plurality of autonomous agents are configured to navigate the excavation such that the clear zones are maintained and can be joined to form a larger clear zone in the excavation. (see at least Dunbabin [0042]) “First, an object detection system is used to find or identify obstacles such as vehicles, equipment, rocks and the machine's own crawlers from the map. The subsystem also fills any `holes` or `gaps` in the map in which no valid sensor date is available.”; (see at least Dunbabin [0043]) “Once the obstacles are identified, safety zones or safety bubbles are dynamically assigned around any obstacles in the situational awareness map to define exclusion areas.”; (see at least Dunbabin [0044]) “Additionally, the system is capable of not only detecting obstacles, but tracking their movement throughout the workspace. Workers and vehicles entering the workspace may carry a trackable identification tag for this purpose; allowing the excavator to detect them and the system to gather information about their movements. A large safety bubble can be assign to these trackable objects to ensure that the bucket or other part of the machine cannot collide with them.” In summary, Dunbabin teaches identification of detected machines and ground personnel and assigns safety zones based on the detections (determines a clear zone according to lack of persons or equipment), fills in the map where no sensor data is available to define exclusion zones (forms a larger clear zone), and tracks movements of detected obstacles in the environment (maintains the clear zone). It would have been obvious to one having ordinary skill in the art before the effective filing date of the instant application to combine the remote vehicle missions and systems as taught by Pack, with the autonomous path planning in excavations, as taught by Dunbabin, with a reasonable expectation of success, because both inventions are from the same field of endeavor for autonomous vehicle control and object detection, and modifying the method as taught by Pack with the determining clear zones based on detected machines and personnel as taught by Dunbabin would (see Dunbabin [0045]) “improve the machine's own awareness of the environment” thereby improving excavation safety. Regarding Claim 4, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. Pack further teaches further including receiving a map of the excavation, (see at least Pack [0021] “the operator control unit retrieves any available historical maps and data available for and relevant to the selected EOD mission, and sends the historical data to the robot head for use in the mission.”) Regarding Claim 6, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination further teaches further including determining a path through the graph for each the plurality of autonomous agents such that the path creates and maintains a clear zone in the graph (see at least Dunbabin [0015]) “safety zones or "safety bubbles" may be assigned to the obstacles detected in a terrain and obstacle map to define the minimum clearance area for the machine to avoid collision.”; (see at least Pack [0095]) “The remote vehicles can autonomously approach a suspicious object and allow the operator to investigate it, and can share video, sensor, and map data with other remote vehicles and units being deployed”; (see at least Pack [0096]) “Remote vehicles can also perform persistent stare and perimeter surveillance missions…The remote vehicle can autonomously follow a previously-defined path while using obstacle detection and avoidance and tagging a map with images, sensor data, and other critical information and/or transmitting such data in real time…The remote vehicle can inform the operator and/or other personnel (e.g., a command center) when there is a change in a scene, sensor data, and/or other critical information”; (see at least Dunbabin [0017]) “Path planning is performed using knowledge of the machine's current and desired states and its movement in response to inputs. The collision-free, optimal path is generated” To summarize, Modified Pack teaches an autonomous vehicle that navigates the excavation while detecting objects, and generating an optimal collision-free path through a terrain and obstacle map (determine and create a clear zone path through the graph) while the remote vehicles share data with other deployed vehicles (for each of the one or more robots) so they may follow a previously defined path while performing surveillance (maintain a clear zone).Regarding Claim 8, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. Pack further teaches wherein the plurality of autonomous agents are configured to share their location with each other. (see at least Pack [0097]) “Remote vehicles that are tasked with clearing a route or area can share video, sensor data, and map data among themselves and/or with other remote vehicles and deployed units.”Regarding Claim 10, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination further teaches wherein a central server may control the plurality of autonomous agents. (see at least Pack [0044]) “A remote vehicle can be completely autonomous”; (see at least Pack [0082]) “FIG. 6 shows an operator control unit (600) having…a processor unit (604) to process the functionality of the OCU.” (see at least Dunbabin [0021]) “the invention may be run…off-board, that is from a remote location that is in communication with the machine via wired or wireless link.Regarding Claim 12, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination further teaches including signaling the clear zone. (see at least Dunbabin [0043]) “Once the obstacles are identified, safety zones or safety bubbles are dynamically assigned around any obstacles in the situational awareness map to define exclusion areas.”; (see at least Pack [0097]) “Remote vehicles that are tasked with clearing a route or area can share video, sensor data, and map data among themselves and/or with other remote vehicles and deployed units.” where sharing an identified safety zone corresponds to signalling the clear zone.Regarding Claim 13, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination further teaches wherein the plurality of autonomous agents are configured to navigate, at least in part, according to data from the one or more sensors. (see at least Pack [0018]) “Using autonomous behaviors to perform exploring and mapping functions comprises one or more of using behaviors that use machine vision techniques to identify landmarks and using an IMU to conduct exploration.” Regarding Claim 14, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination further teaches wherein each of the plurality of autonomous agents is configured to stay with a person upon identification of a person. (see at least Dunbabin [0044]) “Workers and vehicles entering the workspace may carry a trackable identification tag for this purpose; allowing the excavator to detect them and the system to gather information about their movements”; (see at least Pack [0096]) “Remote vehicles can also perform persistent stare and perimeter surveillance missions, allowing operators to monitor operations from a safe standoff distance” where tracking an identified worker from a standoff distance corresponds to staying with a person upon identification. Regarding Claim 15, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination further teaches wherein the plurality of autonomous agents are configured to identify equipment. (see at least Dunbabin [0042]) “an object detection system is used to find or identify obstacles such as vehicles, equipment…” Regarding Claim 16, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination further teaches wherein the sensors include one or more distance/range sensors, such as LiDAR, sonar, radar sensors. (see at least Pack [0043]) “A remote vehicle can include environment sensors such as, for example, a laser range finder” where the laser range finder corresponds to a lidar sensor.Regarding Claim 17, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination further teaches wherein the sensors include one or more image sensors, such as cameras. (see at least Pack [0043]) “A remote vehicle can include environment sensors such as…a stereo vision camera.”Regarding Claim 18, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination further teaches wherein the sensors are be configured to generate three-dimensional data relating to the excavation. (see at least Pack [0058]) “The cameras preferably provide differing views of the remote vehicle's environment, to aid in triangulation and creation of the 3D image.”Regarding Claim 19, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination further teaches wherein the autonomous agents may include an inertial measurement unit (IMU) configured to sense position and orientation changes of the autonomous agents based on inertial acceleration. (see at least Pack [0112]) “using autonomous behaviors to sweep an area (e.g., a room, tunnel, corridor, perimeter, path, area, room, road, roadside), including using behaviors that use…an Inertial Measurement Unit (including one or more accelerometers and/or rate gyroscopes)…and/or waypoint recordings of IMU data” where the remote vehicle uses accelerometer and gyroscope (acceleration and orientation change) data for the IMU waypoint recordings (position change). Regarding Claim 20, Claim 20 recites a system for performing the method of claim1 and recites substantially similar limitations and is rejected for substantially similar reasoning as discussed in claim 1 above. Claims 9 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Pack et al. (US 20120095619 A1) in view of Dunbabin et al. (US 20100223008 A1), and further in view of Sturges et al. (US 20040054434 A1) as cited in the Information Disclosure Statement dated 18 November 2022. Regarding Claim 9, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination fails to teach wherein the plurality of the autonomous agents are configured to determine clear zones of the excavation recursively. However, Sturges discloses (see Sturges [0002]) automated control vehicles for automating the operation of one or more underground mining vehicles that teaches wherein the plurality of the autonomous agents are configured to determine clear zones of the excavation recursively. (see at least Sturges Fig. 6, [0042]) “range data from a laser scanner allows the ability to calculate the current POSE of the MBC. By using a “Line-Finding Algorithm,” (LFA) the two longest straight lines are extracted from the range data using a recursive line splitting technique.” (see at least Sturges [0043]) “At step 208, the algorithm proceeds with applying a recursive line-splitting technique to split the selected group of the range data into subgroups…The group is now split into two subgroups, and the same procedure (connecting a line between the first and last points in each subgroup, computing distance from each point, etc.) is further applied to both remaining groups.” In summary, the Modified Pack teaches remote vehicles used for area clearance in tunnels (searching excavations; see Pack [0087] discussion in claim 1 above) while Sturges teaches autonomous vehicles using recursive algorithms for range finding data and efficient fitting of groups of vehicles through tunnels based on ranging data. It would have been obvious to one having ordinary skill in the art before the effective filing date of the instant application to modify the remote vehicle missions and systems as taught by Modified Pack, with the recursive technique for splitting range data into groups, as taught by Sturges, with a reasonable expectation of success, because both inventions are from the same field of endeavor for control of autonomous vehicles in underground environments. One would be motivated to combine the teachings of Modified Pack and Sturges in order to (see Sturges [0043]) “represent the entire profile of mine walls captured by laser scanners at any instant, and they can be used to determine the current POSE of the MBC in every control cycle” and (see Sturges [0007]) “increases the ability to accurately determine the position” of the autonomous vehicles, thereby increasing remote vehicle reliability. Regarding Claim 11, Pack in combination with Dunbabin teaches all the limitations of claim 3 as discussed above. The combination fails to explicitly teach wherein a master autonomous agent of the plurality of autonomous agents controls one or more other slave autonomous agents of the plurality of autonomous agents. However, Sturges discloses (see Sturges [0002]) automated control vehicles for automating the operation of one or more underground mining vehicles that teaches wherein a master autonomous agent of the plurality of autonomous agents controls one or more other slave autonomous agents of the plurality of autonomous agents. (see at least Sturges [0022]) “at least one pair of mobile bridge carrier (MBC) and piggyback conveyor (“Pigs”) units of a continuous haulage system are automated such that navigation through an underground mine can be accomplished with little or no operator input or intervention. In one embodiment, automation is accomplished through the use of a series of sensors mounted on each MBC and an electronic controller…each MBC can operate (navigate) independently of the other MBC in the continuous miner assembly, it is contemplated that each MBC controller can exchange data and cooperate with the controllers of the other MBCs.”; (see at least Sturges [0024]) “The continuous miner 40 passes mined material to the first MBC 10A. The material is then conveyed to the next MBC 10B”; (see at least Sturges [0030]) “if a first MBC 10A is moving forward it will pull a trailing piggyback conveyor 30A forward. The trailing piggyback conveyor 30A will be an advancing piggyback conveyor as to a second MBC 10B” where cooperative control of the second MBC (slave autonomous agent) is dependent of the first MBC position (master) positioning control. It would have been obvious to one having ordinary skill in the art before the effective filing date of the instant application to modify the remote vehicle missions and systems as taught by Modified Pack, with the continuous haulage system, as taught by Sturges, with a reasonable expectation of success, because both inventions are from the same field of endeavor for control of autonomous vehicles in underground environments. One would be motivated to combine the teachings of Modified Pack and Sturges in order to (see at least Sturges [0041]) “have a local controller for each MBC rather than having one centralized controller for all MBCs” and (see Sturges [0041]) because it is more efficient to have a local controller for each MBC rather than having one centralized controller for all MBCs, thereby increasing efficiency of the remote vehicles communication in a difficult environment. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANISS CHAD whose telephone number is (571)270-3832. The examiner can normally be reached M-F 8:00-4:00pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James Trammell can be reached at 571-272-6712. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ANISS CHAD/ Supervisory Patent Examiner Art Unit 3662
Read full office action

Prosecution Timeline

Nov 18, 2022
Application Filed
Nov 16, 2024
Non-Final Rejection — §103
Apr 02, 2025
Response Filed
Jan 02, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600355
DRIVING ASSISTANCE APPARATUS AND DRIVING ASSISTANCE APPARATUS PROCESSING METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12559137
SYSTEM AND METHOD FOR PROVIDING A SITUATIONAL AWARENESS BASED ADAPTIVE DRIVER VEHICLE INTERFACE
2y 5m to grant Granted Feb 24, 2026
Patent 12515522
DEVICE, SYSTEM, AND METHOD FOR CONTROLLING A VEHICLE-RELATED DISPLAY INTO AN EXITED OCCUPANT SUPPORT MODE
2y 5m to grant Granted Jan 06, 2026
Patent 12496862
METHOD AND DEVICE FOR AN ASSISTED OR AUTOMATIC COUPLING OF A TRAILER VEHICLE TO A TOWING VEHICLE, TOWING VEHICLE, ELECTRONIC PROCESSING UNIT, AND COMPUTER PROGRAM
2y 5m to grant Granted Dec 16, 2025
Patent 12384369
VEHICLE CONTROL DEVICE
2y 5m to grant Granted Aug 12, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
69%
Grant Probability
98%
With Interview (+28.8%)
3y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 439 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month