Prosecution Insights
Last updated: April 18, 2026
Application No. 17/272,489

Exploration Of A Robot Deployment Area By An Autonomous Mobile Robot

Final Rejection §103§112
Filed
Nov 23, 2021
Examiner
HEFLIN, HARRISON JAMES RIEL
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Papst Licensing GmbH & Co. Kg
OA Round
6 (Final)
73%
Grant Probability
Favorable
7-8
OA Rounds
2y 9m
To Grant
86%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
101 granted / 139 resolved
+20.7% vs TC avg
Moderate +13% lift
Without
With
+13.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
22 currently pending
Career history
161
Total Applications
across all art units

Statute-Specific Performance

§101
13.2%
-26.8% vs TC avg
§103
47.7%
+7.7% vs TC avg
§102
20.2%
-19.8% vs TC avg
§112
15.4%
-24.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 139 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments, see the section titled “II. Claim Rejections under 35 U.S.C. 103” starting on page 10 of the reply filed 10/31/2025, have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Regarding the subsection titled “A. Vicenti Teaches away from Renewed Exploration” starting on page 12 of the reply filed 10/31/2025, applicant's arguments have been fully considered but they are not persuasive. For example, Applicant argues that because Vicenti discloses a continuous, real-time mapping and localization process, where the robot persistently updates its pose and coverage status as it navigates the environment, Vicenti fails to teach, suggest, or disclose renewing exploration of unmapped areas in a plurality of distinct exploration runs. Although Vicenti is no longer used as prior art applied in the rejections under 35 U.S.C. 103, newly applied art Fong (US 2019/0212752 A1) similarly discloses the use of simultaneous localization and mapping (SLAM) techniques. However, the Examiner disagrees that the use of such techniques teaches away from or otherwise preclude the disclosure of renewing exploration of unmapped areas in a plurality of distinct exploration runs. For example, Fong discloses maintaining a persistent map at a remote computing system by sending updated information at the end of distinct runs of operation, as expressed in the rejections below. For example, the Examiner understands that continuously updating a map, using SLAM techniques for instance, is not mutually exclusive with the saving and updating of a map in accordance with distinct explorations or runs. See the rejections below. Specification This application does not contain an abstract of the disclosure as required by 37 CFR 1.72(b). An abstract on a separate sheet is required. The Examiner notes that although the abstract appears to be present in the pre-grant publication US 2022/0074762 A1, an agenda does not appear to be presently filed. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 3 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 3 recites the limitation "the received command"; however, there is insufficient antecedent basis for this limitation in the claim. Without sufficient antecedent basis, the Examiner understands that it would not be clear to a person having ordinary skill in the art before the effective filing date of the claimed invention what qualities or properties a received command would need to possess in order to be considered "the received command" as disclosed. Therefore, claim 3 is rejected under 35 U.S.C. 112(b). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1, 3-6, and 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Fong (US 2019/0212752 A1), in view of Stout (US 2014/0222279 A1) and Nakata (US 2018/0079085 A1). Regarding claim 1, Fong discloses a method comprising the following: carrying out a first exploration run of a deployment area of an autonomous mobile robot (In paragraph [0086], Fong discloses that when constructing the persistent map, the robots identify landmarks in the environment and place the landmarks on the map, and when the robot navigates in the environment, the robot has feature detectors that can recognize the landmarks to help the robot determine its location); saving a map of the deployment area of the autonomous mobile robot, wherein the map contains orientation data gathered by the autonomous mobile robot during the first exploration run that represents the structure of the environment in the deployment area and wherein the map contains metadata (In paragraph [0084], Fong discloses that the persistent map can show floor types of various regions [metadata], such as whether a region has hard floor or carpet; in paragraph [0090], Fong discloses that for example, the robot 102 can compare the version number [metadata] of the map 110 stored in the storage device 108 of the robot 102 with the version number of the map 120 stored in the storage device 118 of the remote computing system 116; in paragraphs [0094-0098], Fong discloses that as the robot 102 navigates around the environment, e.g., a home, to perform various tasks, e.g., cleaning tasks, the robot 102 uses the camera 114 and other sensors to detect conditions in the home and compares the sensed data with information on the map 110, where if the sensed data is not consistent with the map 110, the robot 102 can update the map 110, and at the end of the cleaning sessions, the robots 102a, 102b each communicates with the remote computing system 116 to provide updated map information, for example, the robots may have updated persistent maps that include data about new objects detected by the robots as they navigate around the home to perform cleaning tasks; in paragraph [0143], Fong discloses that the controller 706 (of mobile cleaning robot 102) uses signals from its sensor system to generate a map of the home 300 by tracking and updating positions and orientations of the mobile cleaning robot 102 over time); after ending the first exploration run, carrying out a renewed exploration of at least one part of the deployment area, wherein the robot gathers information referring to the structure of its environment in the deployment area by a sensor during the renewed exploration (In paragraph [0090], Fong discloses that in some implementations, when the mobile robot 102 powers up to start a mission, such as a cleaning mission, the robot 102 contacts the remote computing system 116 to check whether there is a new version of the persistent map, for example, the robot 102 can compare the version number of the map 110 stored in the storage device 108 of the robot 102 with the version number of the map 120 stored in the storage device 118 of the remote computing system 116; in paragraph [0094], Fong discloses that as the robot 102 navigates around the environment, e.g., a home, to perform various tasks, e.g., cleaning tasks, the robot 102 uses the camera 114 and other sensors to detect conditions in the home and compares the sensed data with information on the map 110, where if the sensed data is not consistent with the map 110, the robot 102 can update the map 110; in paragraph [0143], Fong discloses that the controller 706 (of mobile cleaning robot 102) uses signals from its sensor system to generate a map of the home 300 by tracking and updating positions and orientations of the mobile cleaning robot 102 over time); updating the map of the deployment area (In paragraph [0094], Fong discloses that as the robot 102 navigates around the environment, e.g., a home, to perform various tasks, e.g., cleaning tasks, the robot 102 uses the camera 114 and other sensors to detect conditions in the home and compares the sensed data with information on the map 110, where if the sensed data is not consistent with the map 110, the robot 102 can update the map 110), and saving the updated map for future deployments of the robot (In paragraphs [0094-0099], Fong discloses that as the robot 102 navigates around the environment, e.g., a home, to perform various tasks, e.g., cleaning tasks, the robot 102 uses the camera 114 and other sensors to detect conditions in the home and compares the sensed data with information on the map 110, where if the sensed data is not consistent with the map 110, the robot 102 can update the map 110, and at the end of the cleaning sessions, the robots 102a, 102b each communicates with the remote computing system 116 to provide updated map information, for example, the robots may have updated persistent maps that include data about new objects detected by the robots as they navigate around the home to perform cleaning tasks, where the map merge module 124 resolves inconsistencies in the persistent map updates, if any, and generates a new official version of the persistent map 120 that includes new map data provided by the robots 102a, 102b), wherein the updating of the map comprises the following: determining changes in the deployment area based on the information referring to the structure of the environment gathered during the renewed exploration and on the orientation data already saved in the map, wherein determining changes in the deployment area includes detecting an unmapped area within the deployment area (In paragraphs [0094-0099], Fong discloses that as the robot 102 navigates around the environment, e.g., a home, to perform various tasks, e.g., cleaning tasks, the robot 102 uses the camera 114 and other sensors to detect conditions in the home and compares the sensed data with information on the map 110, where if the sensed data is not consistent with the map 110, the robot 102 can update the map 110, and at the end of the cleaning sessions, the robots 102a, 102b each communicates with the remote computing system 116 to provide updated map information, for example, the robots may have updated persistent maps that include data about new objects detected by the robots as they navigate around the home to perform cleaning tasks, where the map merge module 124 resolves inconsistencies in the persistent map updates, if any, and generates a new official version of the persistent map 120 that includes new map data provided by the robots 102a, 102b; the Examiner understands the new objects detected by the robots to constitute at least “an unmapped area within the deployment area” under its broadest reasonable interpretation in that the newly updated map information about the objects was previously unmapped; in paragraph [0143], Fong discloses that the controller 706 (of mobile cleaning robot 102) uses signals from its sensor system to generate a map of the home 300 by tracking and updating positions and orientations of the mobile cleaning robot 102 over time), and updating the orientation data and the metadata based on the determined changes, wherein updating the metadata includes storing information referring to the structure of the unmapped area gathered during the renewed exploration as part of the metadata, wherein the unmapped area is an area within the deployment area that was not explored by the robot during the first exploration run (In paragraph [0084], Fong discloses that the persistent map can show floor types of various regions [metadata], such as whether a region has hard floor or carpet; in paragraphs [0094-0099], Fong discloses that as the robot 102 navigates around the environment, e.g., a home, to perform various tasks, e.g., cleaning tasks, the robot 102 uses the camera 114 and other sensors to detect conditions in the home and compares the sensed data with information on the map 110, where if the sensed data is not consistent with the map 110, the robot 102 can update the map 110, and at the end of the cleaning sessions, the robots 102a, 102b each communicates with the remote computing system 116 to provide updated map information, for example, the robots may have updated persistent maps that include data about new objects detected by the robots as they navigate around the home to perform cleaning tasks, where the map merge module 124 resolves inconsistencies in the persistent map updates, if any, and generates a new official version of the persistent map 120 that includes new map data provided by the robots 102a, 102b; the Examiner understands the new objects detected by the robots to be at least “not explored by the robot during the first exploration run” under its broadest reasonable interpretation in that the newly updated map information about the objects was previously unmapped; in paragraph [0143], Fong discloses that the controller 706 (of mobile cleaning robot 102) uses signals from its sensor system to generate a map of the home 300 by tracking and updating positions and orientations of the mobile cleaning robot 102 over time; see also paragraph [0111] for example where Fong discloses that if the first robot detects a large variance in the reflectivity indicating a dark colored stain, the first robot can flag that, the next time the robot navigates to that area the robot checks that area to see whether there is still a variance, and if there is still variance, then it indicates that the variance is probably permanent, and the first robot adds that information to the persistent map indicating that it is an actual variance). Fong does not explicitly disclose ending the first exploration run when there are no accessible areas left that the autonomous mobile robot has not explored or when the deployment area recorded in the map is completely surrounded by obstacles; and classifying the determined changes as temporary changes or permanent changes. However, Stout teaches ending the first exploration run when there are no accessible areas left that the autonomous mobile robot has not explored or when the deployment area recorded in the map is completely surrounded by obstacles (In paragraph [0038], Stout teaches that "Frontier" refers to a boundary between an explored and unexplored portion of a surface; in paragraphs [0048-0050], Stout teaches a process 300 which analyses the map or otherwise selects an un-followed edge or border (perimeter) to cover or navigate along, navigates along that perimeter, updates the map as it does so, and determines if continued perimeter covering is necessary, where if process 300 concludes that there are no perimeter frontiers in the map, it may return with condition "no perimeter frontiers", as at 157, and the mobile device may then cease its coverage control routine, as at 160). Stout is considered to be analogous to the claimed invention in that they both pertain to ensuring that all unexplored accessible areas or frontiers are explored and mapped. It would be obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Stout with the method as disclosed by Fong where “Some embodiments may invoke routine 300 to help obtain more complete coverage through discovery of previously unexplored portions of the surface” as suggested by Stout in paragraph [0090], for example. Doing so may be advantageous in that a more complete map may be obtained, increasing accuracy of the map to the environment, and improving complete coverage behaviors of the mobile robot thereby, for example. The combination of Fong and Stout does not explicitly disclose classifying the determined changes as temporary changes or permanent changes. However, Nakata teaches classifying the determined changes as temporary changes or permanent changes (In paragraph [0094], Nakata teaches where if a time difference between the timings for which the change over time related to the shape of the moving object movement and environmental change information is greater than the threshold value is greater than a threshold value, then the relevant moving object movement and environmental change information is not a temporary environmental change due to a disturbance, but instead may be considered to be a semi-permanent or permanent change of the environment due to the movement of a stationary object or the like, and the changed portion of the environment is handled as a new map). Nakata is considered to be analogous to the claimed invention in that they both pertain to determining if an environmental change is temporary or permanent. It would be obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Nakata with the method as disclosed by the combination of Fong and Stout where doing so may advantageously increase the accuracy of the information in the map, for example. Regarding claim 3, Fong further discloses wherein the received command contains information that specifies the part of the deployment area that is to be newly explored, wherein the part to be newly explored may be the entire deployment area, a previously defined subarea or an area specified by the user (In paragraphs [0081-0082], Fong discloses that the robot starts its mission with a user-defined set of rooms, and, for example, there may be a “keep-out” zone on the map 110 so that the robot 102 needs to keep out of the area, where the map merge module 124 is configured to maintain the keep-out zone in the same place after updating the map 120). Regarding claim 4, Fong further discloses wherein the determination of changes in the deployment area includes the compilation of a new map and the determination of changes by comparing the new map with the saved map (In paragraphs [0094-0099], Fong discloses that as the robot 102 navigates around the environment, e.g., a home, to perform various tasks, e.g., cleaning tasks, the robot 102 uses the camera 114 and other sensors to detect conditions in the home and compares the sensed data with information on the map 110, where if the sensed data is not consistent with the map 110, the robot 102 can update the map 110, and at the end of the cleaning sessions, the robots 102a, 102b each communicates with the remote computing system 116 to provide updated map information, for example, the robots may have updated persistent maps that include data about new objects detected by the robots as they navigate around the home to perform cleaning tasks, where the map merge module 124 resolves inconsistencies in the persistent map updates, if any, and generates a new official version of the persistent map 120 that includes new map data provided by the robots 102a, 102b), wherein metadata from the saved map is at least partially carried over into the new map (In paragraph [0084], Fong discloses that the persistent map can show floor types of various regions [metadata], such as whether a region has hard floor or carpet; the Examiner understands that any portion of the map which is not updated is carried over into the new map); wherein, after completion of the renewed exploration, the saved map is replaced by the new map (In paragraphs [0094-0099], Fong discloses that as the robot 102 navigates around the environment, e.g., a home, to perform various tasks, e.g., cleaning tasks, the robot 102 uses the camera 114 and other sensors to detect conditions in the home and compares the sensed data with information on the map 110, where if the sensed data is not consistent with the map 110, the robot 102 can update the map 110, and at the end of the cleaning sessions, the robots 102a, 102b each communicates with the remote computing system 116 to provide updated map information, for example, the robots may have updated persistent maps that include data about new objects detected by the robots as they navigate around the home to perform cleaning tasks, where the map merge module 124 resolves inconsistencies in the persistent map updates, if any, and generates a new official version of the persistent map 120 that includes new map data provided by the robots 102a, 102b). Regarding claim 5, Fong further discloses wherein the updating of the orientation data and of the metadata includes: compiling a temporary work copy of the saved map and entering identified changes into the work copy (In paragraphs [0094-0099], Fong discloses that as the robot 102 navigates around the environment, e.g., a home, to perform various tasks, e.g., cleaning tasks, the robot 102 uses the camera 114 and other sensors to detect conditions in the home and compares the sensed data with information on the map 110, where if the sensed data is not consistent with the map 110, the robot 102 can update the map 110, and at the end of the cleaning sessions, the robots 102a, 102b each communicates with the remote computing system 116 to provide updated map information, for example, the robots may have updated persistent maps that include data about new objects detected by the robots as they navigate around the home to perform cleaning tasks, where the map merge module 124 resolves inconsistencies in the persistent map updates, if any, and generates a new official version of the persistent map 120 that includes new map data provided by the robots 102a, 102b). Regarding claim 6, Stout further teaches wherein the renewed exploration of at least one part of the deployment area includes: navigating through the deployment area until the part to be explored has been completely covered by a coverage area of the sensor, and the map of the part to be explored encompasses an area enclosed at least by obstacles and by parts of the deployment area that are not to be explored (In paragraph [0038], Stout teaches that "Frontier" refers to a boundary between an explored and unexplored portion of a surface; in paragraphs [0048-0050], Stout teaches a process 300 which analyses the map or otherwise selects an un-followed edge or border (perimeter) to cover or navigate along, navigates along that perimeter, updates the map as it does so, and determines if continued perimeter covering is necessary, where if process 300 concludes that there are no perimeter frontiers in the map, it may return with condition "no perimeter frontiers", as at 157, and the mobile device may then cease its coverage control routine, as at 160). Regarding claim 9, Fong further discloses wherein the updating of the metadata further includes one or more of the following: adaptation of an area in which a service task is to be carried out by the robot (In paragraphs [0082-0083], Fong discloses that there may be a “keep-out” zone on the map 110 so that the robot 102 needs to keep out of the area, where the map merge module 124 is configured to maintain the keep-out zone in the same place after updating the map 120 using “anchor points,” such as corners of a room, that helps place the occupancy grid over it, where the map merge module 124 can choose the four corners of the keep-out zone as anchor points, and at the end of each mission the keep-out zone will be placed based on the anchor points, and additionally that the map merge module 124 honors the room labels provided by the user, where if the user 10 labeled a room as the “Living Room,” the map merge module 124 will try to find out which region in the new occupancy grid is the living room and associate it with the label “Living Room”); adaptation of the size, shape and/or number of subareas in the deployment area (In paragraphs [0082-0083], Fong discloses that there may be a “keep-out” zone on the map 110 so that the robot 102 needs to keep out of the area, where the map merge module 124 is configured to maintain the keep-out zone in the same place after updating the map 120 using “anchor points,” such as corners of a room, that helps place the occupancy grid over it, where the map merge module 124 can choose the four corners of the keep-out zone as anchor points, and at the end of each mission the keep-out zone will be placed based on the anchor points, and additionally that the map merge module 124 honors the room labels provided by the user, where if the user 10 labeled a room as the “Living Room,” the map merge module 124 will try to find out which region in the new occupancy grid is the living room and associate it with the label “Living Room”; see also paragraph [0079] where Fong discloses that, for example, the robot 102 may detect a long table in the middle of a room, thinks that the long table is a wall, and determines that there are two rooms on two sides of the wall, where the user 10 may revise the map 110 to show that there is actually only one room, and the wall is actually a long table); entering information regarding a floor covering on a newly identified surface based on adjacent surfaces (In paragraph [0084], Fong discloses that the persistent map can show floor types of various regions [metadata], such as whether a region has hard floor or carpet); or moving danger zones linked to obstacles or exclusion areas (In paragraphs [0082-0083], Fong discloses that there may be a “keep-out” zone on the map 110 so that the robot 102 needs to keep out of the area, where the map merge module 124 is configured to maintain the keep-out zone in the same place after updating the map 120 using “anchor points,” such as corners of a room, that helps place the occupancy grid over it, where the map merge module 124 can choose the four corners of the keep-out zone as anchor points, and at the end of each mission the keep-out zone will be placed based on the anchor points). Regarding claim 10, Fong further discloses wherein updating the orientation data includes: localizing the robot in the saved map (In paragraph [0143], Fong discloses that the controller 706 (of mobile cleaning robot 102) uses signals from its sensor system to generate a map of the home 300 by tracking and updating positions and orientations of the mobile cleaning robot 102 over time, where the mapping sensors include, for example, simultaneous localization and mapping (SLAM) sensors); and adapting the orientation data saved in the map based on the data gathered by the sensor regarding the structure in the environment of the deployment area (In paragraph [0143], Fong discloses that the controller 706 (of mobile cleaning robot 102) uses signals from its sensor system to generate a map of the home 300 by tracking and updating positions and orientations of the mobile cleaning robot 102 over time, where the mapping sensors include, for example, simultaneous localization and mapping (SLAM) sensors). Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Fong (US 2019/0212752 A1), Stout (US 2014/0222279 A1), and Nakata (US 2018/0079085 A1), in view of Furuta (US 2020/0245837 A1). Regarding claim 2, Fong further discloses carrying out a service task by the robot, wherein the service task is, in particular, one of the following: a cleaning task (In paragraph [0090], Fong discloses that in some implementations, when the mobile robot 102 powers up to start a mission, such as a cleaning mission, the robot 102 contacts the remote computing system 116 to check whether there is a new version of the persistent map). The combination of Fong, Stout, and Nakata does not explicitly disclose wherein the orientation data contained in the saved map remains unchanged. However, Furuta teaches carrying out a service task by the robot, wherein the orientation data contained in the saved map remains unchanged (In paragraphs [0062-0063], Furuta teaches that in the autonomous cleaning mode, the surrounding information generator 45 generates surrounding information, and compares the surrounding information with surrounding information included in the map of the cleaning target space, and, if the surrounding information included in the map is different from the surrounding information generated by the surrounding information generator 45 during the cleaning mission in the autonomous cleaning mode, the autonomous vacuum cleaner 1 displays an enquiry about whether or not to update the map of the cleaning target space together with a message to the effect that the surrounding information, that is, the target object in the cleaning target space is different from the map, for the user after the end of cleaning; the Examiner understands that the saved map will remain unchanged until the user responds to the enquiry to update the map after the end of cleaning), wherein the service task is, in particular, one of the following: a cleaning task (In paragraphs [0062-0063], Furuta teaches an autonomous cleaning mode). Furuta is considered to be analogous to the claimed invention in that they both pertain to carrying out a service task by an autonomous robot with a saved map. It would be obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Furuta with the method as disclosed by the combination of Fong, Stout, and Nakata, where allowing a user to decide whether or not to update the map after the task has been completed may be advantageous in affording more agency to the user in controlling operation of the robot. For example, “a case where it is considered that the state of a target object has been changed due to new installation, movement, or removal of an installed object such as furniture, can be exemplified as the user's motivation to update the map” and “a case where it is considered that a non-fixed object such as an everyday item that was temporarily placed has just been detected as a target object and it is not necessary to change the map can be exemplified as a motivation to not update the map” as suggested by Furuta in paragraph [0063], where allowing the user to choose when to update the map allows the user to operate the robot according to their best contextual judgement. Claims 7-8 are rejected under 35 U.S.C. 103 as being unpatentable over Fong (US 2019/0212752 A1), Stout (US 2014/0222279 A1), and Nakata (US 2018/0079085 A1), in view of Krishnaswamy (US 10,330,480 B1). Regarding claim 7, the combination of Fong, Stout, and Nakata does not explicitly disclose wherein structures in areas in which changes to the deployment area have been detected are scanned with the sensor with a higher degree of accuracy than in other areas. However, Krishnaswamy teaches wherein structures in areas in which changes to the deployment area have been detected are scanned with the sensor with a higher degree of accuracy than in other areas (From column 3 line 50 to column 4 line 31, Krishnaswamy teaches that as changes in the workspace are observed based a low resolution visual information, the on-demand cameras can be sent to corresponding locations to “take a closer look” at the changes (e.g., to gather higher resolution visual information) [structures in areas in which changes to the deployment area have been detected are scanned with the sensor with a higher degree of accuracy than in other areas]). Krishnaswamy is considered to be analogous to the claimed invention in that they both pertain to employing detection of a higher degree of accuracy in an area in which changes have been detected. It would be obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Krishnaswamy with the method as disclosed by the combination of Fong, Stout, and Nakata, where doing so “may result in the on-demand cameras gathering high resolution visual information at the corresponding locations and from different locations than was previously maintained” where “this visual information can be used to update the global view of the map” as suggested by Krishnaswamy from column 3 line 50 to column 4 line 31. This may yield the advantage of increasing the accuracy of the information recorded in the map, where additional information can be utilized and where the information used to create the map is of higher quality, for example. Regarding claim 8, Krishnaswamy further teaches wherein the degree of accuracy of the detection is increased by one or more of the following: increasing the time spent in an area, increasing the scanning duration of the structures (From column 3 line 50 to column 4 line 31, Krishnaswamy teaches that as changes in the workspace are observed based a low resolution visual information, the on-demand cameras can be sent to corresponding locations to “take a closer look” at the changes (e.g., to gather higher resolution visual information) [structures in areas in which changes to the deployment area have been detected are scanned with the sensor with a higher degree of accuracy than in other areas]; the Examiner understands sending on-demand cameras to the areas in which changes have been detected to be at least an example of increasing the time spent in an area or increasing the scanning duration of the structures under its broadest reasonable interpretation, where the deployed sensors must spend more time in the area to capture sensor data of the environment to operate as disclosed). Claim 39 is rejected under 35 U.S.C. 103 as being unpatentable over Fong (US 2019/0212752 A1), Stout (US 2014/0222279 A1), and Nakata (US 2018/0079085 A1), in view of Bailey (US 8,364,309 B1). Regarding claim 39, the combination of Fong, Stout, and Nakata does not explicitly disclose wherein classifying the determined changes as temporary changes or permanent changes includes classifying the determined changes as permanent changes based on instructions of the user. However, Bailey teaches wherein classifying the determined changes as temporary changes or permanent changes includes classifying the determined changes as permanent changes based on instructions of the user (In column 12 lines 13-31, Bailey teaches that the user may update or correct the revised floor plan through the drawing unit 332 of the robot control utility, and in an embodiment, the robot 102 is configured not to delete any information on the floor plan, and instead, changes to the floor plan proposed or suggested by the robot 102 as a result of the discovery phase are added to the floor plan and must be confirmed by the user to be made permanent). Bailey is considered to be analogous to the claimed invention in that they both pertain to classifying permanent changes to a map detected by a robot based on user indication. It would be obvious toa person having ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Bailey with the method as disclosed by the combination of Fong, Stout, and Nakata, where doing so allows the user to dictate which changes should be permanent, advantageously increase the contextual accuracy of the map and increasing the level of user control, for example. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Harrison Heflin whose telephone number is (571)272-5629. The examiner can normally be reached Monday - Friday, 1:00PM - 10:00PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached at 571-272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HARRISON HEFLIN/ Examiner, Art Unit 3665 /HUNTER B LONSBERRY/ Supervisory Patent Examiner, Art Unit 3665
Read full office action

Prosecution Timeline

Nov 23, 2021
Application Filed
Sep 11, 2023
Non-Final Rejection — §103, §112
Mar 15, 2024
Response Filed
Apr 15, 2024
Final Rejection — §103, §112
Aug 19, 2024
Request for Continued Examination
Aug 20, 2024
Response after Non-Final Action
Oct 15, 2024
Non-Final Rejection — §103, §112
Apr 21, 2025
Response Filed
May 09, 2025
Final Rejection — §103, §112
Aug 28, 2025
Interview Requested
Oct 28, 2025
Examiner Interview Summary
Oct 31, 2025
Request for Continued Examination
Nov 06, 2025
Response after Non-Final Action
Nov 20, 2025
Non-Final Rejection — §103, §112
Feb 09, 2026
Interview Requested
Feb 18, 2026
Examiner Interview Summary
Feb 18, 2026
Applicant Interview (Telephonic)
Mar 02, 2026
Response Filed
Apr 09, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596369
CONTROL SYSTEM, MOBILE OBJECT, CONTROL METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12566443
ROBOT TRAVELING IN SPECIFIC SPACE AND CONTROL METHOD THEREOF
2y 5m to grant Granted Mar 03, 2026
Patent 12559894
SYSTEMS AND METHODS TO APPLY SURFACE TREATMENTS
2y 5m to grant Granted Feb 24, 2026
Patent 12541202
UNMANNED VEHICLE AND INFORMATION PROCESSING METHOD
2y 5m to grant Granted Feb 03, 2026
Patent 12497275
APPARATUS FOR MOVING A PAYLOAD
2y 5m to grant Granted Dec 16, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
73%
Grant Probability
86%
With Interview (+13.0%)
2y 9m
Median Time to Grant
High
PTA Risk
Based on 139 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month