Prosecution Insights
Last updated: April 17, 2026
Application No. 18/591,059

AUTONOMOUS SIGNAL BOOSTING ROBOTIC DEVICE

Final Rejection §102§103
Filed
Feb 29, 2024
Examiner
CUMBESS, YOLANDA RENEE
Art Unit
3651
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
unknown
OA Round
2 (Final)
87%
Grant Probability
Favorable
3-4
OA Rounds
2y 5m
To Grant
96%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
970 granted / 1113 resolved
+35.2% vs TC avg
Moderate +9% lift
Without
With
+8.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
25 currently pending
Career history
1138
Total Applications
across all art units

Statute-Specific Performance

§101
1.2%
-38.8% vs TC avg
§103
42.3%
+2.3% vs TC avg
§102
26.7%
-13.3% vs TC avg
§112
29.1%
-10.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1113 resolved cases

Office Action

§102 §103
DETAILED ACTION 0Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments with respect to claim(s) 1-20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-3, 10, 12-15, 18, and 20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Artes et al (US PG. Pub. 2018/0292827). Relative to claims 1-3, 10-15, Artes discloses: claim 1) A method for operating a robot (1)(Fig. 1), comprising: capturing, with at least one sensor (130)(Fig. 9) disposed on the robot (1), data of an environment and data indicative of movement of the robot (1)(Para. 0028); generating or updating, with a processor (102)(Fig. 9) of the robot (1), a map of the environment based on at least a portion of the captured data (Para. 0028; 0085); inferring, with the processor of the robot (1), a current location of the robot (1)(Para. 0085, robot uses SLAM technology; see also Para. 0030, see “location of the robot in its area”); and actuating, with the processor (102) of the robot (1), the robot (1) to execute a first task (Para. 0085); an application of a communication device is used by a user to schedule the first task (Para. 0029, see mobile device having a suitable app providing a graphical user interface); claim 2) the task comprises a transportation task of at least one item from a first location to a second location (Para. 0028, tasks can include transport of objects within a building); and the application is used to schedule delivery of the at least one item to the second location (Para. 0030); and claim 3) the robot (1) further comprises a user interface for choosing functions, adjusting settings, and scheduling tasks (Para. 0029; 0030; 0035; 0076-0077); claim 10) the application displays an image or a video captured by an image sensor (130) disposed on the robot (1)(Para. 0073, 0068, 0061, robot recognizes objects using image processing or cameras and shows images with mobile device graphical interface); claim 12) the application is further used by the user to: instruct the robot (1) to perform a particular task in an area of the environment at a day and time and provide robot settings (Para. 0030; 0061; 0072; 0077); and the application displays information relating to the robot (Para. 0029, information is displayed through graphical interface of mobile device 2); claim 13) actuating, with the processor (102) of the robot (1), the robot (1) to execute a particular action upon detecting the user entering the environment (Para. 0012; 0072); and the processor (102) of the robot (1) detects the user entering the environment based on a location of the communication device (Para. 0067); claim 14) the robot (1) generates a noise or illuminates lights to provide a notification to the user (Para. 0058, button blinks or lights up); and claim 15) the processor (102) of the robot (1) is configured to actuate the robot (1) to respond to environmental characteristics comprising at least: terrain, obstacle locations and obstacle density, elevation, and terrain and elevation transitions (Para. 0085, for instance, obstacle avoidance). Relative to claims 18 and 20, the disclosure of Artes includes: claim 18) A robot (1) for transporting items (Para. 0028), comprising: a chassis (inherently included, see base of robot on top of wheels)(Para. 0084); a set of wheels coupled to the chassis (see wheels on robot, Fig. 1); a control system to actuate movement of the set of wheels (see drive module 101)(Para. 0084)(Fig. 9); a power supply (Para. 0084)(Fig. 9); at least one sensor (130)(Fig. 9); a processor (102)(Fig. 9); and a tangible, non-transitory, machine readable medium storing instructions (inherently included with processing module 102) that when executed by the processor (102) of the robot (1) effectuate operations (Para. 0083) comprising: capturing, with the at least one sensor (130) disposed on the robot (1), data of an environment and data indicative of movement of the robot (1)(Para. 0028; 0086); generating or updating, with the processor (102) of the robot (1), a map of the environment based on at least a portion of the captured data (Para. 0028); inferring, with the processor (102) of the robot (1), a current location of the robot (1)(Para. 0085); and actuating, with the processor (102) of the robot (1), the robot (1) to execute a first task (Para. 0085); an application of a communication device is used by a user to schedule the first task (Para. 0029); and claim 20) the operations further comprise: actuating, with the processor (102) of the robot (1), the robot (1) to execute a particular action upon detecting the user entering the environment (Para. 0012); and the processor (102) of the robot (1) detects the user entering the environment based on a location of the communication device (2)(Para. 0067, geo-coordinates of mobile device 2 may be determined by GPS and other methods). Claim(s) 1, 4 and 9 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Abhyanker (US PG. Pub. 2014/0136414). Relative to claims 1, 4, and 9, Abhyanker discloses: claim 1) A method for operating a robot (100)(Fig. 1A), comprising: capturing, with at least one sensor (102, 240)(Fig. 1A, 2) disposed on the robot (100), data of an environment and data indicative of movement of the robot (100)(Para. 0125); generating or updating, with a processor (202)(Fig. 2) of the robot (100), a map of the environment based on at least a portion of the captured data (Para. 0132); inferring, with the processor of the robot (202), a current location of the robot (100)(Para. 0118); and actuating, with the processor (202) of the robot (100), the robot (100) to execute a first task (Para. 0132; 0183, vehicle performs deliveries); an application of a communication device (user interface) is used by a user to schedule the first task (Para. 0213; 0497-0498, user schedules pick-ups and deliveries); claim 4) the at least one item comprises items from a grocery store, items from a retail store, or a pizza (Para. 0476); and claim 9) actuating, with the processor of the robot (100), the robot (100) to navigate to a particular location within the environment based on a voice command from the user, the voice command comprising an instruction for the robot to navigate to the particular location (Para. 0129; 0132). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 5-8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Artes in view of Angle et al (US Patent No. 10,391,638). Relative to claims 5-8, Artes discloses all claim limitations mentioned above, but does not expressly disclose: claim 5) determining, with the processor of the robot, a type of a second task to execute and a time and a day to execute the second task based on a history of types of tasks performed and times and days the types of tasks were performed by the robot; and actuating, with the processor of the robot, the robot to execute the second task at the time and the day determined; claim 6) the processor of the robot actuates the robot to execute the second task at the time and the day determined only after receiving confirmation by the user; and the user provides the confirmation using the application; and claim 7) the processor of the robot learns the type of the second task to execute and the time and the day to execute the second task using reinforcement learning; or claim 8) actuating, with the processor of the robot, the robot to execute a particular action when the communication device is within a particular range from the robot. Angle teaches: claim 5) determining, with the processor (221)(Fig. 3) of the robot (mobile robot 200), a type of a second task to execute and a time and a day to execute the second task based on a history of types of tasks performed and times and days the types of tasks were performed by the robot (200)(the robot 200 may move from cleaning a first room, i.e, a first task, to clean another room, i.e., second task; robot 200 is also capable of changing an end effector 242 to change cleaning tasks from a first task, such as sweeping, to a second task, such as vacuuming, mopping, etc. Col. 29, lines 65-67; Col. 30, lines 1-8; tasks may be executed based on history of types, Col. 38, lines 30-39, See “In embodiments, or in an invention disclosed herein, the mobile robot 200 monitors and learns the occupancy schedule over a period or periods of time and sets the target completion time based on the learned occupancy schedule”); and actuating, with the processor (221)(Fig. 3) of the robot (200), the robot to execute the second task at the time and the day determined (Col. 38, lines 28-32); claim 6) the processor (221) of the robot (1) actuates the robot (1) to execute the second task at the time and the day determined only after receiving confirmation by the user (user is presented the option to execute a second recommended task for a specific time/day based on collected data, the user can cancel the task if desired to avoid execution, it is inherent that the user may also confirm (as opposed to cancel) the task for the robot 200 to perform the cleaning task, Col. 44, lines 15-25); and the user provides the confirmation using the application (Col. 44, lines 15-25); and claim 7) the processor (221) of the robot (1) learns the type of the second task to execute and the time and the day to execute the second task using reinforcement learning (Col. 12, lines 1-9; Col. 43, lines 35-40; Col. 43, lines 63-67 to Col. 44, lines 1-2, system uses collected data such as identifying traffic patterns, household is unoccupied, etc. to create or determine schedules); and claim 8) the robot (22) to execute a particular action when the communication device is within a particular range from the robot (200)(Col. 26, lines 45-65, when occupant is within a certain zone, the robot may alter its action and/or move to a different zone). Angle teaches the: determining a type of a second task to execute and a time and a day to execute the second task based on a history of types of tasks; the processor actuating the robot to execute the second task at the time and the day determined only after receiving confirmation by the user; the processor learns the type of the second task to execute and the time and the day to execute; and actuating the robot to execute a particular action when the communication device is within a particular range described above, for the purpose of providing a system and method for a mobile robot that can perform tasks in a variable environment that can engage in predictive and planning activities and can escape challenging situations in a household (Col. 2, lines 4-17; Col. 1, lines 29-30). It would have been obvious to one of ordinary skill in the art on or before the time of the filing to modify the method of Artes with the: determining a type of a second task to execute and a time and a day to execute; actuating the robot to execute the second task at the time and the day only after receiving confirmation by the user; learning the type of the second task to execute and the time and the day to execute; and actuating the robot to execute a particular action when the communication device is within a particular range described above, as taught in Angle, for the purpose of providing a system and method for a mobile robot that can perform tasks in a variable environment that can engage in predictive and planning activities and can escape challenging situations in a household. Claim(s) 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Artes in view of Angle et al (US Patent No. 10,391,638). Relative to claims 19, Artes discloses all claim limitations mentioned above, but does not expressly disclose: determining, with the processor of the robot, a type of a second task to execute and a time and a day to execute the second task based on a history of types of tasks performed and times and days the types of tasks were performed by the robot; and actuating, with the processor of the robot, the robot to execute the second task at the time and the day determined. Angle teaches: determining, with the processor (221)(Fig. 3) of the robot (mobile robot 200), a type of a second task to execute and a time and a day to execute the second task based on a history of types of tasks performed and times and days the types of tasks were performed by the robot - (the robot 200 may move from cleaning a first room, i.e, a first task, to clean another room, i.e., second task; robot 200 is also capable of changing an end effector 242 to change cleaning tasks from a first task, such as sweeping, to a second task, such as vacuuming, mopping, etc. Col. 29, lines 65-67; Col. 30, lines 1-8; tasks may be executed based on history of types, Col. 38, lines 30-39, See “In embodiments, or in an invention disclosed herein, the mobile robot 200 monitors and learns the occupancy schedule over a period or periods of time and sets the target completion time based on the learned occupancy schedule”); and actuating, with the processor (221)(Fig. 3) of the robot (200), the robot to execute the second task at the time and the day determined (Col. 38, lines 28-32), for the purpose of providing a system and method for a mobile robot that can perform tasks in a variable environment that can engage in predictive and planning activities and can escape challenging situations in a household (Col. 2, lines 4-17; Col. 1, lines 29-30). It would have been obvious to one of ordinary skill in the art on or before the time of the filing to modify the method of Artes with the: determining a type of a second task to execute and a time and a day to execute described above, as taught in Angle, for the purpose of providing a system and method for a mobile robot that can perform tasks in a variable environment that can engage in predictive and planning activities and can escape challenging situations in a household. Claim(s) 16 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Artes in view of Choi et al (US PG. Pub. 2018/0210445). Relative to claims 16 and 17, Artes discloses all claim limitations mentioned above, but does not expressly disclose: claim 16) inferring, with the processor of the robot, an item type of an item using image recognition; actuating, with the processor of the robot, the robot to perform an action based on the item type of the item; and claim 17) inferring, with the processor of the robot, characteristics of at least one item, the characteristics comprising one of: size, shape, fragility, bulkiness, weight, and stability. Choi teaches: claim 16) inferring, with the processor (included in control unit, 440)(Para. 0068) of the robot (100), an item type of an item using image recognition (Para. 0150); and actuating, with the processor (included in Ref. 440) of the robot (100), the robot to perform an action based on the item type of the item (Para. 0152); claim 17) inferring, with the processor of the robot (100), characteristics of at least one item, the characteristics comprising one of: size, shape, fragility, bulkiness, weight, and stability (Para. 0152, obstacle’s height is determined). Choi teaches: at least one characteristic of an object captured in the image is provided by the user using the application; inferring, an item type of an item using image recognition; and inferring characteristics of at least one item, as mentioned above, for the purpose of providing a moving robot and control method that can provide an escape algorithm for the moving robot to avoid certain obstacles (Para. 0002; 0007). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Artes with: inferring an item type of an item using image recognition; and inferring characteristics of at least one item mentioned above, as taught in Choi for the purpose of providing a moving robot and control method that can provide an escape algorithm for the moving robot to avoid certain obstacle. Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Artes in view of Kwak et al (US PG. Pub. 2018/0133895). Relative to claim 11, Artes discloses all claim limitations mentioned above, but does not expressly disclose: at least one characteristic of an object captured in the image is provided by the user using the application. Kwak teaches: at least one characteristic of an object captured in the image is provided by the user using the application (Para. 0110; 181-0182), for the purpose of providing a mobile robot system and technique of receiving information about a moving space, and performing deep learning based on the information about the moving space, that can actively receive or reflect actual information required for traveling to improve travelling efficiency (Para. 0002; 0010-0011). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Artes with the characteristic of an object captured in the image is provided by the user, as taught in Kwak for the purpose of providing a mobile robot system and technique of receiving information about a moving space, and performing deep learning based on the information about the moving space, that can actively receive or reflect actual information required for traveling to improve travelling efficiency. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to YOLANDA RENEE CUMBESS whose telephone number is (571)270-5527. The examiner can normally be reached M-F 10-6. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Gene Crawford can be reached at 571-272-6911. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /YOLANDA R CUMBESS/Primary Examiner, Art Unit 3651
Read full office action

Prosecution Timeline

Feb 29, 2024
Application Filed
Sep 29, 2025
Non-Final Rejection — §102, §103
Dec 24, 2025
Response Filed
Mar 07, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600582
Systems and Methods for Optimized Container Loading Operations
2y 5m to grant Granted Apr 14, 2026
Patent 12591840
Device and system for an autonomous mobile robot, drone, and/or courier to deliver, hold, protect, and return parcels for multi-users in both residential and commercial applications.
2y 5m to grant Granted Mar 31, 2026
Patent 12550906
APPARATUS AND METHOD FOR GRADING, BATCHING AND SELECTIVELY TRANSFERRING FOOD PARTS
2y 5m to grant Granted Feb 17, 2026
Patent 12553625
SYSTEM AND METHOD OF CIRCULATING A GAS IN AN AUTOMATED GRID BASED STORAGE AND RETRIEVAL SYSTEM
2y 5m to grant Granted Feb 17, 2026
Patent 12544929
ROBOTIC SYSTEM WITH DEPTH-BASED PROCESSING MECHANISM AND METHODS FOR OPERATING THE SAME
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
87%
Grant Probability
96%
With Interview (+8.9%)
2y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 1113 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in for Full Analysis

Enter your email to receive a magic link. No password needed.

Free tier: 3 strategy analyses per month