Prosecution Insights
Last updated: April 19, 2026
Application No. 17/520,269

AUTONOMOUS MOBILE ROBOT, TRANSPORTER, AUTONOMOUS MOBILE ROBOT CONTROL METHOD, AND TRANSPORTER CONTROL METHOD

Final Rejection §103
Filed
Nov 05, 2021
Examiner
LEVY, MERRITT E
Art Unit
3663
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Kabushiki Kaisha Toshiba
OA Round
6 (Final)
33%
Grant Probability
At Risk
7-8
OA Rounds
3y 7m
To Grant
70%
With Interview

Examiner Intelligence

Grants only 33% of cases
33%
Career Allow Rate
26 granted / 78 resolved
-18.7% vs TC avg
Strong +37% interview lift
Without
With
+36.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
56 currently pending
Career history
134
Total Applications
across all art units

Statute-Specific Performance

§101
9.3%
-30.7% vs TC avg
§103
54.0%
+14.0% vs TC avg
§102
16.3%
-23.7% vs TC avg
§112
20.0%
-20.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 78 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on June 17, 2025, has been entered. Status of Claims This Office action is in response to the amendments filed June 17, 2025. Claims 4-7 and 19-21 are currently pending, with Claim 4 being amended, and Claim 21 being newly added. Information Disclosure Statement The information disclosure statement (IDS) submitted on June 30, 2025, is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the Examiner. Response to Amendments In response to Applicant’s amendments, filed June 17, 2025, the Examiner withdraws the previous 35 U.S.C. 103 rejections. Response to Arguments Applicant’s arguments, filed June 17, 2025, with respect to the rejections of Claims 4-7 and 19-20 under Levasseur, in view of Sonoura, Ulbrich, and Tanaka have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new grounds of rejection is made of Claims 4-7 and 19-21 in view of Levasseur, in view of Sonoura, Ulbrich, and Tanaka. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 4-6, and 19-21 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication No. 2021/0331905 A1, to Levasseur, et al (hereinafter referred to as Levasseur; previously of record), in view of U.S. Patent Publication No. 2019/0202388 A1, to Sonoura, et al (hereinafter referred to as Sonoura; previously of record), and further in view of U.S. Patent Publication No. 2024/0067510 A1, to Ulbrich, et al (hereinafter referred to as Ulbrich; previously of record). As per Claim 4, Levasseur discloses the features of a transporter comprising an autonomous mobile robot configured to transport a transport object (e.g. Paragraphs [0025], [0028]; where an autonomous vehicle, such as a robot, includes a body configured for movement along a surface such as the floor of a warehouse, where the autonomous vehicle comprises an end effector for mating to an element for lifting and transport of items such as a pallet or container (i.e. configured to transport a transport object)), ‘…’ the autonomous mobile robot comprising: a driver configured to cause the autonomous mobile robot to move (e.g. Paragraphs [0027], [0028], [0031]; where the robot (10) has a control system for controlling the end-effector, the robot body, or both to move and travel along a surface); a first detector attached to a first position (e.g. Paragraph [0036]; Figure 3; where multiple sensors are located on the robot, and the sensors may be located at different positions, for example, at different heights; and where one or more sensors (36a-e) are located on the robot in different positions), the first position being a position in height at which a surrounding existence object is not present (e.g. Paragraph [0036]; where the sensors (36a, 36c) may be located on and movable with, the end-effector (i.e. adjustable in height) to detect and image elements, such as a pallet or container or fixed device, so that a sensor will not be blocked by the detected elements), ‘…’ the first detector being configured to obtain first data by scanning a first object at a first region around the autonomous mobile robot (e.g. Paragraphs [0034]-[0036]; where one or more sensors (36a-e) are located on the robot (10) for use in detecting the location of the robot itself, for detecting an element to pick up, and/or detecting a location on which to place the element; and where each LIDAR scanner (i.e. detector) is configured to detect objects within a sensing plane; and where a sensor located on the end-effector may enable the robot to detect image elements, such as a pallet or container, that are located directly in front of the robot (i.e. a first region around the robot)), a second detector attached to a second position (e.g. Paragraph [0036]; Figure 3; where multiple sensors are located on the robot, and the sensors may be located at different positions, for example, at different heights; and where one or more sensors (36a-e) are located on the robot in different positions), the second detector being configured to obtain second data by scanning a second object at a second region around the autonomous mobile robot (e.g. Paragraphs [0034]-[0037]; where each LIDAR scanner (i.e. detector) is configured to detect objects within a sensing plane; where one or more sensors may be located at a mid-point, bottom or on the body of the robot (10), which enables the robot to capture elements in front of the robot (i.e. a second region) even when a sensor is blocked, and may include scanners spanning an arc of 180 degrees from one side of the robot to the opposite site of the robot; and where combinations of the data from each of the sensors may be correlated and combined to obtain 3D data representing the space in front of the robot), the second position being lower than the first position in a height direction of the transporter (e.g. .g. Paragraph [0036]; Figure 3; where multiple sensors are located on the robot, and the sensors may be located at different positions, for example, at different heights; and where one or more sensors (36a-e) are located on the robot in different positions; and where one or more sensors may be located at a mid-point, bottom or on the body of the robot (10)), ‘…’ a localization estimation part configured to calculate an estimated position of the autonomous mobile robot in a region in which the transporter travels, in accordance with the first data (e.g. Paragraphs [0034], [0037], [0038]-[0040]; where the sensors may assist in operations such as object detection and localization; and where the one or more sensors (36a-e) are used for detecting the location of the robot itself); a route-generating part configured to calculate a position of the second object present around the autonomous mobile robot in accordance with the second data (e.g. Paragraphs [0028], [0039], [0040]; where the on-board control system of the robot may or may not use a pre-planned route (i.e. move on its own accord) through the map to identify where to locate an element, where the robot may move through and around the space and determine an action upon detecting an object; and where the fleet control system may coordinate operations of the robot including movement through the space), ‘…’ the route-generating part being configured to calculate a route to a target position in accordance with the estimated position of the autonomous mobile robot in the region and the position of the second object (e.g. Paragraphs [0028], [0039], [0040]; where the on-board control system of the robot may or may not use a pre-planned route (i.e. move on its own accord) through the map to identify where to locate an element, where the robot may move through and around the space and determine an action upon detecting an object; and where the fleet control system may coordinate operations of the robot including movement through and around the space to identify elements that the robot is directed to pick up and move), ‘…’ a control part configured to control the driver in accordance with the route (e.g. Paragraphs [0027], [0028], [0031]; where the robot (10) has a control system for controlling the end-effector, the robot body, or both to move and travel along a surface), ‘…’ the control part being configured to cause the autonomous mobile robot to travel to the target position (e.g. Paragraphs [0027], [0028], [0031]; where the robot (10) has a control system for controlling the end-effector, the robot body, or both to move and travel along a surface). Levasseur fails to disclose every feature of the transport object including: plurality of wheels provided under the transport object; and a third detector configured to obtain third data associated with positions of the plurality of the wheels, wherein the control part is configured to control the driver in accordance with the third data to cause the autonomous mobile robot to be disposed between a pair of the wheels of the plurality of the wheels. However, Sonoura, in the same field of endeavor, teaches the features of the transport object including: a plurality of wheels provided under the transport object. Sonoura teaches an unmanned transport vehicle, where a transport object has wheels for movement, and the transport object is different than the transporter/ autonomous mobile robot (e.g. Paragraph [0029]; Figure 1). It would have been obvious to a person of ordinary skill in the art before the time of the Applicant’s invention to modify autonomous robot system of Levasseur, feature of having wheels on a transport object in the system of Sonoura, in order to allow the transport object to be moved freely without the use of a transporter. Sonoura further teaches the features of a third detector configured to obtain third data associated with positions of the plurality of the wheels. Sonoura teaches an unmanned transport vehicle, where a rear monitor (50) may be a laser range finder (LRF) that can irradiate with transport-object (900) with a laser, and the rear monitor (50) acquires information relating to the width of the transport-object (900), including determining heights and positions of the caster wheels to determine if the unmanned transport vehicle (1) can enter between the casters (920) of the transport-object (e.g. Paragraphs [0037], [0052], [0094]; Figure 2). It would have been obvious to a person of ordinary skill in the art before the time of the Applicant’s invention to modify autonomous robot system of Levasseur, with the feature of determining the location of the wheels in the system of Sonoura, in order to improve determine proximity to an object for transport. Sonoura further teaches the features of wherein the control part is configured to control the driver in accordance with the third data to cause the autonomous mobile robot to be disposed between a pair of the wheels of the plurality of the wheels. Sonoura teaches an unmanned transport vehicle, where the controller (60) controls the movement controller (61) such that the unmanned transport vehicle (1) moves toward below the loading portion (910) of the transport object (900) (for example, toward a space between two casters (920) of the transport object (900), and the controller detects the width (W2) of the transport-object (900) (e.g. Paragraph [0052]). It would have been obvious to a person of ordinary skill in the art before the time of the Applicant’s invention to modify autonomous robot system of Levasseur, with the feature of determining the location of the wheels in the system of Sonoura, in order to improve determine proximity to an object for transport. The combination of Levasseur, in view of Sonoura, fails to teach every feature of the surrounding existence object being an object other than an environment inherence object, the surrounding existence object being changeable in position or shape, the first detector being configured to scan the environment inherence object as a first object, the environment inherence object not being movable; the second object being different from the environment inherence object, the second object being a movable object. However, Ulbrich, in a similar field of endeavor, further teaches an autonomous industrial truck, where the sensors can detect obstacles/ persons in the vicinity of the vehicle, and when a person or trolley is recognized (i.e. detects a movable object), the sensor passes the information about the person to the controller to determine if the autonomous vehicle needs to reduce speed (i.e. determines surrounding existence objects that are moveable in position or shape); and where the system may evaluate detected positions data of moving objects over time; and when the obstacle is determined to be static, the obstacle is evaluated to determine whether these obstacles have a straight edge, such as a wall (151) (i.e. detects environment inherence objects that are not movable) (e.g. Paragraphs [0024], [0036], [0129], [0151], [0278]). It would have been obvious to a person of ordinary skill in the art before the time of the Applicant’s invention to further modify autonomous robot system of Levasseur, in view of Sonoura, with the feature of detecting a movable object in the system of Ulbrich, in order to determine if the autonomous vehicle can transfer an item to a different moving vehicle. Ulbrich further teaches the features of wherein the first detector is configured to detect that the first object that is not a movable object. Ulbrich teaches an autonomous industrial truck, where the system can detect an obstacle in the vicinity of the autonomous vehicle, and determines that the obstacle is now static or temporarily static; where stationary objects are detected and evaluated to determine whether these obstacles have a straight edge, such as a wall (151) (i.e. not detects objects that are not movable) (e.g. Paragraphs [0151], [0278], [0280]). It would have been obvious to a person of ordinary skill in the art before the time of the Applicant’s invention to further modify autonomous robot system of Levasseur, in view of Sonoura, with the feature of detecting a static object in the system of Ulbrich, in order to determine autonomous movement and path planning. The combination of Levasseur, in view of Sonoura, further fails to teach every feature of the estimated position calculated by the localization estimation part not being affected by obstacles serving as the second object detect by the second detector; the route-generating part being configured to generate an obstacle-avoiding route to the target position in accordance with the first data and the second data thereby the obstacle-avoiding route causing the transporter to avoid the obstacles. Ulbrich teaches the features of the estimated position calculated by the localization estimation part not being affected by obstacles serving as the second object detect by the second detector. Ulbrich teaches an autonomous industrial truck, where the system can detect an obstacle in the vicinity of the autonomous vehicle, and localize itself based on a determination of static and moving objects in the location, and can determine a waiting position based on the detection and likelihood of moving objects within the path (i.e. the vehicle can localize itself based on the status of detected objects) (e.g. Paragraphs [0128]). It would have been obvious to a person of ordinary skill in the art before the time of the Applicant’s invention to further modify autonomous robot system of Levasseur, in view of Sonoura, with the feature of localizing the vehicle in the system of Ulbrich, in order to streamline the route planning of the vehicle. Ulbrich further teaches the features of the route-generating part being configured to generate an obstacle-avoiding route to the target position in accordance with the first data and the second data thereby the obstacle-avoiding route causing the transporter to avoid the obstacles. Ulbrich teaches an autonomous industrial truck, where the vehicle can localize itself to its environment, and the system determines the amount of energy required to drive around obstacles, implement obstacle avoidance (e.g. Paragraphs [0019], [0043], [0131], [0236]). It would have been obvious to a person of ordinary skill in the art before the time of the Applicant’s invention to further modify autonomous robot system of Levasseur, in view of Sonoura, with the feature of localizing the vehicle in the system of Ulbrich, in order to streamline the route planning of the vehicle. As per Claim 5, Levasseur, in view of Sonoura and Ulbrich, teaches the features of Claim 4, and Levasseur further teaches the features of wherein the autonomous mobile robot is a transfer carriage configured to transport the transport object (e.g. Paragraphs [0025], [0028]; where an autonomous vehicle, such as a robot, includes a body configured for movement along a surface such as the floor of a warehouse, where the autonomous vehicle comprises an end effector for mating to an element for lifting and transport of items such as a pallet or container (i.e. configured to transport a transport object)). As per Claim 6, Levasseur, in view of Sonoura and Ulbrich, teaches the features of Claim 4, and Levasseur further teaches the features of wherein the first detector is adjustable in height to be higher than the position in height of the surrounding existence object serving the movable object (e.g. Paragraph [0036]; where the sensors (36a, 36c) may be located on and movable with, the end-effector (i.e. adjustable in height) to detect and image elements, such as a pallet or container or fixed device, so that a sensor will not be blocked by the detected elements). As per Claim 19, Levasseur, in view of Sonoura and Ulbrich, teaches the features of Claim 4, and Ulbrich further teaches the features of wherein the first object is at least one of a wall, a pillar, and a corner of a building. Ulbrich teaches an autonomous industrial truck, where the system can detect an obstacle in the vicinity of the autonomous vehicle, and determines that the obstacle is now static or temporarily static; where stationary objects are detected and evaluated to determine whether these obstacles have a straight edge, just as a wall (151) (e.g. Paragraphs [0151], [0278], [0280]). It would have been obvious to a person of ordinary skill in the art before the time of the Applicant’s invention to further modify autonomous robot system of Levasseur, in view of Sonoura, with the feature of detecting a static object in the system of Ulbrich, in order to determine autonomous movement and path planning. As per Claim 20, Levasseur, in view of Sonoura and Ulbrich, teaches the features of Claim 4, and Ulbrich further teaches the features of wherein the first detector is configured to scan the first object in a predetermined angle range in a planar direction around the first detector and obtain first data at a plurality of points. Ulbrich teaches an autonomous industrial truck, where the autonomous vehicle is configured in such a way that the angle of coverage of the rear LIDAR is varied, and switched to a narrower field of view when the autonomous vehicle has picked up a load; and where the LIDARS have an angle to the vertical is between 5-20 degrees (e.g. Paragraphs [0012], [0131], [0278]). It would have been obvious to a person of ordinary skill in the art before the time of the Applicant’s invention to further modify autonomous robot system of Levasseur, in view of Sonoura, with the feature of defining a range of a sensor in the system of Ulbrich, in order to correct for sensor errors and improve path planning. As per Claim 21, Levasseur, in view of Sonoura and Ulbrich, teaches the features of Claim 4, and Ulbrich further teaches the features of wherein the first detector and the third detector are disposed on opposite sides of the autonomous mobile robot, and the second detector is disposed at a position higher than a position of the third detector. Ulbrich teaches an autonomous industrial truck, where the autonomous vehicle is configured to have vertical LIDAR sensors (16), forward infrared sensors (28), and rear-facing sensors (17) (i.e. first, second, and third detectors), on opposite sides of the vehicle, where the vertical LIDAR (16) is higher than the rear-facing sensors (17) (e.g. Figure 1). It would have been obvious to a person of ordinary skill in the art before the time of the Applicant’s invention to further modify autonomous robot system of Levasseur, in view of Sonoura, with the feature of determining position information of each sensor in the system of Ulbrich, in order to provide full coverage of the surroundings of the vehicle. Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Levasseur, in view Sonoura and Ulbrich, as applied to Claim 4 above, and further in view of Japanese Patent Publication No. 2002200587 A, to Tanaka (hereinafter referred to as Tanaka; previously of record). As per Claim 7, Levasseur, in view of Sonoura and Ulbrich, teaches the features of Claim 4, but the combination of Levasseur, in view of Sonoura and Ulbrich, fails to teach every feature of wherein the first detector is provided at a position in height higher than 1.8 m. However, Tanaka teaches a carry device for a robot, where the height of the carrier device in the extended state is 1.8m (e.g. Paragraph [0026]). It would have been obvious to a person of ordinary skill in the art before the time of the Applicant’s invention to further modify autonomous robot system of Levasseur, in view of Sonoura and Ulbrich, with the feature of having a sensor at a predetermined height in the system of Tanaka, in order to allow the vehicle to determine the height of an object. It has been held that where the general conditions of a claim are disclosed in the prior art, discovering the optimum or workable ranges involves only routine skill in the art. In re Aller, 105 USPQ 233. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Ichinose, et al (U.S. 2017/0285644 A1), which teaches an autonomous forklift, where the system adjusts the height of sensors to be able to see over a load and to be able to line up the forks to load a pallet. Paschall, et al (U.S. 2019/0160675 A1), which teaches a method for dynamic navigation of an autonomous vehicle by using sensors in different positions and heights to map an environment. Sturm (U.S. 2013/0166108 A1), which teaches a method for operating a transport carriage, and aligning the wheels so as to be able to lift and transport a load. Svensson, et al (U.S. 2016/0090283 A1), which teaches a method for operating a lift truck, where the sensor height is adjusted to be the higher than the height of the forklift. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MERRITT E LEVY whose telephone number is (571)270-5595. The examiner can normally be reached Mon-Fri 0630-1600. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Helal Algahaim can be reached at (571) 270-5227. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MERRITT E LEVY/Examiner, Art Unit 3666 /HELAL A ALGAHAIM/SPE , Art Unit 3666
Read full office action

Prosecution Timeline

Nov 05, 2021
Application Filed
Feb 02, 2024
Non-Final Rejection — §103
May 08, 2024
Response Filed
Jun 12, 2024
Final Rejection — §103
Sep 18, 2024
Request for Continued Examination
Oct 02, 2024
Response after Non-Final Action
Oct 21, 2024
Non-Final Rejection — §103
Feb 14, 2025
Response Filed
Mar 10, 2025
Final Rejection — §103
Jun 17, 2025
Request for Continued Examination
Jun 23, 2025
Response after Non-Final Action
Jul 14, 2025
Non-Final Rejection — §103
Nov 28, 2025
Response Filed
Dec 15, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601596
Estimation of Target Location and Sensor Misalignment Angles
2y 5m to grant Granted Apr 14, 2026
Patent 12603005
DRIVER ASSISTANCE MODULE FOR A MOTOR VEHICLE
2y 5m to grant Granted Apr 14, 2026
Patent 12594944
METHOD AND SYSTEM FOR VEHICLE DRIVE MODE SELECTION
2y 5m to grant Granted Apr 07, 2026
Patent 12594960
NAVIGATIONAL CONSTRAINT CONTROL SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12583382
SYNCHRONIZED LIGHTING FOR ELECTRIC VEHICLES
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
33%
Grant Probability
70%
With Interview (+36.6%)
3y 7m
Median Time to Grant
High
PTA Risk
Based on 78 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month