Prosecution Insights
Last updated: April 19, 2026
Application No. 18/936,122

MOBILE ROBOT AND ITS OPERATION METHOD

Non-Final OA §102§103
Filed
Nov 04, 2024
Examiner
ALKIRSH, AHMED
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
LG Electronics Inc.
OA Round
1 (Non-Final)
54%
Grant Probability
Moderate
1-2
OA Rounds
3y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
23 granted / 43 resolved
+1.5% vs TC avg
Strong +54% interview lift
Without
With
+53.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
63 currently pending
Career history
106
Total Applications
across all art units

Statute-Specific Performance

§101
20.2%
-19.8% vs TC avg
§103
54.5%
+14.5% vs TC avg
§102
22.5%
-17.5% vs TC avg
§112
2.8%
-37.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 43 resolved cases

Office Action

§102 §103
DETAILED ACTIONNotice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-23 of U.S. Application No. 18/936,122 filed on 11/04/2024 have been examined. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim 1, 3-6, 11-13, 15-16 and 18-23 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Janssen et al. (US11036236B2). Regarding Claims 1, 14 and 21, Janssen discloses A mobile robot comprising: [Abstract]: (“a light projection system for an autonomous robot”) [Col. 2, Lines 28-32]: (“autonomous robot that can indicate the robot’s intended path of travel”) a projector configured to project visual information onto one or more surfaces; and [Abstract]: (“The light projection system can… project light onto the ground in front of the robot”) a controller configured to: [Col. 5, Lines 62-64]: (“the light projection system 106 can include a controller circuit that can control the motors that move the mounting system or adjust the focusing system.”) project, via the projector, first visual information for marking a safety area onto a ground surface in a vicinity of the mobile robot while the mobile robot is traveling, and [Col. 2, Lines 40-44]: (“ the illumination pattern is selected to provide as much information as possible in the simplest manner possible. By projecting the robot's path of travel onto the ground ahead of the robot, the light projection system can aid the robot in safely navigating among people.”) in response to determining a change in at least one of a traveling state of the mobile robot or a surrounding situation of the mobile robot, generate changed first visual information and project the changed first visual information onto the ground surface. [Col. 6, Lines 51-59]: (“To indicate the robot's speed, In various examples, the light projection system 106 can change the shape of the light projected on the ground, and/or can make the projected light move.”) Regarding Claim 3: Janssen discloses further comprising: a sensing unit configured to sense a speed of the mobile robot, [Col. 6, Lines 50-60]: (“The robot's on-board computer may determine, for example, that the robot 100 is accelerating from being stopped, or is able to go faster than the robot's current speed.”). wherein the controller is further configured to: in response to determining a change in the speed of the mobile robot, generate the changed first visual information based on changing at least one of a color of the first visual information or a size of the first visual information. [Col. 6, Lines 51-59]: (“the light projection system 106 can also be used to indicate the robot's velocity or a change in velocity. The robot's on-board computer may determine, for example, that the robot 100 is accelerating from being stopped, or is able to go faster than the robot's current speed. To indicate the robot's speed, In various examples, the light projection system 106 can change the shape of the light projected on the ground, and/or can make the projected light move.”) Regarding Claim 4: Janssen discloses wherein the sensing unit is further configured to sense a travel direction of the mobile robot, and [Col. 4, Lines 44-55]: (“In the example of FIG. 1A, the light projection system 106 has configured the illumination pattern 110 in the shape of a vertical bar to indicate that the robot 100 is moving forward. Specifically, the bar is oriented parallel to the robot's forward direction of travel.”) wherein the controller is further configured to: generate the changed first visual information based on elongating an image shape of the first visual information in a direction toward the travel direction. [Col. 6, Lines 51-59]: (“the light projection system 106 can change the shape of the light projected on the ground, and/or can make the projected light move.”) [Col. 4, Lines 44-55]: (“Specifically, the bar is oriented parallel to the robot's forward direction of travel.”) [Col. 6, Lines 40-50]: (“As another example, the light projection system 106 can intermittently change the projected light from a short bar or spot to a longer bar.”) Regarding Claim 5: Janssen discloses wherein the controller is further configured to: generate the changed first visual information based on increasing or decreasing an image size of the first visual information based on the speed of the mobile robot and changing a color of the first visual information based on the speed of the mobile robot. [Col. 6, Lines 51-59]: (“the light projection system 106 can also be used to indicate the robot's velocity or a change in velocity. The robot's on-board computer may determine, for example, that the robot 100 is accelerating from being stopped, or is able to go faster than the robot's current speed. To indicate the robot's speed, In various examples, the light projection system 106 can change the shape of the light projected on the ground, and/or can make the projected light move.”) Regarding Claim 6: Janssen discloses further comprising: a sensing unit configured to sense an obstacle in the vicinity of the mobile robot, [Col. 3, Lines 60-67]: (“For example, when the robot's sensors indicate that an object is located within a certain distance (e.g., three feet, five feet, and/or a distance that varies with the robot's current velocity) from the front of the robot 100, the on-board computer can cause the robot 100 to slow down and/or turn right or left to navigate around the object.”) wherein the controller is further configured to: generate the changed first visual information based on changing the first visual information according to a state of the obstacle. [Col. 3, Lines 60-67]: (“when the robot's sensors indicate that an object is located within a certain distance (e.g., three feet, five feet, and/or a distance that varies with the robot's current velocity) from the front of the robot 100, the on-board computer can cause the robot 100 to slow down and/or turn right or left to navigate around the object. Once the robot's sensors indicate that the obstacle has been bypassed, the on-board computer can adjust the robot's path back to the intended course, if needed.”) Regarding Claim 11: Janssen discloses further comprising: a sensing unit configured to sense the surrounding situation of the mobile robot at a position of the mobile robot, [Col. 6, Lines 51-59]: (“To indicate the robot's speed, In various examples, the light projection system 106 can change the shape of the light projected on the ground, and/or can make the projected light move.”) wherein the controller is further configured to: detect at least one of an upcoming crossway and an upcoming corner area based on the surrounding situation, and generate the changed first visual information by changing a size of the first visual information or a shape of the first visual information according to the at least one of the upcoming crossway and the upcoming corner area. [Col. 2, Lines 1-5]: (“For example, a robot may be programed to travel from one building in a town or city to another building, and in doing so may traverse sidewalks and cross streets.”) [Col. 4, Lines 14-18]: (“The on-board computer can then use this information to adjust the robot's speed and/or direction of travel, so that the robot 100 may be able to avoid running into people or can avoid moving faster than the flow of surrounding traffic.”) Regarding Claim 12: Janssen discloses wherein the controller is further configured to: adjust at least one of a size and a shape of the changed first visual information as the mobile robot approaches the at least one of the upcoming crossway and the upcoming corner area, and in response to determining that the mobile robot has passed by the at least one of the upcoming crossway and the upcoming corner area, restore the at least one of the size and the shape of the changed first visual information to a previous state of the changed first visual information. [Col. 2, Lines 15-28]: (“When encountering a robot traveling along a sidewalk or crossing a street, ……….. the robot may need to make small course corrections along the way to avoid unexpected obstacles, people, uneven terrain, and/or other situations that can cause the robot to deviate from a strictly straight path. …… and the safety of the robot, it may be desirable for the robot to indicate where the robot is going.”) [Col. 6 -7, Lines 64-67 & 1-5]: (” The length of the bar can be an indicator, for example, of where the robot 100 will be in three to five seconds, or another amount of time. In some examples, the length of the bar can change actively. For example, the bar can have an initial length, which can increase to a second length, and then return to the first length, in an intermittent pattern. In some examples, the light projection system 106 can additionally cause the projected light to blink as the length of the bar changes.”) Regarding Claim 13: Janssen discloses wherein the controller is further configured to: project the first visual information onto the ground surface before the mobile robot starts to travel, and in response to a predetermined amount of time elapsing after the mobile robot stops traveling, interrupt the projection of the first visual information. [Col. 6 -7, Lines 60-67 & 1-5]: (”FIG. 1C illustrates an example of an illumination pattern 114 that can be used to indicate the robot's speed or velocity. In this example, the illumination pattern 114 includes a vertical bar that has been lengthened in proportion to the robot's velocity. The length of the bar can be an indicator, for example, of where the robot 100 will be in three to five seconds, or another amount of time. In some examples, the length of the bar can change actively. For example, the bar can have an initial length, which can increase to a second length, and then return to the first length, in an intermittent pattern. In some examples, the light projection system 106 can additionally cause the projected light to blink as the length of the bar changes.”) Regarding Claim 15: Janssen discloses wherein the controller determines the next operation of the mobile robot based on the obstacle approaching the mobile robot and controls the projector to project the second visual information onto the ground surface before the next scheduled operation is performed by the mobile robot. [Col. 3, Lines 55-67]: (“The obstacle may not be noted in the data the external computer uses to determine the robot's route, or may be a mobile obstacle, so that the obstacle's presence or location may not be predictable. In these and other examples, the robot's on-board computing device can include instructions for adjusting the robot's path as the robot travels a route. For example, when the robot's sensors indicate that an object is located within a certain distance (e.g., three feet, five feet, and/or a distance that varies with the robot's current velocity) from the front of the robot 100, the on-board computer can cause the robot 100 to slow down and/or turn right or left to navigate around the object.”) Regarding Claim 16: Janssen discloses wherein the next scheduled operation includes the mobile robot traveling around the obstacle, and wherein the second visual information indicates a position of the obstacle projected onto the ground surface before the mobile travels around the obstacle. [Col. 3, Lines 60-67]: (“The obstacle may not be noted in the data the external computer uses to determine the robot's route, or may be a mobile obstacle, so that the obstacle's presence or location may not be predictable. In these and other examples, the robot's on-board computing device can include instructions for adjusting the robot's path as the robot travels a route. For example, when the robot's sensors indicate that an object is located within a certain distance (e.g., three feet, five feet, and/or a distance that varies with the robot's current velocity) from the front of the robot 100, the on-board computer can cause the robot 100 to slow down and/or turn right or left to navigate around the object.”) Regarding Claim 18: Janssen discloses wherein the controller is further configured to: in response to sensing a first obstacle and a second obstacle, generate the second visual information to include a mobile guide that indicates positions of the mobile robot and the second obstacle for guiding a traveling path of the first obstacle, and projecting the second visual information onto the ground surface. [Col. 4, Lines 4-18]: (“In these examples, to assist the robot 100 in navigating among people, the robot 100 can include an array of sensors that can detect people or objects within a certain distance from the robot 100 (e.g., three feet, five, or another distance). Using these sensors, the robot's on-board computing device may be able to an approximate number and proximity of objects around the robot 100, and possibly also the rate at which the objects are moving. The on-board computer can then use this information to adjust the robot's speed and/or direction of travel, so that the robot 100 may be able to avoid running into people or can avoid moving faster than the flow of surrounding traffic.”) Regarding Claim 19: Janssen discloses wherein the second visual information marks a safety area based on the positions of the mobile robot and the second obstacle, and a risk area, based on the positions of the mobile robot and the second obstacle. [Col. 4, Lines 4-18]: (“In these examples, to assist the robot 100 in navigating among people, the robot 100 can include an array of sensors that can detect people or objects within a certain distance from the robot 100 (e.g., three feet, five, or another distance). Using these sensors, the robot's on-board computing device may be able to an approximate number and proximity of objects around the robot 100, and possibly also the rate at which the objects are moving. The on-board computer can then use this information to adjust the robot's speed and/or direction of travel, so that the robot 100 may be able to avoid running into people or can avoid moving faster than the flow of surrounding traffic.”) Regarding Claim 20: Janssen discloses wherein the controller is further configured to: determine a risk area based on the state of the ground surface, and project the second visual information on the ground surface to mark the risk area. [Col. 6, Lines 20-28]: (“The robot's on-board computing device, for example, may periodically or continuously review the robot's route or route adjustments to see where the robot 100 is supposed to be at the current moment and/or in a few seconds (e.g., three seconds, five seconds, or another number of seconds in the future). When the on-board computing device determines that the robot 100 is to make a left turn, the computing device can instruct the light projection system 106 to make adjustments to project the illumination pattern 112 that indicates a left turn. In some examples, the computing device's programming may cause the illumination pattern 112 to be projected a few seconds before the computing device instructs the robot 100 to execute the turn.”) Regarding Claim 22: Janssen discloses wherein the condition includes at least one of a speed of the mobile robot, a traveling state of the mobile robot, a surrounding situation of the mobile robot, and a current condition of the ground surface. [Col. 6, Lines 51-59]: (“the light projection system 106 can also be used to indicate the robot's velocity or a change in velocity. The robot's on-board computer may determine, for example, that the robot 100 is accelerating from being stopped, or is able to go faster than the robot's current speed. To indicate the robot's speed, In various examples, the light projection system 106 can change the shape of the light projected on the ground, and/or can make the projected light move.”) Regarding Claim 23: Janssen discloses wherein the adjusting the attribute of the first visual information including varying a size, a shape, a pattern or a color of the first visual information. [Col. 9, Lines 25-45]: (“The light sources can include one or more of LEDs, halogen bulbs, lasers, other light emitting devices, or a combination of light emitting devices. In some examples, the light sources may be able to project light of different colors, and the light fixture 230 can include controls for changing the color that is projected.”) [Col. 4, Lines 62-64]: (“The light projection system 106 can also include a focusing system that is able to change the intensity, direction, and/or shape of the projected light.”) Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 7-10 are rejected under 35 U.S.C. 103 as being unpatentable over JANSSEN et al. (US 10401176 B2) in view of Holson et al. (US12436546B2), hereinafter referred to as JANSSEN and Holson respectively. Regarding Claim 7: The mobile robot of claim 1, JANSSEN does not explicitly teach wherein the traveling state includes an operational state that varies based on at least one other moving body being connected to the mobile robot, and wherein the controller is further configured to: in response to sensing that the at least one other moving body is connected to the mobile robot, generate the changed first visual information based on information about the at least one other moving body. However, Holson does teach wherein the traveling state includes an operational state that varies based on at least one other moving body being connected to the mobile robot, and [Col. 11, Lines 24-37]: (“According to various embodiments, the chassis 304 may include one or more rigid members providing physical support and connection between and among other components of the robots. For instance, the chassis 304 may be composed of one or more rods, shelves, bins or other elements. In some configurations, some or all of the chassis 304 may be composed of components from standardized shelving units or carts.”) wherein the controller is further configured to: in response to sensing that the at least one other moving body is connected to the mobile robot, generate the changed first visual information based on information about the at least one other moving body. [Col. 26, Lines 30-36]: (“In some embodiments, haptic rails or virtual train rails may be defined in any of various ways. For example, the robot may project rails onto the ground via a projector.”) Both JANSSEN and Holson teach methods for projecting visual information for an area while the mobile robot travels. However, Holson explicitly teaches teach wherein the traveling state includes an operational state that varies based on at least one other moving body being connected to the mobile robot, and wherein the controller is further configured to: in response to sensing that the at least one other moving body is connected to the mobile robot, generate the changed first visual information based on information about the at least one other moving body. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the visual information projection method of JANSSEN to also include wherein the traveling state includes an operational state that varies based on at least one other moving body being connected to the mobile robot, and wherein the controller is further configured to: in response to sensing that the at least one other moving body is connected to the mobile robot, generate the changed first visual information based on information about the at least one other moving body, as taught by Holson, with a reasonable expectation of success. Doing so improves safety for operating a mobile robot (With regard to this reasoning, see at least [Holson, Col. 11, Lines 24-37 and Col. 26, Lines 30-36]). Regarding Claim 8: The mobile robot of claim 7, JANSSEN does not explicitly teach wherein the information about the at least one other moving body includes information on a number of moving bodies connected to the mobile robot, and wherein the controller is further configured to: generate the changed first visual information based on changing a size of the first visual information based on the information about the at least one other moving body or changing a shape of the first visual information based on the information about the at least one other moving body. However, Holson does teach wherein the information about the at least one other moving body includes information on a number of moving bodies connected to the mobile robot, and [Col. 40, Lines 40-50]: (“In a multi-cart consolidation workflow, two or more robotic carts move to positions proximate to one another. The robotic carts can then coordinate to activate lighting elements to facilitate the movement of items from one bin on one of the robotic carts to another bin, which may potentially be on a different robotic cart. For instance, one or more lights may be activated to indicate a source location of an item to be moved. Then, after the item is picked up and scanned, another one or more lights may be activated to indicate a destination location for the item.”) wherein the controller is further configured to: generate the changed first visual information based on changing a size of the first visual information based on the information about the at least one other moving body or changing a shape of the first visual information based on the information about the at least one other moving body. [Col. 26, Lines 30-36]: (“In some embodiments, haptic rails or virtual train rails may be defined in any of various ways. For example, the robot may project rails onto the ground via a projector.”) Both JANSSEN and Holson teach methods for projecting visual information for an area while the mobile robot travels. However, Holson explicitly teaches teach wherein the information about the at least one other moving body includes information on a number of moving bodies connected to the mobile robot, and wherein the controller is further configured to: generate the changed first visual information based on changing a size of the first visual information based on the information about the at least one other moving body or changing a shape of the first visual information based on the information about the at least one other moving body. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the visual information projection method of JANSSEN to also include wherein the information about the at least one other moving body includes information on a number of moving bodies connected to the mobile robot, and wherein the controller is further configured to: generate the changed first visual information based on changing a size of the first visual information based on the information about the at least one other moving body or changing a shape of the first visual information based on the information about the at least one other moving body, as taught by Holson, with a reasonable expectation of success. Doing so improves safety for operating a mobile robot (With regard to this reasoning, see at least [Holson, Col. 40, Lines 40-50 and Col. 26, Lines 30-36]). Regarding Claim 9: JANSSEN discloses wherein the controller is further configured to: project the changed first visual information onto the ground surface in a direction that corresponds to a traveling direction of the mobile robot. [Col. 4, Lines 44-55]: (“the light projection system 106 has configured the illumination pattern 110 in the shape of a vertical bar to indicate that the robot 100 is moving forward. Specifically, the bar is oriented parallel to the robot's forward direction of travel.”) Regarding Claim 10: JANSSEN discloses The mobile robot of claim 7, wherein the controller is further configured to: determine an access restriction area based on the information on the amount of load present on the at least one other moving body [Col. 2, Lines 40-44]: (“The light projection system can project an illumination pattern on the ground, where the illumination pattern indicates the robot's path of travel. In some examples, the illumination pattern can indicate the robot's intended direction. Alternatively or additionally, the illumination pattern can indicate a location where the robot is estimated to be in within a few seconds. In various examples, the illumination pattern is selected to provide as much information as possible in the simplest manner possible. By projecting the robot's path of travel onto the ground ahead of the robot, the light projection system can aid the robot in safely navigating among people.”) JANSSEN does not explicitly teach wherein the information about the at least one other moving body includes information on an amount of load present on the at least one other moving body connected to the mobile robot, and generate the changed first visual information by changing a size of the first visual information or a shape of the first visual information according to the access restriction area. However, Holson does teach wherein the information about the at least one other moving body includes information on an amount of load present on the at least one other moving body connected to the mobile robot, and [Col. 10, Lines 28-36]: (“In some embodiments, the force sensing assembly 110 may be a force sensing handlebar assembly that is positioned between a human operator and the payload 108 to significantly reduce the effort involved in moving the payload 108 by operating drive assembly 102 via commands determined by manipulation of the force sensing assembly 110. The force sensing assembly 110 may, thus, operate the drive assembly 102 to push, pull, and/or rotate the autonomous robot 100 and, thus, payload 108.”) and generate the changed first visual information by changing a size of the first visual information or a shape of the first visual information according to the access restriction area. [Col. 38, Lines 48-58]: (“the global knowledge map 2820 may be determined based on a combination of the sensor data stored in the 2812, the environment semantic data 2814, and the environment configuration data 2816.”) Both JANSSEN and Holson teach methods for projecting visual information for an area while the mobile robot travels. However, Holson explicitly teaches wherein the information about the at least one other moving body includes information on an amount of load present on the at least one other moving body connected to the mobile robot, and generate the changed first visual information by changing a size of the first visual information or a shape of the first visual information according to the access restriction area. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the visual information projection method of JANSSEN to also include wherein the information about the at least one other moving body includes information on an amount of load present on the at least one other moving body connected to the mobile robot, and generate the changed first visual information by changing a size of the first visual information or a shape of the first visual information according to the access restriction area, as taught by Holson, with a reasonable expectation of success. Doing so improves safety for operating a mobile robot (With regard to this reasoning, see at least [Holson, Col. 10, Lines 28-36 and Col. 26, Lines 30-36]). Claims 2 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over JANSSEN in view of Takai et al. (US20210157326A1), hereinafter referred to as JANSSEN and Takai respectively. Regarding Claim 2: JANSSEN discloses wherein the first visual information includes at least one of an image or text that indicates the access restriction area in a manner that is visually distinguished from surroundings of the mobile robot. [Col. 10, Lines 1-3]: (“The apertures can be used, for example, to form the projected light into the shape of arrows, letters, words, and/or other symbols.”) JANSSEN does not explicitly teach wherein the safety area is an access restriction area determined based on a form of the mobile robot and the traveling state of the mobile robot. However, Takai does teach wherein the safety area is an access restriction area determined based on a form of the mobile robot and the traveling state of the mobile robot [0035]: (“an area on the floor surface onto which the entry prohibited space is projected is defined as an entry prohibited area”) Both JANSSEN and Takai teach methods for projecting visual information for an area while the mobile robot travels. However, Takai explicitly teaches wherein the safety area is an access restriction area determined based on a form of the mobile robot and the traveling state of the mobile robot. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the visual information projection method of JANSSEN to also include wherein the safety area is an access restriction area determined based on a form of the mobile robot and the traveling state of the mobile robot, as taught by Takai, with a reasonable expectation of success. Doing so improves safety for operating a mobile robot (With regard to this reasoning, see at least [Takai, 0035]). Regarding Claim 17: JANSSEN discloses wherein the controller is further configured to: in response to determining that mobile robot is unable to travel around the obstacle due to a condition, [Col. 3, Lines 60-67]: (“The obstacle may not be noted in the data the external computer uses to determine the robot's route, or may be a mobile obstacle, so that the obstacle's presence or location may not be predictable. In these and other examples, the robot's on-board computing device can include instructions for adjusting the robot's path as the robot travels a route. For example, when the robot's sensors indicate that an object is located within a certain distance (e.g., three feet, five feet, and/or a distance that varies with the robot's current velocity) from the front of the robot 100, the on-board computer can cause the robot 100 to slow down and/or turn right or left to navigate around the object.”). JANSSEN does not explicitly teach generate third visual information indicating access restriction and projecting the third visual information onto the ground surface. However, Takai does teach generate third visual information indicating access restriction and projecting the third visual information onto the ground surface. [0055]: (“the illumination unit illuminates the entry restricted area”) Both JANSSEN and Takai teach methods for projecting visual information for an area while the mobile robot travels. However, Takai explicitly teaches generate third visual information indicating access restriction and projecting the third visual information onto the ground surface. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the visual information projection method of JANSSEN to also include generate third visual information indicating access restriction and projecting the third visual information onto the ground surface, as taught by Takai, with a reasonable expectation of success. Doing so improves safety for operating a mobile robot (With regard to this reasoning, see at least [Takai, 0055]). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to AHMED ALKIRSH whose telephone number is (703) 756-4503. The examiner can normally be reached M-F 9:00 am-5:00 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FADEY JABR can be reached on (571) 272-1516. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AA/Examiner, Art Unit 3668 /Fadey S. Jabr/Supervisory Patent Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Nov 04, 2024
Application Filed
Feb 04, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578724
Detection of Anomalous Trailer Behavior
2y 5m to grant Granted Mar 17, 2026
Patent 12410589
METHODS AND SYSTEMS FOR IMPLEMENTING A LOCK-OUT COMMAND ON LEVER MACHINES
2y 5m to grant Granted Sep 09, 2025
Patent 12403908
NON-SELFISH TRAFFIC LIGHTS PASSING ADVISORY SYSTEMS
2y 5m to grant Granted Sep 02, 2025
Patent 12370903
METHOD FOR TORQUE CONTROL OF ELECTRIC VEHICLE ON SLIPPERY ROAD SURFACE, AND TERMINAL DEVICE
2y 5m to grant Granted Jul 29, 2025
Patent 12325450
SYSTEMS AND METHODS FOR GENERATING MULTILEVEL OCCUPANCY AND OCCLUSION GRIDS FOR CONTROLLING NAVIGATION OF VEHICLES
2y 5m to grant Granted Jun 10, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
54%
Grant Probability
99%
With Interview (+53.7%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 43 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month