Prosecution Insights
Last updated: April 19, 2026
Application No. 18/414,059

ROBOT, ROBOT SYSTEM AND CONTROLLING METHOD THEREOF

Non-Final OA §103
Filed
Jan 16, 2024
Examiner
HORNER, MINATO LEE
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
80%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
8 granted / 10 resolved
+28.0% vs TC avg
Strong +25% interview lift
Without
With
+25.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
40 currently pending
Career history
50
Total Applications
across all art units

Statute-Specific Performance

§101
12.8%
-27.2% vs TC avg
§103
50.7%
+10.7% vs TC avg
§102
21.9%
-18.1% vs TC avg
§112
11.7%
-28.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 10 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This communication is in response to application No. 18/414,059, filed on 01/16/2024. Claims 1-20 are currently pending and have been examined. Claims 1-20 have been rejected as follows. Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statements (IDS) filed on 01/16/2024 and 09/04/2024 have been acknowledged. Specification Applicant is reminded of the proper language and format for an abstract of the disclosure. The abstract should be in narrative form and generally limited to a single paragraph on a separate sheet within the range of 50 to 150 words in length. The abstract should describe the disclosure sufficiently to assist readers in deciding whether there is a need for consulting the full patent text for details. The language should be clear and concise and should not repeat information given in the title. It should avoid using phrases which can be implied, such as, “The disclosure concerns,” “The disclosure defined by this invention,” “The disclosure describes,” etc. In addition, the form and legal phraseology often used in patent claims, such as “means” and “said,” should be avoided. The abstract of the disclosure is objected to because it is over 150 words long. A corrected abstract of the disclosure is required and must be presented on a separate sheet, apart from any other text. See MPEP § 608.01(b). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-2, 5-6, 10, 12-13, and 16-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Haegermarck (US 10874274 B2) in view of Kwak (US 11137773 B2). Regarding claim 1, Haegermarck teaches a robot (Fig. 3, master robot 10) comprising: a communication interface (Fig. 1, communication interface 29); a sensor configured to obtain distance data (Fig. 3, camera 23); a driver configured to control a movement of the robot (Fig. 1, electric wheel motors 15, 16); a memory storing with map data corresponding to a space in which the robot travels (column 7 line 28, “Data from the images is extracted by the controller 22 and the data is typically saved in the memory 26 along with the computer program 25”); and a processor (Fig. 1, controller 22) configured to: control the sensor to output a sensing signal for sensing a distance with an external robot (column 9 line 63, “Thus, the master robot 10 emits light by means of its laser light sources 27, 28 onto the slave robot 30 and the camera 23 records images of a vicinity of the master robotic cleaning device 10 from which the slave robot 10 may be detected. Thereafter, the master robot 10 derives positional data of the detected objects from the recorded images, and positions itself in relation to the objects, including the slave robot 30”), control at least one of the driver or an operation state of the external robot based on the position information (column 1 line 8, “the master robotic cleaning device is arranged to submit commands the at least one slave robotic cleaning device to control a cleaning operation of the at least one slave robotic cleaning device, the commands being based on the derived positional data”), transmit a control signal for controlling the operation state of the external robot through the communication interface (column 3 line 42, “the master robot sends commands accordingly to the slave robot via its communication interface”), identify a target position of the robot based on the pose of the external robot and the stored map data (column 9 line 3, “the controller of the master robot 10 uses positional data derived from the obstacle detection device to position itself with respect to the surroundings, which also includes positioning the master robotic device 10 in relation to the slave robotic device 30 (being an “obstacle” in the surroundings of the master robot 10)”), and control the driver to move to the target position (Fig. 5a, master robotic cleaning device 10 positions itself in relation to the slave robotic cleaning device 30). Haegermarck fails to teach the processor is configured to obtain position information of the external robot based on a time at which at least one echo signal is received from the external robot, and identify, based on an error occurring in communication with the external robot through the communication interface, a pose of the external robot based on a type of the at least one echo signal received from the external robot. Haegermarck instead only uses a camera and line lasers (see Fig. 5a) to obtain position information of the external robot, which could not be considered to be “based on a time” and as an “echo signal”. Haegermarck also fails to mention a communication error between the robot and external robot. However, Kwak teaches the processor is configured to obtain position information of the external robot (Fig. 5C, second autonomous mobile robot 100b) based on a time at which at least one echo signal is received from the external robot (column 21 line 22, “The second signal may include delay time (t_reply) information which is calculated based on a time at which the first mobile robot 100a has received the first signal and a time at which the first mobile terminal 100a has output the second signal”; see column 21 lines 9-26), and identify, based on an error occurring in communication with the external robot through the communication interface (column 20 line 4, “On the other hand, when the IR sensor is used, if an obstacle is present between the first mobile robot 100a and the second mobile robot 100b, the reception of the laser light is interrupted, and the relative positions of the first and second mobile robots cannot accurately be recognized”), a pose of the external robot based on a type of the at least one echo signal received from the external robot (column 20 line 9, “To solve this problem, as illustrated in FIGS. 6A and 6B, the present invention can measure the relative positions of the first mobile robot and the second mobile robot by using UWB modules instead of the transmitting/receiving IR sensors”; column 22 line 34, “The present invention can calculate the relative positions (spatial coordinates) of the first mobile robot 100a and the second mobile robot 100b using the plurality of UWB anchors. The triangulation described in FIG. 6B will be equally/similarly applied to calculating the relative positions of the first mobile robot and the second mobile robot using three UWB anchors and one UWB tag”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Haegermarck to incorporate the teachings of Kwak in order to determine the relative positions of both robots even when there is an obstacle in the way. Doing so would increase accuracy (column 20 line 42). Regarding claim 2, the combination of Haegermarck in view of Kwak teaches the robot of claim 1. Haegermarck further teaches the processor is further configured to: identify, while communicating with the external robot through the communication interface, the pose of the external robot based on the type of the at least one echo signal received from the external robot, and transmit a control signal for changing the pose of the external robot to the external robot through the communication interface based on the pose of the external robot and the stored map data (column 1 line 8, “the master robotic cleaning device is arranged to submit commands the at least one slave robotic cleaning device to control a cleaning operation of the at least one slave robotic cleaning device, the commands being based on the derived positional data”). Regarding claim 5, the combination of Haegermarck in view of Kwak teaches the robot of claim 1. Haegermarck fails to teach the external robot comprises a plurality of sensors configured to output echo signals of different types and disposed at different positions, and wherein the processor is further configured to: identify, based on an error occurring in communication with the external robot through the communication interface, positions of the plurality of sensors, which output a plurality of echo signals from among the plurality of sensors disposed in the external robot, based on types of the plurality of echo signals received from the external robot, and identify the pose of the external robot based on the positions of the plurality of sensors. However, Kwak teaches he external robot comprises a plurality of sensors configured to output echo signals of different types and disposed at different positions (Fig. 7A, UWB anchors 710b-1, 710b-2, and 710b-3) and wherein the processor is further configured to: identify, based on an error occurring in communication with the external robot through the communication interface, positions of the plurality of sensors, which output a plurality of echo signals from among the plurality of sensors disposed in the external robot, based on types of the plurality of echo signals received from the external robot, and identify the pose of the external robot based on the positions of the plurality of sensors (column 20 line 9, “To solve this problem, as illustrated in FIGS. 6A and 6B, the present invention can measure the relative positions of the first mobile robot and the second mobile robot by using UWB modules instead of the transmitting/receiving IR sensors”; column 22 line 34, “The present invention can calculate the relative positions (spatial coordinates) of the first mobile robot 100a and the second mobile robot 100b using the plurality of UWB anchors. The triangulation described in FIG. 6B will be equally/similarly applied to calculating the relative positions of the first mobile robot and the second mobile robot using three UWB anchors and one UWB tag”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to further incorporate the teachings of Kwak in order to determine the relative positions of both robots even when there is an obstacle in the way. Doing so would increase accuracy (column 20 line 42). Regarding claim 6, the combination of Haegermarck in view of Kwak teaches the robot of claim 1. Haegermarck further teaches the processor is further configured to identify, based on pose information being received from the external robot through the communication interface, the target position of the robot based on the pose information, the pose of the external robot, and the stored map data (column 9 line 3, “the controller of the master robot 10 uses positional data derived from the obstacle detection device to position itself with respect to the surroundings, which also includes positioning the master robotic device 10 in relation to the slave robotic device 30 (being an “obstacle” in the surroundings of the master robot 10)”). Regarding claim 10, the combination of Haegermarck in view of Kwak teaches the robot of claim 1. Haegermarck fails to explicitly teach the communication interface is configured to communicate according to a short range communication method comprising Bluetooth communication, and wherein the sensor comprises at least one of an infrared sensor or an ultra wide band (UWB) sensor. Haegermarck does teach that the robot is able to communicate with a mobile terminal via Bluetooth (column 3 line 33, “In still another embodiment, the master robot is advantageously arranged to communicate wirelessly, for example via Bluetooth or WLAN”). Kwak teaches the communication interface is configured to communicate according to a short range communication method comprising Bluetooth communication (column 16 line 15, “Meanwhile, the plurality of cleaners 100a and 100b may be directly connected to each other wirelessly via Zigbee, Z-wave, Blue-Tooth, Ultra-wide band, and the like”), and wherein the sensor comprises at least one of an infrared sensor or an ultra wide band (UWB) sensor (column 2 line 60, “In an embodiment disclosed herein, the transmitting optical sensor and the receiving optical sensor may be infrared (IR) sensors, and the first module and the second modules transmitting and receiving the UWB signal may be UWB modules”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to further incorporate the teachings of Kwak. Bluetooth, infrared sensors, and UWB sensors are all well-known in the art and would have been a trivial change to Haegermarck. Regarding claim 12, Haegermarck teaches a method of controlling a robot (Fig. 3, master robot 10), the method comprising: outputting a sensing signal for sensing a distance with an external robot (column 9 line 63, “Thus, the master robot 10 emits light by means of its laser light sources 27, 28 onto the slave robot 30 and the camera 23 records images of a vicinity of the master robotic cleaning device 10 from which the slave robot 10 may be detected. Thereafter, the master robot 10 derives positional data of the detected objects from the recorded images, and positions itself in relation to the objects, including the slave robot 30”), driving at least one of the robot or the external robot based on the position information (column 1 line 8, “the master robotic cleaning device is arranged to submit commands the at least one slave robotic cleaning device to control a cleaning operation of the at least one slave robotic cleaning device, the commands being based on the derived positional data”); identifying a target position of the robot based on the pose of the external robot and map data; and moving the robot to the target position (column 9 line 3, “the controller of the master robot 10 uses positional data derived from the obstacle detection device to position itself with respect to the surroundings, which also includes positioning the master robotic device 10 in relation to the slave robotic device 30 (being an “obstacle” in the surroundings of the master robot 10)”). Haegermarck fails to teach obtaining position information of the external robot based on a time at which at least one echo signal is received from the external robot; and identifying, based on an error in communication with the external robot, a pose of the external robot based on a type of the at least one echo signal received from the external robot. Haegermarck instead only uses a camera and line lasers (see Fig. 5a) to obtain position information of the external robot, which could not be considered to be “based on a time” and as an “echo signal”. Haegermarck also fails to mention a communication error between the robot and external robot. However, Kwak teaches obtaining position information of the external robot (Fig. 5C, second autonomous mobile robot 100b) based on a time at which at least one echo signal is received from the external robot (column 21 line 22, “The second signal may include delay time (t_reply) information which is calculated based on a time at which the first mobile robot 100a has received the first signal and a time at which the first mobile terminal 100a has output the second signal”; see column 21 lines 9-26); and identifying, based on an error in communication with the external robot (column 20 line 4, “On the other hand, when the IR sensor is used, if an obstacle is present between the first mobile robot 100a and the second mobile robot 100b, the reception of the laser light is interrupted, and the relative positions of the first and second mobile robots cannot accurately be recognized”), a pose of the external robot based on a type of the at least one echo signal received from the external robot (column 20 line 9, “To solve this problem, as illustrated in FIGS. 6A and 6B, the present invention can measure the relative positions of the first mobile robot and the second mobile robot by using UWB modules instead of the transmitting/receiving IR sensors”; column 22 line 34, “The present invention can calculate the relative positions (spatial coordinates) of the first mobile robot 100a and the second mobile robot 100b using the plurality of UWB anchors. The triangulation described in FIG. 6B will be equally/similarly applied to calculating the relative positions of the first mobile robot and the second mobile robot using three UWB anchors and one UWB tag”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Haegermarck to incorporate the teachings of Kwak in order to determine the relative positions of both robots even when there is an obstacle in the way. Doing so would increase accuracy (column 20 line 42). Regarding claim 13, the combination of Haegermarck in view of Kwak teaches the method of claim 12. Haegermarck further teaches the identifying the pose of the external robot comprises identifying, while communicating with the external robot, the pose of the external robot based on the type of at least one echo signal received from the external robot, and wherein the method further comprises changing the pose of the external robot based on the pose of the external robot and the map data (column 1 line 8, “the master robotic cleaning device is arranged to submit commands the at least one slave robotic cleaning device to control a cleaning operation of the at least one slave robotic cleaning device, the commands being based on the derived positional data”). Regarding claim 16, the combination of Haegermarck in view of Kwak teaches the method of claim 12. Haegermarck fails to teach the identifying the pose of the external robot comprises: identifying, based on an error occurring in communication with the external robot through a communication interface, positions of a plurality of sensors, which output a plurality of echo signals from among the plurality of sensors disposed in the external robot, based on types of the plurality of echo signals received from the external robot, and identifying the pose of the external robot based on the positions of the plurality of sensors. However, Kwak teaches the identifying the pose of the external robot comprises: identifying, based on an error occurring in communication with the external robot through a communication interface, positions of a plurality of sensors (Fig. 7A, UWB anchors 710b-1, 710b-2, and 710b-3), which output a plurality of echo signals from among the plurality of sensors disposed in the external robot, based on types of the plurality of echo signals received from the external robot, and identifying the pose of the external robot based on the positions of the plurality of sensors (column 20 line 9, “To solve this problem, as illustrated in FIGS. 6A and 6B, the present invention can measure the relative positions of the first mobile robot and the second mobile robot by using UWB modules instead of the transmitting/receiving IR sensors”; column 22 line 34, “The present invention can calculate the relative positions (spatial coordinates) of the first mobile robot 100a and the second mobile robot 100b using the plurality of UWB anchors. The triangulation described in FIG. 6B will be equally/similarly applied to calculating the relative positions of the first mobile robot and the second mobile robot using three UWB anchors and one UWB tag”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to further incorporate the teachings of Kwak in order to determine the relative positions of both robots even when there is an obstacle in the way. Doing so would increase accuracy (column 20 line 42). Regarding claim 17, the combination of Haegermarck in view of Kwak teaches the method of claim 12, wherein identifying the target position of the first robot comprises: identifying, based on pose information being received from the external robot through a communication interface, the target position of the robot based on the pose information, the pose of the external robot, and the stored map data (column 9 line 3, “the controller of the master robot 10 uses positional data derived from the obstacle detection device to position itself with respect to the surroundings, which also includes positioning the master robotic device 10 in relation to the slave robotic device 30 (being an “obstacle” in the surroundings of the master robot 10)”). Claim(s) 3-4 and 14-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Haegermarck in view of Kwak, and further in view of Connor (US 20170229023 A1). Regarding claim 3, the combination of Haegermarck in view of Kwak teaches the robot of claim 1. Both Haegermarck and Kwak fail to teach the processor is further configured to transmit, based on an error occurrence in communication through the communication interface being predicted based on the pose of the external robot and the stored map data, a control signal for changing the pose of the external robot to the external robot through the communication interface. However, Connor teaches the processor is further configured to transmit, based on an error occurrence in communication through the communication interface being predicted based on the pose of the external robot and the stored map data, a control signal for changing the pose of the external robot to the external robot through the communication interface (par. 26, "The communication nodes 102 can use information about their relative positioning with respect to the locations of obstacles in order to predict likely communication link obstructions for possible future states"). The combination of Haegermarck in view of Kwak relates to a plurality of autonomous robot cleaning devices that have a master slave relationship. Kwak explicitly teaches losing line of sight causes a communication error (column 20 line 4, “On the other hand, when the IR sensor is used, if an obstacle is present between the first mobile robot 100a and the second mobile robot 100b, the reception of the laser light is interrupted, and the relative positions of the first and second mobile robots cannot accurately be recognized”). Connor relates to a method of communication link accessibility aware navigation between communication nodes, in which one or more of the communication nodes can be an unmanned vehicle (abstract and par. 25). Connor seeks to solve the problem of impaired communication between mobile communication nodes due to losing clear line of sight for optical communication (par. 2). Therefore, both inventions relate to the known problem of losing line of sight between two autonomous mobile machines, which can cause a communication error. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to incorporate the teachings of Connor to add determining a likelihood of an error occurring in communication. Connor states, “Communication link accessibility between communication nodes may require a clear line of sight for optical or high-bandwidth communication. When one or more of the communication nodes are unmanned aerial vehicles (UAVs), a reduction or loss in communications can impede decision-making and planning capabilities” (par. 2). Predicting a likelihood of communication errors allows for “[a] modification of the path plan can be cooperatively determined with the one or more communication nodes to maintain or restore the one or more communication links” (par. 7). Regarding claim 4, the combination of Haegermarck in view of Kwak teaches the robot of claim 3. Both Haegermarck and Kwak fail to teach the processor is further configured to determine a likelihood of an error occurring in communication through the communication interface based on information on obstacles disposed in an area corresponding to a position of the external robot on the map data, the pose of the external robot, and a moving path of the external robot. However, Connor teaches the processor is further configured to determine a likelihood of an error occurring in communication through the communication interface based on information on obstacles disposed in an area corresponding to a position of the external robot on the map data, the pose of the external robot, and a moving path of the external robot (par. 26, "The communication nodes 102 can use information about their relative positioning with respect to the locations of obstacles in order to predict likely communication link obstructions for possible future states"). The combination of Haegermarck in view of Kwak relates to a plurality of autonomous robot cleaning devices that have a master slave relationship. Kwak explicitly teaches losing line of sight causes a communication error (column 20 line 4, “On the other hand, when the IR sensor is used, if an obstacle is present between the first mobile robot 100a and the second mobile robot 100b, the reception of the laser light is interrupted, and the relative positions of the first and second mobile robots cannot accurately be recognized”). Connor relates to a method of communication link accessibility aware navigation between communication nodes, in which one or more of the communication nodes can be an unmanned vehicle (abstract and par. 25). Connor seeks to solve the problem of impaired communication between mobile communication nodes due to losing clear line of sight for optical communication (par. 2). Therefore, both inventions relate to the known problem of losing line of sight between two autonomous mobile machines, which can cause a communication error. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to incorporate the teachings of Connor to add determining a likelihood of an error occurring in communication. Connor states, “Communication link accessibility between communication nodes may require a clear line of sight for optical or high-bandwidth communication. When one or more of the communication nodes are unmanned aerial vehicles (UAVs), a reduction or loss in communications can impede decision-making and planning capabilities” (par. 2). Predicting a likelihood of communication errors allows for “[a] modification of the path plan can be cooperatively determined with the one or more communication nodes to maintain or restore the one or more communication links” (par. 7). Regarding claim 14, the combination of Haegermarck in view of Kwak teaches the method of claim 12. Haegermarck fails to teach changing, based on an error occurrence in communication being predicted, the pose of the external robot. However, Connor teaches changing, based on an error occurrence in communication being predicted, the pose of the external robot (par. 26, "The communication nodes 102 can use information about their relative positioning with respect to the locations of obstacles in order to predict likely communication link obstructions for possible future states"). The combination of Haegermarck in view of Kwak relates to a plurality of autonomous robot cleaning devices that have a master slave relationship. Kwak explicitly teaches losing line of sight causes a communication error (column 20 line 4, “On the other hand, when the IR sensor is used, if an obstacle is present between the first mobile robot 100a and the second mobile robot 100b, the reception of the laser light is interrupted, and the relative positions of the first and second mobile robots cannot accurately be recognized”). Connor relates to a method of communication link accessibility aware navigation between communication nodes, in which one or more of the communication nodes can be an unmanned vehicle (abstract and par. 25). Connor seeks to solve the problem of impaired communication between mobile communication nodes due to losing clear line of sight for optical communication (par. 2). Therefore, both inventions relate to the known problem of losing line of sight between two autonomous mobile machines, which can cause a communication error. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to incorporate the teachings of Connor to add determining a likelihood of an error occurring in communication. Connor states, “Communication link accessibility between communication nodes may require a clear line of sight for optical or high-bandwidth communication. When one or more of the communication nodes are unmanned aerial vehicles (UAVs), a reduction or loss in communications can impede decision-making and planning capabilities” (par. 2). Predicting a likelihood of communication errors allows for “[a] modification of the path plan can be cooperatively determined with the one or more communication nodes to maintain or restore the one or more communication links” (par. 7). Regarding claim 15, the combination of Haegermarck in view of Kwak teaches the method of claim 14. Both Haegermarck and Kwak fail to teach the changing the pose of the external robot comprises determining a likelihood of an error occurring in communication based on information of obstacles disposed in an area corresponding to a position of the external robot on the map data, the pose of the external robot, and a moving path of the external robot. However, Connor teaches the changing the pose of the external robot comprises determining a likelihood of an error occurring in communication based on information of obstacles disposed in an area corresponding to a position of the external robot on the map data, the pose of the external robot, and a moving path of the external robot (par. 26, "The communication nodes 102 can use information about their relative positioning with respect to the locations of obstacles in order to predict likely communication link obstructions for possible future states"). The combination of Haegermarck in view of Kwak relates to a plurality of autonomous robot cleaning devices that have a master slave relationship. Kwak explicitly teaches losing line of sight causes a communication error (column 20 line 4, “On the other hand, when the IR sensor is used, if an obstacle is present between the first mobile robot 100a and the second mobile robot 100b, the reception of the laser light is interrupted, and the relative positions of the first and second mobile robots cannot accurately be recognized”). Connor relates to a method of communication link accessibility aware navigation between communication nodes, in which one or more of the communication nodes can be an unmanned vehicle (abstract and par. 25). Connor seeks to solve the problem of impaired communication between mobile communication nodes due to losing clear line of sight for optical communication (par. 2). Therefore, both inventions relate to the known problem of losing line of sight between two autonomous mobile machines, which can cause a communication error. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to incorporate the teachings of Connor to add determining a likelihood of an error occurring in communication. Connor states, “Communication link accessibility between communication nodes may require a clear line of sight for optical or high-bandwidth communication. When one or more of the communication nodes are unmanned aerial vehicles (UAVs), a reduction or loss in communications can impede decision-making and planning capabilities” (par. 2). Predicting a likelihood of communication errors allows for “[a] modification of the path plan can be cooperatively determined with the one or more communication nodes to maintain or restore the one or more communication links” (par. 7). Claim(s) 7-8 and 18-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Haegermarck in view of Kwak, and further in view of Lee (US 20200023511 A1). Regarding claim 7, the combination of Haegermarck in view of Kwak teaches the robot of claim 1. Haegermarck fails to teach the sensor comprises a light detection and ranging (LiDAR) sensor, and wherein the processor is further configured to obtain the position information of the external robot based on a sensing signal obtained by the LiDAR sensor and the time at which the at least one echo signal is received from the external robot. However, Kwak teaches (column 21 line 22, “The second signal may include delay time (t_reply) information which is calculated based on a time at which the first mobile robot 100a has received the first signal and a time at which the first mobile terminal 100a has output the second signal”; see column 21 lines 9-26). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to further incorporate the teachings of Kwak in order to determine the relative positions of both robots even when there is an obstacle in the way. Doing so would increase accuracy (column 20 line 42). Both Haegermarck and Kwak fail to teach the sensor comprises a light detection and ranging (LiDAR) sensor, and wherein the processor is further configured to obtain the position information of the external robot based on a sensing signal obtained by the LiDAR sensor. However, Lee teaches the sensor comprises a light detection and ranging (LiDAR) sensor (par. 42, sensing unit 130 can be a LiDAR sensor), and wherein the processor is further configured to obtain the position information of the external robot based on a sensing signal obtained by the LiDAR sensor (par. 51, “When moving the slave robot 200 on the basis of the space map, the control module 190 may control the input unit 120 or the sensing unit 130 to monitor movement of the slave robot 200. Accordingly, it is possible to intuitively recognize the slave robot 200 by the master robot 100”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to incorporate the teachings of Lee to make the sensor a LiDAR sensor in order to monitor movement of the external robot. LiDAR sensors are already well-known in the field and would have been a trivial change. Regarding claim 8, the combination of Haegermarck in view of Kwak teaches the robot of claim 1. Haegermarck further teaches (column 9 line 3, “the controller of the master robot 10 uses positional data derived from the obstacle detection device to position itself with respect to the surroundings, which also includes positioning the master robotic device 10 in relation to the slave robotic device 30 (being an “obstacle” in the surroundings of the master robot 10)”—the communication interface is moved along with the robot). Both Haegermarck and Kwak fail to teach the sensor comprises a light detection and ranging (LiDAR) sensor, and wherein the processor is further configured to: obtain obstacle information based on a sensing signal obtained by the LiDAR sensor. However, Lee teaches the sensor comprises a light detection and ranging (LiDAR) sensor (par. 42, sensing unit 130 can be a LiDAR sensor), and wherein the processor is further configured to: obtain obstacle information based on a sensing signal obtained by the LiDAR sensor (par. 51, “When moving the slave robot 200 on the basis of the space map, the control module 190 may control the input unit 120 or the sensing unit 130 to monitor movement of the slave robot 200. Accordingly, it is possible to intuitively recognize the slave robot 200 by the master robot 100”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to incorporate the teachings of Lee to make the sensor a LiDAR sensor in order to monitor movement of the external robot. LiDAR sensors are already well-known in the field and would have been a trivial change. Regarding claim 18, the combination of Haegermarck in view of Kwak teaches the method of claim 12. Haegermarck fails to teach the obtaining position information of the external robot comprises: obtaining the position information of the external robot based on a sensing signal obtained by a light detection and ranging (LiDAR) sensor and the time at which the at least one echo signal is received from the external robot. However, Kwak teaches the obtaining position information of the external robot comprises: obtaining the position information of the external robot based on a sensing signal (column 21 line 22, “The second signal may include delay time (t_reply) information which is calculated based on a time at which the first mobile robot 100a has received the first signal and a time at which the first mobile terminal 100a has output the second signal”; see column 21 lines 9-26). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to further incorporate the teachings of Kwak in order to determine the relative positions of both robots even when there is an obstacle in the way. Doing so would increase accuracy (column 20 line 42). Both Haegermarck and Kwak fail to teach obtaining the position information of the external robot based on a sensing signal obtained by a light detection and ranging (LiDAR) sensor. However, Lee teaches obtaining the position information of the external robot based on a sensing signal (par. 51, “When moving the slave robot 200 on the basis of the space map, the control module 190 may control the input unit 120 or the sensing unit 130 to monitor movement of the slave robot 200. Accordingly, it is possible to intuitively recognize the slave robot 200 by the master robot 100”) obtained by a light detection and ranging (LiDAR) sensor (par. 42, sensing unit 130 can be a LiDAR sensor). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to incorporate the teachings of Lee to make the sensor a LiDAR sensor in order to monitor movement of the external robot. LiDAR sensors are already well-known in the field and would have been a trivial change. Regarding claim 19, the combination of Haegermarck in view of Kwak teaches the method of claim 12. Haegermarck further teaches obtaining obstacle information based on a sensing signal (column 9 line 3, “the controller of the master robot 10 uses positional data derived from the obstacle detection device to position itself with respect to the surroundings, which also includes positioning the master robotic device 10 in relation to the slave robotic device 30 (being an “obstacle” in the surroundings of the master robot 10)”—the communication interface is moved along with the robot). Both Haegermarck and Kwak fail to teach a sensing signal obtained by a LiDAR sensor. However, Lee teaches a sensing signal obtained by a LiDAR sensor (par. 42, sensing unit 130 can be a LiDAR sensor). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to incorporate the teachings of Lee to make the sensor a LiDAR sensor in order to monitor movement of the external robot. LiDAR sensors are already well-known in the field and would have been a trivial change. Claim(s) 9 and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Haegermarck in view of Kwak, and further in view of Gu (US 20180192845 A1). Regarding claim 9, the combination of Haegermarck in view of Kwak teaches the robot of claim 1. Both Haegermarck and Kwak fail to teach a storage space configured to accommodate the external robot, wherein the processor is further configured to: control, based on work by the external robot being identified as necessary, an output of the external robot from the storage space, plan, based on the work by the external robot being identified as completed, a moving path of the external robot based on the pose of the external robot, and control the operation state of the external robot to accommodate the external robot in the storage space based on the moving path. However, Gu teaches a storage space configured to accommodate the external robot (par. 40, chamber 15 for holding at least one secondary robot 2), wherein the processor is further configured to: control, based on work by the external robot being identified as necessary (par. 42, "Some of the first sensors are used to identify a target zone that the primary robot may not be able to enter by measuring its dimension (height, length, width, or depth) in comparison with those parameters of the primary robot so that a smaller-size secondary robot may be assigned"), an output of the external robot from the storage space (Fig. 9, secondary robot is released from the primary robot), plan, based on the work by the external robot being identified as completed, a moving path of the external robot based on the pose of the external robot (par. 51, "the second cleaning task is considered to be at least partially finished and the second controller 20 will also generate a signal representing the third command and transmit the signal wirelessly via the second communication module 22 to the first communication module 12 and to the first controller 10. The third command requests to move the secondary robot 2 back to the chamber of the primary robot 1”—although in this case the secondary robot controls itself, if combined with the combination of Haegermarck in view of Kwak, one of ordinary skill in the art would be able to recognize the secondary robot could be controlled by the primary robot instead), and control the operation state of the external robot to accommodate the external robot in the storage space based on the moving path (par. 51, “The third command requests to move the secondary robot 2 back to the chamber of the primary robot 1”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to incorporate the teachings of Gu so that places the robot can’t fit can still be cleaned (par. 45, “But for some unusual target zones (with relative low in height, narrow in width, or with special shapes) where the primary robot is unable to enter to perform the first cleaning task, the smaller secondary robot 2 can be used to perform a second cleaning task inside each of those unusual target zones”). Regarding claim 20, the combination of Haegermarck in view of Kwak teaches the method of claim 12. Both Haegermarck and Kwak fail to teach the method further comprises: controlling, based on work by the external robot being identified as necessary, an output of the external robot from a storage space, planning, based on the work by the external robot being identified as completed, a moving path of the external robot based on the pose of the external robot, and controlling an operation state of the external robot to accommodate the external robot in the storage space based on the moving path. However, Gu teaches the method further comprises: controlling, based on work by the external robot being identified as necessary (par. 42, "Some of the first sensors are used to identify a target zone that the primary robot may not be able to enter by measuring its dimension (height, length, width, or depth) in comparison with those parameters of the primary robot so that a smaller-size secondary robot may be assigned"), an output of the external robot from a storage space (Fig. 9, secondary robot is released from the primary robot), planning, based on the work by the external robot being identified as completed, a moving path of the external robot based on the pose of the external robot (par. 51, "the second cleaning task is considered to be at least partially finished and the second controller 20 will also generate a signal representing the third command and transmit the signal wirelessly via the second communication module 22 to the first communication module 12 and to the first controller 10. The third command requests to move the secondary robot 2 back to the chamber of the primary robot 1”—although in this case the secondary robot controls itself, if combined with the combination of Haegermarck in view of Kwak, one of ordinary skill in the art would be able to recognize the secondary robot could be controlled by the primary robot instead), and controlling an operation state of the external robot to accommodate the external robot in the storage space based on the moving path (par. 51, “The third command requests to move the secondary robot 2 back to the chamber of the primary robot 1”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Haegermarck in view of Kwak to incorporate the teachings of Gu so that places the robot can’t fit can still be cleaned (par. 45, “But for some unusual target zones (with relative low in height, narrow in width, or with special shapes) where the primary robot is unable to enter to perform the first cleaning task, the smaller secondary robot 2 can be used to perform a second cleaning task inside each of those unusual target zones”). Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Gu in view of Kwak and Haegermarck. Gu teaches a system comprising: a first robot (Fig. 3, primary robot 1); and a second robot (secondary robot 2) which is accommodated in a storage space of the first robot (chamber 15), (Fig. 9, secondary robot is released from the primary robot) based on work by the second robot being identified as necessary (par. 42, "Some of the first sensors are used to identify a target zone that the primary robot may not be able to enter by measuring its dimension (height, length, width, or depth) in comparison with those parameters of the primary robot so that a smaller-size secondary robot may be assigned"), transmit, based on the work by the second robot being identified as completed, a control signal for accommodating the second robot in the storage space to the second robot through the communication interface (par. 51, "the second cleaning task is considered to be at least partially finished and the second controller 20 will also generate a signal representing the third command and transmit the signal wirelessly via the second communication module 22 to the first communication module 12 and to the first controller 10. The third command requests to move the secondary robot 2 back to the chamber of the primary robot 1”—although in this case the secondary robot controls itself, if combined with the combination of Haegermarck in view of Kwak, one of ordinary skill in the art would be able to recognize the secondary robot could be controlled by the primary robot instead), output a sensing signal for sensing a distance with the second robot (par. 42, “Some first sensors are used to determine current location of the primary robot 1 and also used to determine current location of the secondary robot 2”), Gu fails to teach the second robot comprises a plurality of sensors configured to output echo signals of different types by being disposed at different positions, identify, based on an error occurring in communication with the second robot through the communication interface, positions of the respective sensors, which output a plurality of echo signals from among the plurality of sensors disposed in the second robot, based on the types of the plurality of echo signals received from the second robot, identify a pose of the second robot based on the positions of the plurality of sensors. However, Kwak teaches the second robot (Fig. 5C, second autonomous mobile robot 100b) comprises a plurality of sensors configured to output echo signals of different types by being disposed at different positions (column 21 line 22, “The second signal may include delay time (t_reply) information which is calculated based on a time at which the first mobile robot 100a has received the first signal and a time at which the first mobile terminal 100a has output the second signal”; see column 21 lines 9-26), identify, based on an error occurring in communication with the second robot through the communication interface (column 20 line 4, “On the other hand, when the IR sensor is used, if an obstacle is present between the first mobile robot 100a and the second mobile robot 100b, the reception of the laser light is interrupted, and the relative positions of the first and second mobile robots cannot accurately be recognized”), positions of the respective sensors, which output a plurality of echo signals from among the plurality of sensors disposed in the second robot, based on the types of the plurality of echo signals received from the second robot, identify a pose of the second robot based on the positions of the plurality of sensors (column 20 line 9, “To solve this problem, as illustrated in FIGS. 6A and 6B, the present invention can measure the relative positions of the first mobile robot and the second mobile robot by using UWB modules instead of the transmitting/receiving IR sensors”; column 22 line 34, “The present invention can calculate the relative positions (spatial coordinates) of the first mobile robot 100a and the second mobile robot 100b using the plurality of UWB anchors. The triangulation described in FIG. 6B will be equally/similarly applied to calculating the relative positions of the first mobile robot and the second mobile robot using three UWB anchors and one UWB tag”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Gu to incorporate the teachings of Kwak in order to determine the relative positions of both robots even when there is an obstacle in the way. Doing so would increase accuracy (column 20 line 42). Both Kwak and Gu fail to teach identify a target position of the first robot based on the pose of the second robot and based on map data, and move to the target position. However, Haegermarck teaches identify a target position of the first robot based on the pose of the second robot and based on map data, and move to the target position (column 9 line 3, “the controller of the master robot 10 uses positional data derived from the obstacle detection device to position itself with respect to the surroundings, which also includes positioning the master robotic device 10 in relation to the slave robotic device 30 (being an “obstacle” in the surroundings of the master robot 10)”). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Gu in view of Kwak to incorporate the teachings of Haegermarck. As Haegermarck states, the second vehicle would be considered an “obstacle” to the first robot, and therefore it is necessary to monitor the positions of the obstacle in order to avoid it. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MINATO LEE HORNER whose telephone number is (571)272-5425. The examiner can normally be reached M-F 8-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christian Chace can be reached at (571) 272-4190. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /M.L.H./Examiner, Art Unit 3665 /CHRISTIAN CHACE/Supervisory Patent Examiner, Art Unit 3665
Read full office action

Prosecution Timeline

Jan 16, 2024
Application Filed
Jan 29, 2026
Non-Final Rejection — §103
Mar 16, 2026
Interview Requested

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593748
AUTONOMOUS MACHINE HAVING VISION SYSTEM FOR NAVIGATION AND METHOD OF USING SAME
2y 5m to grant Granted Apr 07, 2026
Patent 12567332
METHOD OF COLLISION POINT CALCULATION AND EMERGENCY BRAKE ASSIST DECELERATION BASED ON THE METHOD OF COLLISION POINT CALCULATION
2y 5m to grant Granted Mar 03, 2026
Patent 12545149
VR-BASED SEAT CONTROL APPARATUS AND METHOD FOR VEHICLE
2y 5m to grant Granted Feb 10, 2026
Patent 12485815
PATTERN-BASED INTELLIGENT PERSONALIZED CHOREOGRAPHY FOR SOFTWARE-DEFINED VEHICLE
2y 5m to grant Granted Dec 02, 2025
Study what changed to get past this examiner. Based on 4 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
80%
Grant Probability
99%
With Interview (+25.0%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 10 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month