Prosecution Insights
Last updated: April 19, 2026
Application No. 18/973,870

VIRTUAL ROBOT BASE

Non-Final OA §102§103
Filed
Dec 09, 2024
Examiner
CAIN, AARON G
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Fastbrick Ip Pty Ltd.
OA Round
1 (Non-Final)
40%
Grant Probability
Moderate
1-2
OA Rounds
3y 3m
To Grant
66%
With Interview

Examiner Intelligence

Grants 40% of resolved cases
40%
Career Allow Rate
52 granted / 130 resolved
-12.0% vs TC avg
Strong +26% interview lift
Without
With
+26.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
42 currently pending
Career history
172
Total Applications
across all art units

Statute-Specific Performance

§101
4.3%
-35.7% vs TC avg
§103
57.4%
+17.4% vs TC avg
§102
19.7%
-20.3% vs TC avg
§112
17.7%
-22.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 130 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims The Office Action is in response to the amendment filed 07/17/2025. Claims 1-15, 18-20, 30, 32 are presently pending and are presented for examination. Information Disclosure Statement The information disclosure statement (IDS) submitted on 08/20/2025 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-14, 30, and 32 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kim et al. US 20170173796 A1 (“Kim”). Regarding Claim 1. Kim teaches a system for performing interactions within a physical environment, the system including: a) a robot base; b) a robot base actuator that moves the robot base relative to the environment; c) a robot arm mounted to the robot base, the robot arm including an end effector mounted thereon (a transfer robot comprising a robot main body, a driving unit configured to move the robot main body toward a stage, and a manipulation unit configured to pick up a target object disposed on the stage [Claim 1]. Kim also shows in FIG. 13 how the hand of the robot is attached to a robotic arm at numeral 610, which includes a first hinge at numeral 615 that is visibly connected to a base, wherein the arm of the robot is attached to the first hinge at numeral 615 and a robot hand at 620, which acts as a head mounted to the boom and the boom is mounted to a boom base); d) a tracking system that measures a tracking target position indicative of a position of a target relative to the environment, the target being mounted on the robot base (the main body of the robot control unit may be configured to receive obstacle-sensing information I1 from the obstacle-sensing unit, distance information I2 from a distance sensor unit, and to receive the first image information I3 from the first image acquisition unit, target object information I4 from a target object-sensing unit, and second image information from the second image acquisition unit. The control unit may use the received information to control the driving unit and manipulation unit to control the position and movement of the robot base and arm [paragraph 51], which means the control unit has a tracking system. The robot body may be moved toward one of a plurality of stages by the driving unit, based on the information I1-I5 [paragraph 52]. For example, when information on a target position “C” adjacent to the stage is input by a user, the robot body may be moved to the target position C by the driving unit. In one embodiment, the target position C can be located at the center of the image sensor on the first image acquisition unit, as shown in FIG. 8 [paragraph 58], which means that the target position can be mounted on the robot base. In sum, Kim teaches a tracking system that measures a tracking target position indicative of a position of a target relative to the environment, wherein the target may be mounted on the robot base. Alternatively, FIG. 1 shows a position labeled 200 which refers to an obstacle-sensing unit which, in at least one embodiment, may include a 2D laser scanner or a laser range finder, which is configured to measure a horizontal distance to an object (e.g., the obstacle O). The laser sensor may be configured to emit a laser beam in the driving direction of the body unit 100, to receive the laser beam reflected by the obstacle O, and to obtain information (hereinafter, obstacle-sensing information “I1”) regarding the presence or absence of the obstacle O or position of the obstacle O [paragraph 37]. This makes the obstacle-sensing unit at 200 a literal target of its own laser used to measure a position of the target relative to the environment, mounted on the robot base); and, e) a control system that: i) acquires an indication of an end effector destination defined relative to an environment coordinate system; ii) determines a tracking target position at least in part using signals from the tracking system (the transfer robot may further include object-sensing unit shown at numeral 500 of FIG. 3. The target object-sensing unit may be configured to detect a target object provided on the stage mentioned in claim 1, and may include a detection sensor to detect the object [paragraph 49]. In some embodiments, the detection sensor may include a 2D laser scanner or a laser range finder, which is configured to measure a horizontal distance to an object [paragraph 49]. Additionally, a scan unit at numeral 520 of FIG. 12 is further configured to rotate about the z plane to form the scan region S of FIG. 12, and can control the detection sensor so that the detection sensor can detect the target object in a scan region “S” (e.g., see FIG. 12). The scan region S may be a three-dimensionally region defined by X, Y, and Z-axes. The scan unit 520 may be configured to control a position of the detection sensor 510, and thus, it is possible for the detection sensor 510 to scan the target object in the scan region S. The target object-sensing unit may obtain target object position information “I4” on the target object using the detection sensor. Kim also teaches that the robot arm has a robotic hand at numeral 620 of FIG. 1. In claim 10, Kim teaches that the hand is part of the manipulation unit and is configured to grasp a target object, while the arm is configured to change a position of the hand, wherein the control unit is configured to calculate a grasping position, allowing the robot hand to grasp the target object, and to control the robot arm to place the robot hand at the grasping position, and the X-coordinate, the Y-coordinate and the Z-coordinate of the target object is used to calculate the grasping position [Claim 10]. The manipulation unit is shown in FIG. 3 to be controlled by the control unit at numeral 800, and this means that the control unit controls the arm with an indication of an end effector destination (the target object) relative to an environment coordinate system, after determining a tracking target position at least in part using signals from the tracking system); iii) determines a virtual robot base position offset from the robot base and defined at least partially in accordance with an end effector position (Kim teaches a transfer robot with a control unit configured to obtain an X-coordinate and a Z-coordinate of a reference point of a first mark from a first image information and to control the driving unit to allow at least one of the X-coordinate and the Z-coordinate to coincide with a predetermined one of the reference coordinates [Claim 6], which means that the system determines the determines the virtual robot base position in a coordinate system, where the virtual robot base position is the predetermined reference coordinate or coordinates. Also of note, the control unit is configured to optimize a driving path “P” of the transfer robot based on information on a position of a particular stage from one or more stages, which may be previously prepared using a mapping method [paragraph 55]. For example, the control unit may optimize the shortest distance between the robot body and the stage, meaning that the control unit is using the robot main body at numeral 100 as a point of origin for the path P, and this means that the control unit generates a virtual robot base position in the coordinate system to plot out a path. FIGS. 1-8 shows that, as a result of the movement of the robot main body along the driving path P, the main body may be positioned at the target position “C” adjacent to the stage at numeral 20, meaning that the target position “C” may be a target position mounted on the robot base. In one embodiment, the center of the image sensor on the first image acquisition unit will coincide with the coordinates for the target position C [paragraph 58]. Here, the target position C may be selected to allow a relative distance “D” between the body unit and the stage to be substantially equal to a predetermined distance, meaning that the control system determines the virtual robot base position using the target position by transforming the target position to the virtual robot base position, which is depicted more clearly in FIG. 8 where C denotes target position (robot base position). Kim does not explicitly say that the coordinate system of the target object is a different coordinate system than the environmental coordinate system in which the robot navigates. However, it would have been obvious to one of ordinary skill in the art to have the coordinate systems be separate under the KSR principle of this being an obvious combination of existing, known parts (having a coordinate system for the environment, a coordinate system for the manipulator, separate coordinate systems) to produce a result with a predictable chance of success (a manipulator coordinate system and a separate environmental coordinate system)); iv) calculates a robot base path extending from the virtual robot base position to the end effector destination; v) generates robot base control signals based on the robot base path; and, vi) applies the robot base control signals to the robot base actuator to cause the robot base to be moved along the robot base path (FIG. 6 shows a virtual path labeled “P” that the robot follows to reach a virtual end position marked by one of the stages at numeral 20 [FIG. 6]. The control unit for the robot may move target position C [FIGS. 7 and 8], adjacent to the stage by the driving unit. In FIG. 8, target position C is located on the robot itself, meaning that the system can determine a virtual robot base position offset from the robot base and defines the position at least partially in accordance with the robot arm position (end effector position) in paragraph 58, where Kim teaches that the robot main body is moved to the stage to pick up the target object using the manipulation unit. The control unit then generates commands to the driving unit to move the robot toward the stage along the driving path [paragraph 55], which reads on generating robot base control signals based on the robot base path and applying the control signals to the robot base actuator to cause the robot base to be moved along the robot base path. Also of note, the control unit is configured to optimize a driving path “P” of the transfer robot based on information on a position of a particular stage from one or more stages, which may be previously prepared using a mapping method [paragraph 55]. For example, the control unit may optimize the shortest distance between the robot body and the stage, meaning that the control unit is using the robot main body at numeral 100 at a point of origin for the path P, and this means that the control unit generates a virtual robot base position in the coordinate system to plot out a path to the virtual base position, generates control signals to the robot base actuator, and causes the robot to move along the bath shown in FIGS. 5 and 6). Regarding Claim 2. Kim teaches a system according to claim 1. Kim also teaches: wherein the virtual robot base position is coincident with a reference end effector position, the reference end effector position being at least one of: a) an operative position indicative of a position of the end effector when performing an interaction in the environment; b) a pre-operative position indicative of a position of the end effector prior to commencing an interaction in the environment; and, c) a default position indicative of a position of the end effector following performing an interaction in the environment (Kim shows in FIG. 6 a virtual path labeled “P” that the robot follows to reach a virtual end position marked by one of the stages at numeral 20 [FIG. 6]. The control unit for the robot may move target position C [FIGS. 7 and 8], adjacent to the stage by the driving unit. In FIG. 8, target position C is located on the robot itself, meaning that the system can determine a virtual robot base position offset from the robot base and defines the position at least partially in accordance with the robot arm position (end effector position) in paragraph 58, where Kim teaches that the robot main body is moved to the stage to pick up the target object using the manipulation unit. The control unit then generates commands to the driving unit to move the robot toward the stage along the driving path [paragraph 55], which means that the virtual base position is coincident with a reference end effector, and the reference end effector position can be an operative position indicative of a position of the end effector when performing an interaction in the environment). Regarding Claim 3. Kim teaches a system according to claim 1. Kim also teaches: wherein the robot base actuator includes a boom mounted to a boom base, and the robot base includes a head mounted to the boom (FIG. 13 shows how the hand of the robot is attached to a robotic arm at numeral 610, which includes a first hinge at numeral 615 that is visibly connected to a base, wherein the arm of the robot is attached to the first hinge at numeral 615 and a robot hand at 620, which acts as a head mounted to the boom and the boom is mounted to a boom base). Regarding Claim 4. Kim teaches a system according to claim 3. Kim also teaches: wherein the boom is attached to a vehicle (FIG. 13 shows that the robot is a vehicle, labeled 100 as the main body unit), and wherein the virtual robot base position is offset from the robot base and defined at least partially in accordance with an end effector position to allow the vehicle to be provided in different positions relative to the environment (FIG. 13 shows that the robot is a vehicle, labeled 100 as the main body unit. FIG. 14 shows the grip recesses 31a and 31b, which are the target location for the fingers at numeral 622 of the robot hand. The control unit controls the fingers of the robot hand to grip these recesses at step S25 of FIG. 11 [paragraph 90]. The target object-sensing unit may obtain target object position information “I4” on the target object using the detection sensor [paragraph 49]. The target object recesses are connected to the target object at 30, and are positioned on a stage at numeral 20 [paragraph 13]. Target object position information I4 is received by the control unit at 800 and may be used with other received information to control the driving unit and the manipulation unit [paragraph 51]. Claim 10 teaches that the manipulation unit comprises a robot hand configured to grasp the target object; and a robot arm connected to the robot hand, and configured to change a position of the robot hand, wherein the control unit is configured to calculate a grasping position, allowing the robot hand to grasp the target object, and to control the robot arm to place the robot hand at the grasping position, and the X-coordinate, the Y-coordinate and the Z-coordinate of the target object is used to calculate the grasping position. This means that the virtual robot base position is defined at least partially in accordance with an end effector position in which the hand will grasp the target object. Additionally, as shown in FIG. 6, multiple stages where the target object will be located are depicted at numeral 20, which means that multiple end effector positions exist, allowing the vehicle to be provided in different positions relative to the environment). Regarding Claim 5. Kim teaches a system according to claim 1. Kim also teaches: wherein the control system: a) determines the virtual robot base position in a robot base actuator coordinate system (FIGS. 1-8 shows that, as a result of the movement of the robot main body along the driving path P, the main body may be positioned at the target position “C” adjacent to the stage at numeral 20, meaning that the target position “C” may be a target position mounted on the robot base. In one embodiment, the center of the image sensor on the first image acquisition unit will coincide with the coordinates for the target position C [paragraph 58]. Here, the target position C may be selected to allow a relative distance “D” between the body unit and the stage to be substantially equal to a predetermined distance, meaning that the control system determines the virtual robot base position using the target position by transforming the target position to the virtual robot base position, which is depicted more clearly in FIG. 8 where C denotes target position (robot base position). Kim does not explicitly say that the coordinate system of the target object is a different coordinate system than the environmental coordinate system in which the robot navigates. However, it would have been obvious to one of ordinary skill in the art to have the coordinate systems be separate under the KSR principle of this being an obvious combination of existing, known parts (having a coordinate system for the environment, a coordinate system for the manipulator, separate coordinate systems) to produce a result with a predictable chance of success (a manipulator coordinate system and a separate environmental coordinate system)); b) transforms the end effector destination into the robot base actuator coordinate system using the tracking target position (The robot further comprises a target object-sensing unit, configured to detect the target object disposed in a scan region and to obtain an X-coordinate, a Y-coordinate and a Z-coordinate of the target object [Claim 8]. The manipulation unit comprises a robot hand configured to grasp the target object; and a robot arm connected to the robot hand, and configured to change a position of the robot hand, wherein the control unit is configured to calculate a grasping position, allowing the robot hand to grasp the target object, and to control the robot arm to place the robot hand at the grasping position, and the X-coordinate, the Y-coordinate and the Z-coordinate of the target object is used to calculate the grasping position [Claim 10], which means that the end effector destination is transformed into the robot coordinate system using the target position); and c) calculates the path in the robot base actuator coordinate system (FIG. 6 shows a virtual path labeled “P” that the robot follows to reach a virtual end position marked by one of the stages at numeral 20 [FIG. 6]. The control unit for the robot may move target position C [FIGS. 7 and 8], adjacent to the stage by the driving unit. In FIG. 8, target position C is located on the robot itself, meaning that the system can determine a virtual robot base position offset from the robot base and defines the position at least partially in accordance with the robot arm position (end effector position) in paragraph 58, where Kim teaches that the robot main body is moved to the stage to pick up the target object using the manipulation unit. The control unit then generates commands to the driving unit to move the robot toward the stage along the driving path [paragraph 55], which reads on generating robot base control signals based on the robot base path and applying the control signals to the robot base actuator to cause the robot base to be moved along the robot base path. Also of note, the control unit is configured to optimize a driving path “P” of the transfer robot based on information on a position of a particular stage from one or more stages, which may be previously prepared using a mapping method [paragraph 55]. For example, the control unit may optimize the shortest distance between the robot body and the stage, meaning that the control unit is using the robot main body at numeral 100 at a point of origin for the path P, and this means that the control unit generates a virtual robot base position in the coordinate system to plot out a path to the virtual base position, generates control signals to the robot base actuator, and causes the robot to move along the bath shown in FIGS. 5 and 6). Regarding Claim 6. Kim teaches a system according to claim 1. Kim also teaches: wherein the tracking system measures a target position indicative of a position of a target mounted on the robot base (the main body of the robot control unit may be configured to receive obstacle-sensing information I1 from the obstacle-sensing unit, distance information I2 from a distance sensor unit, and to receive the first image information I3 from the first image acquisition unit, target object information I4 from a target object-sensing unit, and second image information from the second image acquisition unit. The control unit may use the received information to control the driving unit and manipulation unit to control the position and movement of the robot base and arm [paragraph 51], which means the control unit has a tracking system. The robot body may be moved toward one of a plurality of stages by the driving unit, based on the information I1-I5 [paragraph 52]. For example, when information on a target position “C” adjacent to the stage is input by a user, the robot body may be moved to the target position C by the driving unit. In one embodiment, the target position C can be located at the center of the image sensor on the first image acquisition unit, as shown in FIG. 8 [paragraph 58], which means that the target position can be mounted on the robot base. In sum, Kim teaches a tracking system that measures a tracking target position indicative of a position of a target relative to the environment, wherein the target may be mounted on the robot base. Alternatively, FIG. 1 shows a position labeled 200 which refers to an obstacle-sensing unit which, in at least one embodiment, may include a 2D laser scanner or a laser range finder, which is configured to measure a horizontal distance to an object (e.g., the obstacle O). The laser sensor may be configured to emit a laser beam in the driving direction of the body unit 100, to receive the laser beam reflected by the obstacle O, and to obtain information (hereinafter, obstacle-sensing information “I1”) regarding the presence or absence of the obstacle O or position of the obstacle O [paragraph 37]. This makes the obstacle-sensing unit at 200 a literal target of its own laser used to measure a position of the target relative to the environment, mounted on the robot base) and the control system determines the virtual robot base position using the target position by transforming the target position to the virtual robot base position (Kim teaches a transfer robot with a control unit configured to obtain an X-coordinate and a Z-coordinate of a reference point of a first mark from a first image information and to control the driving unit to allow at least one of the X-coordinate and the Z-coordinate to coincide with a predetermined one of the reference coordinates [Claim 6], which means that the system determines the determines the virtual robot base position in a coordinate system, where the virtual robot base position is the predetermined reference coordinate or coordinates. Also of note, the control unit is configured to optimize a driving path “P” of the transfer robot based on information on a position of a particular stage from one or more stages, which may be previously prepared using a mapping method [paragraph 55]. For example, the control unit may optimize the shortest distance between the robot body and the stage, meaning that the control unit is using the robot main body at numeral 100 as a point of origin for the path P, and this means that the control unit generates a virtual robot base position in the coordinate system to plot out a path. FIGS. 1-8 shows that, as a result of the movement of the robot main body along the driving path P, the main body may be positioned at the target position “C” adjacent to the stage at numeral 20, meaning that the target position “C” may be a target position mounted on the robot base. In one embodiment, the center of the image sensor on the first image acquisition unit will coincide with the coordinates for the target position C [paragraph 58]. Here, the target position C may be selected to allow a relative distance “D” between the body unit and the stage to be substantially equal to a predetermined distance, meaning that the control system determines the virtual robot base position using the target position by transforming the target position to the virtual robot base position, which is depicted more clearly in FIG. 8 where C denotes target position (robot base position). Kim does not explicitly say that the coordinate system of the target object is a different coordinate system than the environmental coordinate system in which the robot navigates. However, it would have been obvious to one of ordinary skill in the art to have the coordinate systems be separate under the KSR principle of this being an obvious combination of existing, known parts (having a coordinate system for the environment, a coordinate system for the manipulator, separate coordinate systems) to produce a result with a predictable chance of success (a manipulator coordinate system and a separate environmental coordinate system)). Regarding Claim 7. Kim teaches a system according to claim 1. Kim also teaches: wherein the control system: a) calculates an end effector path extending to the end effector destination; b) generates robot control signals based on the end effector path; and c) applies the robot control signals to the robot arm to cause the end effector to be moved in accordance with the end effector path (in some embodiments, the method of controlling the transfer robot may include moving a robot hand, which includes a plurality of fingers configured to grasp a target object having grip recesses, to a first position using a robot arm, partially inserting the fingers into the grip recesses, with the robot hand at the first position, elevating the robot hand to a second position higher than the first position, using the robot arm, and further inserting the fingers into the grip recesses with the robot hand at the second position [paragraph 8]. The robot control unit is configured to control the robot hand to partially insert the fingers into the grip recesses, to control the robot arm to elevate the robot hand, and to control the robot hand to further insert the fingers into the grip recesses [Claim 12]. Further, Kim teaches a method of moving a robot hand to a first position using a robot arm, wherein the robot hand comprises a plurality of fingers configured to grasp a target object having grip recesses; partially inserting the fingers into the grip recesses with the robot hand at the first position; elevating the robot hand to a second position higher than the first position, using the robot arm; and further inserting the fingers into the grip recesses with the robot hand at the second position [Claim 13]. This means that a path for the hand (the end effector) is calculated, robot control signals based on the end effector path are generated by the control unit, and the signals are applied to the robot arm to cause the hand to be moved in accordance with the end effector path). Regarding Claim 8. Kim teaches a system according to claim 7. Kim also teaches: wherein the control system: a) determines a current robot base position using signals from the tracking system; and b) generates robot control signals based on the end effector path and the current robot base position (a control unit that is configured to optimize a driving path “P” of the robot based on information on a position of a particular stage from one or more stages, which may be previously prepared using a mapping method [paragraph 55]. For example, the control unit may optimize the shortest distance between the robot body and the stage, meaning that the control unit is using the robot main body at numeral 100 as a point of origin for the path P, and this means that the control unit generates a virtual robot base position in the coordinate system to plot out a path). Regarding Claim 9. Kim teaches a system according to claim 7. Kim also teaches: wherein the control system calculates the end effector path in at least one of: a) the environment coordinate system; and b) the robot base coordinate system (The robot further comprises a target object-sensing unit, configured to detect the target object disposed in a scan region and to obtain an X-coordinate, a Y-coordinate and a Z-coordinate of the target object [Claim 8]. The manipulation unit comprises a robot hand configured to grasp the target object; and a robot arm connected to the robot hand, and configured to change a position of the robot hand, wherein the control unit is configured to calculate a grasping position, allowing the robot hand to grasp the target object, and to control the robot arm to place the robot hand at the grasping position, and the X-coordinate, the Y-coordinate and the Z-coordinate of the target object is used to calculate the grasping position [Claim 10], which means that the end effector destination is transformed into the robot coordinate system using the tracking target position). Regarding Claim 10. Kim teaches a system according to claim 7. Kim also teaches: wherein the control system repeatedly: a) calculates a robot base deviation based on the robot base position and an expected robot base position (there may be an error between a rest position of the main body (the robot base position) and the target position C (expected robot base position) [paragraph 58]. To reduce this error, the target position C may be selected to allow a relative distance “D” between the body unit and the stage to be substantially equal to a predetermined distance and moreover to allow a reference point C2 of the first mark at numeral 21 to coincide with at least a portion of a predetermined reference coordinate C1. The control unit may calculate an error value between the reference points C1 and C2 [paragraph 70]. Error values are calculated for both the X- and Y-coordinates [paragraph 70]); b) calculates a correction based on the robot base deviation, the correction being indicative of a path modification; and c) generates control signals in accordance with the correction (the control unit may control the driving unit to move the robot body by the calculated errors in the X- and Z-directions to minimize the errors during a subsequent calculation [paragraph 71]. This means that the control unit calculates the error values, uses those error values as a correction amount indicative of a path modification, and generates control signals in accordance with the correction). Regarding Claim 11. Kim teaches a system according to claim 7. Kim also teaches: wherein the control system: a) calculates robot arm kinematics using a current robot base position and the end effector path (based on the target object position information I4 obtained by the target object-sensing unit, the control unit may control the robot arm to move the robot hand toward the target object. For example, the control unit may calculate a grasping position allowing the robot hand to grasp the target object, based on the information on the X, Y, and Z-coordinates of the target object obtained by the target object-sensing unit [paragraph 82]. Further, Kim teaches a method of moving a robot hand to a first position using a robot arm, wherein the robot hand comprises a plurality of fingers configured to grasp a target object having grip recesses; partially inserting the fingers into the grip recesses with the robot hand at the first position; elevating the robot hand to a second position higher than the first position, using the robot arm; and further inserting the fingers into the grip recesses with the robot hand at the second position [Claim 13] (this is where Kim teaches the end effector path applicant describes in claim 7). This means that the robot control system calculates robot arm kinematics using a current position (the position prior to the robot hand grasping the target object) and the end effector path); and b) generates robot control signals based on the end effector path and the calculated robot arm kinematics (the robot arm may be controlled to move the robot hand to the calculated grasping position [paragraph 82]. The grasping position may be expressed in terms of X, Y, and Z-coordinates). Regarding Claim 12. Kim teaches a system according to claim 11. Kim also teaches: wherein the current robot base position is indicative of an origin point of the robot arm kinematics and the robot base position is determined in an environment coordinate system thereby allowing the robot arm to be controlled in the environment coordinate system (the robot arm may be controlled to move the robot hand to the calculated grasping position [paragraph 82]. The grasping position may be expressed in terms of X, Y, and Z-coordinates. Kim also teaches that the method for controlling the robot comprises detecting the target object and obtaining coordinate information including an X-coordinate, a Y-coordinate, and a Z-coordinate of the target object, using a target object-sensing unit, wherein the first position is calculated from the X-coordinate, the Y-coordinate and the Z-coordinate of the target object [Claim 14]. Kim also teaches in FIGS. 1-8 that, as a result of the movement of the robot main body along the driving path P, the main body may be positioned at the target position “C” adjacent to the stage at numeral 20, meaning that the target position “C” may be a target position mounted on the robot base [paragraph 58]. As shown in FIG. 8, the target position “C” is shown on a coordinate plane with an X, Y, Z, axis. This means that the robot control unit uses the robot’s current base position as a point of origin for the robot body kinematics and the robot base position is determined in an environment coordinate system, while the arm is also controlled in the environment system). Regarding Claim 13. Kim teaches a system according to claim 7. Kim also teaches: wherein the control system repeatedly: a) calculates the end effector path based on the current robot base position; and b) generates robot control signals based on the end effector path (in some embodiments, the method of controlling the transfer robot may include moving a robot hand, which includes a plurality of fingers configured to grasp a target object having grip recesses, to a first position using a robot arm, partially inserting the fingers into the grip recesses, with the robot hand at the first position, elevating the robot hand to a second position higher than the first position, using the robot arm, and further inserting the fingers into the grip recesses with the robot hand at the second position [paragraph 8]. The robot control unit is configured to control the robot hand to partially insert the fingers into the grip recesses, to control the robot arm to elevate the robot hand, and to control the robot hand to further insert the fingers into the grip recesses [Claim 12]. Further, Kim teaches a method of moving a robot hand to a first position using a robot arm, wherein the robot hand comprises a plurality of fingers configured to grasp a target object having grip recesses; partially inserting the fingers into the grip recesses with the robot hand at the first position; elevating the robot hand to a second position higher than the first position, using the robot arm; and further inserting the fingers into the grip recesses with the robot hand at the second position [Claim 13]. Additionally, the control unit uses a target position “C” located adjacent to the stage at numeral 20 in FIGS. 1-8, which the main robot body may be positioned at in order to pick up the target object at numeral 30 [paragraph 58]. As shown in FIGS. 1-8, this target position C is in an X, Y, Z, coordinate plane, with the robot base located a distance “D” from the stage upon which the target object is located. This means that a path for the hand (the end effector) between the first and second positions is calculated based on the current robot base position at the time the robot end effector is intended to move to pick up the target object, and control signals based on the end effector path are generated by the control unit). Regarding Claim 14. Kim teaches a system according to claim 7. Kim also teaches: wherein the control system calculates the end effector path at least in part using a reference robot base position indicative of at least one of: a) a current robot base position; b) a predicted robot base position based on movement of the robot base from a current robot base position; c) a predicted robot base position based on movement of the robot base along the robot base path; and d) an intended robot base position when end effector reaches the end effector destination (in some embodiments, the method of controlling the transfer robot may include moving a robot hand, which includes a plurality of fingers configured to grasp a target object having grip recesses, to a first position using a robot arm, partially inserting the fingers into the grip recesses, with the robot hand at the first position, elevating the robot hand to a second position higher than the first position, using the robot arm, and further inserting the fingers into the grip recesses with the robot hand at the second position [paragraph 8]. The robot control unit is configured to control the robot hand to partially insert the fingers into the grip recesses, to control the robot arm to elevate the robot hand, and to control the robot hand to further insert the fingers into the grip recesses [Claim 12]. Further, Kim teaches a method of moving a robot hand to a first position using a robot arm, wherein the robot hand comprises a plurality of fingers configured to grasp a target object having grip recesses; partially inserting the fingers into the grip recesses with the robot hand at the first position; elevating the robot hand to a second position higher than the first position, using the robot arm; and further inserting the fingers into the grip recesses with the robot hand at the second position [Claim 13]. Additionally, the control unit uses a target position “C” located adjacent to the stage at numeral 20 in FIGS. 1-8, which the main robot body may be positioned at in order to pick up the target object at numeral 30 [paragraph 58]. As shown in FIGS. 1-8, this target position C is in an X, Y, Z, coordinate plane, with the robot base located a distance “D” from the stage upon which the target object is located. This means that a path for the hand (the end effector) between the first and second positions is calculated based on a reference robot base position indicative of the current robot base position at the time the robot end effector is intended to move to pick up the target object, and control signals based on the end effector path are generated by the control unit). Regarding Claim 30. Kim teaches a method for performing interactions within a physical environment using system including: a) a robot base; b) a robot base actuator that moves the robot base relative to the environment; c) a robot arm mounted to the robot base, the robot arm including an end effector mounted thereon (a transfer robot comprising a robot main body, a driving unit configured to move the robot main body toward a stage, and a manipulation unit configured to pick up a target object disposed on the stage [Claim 1]. Kim also shows in FIG. 13 how the hand of the robot is attached to a robotic arm at numeral 610, which includes a first hinge at numeral 615 that is visibly connected to a base, wherein the arm of the robot is attached to the first hinge at numeral 615 and a robot hand at 620, which acts as a head mounted to the boom and the boom is mounted to a boom base); and, d) a tracking system that measures a tracking target position indicative of a position of a target relative to the environment, the target being mounted on the robot base (the main body of the robot control unit may be configured to receive obstacle-sensing information I1 from the obstacle-sensing unit, distance information I2 from a distance sensor unit, and to receive the first image information I3 from the first image acquisition unit, target object information I4 from a target object-sensing unit, and second image information from the second image acquisition unit. The control unit may use the received information to control the driving unit and manipulation unit to control the position and movement of the robot base and arm [paragraph 51], which means the control unit has a tracking system. The robot body may be moved toward one of a plurality of stages by the driving unit, based on the information I1-I5 [paragraph 52]. For example, when information on a target position “C” adjacent to the stage is input by a user, the robot body may be moved to the target position C by the driving unit. In one embodiment, the target position C can be located at the center of the image sensor on the first image acquisition unit, as shown in FIG. 8 [paragraph 58], which means that the target position can be mounted on the robot base. In sum, Kim teaches a tracking system that measures a tracking target position indicative of a position of a target relative to the environment, wherein the target may be mounted on the robot base. Alternatively, FIG. 1 shows a position labeled 200 which refers to an obstacle-sensing unit which, in at least one embodiment, may include a 2D laser scanner or a laser range finder, which is configured to measure a horizontal distance to an object (e.g., the obstacle O). The laser sensor may be configured to emit a laser beam in the driving direction of the body unit 100, to receive the laser beam reflected by the obstacle O, and to obtain information (hereinafter, obstacle-sensing information “I1”) regarding the presence or absence of the obstacle O or position of the obstacle O [paragraph 37]. This makes the obstacle-sensing unit at 200 a literal target of its own laser used to measure a position of the target relative to the environment, mounted on the robot base), and wherein the method includes, in a control system: i) acquiring an indication of an end effector destination defined relative to an environment coordinate system; ii) determining a tracking target position at least in part using signals from the tracking system (the transfer robot may further include object-sensing unit shown at numeral 500 of FIG. 3. The target object-sensing unit may be configured to detect a target object provided on the stage mentioned in claim 1, and may include a detection sensor to detect the object [paragraph 49]. In some embodiments, the detection sensor may include a 2D laser scanner or a laser range finder, which is configured to measure a horizontal distance to an object [paragraph 49]. Additionally, a scan unit at numeral 520 of FIG. 12 is further configured to rotate about the z plane to form the scan region S of FIG. 12, and can control the detection sensor so that the detection sensor can detect the target object in a scan region “S” (e.g., see FIG. 12). The scan region S may be a three-dimensionally region defined by X, Y, and Z-axes. The scan unit 520 may be configured to control a position of the detection sensor 510, and thus, it is possible for the detection sensor 510 to scan the target object in the scan region S. The target object-sensing unit may obtain target object position information “I4” on the target object using the detection sensor. Kim also teaches that the robot arm has a robotic hand at numeral 620 of FIG. 1. In claim 10, Kim teaches that the hand is part of the manipulation unit and is configured to grasp a target object, while the arm is configured to change a position of the hand, wherein the control unit is configured to calculate a grasping position, allowing the robot hand to grasp the target object, and to control the robot arm to place the robot hand at the grasping position, and the X-coordinate, the Y-coordinate and the Z-coordinate of the target object is used to calculate the grasping position [Claim 10]. The manipulation unit is shown in FIG. 3 to be controlled by the control unit at numeral 800, and this means that the control unit controls the arm with an indication of an end effector destination (the target object) relative to an environment coordinate system, after determining a target position at least in part using signals from the tracking system); iii) determining a virtual robot base position offset from the robot base and defined at least partially in accordance with an end effector position (Kim teaches a transfer robot with a control unit configured to obtain an X-coordinate and a Z-coordinate of a reference point of a first mark from a first image information and to control the driving unit to allow at least one of the X-coordinate and the Z-coordinate to coincide with a predetermined one of the reference coordinates [Claim 6], which means that the system determines the determines the virtual robot base position in a coordinate system, where the virtual robot base position is the predetermined reference coordinate or coordinates. Also of note, the control unit is configured to optimize a driving path “P” of the transfer robot based on information on a position of a particular stage from one or more stages, which may be previously prepared using a mapping method [paragraph 55]. For example, the control unit may optimize the shortest distance between the robot body and the stage, meaning that the control unit is using the robot main body at numeral 100 as a point of origin for the path P, and this means that the control unit generates a virtual robot base position in the coordinate system to plot out a path. FIGS. 1-8 shows that, as a result of the movement of the robot main body along the driving path P, the main body may be positioned at the target position “C” adjacent to the stage at numeral 20, meaning that the target position “C” may be a target position mounted on the robot base. In one embodiment, the center of the image sensor on the first image acquisition unit will coincide with the coordinates for the target position C [paragraph 58]. Here, the target position C may be selected to allow a relative distance “D” between the body unit and the stage to be substantially equal to a predetermined distance, meaning that the control system determines the virtual robot base position using the target position by transforming the target position to the virtual robot base position, which is depicted more clearly in FIG. 8 where C denotes target position (robot base position). Kim does not explicitly say that the coordinate system of the target object is a different coordinate system than the environmental coordinate system in which the robot navigates. However, it would have been obvious to one of ordinary skill in the art to have the coordinate systems be separate under the KSR principle of this being an obvious combination of existing, known parts (having a coordinate system for the environment, a coordinate system for the manipulator, separate coordinate systems) to produce a result with a predictable chance of success (a manipulator coordinate system and a separate environmental coordinate system)); iv) calculating a robot base path extending from the virtual robot base position to the end effector destination; v) generating robot base control signals based on the robot base path; and, vi) applying the robot base control signals to the robot base actuator to cause the robot base to be moved along the robot base path (FIG. 6 a virtual path labeled “P” that the robot follows to reach a virtual end position marked by one of the stages at numeral 20 [FIG. 6]. The control unit for the robot may move target position C [FIGS. 7 and 8], adjacent to the stage by the driving unit. In FIG. 8, target position C is located on the robot itself, meaning that the system can determine a virtual robot base position offset from the robot base and defines the position at least partially in accordance with the robot arm position (end effector position) in paragraph 58, where Kim teaches that the robot main body is moved to the stage to pick up the target object using the manipulation unit. The control unit then generates commands to the driving unit to move the robot toward the stage along the driving path [paragraph 55], which reads on generating robot base control signals based on the robot base path and applying the control signals to the robot base actuator to cause the robot base to be moved along the robot base path. Also of note, the control unit is configured to optimize a driving path “P” of the transfer robot based on information on a position of a particular stage from one or more stages, which may be previously prepared using a mapping method [paragraph 55]. For example, the control unit may optimize the shortest distance between the robot body and the stage, meaning that the control unit is using the robot main body at numeral 100 at a point of origin for the path P, and this means that the control unit generates a virtual robot base position in the coordinate system to plot out a path to the virtual base position, generates control signals to the robot base actuator, and causes the robot toe move along the bath shown in FIGS. 5 and 6). Regarding Claim 32. Kim teaches a computer program product including computer executable code, which when executed by a suitably programmed control system causes the control system to control a system for performing interactions within a physical environment, the system including: a) a robot base; b) a robot base actuator that moves the robot base relative to the environment; c) a robot arm mounted to the robot base, the robot arm including an end effector mounted thereon (a transfer robot comprising a robot main body, a driving unit configured to move the robot main body toward a stage, and a manipulation unit configured to pick up a target object disposed on the stage [Claim 1]. Kim also shows in FIG. 13 how the hand of the robot is attached to a robotic arm at numeral 610, which includes a first hinge at numeral 615 that is visibly connected to a base, wherein the arm of the robot is attached to the first hinge at numeral 615 and a robot hand at 620, which acts as a head mounted to the boom and the boom is mounted to a boom base); d) a tracking system that measures a tracking target position indicative of a position of a target relative to the environment, the target being mounted on the robot base (the main body of the robot control unit may be configured to receive obstacle-sensing information I1 from the obstacle-sensing unit, distance information I2 from a distance sensor unit, and to receive the first image information I3 from the first image acquisition unit, target object information I4 from a target object-sensing unit, and second image information from the second image acquisition unit. The control unit may use the received information to control the driving unit and manipulation unit to control the position and movement of the robot base and arm [paragraph 51], which means the control unit has a tracking system. The robot body may be moved toward one of a plurality of stages by the driving unit, based on the information I1-I5 [paragraph 52]. For example, when information on a target position “C” adjacent to the stage is input by a user, the robot body may be moved to the target position C by the driving unit. In one embodiment, the target position C can be located at the center of the image sensor on the first image acquisition unit, as shown in FIG. 8 [paragraph 58], which means that the target position can be mounted on the robot base. In sum, Kim teaches a tracking system that measures a tracking target position indicative of a position of a target relative to the environment, wherein the target may be mounted on the robot base. Alternatively, FIG. 1 shows a position labeled 200 which refers to an obstacle-sensing unit which, in at least one embodiment, may include a 2D laser scanner or a laser range finder, which is configured to measure a horizontal distance to an object (e.g., the obstacle O). The laser sensor may be configured to emit a laser beam in the driving direction of the body unit 100, to receive the laser beam reflected by the obstacle O, and to obtain information (hereinafter, obstacle-sensing information “I1”) regarding the presence or absence of the obstacle O or position of the obstacle O [paragraph 37]. This makes the obstacle-sensing unit at 200 a literal target of its own laser used to measure a position of the target relative to the environment, mounted on the robot base); and, e) a control system that: i) acquires an indication of an end effector destination defined relative to an environment coordinate system; ii) determines a tracking target position at least in part using signals from the tracking system (the transfer robot may further include object-sensing unit shown at numeral 500 of FIG. 3. The target object-sensing unit may be configured to detect a target object provided on the stage mentioned in claim 1, and may include a detection sensor to detect the object [paragraph 49]. In some embodiments, the detection sensor may include a 2D laser scanner or a laser range finder, which is configured to measure a horizontal distance to an object [paragraph 49]. Additionally, a scan unit at numeral 520 of FIG. 12 is further configured to rotate about the z plane to form the scan region S of FIG. 12, and can control the detection sensor so that the detection sensor can detect the target object in a scan region “S” (e.g., see FIG. 12). The scan region S may be a three-dimensionally region defined by X, Y, and Z-axes. The scan unit 520 may be configured to control a position of the detection sensor 510, and thus, it is possible for the detection sensor 510 to scan the target object in the scan region S. The target object-sensing unit may obtain target object position information “I4” on the target object using the detection sensor. Kim also teaches that the robot arm has a robotic hand at numeral 620 of FIG. 1. In claim 10, Kim teaches that the hand is part of the manipulation unit and is configured to grasp a target object, while the arm is configured to change a position of the hand, wherein the control unit is configured to calculate a grasping position, allowing the robot hand to grasp the target object, and to control the robot arm to place the robot hand at the grasping position, and the X-coordinate, the Y-coordinate and the Z-coordinate of the target object is used to calculate the grasping position [Claim 10]. The manipulation unit is shown in FIG. 3 to be controlled by the control unit at numeral 800, and this means that the control unit controls the arm with an indication of an end effector destination (the target object) relative to an environment coordinate system, after determining a target position at least in part using signals from the tracking system); iii) determines a virtual robot base position offset from the robot base and defined at least partially in accordance with an end effector position (Kim teaches a transfer robot with a control unit configured to obtain an X-coordinate and a Z-coordinate of a reference point of a first mark from a first image information and to control the driving unit to allow at least one of the X-coordinate and the Z-coordinate to coincide with a predetermined one of the reference coordinates [Claim 6], which means that the system determines the determines the virtual robot base position in a coordinate system, where the virtual robot base position is the predetermined reference coordinate or coordinates. Also of note, the control unit is configured to optimize a driving path “P” of the transfer robot based on information on a position of a particular stage from one or more stages, which may be previously prepared using a mapping method [paragraph 55]. For example, the control unit may optimize the shortest distance between the robot body and the stage, meaning that the control unit is using the robot main body at numeral 100 as a point of origin for the path P, and this means that the control unit generates a virtual robot base position in the coordinate system to plot out a path. FIGS. 1-8 shows that, as a result of the movement of the robot main body along the driving path P, the main body may be positioned at the target position “C” adjacent to the stage at numeral 20, meaning that the target position “C” may be a target position mounted on the robot base. In one embodiment, the center of the image sensor on the first image acquisition unit will coincide with the coordinates for the target position C [paragraph 58]. Here, the target position C may be selected to allow a relative distance “D” between the body unit and the stage to be substantially equal to a predetermined distance, meaning that the control system determines the virtual robot base position using the target position by transforming the target position to the virtual robot base position, which is depicted more clearly in FIG. 8 where C denotes target position (robot base position). Kim does not explicitly say that the coordinate system of the target object is a different coordinate system than the environmental coordinate system in which the robot navigates. However, it would have been obvious to one of ordinary skill in the art to have the coordinate systems be separate under the KSR principle of this being an obvious combination of existing, known parts (having a coordinate system for the environment, a coordinate system for the manipulator, separate coordinate systems) to produce a result with a predictable chance of success (a manipulator coordinate system and a separate environmental coordinate system)); iv) calculates a robot base path extending from the virtual robot base position to the end effector destination; v) generates robot base control signals based on the robot base path; and, vi) applies the robot base control signals to the robot base actuator to cause the robot base to be moved along the robot base path (FIG. 6 a virtual path labeled “P” that the robot follows to reach a virtual end position marked by one of the stages at numeral 20 [FIG. 6]. The control unit for the robot may move target position C [FIGS. 7 and 8], adjacent to the stage by the driving unit. In FIG. 8, target position C is located on the robot itself, meaning that the system can determine a virtual robot base position offset from the robot base and defines the position at least partially in accordance with the robot arm position (end effector position) in paragraph 58, where Kim teaches that the robot main body is moved to the stage to pick up the target object using the manipulation unit. The control unit then generates commands to the driving unit to move the robot toward the stage along the driving path [paragraph 55], which reads on generating robot base control signals based on the robot base path and applying the control signals to the robot base actuator to cause the robot base to be moved along the robot base path. Also of note, the control unit is configured to optimize a driving path “P” of the transfer robot based on information on a position of a particular stage from one or more stages, which may be previously prepared using a mapping method [paragraph 55]. For example, the control unit may optimize the shortest distance between the robot body and the stage, meaning that the control unit is using the robot main body at numeral 100 at a point of origin for the path P, and this means that the control unit generates a virtual robot base position in the coordinate system to plot out a path to the virtual base position, generates control signals to the robot base actuator, and causes the robot toe move along the bath shown in FIGS. 5 and 6). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 15 is rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. US 20170173796 A1 (“Kim”) as applied to claim 1 above, and further in view of Meuleau US 9405293 B2 (“Meuleau”). Regarding Claim 15. Kim teaches a system according to claim 1. Kim also teaches: wherein the control system: a) acquires an indication of a plurality of end effector destinations including the end effector destination (FIGS. 5 and 6 show a plurality of stages at numeral 20 with marks at numeral 21 in a robot’s workspace. The stages support target objects as shown in FIG. 13. The robot control unit is configured to control the robot hand to partially insert the fingers into the grip recesses, to control the robot arm to elevate the robot hand, and to control the robot hand to further insert the fingers into the grip recesses [Claim 12]. In FIG. 16, we see this more clearly, with the end effector gripping the recesses at numerals 31a and 31b, and the end effector destinations are the points where the robot has gripped the target objects); b) determines a robot base position at least in part using signals from the tracking system (a control unit that is configured to optimize a driving path “P” of the robot based on information on a position of a particular stage from one or more stages, which may be previously prepared using a mapping method [paragraph 55]. For example, the control unit may optimize the shortest distance between the robot body and the stage, meaning that the control unit is using the robot main body at numeral 100 as a point of origin for the path P, and this means that the control unit generates a virtual robot base position in the coordinate system to plot out a path); c) calculates the robot base path extending from the robot base position in accordance with the end effector destinations (the driving path “P” corresponding to the shortest distance is established from the position information of the stage as the destination, and the position information of the stage may include X- and Y-coordinates [paragraph 55]. This path is intended to allow the robot to reach the stages shown at numeral 20 in FIGS. 5 and 6 along path P to allow the robotic arm to pick up the target objects [paragraph 58]), d) generates robot base control signals based on the robot base path; and e) applies the robot base control signals to the robot base actuator to cause the robot base to be moved along the robot base path (the driving unit may be controlled by the control unit to allow the robot main body to move toward the stage along the established driving path P [paragraph 55], which reads on generating robot base control signals based on the robot base path and applying the control signals to the robot base actuator to cause the robot based to be moved along the robot base path). Kim does not teach: the robot base path being configured to allow continuous movement of the robot base along the robot base path in accordance with a defined robot base path velocity profile; and the robot moves along a robot base path in accordance with the robot base path velocity profile. However, Meuleau teaches: the robot base path being configured to allow continuous movement of the robot base along the robot base path in accordance with a defined robot base path velocity profile; and the robot moves along a robot base path in accordance with the robot base path velocity profile (a vehicle trajectory optimization method for autonomous vehicles, in which the vehicle includes a trajectory planning system that can obtain information describing a current state of the vehicle and a goal state for the vehicle, and, based on this information, to determine and optimize a trajectory for the vehicle. The output of the trajectory planning system can be optimized trajectory that is supplied to one or more both of a steering control system and a throttle control system. In one example, the optimized trajectory can be a list of positions (e.g., nodes) along a roadway [Column 4, lines 4-27]. The throttle control system can receive a velocity profile from the trajectory planning system or can compute a velocity profile based on information received from the trajectory planning system such as the vector of control inputs. The throttle control system is operable to output throttle control signals to the engine for regulating the power supplied by the engine and thus regulating via the speed of the vehicle [Column 4, lines 44-53]. This allows the robot to follow a path configured to allow continuous movement along the path in accordance with a defined robot base path velocity profile, as the throttle control system can regulate the speed of the vehicle in response to the velocity profile received from the trajectory planning system to maintain a constant speed). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to combine the system of Kim with the robot base path being configured to allow continuous movement of the robot base along the robot base path in accordance with a defined robot base path velocity profile; and the robot moves along a robot base path in accordance with the robot base path velocity profile as taught by Meuleau so as to allow the moveable robot base of Kim to maintain a constant speed while navigating between target object stages. Claim(s) 18-19 are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. US 20170173796 A1 (“Kim”) in combination with Meuleau US 9405293 B2 (“Meuleau”) as applied to claim 15 above, and further in view of Letsky US 20120265391 A1 (“Letsky”). Regarding Claim 18. Kim in combination with Meuleau teaches a system according to claim 15. Kim also teaches: wherein the control system: a) monitors end effector interaction; and b) selectively modifies the robot base control signals to cause the robot base to adjust its movement, depending on results of the monitoring (Kim teaches that the target object may have marks at 32a and 32b of FIG. 14, and a second image acquisition unit at numeral 700 may be configured to obtain second image information “I5”, in which images of the second marks are contained [paragraph 50]. The second image information I5 may be transmitted from the second image acquisition unit to the control unit at numeral 800, and may either be provided on the manipulation unit, or on the body unit at numeral 100, or even directly on the robot hand at 620. The control unit may be configured to receive the second image information I5 from the second image acquisition unit, and the received information may be used to control the driving unit and the manipulation unit [paragraph 51]. This image acquisition unit monitors the end effector interaction by detecting the marks at 32a and 32b, and the control unit may use the received information to control the driving unit and the manipulation unit). Kim does not teach: wherein the control system: b) selectively modifies the robot base control signals to cause the robot base to move at a robot base velocity below the robot base path velocity profile, depending on results of the monitoring. However, Letsky teaches: wherein the control system: b) selectively modifies the robot base control signals to cause the robot base to move at a robot base velocity below the robot base path velocity profile, depending on results of the monitoring (Letsky teaches an autonomous robot with a CPU that is configured to move the autonomous robot at a first velocity when the robot is within the area of confinement, and the first velocity may be a constant speed [paragraph 169], meaning it can be a velocity profile. As the robot approaches the perimeter of the area of confinement and the autonomous robot is less than or equal to a first distance from the perimeter of the area of confinement, the CPU will automatically reduce the speed of the autonomous robot to a second velocity that is less than the first velocity [paragraph 169], meaning that the control system of Letsky selectively modifies the robot base control signals to cause the robot base to move at a robot base velocity below the robot base path velocity profile, depending on monitoring results, where the monitoring results in Letsky refer to monitoring the distance between the robot and the perimeter of the area of confinement). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to combine the system of Kim with wherein the control system: b) selectively modifies the robot base control signals to cause the robot base to move at a robot base velocity below the robot base path velocity profile, depending on results of the monitoring as taught by Letsky so that the control signals applied by the control unit to the driving unit in response to monitoring the end effector interaction of Kim can slow the robot movement in the event that the information received from monitoring the end effector indicates that the robot needs to reduce speed. Regarding Claim 19. Kim in combination with Meuleau teaches a system according to claim 15. Kim also teaches: wherein the robot base path includes an interaction window associated with each end effector destination (Kim teaches that the robot can include a distance sensor unit to obtain information on a distance between the robot main body and a stage, shown in FIG. 8. This can include a distance “D1” obtained by a first distance unit at numeral 310 and a second distance “D2” obtained by a second distance sensor, and together they determine whether the robot is positioned adjacent to the stage for the robot arm to perform its tasks, as shown in FIG. 8 [paragraph 60]. The distance between the stage at numeral 20 and the robot body can be for any one of the stages shown in FIG. 6), and wherein as the robot base enters an interaction window, the control system: a) controls the robot arm to commence at least one of: i) interaction; and ii) movement of the end effector along an end effector path to the end effector destination (Kim teaches in FIG. 11 that the robot control unit will control the robot arm to move the robot hand to a grasping position of a target object, using position information of a target object at S22 [FIG. 11], which reads on both an interaction and moving the end effector along an end effector path to the end effector destination); b) monitors interaction (The robot will then take an image of the second mark of the target object and obtain second image information [FIG. 11 S23], and obtain position information of second mark from second image information and dispose fingers of robot hand at positions corresponding to grip recesses, using position information of the second mark [FIG. 11, S24]. This reads on monitoring the interaction with the target object). Kim does not teach: The control system: b) monitors interaction by determining if the interaction will be completed by the time the robot base approaches an exit to an interaction window; and, c) progressively reduces the robot base velocity to ensure the interaction is completed by the time the robot base reaches an exit to the interaction window. However, Letsky teaches: The control system: b) monitors interaction by determining if the interaction will be completed by the time the robot base approaches an exit to an interaction window; and, c) progressively reduces the robot base velocity to ensure the interaction is completed by the time the robot base reaches an exit to the interaction window (Letsky teaches a robotic lawnmower that can determine areas on a lawn where the grass is thick by calculating an average current value detected by a current sensor and analyzing the average current value in the CPU [paragraphs 183-184]. If the current value is at least 10% greater than the threshold current value that is stored in the robot’s memory device, then thick grass has been detected, which means that the robot control system monitors the interaction between the robot and the grass by determining if the interaction will be completed by the time the robot base exits the area where the grass is thick (an interaction window). As a result of detecting thick grass, the CPU will apply more power to the blade motors at step 605 in FIG. 20 so that the blades at numeral 203 of FIG. 3 can rotate quicker [paragraph 185]. The CPU will also reduce the velocity of the autonomous robot to ensure that the robot can effectively cut the thick grass, which means that the robot control system reduces the robot base velocity to ensure that the interaction is completed by the time the robot leaves the interaction window). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to combine the system of Kim with the control system: b) monitors interaction by determining if the interaction will be completed by the time the robot base approaches an exit to an interaction window; and, c) progressively reduces the robot base velocity to ensure the interaction is completed by the time the robot base reaches an exit to the interaction window as taught Letsky so that the robot can ensure that the task is completed before exiting the task completion area. Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. US 20170173796 A1 (“Kim”) as applied to claim 1 above, and further in view of Letsky US 20120265391 A1 (“Letsky”), Krogedal et al. US 20090074979 A1 (“Krogedal”), and Brooks et al. US 20140067121 A1 (“Brooks”). Regarding Claim 20. Kim in combination with Meuleau teaches a system according to claim 1. Kim does not teach: wherein the system includes: a) a first tracking system that measures a robot base position indicative of a position of the robot base relative to the environment; and b) a second tracking system that measures movement of the robot base, and wherein the control system: i) determines the robot base position at least in part using signals from the first tracking system; and ii) in the event of failure of the first tracking system: (1) determines a robot base position using signals from the second tracking system. However, Letsky teaches: wherein the system includes: a) a first tracking system that measures a robot base position indicative of a position of the robot base relative to the environment (a method of defining an area of confinement for an autonomous robot by moving the robot from a first location to a plurality of location points and recording each of the plurality of points within a memory device [paragraph 8]. In another aspect, the central processing unit that can track the location of an autonomous robot based on output generated by a location tracking unit during movement of the autonomous robot [paragraph 9], which means the robot features a tracking system that measures a robot base position indicative of a position of the robot base relative to the environment. The CPU also records the robot’s movement and location in an area of confinement); and b) a second tracking system that measures movement of the robot base, and wherein the control system: i) determines the robot base position at least in part using signals from the first tracking system; and ii) in the event of failure of the first tracking system: (1) determines a robot base position using signals from the second tracking system (two sources of navigational measurements can be used to improve navigational accuracy [paragraph 99]. A first example includes differential correction of navigational measurements where one source of navigational measurements transmits corrected information to the second source of navigational measurements using radio signals, such as in the art and practice of differential GPS (DGPS). In the event that the autonomous robot travels outside the defined perimeter and detects that is being “carried” without a secure pin having been entered, the robot will activate a security alarm and transmit its GPS location (using either the location tracking unit or a GPS unit) back to a docking station [paragraph 115]. The server will e-mail the user with information indicating that the autonomous robot may have been stolen, and will provide the user with the GPS location of the autonomous robot. In some embodiments, the autonomous robot may communicate with neighboring docking stations to transmit security information. In some embodiments, the autonomous robot may communicate with neighboring docking stations to obtain corrected navigational information i.e. GPS correction data. This means that this second tracking system measures movement of the robot base by determining the location of the robot using information from the first robot position tracking system, and in the event that the first tracking system fails (due to the robot traveling outside the defined perimeter, possibly because the robot is being stolen), the robot determines its location based on signals from a second tracking system). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to combine the system of Kim with wherein the system includes: a) a first tracking system that measures a robot base position indicative of a position of the robot base relative to the environment; and b) a second tracking system that measures movement of the robot base, and wherein the control system: i) determines the robot base position at least in part using signals from the first tracking system; and ii) in the event of failure of the first tracking system: (1) determines a robot base position using signals from the second tracking system as taught by Letsky so that the robot will be able to track its current location and have a failsafe in case the first tracking system fails due to unforeseen events, such as a robot being stolen. Kim also does not teach: ii) in the event of failure of the first tracking system: the second tracking system can control the robot arm to move at a reduced end effector speed. However, Brooks teaches: ii) in the event of failure of the first tracking system: the second tracking system can control the robot arm to move at a reduced end effector speed (a working robot that includes one or more sensors for monitoring the robot’s environment, including sensors for monitoring the state of the robot itself, including accelerometers or gyroscopes to keep track of the location, orientation, and configuration of its appendages [paragraph 23]. The robot includes a failure-response module that may monitor the robot sensors for mutual consistency of their readings, detect any sensor or robot-operation failures, issue a warning, interrupt robot operation, and/or initiate a safe shut-down procedure in case of any failure condition [paragraph 25]. The robot can treat a human entering its work area, a “zone of danger,” as a failure [paragraph 10], but the robot’s safety system may also implement self-consistency checks to monitor its safety sensors and ensure that all sensors are operating correctly; if it detects a failure, the safety system shuts down robot operation [paragraph 11]. To ensure safe shut-down in case of a failure, the robot may passively lower its arms in a slow and safe manner, avoiding any possibility of injury to a person. Additional features to increase safety and operational efficiency might include providing feedback to nearby persons to help them assess how their presence and actions affect robot operation, and thus enable them to adjust their actions to maximize robot performance [paragraph 9]. For example, a person may be alerted of his presence in the danger zone, which results in slowed-down robot operation, and the person may in response leave the danger zone if feasible, which means that the robot can still move after detecting a failure either due to a person entering the danger zone or a failure of the sensors that form the robot tracking system, but only at a reduced speed, instead of having to stop controlling the robot arms entirely. FIGS. 2A and 2B illustrates an exemplary robot in accordance with one embodiment of the invention, with two arms at numeral 206, and grippers at 208 which act as end effectors [paragraph 26, FIGS. 2A and 2B], which are lowered when the robot detects a failure, or at the very least are operated at a safe reduced speed). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to combine the system of Kim with ii) in the event of failure of the first tracking system: the second tracking system can control the robot arm to move at a reduced end effector speed as taught by Brooks so that the robot can avoid causing harm to people and objects in the workspace in the event that the first tracking system fails. Kim also does not teach: the second tracking system can control the robot arm to move the end effector along the end effector path at a reduced end effector speed. However, Krogedal teaches: the second tracking system can control the robot arm to move the end effector along the end effector path at a reduced end effector speed (a method for controlling an industrial robot with a manipulator arm for painting a workpiece, where paint is applied to a substantially circular or elliptical area on the surface of the workpiece, with a fixed point on a wrist section that follows a planned path to allow the tool center point to coat the surface of the workpiece [Claim 1]. The fixed wrist point (wrist center point) moves along a planned path at a constant velocity Vprog, as taught in FIG. 4, step 30, but when the wrist approaches a bend in the path, the velocity of the WCP is reduced to allow for deviation in x and y coordinates until the WCP leaves the bend and accelerates back to Vprog [FIG. 4, step 36]. This means that a tracking system for the robot arm controls the robot arm to move the end effector along the end effector path at a reduced end effector speed). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to combine the system of Kim with the second tracking system can control the robot arm to move the end effector along the end effector path at a reduced end effector speed as taught by Krogedal so that the robot controller will be able to reduce the speed of the arm end effector in the event that the robot detects through one of its tracking systems that the arm needs to reduce effector speed to avoid collision or to perform its task properly. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Telleria et al. US 20180283017 A1 (“Telleria”), which teaches a robotic mobile platform that moves independently of an attached robot arm, with a boom and the robot based positioned between the robot arm and the boom. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AARON G CAIN whose telephone number is (571)272-7009. The examiner can normally be reached Monday: 7:30am - 4:30pm EST to Friday 7:30pm - 4:30am. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wade Miles can be reached at (571) 270-7777. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AARON G CAIN/Examiner, Art Unit 3656 /WADE MILES/Supervisory Patent Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Dec 09, 2024
Application Filed
Mar 10, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12573302
METHOD FOR INFRASTRUCTURE-SUPPORTED ASSISTING OF A MOTOR VEHICLE
2y 5m to grant Granted Mar 10, 2026
Patent 12558790
METHOD AND COMPUTING SYSTEMS FOR PERFORMING OBJECT DETECTION
2y 5m to grant Granted Feb 24, 2026
Patent 12552019
MACHINE LEARNING METHOD AND ROBOT SYSTEM
2y 5m to grant Granted Feb 17, 2026
Patent 12544144
DENTAL ROBOT AND ORAL NAVIGATION METHOD
2y 5m to grant Granted Feb 10, 2026
Patent 12541205
MOVEMENT CONTROL SUPPORT DEVICE AND METHOD
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
40%
Grant Probability
66%
With Interview (+26.1%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 130 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month