DETAILED ACTION
This is a non-final Office Action on the merits in response to communications filed by Applicant on February 26th, 2026. Claims 1-3, 8, 15-16, 18-21, 24, 30, 36-38, 40, 42, 46, 53, and 112 are currently pending and examined below.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The amendments filed February 26th, 2026, have been entered. Claims 1 and 53 are currently amended and pending, claims 2-3, 15-16, 18-21, 24, 30, 36-37, 40, 42, and 46 are original, unamended, and pending, claims 8 and 28 are as previously presented and pending, claim 112 is new and currently pending and claims 4-7, 9-14, 17, 22-23, 25-29, 31-35, 39, 41, 43-45, 47-52, and 54-111 have been canceled.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-3, 15-16, 18-21, 24, 38, 42, 53, and 112 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 11911918 B2 ("Tanaka") in view of US 10081106 B2 ("Rublee") in further view of US 11597084 B2 ("Johnson").
Regarding claim 1, Tanaka teaches a method comprising (Tanaka: Figures 3 and 4):
receiving, by a computing device, first location information for the mobile robot (Tanaka: Column 6 lines 26-31, “This safety detector 115 includes machine detector 121, human detector 122, and safety determiner 123. Machine detector 121 detects, using second range image 134, the position (three-dimensional position) of mobile machine 101.”);
receiving, by the computing device, second location information for a first entity in an environment of the mobile robot (Tanaka: Column 6 lines 26-31, “Human detector 122 detects, using second range image 134, the position (three-dimensional position) of person 104.”);
determining, by the computing device, based, at least in part, on the first location information and the second location information, a first distance between the mobile robot and the first entity in the environment of the mobile robot (Tanaka: Column 6 lines 32-44, “Safety determiner 123 determines safety based on the detected position of mobile machine 101 and the detected position of person 104. Specifically, safety determiner 123 detects the first distance between mobile machine 101 and person 104, based on the position of mobile machine 101 and the position of person 104.”);
determining, by the computing device, one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance (Tanaka: Column 6 lines 32-44, “In addition, when safety determiner 123 determines that the distance between mobile machine 101 and person 104 is unsafe, safety determiner 123 controls operation of mobile machine 101 via machine controller 106.”); and
controlling, by the computing device, the mobile robot to operate according to the one or more operating parameters (Tanaka: Column 6 lines 32-44, “In addition, when safety determiner 123 determines that the distance between mobile machine 101 and person 104 is unsafe, safety determiner 123 controls operation of mobile machine 101 via machine controller 106.”, Column 7 lines 61-67, “When safety determiner 123 determines that the distance between mobile machine 101 and person 104 is unsafe (No in S115), safety determiner 123 controls operation of mobile machine 101 via machine controller 106 (S116). Specifically, safety detector 115 (i) stops mobile machine 101, (ii) slows down mobile machine 101, or (iii) changes (limits) a movable range of mobile machine 101.”).
Tanaka does not teach receiving, by a computing device located on board a mobile robot, first location information for the mobile robot;
receiving, by the computing device, information indicating a hardware configuration of the mobile robot;
determining, by the computing device, one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot.
Rublee, in the same field of endeavor, teaches receiving, by a computing device located on board a mobile robot, first location information for the mobile robot (Rublee: Column 19 lines 25-38, “FIG. 6A is a functional block diagram of computing device 600 (e.g., system, in accordance with an example embodiment. In particular, computing device 600 shown in FIG. 6A can be configured to perform one or more functions of a robot, robotic actor, a special robot, safety-system server 180, a computing device as discussed in the context of FIGS. 5A and 5B, network 614, method 700, and one or more functions related to one or more of scenarios 200, 300, and/or 400. Computing device 600 may include a user interface module 601, a network-communication interface module 602, one or more processors 603, data storage 604, one or more sensors 620, and one or more actuators 630, all of which may be linked together via a system bus, network, or other connection mechanism 605.”, Column 23 lines 25-34, “FIG. 7 is a flowchart of method 700, in accordance with an example embodiment. Method 700 can be executed on a computing device. Example computing devices include, but are not limited to, computing device 600, a safety-system server such as safety-system server 180, a computing device aboard a herein-described robot, robotic actor, robotic device, special robot, and/or other computing device(s). In some embodiments, the computing device can be part of a safety system, such as discussed above in the context of at least FIG. 1.”, Column 23 lines 35-39, “Method 700 can begin at block 710, where the computing device can determine presence information and actor type information about any actors present within a predetermined area of an environment using one or more sensors, such as discussed above regarding as least FIGS. 1-4.”. The cited passages clearly teach that the computing device used to implement the method can be onboard the robot and that said computing device is configured to gather position information of the robot.).
Tanaka teaches a method comprising: receiving, by a computing device, first location information for the mobile robot; receiving, by the computing device, second location information for a first entity in an environment of the mobile robot; determining, by the computing device, based, at least in part, on the first location information and the second location information, a first distance between the mobile robot and the first entity in the environment of the mobile robot; determining, by the computing device, one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance; and controlling, by the computing device, the mobile robot to operate according to the one or more operating parameters. Tanaka does not teach receiving, by a computing device located on board a mobile robot, first location information for the mobile robot. Rublee teaches receiving, by a computing device located on board a mobile robot, first location information for the mobile robot. A person of ordinary skill in the art would have had the technological capabilities required to have modified the method taught in Tanaka with receiving, by a computing device located on board a mobile robot, first location information for the mobile robot taught in Rublee. Furthermore, even though the process outlined in the method taught Tanaka is not explicitly stated as being performed on a computing device mounted on board the robot, Tanaka still teaches a machine controller for controlling the robot. As such, Tanaka could easily be modified to perform the steps of the method by using a computing device on board a robot as taught in Rublee. Such a modification would only require changing the location of the computing device used to perform the method, which would not change or introduce new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a method comprising: receiving, by a computing device located on board a mobile robot, first location information for the mobile robot.
Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the method taught in Tanaka with receiving, by a computing device located on board a mobile robot, first location information for the mobile robot taught in Rublee with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results.
Tanaka in view of Rublee does not teach receiving, by the computing device, information indicating a hardware configuration of the mobile robot;
determining, by the computing device, one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot.
Johnson, in the same field of endeavor, teaches receiving, by the computing device, information indicating a hardware configuration of the mobile robot (Johnson: Column 3 lines 1-24, “Further, embodiments may determine the context by analyzing a computer based model of an environment of the robot to determine proximity of the robot to physical objects during the portion of the movement plan. Yet another embodiment identifies the context by performing a computer-based simulation of an environment including the robot using (i) the movement plan, (ii) a computer-based model of the environment, and (iii) a computer-based model of the robot.”, Column 5 lines 31-47, “In an embodiment, in order to compute the feed-forward torque for a given action, it is necessary to model the expected resistance to be encountered during the motion. In an embodiment, the expected resistance is based on a model of an environment of the robot arm. In an embodiment, the context specific torque is loaded from a model of the robot arm (including moments of inertia, gravity, friction, stiction, motor performance, and controller response), the environment (viscosity, density, fluid-based drag), and intended contacts with objects and materials. In another embodiment, the context also includes distance from potential obstacles and likelihood of interaction with human co-workers. In regions where collision is unlikely, robot operations, e.g., torque and velocity constraints, can be relaxed to enable improved processing speed. By automatically determining these regions aka ‘the context’, embodiments can enable safe operation without tedious human, i.e., manual, design.”, Column 11 lines 26-34, “In an embodiment, determining the context specific torque at 333 includes loading the context specific torque from a model of the robot, the task, and an environment in which the robot is operating to perform the task. In such an embodiment, the task is selected from a library of tasks, where the required torque is recorded. The model of the robot determines if the required torque can be achieved and if, given the environment, the robot is physically able to perform the task.”, Column 11 lines 35-57, “In embodiments, of the method 330 “context” may include any conditions related in any way to the robot. For example, context may include any data related to the robot, the task performed by robot, the motion of the robot, and the environment in which the robot is operating, amongst other examples. According to an example embodiment of the method 330, context is at least one of: (a) free space, (b) active contact, (c) collision, (d) anticipated collision, (e) a likelihood of accident during the portion of the movement plan based on standards for (1) speed and separation limits and (2) allowable harm, (f) the task performed by the robot, e.g., slicing, scooping, grasping, picking, or machining material, (g) a set of possible colliding objects, where each object has an object class, and the object class indicates a severity of a collision with the object, (h) a probability distribution of possible colliding objects, (i) a probability distribution of one or more possible world configurations, (j) a continuous distribution of world configurations, (k) a continuous distribution of objects, (l) a plane of motion of the robot, and (m) dimensions of the robot, amongst other examples. In embodiments, speed and separation limits and limits for allowable harm may be based upon industry standards and/or regulations such as ISO standard 15066.”. The cited passages clearly teaches that a model of the robot is used by the controller. The model of the robot includes moments of inertia of the robot, gravity, friction, stiction, motor performance, the response time of the controller, and the dimensions of the robot. One of ordinary skill in the art would recognize that all of these parameters are defined by the hardware configuration of the robot. Additionally, the system considers the task being performed by the robot, which is defined by the hardware configuration of said robot.);
determining, by the computing device, one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot (Johnson: Column 9 lines 20-24, “In another example of context specific torque limits, when the robot operates with a known model of its environment and obstacles therein, it can limit the torque and velocity of the robot based on distance to known obstacles, which allows for safer operation.”, Column 9 lines 25-31, “Embodiments provide functionality to control robot operation, e.g., torque and velocity, amongst other examples, based on context. The context can include current location of the robot, distance to collision, current velocity, and position in a motion plan (e.g., a plan that indicates the robot's location and path for movements through free space and the robot's contact with material, amongst other examples.”, Column 11 lines 26-34, “In an embodiment, determining the context specific torque at 333 includes loading the context specific torque from a model of the robot, the task, and an environment in which the robot is operating to perform the task. In such an embodiment, the task is selected from a library of tasks, where the required torque is recorded. The model of the robot determines if the required torque can be achieved and if, given the environment, the robot is physically able to perform the task.”, Column 11 lines 35-57, “In embodiments, of the method 330 “context” may include any conditions related in any way to the robot. For example, context may include any data related to the robot, the task performed by robot, the motion of the robot, and the environment in which the robot is operating, amongst other examples. According to an example embodiment of the method 330, context is at least one of: (a) free space, (b) active contact, (c) collision, (d) anticipated collision, (e) a likelihood of accident during the portion of the movement plan based on standards for (1) speed and separation limits and (2) allowable harm, (f) the task performed by the robot, e.g., slicing, scooping, grasping, picking, or machining material, (g) a set of possible colliding objects, where each object has an object class, and the object class indicates a severity of a collision with the object, (h) a probability distribution of possible colliding objects, (i) a probability distribution of one or more possible world configurations, (j) a continuous distribution of world configurations, (k) a continuous distribution of objects, (l) a plane of motion of the robot, and (m) dimensions of the robot, amongst other examples. In embodiments, speed and separation limits and limits for allowable harm may be based upon industry standards and/or regulations such as ISO standard 15066.”. The cited passages clearly teach that the torque and velocity of the robot is set base on context data, which includes details on the hardware configuration as well as a distance from the robot to an obstacle.).
Tanaka in view of Rublee teaches a method comprising: receiving, by a computing device onboard the mobile robot, first location information for the mobile robot; receiving, by the computing device, second location information for a first entity in an environment of the mobile robot; determining, by the computing device, based, at least in part, on the first location information and the second location information, a first distance between the mobile robot and the first entity in the environment of the mobile robot; determining, by the computing device, one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance; and controlling, by the computing device, the mobile robot to operate according to the one or more operating parameters. Tanaka in view of Rublee does not teach receiving, by the computing device, information indicating a hardware configuration of the mobile robot; determining, by the computing device, one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot. Johnson teaches receiving, by the computing device, information indicating a hardware configuration of the mobile robot; determining, by the computing device, one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot. A person of ordinary skill in the art would have had the technological capabilities required to have modified the method taught in Tanaka in view of Rublee with receiving, by the computing device, information indicating a hardware configuration of the mobile robot; determining, by the computing device, one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot taught in Johnson. Furthermore, the method taught in Tanaka in view of Rublee already teaches setting an operational parameter of the robot based on a first distance between the robot and entity. As such, one of ordinary skill in the art would have been easily able to add the consideration of the hardware parameters of the robot in the determination of the operational parameters as taught in Johnson into the determination of the operational parameters taught in Tanaka in view of Rublee according to methods known in the art. Additionally, such considerations of hardware parameters would have been familiar to a person of ordinary skill in the art. As such, the combination would not have changed or introduced new functionality to either. No inventive effort would have been required. The combination would have yielded the predictable result of a method comprising: receiving, by the computing device, information indicating a hardware configuration of the mobile robot; determining, by the computing device, one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot.
Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the method taught in Tanaka in view of Rublee with receiving, by the computing device, information indicating a hardware configuration of the mobile robot; determining, by the computing device, one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot taught in Johnson with a reasonable expectation of success. A person of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results.
Regarding claim 2, Tanaka in view of Rublee in further view of Johnson teaches wherein receiving first location information for the mobile robot comprises receiving sensor data indicating a location of the mobile robot (Tanaka: Column 6 lines 9-11, “Range image generator 114 generates second range image 134 from second sensed result 133 obtained by second ranging sensor 103.”, Column 6 lines 26-31, “Machine detector 121 detects, using second range image 134, the position (three-dimensional position) of mobile machine 101.”).
Regarding claim 3, Tanaka in view of Rublee in further view of Johnson teaches wherein receiving second location information for the first entity comprises receiving an indication that the first entity is located in a region defining a safety zone of the mobile robot (Tanaka: Column 5 lines 32-41, “In addition, distance-measuring system 100 detects that person 104 has come closer to mobile machine 101. For example, distance-measuring system 100 detects, using a sensed result obtained by second ranging sensor 103 with high precision, whether person 104 has entered restricted area 105 in which mobile machine 101 is present. When person 104 is detected within restricted area 105, distance measuring system 100 restricts or stops operation of mobile machine 101. With this, it is possible to improve the safety of person 104.”).
Regarding claim 15, Tanaka in view of Rublee in further view of Johnson teaches further comprising receiving, by the computing device, a velocity of the first entity, wherein the one or more operating parameters are based on the velocity of the first entity (Tanaka: Column 12 lines 56-63, “Next, human movement predictor 127 predicts the traveling direction of person 104, based on the plurality of second range images 134 corresponding to the plurality of frames which are stored in frame memory 125 (S143). Specifically, human movement predictor 127 predicts the traveling direction and traveling speed of person 104, based on differences among the plurality of second range images 134 corresponding to the plurality of frames.”, Column 13 lines 10-17, “Next, safety determiner 123C determines safety based on the position of mobile machine 101 obtained by machine detector 121, the position of person 104 obtained by human detector 122, the traveling direction and traveling speed of mobile machine 101 obtained by machine movement predictor 126, and the traveling direction and traveling speed of person 104 obtained by human movement predictor 127 (S115C).”).
Embodiment 1 of Tanaka teaches a method for controlling a mobile robot. Embodiment 4 of Tanaka teaches further comprising receiving, by the computing device, a velocity of the first entity, wherein the one or more operating parameters are based on the velocity of the first entity. A person of ordinary skill in the art would have had the technological capabilities required to have combine the method taught in embodiment 1 of Tanaka with further comprising receiving, by the computing device, a velocity of the first entity, wherein the one or more operating parameters are based on the velocity of the first entity taught in Embodiment 4 of Tanaka. The combination is a simple substitution of components or algorithms between the embodiments or are simply adding components or algorithms to an embodiment that have known results. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, that the combination of Tanaka in view of Rublee in further view of Johnson teaches the limitations of claim 15.
Regarding claim 16, Tanaka in view of Rublee in further view of Johnson teaches further comprising receiving, by the computing device, an acceleration of the first entity, wherein the one or more operating parameters are based on the acceleration of the first entity (Tanaka: Column 13 lines 18-25, “Specifically, safety determiner 123C determines that the distance between mobile machine 101 and person 104 is unsafe, when a first distance between mobile machine 101and person 104 is less than a predetermined first value.”, Column 13 lines 42-53, “For example, when the traveling direction of person 104 is a direction toward mobile machine 101, safety determiner 123C decreases the first value, and when the traveling direction of person 104 is a direction away from mobile machine 101, safety determiner 123C increases the first value. In addition, the amount of change in the first value increases as the traveling speed”. As can be seen from the cited passages, the system is configured to change the first value used to determine if there is a safety risk between the robot and person based on the amount of change in the person’s speed. The amount of change in speed by definition is acceleration. Furthermore, the operating parameters of the robot is changed based on the first value. Therefore the operating parameters of the robot are determined, in part, based on the acceleration of a person.).
Embodiment 1 of Tanaka teaches a method for controlling a mobile robot. Embodiment 4 of Tanaka teaches further comprising receiving, by the computing device, an acceleration of the first entity, wherein the one or more operating parameters are based on the acceleration of the first entity. A person of ordinary skill in the art would have had the technological capabilities required to have combine the method taught in embodiment 1 of Tanaka with further comprising receiving, by the computing device, an acceleration of the first entity, wherein the one or more operating parameters are based on the acceleration of the first entity taught in Embodiment 4 of Tanaka. The combination is a simple substitution of components or algorithms between the embodiments or are simply adding components or algorithms to an embodiment that have known results. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, that the combination of Tanaka in view of Rublee in further view of Johnson teaches the limitations of claim 16.
Regarding claim 18, Tanaka in view of Rublee in further view of Johnson teaches further comprising: receiving, by the computing device, third location information for a second entity in the environment of the mobile robot (Rublee: Column 8 lines 60-67, “In scenario 200, safety-system server 180 and/or other computing device(s) (e.g., a computing device aboard a robotic actor, a mobile device carried by a human actor) can receive presence/location and identification data for human actors 210, 212, 214, 216, 218 and robotic actors 220, 222, 224, 226, 228, 230.”);
wherein the one or more operating parameters are based on a smaller distance of the first distance and the second distance (Rublee: Column 9 lines 19-29, “For example, in scenario 200, safety-system server 180 and/or other computing device(s) can classify an area A in environment 100 with: the high safety classification if area A is not otherwise classified and if area A is at least partially occupied by one or more human actors; the medium safety classification if area A is not otherwise classified and if area A is at least partially occupied by one or more robotic actors; or the low safety classification if no actors are at least partially present within area A.”, Column 9 lines 40-46, “Safety-system server 180 can provide safety rules to robotic (and perhaps other) actors in environment 100. The safety rules can correspond to the safety classifications of areas within environment 100; as such, a particular safety rule can be used in more than one area 111-157 of environment 100 based on the safety classifications of the areas 111-157 of environment 100.”, Column 9 line 62 – Column 10 line 3, “In other embodiments, the medium safety rule can specify that a robotic actor is to slow down upon entry and/or exit of one of areas 111-157 of environment 100, while the high safety rule can specify that a robotic actor is to stop for at least a predetermined amount of time (e.g., a second) upon entry and/or exit of one of areas 111-157 of environment 100, and the low safety rule can be silent about (not specify) slowing down or stopping upon entry and/or exit of one of areas 111-157.”).
Regarding claim 19, Tanaka in view of Rublee in further view of Johnson teaches further comprising: receiving, by the computing device, third location information for a second entity in the environment of the mobile robot (Rublee: Column 8 lines 60-67, “In scenario 200, safety-system server 180 and/or other computing device(s) (e.g., a computing device aboard a robotic actor, a mobile device carried by a human actor) can receive presence/location and identification data for human actors 210, 212, 214, 216, 218 and robotic actors 220, 222, 224, 226, 228, 230.”);
wherein the one or more operating parameters are based only on the first distance when it is determined that the first entity is closer to the mobile robot than the second entity (Rublee: Column 9 lines 19-29, “For example, in scenario 200, safety-system server 180 and/or other computing device(s) can classify an area A in environment 100 with: the high safety classification if area A is not otherwise classified and if area A is at least partially occupied by one or more human actors; the medium safety classification if area A is not otherwise classified and if area A is at least partially occupied by one or more robotic actors; or the low safety classification if no actors are at least partially present within area A.”, Column 9 lines 40-46, “Safety-system server 180 can provide safety rules to robotic (and perhaps other) actors in environment 100. The safety rules can correspond to the safety classifications of areas within environment 100; as such, a particular safety rule can be used in more than one area 111-157 of environment 100 based on the safety classifications of the areas 111-157 of environment 100.”, Column 9 line 62 – Column 10 line 3, “In other embodiments, the medium safety rule can specify that a robotic actor is to slow down upon entry and/or exit of one of areas 111-157 of environment 100, while the high safety rule can specify that a robotic actor is to stop for at least a predetermined amount of time (e.g., a second) upon entry and/or exit of one of areas 111-157 of environment 100, and the low safety rule can be silent about (not specify) slowing down or stopping upon entry and/or exit of one of areas 111-157.”).
Regarding claim 20, Tanaka in view of Rublee in further view of Johnson teaches wherein the environment of the mobile robot includes a plurality of entities (Rublee: Column 8 lines 60-67, “In scenario 200, safety-system server 180 and/or other computing device(s) (e.g., a computing device aboard a robotic actor, a mobile device carried by a human actor) can receive presence/location and identification data for human actors 210, 212, 214, 216, 218 and robotic actors 220, 222, 224, 226, 228, 230.”),
and wherein an entity of the plurality of entities located closest to the mobile robot is selected as the first entity (Rublee: Column 9 lines 19-29, “For example, in scenario 200, safety-system server 180 and/or other computing device(s) can classify an area A in environment 100 with: the high safety classification if area A is not otherwise classified and if area A is at least partially occupied by one or more human actors; the medium safety classification if area A is not otherwise classified and if area A is at least partially occupied by one or more robotic actors; or the low safety classification if no actors are at least partially present within area A.”, Column 9 lines 40-46, “Safety-system server 180 can provide safety rules to robotic (and perhaps other) actors in environment 100. The safety rules can correspond to the safety classifications of areas within environment 100; as such, a particular safety rule can be used in more than one area 111-157 of environment 100 based on the safety classifications of the areas 111-157 of environment 100.”, Column 9 line 62 – Column 10 line 3, “In other embodiments, the medium safety rule can specify that a robotic actor is to slow down upon entry and/or exit of one of areas 111-157 of environment 100, while the high safety rule can specify that a robotic actor is to stop for at least a predetermined amount of time (e.g., a second) upon entry and/or exit of one of areas 111-157 of environment 100, and the low safety rule can be silent about (not specify) slowing down or stopping upon entry and/or exit of one of areas 111-157.”. One of ordinary skill in the art would see that the system changes the operating parameters of the robot based only on the closest human. This is equivalent to designating the closest human as the first entity.).
Regarding claim 21, Tanaka in view of Rublee in further view of Johnson teaches wherein the first location information for the mobile robot and/or the second location information for the first entity are based on data received from one or more sensors in communication with the computing device (Tanaka: Column 6 lines 1-8, “Specifically, sensor controller113 controls the detection position to be detected by second ranging sensor 103 so as to detect the detected position of mobile machine 101. For example, sensor controller 113 controls second ranging sensor 103 such that the position of mobile machine 101 is included in a detection range to be covered by second ranging sensor 103.”, Column 6 lines 26-31, “This safety detector 115 includes machine detector 121, human detector 122, and safety determiner 123. Machine detector 121 detects, using second range image 134, the position (three-dimensional position) of mobile machine 101. Human detector 122 detects, using second range image 134, the position (three-dimensional position) of person 104.”).
Regarding claim 24, Tanaka in view of Rublee in further view of Johnson teaches wherein the one or more sensors are attached to a sensor mount physically separate from the mobile robot (Tanaka: Figure 1, Column 5 lines 1-16, “For example, first ranging sensor 102 and second ranging sensor 103 are fixedly installed.”. As can be seen from the cited Figure and passage, the sensors are installed in a fixed location separate from the robot.).
Regarding claim 38, Tanaka teaches further comprising controlling, by the computing device, the mobile robot to perform an emergency stop when the first distance is below a threshold distance and/or when the second location information for the first entity indicates that the first entity is located in a specified safety zone (Tanaka: Column 7 lines 61-67, “When safety determiner 123 determines that the distance between mobile machine 101 and person 104 is unsafe (No in S115), safety determiner 123 controls operation of mobile machine 101 via machine controller 106 (S116). Specifically, safety detector 115 (i) stops mobile machine 101, (ii) slows down mobile machine 101, or (iii) changes (limits) a movable range of mobile machine 101.”).
Regarding claim 42, Tanaka in view of Rublee in further view of Johnson teaches wherein the second location information for the first entity is based on a presence or absence of the first entity in a safety zone in the environment of the mobile robot (Tanaka: Column 14 lines 19-32, “FIG. 12 is a diagram illustrated for describing operation of distance-measuring system 100D. FIG. 12 illustrates areas used for safety determinations. A monitoring area illustrated in FIG. 12 is a detection range covered by first ranging sensor 102, and first ranging sensor 102 detects the position of person 104 within the monitoring area. When person 104 enters the restricted area illustrated in FIG. 12, operation of mobile machine 101D is restricted. For example, mobile machine 101D slows down, or a movable range is restricted. In addition, when person 104 enters the deactivation area illustrated in FIG. 12, mobile machine 101D is deactivated. The restricted area and deactivation area are included in a detection range covered by second ranging sensor 103.”).
Embodiment 1 of Tanaka teaches a method for controlling a mobile robot. Embodiment 5 of Tanaka teaches wherein the second location information for the first entity is based on a presence or absence of the first entity in a safety zone in the environment of the mobile robot. A person of ordinary skill in the art would have had the technological capabilities required to have combine the method taught in embodiment 1 of Tanaka with wherein the second location information for the first entity is based on a presence or absence of the first entity in a safety zone in the environment of the mobile robot taught in Embodiment 4 of Tanaka. The combination is a simple substitution of components or algorithms between the embodiments or are simply adding components or algorithms to an embodiment that have known results. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, that the combination of Tanaka in view of Rublee in further view of Johnson teaches the limitations of claim 42.
Regarding claim 53, Tanaka teaches a computing system of a mobile robot, the computing system comprising (Tanaka: Column 4 lines 55-60, “FIG. 1 is a diagram schematically illustrating a configuration of distance-measuring system 100. Distance-measuring system 100 includes mobile machine 101, first ranging sensor 102, and second ranging sensor 103.”, Column 5 lines 53-57, “As illustrated in FIG. 2, distance-measuring system 100 further includes machine controller 106 and image processing device 110.”):
data processing hardware (Tanaka: Column 5 lines 53-57, “As illustrated in FIG. 2, distance-measuring system 100 further includes machine controller 106 and image processing device 110.”);
and memory hardware in communication with the data processing hardware, the memory hardware storing instructions that when executed on the data processing hardware cause the data processing hardware to perform operations comprising (Tanaka: Column 18 lines 4-11, “For example, in the embodiments above, each of the elements may be configured of a dedicated hardware or realized by running a software program suitable for the element. The elements may each be realized by a program executor such as a central processing unit (CPU) or a processor reading and running a software program stored in a recording medium such as a hard disk or a semiconductor memory”):
receiving first location information for the mobile robot (Tanaka: Column 6 lines 26-31, “This safety detector 115 includes machine detector 121, human detector 122, and safety determiner 123. Machine detector 121 detects, using second range image 134, the position (three-dimensional position) of mobile machine 101.”);
receiving second location information for a first entity in an environment of the mobile robot (Tanaka: Column 6 lines 26-31, “Human detector 122 detects, using second range image 134, the position (three-dimensional position) of person 104.”);
determining based, at least in part, on the first location information and the second location information, a first distance between the mobile robot and the first entity in the environment of the mobile robot (Tanaka: Column 6 lines 32-44, “Safety determiner 123 determines safety based on the detected position of mobile machine 101 and the detected position of person 104. Specifically, safety determiner 123 detects the first distance between mobile machine 101 and person 104, based on the position of mobile machine 101 and the position of person 104.”);
determining one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance (Tanaka: Column 6 lines 32-44, “In addition, when safety determiner 123 determines that the distance between mobile machine 101 and person 104 is unsafe, safety determiner 123 controls operation of mobile machine 101 via machine controller 106.”); and
controlling the mobile robot to operate according to the one or more operating parameters (Tanaka: Column 6 lines 32-44, “In addition, when safety determiner 123 determines that the distance between mobile machine 101 and person 104 is unsafe, safety determiner 123 controls operation of mobile machine 101 via machine controller 106.”, Column 7 lines 61-67, “When safety determiner 123 determines that the distance between mobile machine 101 and person 104 is unsafe (No in S115), safety determiner 123 controls operation of mobile machine 101 via machine controller 106 (S116). Specifically, safety detector 115 (i) stops mobile machine 101, (ii) slows down mobile machine 101, or (iii) changes (limits) a movable range of mobile machine 101.”).
Tanaka does not teach a computing system located onboard a mobile robot;
receiving information indicating a hardware configuration of the mobile robot;
determining one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot.
Rublee, in the same field of endeavor, teaches a computing system located onboard a mobile robot (Rublee: Column 19 lines 25-38, “FIG. 6A is a functional block diagram of computing device 600 (e.g., system, in accordance with an example embodiment. In particular, computing device 600 shown in FIG. 6A can be configured to perform one or more functions of a robot, robotic actor, a special robot, safety-system server 180, a computing device as discussed in the context of FIGS. 5A and 5B, network 614, method 700, and one or more functions related to one or more of scenarios 200, 300, and/or 400. Computing device 600 may include a user interface module 601, a network-communication interface module 602, one or more processors 603, data storage 604, one or more sensors 620, and one or more actuators 630, all of which may be linked together via a system bus, network, or other connection mechanism 605.”, Column 23 lines 25-34, “FIG. 7 is a flowchart of method 700, in accordance with an example embodiment. Method 700 can be executed on a computing device. Example computing devices include, but are not limited to, computing device 600, a safety-system server such as safety-system server 180, a computing device aboard a herein-described robot, robotic actor, robotic device, special robot, and/or other computing device(s). In some embodiments, the computing device can be part of a safety system, such as discussed above in the context of at least FIG. 1.”, Column 23 lines 35-39, “Method 700 can begin at block 710, where the computing device can determine presence information and actor type information about any actors present within a predetermined area of an environment using one or more sensors, such as discussed above regarding as least FIGS. 1-4.”. The cited passages clearly teach that the computing device used to implement the method can be onboard the robot and that said computing device is configured to gather position information of the robot.).
Tanaka teaches a computing system of a mobile robot, the computing system comprising: data processing hardware; and memory hardware in communication with the data processing hardware, the memory hardware storing instructions that when executed on the data processing hardware cause the data processing hardware to perform operations comprising: receiving first location information for the mobile robot; receiving second location information for a first entity in an environment of the mobile robot; determining based, at least in part, on the first location information and the second location information, a first distance between the mobile robot and the first entity in the environment of the mobile robot; determining one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance; and controlling the mobile robot to operate according to the one or more operating parameters. Tanaka does not teach a computing system located onboard a mobile robot. Rublee teaches a computing system located onboard a mobile robot. A person of ordinary skill in the art would have had the technological capabilities required to have modified the computing system taught in Tanaka with a computing system located onboard a mobile robot taught in Rublee. Furthermore, even though the process outlined in the computing system taught Tanaka is not explicitly stated as being performed on a computing device mounted on board the robot, Tanaka still teaches a machine controller for controlling the robot. As such, Tanaka could easily be modified to perform the steps of the method by using a computing device on board a robot as taught in Rublee. Such a modification would only require changing the location of the computing device used to perform the method, which would not change or introduce new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a computing system located onboard a mobile robot.
Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the computing system taught in Tanaka with a computing system located onboard a mobile robot taught in Rublee with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results.
Tanaka in view of Rublee does not teach receiving information indicating a hardware configuration of the mobile robot;
determining one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot.
Johnson, in the same field of endeavor, teaches receiving information indicating a hardware configuration of the mobile robot (Johnson: Column 3 lines 1-24, “Further, embodiments may determine the context by analyzing a computer based model of an environment of the robot to determine proximity of the robot to physical objects during the portion of the movement plan. Yet another embodiment identifies the context by performing a computer-based simulation of an environment including the robot using (i) the movement plan, (ii) a computer-based model of the environment, and (iii) a computer-based model of the robot.”, Column 5 lines 31-47, “In an embodiment, in order to compute the feed-forward torque for a given action, it is necessary to model the expected resistance to be encountered during the motion. In an embodiment, the expected resistance is based on a model of an environment of the robot arm. In an embodiment, the context specific torque is loaded from a model of the robot arm (including moments of inertia, gravity, friction, stiction, motor performance, and controller response), the environment (viscosity, density, fluid-based drag), and intended contacts with objects and materials. In another embodiment, the context also includes distance from potential obstacles and likelihood of interaction with human co-workers. In regions where collision is unlikely, robot operations, e.g., torque and velocity constraints, can be relaxed to enable improved processing speed. By automatically determining these regions aka ‘the context’, embodiments can enable safe operation without tedious human, i.e., manual, design.”, Column 11 lines 26-34, “In an embodiment, determining the context specific torque at 333 includes loading the context specific torque from a model of the robot, the task, and an environment in which the robot is operating to perform the task. In such an embodiment, the task is selected from a library of tasks, where the required torque is recorded. The model of the robot determines if the required torque can be achieved and if, given the environment, the robot is physically able to perform the task.”, Column 11 lines 35-57, “In embodiments, of the method 330 “context” may include any conditions related in any way to the robot. For example, context may include any data related to the robot, the task performed by robot, the motion of the robot, and the environment in which the robot is operating, amongst other examples. According to an example embodiment of the method 330, context is at least one of: (a) free space, (b) active contact, (c) collision, (d) anticipated collision, (e) a likelihood of accident during the portion of the movement plan based on standards for (1) speed and separation limits and (2) allowable harm, (f) the task performed by the robot, e.g., slicing, scooping, grasping, picking, or machining material, (g) a set of possible colliding objects, where each object has an object class, and the object class indicates a severity of a collision with the object, (h) a probability distribution of possible colliding objects, (i) a probability distribution of one or more possible world configurations, (j) a continuous distribution of world configurations, (k) a continuous distribution of objects, (l) a plane of motion of the robot, and (m) dimensions of the robot, amongst other examples. In embodiments, speed and separation limits and limits for allowable harm may be based upon industry standards and/or regulations such as ISO standard 15066.”. The cited passages clearly teaches that a model of the robot is used by the controller. The model of the robot includes moments of inertia of the robot, gravity, friction, stiction, motor performance, the response time of the controller, and the dimensions of the robot. One of ordinary skill in the art would recognize that all of these parameters are defined by the hardware configuration of the robot. Additionally, the system considers the task being performed by the robot, which is defined by the hardware configuration of said robot.);
determining one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot (Johnson: Column 9 lines 20-24, “In another example of context specific torque limits, when the robot operates with a known model of its environment and obstacles therein, it can limit the torque and velocity of the robot based on distance to known obstacles, which allows for safer operation.”, Column 9 lines 25-31, “Embodiments provide functionality to control robot operation, e.g., torque and velocity, amongst other examples, based on context. The context can include current location of the robot, distance to collision, current velocity, and position in a motion plan (e.g., a plan that indicates the robot's location and path for movements through free space and the robot's contact with material, amongst other examples.”, Column 11 lines 26-34, “In an embodiment, determining the context specific torque at 333 includes loading the context specific torque from a model of the robot, the task, and an environment in which the robot is operating to perform the task. In such an embodiment, the task is selected from a library of tasks, where the required torque is recorded. The model of the robot determines if the required torque can be achieved and if, given the environment, the robot is physically able to perform the task.”, Column 11 lines 35-57, “In embodiments, of the method 330 “context” may include any conditions related in any way to the robot. For example, context may include any data related to the robot, the task performed by robot, the motion of the robot, and the environment in which the robot is operating, amongst other examples. According to an example embodiment of the method 330, context is at least one of: (a) free space, (b) active contact, (c) collision, (d) anticipated collision, (e) a likelihood of accident during the portion of the movement plan based on standards for (1) speed and separation limits and (2) allowable harm, (f) the task performed by the robot, e.g., slicing, scooping, grasping, picking, or machining material, (g) a set of possible colliding objects, where each object has an object class, and the object class indicates a severity of a collision with the object, (h) a probability distribution of possible colliding objects, (i) a probability distribution of one or more possible world configurations, (j) a continuous distribution of world configurations, (k) a continuous distribution of objects, (l) a plane of motion of the robot, and (m) dimensions of the robot, amongst other examples. In embodiments, speed and separation limits and limits for allowable harm may be based upon industry standards and/or regulations such as ISO standard 15066.”. The cited passages clearly teach that the torque and velocity of the robot is set base on context data, which includes details on the hardware configuration as well as a distance from the robot to an obstacle.).
Tanaka in view of Rublee teaches a computing system located onboard a mobile robot, the computing system comprising: data processing hardware; and memory hardware in communication with the data processing hardware, the memory hardware storing instructions that when executed on the data processing hardware cause the data processing hardware to perform operations comprising: receiving first location information for the mobile robot; receiving second location information for a first entity in an environment of the mobile robot; determining based, at least in part, on the first location information and the second location information, a first distance between the mobile robot and the first entity in the environment of the mobile robot; determining one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance; and controlling the mobile robot to operate according to the one or more operating parameters. Tanaka in view of Rublee does not teach receiving information indicating a hardware configuration of the mobile robot; determining one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot. Johnson teaches receiving information indicating a hardware configuration of the mobile robot; determining one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot. A person of ordinary skill in the art would have had the technological capabilities required to have modified the system taught in Tanaka in view of Rublee with receiving information indicating a hardware configuration of the mobile robot; determining one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot taught in Johnson. Furthermore, the system taught in Tanaka in view of Rublee already teaches setting an operational parameter of the robot based on a first distance between the robot and entity. As such, one of ordinary skill in the art would have been easily able to add the consideration of the hardware parameters of the robot in the determination of the operational parameters as taught in Johnson into the determination of the operational parameters taught in Tanaka in view of Rublee according to methods known in the art. Additionally, such considerations of hardware parameters would have been familiar to a person of ordinary skill in the art. As such, the combination would not have changed or introduced new functionality to either. No inventive effort would have been required. The combination would have yielded the predictable result of a computing system located onboard a mobile robot, the computing system comprising: receiving information indicating a hardware configuration of the mobile robot; determining one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot.
Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the system taught in Tanaka in view of Rublee with receiving information indicating a hardware configuration of the mobile robot; determining one or more operating parameters for the mobile robot, the one or more operating parameters based on the first distance and the information indicating the hardware configuration of the mobile robot taught in Johnson with a reasonable expectation of success. A person of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results.
Regarding claim 112, Tanaka in view of Rublee in further view of Johnson teaches further comprising: receiving, by the computing device, first sensor data representing the first entity (Johnson: Column 13 lines 3-10, “According to another embodiment, identifying the context includes identifying physical objects in an environment of the robot to determine proximity of the robot to the physical objects during the portion of the movement plan. This may be done using a camera, depth sensor, lidar, or any other sensor know in the art that can detect objects in an environment. In such an embodiment, proximity can be based on the closest points of the objects.”);
determining, by the computing device, a class of the first entity based on the first sensor data (Johnson: Column 11 lines 35-57, “In embodiments, of the method 330 “context” may include any conditions related in any way to the robot. For example, context may include any data related to the robot, the task performed by robot, the motion of the robot, and the environment in which the robot is operating, amongst other examples. According to an example embodiment of the method 330, context is at least one of: … (g) a set of possible colliding objects, where each object has an object class, and the object class indicates a severity of a collision with the object, ...”, Column 11 line 58 – Column 12 line 4, “In embodiments that utilize object class, the object class can indicate a type or category of an object and embodiments may modify operations based upon the object class. For example, if the class is a person, collision severity is high, i.e., there is a great possibility of harm. If, however, the object class is a balloon, and a collision occurs, the balloon will likely move and there will be no damage to the balloon or robot. In such an embodiment, based on the collision severity, and how the robot is moving in proximity to the object, the torque may be adjusted to avoid collision with that object, i.e., add more safety margin if the object is of high value, like a person) or increase likelihood of collision (reduce safety margin to gain speed if the object is robust, such as a table surface).”, Column 13 lines 3-10, “According to another embodiment, identifying the context includes identifying physical objects in an environment of the robot to determine proximity of the robot to the physical objects during the portion of the movement plan. This may be done using a camera, depth sensor, lidar, or any other sensor know in the art that can detect objects in an environment. In such an embodiment, proximity can be based on the closest points of the objects.”);
and determining, by the computing device, one or more characteristics of the class of the first entity, wherein the one or more operating parameters for the mobile robot are based on the one or more characteristics of the class of the first entity (Johnson: Column 11 lines 26-34, “In an embodiment, determining the context specific torque at 333 includes loading the context specific torque from a model of the robot, the task, and an environment in which the robot is operating to perform the task. In such an embodiment, the task is selected from a library of tasks, where the required torque is recorded. The model of the robot determines if the required torque can be achieved and if, given the environment, the robot is physically able to perform the task.”, Column 11 lines 35-57, “In embodiments, of the method 330 “context” may include any conditions related in any way to the robot. For example, context may include any data related to the robot, the task performed by robot, the motion of the robot, and the environment in which the robot is operating, amongst other examples. According to an example embodiment of the method 330, context is at least one of: … (g) a set of possible colliding objects, where each object has an object class, and the object class indicates a severity of a collision with the object, ...”, Column 11 line 58 – Column 12 line 4, “In embodiments that utilize object class, the object class can indicate a type or category of an object and embodiments may modify operations based upon the object class. For example, if the class is a person, collision severity is high, i.e., there is a great possibility of harm. If, however, the object class is a balloon, and a collision occurs, the balloon will likely move and there will be no damage to the balloon or robot. In such an embodiment, based on the collision severity, and how the robot is moving in proximity to the object, the torque may be adjusted to avoid collision with that object, i.e., add more safety margin if the object is of high value, like a person) or increase likelihood of collision (reduce safety margin to gain speed if the object is robust, such as a table surface).”. The cited passages teach the system is configured to determine the operational parameters of the robot based on the context data, wherein the context data includes the detected object and the class of said detected object (i.e. person ballon, etc.)).
Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 11911918 B2 ("Tanaka") in view of US 10081106 B2 ("Rublee") in further view of US 11597084 B2 ("Johnson") in further view of US 10828776 B2 ("Oyama").
Regarding claim 8, Tanaka in view of Rublee in further view of Johnson does not teach parameters comprise one or more of an operating speed limit, a stopping time limit, or an operating acceleration limit.
Oyama, in the same field of endeavor, teaches parameters comprise one or more of an operating speed limit, a stopping time limit, or an operating acceleration limit (Oyama: Column 5 lines 50-57, “The control device of the present embodiment controls the operation speed of the robot 1 so that the operation speed is equal to or lower than a predetermined limit speed when the robot 1 is placed at a position and orientation, at which a person is likely to be sandwiched.”).
Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robot control method taught in Tanaka in view of Rublee in further view of Johnson with the use of an operating speed limit taught in Oyama with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because by imposing an operating speed limit on the robot, the effects of inertia on the robot can be mitigated and thereby prevent any harm or damage caused by a robot even though a stop command has been issued (Oyama: Column 1 line 61 – Column 2 line 4, “In this regard, when the operator and the robot work in the same working area, the position and orientation of the robot change. The operator may be sandwiched between, for example, the arm of the robot and a workbench disposed around the robot. The control device stops the robot when an external force is detected. However, the robot operates due to inertia until the robot completely stops after a command for stopping the robot is issued. For example, when a stop command is issued while the direction of the arm is changing, the arm does not immediately stop. The arm stops after moving a predetermined distance due to the inertia.”).
Claim(s) 30, 36, and 37 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 11911918 B2 ("Tanaka") in view of US 10081106 B2 ("Rublee") in further view of US 11597084 B2 ("Johnson") in further view of US 10703584 B2 ("Diankov").
Regarding claim 30, Tanaka in view of Rublee in further view of Johnson does not teach wherein the sensor mount is attached to a conveyor or a ground location in the environment of the mobile robot.
Diankov, in the same field of endeavor, teaches wherein the sensor mount is attached to a conveyor or a ground location in the environment of the mobile robot (Diankov: Figure 3 first crossing sensor 316 and second crossing sensor 318, Column 9 lines 45-67, “In some embodiments, the robotic system 100 can include one or more crossing sensors (e.g., a first crossing sensor 316 and/or a second crossing sensor 318) configured to detect crossing events where an object crosses/leaves corresponding sensing line/plane.”, Column 10 lines 46-63, “For illustrative purposes, the first crossing sensor 316 is shown attached to the conveyor 306.”. As can clearly be seen from the cited figure and passages, the first and second crossing sensors are attached to the end of the conveyor belt.).
The only difference between the prior art and the claimed invention is that the prior art does not combine the robot control method and the sensors being mounted on a conveyor into a single combine reference. A person of ordinary skill in the art would have had the technological capabilities required to have combine the robot control method taught in Tanaka in view of Rublee in further view of Johnson with the sensors being fixed to a conveyor belt taught in Diankov. Furthermore, the sensors in Tanaka in view of Rublee in further view of Johnson are described as being fixed at a location separate from the robot but does not teach at what location they are fixed. Therefore, a person of ordinary skill in the art would have been able to fix the sensors to the end of a conveyor belt as taught in Diankov without changing or introducing new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a robot control method with sensors mounted to the conveyor.
Therefore, it would have been obvious to a person of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robot control method taught in Tanaka in view of Rublee in further view of Johnson with the sensors being mounted to a conveyor taught in Diankov with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results.
Regarding claim 36, Tanaka in view of Rublee in further view of Johnson in further view of Diankov teaches further comprising adjusting a sensing field of the one or more sensors based on at least one of (i) a position of the conveyor, (ii) a location of the mobile robot, (iii) a location of a bay in the environment of the mobile robot, or (iv) a position of the first entity (Tanaka: Column 5 lines 17-24, “In addition, second ranging sensor 103 includes pan 20 head 107 and sensor unit 108 installed on pan head 107. Panhead 107 allows panning, tilting, and zooming operations. With this, it is possible to change a detection range to be covered by second ranging sensor 103.”, Column 7 lines 5-11, “Next, sensor controller 113 controls, based on the position of mobile machine 101 detected by machine detector 112, a detection position ( detection range) to be detected by second ranging sensor 103 (S104). Specifically, sensor controller 113 controls pan head 107 of second ranging sensor 103 such that second ranging sensor 103 detects the detected position of mobile machine 101.”).
Regarding claim 37, Tanaka in view of Rublee in further view of Johnson in further view of Diankov teaches further comprising controlling, by the computing device, the one or more sensors to sense a region located above an end of the conveyor (Diankov: Figure 3 first crossing sensor 316, Column 10 lines -45, “In some embodiments, the first crossing sensor 316 can be used to measure a height of the target object 112 during transfer. For example, the robotic system 100 can determine a gripper height 322 (e.g., a vertical position/location/coordinate of the end-effector 304 relative to a reference point, such as the ground) at the time of an entry event as detected by the first crossing sensor 316.”, Column 10 lines 46-63, “For illustrative purposes, the first crossing sensor 316 is shown attached to the conveyor 306.”. As can be seen from the cited passages and figure, the first crossing sensor is configured to sense a region above the conveyor and is placed at the end of the conveyor.).
Claim(s) 40 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 11911918 B2 ("Tanaka") in view of US 10081106 B2 ("Rublee") in further view of US 11597084 B2 ("Johnson") in further view of US 9607285 B2 ("Wellman").
Regarding claim 40, Tanaka in view of Rublee in further view of Johnson does not teach further comprising enforcing, by the mobile robot, the one or more operating parameters based on a motion plan of the mobile robot.
Wellman, in the same field of endeavor teaches, further comprising enforcing, by the mobile robot, the one or more operating parameters based on a motion plan of the mobile robot (Wellman: Column 18 lines 1-10, “In embodiments, the pathway module 404 may generate the new paths in real time to account for the working entity moving throughout the inventory management system and to provide the new paths to the plurality of mobile drive units to alter their movement path on the fly to ensure the safety of the working entity.”, Column 18 lines 11-45, “In accordance with at least one embodiment, the pathway module 404 may provide instructions to the plurality of mobile drive units to decrease their speed to a speed that would enable safe passage by the working entity to the inventory management system.”. As can be seen from the cited passages, the control system is configured to enforce the operating parameters based on the motion plan of the robot, such as when the speed is reduced when passing by an entity.).
Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robot control method taught in Tanaka in view of Rublee in further view of Johnson with the method of enforcing the operating parameters based on a motion plan of the robot taught in Wellman with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make these modifications because it allows for the safe passage or workers and allows the system to comply with safety guidelines (Wellman: Column 18 lines 11-45, “In accordance with at least one embodiment, the pathway module 404 may provide instructions to the plurality of mobile drive units to decrease their speed to a speed that would enable safe passage by the working entity to the inventory management system. For example, instructions may be provided that indicate the speed of the mobile drive units should be reduced to 0.25 meters per second or less. In some embodiments, instructions to decrease speed can be in accordance with guidelines provided by the international standards organization (ISO), the automated guided vehicle system (AGVS) organization, or any other known organizations.”).
Claim(s) 46 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 11911918 B2 ("Tanaka") in view of US 10081106 B2 ("Rublee") in further view of US 11597084 B2 ("Johnson") in further view of US 8798790 B2 ("Kamiya").
Regarding claim 46, Tanaka in view of Rublee in further view of Johnson does not teach further comprising commanding, by the computing device, a robotic arm of the mobile robot to assume a stowed position when the first entity is determined to be less than a threshold distance from the mobile robot.
Kamiya, in the same field of endeavor, teaches further comprising commanding, by the computing device, a robotic arm of the mobile robot to assume a stowed position when the first entity is determined to be less than a threshold distance from the mobile robot (Kamiya: Column 8 lines 37-67, “At Step S5, the external-force-value monitoring unit 52 monitors whether the value of the force sensor exceeds a set threshold value F0 at regular intervals. When the value of the force sensor exceeds F0, the external-force-value monitoring unit 52 determines that a contact is detected, and transmits a contact position detection signal to the position-instruction generating unit 51. Upon receiving the contact position detection signal, the position-instruction generating unit 51 performs a position control on the robot 1 to decelerate and stop the robot and retract the robot to a start position of the position detecting operation (Step S6).”. As can be seen from the cited passage, the robot is configured to retract to a start position when an excessive contact force is detected.).
The only difference between the prior art and the claimed invention is that the prior art does not combine the robot control method and the method of having the robot assumed a stowed position when an entity is less than a threshold distance. A person of ordinary skill in the art would have had the technological capabilities require to have combine the robot control method taught in Tanaka in view of Rublee in further view of Johnson with the method of having the robot assumed a stowed position when an entity is less than a threshold distance Taught in Kamiya. Furthermore, even though Kamiya moves the robot based on the detected force, Tanaka in view of Rublee in further view of Johnson already teaches controlling the robots movable range based on the determined distance of the first entity. Therefore the method taught in Kamiya could easily be modified to work using the detected distances as taught in Tanaka in view of Rublee in further view of Johnson without changing or introducing new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a robot control method configured to cause the robot to assume a stowed position based on the distance of the first entity being less than a threshold.
Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robot control method taught in Tanaka in view of Rublee in further view of Johnson with the method of having the robot assumed a stowed position when an entity is less than a threshold distance from the robot taught in Kamiya with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because it would have yielded predictable results.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1 and 53 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Noah W Stiebritz whose telephone number is (571)272-3414. The examiner can normally be reached Monday thru Friday 7-5 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached at (571) 270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/N.W.S./Examiner, Art Unit 3658
/Ramon A. Mercado/Supervisory Patent Examiner, Art Unit 3658