Prosecution Insights
Last updated: April 19, 2026
Application No. 18/062,019

DEVICE CONTROL BASED ON EXECUTION COMMAND AND UPDATED ENVIRONMENT INFORMATION

Non-Final OA §103§112
Filed
Dec 06, 2022
Examiner
YOON, ERIC
Art Unit
2118
Tech Center
2100 — Computer Architecture & Software
Assignee
Kabushiki Kaisha Yaskawa Denki
OA Round
3 (Non-Final)
58%
Grant Probability
Moderate
3-4
OA Rounds
3y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
148 granted / 253 resolved
+3.5% vs TC avg
Strong +67% interview lift
Without
With
+67.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
23 currently pending
Career history
276
Total Applications
across all art units

Statute-Specific Performance

§101
13.3%
-26.7% vs TC avg
§103
43.2%
+3.2% vs TC avg
§102
12.6%
-27.4% vs TC avg
§112
24.5%
-15.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 253 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/03/2026 has been entered. Claims 2, 4, 8 and 19-21 have been canceled. New claims 23-25 were added. Claims 1, 3, 5-7, 9-18 and 22-25 are pending for examination. Claim Rejections – 35 USC § 112(b) The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claims 7 and 25 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention. Claims 7 and 25 recite the limitation, "store the execution conditions including a first condition for the first task that requires a completion of the machine task and a second condition for the second condition that is not requires the completion of the machine task." It is unclear what is meant by the phrase, "for the second condition that is not requires the completion of the machine task." For the purpose of examination, Examiner interprets the above limitation as meaning that the execution conditions include a first condition for the first task that requires a completion of the machine task and a second condition. Claim Rejections – 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 3, 5, 6, 9-11 and 23 are rejected under 35 U.S.C. 103 as being unpatentable over Nihei (US 2006/0276934) in view of Francis (8,380,652). Regarding claim 1, Nihei teaches a production system, comprising: a plurality of local controllers configured to control a plurality of devices that perform a plurality of tasks for one or more workpieces, the plurality of devices including a robot, the plurality of local controllers including a robot controller configured to control the robot (Figs. 1, 6, 12-14, Abstract, illustrates a system including multiple devices, including a robot and a robot controller, each device also inherently has a processor/controller to control its operations); and circuitry of a host controller communicable with the plurality of local controllers (Fig. 6, 9-11 [0141, 0142], the robot controller includes a CPU 76 that is connected to and communicates with the devices and the robot via a network interface and network), the circuitry configured to: store progress information of the plurality of tasks ([0038, 0041-0045, 0117-0121, 0124-0125, 0128-0136, 0175-0178], the system collects state information in real-time; programs/tasks are executed based on the collected state information; task/programs may be selected based on whether task starting conditions are met; see for example [0123], which describes state information collected that pertains to a workpiece and a robot); store, on a remote storage device, environment information of the one or more workpieces and the plurality of devices ([0038, 0041-0045, 0117-0121, 0124-0125, 0128-0136, 0175-0178], the system collects state information in real-time; programs/tasks are executed based on the collected state information; task/programs may be selected based on whether task starting conditions are met; see for example [0123], which describes state information collected that pertains to a workpiece and a robot; Figs. 1, 5, 7 [0038], the system includes a storage device to store such state information/data e.g., information collecting section 18; for example, controller 40 in Figs. 5 and 7 includes a memory; receive status information from the plurality of local controllers ([0175-0178, 0141-0142], the system/robot controller detects states of the processing machines and positioning tools through digital input/output circuit; as indicated in [0141-0142], this means that the system receives state information from those devices, which are connected to the control system via cables and the digital input/output circuit, inherently via their respective controllers; based on the received status information, an external model is updated; [0158-0159] devices e.g., a processing machine, can communicate status information e.g., task completion signals, to the control system/robot controller; [0107, 0123], sensors transmit status information to the control system, to indicate the state/status of devices in the system or whether a task was completed; inherently sensors transmit such data via a controller); update the stored progress information based at least in part on the status information received from the plurality of local controllers ([0038, 0041-0045, 0117-0121, 0124-0125, 0128-0136, 0175-0178], the system collects state information in real-time; programs/tasks are executed based on the collected state information; task/programs may be selected based on whether task starting conditions are met; see for example [0123], which describes state information collected that pertains to a workpiece and a robot); update the stored environment information based at least in part on the status information received from the plurality of local controllers ([0038, 0041-0045, 0117-0121, 0124-0125, 0128-0136, 0175-0178], the system collects state information in real-time; programs/tasks are executed based on the collected state information; task/programs may be selected based on whether task starting conditions are met; see for example [0123], which describes state information collected that pertains to a workpiece and a robot); and output execution commands of next tasks to one or more of the plurality of local controllers based on the progress information (Figs. 9-11, Abstract, [0038], the controller makes a robot execute tasks for workpieces based on collected state information; the controller selects programs to execute based on an order and whether they satisfy a starting condition), wherein the robot controller is configured to: receive an execution command that is output to the robot controller from the circuitry (Figs. 9-11, Abstract, [0038, 0117-0121], the system selects programs/actions to execute based on the environment/state information e.g., based on whether the last action was successfully completed; see also Figs. 9-11, [0149-0161], the system can control various devices to execute tasks e.g., the robot, conveyor, processing machine etc.); control the robot to execute one robot task of the robot tasks corresponding to the received execution command (Figs. 9-14, [0149-0150], the robot controller issues commands to the robot; the robot receives and thus stores commands to perform operations; see also [0038-0039], a controller may issue commands to a robot to perform multiple tasks/programs in a particular order, based on the tracked state of the working environment/state information). However, Nihei does not expressly disclose monitor wherein the robot controller is configured to: store execution conditions predetermined for robot tasks; the controlling the robot in response to determining that the changed environment information satisfies one of the execution conditions corresponding to the one robot task; monitor the remote storage device to detect a change in the stored environment information caused by an operation executed by one device of the plurality of devices, other than the robot, that is controlled by another local controller of the plurality of local controllers; the changed environment information being caused by the operation execution by the one device. In the same field of endeavor, Francis teaches wherein the robot controller is configured to: store execution conditions predetermined for robot tasks; the controlling the robot in response to determining that the changed environment information satisfies one of the execution conditions corresponding to the one robot task (claim 1, a robot may receive commands and determine a constraint/condition for each command; then robot then autonomously determines whether the constraint has been met when determining when to execute the corresponding command; see also col. 14, line 4 to col. 15, line 44, a robot may receive commands for tasks where both tasks cannot be pursued simultaneously and the command itself does not clearly prioritize a particular task; thus, the robot will determine, based on its stored logic and analysis, which command to perform based on obtained contextual/situation data e.g., from the cloud-stored environmental data); monitor the remote storage device to detect a change in the stored environment information caused by an operation executed by one device of the plurality of devices, other than the robot, that is controlled by another local controller of the plurality of local controllers; the changed environment information, caused by the operation execution by the one device (col. 10, line 5 to col. 11, line 17, there may be multiple robots; and any observation made by a first robot can be uploaded to the cloud i.e., causing a change in the state information in the cloud, where it can be shared with all other robots; thus, the action of the second robot may be based on the observation of the first robot, and each robot can build upon what is learned from the other robots; put another way, the robots are each monitoring the state of the cloud data in real time; it would be obvious to combine the above features of Francis with Nihei; that is, in the context of Nihei, as noted in Nihei [0158, 0038-0041, 0060-0064], there may be multiple devices/processing machines/robots that perform actions in coordination with another; for example, as noted in Nihei [0060-0064], a first robot may have a command to unload workpieces from two machines, but the command does not specify which machine; a constraint is that workpiece can only be unloaded from a machine when the machine has finished processing the workpiece; the completion of processing by a machine is provided to a central controller/model e.g., see Nihei [0158-0159]; per Francis, it would be obvious that a robot would autonomously recognize the constraint and make a decision between tasks e.g., selecting a target machine for unloading, based on its monitoring of changes in environmental information e.g., an indication, from the machine, that it had completed its processing). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to have incorporated monitor wherein the robot controller is configured to: store execution conditions predetermined for robot tasks; the controlling the robot in response to in response to determining that the changed environment information satisfies one of the execution conditions corresponding to the one robot task; monitor the remote storage device to detect a change in the stored environment information caused by an operation executed by one device of the plurality of devices, other than the robot, that is controlled by another local controller of the plurality of local controllers; the changed environment information being caused by the operation execution by the one device as suggested in Francis into Nihei because Nihei and Francis pertain to analogous fields of technology. Both Nihei and Francis pertain to systems where multiple machines/robots collaborate, and sometimes the system issues commands to a robot relating to multiple selectable alternative tasks, which incorporate constraints as to when the tasks can be performed. For example, as noted in in Nihei [0060-0066], a robot may issue a command to unload workpieces from machines, with the constraint that a machine to be unloaded should have finished processing its respective workpiece. As noted in Nihei [0158-0159], an indication as to whether a constraint has been satisfied may be provided by the machine/robot performing an action that resulted in such satisfaction e.g., the processing machine may inform a central controller and environment information model that processing of a workpiece was completed. In Francis, as recited in claim 1, the robot determines the constraints, monitors environmental information, and based on new developments or changes, may make a decision on which task among multiple alternative tasks to perform. It would be desirable to incorporate such features into Nihei, to provide alternative methods for robots to select and prioritize a task, in light of multiple alternatives and various conditions/constraints e.g., see Francis claim 1, col. 14, line 4 to col. 15, line 44; col. 10, line 5 to col. 11, line 17). Regarding claim 3, the combination of Nihei and Francis teaches the invention as claimed in claim 1. The combination of Nihei and Francis also teaches wherein the circuitry is further configured to: acquire a production order of the one or more workpieces from a host production management system (Nihei [0048], the system obtains an operation script, which defines work units/tasks to be performed in a particular order); allocate the plurality of tasks to the one or more workpieces based on the production order (Nihei Figs. 9-11, [055-0058], based on the script, the system can cause the robot to execute tasks related to a workpiece); and output the execution commands of the next tasks based on the allocated process and the progress information of the process (Nihei Figs. 9-11, [055-0058], based on the script, the system can cause the robot to execute tasks related to a workpiece; Nihei Figs. 9-11, Abstract, [0038], the system selects programs/actions to execute based on the environment/state information; see, for example, Nihei [0041, 0044, 0045, 0124, 0128-136, 0175-0177], the system uses sensors to detect abnormalities and then reacts accordingly). Regarding claim 5, the combination of Nihei and Francis teaches the invention as claimed in claim 4. The combination of Nihei and Francis also teaches wherein the circuitry is configured to: allocate, in response to acquiring production orders of a plurality of workpieces, the plurality of tasks to the plurality of workpieces (Nihei Fig. 5, [0139], the system can cause distinct operations to be applied to different workpieces e.g., workpieces W1 and W2; see also [0056]); and output the execution commands of the next tasks based on the progress information (Nihei Figs. 9-11, Abstract, [0038], the controller makes a robot execute tasks for workpieces; the controller selects programs to execute based on an order and whether they satisfy a starting condition), and wherein the robot controller is configured to: store two or more execution commands (Nihei Figs. 9-11, Abstract, [0038, 0117-0121], the system selects programs/actions to execute based on the environment/state information e.g., based on whether the last action was successfully completed; see also Figs. 9-11, [0149-0161], the system can control various devices to execute tasks e.g., the robot, conveyor, processing machine etc.); and control the robot to execute the one robot task of the robot tasks corresponding to one of the stored execution commands, in response to determining that the changed environment information satisfies one of the execution conditions corresponding to the one robot task (Nihei Figs. 9-14, [0149-0150], the robot controller issues commands to the robot; the robot receives and thus stores commands to perform operations; see also [0038-0039], a controller may issue commands to a robot to perform multiple tasks/programs in a particular order, based on the tracked state of the working environment/state information; Francis claim 1, a robot may receive commands and determine a constraint/condition for each command; then robot then autonomously determines whether the constraint has been met when determining when to execute the corresponding command; see also col. 14, line 4 to col. 15, line 44, a robot may receive commands for tasks where both tasks cannot be pursued simultaneously and the command itself does not clearly prioritize a particular task; thus, the robot will determine, based on its stored logic and analysis, which command to perform based on obtained contextual/situation data e.g., from the cloud-stored environmental data). Regarding claim 6, the combination of Nihei and Francis teaches the invention as claimed in claim 1. The combination of Nihei and Francis also teaches wherein the robot controller is configured to: store two or more received execution commands of two or more of the robot tasks including a first task and a second task having a priority lower than a priority of the first task: and control the robot to execute the first task in response to determining that the changed environment information satisfies both an execution condition of the first task and an execution condition of the second task (Francis claim 1, a robot may receive commands and determine a constraint/condition for each command; then robot then autonomously determines whether the constraint has been met when determining when to execute the corresponding command; see also Francis col. 14, line 4 to col. 15, line 44, a robot may receive commands for tasks where both tasks cannot be pursued simultaneously; Francis col. 10, line 5 to col. 11, line 17, there may be multiple robots; and any observation made by a first robot can be uploaded to the cloud i.e., causing a change in the state information in the cloud, where it can be shared with all other robots; thus, the action of the second robot may be based on the observation of the first robot, and each robot can build upon what is learned from the other robots; put another way, the robots are each monitoring the state of the cloud data in real time; it would be obvious to combine the above features of Francis with Nihei; that is, in the context of Nihei, as noted in Nihei [0158, 0038-0041, 0060-0064], there may be multiple devices/processing machines/robots that perform actions in coordination with another; for example, as noted in Nihei [0060-0064], a first robot may have a command to unload workpieces from two machines, but the command does not specify which machine; a constraint is that workpiece can only be unloaded from a machine when the machine has finished processing the workpiece; the completion of processing by a machine is provided to a central controller/model e.g., see Nihei [0158-0159]; per Francis, it would be obvious that a robot would autonomously recognize the constraint and make a decision between tasks e.g., selecting a target machine for unloading, based on its monitoring of changes in environmental information e.g., an indication, from the machine, that it had completed its processing; Nihei [0060] contemplates another situation, in which there are two tasks to perform, and the conditions for both tasks has been met; for instance, in the above example, it may mean that processing has been completed for both workpieces; in that case, as noted in [0060], the command that is performed is the one with higher priority). Regarding claim 9, the combination of Nihei and Francis teaches the invention as claimed in claim 6. The combination of Nihei and Francis also teaches wherein the robot controller is configured to: control the robot to execute the second task in response to determining that the changed environment information satisfies the execution condition of the second task and does not satisfy the execution condition of the first task; and control the robot to suspend the second task in response to determining that the changed environment information satisfies an execution condition of the first task during execution of the second task (Nihei [0124-0127, 0117-0121], the system may be performing a particular task i.e., a series of actions, such as Pick on Table and Load to machine, see [0124]; and then determine during that task that there is an abnormality, which means that during the performance of the task, the system may instead suspend the task and cause the robot to perform an exception action; afterward, it may then return to the task; [0127], Nihei further teaches that while performing any task, the system may engage in a stop task i.e., emergency stop due to machine trouble; then the original task may be safely restarted). Regarding claim 10, the combination of Nihei and Francis teaches the invention as claimed in claim 9. The combination of Nihei and Francis also teaches wherein the robot controller is configured to return the robot to a start position of the second task after suspending the second task before an execution of the first task (see remarks in connection with claim 9; as indicated in Nihei [0124], the system returns to a start position for an action in the task that it suspended e.g., a load to machine step; the system had been performing steps in an order, but then temporarily left that order and now returns to a step in that order; Nihei [0127], Nihei further teaches that while performing any task, the system may engage in a stop task i.e., emergency stop due to machine trouble; then the original task may be safely restarted). Regarding claim 11, the combination of Nihei and Francis teaches the invention as claimed in claim 9. The combination of Nihei and Francis also teaches wherein the first task includes a conveyance of a first workpiece (Nihei [0054] describes two processes/tasks involving handling two workpieces, which are handled in parallel; the steps of the two processes/tasks are intermingled, as seen in the list of work units/steps in Nihei [0054], which are arranged in particular order of performance; for example, a first task/process involves placing/conveying a workpiece to pallet #2, which occurs late in that order i.e., third to last), wherein the second task includes a conveyance of a second workpiece (Nihei [0054], the second process/task also involves handling a workpiece e.g., picking up a workpiece from Table #1, the seventh item on the ordered list; note that the two processes/tasks occur concurrently), and wherein the robot controller is configured to return the robot to a start position of the conveyance of the second workpiece after suspending the conveyance of the second workpiece before an execution of the conveyance of the first workpiece (Nihei [0127] teaches that for any given task, an abnormality may arise, where the task might have to be stopped and restarted; in [0124], picking up a workpiece from a table is given as an example; thus, this can happen with the second process/task i.e., picking up of the workpiece as noted above, where the second process/task is stopped/suspended and then restarted i.e., returned to a start position, due to detection of an abnormality; all of this occurs before the portion of the first process/task that relates to conveying a workpiece to pallet #2; all of the above occurs because the system has determined that the two processes/tasks should take place in parallel). Claim 23 corresponds to claim 9 and is rejected for the same reasons. Claims 7 and 25 are rejected under 35 U.S.C. 103 as being unpatentable over Nihei and Francis, as applied in claim 6, and further in view of Skiba (US 2017/0286916). Regarding claim 7, the combination of Nihei and Francis teaches the invention as claimed in claim 6. The combination of Nihei and Francis also teaches wherein the plurality of devices comprise an industrial machine other than the robot, wherein the plurality of tasks include a machine task executed by the industrial machine (Nihei [0158, 0038-0041, 0060-0064], there may be multiple devices/processing machines/robots that perform actions in coordination with another; for example, as noted in Nihei [0060-0064], a first robot may have a command to unload workpieces from two machines, but the command does not specify which workpiece and machine; a constraint is that workpiece can only be unloaded from a machine when the machine has finished processing the workpiece), wherein the robot controller is configured to:_ store the execution conditions including a first condition for the first task that requires a completion of the machine task and a second condition for the second task that is not requires the completion of the machine task (Francis claim 1, a robot may receive commands and determine a constraint/condition for each command; then robot then autonomously determines whether the constraint has been met when determining when to execute the corresponding command; see also Francis col. 14, line 4 to col. 15, line 44; Nihei [0060-0064] describes two example tasks, unloading a workpiece from processing machine #1 and unloading a workpiece from processing machine #2; a start condition of each task is that the corresponding processing machine has finished processing the associated workpiece; put another way, the task of naturally, that condition as it pertains to machine #1 does not pertain to machine #2 and the unloading of its workpiece); store two or more received execution commands of two or more of the robot tasks (Francis claim 1, a robot may receive commands and determine a constraint/condition for each command; then robot then autonomously determines whether the constraint has been met when determining when to execute the corresponding command; see also Francis col. 14, line 4 to col. 15, line 44, a robot may receive commands for tasks where both tasks cannot be pursued simultaneously and the command itself does not clearly prioritize a particular task; thus, the robot will determine, based on its stored logic and analysis, which command to perform based on obtained contextual/situation data e.g., from the cloud-stored environmental data; as previously noted in connection with claim 1 and above, Nihei contemplates a situation where a robot/machine determines a change in an environment and uploads the change to the cloud, which all other robots/machines have access to and can act upon; thus, the combination of Nihei and Francis teaches, for example, a processing machine that finishes processing a workpiece and uploads that status to the cloud, so that a robot can access that changed status and then knows to implement a command i.e., unloading the workpiece, based on the known condition i.e., that its processing was finished). However, the combination of Nihei and Francis does not expressly disclose wherein the environment information includes a waiting time corresponding to a completion time of the machine task; control the robot to wait for the completion of the machine task without executing the second task in response to determining that the changed environment information satisfies the second condition but the waiting time is shorter than a predetem1ined threshold; and control the robot to execute the second task in response to determining that the changed environment information satisfies the second condition and the waiting time is longer than the threshold. In the same field of endeavor, Skiba teaches wherein the environment information includes a waiting time corresponding to a completion time of the machine task; control the robot to wait for the completion of the machine task without executing the second task in response to determining that the changed environment information satisfies the second condition but the waiting time is shorter than a predetermined threshold; and control the robot to execute the second task in response to determining that the changed environment information satisfies the second condition and the waiting time is longer than the threshold (Skiba [0200] describe a robot that may the option of performing first and second tasks; if the wait time for the first task is below a threshold, then it will perform the first task; however, if the wait time is too long, then the robot may perform the alternative task). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to have incorporated wherein the environment information includes a waiting time corresponding to a completion time of the machine task; control the robot to wait for the completion of the machine task without executing the second task in response to determining that the changed environment information satisfies the second condition but the waiting time is shorter than a predetem1ined threshold; and control the robot to execute the second task in response to determining that the changed environment information satisfies the second condition and the waiting time is longer than the threshold as suggested in Skiba into Nihei and Francis because Nihei/Francis and Skiba pertain to analogous fields of technology. Nihei/Francis pertains to a robot that may perform multiple tasks, as long as they meet starting conditions, and that can evaluate whether those conditions are satisfied by consulting cloud-based environment information. Skiba also pertains to a robot confronting multiple tasks. In Skiba, a robot may perform a first task if the wait for that task to be ready is not too long; if it is too long, the robot may instead perform a second task that is ready to be acted upon. It would be desirable to incorporate this feature into Nihei/Francis so that a robot may consider multiple factors in determining how to prioritize tasks e.g., see Skiba [00200]. Claim 25 corresponds to claim 7 and is rejected for the same reasons. Claims 13, 14 and 16-18 are rejected under 35 U.S.C. 103 as being unpatentable over Nihei and Francis, as applied in claim 5, and further in view of Madvil (US 2015/0277398). Regarding claim 13, the combination of Nihei and Francis teaches the invention as claimed in claim 5. The combination of Nihei and Francis also teaches wherein the circuitry is configured to generate at least a part of the one or more execution conditions (Nihei Figs. 9-11, Abstract, [0038, 0117-0121], the system selects programs/actions to execute based on the environment/state information e.g., based on a condition as to whether the last action was successfully completed; thus, the setting of each action/task and the success of the action in order is effectively a condition for a subsequent action; see, for example, Nihei [0041, 0044, 0045, 0117-0121, 0124, 0128-136, 0175-0177], the system uses sensors to detect abnormalities and then reacts accordingly). However, the combination of Nihei and Francis does not expressly disclose the generating based on an operation in a virtual space of a virtual robot corresponding to the robot. In the same field of endeavor, Madvil teaches the generating based on an operation in a virtual space of a virtual robot corresponding to the robot (Madvil [0046, 0041, 0040, 0028, 0029, 0014, 0018], Madvil pertains to running simulations in virtual space with virtual robots and virtual work pieces; in Madvil, commands are issued to virtual robots to pursue routing tasks i.e., to follow paths; the paths of multiple virtual robots may overlap improperly, causing virtual collisions between virtual robots; these collisions are then resolved i.e., adjustments are made to the tasks/routes performed by the virtual robots to avoid collisions; the adjusted task/routes can be applied to real robots). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to have incorporated the generating based on an operation in a virtual space of a virtual robot corresponding to the robot as suggested in Madvil into Nihei and Francis because Nihei and Madvil pertain to analogous fields of technology. Both Nihei and Madvil relate to system in which robots carry workpieces to various destinations and follow issued commands in sequence. In Nihei, a condition of each command generally is that the prior command was successfully completed. Madvil also relates to a system of robot control. In Madvil, commands for robots may be adjusted to avoid collision, by running simulations for corresponding virtual robots in a virtual space. It would be obvious to modify Nihei to incorporate such simulations, such that one or more commands and competition conditions of a robot are formed in part based on the virtual simulation techniques described in Madvil. It would be desirable to incorporate this feature into Nihei to facilitate optimal commands and conditions e.g., see Madvil [0046, 0041, 0040, 0028, 0029, 0014, 0018]. Regarding claim 14, the combination of Nihei, Francis and Madvil teaches the invention as claimed in claim 13. The combination of Nihei, Francis and Madvil also teaches wherein the circuitry is configured to change at least a part of the execution conditions so as to shorten an execution time of the plurality of tasks processes in the virtual space (Madvil [0040], the system can create paths for the movements of robots i.e., an earlier portion of the path is a condition for performing a later portion of the path, as noted in connection with claim 13; as noted in Madvil [0040], the reference incorporates by reference Shapiro US 2011/0153080; Shapiro [0016-0019] notes that the system can optimize paths so as to reduce cycle time i.e., the time needed to move the robot from the origin point to the destination point). Regarding claim 16, the combination of Nihei, Francis and Madvil teaches the invention as claimed in claim 13. The combination of Nihei, Francis and Madvil also teaches wherein the plurality of devices include a second robot (Madvil ([0046, 0041, 0040, 0028, 0029, 0014, 0018] contemplates control of multiple robots performing respective routes/tasks) wherein the plurality of tasks include a first robot task executed by the robot and a second robot task executed by the second robot (Madvil ([0046, 0041, 0040, 0028, 0029, 0014, 0018] contemplates control of multiple robots performing respective routes/tasks), and wherein the circuitry is configured to generate an execution condition of the first robot task and an execution condition of the second robot task so as to avoid collision in the virtual space between the virtual robot and a second virtual robot corresponding to the second robot (Madvil [0046, 0041, 0040, 0028, 0029, 0014, 0018], Madvil pertains to running simulations in virtual space with virtual robots and virtual work pieces; in Madvil, commands are issued to virtual robots to pursue routing tasks i.e., to follow paths; the paths of multiple virtual robots may overlap improperly, causing virtual collisions between virtual robots; these collisions are then resolved i.e., adjustments are made to the tasks/routes performed by the virtual robots to avoid collisions; the adjusted task/routes can be applied to real robots). Regarding claim 17, the combination of Nihei, Francis and Madvil teaches the invention as claimed in claim 16. The combination of Nihei, Francis and Madvil also teaches wherein the circuitry is configured to: derive an overlapping region of an operation region of the virtual robot executing the first robot task and an operation region of the second virtual robot executing the second robot task (Madvil [0046, 0041, 0040, 0028, 0029, 0014, 0018], Madvil pertains to running simulations in virtual space with virtual robots and virtual work pieces; in Madvil, commands are issued to virtual robots to pursue routing tasks i.e., to follow paths; the paths of multiple virtual robots may overlap improperly, causing virtual collisions between virtual robots; these collisions are then resolved i.e., adjustments are made to the tasks/routes performed by the virtual robots to avoid collisions; the adjusted task/routes can be applied to real robots); generate the execution condition of the first robot task to require that the second virtual robot is not located in the overlapping region; and generate the execution condition of the second robot task to require that the virtual robot is not located in the overlapping region (Madvil [0046, 0041, 0040, 0028, 0029, 0014, 0018], Madvil pertains to running simulations in virtual space with virtual robots and virtual work pieces; in Madvil, commands are issued to virtual robots to pursue routing tasks i.e., to follow paths; the paths of multiple virtual robots may overlap improperly, causing virtual collisions between virtual robots; these collisions are then resolved i.e., adjustments are made to the tasks/routes performed by the virtual robots to avoid collisions; the adjusted task/routes can be applied to real robots). Claim 18 corresponds to claim 1 and is rejected for the same reasons. Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Nihei, Francis and Madvil, as applied in claim 14, and further in view of McGregor (US 2012/0138651). Regarding claim 15, the combination of Nihei, Francis and Madvil teaches the invention as claimed in claim 14. However, the combination of Nihei, Francis and Madvil does not expressly disclose wherein the circuitry is configured to: store virtual progress information of the plurality of tasks in the virtual space; store virtual environment information: output virtual execution commands of the next tasks based on the stored virtual progress information; update the stored virtual environment information in accordance with operations in a virtual space of a plurality of virtual devices corresponding to the plurality of devices; update the virtual progress information in accordance with progress of the next tasks in the virtual space; store virtual execution conditions corresponding to the execution conditions; receive a virtual execution command that is output to a virtual robot controller corresponding to the robot controller, wherein the virtual robot controller is configured to control a virtual robot corresponding to the robot; monitor the stored virtual environment information to detect a change in the stored virtual environment information caused by an operation executed by one virtual device of the plurality of virtual devices. other than the virtual robot; and control the virtual robot to virtually execute one robot task of the robot tasks corresponding to the received virtual execution command in response to determining that the changed virtual environment information satisfies one of the virtual execution conditions corresponding to the one robot task in the virtual space. In the same field of endeavor, McGregor teaches wherein the circuitry is configured to: store virtual progress information of the plurality of tasks in the virtual space; store virtual environment information: output virtual execution commands of the next tasks based on the stored virtual progress information; update the stored virtual environment information in accordance with operations in a virtual space of a plurality of virtual devices corresponding to the plurality of devices; update the virtual progress information in accordance with progress of the next tasks in the virtual space; store virtual execution conditions corresponding to the execution conditions; receive a virtual execution command that is output to a virtual robot controller corresponding to the robot controller, wherein the virtual robot controller is configured to control a virtual robot corresponding to the robot; monitor the stored virtual environment information to detect a change in the stored virtual environment information caused by an operation executed by one virtual device of the plurality of virtual devices. other than the virtual robot; and control the virtual robot to virtually execute one robot task of the robot tasks corresponding to the received virtual execution command in response to determining that the changed virtual environment information satisfies one of the virtual execution conditions corresponding to the one robot task in the virtual space ([0029, 0033, 0051, 0055, 0056, 0066], it is known execute a robot simulation, where each robot and its operations are digitally mimicked; all algorithms executed on the actual robots can be used in the simulation; any mechanical or control components can also be simulated; thus, it would be obvious to virtually simulate any operations or features described in Nihei/Francis in connection with claim 1 e.g., storing cloud-based environment information, executing commands, updating environment information, updating progress information, receiving execution commands at a robot etc.) It would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to have incorporated store virtual progress information of the plurality of tasks in the virtual space; store virtual environment information: output virtual execution commands of the next tasks based on the stored virtual progress information; update the stored virtual environment information in accordance with operations in a virtual space of a plurality of virtual devices corresponding to the plurality of devices; update the virtual progress information in accordance with progress of the next tasks in the virtual space; store virtual execution conditions co1Tesponding to the execution conditions; receive a virtual execution command that is output to a virtual robot controller corresponding to the robot controller, wherein the virtual robot controller is configured to control a virtual robot corresponding to the robot; monitor the stored virtual environment information to detect a change in the stored virtual environment information caused by an operation executed by one virtual device of the plurality of virtual devices. other than the virtual robot; and control the virtual robot to virtually execute one robot task of the robot tasks corresponding to the received virtual execution command in response to determining that the changed virtual environment information satisfies one of the virtual execution conditions corresponding to the one robot task in the virtual space as suggested in McGregor into Nihei, Francis and Madvil, because Nihei/Francis/Madvil and McGregor pertain to analogous fields of technology. Nihei/Francis/Madvil and McGregor pertain to using simulations to improve the operation of a robotic/industrial system. In McGregor, all features, operations and components of the robotic system can be simulated. It would be desirable to incorporate this feature into Nihei/Francis/Madvil to accurately simulate the operations of the system and reduce associated development work e.g., see McGregor Abstract, [0029, 0033, 0051, 0055, 0056, 0066]. Allowable Subject Matter Claims 12 and 24 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. At best, the prior art of record, specifically, Nihei teaches system for controlling a robot and multiple manufacturing devices; the robot executes tasks on a workpiece; sensor data is gathered to determine when tasks are completed and to allow the system to react to abnormalities e.g., see Nihei Figs. 1, 6, 12-14, Abstract, [0038, 0141-0142, 0041, 0044-0045, 0124, 0128-0136, 0175-0177]. Francis teaches a robotic device that can independently determine constraints on tasks and select/configure operations; the system uploads its sensor/environmental data to the cloud e.g., see Francis Abstract, col. 13, lines 8-30; col. 14, lines 4-15; col. 15, lines 27-45; col. 15, lines 45-60; col. 4, lines 18-40. Liu (WO201877032) teaches a robot that suspends performance of a task and switches to another task e.g., see Liu pages 10-12. Kuffner (US 10,207,407) teaches simulating a device to perform tasks, where the performance of the task is compared against metrics e.g., see Kuffner Abstract, col. 7, line 40 to col. 8, line 18; col. 27, lines 24-41. Response to Arguments The Examiner acknowledges the Applicant's amendments to claims 1 and 18. Regarding claims 1 and 18, Applicant alleges that the cited prior art does not teach the amended limitations, ""wherein the robot controller is configured to: store execution conditions predetermined for robot tasks; receive an execution command that is output to the robot controller from the circuitry; monitor the remote storage device to detect a change in the stored environment information caused by an operation executed by one device of the plurality of devices, other than the robot, that is controlled by another local controller of the plurality of local controllers; and control the robot to execute one robot task of the robot tasks corresponding to the received execution command in response to determining that the changed environment information, caused by the operation executed by the one device, satisfies one of the execution conditions corresponding to the one robot task." More specifically, Applicant notes the following on page 17 of the reply: The robot of Francis determines whether to clean the floor based on its own sensory data as to whether or not the guests are still present. The robot of Francis is therefore required to not only compare the command to clean the floor with its own preset operating conditions (namely how to "clean the floor" and "stay quiet"), but the robot of Francis is also required to process all of the environmental information itself based on its own sensory data. Therefore, Francis does not disclose a role-sharing architecture in which a host centrally updates environment information in a remote storage device and the robot monitors the remote storage device to autonomously execute the next task. Nor does Nihei disclose such a role-sharing architecture; rather, Nihei teaches a centralized scheme in which the host makes all determinations. Examiner respectfully disagrees with Applicant's interpretation of Nihei and Francis. Although the Applicant does accurately describe various embodiments of the Francis invention, the Francis invention is not limited to such embodiments. In particular, Francis Figs. 3 and 4, col. 9, line 20 to col. 11, line 31 describes a system of multiple robots, who can make observations or learn behaviors and then upload such data to the cloud. Through the cloud, the experiences of each robot are shared with all the other robots. As noted in col. 11, lines 20-30, the robot can interact with the cloud to determine the satisfaction of conditions for executing actions i.e., the robot can autonomously resolve conflicts over what action to take next. It should be noted that Nihei teaches a somewhat similar approach. For instance, Nihei [0158, 0038-0041, 0060-0064, 0158-0159] describes situations where a machine/robot can inform a central controller/system when particular conditions have been satisfied e.g., a machine has processed a workpiece, which allows another robot/machine to act e.g., to unload the processed workpiece from the machine. In the view of the Examiner, the combination of Nihei and Francis therefore teaches a robot that can store conditions and receive commands, and will autonomously interpret and act on the conditions and commands based on interacting with cloud-based environmental information. Such features appear to read upon the above amended limitations. Applicant further alleges that claims 3-17 and 20-21 are allowable in view of their dependency on claims 1 and 18. Claims 3-17 and 20-21 are rejected as being taught by Nihei, Francis, McGregor and/or Madvil. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Dixon (Dixon K., et al., RAVE: A Real and Virtual Environment for Multiple Mobile Robot Systems," IEE/RSJ 1999) teaches a system for simulating robots and robot programs e.g., see pages 1360-1362. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ERIC YOON whose telephone number is (408)918-7581. The examiner can normally be reached on 9 am to 5 pm ET Monday through Friday. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott Baderman, can be reached at telephone number 571-272-3644. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center for authorized users only. Should you have questions about access to Patent Center, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form. /ERIC J YOON/Primary Examiner, Art Unit 2118
Read full office action

Prosecution Timeline

Dec 06, 2022
Application Filed
May 02, 2025
Non-Final Rejection — §103, §112
Aug 07, 2025
Response Filed
Sep 02, 2025
Final Rejection — §103, §112
Nov 04, 2025
Examiner Interview (Telephonic)
Nov 04, 2025
Examiner Interview Summary
Dec 03, 2025
Request for Continued Examination
Dec 10, 2025
Response after Non-Final Action
Feb 20, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585258
A SYSTEM FOR PROCESSING INPUT DATA FROM A FOOD HANDLING LINE TO DETERMINE TRIGGER DATA FOR SAMPLING, AND A METHOD THEREOF
2y 5m to grant Granted Mar 24, 2026
Patent 12585697
Interactive Content Feedback System
2y 5m to grant Granted Mar 24, 2026
Patent 12578702
APPARATUS AND METHOD FOR CALIBRATING FOR SKEW WITHOUT INDEPENDENT REFERENCE OBJECT AND FOR COMPENSATING FOR COEFFICIENT OF THERMAL EXPANSION AND MOISTURE ABSORPTION
2y 5m to grant Granted Mar 17, 2026
Patent 12571551
SYSTEM AND METHOD FOR CONTROLLING TEMPERATURE AND WATER CONTENT OF AN AIRSTREAM
2y 5m to grant Granted Mar 10, 2026
Patent 12567613
AIRCRAFT THERMAL MANAGEMENT SYSTEM FOR AN ENERGY STORAGE SYSTEM
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
58%
Grant Probability
99%
With Interview (+67.0%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 253 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month