Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
This Office action is in response to the amendment filed on 02/02/2026. Claims 1-6, 8-9, and 11-17 are currently pending with claims 1, 6, 8-9, and 14-15 being amended, claim 7 being cancelled, and claim 17 being newly added.
Response to Amendment
The amendments to the claims submitted on 02/02/2026 overcome the claim objections set forth in the previous Office action except for those set forth in the claim objection section.
Response to Arguments
Applicant's arguments filed 02/02/2026 have been fully considered but they are not persuasive.
In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Examiner further notes wherein Schultz suggests the response to a selection or execution changing the viewing area and information displayed to the user. It would be obvious to incorporate this concept into the system taught by Takeuchi which displays the information claimed in the instant application in a different fashion. This would allow the system to selectively display information relative to the current operation ensuring that the interface remains informative as well as efficient.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-9 and 13-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Takeuchi et al. (US 20190232493 A1), hereinafter Takeuchi in view of Jin et al. (US 20230236804 A1), hereinafter Jin and Schultz et al. (US 20030227483 A1), hereinafter Schultz.
Regarding claim 1, Takeuchi teaches:
1.(Currently Amended) A teaching device configured to create a control program of a robot comprising:
a processor configured to: (Paragraph 0057, “The robot control device 200 controls the arm 110, the end effector 140, the transport device 50, and the camera 30. The functions of the robot control device 200 are realized, for example, by a computer with a processor and a memory executing a computer program.”)
generate a program creation screen for performing program creation by a plurality of commands each representing a function constituting a control program of the robot; and (Paragraph 0005, "The robot control device includes a display control unit that displays an input screen including an operation flow creation area for creating an operation flow of work on a display device; a conversion unit that converts the created operation flow into a control program; and a control execution unit that executes the control program to control the robot. The input screen is configured to display a plurality of operation objects indicating a plurality of operations including an operation using force control, and one or more conditional branch objects indicating a conditional branch, as options. The operation flow creation area is configured to create an operation flow including the conditional branch by graphically placing an operation object selected from the plurality of operation objects and the conditional branch object." As well as Paragraph 0082, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”)
cause a display to display, … ; and (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figures 6E and 7 which demonstrate that different operation objects may be selected to add to the program and information on these objects is displayed during selection.)
execute the control program in order to control the robot, (Paragraph 0005, “According to a first aspect of the invention, a robot control device that creates a control program for work of a robot with a force detector is provided. The robot control device includes a display control unit that displays an input screen including an operation flow creation area for creating an operation flow of work on a display device; a conversion unit that converts the created operation flow into a control program; and a control execution unit that executes the control program to control the robot. The input screen is configured to display a plurality of operation objects indicating a plurality of operations including an operation using force control, and one or more conditional branch objects indicating a conditional branch, as options. The operation flow creation area is configured to create an operation flow including the conditional branch by graphically placing an operation object selected from the plurality of operation objects and the conditional branch object.”) wherein the processor is further configured to cause the display to display the information … (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figures 6E and 7 which demonstrate that different operation objects may be selected to add to the program and information on these objects is displayed during selection.) …
the information related to the execution state includes at least one of a state of a sensor or equipment associated with the command, (Paragraph 0160, "FIG. 6E shows an example of a window W1a for creating an operation flow of second type work using the camera 30. The window W1a is similar to the window W1 for the first type work shown in FIG. 6D. However, in the main view area MV, the window W1a is different from the window W1 in that a camera image display area IM for displaying an image captured with the camera 30 is provided in the main view area MV. In the camera image display area IM, it is preferable to be able to designate an image processing area to be subjected to an image processing such as product checking among the images captured with the camera 30. In this way, the load of the image processing can be reduced so that the operation of the robot can be performed at high speed. The content of areas FL, SQ, PR, RS, and RN other than the window W1a for the second type work in the main view area MV are substantially the same as the content of these areas in the window W1 for the first type work. However, content different from that of the window W1 for the first type work may be displayed. The window W1 for the first type work corresponds to a “first type input screen”, and the window W1a for the second type work corresponds to a “second type input screen”." Please see areas "PR" and "RS" in Figure 6E.) information related to a detection value of the sensor associated with the command, (Paragraph 0148, "In the main view area MV, the temporal change of the force Fx in the X axis direction and the torque Tx around the X axis are displayed among the plurality of force measured by the force detector 130 at the time of executing the operation flow. In the main view area MV, it is possible to select and display any one or more temporal changes of force of the plurality of forces measured by the force detector 130. It is also possible to display the temporal change of the measured position of the TCP and the temporal change of the difference between the target position and the measured position of the TCP on the main view area MV. The period of displaying the result in the main view area MV can be an operation period of any one of the operation objects in the operation flow, or can be the entire period from the start to stop of the execution. For example, when any operation object is selected in the operation flow creation area FL, the execution result of the operation period of the operation object is displayed. When the sequence block SB1 is selected, the result of the entire period from the start to stop of the execution is displayed. The period of displaying result in the main view area MV may be an operation period over a plurality of continuing operation objects. The information of some execution results of the control program is also displayed in the result area RS. For example, for any operation object, it is possible to display the end state of the operation (success or failure), time required for the operation, force at the end of the operation, the position at the end of the operation, and the like in the result area RS. Other types of results than the one shown in FIG. 11A may be displayed in the main view area MV. For example, information related to the robot, such as the speed of the robot and the angle of each joint may be displayed.") or an internal variable of the control program related to the command. (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figures 6E and 7 which demonstrate that different operation objects may be selected to add to the program and information on these objects is displayed during selection.)
Examiner Note: The currently provided claim language only requires the information to include one of:
“a state of a sensor or equipment associated with the command” (This is demonstrated in at least Figure 6E in areas “PR” and “RS” which show different sensors and the execution results as well as parameters which may be defined. The term “state” may be understood under the broadest reasonable interpretation to include presence of a sensor or equipment, inclusion in the control program being executed, successful data gathering resulting in execution results, successful performance of an operation including the sensor or equipment, any parameters associated with the sensor or equipment, etc.)
“information related to a detection value of the sensor associated with the command” (This is demonstrated at least by the “execution result” section of area “RS” as well as by subsection “IM” of the main viewing area which displays an image captured by a camera.)
“internal variable of the control program related to the command” (This is demonstrated by the parameter setting area which allows monitoring and adjusting of parameters (variables) via the interface.)
Takeuchi does not specifically teach the display being a pop-up display. Or displaying information on a control block when that block is executed. However, Schultz, in the same field of endeavor of robotics, teaches:
… when a command of the plurality of commands disposed in the program creation screen is executed by execution of the control program or is selected, information related to an execution state of the selected command … related to the execution state (Paragraph 0121, “In one embodiment, when the user executes the application, the displayed icons may be highlighted to visually indicate which of the operations in the application is currently being executed. The connections between icons may also be animated, such as with "propagating bubbles", to visually indicate flow of data or flow of execution. Thus, the plurality of interconnected icons may be animated, such as in the "execution highlighting" feature of LabVIEW, to visually indicate which operations are being performed. In one embodiment, corresponding modifications may be made to the displayed image as the application is executed. The corresponding modifications made to the displayed image may be made as a respective icon corresponding to the operation being performed is highlighted. This provides the user with a visual indication of which operation is being executed, and how this operation is affecting the image. This provides an improved debugging or feedback tool to the user.”) …
However, Jin, in the same field of endeavor of robotics, teaches:
… in a pop-up display, … (Paragraph 0017, "As another example, some embodiments include an apparatus (10) for automatic programming, comprising: a user interaction module (201), configured to receive a user's request on creating a first global parameter of a specific type by detecting the user's dragging a control corresponding to the specific type from a first user interface (701) and dropping the control on a second user interface (702); a displaying module (202), configured to display a first popup window (804) for editing the first global parameter on the second user interface (702); a processing module (203), configured to receive the user's editing operations on the first global parameter in the first popup window (804); and create the first global parameter according to the user's editing operations on the first global parameter; and the displaying module (202), further configured to display a new added first card (802) of the first global parameter in a list (802′) of global parameters on a third user interface (703).")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the workflow generation and robotic system as taught by Takeuchi with the functionality taught by Schultz to highlight current operations during execution of the program and with ability to use a pop up display as taught by Jin. This would enable a user to continue their work without disruption while accessing focused information within the pop up window and would further allow a user to efficiently follow the operation program and debug the program as needed.
Regarding claim 2, where all the limitations of claim 1 are discussed above, Takeuchi further teaches:
2. (previously presented) The teaching device according to claim 1, wherein the program creation screen includes a first region that displays a list of the plurality of commands, and (Paragraph 0063-0065, "(1) Main view area MV is an area for displaying options of operation objects and conditional branch objects to be described later, execution results of a control program, and the like.
(2) Operation flow creation area FL is an area for displaying the operation flows in which a plurality of objects are graphically placed in an editable manner. The work represented by the operation flow is also called “sequence”.
(3) Sequence display area SQ is an area for displaying a tree structure of the sequence." As well as Paragraph 0082, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”)
a second region for creating the control program by disposing one or more commands selected from the first region, and (Paragraph 0064, "Operation flow creation area FL is an area for displaying the operation flows in which a plurality of objects are graphically placed in an editable manner. The work represented by the operation flow is also called “sequence”.")
the processor (Paragraph 0057, “The robot control device 200 controls the arm 110, the end effector 140, the transport device 50, and the camera 30. The functions of the robot control device 200 are realized, for example, by a computer with a processor and a memory executing a computer program.”) is further configured to cause the display to displays, in response to selection on a command in the first region or the second region or execution instruction on a command disposed in the second region, … (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figures 6E and 7 which demonstrate that different operation objects may be selected to add to the program and information on these objects is displayed during selection.)
While Takeuchi does teach displaying an execution information area. They do not explicitly state that this is the execution state information. However, Schultz, in the same field of endeavor of robotics, teaches:
… the information. (Paragraph 0121, “In one embodiment, when the user executes the application, the displayed icons may be highlighted to visually indicate which of the operations in the application is currently being executed. The connections between icons may also be animated, such as with "propagating bubbles", to visually indicate flow of data or flow of execution. Thus, the plurality of interconnected icons may be animated, such as in the "execution highlighting" feature of LabVIEW, to visually indicate which operations are being performed. In one embodiment, corresponding modifications may be made to the displayed image as the application is executed. The corresponding modifications made to the displayed image may be made as a respective icon corresponding to the operation being performed is highlighted. This provides the user with a visual indication of which operation is being executed, and how this operation is affecting the image. This provides an improved debugging or feedback tool to the user.”)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the workflow generation and robotic system as taught by Takeuchi with the functionality taught by Schultz to highlight current operations during execution of the program and with ability to use a pop up display as taught by Jin. This would enable a user to efficiently follow the operation program and debug the program as needed.
Regarding claim 3, where all the limitations of claim 1 are discussed above, Takeuchi further teaches:
3. (previously presented) The teaching device according to claim 1, wherein the processor (Paragraph 0057, “The robot control device 200 controls the arm 110, the end effector 140, the transport device 50, and the camera 30. The functions of the robot control device 200 are realized, for example, by a computer with a processor and a memory executing a computer program.”) is further configured to cause the display to displays, during selection of a command disposed in the program creation screen, … (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figures 6E and 7 which demonstrate that different operation objects may be selected to add to the program and information on these objects is displayed during selection.)
While Takeuchi does teach displaying an execution information area. They do not explicitly state that this is the execution state information. However, Schultz, in the same field of endeavor of robotics, teaches:
… the information. (Paragraph 0121, “In one embodiment, when the user executes the application, the displayed icons may be highlighted to visually indicate which of the operations in the application is currently being executed. The connections between icons may also be animated, such as with "propagating bubbles", to visually indicate flow of data or flow of execution. Thus, the plurality of interconnected icons may be animated, such as in the "execution highlighting" feature of LabVIEW, to visually indicate which operations are being performed. In one embodiment, corresponding modifications may be made to the displayed image as the application is executed. The corresponding modifications made to the displayed image may be made as a respective icon corresponding to the operation being performed is highlighted. This provides the user with a visual indication of which operation is being executed, and how this operation is affecting the image. This provides an improved debugging or feedback tool to the user.”) …
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the workflow generation and robotic system as taught by Takeuchi with the functionality taught by Schultz to highlight current operations during execution of the program and with ability to use a pop up display as taught by Jin. This would enable a user to efficiently follow the operation program and debug the program as needed.
Regarding claim 4, where all the limitations of claim 1 are discussed above, Takeuchi further teaches:
4. (previously presented) The teaching device according to claim 1, wherein the processor (Paragraph 0057, “The robot control device 200 controls the arm 110, the end effector 140, the transport device 50, and the camera 30. The functions of the robot control device 200 are realized, for example, by a computer with a processor and a memory executing a computer program.”) is further configured to cause the display to display, in response to an execution instruction on a command disposed in the program creation screen, (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figure11A which shows the display and the information shown as a result of the generated program being executed.) …
While Takeuchi does teach displaying an execution information area. They do not explicitly state that this is the execution state information. However, Schultz, in the same field of endeavor of robotics, teaches:
… the information during execution, or after execution of the command. (Paragraph 0121, “In one embodiment, when the user executes the application, the displayed icons may be highlighted to visually indicate which of the operations in the application is currently being executed. The connections between icons may also be animated, such as with "propagating bubbles", to visually indicate flow of data or flow of execution. Thus, the plurality of interconnected icons may be animated, such as in the "execution highlighting" feature of LabVIEW, to visually indicate which operations are being performed. In one embodiment, corresponding modifications may be made to the displayed image as the application is executed. The corresponding modifications made to the displayed image may be made as a respective icon corresponding to the operation being performed is highlighted. This provides the user with a visual indication of which operation is being executed, and how this operation is affecting the image. This provides an improved debugging or feedback tool to the user.”) …
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the workflow generation and robotic system as taught by Takeuchi with the functionality taught by Schultz to highlight current operations during execution of the program and with ability to use a pop up display as taught by Jin. This would enable a user to efficiently follow the operation program and debug the program as needed.
Regarding claim 5, where all the limitations of claim 4 are discussed above, Takeuchi further teaches:
5. (previously presented) The teaching device according to claim 4, wherein the processor (Paragraph 0057, “The robot control device 200 controls the arm 110, the end effector 140, the transport device 50, and the camera 30. The functions of the robot control device 200 are realized, for example, by a computer with a processor and a memory executing a computer program.”) is further configured to cause the display to display (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figure11A which shows the display and the information shown as a result of the generated program being executed.) …
While Takeuchi does teach displaying an execution information area. They do not explicitly state that this is the execution state information. However, Schultz, in the same field of endeavor of robotics, teaches:
… the information … during the execution of the command while the command is executed. (Paragraph 0121, “In one embodiment, when the user executes the application, the displayed icons may be highlighted to visually indicate which of the operations in the application is currently being executed. The connections between icons may also be animated, such as with "propagating bubbles", to visually indicate flow of data or flow of execution. Thus, the plurality of interconnected icons may be animated, such as in the "execution highlighting" feature of LabVIEW, to visually indicate which operations are being performed. In one embodiment, corresponding modifications may be made to the displayed image as the application is executed. The corresponding modifications made to the displayed image may be made as a respective icon corresponding to the operation being performed is highlighted. This provides the user with a visual indication of which operation is being executed, and how this operation is affecting the image. This provides an improved debugging or feedback tool to the user.”) …
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the workflow generation and robotic system as taught by Takeuchi with the functionality taught by Schultz to highlight current operations during execution of the program and with ability to use a pop up display as taught by Jin. This would enable a user to efficiently follow the operation program and debug the program as needed.
Regarding claim 6, where all the limitations of claim 4 are discussed above, Takeuchi further teaches:
6. (Currently Amended) The teaching device according to claim 4, wherein the processor (Paragraph 0057, “The robot control device 200 controls the arm 110, the end effector 140, the transport device 50, and the camera 30. The functions of the robot control device 200 are realized, for example, by a computer with a processor and a memory executing a computer program.”) is further configured to cause the display to display the information related to the execution state, whichfurther includes an execution result of the execution of the command, after the command is executed. (Paragraph 0067, "Result area RS is an area for displaying execution results of the control program.")
Regarding claim 8, where all the limitations of claim 7 are discussed above, Takeuchi further teaches:
8. (currently amended) The teaching device according to claim 1, wherein the sensor is a visual sensor. (Paragraph 0160, "FIG. 6E shows an example of a window W1a for creating an operation flow of second type work using the camera 30. The window W1a is similar to the window W1 for the first type work shown in FIG. 6D. However, in the main view area MV, the window W1a is different from the window W1 in that a camera image display area IM for displaying an image captured with the camera 30 is provided in the main view area MV. In the camera image display area IM, it is preferable to be able to designate an image processing area to be subjected to an image processing such as product checking among the images captured with the camera 30. In this way, the load of the image processing can be reduced so that the operation of the robot can be performed at high speed. The content of areas FL, SQ, PR, RS, and RN other than the window W1a for the second type work in the main view area MV are substantially the same as the content of these areas in the window W1 for the first type work. However, content different from that of the window W1 for the first type work may be displayed. The window W1 for the first type work corresponds to a “first type input screen”, and the window W1a for the second type work corresponds to a “second type input screen”." Please see areas "PR" and "RS" in Figure 6E.)
Regarding claim 9, where all the limitations of claim 7 are discussed above, Takeuchi further teaches:
9. (currently amended) The teaching device according to claim 1, wherein the sensor is a force sensor. (Paragraph 0054, "The robot 100 can set the end effectors at any positions in any orientations within a movable range of the arm 110. A force detector 130 and an end effector 140 are installed on the arm flange 120. In the present embodiment, the end effector 140 is a gripper, but any other type of end effector can be used. The force detector 130 is a six-axis sensor that measures three-axis force acting on the end effector 140 and torque acting around the three axes. The force detector 130 measures magnitude of force parallel to three measurement axes orthogonal to each other in a sensor coordinate system which is a unique coordinate system, and the magnitude of torque around the three measurement axes. A force sensor as a force detector may be provided at any one or more joints J1 to J5 other than the joint J6. The force detector may only measure the force and torque in a direction of control, and a unit for directly measuring the force and torque like the force detector 130 or a unit for measuring the torque of the joint of the robot to obtain the force and the torque indirectly may be used. The force detector may measure the force and torque only in the direction of controlling force." Please also see Figure 11B)
Regarding claim 13, where all the limitations of claim 1 are discussed above, Takeuchi further teaches:
13. (previously presented) The teaching device according to claim 1, wherein each of the plurality of commands is represented by an icon or a statement. (Paragraph 0076, "Main view area MV: a plurality of categories indicating the operations constituting the operation flow and the categories of the conditional branch, name and icon of the object belonging to each of the categories, the description of the contents of the object, and an image showing the outline of the object are displayed. The object displayed on the main view area MV can be arbitrarily added to the operation flow in the operation flow creation area FL by a work such as drag and drop.")
Regarding claim 14, Takeuchi further teaches:
14. (Currently Amended) A teaching device configured to create a control program of a robot comprising:
a processor (Paragraph 0057, “The robot control device 200 controls the arm 110, the end effector 140, the transport device 50, and the camera 30. The functions of the robot control device 200 are realized, for example, by a computer with a processor and a memory executing a computer program.”) configured to:
generate a program creation screen for performing program creation by a plurality of commands each representing a function constituting a control program of the robot; (Paragraph 0005, "The robot control device includes a display control unit that displays an input screen including an operation flow creation area for creating an operation flow of work on a display device; a conversion unit that converts the created operation flow into a control program; and a control execution unit that executes the control program to control the robot. The input screen is configured to display a plurality of operation objects indicating a plurality of operations including an operation using force control, and one or more conditional branch objects indicating a conditional branch, as options. The operation flow creation area is configured to create an operation flow including the conditional branch by graphically placing an operation object selected from the plurality of operation objects and the conditional branch object." As well as Paragraph 0082, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”)
cause a display to display, … ; (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figures 6E and 7 which demonstrate that different operation objects may be selected to add to the program and information on these objects is displayed during selection.) and
execute the control program in order to control the robot, (Paragraph 0005, “According to a first aspect of the invention, a robot control device that creates a control program for work of a robot with a force detector is provided. The robot control device includes a display control unit that displays an input screen including an operation flow creation area for creating an operation flow of work on a display device; a conversion unit that converts the created operation flow into a control program; and a control execution unit that executes the control program to control the robot. The input screen is configured to display a plurality of operation objects indicating a plurality of operations including an operation using force control, and one or more conditional branch objects indicating a conditional branch, as options. The operation flow creation area is configured to create an operation flow including the conditional branch by graphically placing an operation object selected from the plurality of operation objects and the conditional branch object.”)
wherein the information related to the execution state includes at least one of a state of a sensor or equipment associated with the command, (Paragraph 0160, "FIG. 6E shows an example of a window W1a for creating an operation flow of second type work using the camera 30. The window W1a is similar to the window W1 for the first type work shown in FIG. 6D. However, in the main view area MV, the window W1a is different from the window W1 in that a camera image display area IM for displaying an image captured with the camera 30 is provided in the main view area MV. In the camera image display area IM, it is preferable to be able to designate an image processing area to be subjected to an image processing such as product checking among the images captured with the camera 30. In this way, the load of the image processing can be reduced so that the operation of the robot can be performed at high speed. The content of areas FL, SQ, PR, RS, and RN other than the window W1a for the second type work in the main view area MV are substantially the same as the content of these areas in the window W1 for the first type work. However, content different from that of the window W1 for the first type work may be displayed. The window W1 for the first type work corresponds to a “first type input screen”, and the window W1a for the second type work corresponds to a “second type input screen”." Please see areas "PR" and "RS" in Figure 6E.) information related to a detection value of the sensor associated with the command, (Paragraph 0148, "In the main view area MV, the temporal change of the force Fx in the X axis direction and the torque Tx around the X axis are displayed among the plurality of force measured by the force detector 130 at the time of executing the operation flow. In the main view area MV, it is possible to select and display any one or more temporal changes of force of the plurality of forces measured by the force detector 130. It is also possible to display the temporal change of the measured position of the TCP and the temporal change of the difference between the target position and the measured position of the TCP on the main view area MV. The period of displaying the result in the main view area MV can be an operation period of any one of the operation objects in the operation flow, or can be the entire period from the start to stop of the execution. For example, when any operation object is selected in the operation flow creation area FL, the execution result of the operation period of the operation object is displayed. When the sequence block SB1 is selected, the result of the entire period from the start to stop of the execution is displayed. The period of displaying result in the main view area MV may be an operation period over a plurality of continuing operation objects. The information of some execution results of the control program is also displayed in the result area RS. For example, for any operation object, it is possible to display the end state of the operation (success or failure), time required for the operation, force at the end of the operation, the position at the end of the operation, and the like in the result area RS. Other types of results than the one shown in FIG. 11A may be displayed in the main view area MV. For example, information related to the robot, such as the speed of the robot and the angle of each joint may be displayed.") or an internal variable of the control program related to the command. (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figures 6E and 7 which demonstrate that different operation objects may be selected to add to the program and information on these objects is displayed during selection.)
Examiner Note: The currently provided claim language only requires the information to include one of:
“a state of a sensor or equipment associated with the command” (This is demonstrated in at least Figure 6E in areas “PR” and “RS” which show different sensors and the execution results as well as parameters which may be defined. The term “state” may be understood under the broadest reasonable interpretation to include presence of a sensor or equipment, inclusion in the control program being executed, successful data gathering resulting in execution results, successful performance of an operation including the sensor or equipment, any parameters associated with the sensor or equipment, etc.)
“information related to a detection value of the sensor associated with the command” (This is demonstrated at least by the “execution result” section of area “RS” as well as by subsection “IM” of the main viewing area which displays an image captured by a camera.)
“internal variable of the control program related to the command” (This is demonstrated by the parameter setting area which allows monitoring and adjusting of parameters (variables) via the interface.)
Takeuchi does not specifically teach displaying information on a control block when that block is executed. However, Schultz, in the same field of endeavor of robotics, teaches:
… when a command of the plurality of commands disposed in the program creation screen is executed by execution of the control program or is selected, information related to an execution state of the executed or selected command (Paragraph 0121, “In one embodiment, when the user executes the application, the displayed icons may be highlighted to visually indicate which of the operations in the application is currently being executed. The connections between icons may also be animated, such as with "propagating bubbles", to visually indicate flow of data or flow of execution. Thus, the plurality of interconnected icons may be animated, such as in the "execution highlighting" feature of LabVIEW, to visually indicate which operations are being performed. In one embodiment, corresponding modifications may be made to the displayed image as the application is executed. The corresponding modifications made to the displayed image may be made as a respective icon corresponding to the operation being performed is highlighted. This provides the user with a visual indication of which operation is being executed, and how this operation is affecting the image. This provides an improved debugging or feedback tool to the user.”) …
However, Jin, in the same field of endeavor of robotics, teaches:
… in a pop-up display. (Paragraph 0017, "As another example, some embodiments include an apparatus (10) for automatic programming, comprising: a user interaction module (201), configured to receive a user's request on creating a first global parameter of a specific type by detecting the user's dragging a control corresponding to the specific type from a first user interface (701) and dropping the control on a second user interface (702); a displaying module (202), configured to display a first popup window (804) for editing the first global parameter on the second user interface (702); a processing module (203), configured to receive the user's editing operations on the first global parameter in the first popup window (804); and create the first global parameter according to the user's editing operations on the first global parameter; and the displaying module (202), further configured to display a new added first card (802) of the first global parameter in a list (802′) of global parameters on a third user interface (703).")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the workflow generation and robotic system as taught by Takeuchi with the functionality taught by Schultz to highlight current operations during execution of the program. This would enable a user to continue their work without disruption and to efficiently follow the operation program and debug the program as needed.
Regarding claim 15, Takeuchi further teaches:
15. (currently amended) A method of creating a control program of a robot, comprising:
generating a program creation screen for performing program creation by a plurality of commands each representing a function constituting a control program of the robot; (Paragraph 0005, "The robot control device includes a display control unit that displays an input screen including an operation flow creation area for creating an operation flow of work on a display device; a conversion unit that converts the created operation flow into a control program; and a control execution unit that executes the control program to control the robot. The input screen is configured to display a plurality of operation objects indicating a plurality of operations including an operation using force control, and one or more conditional branch objects indicating a conditional branch, as options. The operation flow creation area is configured to create an operation flow including the conditional branch by graphically placing an operation object selected from the plurality of operation objects and the conditional branch object." As well as Paragraph 0082, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”)
causing a display to display, (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figures 6E and 7 which demonstrate that different operation objects may be selected to add to the program and information on these objects is displayed during selection.) … and
executing the control program in order to control the robot, (Paragraph 0005, “According to a first aspect of the invention, a robot control device that creates a control program for work of a robot with a force detector is provided. The robot control device includes a display control unit that displays an input screen including an operation flow creation area for creating an operation flow of work on a display device; a conversion unit that converts the created operation flow into a control program; and a control execution unit that executes the control program to control the robot. The input screen is configured to display a plurality of operation objects indicating a plurality of operations including an operation using force control, and one or more conditional branch objects indicating a conditional branch, as options. The operation flow creation area is configured to create an operation flow including the conditional branch by graphically placing an operation object selected from the plurality of operation objects and the conditional branch object.”)
wherein the information related to the execution state includes at least one of a state of a sensor or equipment associated with the command, (Paragraph 0160, "FIG. 6E shows an example of a window W1a for creating an operation flow of second type work using the camera 30. The window W1a is similar to the window W1 for the first type work shown in FIG. 6D. However, in the main view area MV, the window W1a is different from the window W1 in that a camera image display area IM for displaying an image captured with the camera 30 is provided in the main view area MV. In the camera image display area IM, it is preferable to be able to designate an image processing area to be subjected to an image processing such as product checking among the images captured with the camera 30. In this way, the load of the image processing can be reduced so that the operation of the robot can be performed at high speed. The content of areas FL, SQ, PR, RS, and RN other than the window W1a for the second type work in the main view area MV are substantially the same as the content of these areas in the window W1 for the first type work. However, content different from that of the window W1 for the first type work may be displayed. The window W1 for the first type work corresponds to a “first type input screen”, and the window W1a for the second type work corresponds to a “second type input screen”." Please see areas "PR" and "RS" in Figure 6E.) information related to a detection value of the sensor associated with the command, (Paragraph 0148, "In the main view area MV, the temporal change of the force Fx in the X axis direction and the torque Tx around the X axis are displayed among the plurality of force measured by the force detector 130 at the time of executing the operation flow. In the main view area MV, it is possible to select and display any one or more temporal changes of force of the plurality of forces measured by the force detector 130. It is also possible to display the temporal change of the measured position of the TCP and the temporal change of the difference between the target position and the measured position of the TCP on the main view area MV. The period of displaying the result in the main view area MV can be an operation period of any one of the operation objects in the operation flow, or can be the entire period from the start to stop of the execution. For example, when any operation object is selected in the operation flow creation area FL, the execution result of the operation period of the operation object is displayed. When the sequence block SB1 is selected, the result of the entire period from the start to stop of the execution is displayed. The period of displaying result in the main view area MV may be an operation period over a plurality of continuing operation objects. The information of some execution results of the control program is also displayed in the result area RS. For example, for any operation object, it is possible to display the end state of the operation (success or failure), time required for the operation, force at the end of the operation, the position at the end of the operation, and the like in the result area RS. Other types of results than the one shown in FIG. 11A may be displayed in the main view area MV. For example, information related to the robot, such as the speed of the robot and the angle of each joint may be displayed.") or an internal variable of the control program related to the command. (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figures 6E and 7 which demonstrate that different operation objects may be selected to add to the program and information on these objects is displayed during selection.)
Examiner Note: The currently provided claim language only requires the information to include one of:
“a state of a sensor or equipment associated with the command” (This is demonstrated in at least Figure 6E in areas “PR” and “RS” which show different sensors and the execution results as well as parameters which may be defined. The term “state” may be understood under the broadest reasonable interpretation to include presence of a sensor or equipment, inclusion in the control program being executed, successful data gathering resulting in execution results, successful performance of an operation including the sensor or equipment, any parameters associated with the sensor or equipment, etc.)
“information related to a detection value of the sensor associated with the command” (This is demonstrated at least by the “execution result” section of area “RS” as well as by subsection “IM” of the main viewing area which displays an image captured by a camera.)
“internal variable of the control program related to the command” (This is demonstrated by the parameter setting area which allows monitoring and adjusting of parameters (variables) via the interface.)
Takeuchi does not specifically teach displaying information on a control block when that block is executed. However, Schultz, in the same field of endeavor of robotics, teaches:
… when a command of the plurality of commands disposed in the program creation screen is executed by execution of the control program or is selected, information related to an execution state of the executed or selected command; (Paragraph 0121, “In one embodiment, when the user executes the application, the displayed icons may be highlighted to visually indicate which of the operations in the application is currently being executed. The connections between icons may also be animated, such as with "propagating bubbles", to visually indicate flow of data or flow of execution. Thus, the plurality of interconnected icons may be animated, such as in the "execution highlighting" feature of LabVIEW, to visually indicate which operations are being performed. In one embodiment, corresponding modifications may be made to the displayed image as the application is executed. The corresponding modifications made to the displayed image may be made as a respective icon corresponding to the operation being performed is highlighted. This provides the user with a visual indication of which operation is being executed, and how this operation is affecting the image. This provides an improved debugging or feedback tool to the user.”) …
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the workflow generation and robotic system as taught by Takeuchi with the functionality taught by Schultz to highlight current operations during execution of the program. This would enable a user to continue their work without disruption and to efficiently follow the operation program and debug the program as needed.
Regarding claim 16, where all the limitations of claim 1 are discussed above, Takeuchi further teaches:
16. (previously presented) The teaching device according to claim 1, wherein the execution state includes information related to a state, during execution of the executed command, of at least one of a force sensor, a visual sensor, an arc welding, a current or voltage sensor, and an I/O signal. (Paragraph 0148, "In the main view area MV, the temporal change of the force Fx in the X axis direction and the torque Tx around the X axis are displayed among the plurality of force measured by the force detector 130 at the time of executing the operation flow. In the main view area MV, it is possible to select and display any one or more temporal changes of force of the plurality of forces measured by the force detector 130. It is also possible to display the temporal change of the measured position of the TCP and the temporal change of the difference between the target position and the measured position of the TCP on the main view area MV. The period of displaying the result in the main view area MV can be an operation period of any one of the operation objects in the operation flow, or can be the entire period from the start to stop of the execution. For example, when any operation object is selected in the operation flow creation area FL, the execution result of the operation period of the operation object is displayed. When the sequence block SB1 is selected, the result of the entire period from the start to stop of the execution is displayed. The period of displaying result in the main view area MV may be an operation period over a plurality of continuing operation objects. The information of some execution results of the control program is also displayed in the result area RS. For example, for any operation object, it is possible to display the end state of the operation (success or failure), time required for the operation, force at the end of the operation, the position at the end of the operation, and the like in the result area RS. Other types of results than the one shown in FIG. 11A may be displayed in the main view area MV. For example, information related to the robot, such as the speed of the robot and the angle of each joint may be displayed.")
Regarding claim 17, where all the limitations of claim 1 are discussed above, Takeuchi further teaches:
17. (New) The teaching device according to claim 1, wherein the program creation screen includes a first region that displays a list of the plurality of commands, (Paragraph 0063-0065, "(1) Main view area MV is an area for displaying options of operation objects and conditional branch objects to be described later, execution results of a control program, and the like.
(2) Operation flow creation area FL is an area for displaying the operation flows in which a plurality of objects are graphically placed in an editable manner. The work represented by the operation flow is also called “sequence”.
(3) Sequence display area SQ is an area for displaying a tree structure of the sequence." As well as Paragraph 0082, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”) and
a second region for creating the control program by disposing one or more commands selected from the first region, (Paragraph 0064, "Operation flow creation area FL is an area for displaying the operation flows in which a plurality of objects are graphically placed in an editable manner. The work represented by the operation flow is also called “sequence”.")
wherein the processor (Paragraph 0057, “The robot control device 200 controls the arm 110, the end effector 140, the transport device 50, and the camera 30. The functions of the robot control device 200 are realized, for example, by a computer with a processor and a memory executing a computer program.”) is further configured to cause the display to display the information … the program creation screen (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figures 6E and 7 which demonstrate that different operation objects may be selected to add to the program and information on these objects is displayed during selection.) such that … window does not overlap the first region and the second region. (Please see at least Figures 6A and 11A which demonstrate that the plurality of regions are displayed as separate windows none of which overlap.)
While Takeuchi does teach displaying an execution information area. They do not explicitly state that this is the execution state information triggered by selection or execution and is displayed in a pop-up window. However, Schultz, in the same field of endeavor of robotics, teaches:
… related to the execution state (Paragraph 0121, “In one embodiment, when the user executes the application, the displayed icons may be highlighted to visually indicate which of the operations in the application is currently being executed. The connections between icons may also be animated, such as with "propagating bubbles", to visually indicate flow of data or flow of execution. Thus, the plurality of interconnected icons may be animated, such as in the "execution highlighting" feature of LabVIEW, to visually indicate which operations are being performed. In one embodiment, corresponding modifications may be made to the displayed image as the application is executed. The corresponding modifications made to the displayed image may be made as a respective icon corresponding to the operation being performed is highlighted. This provides the user with a visual indication of which operation is being executed, and how this operation is affecting the image. This provides an improved debugging or feedback tool to the user.”) …
However, Jin, in the same field of endeavor of robotics, teaches:
… in a pop-up window that is displayed as a separate window from … the pop-up … (Paragraph 0017, "As another example, some embodiments include an apparatus (10) for automatic programming, comprising: a user interaction module (201), configured to receive a user's request on creating a first global parameter of a specific type by detecting the user's dragging a control corresponding to the specific type from a first user interface (701) and dropping the control on a second user interface (702); a displaying module (202), configured to display a first popup window (804) for editing the first global parameter on the second user interface (702); a processing module (203), configured to receive the user's editing operations on the first global parameter in the first popup window (804); and create the first global parameter according to the user's editing operations on the first global parameter; and the displaying module (202), further configured to display a new added first card (802) of the first global parameter in a list (802′) of global parameters on a third user interface (703).")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the workflow generation and robotic system as taught by Takeuchi with the functionality taught by Schultz to highlight current operations during execution of the program and with ability to use a pop up display as taught by Jin. Incorporating the “pop-up” style window which may be triggered by a selection or execution into the interface as taught by Takeuchi where all windows are displayed in a non-overlapping manner would be obvious. This would allow for new windows to be triggered by different actions while maintaining the visibility of the previous windows which a user may still be interacting with. This would enable a user to continue their work without disruption while accessing focused information within the pop up window and would further allow a user to efficiently follow the operation program and debug the program as needed.
Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Takeuchi in view of Schultz and Jin and in further view of Inaba et al. (US 20210060772 A1), hereinafter Inaba.
Regarding claim 11, where all the limitations of claim 1 are discussed above, Takeuchi further teaches:
11. (previously presented) The teaching device according to claim 1, further comprising wherein the processor (Paragraph 0057, “The robot control device 200 controls the arm 110, the end effector 140, the transport device 50, and the camera 30. The functions of the robot control device 200 are realized, for example, by a computer with a processor and a memory executing a computer program.”) is further configured to … when the information related to the command is displayed in response to an execution instruction on the command. (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figure11A which shows the display and the information shown as a result of the generated program being executed.)
Takeuchi does not specifically teach defining the display content to be displayed. However, Inaba, in the same field of endeavor of robotic programming, teaches:
… set, for a command among the plurality of commands, display attribute information that defines a display content to be displayed as the information related to the command, and determine a display content according to the display attribute information set for the command … (Paragraph 0099, "In the robot programming device 100 described in (1) or (2), the display control unit 102 may accept an instruction to display setting content relating to any operation unit block, and display setting content relating to the operation unit block selected by the instruction in the advanced setting region 320.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic programming interface as taught by Takeuchi with the ability to instruct the system to display different content as taught by Inaba. This would allow a user to personalize the display for either a specific user or for a specific operation so as to provide more relevant data and allow for a more efficient workflow.
Claim(s) 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Takeuchi in view of Schultz, Jin and Inaba and in further view of Ogawa et al. (WO 2017149667 A1), hereinafter Ogawa.
Regarding claim 12, where all the limitations of claim 11 are discussed above, Takeuchi further teaches:
12. (previously presented) The teaching device according to claim 11, wherein the processor (Paragraph 0057, “The robot control device 200 controls the arm 110, the end effector 140, the transport device 50, and the camera 30. The functions of the robot control device 200 are realized, for example, by a computer with a processor and a memory executing a computer program.”) is further configured to …
determine a display content, … when the information related to the command is displayed in response to an execution instruction on the command. (Paragraph 0066, "Parameter setting area PR is an area for setting work parameters related to the entire work or operation parameters related to individual operations." and Paragraph 0081, " In FIG. 6D, when one of the blocks SB1 and OB1 to OB4 placed in the operation flow creation area FL is selected, the parameters corresponding to the selected block are displayed in the parameter setting area PR. For example, when the sequence block SB1 is selected, the work parameters related to the entire sequence are displayed. When one the objects OB1 to OB4 of the object is selected, the parameters related to the objects are displayed. In the example of FIG. 6D, the parameters related to the conditional branch block OB2 are displayed. These parameters are changed as necessary." Please also see Figure11A which shows the display and the information shown as a result of the generated program being executed.)
Takeuchi does not specifically teach displaying the progress status of the system on the display. However, Ogawa, in the same field of endeavor of interfaces for robotic systems, teaches:
… set the execution state attribute being information related to an execution state of the command, and … based on the execution state attribute … (Page 4, Paragraphs 2-4, "In this machine tool management apparatus, as shown in FIGS. 4, 5, and 6, a status bar 96 including a plurality of status display icons 90, 92, and 94 is displayed on any of the processing screens. The status bar 96 is displayed in a state extending in the left-right direction at the upper end on the processing screen, and a plurality of status display icons 90, 92, and 94 are arranged side by side. The plurality of status display icons 90, 92, and 94 indicate the state of the lathe body 40 that is a machine tool for a certain matter, and are roughly classified into three types.
First, the first type of status display icon 90 indicates the status of the lathe body 40 for matters that can be changed by the operator, and corresponds to the four status display icons 90a, 90b, 90c, and 90d from the left. . These four status display icons 90a, 90b, 90c, and 90d respectively indicate the part number of the selected workpiece, the part that indicates the currently executed program number, and the turret that is indexed for the selected tool. This indicates the number and the control mode of the NC control currently selected.
In this machine tool management apparatus, when each of these status display icons 90a, 90b, 90c, and 90d is touch-operated by an operator, a dialog box (a kind of window) for changing the items indicated by the respective icons. Is displayed on the currently displayed processing screen. FIG. 7 is a display screen of the display 30 when the state display icon 90d is touch-operated by the operator on the processing screen 50 shown in FIG. The status display icon 90d is an NC mode icon indicating the currently selected NC machining control mode. In this machine tool management apparatus, there are seven control modes for NC machining control for machining a workpiece, and the operating state of the lathe body 40 for machining the workpiece is selected. The operating state is set in accordance with a control mode selected alternatively from the seven control modes. As shown in FIG. 7, a dialog box 100 is displayed on the processing screen 50 to allow the operator to select an NC machining control mode from seven modes. In the dialog box 100, as shown in FIG. 7, seven icons 102 corresponding to seven control modes are arranged, and when one of these seven icons 102 is selected by the operator, the lathe body The 40 operating states are set as operating states corresponding to the control mode corresponding to the selected icon 102. When any of the seven icons 102 is selected, the NC mode icon 90d displayed on the processing screen is changed to an illustration icon representing the selected control mode.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the robotic system interface as taught by Takeuchi with the ability to display the progress state of the system as taught by Ogawa. This would allow the user to closely monitor an operation so that they may manage the system status and interact as necessary.
Conclusion
The Examiner has cited particular paragraphs or columns and line numbers in the referencesapplied to the claims above for the convenience of the Applicant. Although the specified citations arerepresentative of the teachings of the art and are applied to specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested of the Applicant in preparing responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. See MPEP 2141.02 [R-07.2015] VI. A prior art reference must be considered in its entirety, i.e., as a whole, including portions that would lead away from the claimed Invention. W.L. Gore & Associates, Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851 (1984). See also MPEP §2123.
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
/H.J.K./Examiner, Art Unit 3657
/ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657