Prosecution Insights
Last updated: April 19, 2026
Application No. 18/873,331

ROBOT SYSTEM AND ROBOT CONTROL DEVICE

Non-Final OA §101§103
Filed
Dec 10, 2024
Examiner
STIEBRITZ, NOAH WILLIAM
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Fanuc Corporation
OA Round
1 (Non-Final)
67%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
51%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
12 granted / 18 resolved
+14.7% vs TC avg
Minimal -16% lift
Without
With
+-15.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
44 currently pending
Career history
62
Total Applications
across all art units

Statute-Specific Performance

§101
18.6%
-21.4% vs TC avg
§103
61.7%
+21.7% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
8.0%
-32.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 18 resolved cases

Office Action

§101 §103
DETAILED ACTION This is a non-final Office Action on the merits in response to communications filed by Applicant on December 10th, 2024. Claims 1-16 are currently pending and examined below. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments The amendments to the Claims, filed on December 10th, 2024, have been entered. Claims 4-5 and 12-13 are currently amended and pending, and claims 1-3, 6-11, and 14-16 are original, unamended, and pending. The amendments to the Abstract, filed on December 10th, 2024, have been entered. The amendments to the Specifications, filed on December 10th, 2024, have been entered. Information Disclosure Statement The Information Disclosure Statement(s) filed on 12/10/2024 is/are being considered by the examiner. Drawings The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference character(s) not mentioned in the description: Reference numbers 511 and 512 in Figure 2 and reference number S13 in Figure 9. Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Specification The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: Claim 1 – force detection unit, determination unit, force control setting unit Claim 5 – area setting unit Claim 7 – area monitoring unit Claim 9 – force detection unit, determination unit, force control setting unit Claim 13 – area setting unit Claim 15 – area monitoring unit Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1-16 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. STEP 1: Do the claims fall within one of the statutory categories? Yes, claim(s) 1 and 9 are directed towards a system and controller respectively. STEP 2A (PRONG 1): Is the claim directed to a law of nature, a natural phenomenon, or an abstract idea? Yes, the claims are directed to an abstract idea. The system of claim 1 is directed towards a mental process, and as such, is directed towards an abstract idea. Claim 1 recites the limitation “wherein the robot controller includes: a determination unit configured to determine, based on the robot program, an operation mode of force control using the force detection unit, the force control being executed in the robot program”. MPEP § 2106.04(a)(2)(III) states that mental processes include observations, evaluations, and judgments. The limitation as claimed is clearly a process involving making an evaluation (i.e. determining an operation mode of force control) based on an observation (i.e. the robot program). Determining what force control mode to use based on the robot program is a process that can be entirely performed in the human mind by simply observing the robot program. Therefore, claim 1 is clearly directed towards a mental process, and as such, is directed towards an abstract idea. The controller of claim 9 is directed towards a mental process, and as such, is directed towards an abstract idea. Claim 9 recites the limitation “a determination unit configured to determine, based on the robot program, an operation mode of force control using a force detection unit, the force control being executed in the robot program”. MPEP § 2106.04(a)(2)(III) states that mental processes include observations, evaluations, and judgments. The limitation as claimed is clearly a process involving making an evaluation (i.e. determining an operation mode of force control) based on an observation (i.e. the robot program). Determining what force control mode to use based on the robot program is a process that can be entirely performed in the human mind by simply observing the robot program. Therefore, claim 9 is clearly directed towards a mental process, and as such, is directed towards an abstract idea. STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application? Claim(s) 1 and 9 do not recite any of the exemplary considerations that are indicative of an abstract idea having been integrated into a practical application. Claim 1 recites the additional limitation “a robot”. A robot is generic linking. The limitation “a robot controller configured to execute a robot program and control the robot” is generic linking. The limitation “a force detection unit configured to detect a force acting on the robot” is generic linking. The limitation “a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit” is recited at a high level of generality and, as such, is considered to be apply it level. Claim 9 recites the additional limitation “a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit”. A force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit is recited at a high level of generality and, as such, is considered to be apply it level. Therefore, it is clear the abstract idea consists of generic linking and apply it level, which is not indicative of having been integrated into a practical application. STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? Claim(s) 1 and 9 do not recite additional elements that amount to significantly more than the judicial exception. Claim(s) 1 and 9 do not recite any specific limitations that are not considered to be generic linking or apply it level. A robot is generic linking. A robot controller configured to execute a robot program and control the robot is generic linking. A force detection unit configured to detect a force acting on the robot is generic linking. A force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit is recited at a high level of generality and, as such, is considered to be apply it level. In conclusion, claim(s) 1 and 9 are rejected under 35 U.S.C. 101 because: (a) are directed toward an abstract idea, (b) does not recite additional elements that integrate the judicial exception into a practical application, and (c) does not recite additional elements that amount to significantly more than the judicial exception, it is clear that the claims are directed toward non-statutory subject matter. Regarding claim 2, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because further comprising a conveyance device is generic linking. Wherein the determination unit determines, based on the robot program, whether the operation mode of the force control is an operation mode of executing force control while tracking an article conveyed on the conveyance device or an operation mode of regular force control executed without tracking an article is a part of the abstract idea of claim 1. Therefore, claim 2 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 3, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the determination unit determines whether the operation mode of the force control is an operation mode executing force control while tracking an article or an operation mode of regular force control by determining whether the robot program is associated with setting data relating to tracking operation is a part of the abstract idea of claim 1. Therefore, claim 3 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 4, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the force control setting unit automatically sets a force control parameter according to the determined operation mode of the force control is merely specifying the data to be manipulated and is considered to be insignificant extra solution activity and is recited at a high level of generality and, as such, is considered to be apply it level. Therefore, claim 4 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 5, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because further comprising a visual sensor is generic linking. Wherein the robot controller further includes an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by the visual sensor is insignificant pre-solution data gathering. Therefore, claim 5 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 6, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein, when the work area is not set, the area setting unit sets the work area in an image capture range of the visual sensor, based on position information of the visual sensor and the robot is insignificant pre-solution data gathering. Therefore, claim 6 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 7, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the robot controller further includes an area monitoring unit configured to detect entry of an obstacle into the work area, based on an image captured by the visual sensor is insignificant pre-solution data gathering. The area setting unit performs re-setting of the work area in such a way that an area where the obstacle exists in the work area is excluded from the work area is recited at a high level of generality and, as such, is considered to be apply it level. Therefore, claim 7 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 10, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the determination unit determines, based on the robot program, whether the operation mode of the force control is an operation mode of executing force control while tracking an article conveyed on the conveyance device or an operation mode of regular force control executed without tracking an article is a part of the abstract idea of claim 1. Therefore, claim 10 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 11, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the determination unit determines whether the operation mode of the force control is an operation mode executing force control while tracking an article or an operation mode of regular force control by determining whether the robot program is associated with setting data relating to tracking operation is a part of the abstract idea of claim 1. Therefore, claim 11 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 12, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the force control setting unit automatically sets a force control parameter according to the determined operation mode of the force control is merely specifying the data to be manipulated and is considered to be insignificant extra solution activity and is recited at a high level of generality and, as such, is considered to be apply it level. Therefore, claim 12 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 13, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because further comprising an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by a visual sensor is insignificant pre-solution data gathering. Therefore, claim 13 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 14, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein, when the work area is not set, the area setting unit sets the work area in an image capture range of the visual sensor, based on position information of the visual sensor and the robot is insignificant pre-solution data gathering. Therefore, claim 14 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 15, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because further comprising an area monitoring unit configured to detect entry of an obstacle into the work area, based on an image captured by the visual sensor is insignificant pre-solution data gathering. Wherein the area setting unit performs re-setting of the work area in such a way that an area where the obstacle exists in the work area is excluded from the work area is recited at a high level of generality and, as such, is considered to be apply it level. Therefore, claim 15 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Therefore, claims 2-7 and 10-15 do not include additional elements that are sufficient to amount to significantly more than the judicial exception, and are therefore rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim 8 was not rejected under 35 U.S.C. § 101 because it recites the limitation “wherein the robot controller further includes an area monitoring unit configured to perform control in such a way as to discontinue the force control when the robot is out of the work area”. This limitation clearly recites an active control step of the robot using the information generated using the abstract idea, and is therefore indicative of integration into a practical application. Claim 16 was not rejected under 35 U.S.C. § 101 because it recites the limitation “further comprising an area monitoring unit configured to perform control in such a way as to discontinue the force control when the robot is out of the work area”. This limitation clearly recites an active control step of the robot using the information generated using the abstract idea, and is therefore indicative of integration into a practical application. The 35 U.S.C. § 101 rejection of the independent claims 1 and 9 can be overcome by amending the claims to recite an active control step of the robot using the information generated from the abstract idea. Active control steps of the same of similar phrasing as those recited in claim 8 and 16 would overcome the 35 U.S.C. § 101 rejection of the independent claims 1 and 9. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-3 and 9-11 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 11167417 B2 ("Takeuchi") in view of US 11230010 B2 ("Ueda"). Regarding claim 1, Takeuchi teaches a robot system comprising (Takeuchi: Figure 1, Abstract, “A robot control device that creates a control program for work of a robot with a force detector, the device includes a processor, wherein the processor is configured to: display an input screen including an operation flow creation area for creating an operation flow of work on a display device; convert the created operation flow into a control program; and execute the control program to control the robot, wherein the input screen is configured to display a plurality of operation objects indicating a plurality of operations including an operation using force control, and one or more conditional branch objects indicating a conditional branch, as options, and wherein the operation flow creation area is configured to create an operation flow including the conditional branch by graphically placing an operation object selected from the plurality of operation objects and the conditional branch object.”, Column 4 lines 10-14, “FIG. 1 is a perspective view of a robot system in a first embodiment. The robot system includes a camera 30, a transport device 50, a robot 100, and a robot control device 200. The robot 100 and the robot control device 200 are communicably connected via a cable or radio.”): a robot (Takeuchi: Figure 1 robot 100, Column 4 lines 10-14, “FIG. 1 is a perspective view of a robot system in a first embodiment. The robot system includes a camera 30, a transport device 50, a robot 100, and a robot control device 200. The robot 100 and the robot control device 200 are communicably connected via a cable or radio.”, Column 4 lines 15-29, “The robot 100 is a single arm robot that is used by attaching various end effectors on an arm flange 120 at a tip end of an arm 110. The arm 110 has six joints J1 to 16. The joints J2, J3, and J5 are bending joints and the joints J1, J4, and J6 are twisting joints. Various end effectors for performing work such as gripping and processing on an object (workpiece) are installed on the arm flange 120 at the tip end of the joint J6. A point in a vicinity of the tip end of the arm 110 can be set as a tool center point (TCP). The TCP is a position used as a reference of the positions of the end effectors, and can be set at any position. For example, a predetermined position on a rotation axis of the joint 16 can be set as the TCP. In the present embodiment, a six-axis robot is used, but a robot having another joint mechanism may be used.”); a robot controller configured to execute a robot program and control the robot (Takeuchi: Figure 1 robot control device 200, Column 4 lines 10-14, “FIG. 1 is a perspective view of a robot system in a first embodiment. The robot system includes a camera 30, a transport device 50, a robot 100, and a robot control device 200. The robot 100 and the robot control device 200 are communicably connected via a cable or radio.”, Column 5 lines 15-19, “The robot control device 200 controls the arm 110, the end effector 140, the transport device 50, and the camera 30. The functions of the robot control device 200 are realized, for example, by a computer with a processor and a memory executing a computer program.”); and a force detection unit configured to detect a force acting on the robot (Takeuchi: Figure 1 force detector 130, Column 4 lines 30-49, “The robot 100 can set the end effectors at any positions in any orientations within a movable range of the arm 110. A force detector 130 and an end effector 140 are installed on the arm flange 120. In the present embodiment, the end effector 140 is a gripper, but any other type of end effector can be used. The force detector 130 is a six-axis sensor that measures three-axis force acting on the end effector 140 and torque acting around the three axes. The force detector 130 measures magnitude of force parallel to three measurement axes orthogonal to each other in a sensor coordinate system which is a unique coordinate system, and the magnitude of torque around the three measurement axes. A force sensor as a force detector may be provided at any one or more joints J1 to J5 other than the joint 16. The force detector may only measure the force and torque in a direction of control, and a unit for directly measuring the force and torque like the force detector 130 or a unit for measuring the torque of the joint of the robot to obtain the force and the torque indirectly may be used. The force detector may measure the force and torque only in the direction of controlling force.”), wherein the robot controller includes: a determination unit configured to determine, based on the robot program, an operation mode of force control using the force detection unit, the force control being executed in the robot program (Takeuchi: Figure 10, Column 7 lines 19-32, “FIG. 6D shows a state in which the teacher created the operation flow in the operation flow creation area FL on the window W1. In this example, blocks of a contacting object OB1, a conditional branch object 0B2, a pressing and probing object OB3, and a pressing and moving object OB4 are placed in this order below the sequence block OB1. In the block of each object, the name and icon of the object are displayed. Among the four objects OB1 to OB4, the three objects OB1, OB3, and OB4 are operation objects except for the conditional branch object 0B2. The categories of the operation and the operation objects will be described later. In the operation flow, any object displayed in the main view area MV can be arbitrarily added, and any object in the operation flow can be deleted.”, Column 7 lines 44-49, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”, Column 8 lines 33-53, “Category 4: Pressing Pressing is an operation of pressing with designated force in the designated direction. The category of the pressing operation includes the following two types of operation objects. (a) Pressing (simple pressing) object is an operation of pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. (b) Pressing and moving object is an operation of moving while pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. As shown in FIG. 8D, in the pressing and moving object, the end effector 140 is moved in the designated direction DD and pressed with designated force, and then, is moved in a direction different from the designated direction while maintaining (that is copying) the pressing with designated force. In the example of FIG. SD, an operation of inserting the workpiece WKa held by the end effector 140 into the hole Hb of a workpiece WKb is executed by the pressing and moving.”, Column 11 lines 6-20, “When the work parameters are set for the work in which the operation flow is created as shown in FIG. 6D, it is preferable that some of the operation parameters of the operation included in the operation flow are automatically set from the work parameters.”, Column 11 lines 21-40, “FIG. 10 is an explanatory diagram showing an example of a relationship between work parameters and operation parameters. Here, an example of work parameters displayed in the parameter setting area PR on the window W1 (FIG. 6D) is shown. These work parameters are, for example, displayed in the parameter setting area PR by selecting the sequence block OB1 in the operation flow creation area FL. In FIG. 10, when clicking a relationship display button BT of a specific parameter in the parameter setting area PR, a dialog DL1 showing operation parameters influenced by the work parameters is displayed. In this example, among the work parameters of cylinder fitting work, when the relationship display button BT of the fitting direction is clicked, it is displayed in the dialog DL1 the fitting direction affects the contacting direction of contacting operation, the pressing direction of the pressing and probing operation, the moving direction of the pressing and moving operation, and the force control in the six axes directions. The teacher can check the relationship between the work parameters and the operation parameters of each operation from the dialog DL1.”, Column 11 lines 41-63, “The operation parameter of the force control operation automatically set from the work parameter is not limited to the direction of force control, and other operation parameters may be automatically set according to the work parameter. For example, the moving amount while pressing of the pressing and moving operation (moving amount in the -Z direction at the right end of FIG. SD) may be automatically set from workpiece information (for example, fitting depth of workpiece) of the work parameter. As an automatic parameter setting mode of the parameter, one mode may be selected from a first mode in which only the work parameter can be changed, a second mode in which only the operation parameter can be changed, and a third mode in which both of the work parameter and the operation parameter can be changed. In this way, a novice can create a work sequence using only the work parameter, and an expert can perform further detailed corrections using the operation parameter. In a case where the work parameter and the operation parameter are inconsistent, for example, in a case where the contacting direction is set in the -X direction when the fitting direction is set in the -Z direction in the example of FIG. 10, it is preferable that the operation parameter is edited with the work parameter as correct.”. The cited passages clearly shows that the work process of the robot can include multiple different categories of operations and that each operation changes the parameter used to control the robot (See Column 8 line 60 – Column 10 line 23 for a list of parameters for each operation category). Additionally the cited passages shows that the operational parameters are automatically set. The cited passages and Figure 10 shows that each operation has different operational parameter that are automatically set. One of ordinary skill in the art would therefore recognize that the system clearly determines the operation being performed in order to proper set the necessary parameters.); and a control setting unit configured to perform setting of the control according to the operation mode of the control determined by the determination unit (Takeuchi: Column 7 lines 44-49, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”, Column 11 lines 6-20, “When the work parameters are set for the work in which the operation flow is created as shown in FIG. 6D, it is preferable that some of the operation parameters of the operation included in the operation flow are automatically set from the work parameters.”, Column 11 lines 21-40, “FIG. 10 is an explanatory diagram showing an example of a relationship between work parameters and operation parameters. Here, an example of work parameters displayed in the parameter setting area PR on the window W1 (FIG. 6D) is shown. These work parameters are, for example, displayed in the parameter setting area PR by selecting the sequence block OB1 in the operation flow creation area FL. In FIG. 10, when clicking a relationship display button BT of a specific parameter in the parameter setting area PR, a dialog DL1 showing operation parameters influenced by the work parameters is displayed. In this example, among the work parameters of cylinder fitting work, when the relationship display button BT of the fitting direction is clicked, it is displayed in the dialog DL1 the fitting direction affects the contacting direction of contacting operation, the pressing direction of the pressing and probing operation, the moving direction of the pressing and moving operation, and the force control in the six axes directions. The teacher can check the relationship between the work parameters and the operation parameters of each operation from the dialog DL1.”, Column 11 lines 41-63, “The operation parameter of the force control operation automatically set from the work parameter is not limited to the direction of force control, and other operation parameters may be automatically set according to the work parameter. For example, the moving amount while pressing of the pressing and moving operation (moving amount in the -Z direction at the right end of FIG. SD) may be automatically set from workpiece information (for example, fitting depth of workpiece) of the work parameter. As an automatic parameter setting mode of the parameter, one mode may be selected from a first mode in which only the work parameter can be changed, a second mode in which only the operation parameter can be changed, and a third mode in which both of the work parameter and the operation parameter can be changed. In this way, a novice can create a work sequence using only the work parameter, and an expert can perform further detailed corrections using the operation parameter. In a case where the work parameter and the operation parameter are inconsistent, for example, in a case where the contacting direction is set in the -X direction when the fitting direction is set in the -Z direction in the example of FIG. 10, it is preferable that the operation parameter is edited with the work parameter as correct.”. As can clearly be seen from the cited passages the operational parameters of the forced controlled operation can be automatically set by the system based on the type of forced controlled operation and the objects involved.). Takeuchi does not teach and a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit. Ueno, in the same field of endeavor, teaches a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit (Ueno: Abstract, “A robot system that performs work of coupling a flexible cable to a connector provided on a board, includes a robot in which a gripping unit that grips the cable and a force detection unit that detects a force acting on the gripping unit are provided, a control unit that controls the robot to perform a conveyance action to grip the cable using the gripping unit and convey the cable onto the board, and an insertion action to insert the cable into the connector by force control based on a detection result in the force detection unit, an insertion speed entry part in which an insertion speed of the cable into the connector at the insertion action is entered, and a determination unit that can determine force control information necessary for the force control in the insertion action according to the insertion speed.”, Column 14 lines 7-17, “The control apparatus 200 includes a determination unit 203C that can determine force control information necessary for force control at the second stage of the conveyance action according to the conveyance speed Vl. The force control information includes e.g. the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force. The virtual coefficient of inertia, the virtual coefficient of viscosity, and the virtual modulus of elasticity are the parameters expressed by the above described equation (A). The target force may be set to e.g. the first threshold value.”, Column 14 lines 37-53, “As described above, the robot system 100 includes the conveyance speed entry part 332 in which the conveyance speed V1 of the cable 93 to the board 91 at the second stage of the conveyance action is entered. Then, the determination unit 203C can determine the force control information at the second stage of the conveyance action according to the conveyance speed V1 based on the table 41 as the calibration curve 4. That is, the unit may determine the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force suitable for the conveyance speed V1 at the second stage. Thereby, for adjustment of the conveyance speed V1 regardless of the degree of skill of a programmer, the determination unit 203C may accurately change and determine the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force according to the adjustment.”. The cited passages clearly teach a determination unit configured to determine force parameters for a force control operation of a robot. One of ordinary sill in the art would recognize that this clearly teaches a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit). Takeuchi teaches a robot system comprising: a robot; a robot controller configured to execute a robot program and control the robot; and a force detection unit configured to detect a force acting on the robot, wherein the robot controller includes: a determination unit configured to determine, based on the robot program, an operation mode of force control using the force detection unit, the force control being executed in the robot program; and a control setting unit configured to perform setting of the control according to the operation mode of the control determined by the determination unit. Furthermore, Takeuchi teaches automatically setting the operational parameters for a forced control operation based on the operation type and the workpieces, though does not explicitly teach that parameters specifically regarding the force control aspect of the operation are set. Ueno teaches a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit. A person of ordinary skill in the art would have had the technological capabilities required to have modified the system taught in Takeuchi with a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit. As previously stated, Takeuchi teaches automatically setting a variety of parameters of the operation, though does not explicitly teach that parameters specifically regarding the force control aspect of the operation. As such, a person of ordinary skill in the art would have been able to have modified the system of Takeuchi to automatically set parameters of the force control operation as taught in Ueno according to methods known in the art. Such a combination would not have changed or introduced new functionality. Therefore, it would have been obvious to one of ordinary skill in the art that the combination of Takeuchi in view of Ueno teaches the limitations of claim 1. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the system taught in Takeuchi with a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit taught in Ueno with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because such a method of setting the force control by a determination unit allows the force control parameters to be accurately set regardless of the skill of the user. (Ueno: Column 14 lines 37-53, “Thereby, for adjustment of the conveyance speed V1 regardless of the degree of skill of a programmer, the determination unit 203C may accurately change and determine the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force according to the adjustment.”). Regarding claim 2, Takeuchi in view of Ueno teaches further comprising a conveyance device (Takeuchi: Figure 1 transport device 50, Column 4 lines 10-14, “FIG. 1 is a perspective view of a robot system in a first embodiment. The robot system includes a camera 30, a transport device 50, a robot 100, and a robot control device 200. The robot 100 and the robot control device 200 are communicably connected via a cable or radio.”, Column 4 line 66 – Column 5 line 14, “In the present embodiment, a workpiece WK2 is transported by the transport device 50. The transport device 50 includes transport rollers 50a and 50b. The transport device 50 can transport the workpiece WK2 placed on a transport surface by moving the transport surface by rotating these transport rollers 50a and 50b. The camera 30 is installed above the transport device 50. The camera 30 is installed such that the workpiece WK2 on the transport surface is in the visual field. A fitting hole H2 is formed on a surface of the workpiece WK2. The end effector 140 can perform work of fitting a workpiece WK1 gripped by the end effector 140 into the fitting hole H2 of the workpiece WK2. The fitting work may be performed in a state in which the transport surface is stopped, or, may be executed while moving the transport surface. The transport device 50 and the camera 30 can be omitted.”), wherein the determination unit determines, based on the robot program, whether the operation mode of the force control is an operation mode of executing force control while tracking an article conveyed on the conveyance device or an operation mode of regular force control executed without tracking an article (Takeuchi: Column 4 line 66 – Column 5 line 14, “In the present embodiment, a workpiece WK2 is transported by the transport device 50. The transport device 50 includes transport rollers 50a and 50b. The transport device 50 can transport the workpiece WK2 placed on a transport surface by moving the transport surface by rotating these transport rollers 50a and 50b. The camera 30 is installed above the transport device 50. The camera 30 is installed such that the workpiece WK2 on the transport surface is in the visual field. A fitting hole H2 is formed on a surface of the workpiece WK2. The end effector 140 can perform work of fitting a workpiece WK1 gripped by the end effector 140 into the fitting hole H2 of the workpiece WK2. The fitting work may be performed in a state in which the transport surface is stopped, or, may be executed while moving the transport surface. The transport device 50 and the camera 30 can be omitted.”, Column 7 lines 44-49, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”, Column 8 lines 33-53, “Category 4: Pressing Pressing is an operation of pressing with designated force in the designated direction. The category of the pressing operation includes the following two types of operation objects. (a) Pressing (simple pressing) object is an operation of pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. (b) Pressing and moving object is an operation of moving while pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. As shown in FIG. 8D, in the pressing and moving object, the end effector 140 is moved in the designated direction DD and pressed with designated force, and then, is moved in a direction different from the designated direction while maintaining (that is copying) the pressing with designated force. In the example of FIG. SD, an operation of inserting the workpiece WKa held by the end effector 140 into the hole Hb of a workpiece WKb is executed by the pressing and moving.”, Column 13 lines 14-22, “When the operation flow is completed as described above, the conversion unit 244 converts the operation flow into a control program according to the indication of the teacher in step S130 in FIG. 5. The indication can be performed, for example, by selecting "create control program" from the context menu of the operation flow creation area FL. It is preferable that any one method of the following three types of methods can be selectively performed for the conversion from the operation flow to a control program and execution.”, Column 11 lines 21-40, “FIG. 10 is an explanatory diagram showing an example of a relationship between work parameters and operation parameters. Here, an example of work parameters displayed in the parameter setting area PR on the window W1 (FIG. 6D) is shown. These work parameters are, for example, displayed in the parameter setting area PR by selecting the sequence block OB1 in the operation flow creation area FL. In FIG. 10, when clicking a relationship display button BT of a specific parameter in the parameter setting area PR, a dialog DL1 showing operation parameters influenced by the work parameters is displayed. In this example, among the work parameters of cylinder fitting work, when the relationship display button BT of the fitting direction is clicked, it is displayed in the dialog DL1 the fitting direction affects the contacting direction of contacting operation, the pressing direction of the pressing and probing operation, the moving direction of the pressing and moving operation, and the force control in the six axes directions. The teacher can check the relationship between the work parameters and the operation parameters of each operation from the dialog DL1.”, Column 11 lines 41-63, “The operation parameter of the force control operation automatically set from the work parameter is not limited to the direction of force control, and other operation parameters may be automatically set according to the work parameter. For example, the moving amount while pressing of the pressing and moving operation (moving amount in the -Z direction at the right end of FIG. SD) may be automatically set from workpiece information (for example, fitting depth of workpiece) of the work parameter. As an automatic parameter setting mode of the parameter, one mode may be selected from a first mode in which only the work parameter can be changed, a second mode in which only the operation parameter can be changed, and a third mode in which both of the work parameter and the operation parameter can be changed. In this way, a novice can create a work sequence using only the work parameter, and an expert can perform further detailed corrections using the operation parameter. In a case where the work parameter and the operation parameter are inconsistent, for example, in a case where the contacting direction is set in the -X direction when the fitting direction is set in the -Z direction in the example of FIG. 10, it is preferable that the operation parameter is edited with the work parameter as correct.”. The cited passages clearly shows that the work process of the robot can include multiple different categories of operations and that each operation changes the parameter used to control the robot (See Column 8 line 60 – Column 10 line 23 for a list of parameters for each operation category). Furthermore, two such operations include a standard pressing operation (performed without the robot moving in a direction other than the direction required to press the held object into the workpiece) and an operation of moving while pressing (performed while moving in a direction other than the direction required to press the held object into the workpiece.). One of ordinary skill in the art would recognize that, because the system can be configured to perform the work on thew workpiece while it is moving on the conveying device, the moving while pressing operation comprises an operation mode of executing force control while tracking an article conveyed on the conveyance device. Additionally the cited passages shows that the operational parameters are automatically set. The cited passages and Figure 10 shows that each operation has different operational parameter that are automatically set. One of ordinary skill in the art would therefore recognize that the system clearly determines the operation being performed in order to proper set the necessary parameters.). Regarding claim 3, Takeuchi in view of Ueno teaches wherein the determination unit determines whether the operation mode of the force control is an operation mode executing force control while tracking an article or an operation mode of regular force control by determining whether the robot program is associated with setting data relating to tracking operation (Takeuchi: Figure 10, Column 7 lines 44-49, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”, Column 8 lines 33-53, “Category 4: Pressing Pressing is an operation of pressing with designated force in the designated direction. The category of the pressing operation includes the following two types of operation objects. (a) Pressing (simple pressing) object is an operation of pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. (b) Pressing and moving object is an operation of moving while pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. As shown in FIG. 8D, in the pressing and moving object, the end effector 140 is moved in the designated direction DD and pressed with designated force, and then, is moved in a direction different from the designated direction while maintaining (that is copying) the pressing with designated force. In the example of FIG. SD, an operation of inserting the workpiece WKa held by the end effector 140 into the hole Hb of a workpiece WKb is executed by the pressing and moving.”, Column 13 lines 14-22, “When the operation flow is completed as described above, the conversion unit 244 converts the operation flow into a control program according to the indication of the teacher in step S130 in FIG. 5. The indication can be performed, for example, by selecting "create control program" from the context menu of the operation flow creation area FL. It is preferable that any one method of the following three types of methods can be selectively performed for the conversion from the operation flow to a control program and execution.”, Column 11 lines 21-40, “FIG. 10 is an explanatory diagram showing an example of a relationship between work parameters and operation parameters. Here, an example of work parameters displayed in the parameter setting area PR on the window W1 (FIG. 6D) is shown. These work parameters are, for example, displayed in the parameter setting area PR by selecting the sequence block OB1 in the operation flow creation area FL. In FIG. 10, when clicking a relationship display button BT of a specific parameter in the parameter setting area PR, a dialog DL1 showing operation parameters influenced by the work parameters is displayed. In this example, among the work parameters of cylinder fitting work, when the relationship display button BT of the fitting direction is clicked, it is displayed in the dialog DL1 the fitting direction affects the contacting direction of contacting operation, the pressing direction of the pressing and probing operation, the moving direction of the pressing and moving operation, and the force control in the six axes directions. The teacher can check the relationship between the work parameters and the operation parameters of each operation from the dialog DL1.”, Column 11 lines 41-63, “The operation parameter of the force control operation automatically set from the work parameter is not limited to the direction of force control, and other operation parameters may be automatically set according to the work parameter. For example, the moving amount while pressing of the pressing and moving operation (moving amount in the -Z direction at the right end of FIG. SD) may be automatically set from workpiece information (for example, fitting depth of workpiece) of the work parameter. As an automatic parameter setting mode of the parameter, one mode may be selected from a first mode in which only the work parameter can be changed, a second mode in which only the operation parameter can be changed, and a third mode in which both of the work parameter and the operation parameter can be changed. In this way, a novice can create a work sequence using only the work parameter, and an expert can perform further detailed corrections using the operation parameter. In a case where the work parameter and the operation parameter are inconsistent, for example, in a case where the contacting direction is set in the -X direction when the fitting direction is set in the -Z direction in the example of FIG. 10, it is preferable that the operation parameter is edited with the work parameter as correct.”. The cited figure and passages show that each operation has their own operational parameters that are automatically set by the system. One of ordinary skill in the art would recognize that the system is clearly configured to determine if the operation is associated with a setting relating to tracking in order to automatically set said parameter. Such a parameter would include the moving direction for the moving and pressing operation as shown in Figure 10.). Regarding claim 4, Takeuchi in view of Ueno teaches wherein the force control setting unit automatically sets a force control parameter according to the determined operation mode of the force control (Takeuchi: Column 7 lines 44-49, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”, Column 11 lines 6-20, “When the work parameters are set for the work in which the operation flow is created as shown in FIG. 6D, it is preferable that some of the operation parameters of the operation included in the operation flow are automatically set from the work parameters.”, Column 11 lines 21-40, “FIG. 10 is an explanatory diagram showing an example of a relationship between work parameters and operation parameters. Here, an example of work parameters displayed in the parameter setting area PR on the window W1 (FIG. 6D) is shown. These work parameters are, for example, displayed in the parameter setting area PR by selecting the sequence block OB1 in the operation flow creation area FL. In FIG. 10, when clicking a relationship display button BT of a specific parameter in the parameter setting area PR, a dialog DL1 showing operation parameters influenced by the work parameters is displayed. In this example, among the work parameters of cylinder fitting work, when the relationship display button BT of the fitting direction is clicked, it is displayed in the dialog DL1 the fitting direction affects the contacting direction of contacting operation, the pressing direction of the pressing and probing operation, the moving direction of the pressing and moving operation, and the force control in the six axes directions. The teacher can check the relationship between the work parameters and the operation parameters of each operation from the dialog DL1.”, Column 11 lines 41-63, “The operation parameter of the force control operation automatically set from the work parameter is not limited to the direction of force control, and other operation parameters may be automatically set according to the work parameter. For example, the moving amount while pressing of the pressing and moving operation (moving amount in the -Z direction at the right end of FIG. SD) may be automatically set from workpiece information (for example, fitting depth of workpiece) of the work parameter. As an automatic parameter setting mode of the parameter, one mode may be selected from a first mode in which only the work parameter can be changed, a second mode in which only the operation parameter can be changed, and a third mode in which both of the work parameter and the operation parameter can be changed. In this way, a novice can create a work sequence using only the work parameter, and an expert can perform further detailed corrections using the operation parameter. In a case where the work parameter and the operation parameter are inconsistent, for example, in a case where the contacting direction is set in the -X direction when the fitting direction is set in the -Z direction in the example of FIG. 10, it is preferable that the operation parameter is edited with the work parameter as correct.”, Ueno: Abstract, “A robot system that performs work of coupling a flexible cable to a connector provided on a board, includes a robot in which a gripping unit that grips the cable and a force detection unit that detects a force acting on the gripping unit are provided, a control unit that controls the robot to perform a conveyance action to grip the cable using the gripping unit and convey the cable onto the board, and an insertion action to insert the cable into the connector by force control based on a detection result in the force detection unit, an insertion speed entry part in which an insertion speed of the cable into the connector at the insertion action is entered, and a determination unit that can determine force control information necessary for the force control in the insertion action according to the insertion speed.”, Column 14 lines 7-17, “The control apparatus 200 includes a determination unit 203C that can determine force control information necessary for force control at the second stage of the conveyance action according to the conveyance speed Vl. The force control information includes e.g. the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force. The virtual coefficient of inertia, the virtual coefficient of viscosity, and the virtual modulus of elasticity are the parameters expressed by the above described equation (A). The target force may be set to e.g. the first threshold value.”, Column 14 lines 37-53, “As described above, the robot system 100 includes the conveyance speed entry part 332 in which the conveyance speed V1 of the cable 93 to the board 91 at the second stage of the conveyance action is entered. Then, the determination unit 203C can determine the force control information at the second stage of the conveyance action according to the conveyance speed V1 based on the table 41 as the calibration curve 4. That is, the unit may determine the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force suitable for the conveyance speed V1 at the second stage. Thereby, for adjustment of the conveyance speed V1 regardless of the degree of skill of a programmer, the determination unit 203C may accurately change and determine the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force according to the adjustment.”). Regarding claim 9, Takeuchi teaches a robot controller for executing a robot program and controlling a robot, the robot controller comprising (Takeuchi: Figure 1, Abstract, “A robot control device that creates a control program for work of a robot with a force detector, the device includes a processor, wherein the processor is configured to: display an input screen including an operation flow creation area for creating an operation flow of work on a display device; convert the created operation flow into a control program; and execute the control program to control the robot, wherein the input screen is configured to display a plurality of operation objects indicating a plurality of operations including an operation using force control, and one or more conditional branch objects indicating a conditional branch, as options, and wherein the operation flow creation area is configured to create an operation flow including the conditional branch by graphically placing an operation object selected from the plurality of operation objects and the conditional branch object.”, Column 4 lines 10-14, “FIG. 1 is a perspective view of a robot system in a first embodiment. The robot system includes a camera 30, a transport device 50, a robot 100, and a robot control device 200. The robot 100 and the robot control device 200 are communicably connected via a cable or radio.”): a determination unit configured to determine, based on the robot program, an operation mode of force control using a force detection unit, the force control being executed in the robot program(Takeuchi: Figure 10, Column 7 lines 19-32, “FIG. 6D shows a state in which the teacher created the operation flow in the operation flow creation area FL on the window W1. In this example, blocks of a contacting object OB1, a conditional branch object 0B2, a pressing and probing object OB3, and a pressing and moving object OB4 are placed in this order below the sequence block OB1. In the block of each object, the name and icon of the object are displayed. Among the four objects OB1 to OB4, the three objects OB1, OB3, and OB4 are operation objects except for the conditional branch object 0B2. The categories of the operation and the operation objects will be described later. In the operation flow, any object displayed in the main view area MV can be arbitrarily added, and any object in the operation flow can be deleted.”, Column 7 lines 44-49, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”, Column 8 lines 33-53, “Category 4: Pressing Pressing is an operation of pressing with designated force in the designated direction. The category of the pressing operation includes the following two types of operation objects. (a) Pressing (simple pressing) object is an operation of pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. (b) Pressing and moving object is an operation of moving while pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. As shown in FIG. 8D, in the pressing and moving object, the end effector 140 is moved in the designated direction DD and pressed with designated force, and then, is moved in a direction different from the designated direction while maintaining (that is copying) the pressing with designated force. In the example of FIG. SD, an operation of inserting the workpiece WKa held by the end effector 140 into the hole Hb of a workpiece WKb is executed by the pressing and moving.”, Column 11 lines 6-20, “When the work parameters are set for the work in which the operation flow is created as shown in FIG. 6D, it is preferable that some of the operation parameters of the operation included in the operation flow are automatically set from the work parameters.”, Column 11 lines 21-40, “FIG. 10 is an explanatory diagram showing an example of a relationship between work parameters and operation parameters. Here, an example of work parameters displayed in the parameter setting area PR on the window W1 (FIG. 6D) is shown. These work parameters are, for example, displayed in the parameter setting area PR by selecting the sequence block OB1 in the operation flow creation area FL. In FIG. 10, when clicking a relationship display button BT of a specific parameter in the parameter setting area PR, a dialog DL1 showing operation parameters influenced by the work parameters is displayed. In this example, among the work parameters of cylinder fitting work, when the relationship display button BT of the fitting direction is clicked, it is displayed in the dialog DL1 the fitting direction affects the contacting direction of contacting operation, the pressing direction of the pressing and probing operation, the moving direction of the pressing and moving operation, and the force control in the six axes directions. The teacher can check the relationship between the work parameters and the operation parameters of each operation from the dialog DL1.”, Column 11 lines 41-63, “The operation parameter of the force control operation automatically set from the work parameter is not limited to the direction of force control, and other operation parameters may be automatically set according to the work parameter. For example, the moving amount while pressing of the pressing and moving operation (moving amount in the -Z direction at the right end of FIG. SD) may be automatically set from workpiece information (for example, fitting depth of workpiece) of the work parameter. As an automatic parameter setting mode of the parameter, one mode may be selected from a first mode in which only the work parameter can be changed, a second mode in which only the operation parameter can be changed, and a third mode in which both of the work parameter and the operation parameter can be changed. In this way, a novice can create a work sequence using only the work parameter, and an expert can perform further detailed corrections using the operation parameter. In a case where the work parameter and the operation parameter are inconsistent, for example, in a case where the contacting direction is set in the -X direction when the fitting direction is set in the -Z direction in the example of FIG. 10, it is preferable that the operation parameter is edited with the work parameter as correct.”. The cited passages clearly shows that the work process of the robot can include multiple different categories of operations and that each operation changes the parameter used to control the robot (See Column 8 line 60 – Column 10 line 23 for a list of parameters for each operation category). Additionally the cited passages shows that the operational parameters are automatically set. The cited passages and Figure 10 shows that each operation has different operational parameter that are automatically set. One of ordinary skill in the art would therefore recognize that the system clearly determines the operation being performed in order to proper set the necessary parameters.); and a control setting unit configured to perform setting of the control according to the operation mode of the control determined by the determination unit (Takeuchi: Column 7 lines 44-49, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”, Column 11 lines 6-20, “When the work parameters are set for the work in which the operation flow is created as shown in FIG. 6D, it is preferable that some of the operation parameters of the operation included in the operation flow are automatically set from the work parameters.”, Column 11 lines 21-40, “FIG. 10 is an explanatory diagram showing an example of a relationship between work parameters and operation parameters. Here, an example of work parameters displayed in the parameter setting area PR on the window W1 (FIG. 6D) is shown. These work parameters are, for example, displayed in the parameter setting area PR by selecting the sequence block OB1 in the operation flow creation area FL. In FIG. 10, when clicking a relationship display button BT of a specific parameter in the parameter setting area PR, a dialog DL1 showing operation parameters influenced by the work parameters is displayed. In this example, among the work parameters of cylinder fitting work, when the relationship display button BT of the fitting direction is clicked, it is displayed in the dialog DL1 the fitting direction affects the contacting direction of contacting operation, the pressing direction of the pressing and probing operation, the moving direction of the pressing and moving operation, and the force control in the six axes directions. The teacher can check the relationship between the work parameters and the operation parameters of each operation from the dialog DL1.”, Column 11 lines 41-63, “The operation parameter of the force control operation automatically set from the work parameter is not limited to the direction of force control, and other operation parameters may be automatically set according to the work parameter. For example, the moving amount while pressing of the pressing and moving operation (moving amount in the -Z direction at the right end of FIG. SD) may be automatically set from workpiece information (for example, fitting depth of workpiece) of the work parameter. As an automatic parameter setting mode of the parameter, one mode may be selected from a first mode in which only the work parameter can be changed, a second mode in which only the operation parameter can be changed, and a third mode in which both of the work parameter and the operation parameter can be changed. In this way, a novice can create a work sequence using only the work parameter, and an expert can perform further detailed corrections using the operation parameter. In a case where the work parameter and the operation parameter are inconsistent, for example, in a case where the contacting direction is set in the -X direction when the fitting direction is set in the -Z direction in the example of FIG. 10, it is preferable that the operation parameter is edited with the work parameter as correct.”. As can clearly be seen from the cited passages the operational parameters of the forced controlled operation can be automatically set by the system based on the type of forced controlled operation and the objects involved.). Takeuchi does not teach and a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit. Ueno, in the same field of endeavor, teaches a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit (Ueno: Abstract, “A robot system that performs work of coupling a flexible cable to a connector provided on a board, includes a robot in which a gripping unit that grips the cable and a force detection unit that detects a force acting on the gripping unit are provided, a control unit that controls the robot to perform a conveyance action to grip the cable using the gripping unit and convey the cable onto the board, and an insertion action to insert the cable into the connector by force control based on a detection result in the force detection unit, an insertion speed entry part in which an insertion speed of the cable into the connector at the insertion action is entered, and a determination unit that can determine force control information necessary for the force control in the insertion action according to the insertion speed.”, Column 14 lines 7-17, “The control apparatus 200 includes a determination unit 203C that can determine force control information necessary for force control at the second stage of the conveyance action according to the conveyance speed Vl. The force control information includes e.g. the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force. The virtual coefficient of inertia, the virtual coefficient of viscosity, and the virtual modulus of elasticity are the parameters expressed by the above described equation (A). The target force may be set to e.g. the first threshold value.”, Column 14 lines 37-53, “As described above, the robot system 100 includes the conveyance speed entry part 332 in which the conveyance speed V1 of the cable 93 to the board 91 at the second stage of the conveyance action is entered. Then, the determination unit 203C can determine the force control information at the second stage of the conveyance action according to the conveyance speed V1 based on the table 41 as the calibration curve 4. That is, the unit may determine the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force suitable for the conveyance speed V1 at the second stage. Thereby, for adjustment of the conveyance speed V1 regardless of the degree of skill of a programmer, the determination unit 203C may accurately change and determine the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force according to the adjustment.”. The cited passages clearly teach a determination unit configured to determine force parameters for a force control operation of a robot. One of ordinary sill in the art would recognize that this clearly teaches a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit). Takeuchi teaches a robot controller for executing a robot program and controlling a robot, the robot controller comprising: a determination unit configured to determine, based on the robot program, an operation mode of force control using the force detection unit, the force control being executed in the robot program; and a control setting unit configured to perform setting of the control according to the operation mode of the control determined by the determination unit. Furthermore, Takeuchi teaches automatically setting the operational parameters for a forced control operation based on the operation type and the workpieces, though does not explicitly teach that parameters specifically regarding the force control aspect of the operation are set. Ueno teaches a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit. A person of ordinary skill in the art would have had the technological capabilities required to have modified the controller taught in Takeuchi with a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit. As previously stated, Takeuchi teaches automatically setting a variety of parameters of the operation, though does not explicitly teach that parameters specifically regarding the force control aspect of the operation. As such, a person of ordinary skill in the art would have been able to have modified the controller of Takeuchi to automatically set parameters of the force control operation as taught in Ueno according to methods known in the art. Such a combination would not have changed or introduced new functionality. Therefore, it would have been obvious to one of ordinary skill in the art that the combination of Takeuchi in view of Ueno teaches the limitations of claim 1. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the controller taught in Takeuchi with a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit taught in Ueno with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because such a method of setting the force control by a determination unit allows the force control parameters to be accurately set regardless of the skill of the user. (Ueno: Column 14 lines 37-53, “Thereby, for adjustment of the conveyance speed V1 regardless of the degree of skill of a programmer, the determination unit 203C may accurately change and determine the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force according to the adjustment.”). Regarding claim 10, Takeuchi in view of Ueno teaches wherein the determination unit determines, based on the robot program, whether the operation mode of the force control is an operation mode of executing force control while tracking an article conveyed on a conveyance device or an operation mode of regular force control executed without tracking an article (Takeuchi: Figure 1 transport device 50, Column 4 lines 10-14, “FIG. 1 is a perspective view of a robot system in a first embodiment. The robot system includes a camera 30, a transport device 50, a robot 100, and a robot control device 200. The robot 100 and the robot control device 200 are communicably connected via a cable or radio.”, Column 4 line 66 – Column 5 line 14, “In the present embodiment, a workpiece WK2 is transported by the transport device 50. The transport device 50 includes transport rollers 50a and 50b. The transport device 50 can transport the workpiece WK2 placed on a transport surface by moving the transport surface by rotating these transport rollers 50a and 50b. The camera 30 is installed above the transport device 50. The camera 30 is installed such that the workpiece WK2 on the transport surface is in the visual field. A fitting hole H2 is formed on a surface of the workpiece WK2. The end effector 140 can perform work of fitting a workpiece WK1 gripped by the end effector 140 into the fitting hole H2 of the workpiece WK2. The fitting work may be performed in a state in which the transport surface is stopped, or, may be executed while moving the transport surface. The transport device 50 and the camera 30 can be omitted.”, Column 4 line 66 – Column 5 line 14, “In the present embodiment, a workpiece WK2 is transported by the transport device 50. The transport device 50 includes transport rollers 50a and 50b. The transport device 50 can transport the workpiece WK2 placed on a transport surface by moving the transport surface by rotating these transport rollers 50a and 50b. The camera 30 is installed above the transport device 50. The camera 30 is installed such that the workpiece WK2 on the transport surface is in the visual field. A fitting hole H2 is formed on a surface of the workpiece WK2. The end effector 140 can perform work of fitting a workpiece WK1 gripped by the end effector 140 into the fitting hole H2 of the workpiece WK2. The fitting work may be performed in a state in which the transport surface is stopped, or, may be executed while moving the transport surface. The transport device 50 and the camera 30 can be omitted.”, Column 7 lines 44-49, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”, Column 8 lines 33-53, “Category 4: Pressing Pressing is an operation of pressing with designated force in the designated direction. The category of the pressing operation includes the following two types of operation objects. (a) Pressing (simple pressing) object is an operation of pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. (b) Pressing and moving object is an operation of moving while pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. As shown in FIG. 8D, in the pressing and moving object, the end effector 140 is moved in the designated direction DD and pressed with designated force, and then, is moved in a direction different from the designated direction while maintaining (that is copying) the pressing with designated force. In the example of FIG. SD, an operation of inserting the workpiece WKa held by the end effector 140 into the hole Hb of a workpiece WKb is executed by the pressing and moving.”, Column 13 lines 14-22, “When the operation flow is completed as described above, the conversion unit 244 converts the operation flow into a control program according to the indication of the teacher in step S130 in FIG. 5. The indication can be performed, for example, by selecting "create control program" from the context menu of the operation flow creation area FL. It is preferable that any one method of the following three types of methods can be selectively performed for the conversion from the operation flow to a control program and execution.”, Column 11 lines 21-40, “FIG. 10 is an explanatory diagram showing an example of a relationship between work parameters and operation parameters. Here, an example of work parameters displayed in the parameter setting area PR on the window W1 (FIG. 6D) is shown. These work parameters are, for example, displayed in the parameter setting area PR by selecting the sequence block OB1 in the operation flow creation area FL. In FIG. 10, when clicking a relationship display button BT of a specific parameter in the parameter setting area PR, a dialog DL1 showing operation parameters influenced by the work parameters is displayed. In this example, among the work parameters of cylinder fitting work, when the relationship display button BT of the fitting direction is clicked, it is displayed in the dialog DL1 the fitting direction affects the contacting direction of contacting operation, the pressing direction of the pressing and probing operation, the moving direction of the pressing and moving operation, and the force control in the six axes directions. The teacher can check the relationship between the work parameters and the operation parameters of each operation from the dialog DL1.”, Column 11 lines 41-63, “The operation parameter of the force control operation automatically set from the work parameter is not limited to the direction of force control, and other operation parameters may be automatically set according to the work parameter. For example, the moving amount while pressing of the pressing and moving operation (moving amount in the -Z direction at the right end of FIG. SD) may be automatically set from workpiece information (for example, fitting depth of workpiece) of the work parameter. As an automatic parameter setting mode of the parameter, one mode may be selected from a first mode in which only the work parameter can be changed, a second mode in which only the operation parameter can be changed, and a third mode in which both of the work parameter and the operation parameter can be changed. In this way, a novice can create a work sequence using only the work parameter, and an expert can perform further detailed corrections using the operation parameter. In a case where the work parameter and the operation parameter are inconsistent, for example, in a case where the contacting direction is set in the -X direction when the fitting direction is set in the -Z direction in the example of FIG. 10, it is preferable that the operation parameter is edited with the work parameter as correct.”. The cited passages clearly shows that the work process of the robot can include multiple different categories of operations and that each operation changes the parameter used to control the robot (See Column 8 line 60 – Column 10 line 23 for a list of parameters for each operation category). Furthermore, two such operations include a standard pressing operation (performed without the robot moving in a direction other than the direction required to press the held object into the workpiece) and an operation of moving while pressing (performed while moving in a direction other than the direction required to press the held object into the workpiece.). One of ordinary skill in the art would recognize that, because the system can be configured to perform the work on thew workpiece while it is moving on the conveying device, the moving while pressing operation comprises an operation mode of executing force control while tracking an article conveyed on the conveyance device. Additionally the cited passages shows that the operational parameters are automatically set. The cited passages and Figure 10 shows that each operation has different operational parameter that are automatically set. One of ordinary skill in the art would therefore recognize that the system clearly determines the operation being performed in order to proper set the necessary parameters.). Regarding claim 11, Takeuchi in view of Ueno teaches wherein the determination unit determines whether the operation mode of the force control is an operation mode executing force control while tracking an article or an operation mode of regular force control by determining whether the robot program is associated with setting data relating to tracking operation (Takeuchi: Figure 10, Column 7 lines 44-49, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”, Column 8 lines 33-53, “Category 4: Pressing Pressing is an operation of pressing with designated force in the designated direction. The category of the pressing operation includes the following two types of operation objects. (a) Pressing (simple pressing) object is an operation of pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. (b) Pressing and moving object is an operation of moving while pressing with designated force in the designated direction. In this operation, "copying" operation can be executed with respect to other designated axes. As shown in FIG. 8D, in the pressing and moving object, the end effector 140 is moved in the designated direction DD and pressed with designated force, and then, is moved in a direction different from the designated direction while maintaining (that is copying) the pressing with designated force. In the example of FIG. SD, an operation of inserting the workpiece WKa held by the end effector 140 into the hole Hb of a workpiece WKb is executed by the pressing and moving.”, Column 13 lines 14-22, “When the operation flow is completed as described above, the conversion unit 244 converts the operation flow into a control program according to the indication of the teacher in step S130 in FIG. 5. The indication can be performed, for example, by selecting "create control program" from the context menu of the operation flow creation area FL. It is preferable that any one method of the following three types of methods can be selectively performed for the conversion from the operation flow to a control program and execution.”, Column 11 lines 21-40, “FIG. 10 is an explanatory diagram showing an example of a relationship between work parameters and operation parameters. Here, an example of work parameters displayed in the parameter setting area PR on the window W1 (FIG. 6D) is shown. These work parameters are, for example, displayed in the parameter setting area PR by selecting the sequence block OB1 in the operation flow creation area FL. In FIG. 10, when clicking a relationship display button BT of a specific parameter in the parameter setting area PR, a dialog DL1 showing operation parameters influenced by the work parameters is displayed. In this example, among the work parameters of cylinder fitting work, when the relationship display button BT of the fitting direction is clicked, it is displayed in the dialog DL1 the fitting direction affects the contacting direction of contacting operation, the pressing direction of the pressing and probing operation, the moving direction of the pressing and moving operation, and the force control in the six axes directions. The teacher can check the relationship between the work parameters and the operation parameters of each operation from the dialog DL1.”, Column 11 lines 41-63, “The operation parameter of the force control operation automatically set from the work parameter is not limited to the direction of force control, and other operation parameters may be automatically set according to the work parameter. For example, the moving amount while pressing of the pressing and moving operation (moving amount in the -Z direction at the right end of FIG. SD) may be automatically set from workpiece information (for example, fitting depth of workpiece) of the work parameter. As an automatic parameter setting mode of the parameter, one mode may be selected from a first mode in which only the work parameter can be changed, a second mode in which only the operation parameter can be changed, and a third mode in which both of the work parameter and the operation parameter can be changed. In this way, a novice can create a work sequence using only the work parameter, and an expert can perform further detailed corrections using the operation parameter. In a case where the work parameter and the operation parameter are inconsistent, for example, in a case where the contacting direction is set in the -X direction when the fitting direction is set in the -Z direction in the example of FIG. 10, it is preferable that the operation parameter is edited with the work parameter as correct.”. The cited figure and passages show that each operation has their own operational parameters that are automatically set by the system. One of ordinary skill in the art would recognize that the system is clearly configured to determine if the operation is associated with a setting relating to tracking in order to automatically set said parameter. Such a parameter would include the moving direction for the moving and pressing operation as shown in Figure 10.). Regarding claim 12, Takeuchi in view of Ueno teaches wherein the force control setting unit automatically sets a force control parameter according to the determined operation mode of the force control (Takeuchi: Column 7 lines 44-49, “FIG. 7 shows an example of operation classifications and operation objects constituting an operation flow, and FIGS. 8A to 8D show outlines of operations of some operation objects. A plurality of operation objects can be categorized into the following four categories. All of these operations involve force control.”, Column 11 lines 6-20, “When the work parameters are set for the work in which the operation flow is created as shown in FIG. 6D, it is preferable that some of the operation parameters of the operation included in the operation flow are automatically set from the work parameters.”, Column 11 lines 21-40, “FIG. 10 is an explanatory diagram showing an example of a relationship between work parameters and operation parameters. Here, an example of work parameters displayed in the parameter setting area PR on the window W1 (FIG. 6D) is shown. These work parameters are, for example, displayed in the parameter setting area PR by selecting the sequence block OB1 in the operation flow creation area FL. In FIG. 10, when clicking a relationship display button BT of a specific parameter in the parameter setting area PR, a dialog DL1 showing operation parameters influenced by the work parameters is displayed. In this example, among the work parameters of cylinder fitting work, when the relationship display button BT of the fitting direction is clicked, it is displayed in the dialog DL1 the fitting direction affects the contacting direction of contacting operation, the pressing direction of the pressing and probing operation, the moving direction of the pressing and moving operation, and the force control in the six axes directions. The teacher can check the relationship between the work parameters and the operation parameters of each operation from the dialog DL1.”, Column 11 lines 41-63, “The operation parameter of the force control operation automatically set from the work parameter is not limited to the direction of force control, and other operation parameters may be automatically set according to the work parameter. For example, the moving amount while pressing of the pressing and moving operation (moving amount in the -Z direction at the right end of FIG. SD) may be automatically set from workpiece information (for example, fitting depth of workpiece) of the work parameter. As an automatic parameter setting mode of the parameter, one mode may be selected from a first mode in which only the work parameter can be changed, a second mode in which only the operation parameter can be changed, and a third mode in which both of the work parameter and the operation parameter can be changed. In this way, a novice can create a work sequence using only the work parameter, and an expert can perform further detailed corrections using the operation parameter. In a case where the work parameter and the operation parameter are inconsistent, for example, in a case where the contacting direction is set in the -X direction when the fitting direction is set in the -Z direction in the example of FIG. 10, it is preferable that the operation parameter is edited with the work parameter as correct.”, Ueno: Abstract, “A robot system that performs work of coupling a flexible cable to a connector provided on a board, includes a robot in which a gripping unit that grips the cable and a force detection unit that detects a force acting on the gripping unit are provided, a control unit that controls the robot to perform a conveyance action to grip the cable using the gripping unit and convey the cable onto the board, and an insertion action to insert the cable into the connector by force control based on a detection result in the force detection unit, an insertion speed entry part in which an insertion speed of the cable into the connector at the insertion action is entered, and a determination unit that can determine force control information necessary for the force control in the insertion action according to the insertion speed.”, Column 14 lines 7-17, “The control apparatus 200 includes a determination unit 203C that can determine force control information necessary for force control at the second stage of the conveyance action according to the conveyance speed Vl. The force control information includes e.g. the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force. The virtual coefficient of inertia, the virtual coefficient of viscosity, and the virtual modulus of elasticity are the parameters expressed by the above described equation (A). The target force may be set to e.g. the first threshold value.”, Column 14 lines 37-53, “As described above, the robot system 100 includes the conveyance speed entry part 332 in which the conveyance speed V1 of the cable 93 to the board 91 at the second stage of the conveyance action is entered. Then, the determination unit 203C can determine the force control information at the second stage of the conveyance action according to the conveyance speed V1 based on the table 41 as the calibration curve 4. That is, the unit may determine the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force suitable for the conveyance speed V1 at the second stage. Thereby, for adjustment of the conveyance speed V1 regardless of the degree of skill of a programmer, the determination unit 203C may accurately change and determine the virtual coefficient of inertia, the virtual coefficient of viscosity, the virtual modulus of elasticity, and the target force according to the adjustment.”). Claim(s) 5-8 and 13-16 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 11167417 B2 ("Takeuchi") in view of US 11230010 B2 ("Ueda") in further view of US 20200391385 A1 ("Oka"). Regarding claim 5, Takeuchi in view of Ueno teaches further comprising a visual sensor (Takeuchi: Column 4 lines 10-14, “FIG. 1 is a perspective view of a robot system in a first embodiment. The robot system includes a camera 30, a transport device 50, a robot 100, and a robot control device 200. The robot 100 and the robot control device 200 are communicably connected via a cable or radio.”, Column 5 lines 1-14, “The camera 30 is installed above the transport device 50. The camera 30 is installed such that the workpiece WK2 on the transport surface is in the visual field.”). Takeuchi in view of Ueno does not teach wherein the robot controller further includes an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by the visual sensor. Oka, in the same field of endeavor, teaches wherein the robot controller further includes an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by the visual sensor (Oka: Abstract, “An object handling control device includes one or more processors configured to acquire at least object information and status information representing an initial position and a destination of an object; set, when a grasper grasping the object moves from the initial position to the destination, a first region, a second region, and a third region in accordance with the object information and the status information; and calculate a moving route along which the object is moved from the initial position to the destination with reference to the first region, the second region, and the third region.”, ¶ 0043, “According to one embodiment, in general, an object handling control device includes one or more processors. The one or more processors are configured to acquire at least object information and status information, the object information representing an object grasped by a grasper, the status information representing an initial position and a destination of the object; set, when the grasper grasping the object moves from the initial position to the destination, a first region, a second region, and a third region in accordance with the object information and the status information, the first region being a region in which the grasper is allowed to move without being restricted by an obstacle present in a space between the initial position and the destination, the second region being a region in which the grasper is restricted from moving due to the obstacle, the third region at least part of which is set below the second region, the third region being a region in which the grasper is operated under force control; and calculate a moving route along which the object is moved from the initial position to the destination with reference to the first region, the second region, and the third region.”, ¶ 0064, “To move the hand 22 grasping the object OBJ from the initial position HP to the moving destination RP, the region setter 55 sets regions in a space between the initial position HP and the moving destination RP with reference to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b. The region setter 55 sets a first region, a second region, and a third region, for example. In the first region the hand 22 is allowed to move without being restricted by obstacles such as the containers 14a and 14b and a previously set object in the space between the initial position HP and the moving destination RP. In the second region the hand 22 is restricted from moving due to presence of obstacles. At least part of the third region is set below the second region, and in the third region the hand 22 is moved under force control. In the first region the hand 22 is movable at a higher speed, for example. In the second region the hand 22 is restricted or prohibited from passing. In the third region the force sensor 31 detects force, allowing the hand 22 to correct the moving route under repulsive control if the object OBJ or the hand 22 interferes with the obstacle. Additionally, in the third region, the moving speed of the hand 22 (object OBJ) may be lowered, or the force sensor 31 may be temporarily improved in terms of sensing accuracy, for example.”. The cited passages clearly shows that the system defines multiple work area in the robot, wherein these work areas define how the robot is permitted to move.). Takeuchi in view of Ueno teaches a robot system comprising: a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit, and a visual sensor. Takeuchi in view of Ueno does not teach wherein the robot controller further includes an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by the visual sensor. Oka teaches wherein the robot controller further includes an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by the visual sensor. A person of ordinary skill in the art would have had the technological capabilities required to have modified the system taught in Takeuchi in view of Ueno with wherein the robot controller further includes an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by the visual sensor taught in Oka. Furthermore, the system taught in Takeuchi in view of Ueno is already configured with a visual sensor and is configured to use the visual sensor in the control of the robot (Takeuchi: Column 13 lines 61-67, “In the above-described description, a procedure of creating an operation flow and a control program of work using the force detector 130 is described. However, in the present embodiment, it is also possible to create an operation flow and a control program of work using the camera 30 (imaging device).”, Column 14 lines 4-26, “FIG. 6E shows an example of a window W1a for creating an operation flow of second type work using the camera 30. The window W1a is similar to the window W1 for the first type work shown in FIG. 6D. However, in the main view area MV, the window W1a is different from the window W1 in that a camera image display area IM for displaying an image captured with the camera 30 is provided in the main view area MV. In the camera image display area IM, it is preferable to be able to designate an image processing area to be subjected to an image processing such as product”). As such, one of ordinary skill in the art would have been able to have modified the system taught in Takeuchi in view of Ueno with the method of setting a work area of the robot as a monitoring target by the visual sensor according to methods known in the art. Such a combination would not have changed or introduced new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a robot system comprising: the robot controller further includes an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by the visual sensor. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the system taught in Takeuchi in view of Ueno with wherein the robot controller further includes an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by the visual sensor taught in Oka with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Regarding claim 6, Takeuchi in view of Ueno in further view of Oka teaches wherein, when the work area is not set, the area setting unit sets the work area in an image capture range of the visual sensor, based on position information of the visual sensor and the robot (Oka: ¶ 0054, “The camera 32a is located in the initial position to image the object OBJ and the surroundings thereof from above at the initial position HP of the object OBJ to be grasped and moved or conveyed and acquire object information (such as a shape or a size) and status information (such as a stationary pose) on the object OBJ. At the initial position HP, the object OBJ is housed in a container 14a such as a stowage or a palette. In such a case, the camera 32a generates an image of all or part of inside the container 14a. The initial position HP may also be referred to as a motion start position or a departure position of the object OBJ. In FIG. 1, the container 14a is placed on a conveying mechanism such as a conveyor belt for exemplary purpose only, and the location thereof is not limited thereto.”, ¶ 0063, “The grasp plan generator 54 calculates a grasping method and a grasping pose of the object OBJ at the initial position HP, and a moving route and via points along which the manipulator 20 or hand 22 is moved to the initial position HP. The grasp plan generator 54 also calculates a moving route and via points of the hand 22 to grasp a next intended object OBJ after releasing the object OBJ at the moving destination RP. In these cases, the object information acquired by the camera 32a is utilized in calculation of the moving route and via points to move the hand 22 without interfering with surrounding obstacles such as wall surfaces of the containers 14a and 14b or an object or objects other than the currently moved object OBJ.”, ¶ 0064, “To move the hand 22 grasping the object OBJ from the initial position HP to the moving destination RP, the region setter 55 sets regions in a space between the initial position HP and the moving destination RP with reference to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b. The region setter 55 sets a first region, a second region, and a third region, for example. In the first region the hand 22 is allowed to move without being restricted by obstacles such as the containers 14a and 14b and a previously set object in the space between the initial position HP and the moving destination RP. In the second region the hand 22 is restricted from moving due to presence of obstacles. At least part of the third region is set below the second region, and in the third region the hand 22 is moved under force control. In the first region the hand 22 is movable at a higher speed, for example. In the second region the hand 22 is restricted or prohibited from passing. In the third region the force sensor 31 detects force, allowing the hand 22 to correct the moving route under repulsive control if the object OBJ or the hand 22 interferes with the obstacle. Additionally, in the third region, the moving speed of the hand 22 (object OBJ) may be lowered, or the force sensor 31 may be temporarily improved in terms of sensing accuracy, for example.”. The cited passages clearly shows that the area setting unit (i.e. the region setter) sets the work area in an image capture range of the cameras and uses positional information of the sensors, objects, and robot to set said work areas.). Regarding claim 7, Takeuchi in view of Ueno in further view of Oka teaches wherein the robot controller further includes an area monitoring unit configured to detect entry of an obstacle into the work area, based on an image captured by the visual sensor (Oka: ¶ 0064, “To move the hand 22 grasping the object OBJ from the initial position HP to the moving destination RP, the region setter 55 sets regions in a space between the initial position HP and the moving destination RP with reference to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b. The region setter 55 sets a first region, a second region, and a third region, for example. In the first region the hand 22 is allowed to move without being restricted by obstacles such as the containers 14a and 14b and a previously set object in the space between the initial position HP and the moving destination RP. In the second region the hand 22 is restricted from moving due to presence of obstacles. At least part of the third region is set below the second region, and in the third region the hand 22 is moved under force control. In the first region the hand 22 is movable at a higher speed, for example. In the second region the hand 22 is restricted or prohibited from passing. In the third region the force sensor 31 detects force, allowing the hand 22 to correct the moving route under repulsive control if the object OBJ or the hand 22 interferes with the obstacle. Additionally, in the third region, the moving speed of the hand 22 (object OBJ) may be lowered, or the force sensor 31 may be temporarily improved in terms of sensing accuracy, for example.”, ¶ 0102, “The region setter 55 serves to set the first region, the second region, and the third region including the first control region and the second control region according to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b.”. One of ordinary skill in the art would recognize that the second region, which is defined by the presence of an obstacle, is defined using the object information acquired by the cameras. This clearly shows that the system is configured to determine the presence of obstacles using an image captured by the vision sensors.), and the area setting unit performs re-setting of the work area in such a way that an area where the obstacle exists in the work area is excluded from the work area (Oka: ¶ 0098, “In the second region the hand 22 is restricted from moving due to presence of obstacles, For example, the hand 22 is restricted from entering or moving from above to below the obstacle, To avoid interference, the fingertip TCP, of the hand 22 is prohibited from passing the second region while moving from upward to downward following a moving-route plan. In the second region the hand 22 may be additionally restricted from moving from downward to upward or moving laterally. The second region may be set to a motion prohibited region in which the hand 22 or the object OBJ is prohibited from entering or moving.”. The cited passages shows that the system can prohibit the robot hand from entering the region defined by the obstacles. One of ordinary skill in the art would recognize that this clearly teaches excluding the obstacle from the work area (i.e. the first region)). Regarding claim 8, Takeuchi in view of Ueno in further view of Oka teaches wherein the robot controller further includes an area monitoring unit configured to perform control in such a way as to discontinue the force control when the robot is out of the work area (Oka: ¶ 0064, “To move the hand 22 grasping the object OBJ from the initial position HP to the moving destination RP, the region setter 55 sets regions in a space between the initial position HP and the moving destination RP with reference to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b. The region setter 55 sets a first region, a second region, and a third region, for example. In the first region the hand 22 is allowed to move without being restricted by obstacles such as the containers 14a and 14b and a previously set object in the space between the initial position HP and the moving destination RP. In the second region the hand 22 is restricted from moving due to presence of obstacles. At least part of the third region is set below the second region, and in the third region the hand 22 is moved under force control. In the first region the hand 22 is movable at a higher speed, for example. In the second region the hand 22 is restricted or prohibited from passing. In the third region the force sensor 31 detects force, allowing the hand 22 to correct the moving route under repulsive control if the object OBJ or the hand 22 interferes with the obstacle. Additionally, in the third region, the moving speed of the hand 22 (object OBJ) may be lowered, or the force sensor 31 may be temporarily improved in terms of sensing accuracy, for example.”, ¶ 0099, “At least part of the third region is set below and adjacent to the second region, and includes a first control region and a second control region, for example. In the third region the hand 22 is operated under the force control as described later. In setting the third region below the second region, at least part of the first region may be set between the second region and the third region. The first control region of the third region is set below the second region along the obstacle, and includes a less margin with respect to the obstacle than the second region, that is, has a smaller lateral width than the second region. In the lateral direction the first control region is set entirely adjacent to the bottom of the second region. The hand 22 enters the first control region from the lateral direction with respect to the obstacle. In other words, setting the first control region below the second control region prevents the object OBJ or the hand 22 from approaching the first control region from above the second region. Also, the object OBJ or the hand 22 enters and approaches the inside of the first control region from the lateral direction alone. The first control region is laterally adjacent to the obstacle below the second region, and the hand 22 is operated under the pressing control of the force control therein. As in a second region 100, the hand 22 may be prohibited in principle from moving in or entering the first control region, such as from upward to downward motion. In this case, if such motion or entry prohibition makes it difficult to create the route or a created route is inefficient, the hand 22 may be exceptionally allowed to move in or enter the first control region under the condition that the hand 22 is operated under the pressing control.”. One of ordinary skill in the art would recognize from the cited passages that only the third work area (i.e. third region) is set to allow force control of the robot, and that the force control ends when the robot exits the third region.). Regarding claim 13, Takeuchi in view of Ueno does not teach further comprising an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by a visual sensor. Oka, in the same field of endeavor, further comprising an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by a visual sensor (Oka: Abstract, “An object handling control device includes one or more processors configured to acquire at least object information and status information representing an initial position and a destination of an object; set, when a grasper grasping the object moves from the initial position to the destination, a first region, a second region, and a third region in accordance with the object information and the status information; and calculate a moving route along which the object is moved from the initial position to the destination with reference to the first region, the second region, and the third region.”, ¶ 0043, “According to one embodiment, in general, an object handling control device includes one or more processors. The one or more processors are configured to acquire at least object information and status information, the object information representing an object grasped by a grasper, the status information representing an initial position and a destination of the object; set, when the grasper grasping the object moves from the initial position to the destination, a first region, a second region, and a third region in accordance with the object information and the status information, the first region being a region in which the grasper is allowed to move without being restricted by an obstacle present in a space between the initial position and the destination, the second region being a region in which the grasper is restricted from moving due to the obstacle, the third region at least part of which is set below the second region, the third region being a region in which the grasper is operated under force control; and calculate a moving route along which the object is moved from the initial position to the destination with reference to the first region, the second region, and the third region.”, ¶ 0064, “To move the hand 22 grasping the object OBJ from the initial position HP to the moving destination RP, the region setter 55 sets regions in a space between the initial position HP and the moving destination RP with reference to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b. The region setter 55 sets a first region, a second region, and a third region, for example. In the first region the hand 22 is allowed to move without being restricted by obstacles such as the containers 14a and 14b and a previously set object in the space between the initial position HP and the moving destination RP. In the second region the hand 22 is restricted from moving due to presence of obstacles. At least part of the third region is set below the second region, and in the third region the hand 22 is moved under force control. In the first region the hand 22 is movable at a higher speed, for example. In the second region the hand 22 is restricted or prohibited from passing. In the third region the force sensor 31 detects force, allowing the hand 22 to correct the moving route under repulsive control if the object OBJ or the hand 22 interferes with the obstacle. Additionally, in the third region, the moving speed of the hand 22 (object OBJ) may be lowered, or the force sensor 31 may be temporarily improved in terms of sensing accuracy, for example.”. The cited passages clearly shows that the system defines multiple work area in the robot, wherein these work areas define how the robot is permitted to move.). Takeuchi in view of Ueno teaches a robot controller for executing a robot program and controlling a robot, the robot controller comprising: a force control setting unit configured to perform setting of the force control according to the operation mode of the force control determined by the determination unit, and a visual sensor. Takeuchi in view of Ueno does not teach further comprising an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by a visual sensor. Oka teaches further comprising an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by a visual sensor. A person of ordinary skill in the art would have had the technological capabilities required to have modified the controller taught in Takeuchi in view of Ueno with further comprising an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by a visual sensor taught in Oka. Furthermore, the controller taught in Takeuchi in view of Ueno is already configured with a visual sensor and is configured to use the visual sensor in the control of the robot (Takeuchi: Column 13 lines 61-67, “In the above-described description, a procedure of creating an operation flow and a control program of work using the force detector 130 is described. However, in the present embodiment, it is also possible to create an operation flow and a control program of work using the camera 30 (imaging device).”, Column 14 lines 4-26, “FIG. 6E shows an example of a window W1a for creating an operation flow of second type work using the camera 30. The window W1a is similar to the window W1 for the first type work shown in FIG. 6D. However, in the main view area MV, the window W1a is different from the window W1 in that a camera image display area IM for displaying an image captured with the camera 30 is provided in the main view area MV. In the camera image display area IM, it is preferable to be able to designate an image processing area to be subjected to an image processing such as product”). As such, one of ordinary skill in the art would have been able to have modified the controller taught in Takeuchi in view of Ueno with the method of setting a work area of the robot as a monitoring target by the visual sensor according to methods known in the art. Such a combination would not have changed or introduced new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a robot controller further comprising an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by a visual sensor. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the controller taught in Takeuchi in view of Ueno with further comprising an area setting unit configured to perform setting relating to a work area of the robot as a monitoring target by a visual sensor taught in Oka with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Regarding claim 14, Takeuchi in view of Ueno in further view of Oka teaches wherein, when the work area is not set, the area setting unit sets the work area in an image capture range of the visual sensor, based on position information of the visual sensor and the robot (Oka: ¶ 0054, “The camera 32a is located in the initial position to image the object OBJ and the surroundings thereof from above at the initial position HP of the object OBJ to be grasped and moved or conveyed and acquire object information (such as a shape or a size) and status information (such as a stationary pose) on the object OBJ. At the initial position HP, the object OBJ is housed in a container 14a such as a stowage or a palette. In such a case, the camera 32a generates an image of all or part of inside the container 14a. The initial position HP may also be referred to as a motion start position or a departure position of the object OBJ. In FIG. 1, the container 14a is placed on a conveying mechanism such as a conveyor belt for exemplary purpose only, and the location thereof is not limited thereto.”, ¶ 0063, “The grasp plan generator 54 calculates a grasping method and a grasping pose of the object OBJ at the initial position HP, and a moving route and via points along which the manipulator 20 or hand 22 is moved to the initial position HP. The grasp plan generator 54 also calculates a moving route and via points of the hand 22 to grasp a next intended object OBJ after releasing the object OBJ at the moving destination RP. In these cases, the object information acquired by the camera 32a is utilized in calculation of the moving route and via points to move the hand 22 without interfering with surrounding obstacles such as wall surfaces of the containers 14a and 14b or an object or objects other than the currently moved object OBJ.”, ¶ 0064, “To move the hand 22 grasping the object OBJ from the initial position HP to the moving destination RP, the region setter 55 sets regions in a space between the initial position HP and the moving destination RP with reference to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b. The region setter 55 sets a first region, a second region, and a third region, for example. In the first region the hand 22 is allowed to move without being restricted by obstacles such as the containers 14a and 14b and a previously set object in the space between the initial position HP and the moving destination RP. In the second region the hand 22 is restricted from moving due to presence of obstacles. At least part of the third region is set below the second region, and in the third region the hand 22 is moved under force control. In the first region the hand 22 is movable at a higher speed, for example. In the second region the hand 22 is restricted or prohibited from passing. In the third region the force sensor 31 detects force, allowing the hand 22 to correct the moving route under repulsive control if the object OBJ or the hand 22 interferes with the obstacle. Additionally, in the third region, the moving speed of the hand 22 (object OBJ) may be lowered, or the force sensor 31 may be temporarily improved in terms of sensing accuracy, for example.”. The cited passages clearly shows that the area setting unit (i.e. the region setter) sets the work area in an image capture range of the cameras and uses positional information of the sensors, objects, and robot to set said work areas.). Regarding claim 15, Takeuchi in view of Ueno in further view of Oka teaches further comprising an area monitoring unit configured to detect entry of an obstacle into the work area, based on an image captured by the visual sensor (Oka: ¶ 0064, “To move the hand 22 grasping the object OBJ from the initial position HP to the moving destination RP, the region setter 55 sets regions in a space between the initial position HP and the moving destination RP with reference to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b. The region setter 55 sets a first region, a second region, and a third region, for example. In the first region the hand 22 is allowed to move without being restricted by obstacles such as the containers 14a and 14b and a previously set object in the space between the initial position HP and the moving destination RP. In the second region the hand 22 is restricted from moving due to presence of obstacles. At least part of the third region is set below the second region, and in the third region the hand 22 is moved under force control. In the first region the hand 22 is movable at a higher speed, for example. In the second region the hand 22 is restricted or prohibited from passing. In the third region the force sensor 31 detects force, allowing the hand 22 to correct the moving route under repulsive control if the object OBJ or the hand 22 interferes with the obstacle. Additionally, in the third region, the moving speed of the hand 22 (object OBJ) may be lowered, or the force sensor 31 may be temporarily improved in terms of sensing accuracy, for example.”, ¶ 0102, “The region setter 55 serves to set the first region, the second region, and the third region including the first control region and the second control region according to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b.”. One of ordinary skill in the art would recognize that the second region, which is defined by the presence of an obstacle, is defined using the object information acquired by the cameras. This clearly shows that the system is configured to determine the presence of obstacles using an image captured by the vision sensors.), wherein the area setting unit performs re-setting of the work area in such a way that an area where the obstacle exists in the work area is excluded from the work area (Oka: ¶ 0098, “In the second region the hand 22 is restricted from moving due to presence of obstacles, For example, the hand 22 is restricted from entering or moving from above to below the obstacle, To avoid interference, the fingertip TCP, of the hand 22 is prohibited from passing the second region while moving from upward to downward following a moving-route plan. In the second region the hand 22 may be additionally restricted from moving from downward to upward or moving laterally. The second region may be set to a motion prohibited region in which the hand 22 or the object OBJ is prohibited from entering or moving.”. The cited passages shows that the system can prohibit the robot hand from entering the region defined by the obstacles. One of ordinary skill in the art would recognize that this clearly teaches excluding the obstacle from the work area (i.e. the first region)). Regarding claim 16, Takeuchi in view of Ueno in further view of Oka teaches further comprising an area monitoring unit configured to perform control in such a way as to discontinue the force control when the robot is out of the work area (Oka: ¶ 0064, “To move the hand 22 grasping the object OBJ from the initial position HP to the moving destination RP, the region setter 55 sets regions in a space between the initial position HP and the moving destination RP with reference to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b. The region setter 55 sets a first region, a second region, and a third region, for example. In the first region the hand 22 is allowed to move without being restricted by obstacles such as the containers 14a and 14b and a previously set object in the space between the initial position HP and the moving destination RP. In the second region the hand 22 is restricted from moving due to presence of obstacles. At least part of the third region is set below the second region, and in the third region the hand 22 is moved under force control. In the first region the hand 22 is movable at a higher speed, for example. In the second region the hand 22 is restricted or prohibited from passing. In the third region the force sensor 31 detects force, allowing the hand 22 to correct the moving route under repulsive control if the object OBJ or the hand 22 interferes with the obstacle. Additionally, in the third region, the moving speed of the hand 22 (object OBJ) may be lowered, or the force sensor 31 may be temporarily improved in terms of sensing accuracy, for example.”, ¶ 0099, “At least part of the third region is set below and adjacent to the second region, and includes a first control region and a second control region, for example. In the third region the hand 22 is operated under the force control as described later. In setting the third region below the second region, at least part of the first region may be set between the second region and the third region. The first control region of the third region is set below the second region along the obstacle, and includes a less margin with respect to the obstacle than the second region, that is, has a smaller lateral width than the second region. In the lateral direction the first control region is set entirely adjacent to the bottom of the second region. The hand 22 enters the first control region from the lateral direction with respect to the obstacle. In other words, setting the first control region below the second control region prevents the object OBJ or the hand 22 from approaching the first control region from above the second region. Also, the object OBJ or the hand 22 enters and approaches the inside of the first control region from the lateral direction alone. The first control region is laterally adjacent to the obstacle below the second region, and the hand 22 is operated under the pressing control of the force control therein. As in a second region 100, the hand 22 may be prohibited in principle from moving in or entering the first control region, such as from upward to downward motion. In this case, if such motion or entry prohibition makes it difficult to create the route or a created route is inefficient, the hand 22 may be exceptionally allowed to move in or enter the first control region under the condition that the hand 22 is operated under the pressing control.”. One of ordinary skill in the art would recognize from the cited passages that only the third work area (i.e. third region) is set to allow force control of the robot, and that the force control ends when the robot exits the third region.). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Noah W Stiebritz whose telephone number is (571)272-3414. The examiner can normally be reached Monday thru Friday 7-5 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached at (571) 270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /N.W.S./Examiner, Art Unit 3658 /Ramon A. Mercado/Supervisory Patent Examiner, Art Unit 3658
Read full office action

Prosecution Timeline

Dec 10, 2024
Application Filed
Feb 23, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602063
LOAD HANDLING SYSTEM AND LOAD HANDLING METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12575900
Steerable Eversion Robot System and Method of Operating the Steerable Eversion Robot System
2y 5m to grant Granted Mar 17, 2026
Patent 12552043
METHOD FOR CONTROLLING ROBOTIC ARM, ELECTRONIC DEVICE, AND COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Feb 17, 2026
Patent 12472640
CONTROL METHOD AND SYSTEM FOR ARTICLE TRANSPORTATION BASED ON MOBILE ROBOT
2y 5m to grant Granted Nov 18, 2025
Patent 12467759
VEHICLE WITH SWITCHABLE FORWARD AND BACKWARD CONFIGURATIONS, CONTROL METHOD, AND CONTROL PROGRAM
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
67%
Grant Probability
51%
With Interview (-15.6%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 18 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month