Prosecution Insights
Last updated: April 19, 2026
Application No. 18/547,447

CONTROL DEVICE, CONTROL SYSTEM, CONTROL METHOD, AND PROGRAM

Final Rejection §103§112
Filed
Aug 22, 2023
Examiner
HOQUE, SHAHEDA SHABNAM
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Omron Corporation
OA Round
2 (Final)
43%
Grant Probability
Moderate
3-4
OA Rounds
3y 1m
To Grant
81%
With Interview

Examiner Intelligence

Grants 43% of resolved cases
43%
Career Allow Rate
25 granted / 58 resolved
-8.9% vs TC avg
Strong +38% interview lift
Without
With
+37.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
38 currently pending
Career history
96
Total Applications
across all art units

Statute-Specific Performance

§101
10.5%
-29.5% vs TC avg
§103
61.8%
+21.8% vs TC avg
§102
16.9%
-23.1% vs TC avg
§112
10.2%
-29.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 58 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 07/23/2025 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Response to Arguments Objection to Drawings is withdrawn in view of Amendments. Invocation of 35 U.S.C. § 112(f) is maintained upon further review. Previously in the Non-Final Office action, claim 4 was rejected under 35 U.S.C. § 112(b) as indefinite; and now, claim 4 limitations has been incorporated with claim 1, however, those same claim limitations are not amended as such to overcome the indefiniteness. The Applicant's arguments filed on 07/29/2025 have been fully considered but they are not persuasive. The rejection for independent claims is maintained which are based on Okahara in view of Hirayama. The same reasoning applied to the independent claims also apply to their corresponding dependent claims. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: control unit in claim 1 and 11, judgment unit in claim 1 and 11, prediction unit in 1, 6, 7, 8, 9, 10, and 11, modification unit in claim 1, and 11. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1, 6-14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Applicant provides the limitation in claim 1 and 11 "the judgment unit in the first case in which the first distance is shorter than the first predetermined distance, transmits a second signal to the modification unit and judges whether a third distance between the position of the robot and the position of the person detected after transmitting the second signal to the modification unit is shorter than the first predetermined distance, and in a case in which the third distance is equal to or longer than the first predetermined distance, transmits a third signal to the modification unit, and the modification unit does not modify the movement trajectory information in a period from after receiving the second signal and until receiving the third signal", and in claim 13 “the first case in which the first distance is shorter than the first predetermined distance, transmits a second signal to the modification unit and judges whether a third distance between the position of the robot and the position of the person detected after transmitting the second signal to the modification unit is shorter than the first predetermined distance, and in a case in which the third distance is equal to or longer than the first predetermined distance, transmits a third signal to the modification unit, and the modification unit does not modify the movement trajectory information in a period from after receiving the second signal and until receiving the third signal”, however, based on the currently provided claim language, it is unclear what the metes and bounds of the claimed language encompasses, and therefore claim 1, 11, and 13 are rendered indefinite. Accordingly, appropriate correction and/or clarification are earnestly solicited. For a rejection of independent claims 1 and 13, dependent claims such as: 6-12 and 14 are also rejected by the virtue of their dependency on rejected base claims 1 and 13. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1, 6, 9, and 11-14 are rejected under 35 U.S.C. 103 as being unpatentable over Okahara (JP2019206080A) in view of Hirayama et al. (JP2020046773A) (Hereinafter Hirayama). Regarding Claim 1, Okahara teaches a control device comprising: a memory configured to store computer-executable instructions (See at least Para [0066] “The storage device 152 includes a ROM (Read Only Memory), a HDD (Hard Disk Drive), and the like.”); and a processor configured to execute the computer-executable instructions stored in the memory (See at least Para [0066] “…As shown in FIG. 9, part or all of the robot control device 1 specifically includes a CPU 151 (Central Processing Unit)”, Para [0067] “Each process of the robot control device 1 is executed by the CPU 151…”) to implement: a control unit configured to control a movement of a robot on the basis of movement trajectory information indicating a movement trajectory of the robot (See at least Para [0040] “The robot operation control unit 108 outputs a control signal 10 to the robot 3 in each control cycle based on the command trajectory 1070.”, Para [0022] “FIG. 2 is a block diagram showing a configuration of the robot control device 1 shown in FIG. The robot control device 1 illustrated in FIG. 2 includes a hand information acquisition unit 101, a target point information acquisition unit 102, a person position information acquisition unit 103, a first trajectory calculation unit 104, a collision possibility estimation unit 105, and a correction necessity determination unit 106. , A trajectory correction unit 107, and a robot operation control unit 108.”); a judgment unit configured to judge whether a first distance between a position of the robot and a position of a person is shorter than a first predetermined distance (See at least Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030. Next, in step S32A, the robot operation control unit 108A calculates the possibility of collision at a preset cycle based on the current hand information 1010 and the human position information 1030 using the above-described collision evaluation index. Here, the current hand information 1010 and the person position information 1030 indicate the latest hand information 1010 and the person position information 1030 that have been acquired. In step S33A, robot operation control unit 108A determines whether or not the calculated collision probability is equal to or smaller than a threshold. If the possibility of collision is equal to or smaller than the threshold (YES) in step S33A, the robot operation control unit 108A generates the control signal 10. On the other hand, if the possibility of collision is larger than the threshold in step S33A (NO), the process proceeds to step S35A, where the robot operation control unit 108A generates a replan command 1080.”, Para [0050] “FIG. 6 is a block diagram illustrating an example of the configuration of the collision possibility estimation unit 105. The collision possibility estimation unit 105 includes a hand position acquisition unit 1051, an approach distance calculation unit 1052, a hand speed acquisition unit 1053, a relative speed calculation unit 1054, and a collision possibility calculation unit 1055.”, Para [0078] “According to the present embodiment, if the worker 5 is operating so as to approach the command trajectory of the robot 3 in order to determine the possibility of collision in consideration of the moving speed of the worker 5, Since the approach distance is shorter than in the first embodiment, the possibility of collision is calculated to be higher”), the position of the robot and the position of the person being detected by a sensor (See at least Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030…”, Para [0020] “Here, the position detection device 2 will be described in more detail. The position detection device 2 includes sensors such as a range sensor, an RGB-D (Red Green Blue-Depth) sensor, an ultrasonic sensor, and a capacitance sensor. An additional sensor such as a mat switch or a light curtain may be used to acquire the detection information 20 of an area that cannot be detected by the above-described sensor or to improve the detection accuracy of the worker 5.”); a prediction unit configured to predict a future position of the robot on a basis of the position of the robot and predict a future position of the person on the basis of the position of the person (See at least Para [0030] “Here, the first trajectory 1040 includes a hand position (predicted position), a predicted posture, and a hand position of the robot hand 3b at respective times (t = t1, t2,..., Tn-1, tn,...)…”); and a modification unit configured to judge whether a second distance between the future position of the robot and the future position of the person is shorter than a second predetermined distance and, in a case in which the second distance is shorter than the second predetermined distance, modify the movement trajectory information such that the second distance is equal to or longer than the second predetermined distance (See at least Para [0048] “In step S25, the trajectory correction unit 107 receives the input of the correction instruction signal 1060, corrects the first trajectory 1040 to a second trajectory having a lower possibility of collision, and sets the second trajectory as the command trajectory 1070.”), wherein … the sensor detects the position of the robot and the position of the person at regular or irregular intervals (See at least Para [0030] “Here, the first trajectory 1040 includes a hand position (predicted position), a predicted posture, and a hand position of the robot hand 3b at respective times (t = t1, t2,..., Tn-1, tn,...)”, Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030…”), the judgment unit judges whether the first distance is shorter than the first predetermined distance each time the position of the robot and the position of the person are detected (See at least Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030. Next, in step S32A, the robot operation control unit 108A calculates the possibility of collision at a preset cycle based on the current hand information 1010 and the human position information 1030 using the above-described collision evaluation index. Here, the current hand information 1010 and the person position information 1030 indicate the latest hand information 1010 and the person position information 1030 that have been acquired. In step S33A, robot operation control unit 108A determines whether or not the calculated collision probability is equal to or smaller than a threshold. If the possibility of collision is equal to or smaller than the threshold (YES) in step S33A, the robot operation control unit 108A generates the control signal 10. On the other hand, if the possibility of collision is larger than the threshold in step S33A (NO), the process proceeds to step S35A, where the robot operation control unit 108A generates a replan command 1080.”, Para [0050] “FIG. 6 is a block diagram illustrating an example of the configuration of the collision possibility estimation unit 105. The collision possibility estimation unit 105 includes a hand position acquisition unit 1051, an approach distance calculation unit 1052, a hand speed acquisition unit 1053, a relative speed calculation unit 1054, and a collision possibility calculation unit 1055.”, Para [0078] “According to the present embodiment, if the worker 5 is operating so as to approach the command trajectory of the robot 3 in order to determine the possibility of collision in consideration of the moving speed of the worker 5, Since the approach distance is shorter than in the first embodiment, the possibility of collision is calculated to be higher…”), the prediction unit predicts the future position of the robot and the future position of the person each time the position of the robot and the position of the person are detected (See at least Para [0030] “Here, the first trajectory 1040 includes a hand position (predicted position), a predicted posture, and a hand position of the robot hand 3b at respective times (t = t1, t2,..., Tn-1, tn,...)…”), … However, Okahara does not explicitly spell out … the control unit in a first case in which the first distance is shorter than the first predetermined distance, stops or decelerates the movement of the robot, regardless of whether the movement trajectory information is modified, and in a second case in which the first distance is equal to or longer than the first predetermined distance and the movement trajectory information is modified, controls the movement of the robot on the basis of the movement trajectory information after modification … the judgment unit in the first case in which the first distance is shorter than the first predetermined distance, transmits a second signal to the modification unit and judges whether a third distance between the position of the robot and the position of the person detected after transmitting the second signal to the modification unit is shorter than the first predetermined distance, and in a case in which the third distance is equal to or longer than the first predetermined distance, transmits a third signal to the modification unit, and the modification unit does not modify the movement trajectory information in a period from after receiving the second signal and until receiving the third signal. Hirayama teaches … the control unit in a first case in which the first distance is shorter than the first predetermined distance, stops or decelerates the movement of the robot, regardless of whether the movement trajectory information is modified (See at least Page 23 Para 2 “In the temporary stop function 113, the reverse trajectory generated by the reverse trajectory generation function 112 may cause the robot 100 to be unable to move due to the presence of a fixed obstacle on the route or the like. If it is impossible to avoid the trunk contact of the robot 100, an action of temporarily stopping the robot 100 is selected.”), and in a second case in which the first distance is equal to or longer than the first predetermined distance and the movement trajectory information is modified, controls the movement of the robot on the basis of the movement trajectory information after modification (See at least Page 7 Para 6 “The tuning action area A1 is an area where the determination relative distance LRH is within a predetermined boundary distance with respect to the person H. As shown in FIG. 7, the boundary distance is a critical avoidance distance LIL that is a shortest avoidance distance LPSE longer than the personal space critical distance LPS and a preset work preparation time Tla, as described later. . This critical avoidance distance LIL is obtained by the following equations (10) to (12).”, Page 8 Para 1 “The asserted action area A2 is longer than the critical avoidance distance LIL, which is a distance from the human H to the boundary of the synchronized action area A1, and the relative distance LRH for determination is the furthest distance at which the action of the robot 10 is effective”)… the judgment unit (See at least Page 21 Para 12 “As shown in FIG. 20, the interference time planning unit 103 includes an approach state determination unit 105 that determines whether the robot 100 at the current position is present in the personal space of the human, and a robot 100 and the human who Contact possibility judging unit 106 for judging the possibility of physical trunk contact, preparation time judging unit 107 for judging whether there is a work preparation time required for the robot 100 to work on humans, and an emergency collision An emergency avoidance action selection unit 108 that selects an emergency avoidance action that is an action of the robot 100 that performs an avoidance response, and a normal avoidance action selection unit that selects the normal avoidance action that is an action of the robot 100 that performs a collision avoidance response when it is not an emergency 109.”) in the first case in which the first distance is shorter than the first predetermined distance, transmits a second signal to the modification unit and judges whether a third distance between the position of the robot and the position of the person detected after transmitting the second signal to the modification unit is shorter than the first predetermined distance (See at least Page 22 Para “Further, when the reference distance Lb substituted into the above equation (17) is RLR + RLH which is the sum of the radii of the robot 10 and the personal space of the human, it is shorter than RLR + RLH and longer than the trunk width of the human. It is a predetermined distance. Accordingly, an approach trajectory is generated in which the robot 100 can pass between the robot 100 and the human without touching the human while the robot 100 enters the personal space of the human. As illustrated in FIG. 21, as the approach trajectory, the trajectory with the shorter moving distance is selected from the two types of approach paths PT1 so that the human H can pass on the left and right sides. .”), and in a case in which the third distance is equal to or longer than the first predetermined distance, transmits a third signal to the modification unit (See at least Page 24 Para 2 “That is, when the space beside the human has a width equal to or greater than the above-mentioned width when the robot 100 comes closest to the human, the through space is “present”, otherwise, the through space “absent”. It”), and the modification unit does not modify the movement trajectory information in a period from after receiving the second signal and until receiving the third signal (See at least Page 3 Para 12 “The achievement estimating means 24, the plan adjusting means 25 for changing the action plan of the robot 10 based on the estimation result of the achievement degree, and the robot 10 operating based on the determined action plan. In, and a operation command unit 26 for operation command to the operation unit 11”). Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Okahara with the teachings of Hirayama and include the feature of stopping or decelerating the movement of the robot in a case in which the first distance is shorter than the first predetermined distance, transmits a second signal to the modification unit and judges whether a third distance between the position of the robot and the position of the person detected after transmitting the second signal to the modification unit is shorter than the first predetermined distance, in a case in which the third distance is equal to or longer than the first predetermined distance, transmits a third signal to the modification unit, and the modification unit does not modify the movement trajectory information in a period from after receiving the second signal and until receiving the third signal, thereby enhance safety around the robot (See at least Page 1 Para 9 “an object of the present invention is to select a trajectory with high movement efficiency from trajectory candidates for avoiding interference with a moving obstacle.”). Regarding Claim 6, modified Okahara teaches all the elements of claim 1. Okahara further teaches the control device according to claim 1, wherein the prediction unit predicts the future position of the robot on the basis of a plurality of positions of the robot (See at least Para [0030] “Here, the first trajectory 1040 includes a hand position (predicted position), a predicted posture, and a hand position of the robot hand 3b at respective times (t = t1, t2,..., Tn-1, tn,...)…”). Regarding Claim 9, modified Okahara teaches all the elements of claim 1. Okahara further teaches the control device according to claim 1, wherein the prediction unit predicts the future position of the person on the basis of a plurality of positions of the person (See at least Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030…”, Para [0087] “In the flowchart of FIG. 16, when the replanning command is input, steps S25 to S3B are repeated until the possibility of collision of the command trajectory 1070 becomes equal to or less than the threshold.”, discloses repeating the process which will generate plurality of positions of the person). Regarding Claim 11, Okahara teaches a control device comprising: a memory configured to store computer-executable instructions (See at least Para [0066] “The storage device 152 includes a ROM (Read Only Memory), a HDD (Hard Disk Drive), and the like.”); and a processor configured to execute the computer-executable instructions stored in the memory (See at least Para [0066] “…As shown in FIG. 9, part or all of the robot control device 1 specifically includes a CPU 151 (Central Processing Unit)”, Para [0067] “Each process of the robot control device 1 is executed by the CPU 151…”) to implement: a control unit configured to control a movement of a robot on the basis of movement trajectory information indicating a movement trajectory of the robot (See at least Para [0040] “The robot operation control unit 108 outputs a control signal 10 to the robot 3 in each control cycle based on the command trajectory 1070.”, Para [0022] “FIG. 2 is a block diagram showing a configuration of the robot control device 1 shown in FIG. The robot control device 1 illustrated in FIG. 2 includes a hand information acquisition unit 101, a target point information acquisition unit 102, a person position information acquisition unit 103, a first trajectory calculation unit 104, a collision possibility estimation unit 105, and a correction necessity determination unit 106. , A trajectory correction unit 107, and a robot operation control unit 108.”); a judgment unit configured to judge whether a first distance between a position of the robot and a position of a person is shorter than a first predetermined distance (See at least Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030. Next, in step S32A, the robot operation control unit 108A calculates the possibility of collision at a preset cycle based on the current hand information 1010 and the human position information 1030 using the above-described collision evaluation index. Here, the current hand information 1010 and the person position information 1030 indicate the latest hand information 1010 and the person position information 1030 that have been acquired. In step S33A, robot operation control unit 108A determines whether or not the calculated collision probability is equal to or smaller than a threshold. If the possibility of collision is equal to or smaller than the threshold (YES) in step S33A, the robot operation control unit 108A generates the control signal 10. On the other hand, if the possibility of collision is larger than the threshold in step S33A (NO), the process proceeds to step S35A, where the robot operation control unit 108A generates a replan command 1080.”, Para [0050] “FIG. 6 is a block diagram illustrating an example of the configuration of the collision possibility estimation unit 105. The collision possibility estimation unit 105 includes a hand position acquisition unit 1051, an approach distance calculation unit 1052, a hand speed acquisition unit 1053, a relative speed calculation unit 1054, and a collision possibility calculation unit 1055.”, Para [0078] “According to the present embodiment, if the worker 5 is operating so as to approach the command trajectory of the robot 3 in order to determine the possibility of collision in consideration of the moving speed of the worker 5, Since the approach distance is shorter than in the first embodiment, the possibility of collision is calculated to be higher”), the position of the robot and the position of the person being detected by a sensor (See at least Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030…”, Para [0020] “Here, the position detection device 2 will be described in more detail. The position detection device 2 includes sensors such as a range sensor, an RGB-D (Red Green Blue-Depth) sensor, an ultrasonic sensor, and a capacitance sensor. An additional sensor such as a mat switch or a light curtain may be used to acquire the detection information 20 of an area that cannot be detected by the above-described sensor or to improve the detection accuracy of the worker 5.”); the first prediction unit configured to predict the future position of a robot on a basis of the position of the robot (See at least Para [0030] “Here, the first trajectory 1040 includes a hand position (predicted position), a predicted posture, and a hand position of the robot hand 3b at respective times (t = t1, t2,..., Tn-1, tn,...)…”); the second prediction unit configured to predict the future position of the person on a basis of the position of the person (See at least Para [0030] “Here, the first trajectory 1040 includes a hand position (predicted position), a predicted posture, and a hand position of the robot hand 3b at respective times (t = t1, t2,..., Tn-1, tn,...)…”), and a modification unit configured to judge whether a second distance between the future position of the robot and the future position of the person is shorter than a second predetermined distance and, in a case in which the second distance is shorter than the second predetermined distance, modify the movement trajectory information such that the second distance is equal to or longer than the second predetermined distance (See at least Para [0048] “In step S25, the trajectory correction unit 107 receives the input of the correction instruction signal 1060, corrects the first trajectory 1040 to a second trajectory having a lower possibility of collision, and sets the second trajectory as the command trajectory 1070.”), wherein … the sensor detects the position of the robot and the position of the person at regular or irregular intervals (See at least Para [0030] “Here, the first trajectory 1040 includes a hand position (predicted position), a predicted posture, and a hand position of the robot hand 3b at respective times (t = t1, t2,..., Tn-1, tn,...)”, Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030…”), the judgment unit judges whether the first distance is shorter than the first predetermined distance each time the position of the robot and the position of the person are detected (See at least Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030. Next, in step S32A, the robot operation control unit 108A calculates the possibility of collision at a preset cycle based on the current hand information 1010 and the human position information 1030 using the above-described collision evaluation index. Here, the current hand information 1010 and the person position information 1030 indicate the latest hand information 1010 and the person position information 1030 that have been acquired. In step S33A, robot operation control unit 108A determines whether or not the calculated collision probability is equal to or smaller than a threshold. If the possibility of collision is equal to or smaller than the threshold (YES) in step S33A, the robot operation control unit 108A generates the control signal 10. On the other hand, if the possibility of collision is larger than the threshold in step S33A (NO), the process proceeds to step S35A, where the robot operation control unit 108A generates a replan command 1080.”, Para [0050] “FIG. 6 is a block diagram illustrating an example of the configuration of the collision possibility estimation unit 105. The collision possibility estimation unit 105 includes a hand position acquisition unit 1051, an approach distance calculation unit 1052, a hand speed acquisition unit 1053, a relative speed calculation unit 1054, and a collision possibility calculation unit 1055.”, Para [0078] “According to the present embodiment, if the worker 5 is operating so as to approach the command trajectory of the robot 3 in order to determine the possibility of collision in consideration of the moving speed of the worker 5, Since the approach distance is shorter than in the first embodiment, the possibility of collision is calculated to be higher…”), However, Okahara does not explicitly spell out … the control unit in a first case in which the first distance is shorter than the first predetermined distance, stops or decelerates the movement of the robot, regardless of whether the movement trajectory information is modified, and in a second case in which the first distance is equal to or longer than the first predetermined distance and the movement trajectory information is modified, controls the movement of the robot on the basis of the movement trajectory information after modification … the judgment unit in the first case in which the first distance is shorter than the first predetermined distance, transmits a second signal to the modification unit and judges whether a third distance between the position of the robot and the position of the person detected after transmitting the second signal to the modification unit is shorter than the first predetermined distance, and in a case in which the third distance is equal to or longer than the first predetermined distance, transmits a third signal to the modification unit, and the modification unit does not modify the movement trajectory information in a period from after receiving the second signal and until receiving the third signal. Hirayama teaches … the control unit in a first case in which the first distance is shorter than the first predetermined distance, stops or decelerates the movement of the robot, regardless of whether the movement trajectory information is modified (See at least Page 23 Para 2 “In the temporary stop function 113, the reverse trajectory generated by the reverse trajectory generation function 112 may cause the robot 100 to be unable to move due to the presence of a fixed obstacle on the route or the like. If it is impossible to avoid the trunk contact of the robot 100, an action of temporarily stopping the robot 100 is selected.”), and in a second case in which the first distance is equal to or longer than the first predetermined distance and the movement trajectory information is modified, controls the movement of the robot on the basis of the movement trajectory information after modification (See at least Page 7 Para 6 “The tuning action area A1 is an area where the determination relative distance LRH is within a predetermined boundary distance with respect to the person H. As shown in FIG. 7, the boundary distance is a critical avoidance distance LIL that is a shortest avoidance distance LPSE longer than the personal space critical distance LPS and a preset work preparation time Tla, as described later. . This critical avoidance distance LIL is obtained by the following equations (10) to (12).”, Page 8 Para 1 “The asserted action area A2 is longer than the critical avoidance distance LIL, which is a distance from the human H to the boundary of the synchronized action area A1, and the relative distance LRH for determination is the furthest distance at which the action of the robot 10 is effective”)… the judgment unit (See at least Page 21 Para 12 “As shown in FIG. 20, the interference time planning unit 103 includes an approach state determination unit 105 that determines whether the robot 100 at the current position is present in the personal space of the human, and a robot 100 and the human who Contact possibility judging unit 106 for judging the possibility of physical trunk contact, preparation time judging unit 107 for judging whether there is a work preparation time required for the robot 100 to work on humans, and an emergency collision An emergency avoidance action selection unit 108 that selects an emergency avoidance action that is an action of the robot 100 that performs an avoidance response, and a normal avoidance action selection unit that selects the normal avoidance action that is an action of the robot 100 that performs a collision avoidance response when it is not an emergency 109.”) in the first case in which the first distance is shorter than the first predetermined distance, transmits a second signal to the modification unit and judges whether a third distance between the position of the robot and the position of the person detected after transmitting the second signal to the modification unit is shorter than the first predetermined distance (See at least Page 22 Para “Further, when the reference distance Lb substituted into the above equation (17) is RLR + RLH which is the sum of the radii of the robot 10 and the personal space of the human, it is shorter than RLR + RLH and longer than the trunk width of the human. It is a predetermined distance. Accordingly, an approach trajectory is generated in which the robot 100 can pass between the robot 100 and the human without touching the human while the robot 100 enters the personal space of the human. As illustrated in FIG. 21, as the approach trajectory, the trajectory with the shorter moving distance is selected from the two types of approach paths PT1 so that the human H can pass on the left and right sides. .”), and in a case in which the third distance is equal to or longer than the first predetermined distance, transmits a third signal to the modification unit (See at least Page 24 Para 2 “That is, when the space beside the human has a width equal to or greater than the above-mentioned width when the robot 100 comes closest to the human, the through space is “present”, otherwise, the through space “absent”. It”), and the modification unit does not modify the movement trajectory information in a period from after receiving the second signal and until receiving the third signal (See at least Page 3 Para 12 “The achievement estimating means 24, the plan adjusting means 25 for changing the action plan of the robot 10 based on the estimation result of the achievement degree, and the robot 10 operating based on the determined action plan. In, and a operation command unit 26 for operation command to the operation unit 11”). Therefore, it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Okahara with the teachings of Hirayama and include the feature of stopping or decelerating the movement of the robot in a case in which the first distance is shorter than the first predetermined distance, transmits a second signal to the modification unit and judges whether a third distance between the position of the robot and the position of the person detected after transmitting the second signal to the modification unit is shorter than the first predetermined distance, in a case in which the third distance is equal to or longer than the first predetermined distance, transmits a third signal to the modification unit, and the modification unit does not modify the movement trajectory information in a period from after receiving the second signal and until receiving the third signal, thereby enhance safety around the robot (See at least Page 1 Para 9 “an object of the present invention is to select a trajectory with high movement efficiency from trajectory candidates for avoiding interference with a moving obstacle.”). Regarding Claim 12, modified Okahara teaches all the elements of claim 11. Okahara further teaches a control system comprising: the control device according to claim 1 (See at least Para [0040] “The robot operation control unit 108 outputs a control signal 10 to the robot 3 in each control cycle based on the command trajectory 1070.”, Para [0022] “FIG. 2 is a block diagram showing a configuration of the robot control device 1 shown in FIG. The robot control device 1 illustrated in FIG. 2 includes a hand information acquisition unit 101, a target point information acquisition unit 102, a person position information acquisition unit 103, a first trajectory calculation unit 104, a collision possibility estimation unit 105, and a correction necessity determination unit 106. , A trajectory correction unit 107, and a robot operation control unit 108.”, Examiner notes see also claim 1); the robot (See at least Para [0040] “The robot operation control unit 108 outputs a control signal 10 to the robot 3 in each control cycle based on the command trajectory 1070.”); and the sensor (See at least Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030…”, Para [0020] “Here, the position detection device 2 will be described in more detail. The position detection device 2 includes sensors such as a range sensor, an RGB-D (Red Green Blue-Depth) sensor, an ultrasonic sensor, and a capacitance sensor. An additional sensor such as a mat switch or a light curtain may be used to acquire the detection information 20 of an area that cannot be detected by the above-described sensor or to improve the detection accuracy of the worker 5.”). Regarding Claim 13, Okahara teaches a control method executed by a computer, the control method comprising: a control step of controlling a movement of a robot on a basis of movement trajectory information indicating a movement trajectory of the robot (See at least Para [0040] “The robot operation control unit 108 outputs a control signal 10 to the robot 3 in each control cycle based on the command trajectory 1070.”, Para [0022] “FIG. 2 is a block diagram showing a configuration of the robot control device 1 shown in FIG. The robot control device 1 illustrated in FIG. 2 includes a hand information acquisition unit 101, a target point information acquisition unit 102, a person position information acquisition unit 103, a first trajectory calculation unit 104, a collision possibility estimation unit 105, and a correction necessity determination unit 106. , A trajectory correction unit 107, and a robot operation control unit 108.”); a judgment step of judging whether a first distance between a position of the robot and a position of a person is shorter than a first predetermined distance (See at least Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030. Next, in step S32A, the robot operation control unit 108A calculates the possibility of collision at a preset cycle based on the current hand information 1010 and the human position information 1030 using the above-described collision evaluation index. Here, the current hand information 1010 and the person position information 1030 indicate the latest hand information 1010 and the person position information 1030 that have been acquired. In step S33A, robot operation control unit 108A determines whether or not the calculated collision probability is equal to or smaller than a threshold. If the possibility of collision is equal to or smaller than the threshold (YES) in step S33A, the robot operation control unit 108A generates the control signal 10. On the other hand, if the possibility of collision is larger than the threshold in step S33A (NO), the process proceeds to step S35A, where the robot operation control unit 108A generates a replan command 1080.”, Para [0050] “FIG. 6 is a block diagram illustrating an example of the configuration of the collision possibility estimation unit 105. The collision possibility estimation unit 105 includes a hand position acquisition unit 1051, an approach distance calculation unit 1052, a hand speed acquisition unit 1053, a relative speed calculation unit 1054, and a collision possibility calculation unit 1055.”, Para [0078] “According to the present embodiment, if the worker 5 is operating so as to approach the command trajectory of the robot 3 in order to determine the possibility of collision in consideration of the moving speed of the worker 5, Since the approach distance is shorter than in the first embodiment, the possibility of collision is calculated to be higher, the position of the robot and the position of the person being detected by a sensor (See at least Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030…”, Para [0020] “Here, the position detection device 2 will be described in more detail. The position detection device 2 includes sensors such as a range sensor, an RGB-D (Red Green Blue-Depth) sensor, an ultrasonic sensor, and a capacitance sensor. An additional sensor such as a mat switch or a light curtain may be used to acquire the detection information 20 of an area that cannot be detected by the above-described sensor or to improve the detection accuracy of the worker 5.”), the positions being detected by a detection unit; a prediction step of predicting a future position of the robot on the basis of the position of the robot and predicting a future position of the person on a basis of the position of the person (See at least Para [0030] “Here, the first trajectory 1040 includes a hand position (predicted position), a predicted posture, and a hand position of the robot hand 3b at respective times (t = t1, t2,..., Tn-1, tn,...)…”); and a modification step of judging whether a second distance between the future position of the robot and the future position of the person is shorter than a second predetermined distance and, in a case in which the second distance is shorter than the second predetermined distance, modifying the movement trajectory information such that the second distance is equal to or longer than the second predetermined distance (See at least Para [0048] “In step S25, the trajectory correction unit 107 receives the input of the correction instruction signal 1060, corrects the first trajectory 1040 to a second trajectory having a lower possibility of collision, and sets the second trajectory as the command trajectory 1070.”), wherein … the sensor detects the position of the robot and the position of the person at regular or irregular intervals (See at least Para [0030] “Here, the first trajectory 1040 includes a hand position (predicted position), a predicted posture, and a hand position of the robot hand 3b at respective times (t = t1, t2,..., Tn-1, tn,...)”, Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030…”), whether the first distance is shorter than the first predetermined distance is judged each time the position of the robot and the position of the person are detected (See at least Para [0083] “First, in step S31A, the robot operation control unit 108A acquires the current hand information 1010 and the person position information 1030. Next, in step S32A, the robot operation control unit 108A calculates the possibility of collision at a preset cycle based on the current hand information 1010 and the human position information 1030 using the above-described collision evaluation index. Here, the current hand information 1010 and the person position information 1030 indicate the latest hand information 1010 and the person position information 1030 that have been acquired. In step S33A, robot operation control unit 108A determines whether or not the calculated collision probability is equal to or smaller than a threshold. If the possibility of collision is equal to or smaller than the threshold (YES) in step S33A, the robot operation control unit 108A generates the control signal 10. On the other hand, if the possibility of collision is larger than the threshold in step S33A (NO), the process proceeds to step S35A, where the robot operation control unit 108A generates a replan command 1080.”, Para [0050] “FIG. 6 is a block diagram illustrating an example of the configuration of the collision possibility estimation unit 105. The collision possibility estimation unit 105 includes a hand position acquisition unit 1051, an approach distance calculation unit 1052, a hand speed acquisition unit 1053, a relative speed calculation unit 1054, and a collision possibility calculation unit 1055.”, Para [0078] “According to the present embodiment, if the worker 5 is operating so as to approach the command trajectory of the robot 3 in order to determine the possibility of collision in consideration of the moving speed of the worker 5, Since the approach distance is shorter than in the first embodiment, the possibility of collision is calculated to be higher…”), the future position of the robot and the future position of the person are predicted each time the position of the robot and the position of the person are detected (See at least Para [0030] “Here, the first trajectory 1040 includes a hand position (predicted position), a predicted posture, and a hand position of the robot hand 3b at respective times (t = t1, t2,..., Tn-1, tn,...)…”), However, Okahara does not explicitly spell out … in the control step, in a first case in which the first distance is shorter than the first predetermined distance, stops or decelerates the movement of the robot, regardless of whether the movement trajectory information is modified, and in a second case in which the first distance is equal to or longer than the first predetermined distance and the movement trajectory information is modified, controls the movement of the robot on the basis of the movement trajectory info
Read full office action

Prosecution Timeline

Aug 22, 2023
Application Filed
May 01, 2025
Non-Final Rejection — §103, §112
Jul 29, 2025
Response Filed
Oct 05, 2025
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12569992
AUTOMATIC DETERMINATION OF ROBOT SETTLING STATES
2y 5m to grant Granted Mar 10, 2026
Patent 12539597
ROBOT SYSTEM, AND CONTROL METHOD FOR SAME
2y 5m to grant Granted Feb 03, 2026
Patent 12514143
AGRICULTURAL MACHINE, AGRICULTURAL WORK ASSISTANCE APPARATUS, AND AGRICULTURAL WORK ASSISTANCE SYSTEM
2y 5m to grant Granted Jan 06, 2026
Patent 12485538
METHOD AND SYSTEM FOR DETERMINING A WORKPIECE LOADING LOCATION IN A CNC MACHINE WITH A ROBOTIC ARM
2y 5m to grant Granted Dec 02, 2025
Patent 12479107
METHOD AND AN ASSEMBLY UNIT FOR PERFORMING ASSEMBLING OPERATIONS
2y 5m to grant Granted Nov 25, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
43%
Grant Probability
81%
With Interview (+37.9%)
3y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 58 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month