Prosecution Insights
Last updated: April 19, 2026
Application No. 18/010,571

ROBOT CONTROL DEVICE

Non-Final OA §103
Filed
Dec 15, 2022
Examiner
MILES, JONATHAN WADE
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Fanuc Corporation
OA Round
3 (Non-Final)
70%
Grant Probability
Favorable
3-4
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
406 granted / 578 resolved
+18.2% vs TC avg
Strong +48% interview lift
Without
With
+48.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
5 currently pending
Career history
583
Total Applications
across all art units

Statute-Specific Performance

§101
0.6%
-39.4% vs TC avg
§103
38.1%
-1.9% vs TC avg
§102
33.1%
-6.9% vs TC avg
§112
21.8%
-18.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 578 resolved cases

Office Action

§103
DETAILED ACTION Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on December 12, 2025, has been entered. Response to Amendment This Office action is in response to the claims filed December 12, 2025. Claims 1, 6, and 7 are amended, and claims 1-7 are pending and addressed below. Response to Arguments Applicant’s arguments are directed towards the claims as amended. The amendments have necessitated a new grounds of rejection as detailed below. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2 are rejected under 35 U.S.C. 103 as being unpatentable over Kamiya et al. (“Kamiya” US 20110270443) in view of Hsu et al. (“Hsu” TW1753209; citations made to the specification of the provided English translation; figures may be found in the original filing). Regarding claim 1, Kamiya teaches a robot control device (see Kamiya at least fig. 1; robot controller 5) comprising: a memory configured to store a program (54); and a processor configured to execute the program ([0042]: “The robot controller 5 can execute a function that is required for causing the robot 1 to detect the position of the workpiece 2.”) and control the robot control device to: acquire force data indicating external force to a tool attached to a robot as detected by a sensor disposed on the robot (see Kamiya at least par. 39; "external force detector 3 detects an external force applied to the probe 4"); calculate a point of action of the external force based on the force data acquired (see Kamiya at least par. 42; "contact-position calculating unit 55 calculates, when it is determined that the probe 4 is in the contact state by the external-force-value monitoring unit 52, an actual contact position based on the external force value"); and set the point of action of the external force as a tool tip point of the robot (see Kamiya at least par. 57; "probe-end-position calculating unit 53 outputs a probe end position calculated at the time when the current conduction, that is, the contact, is detected by the conduction monitoring unit as the actual contact position"); but fails to disclose wherein the external force is external forces in two directions, that differ from each other, applied by a user to a tool attached to a robot as detected by a sensor disposed on the robot without operating the robot. However, Hsu discloses setting the point of action of the external force as a tool tip point based on data indicating external forces in two directions, that differ from each other (abstract and throughout the document; “contact force and contact torque”), applied by a user to a tool attached to a robot (see Fig. 3) as detected by a sensor disposed on the robot without operating the robot (Page 2 further discloses either touching the tool tip with the user’s finger as shown in Fig. 3, or driving, i.e., operating the robot arm to touch the tool tip to an object as shown in Fig. 4, and both methods result in calibrating tool center point. Because Hsu mentions no driving or operating in conjunction with touching the tool tip with the user’s finger and differentiates touching the tool tip to an object by driving the robot arm, one having ordinary skill in the art will conclude that the driving or operation of the robot arm is not occurring in Fig. 3 when touched by the user’s finger.) It would have been obvious to one having ordinary skill in the art at the time of the applicant’s effective filing date to combine the tool center point calibration taught by Hsu with the robotic control device of Kamiya because Hsu discloses both touching the tool tip with the user’s finger and operating the robot arm to touch the tool tip as both being viable methods for calibrating the tool tip. Therefore, it would have been obvious to calibrate the tool tip using the user’s finger to touch the tip as it would save wear and tear on the mechanics of the robot arm by reducing movements to achieve proper calibration and would also save in energy, as extra energy is not consumed to move the robot arm into contact with the object in order to calibrate the tool tip. Regarding claim 2, Kamiya in view of Hsu already teaches the robot control device according to claim 1, and Hsu further teaches where the sensor is a torque sensor (abstract; the sensor detects torque of the tool and torque of the contact force, thus is a torque sensor). Claims 3 and 5 are rejected under 35 U.S.C. 103 as being unpatentable over Kamiya in view Hsu and further in view of Arata et al., (US 20080004633). Regarding claim 3, Kamiya in view of Hsu already teaches the robot control device according to claim 1. Additionally, Kamiya teaches wherein the memory stores the point of action calculated ([0042]: "recording unit 54 records therein the value of the force detected by the external force detector 3 and the position of the probe calculated by the probe-end-position calculating unit 53"), and wherein… when the processor is storing two points of action (see Kamiya at least par. 51; "obtain the actual contact position at Steps S82 and S83 in the process"), Kamiya does not explicitly teach the configuration unit sets… a midpoint on a straight line connecting the two points of action as the tool tip point. However, Arata, who discloses calibrating a tip position, teaches the configuration unit sets… a midpoint on a straight line connecting the two points of action as the tool tip point (see Arata at least par. 122; "take a number of burr tip positions, calculate the average of the positions"). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention to have modified the device of claim 1 by calculating an average of the tip positions in order to "make the verification process as accurate as possible" (Arata at least par. 122). Regarding claim 5, Kamiya in view of Hsu already teaches the robot control device according to claim 1. Additionally, Kamiya teaches wherein the memory stores the point of action calculated (see Kamiya at least par. 42; "recording unit 54 records therein the value of the force detected by the external force detector 3 and the position of the probe calculated by the probe-end-position calculating unit 53"). Kamiya does not explicitly teach wherein the processor sets, when the memory is storing three or more points of action as a plurality of points of action, a midpoint in a polygonal shape formed by connecting the plurality of points of action as the tool tip point. However, Arata, who discloses calibrating a tip position, teaches wherein the processor sets, when the memory is storing three or more points of action as a plurality of points of action, a midpoint in a polygonal shape formed by connecting the plurality of points of action as the tool tip point (see Arata at least par. 122, par. 89, and fig. 18; "take a number of burr tip positions, calculate the average of the positions"; "displays the acquired tip positions on the screen 86g as dots 900"). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention to have modified the device of claim 1 by calculating an average of the tip positions as is done in Arata in order to "make the verification process as accurate as possible" (Arata at least par. 122). Claims 4, 6, and 7 are rejected under 35 U.S.C. 103 as being unpatentable over Kamiya in view of Hsu and Arata, and further in view of Ikeda et al., (US 6522949). Regarding claim 4, Kamiya in view of Hsu and Arata already teaches the robot control device according to claim 3, and Arata further teaches a display unit configured to display a screen (see Arata at least fig. 19 and fig. 21; screen [86h] shows dots [900] and screen [186a] shows the tool). Additionally, Ikeda, who discloses a robot controller for correction a position, teaches indicating a positional relationship between the straight line connecting the two points of action and the robot (see Ikeda at least fig. 5A; tool [18] with points [P11, P12, P13…] and lines); and an input unit (see Ikeda at least fig. 2; input unit [17]) configured to designate a desired position on the straight line displayed on the screen (see Ikeda at least fig. 5A, fig. 8A, col. 4 lines 61-64, and col. 6 lines 55-57; "Supposing the actual position of block 402' to be the position indicated by broken line in FIG. 5B, the teaching coordinates of P33 shown in FIG. 5A must be corrected to the position of P33' shown in FIG. 5B"; "operator selects a corresponding welding line, and further selects a welding point from the selected welding line"). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention to have modified the device of claim 3 by using a screen as is done in Arata and having an operator select points on a line as is done in Ikeda in order to "guide the user manipulating the haptic device 30 through the procedure" (Arata par. 78) and to "correct position of teaching points" (Ikeda col. 6 line 29). Regarding claim 6, Kamiya in view of Hsu and Arata already teaches the robot control device according to claim 5, and Arata teaches a processor displaying a screen (see Arata at least fig. 19 and fig. 21; screen [86h] shows dots [900] and screen [186a] shows the tool). Additionally, Ikeda, who discloses a robot controller for correction a position, discloses indicating a positional relationship between the polygonal shape formed by connecting the plurality of points of action and the robot (see Ikeda at least fig. 5A; tool [18] with points [P11, P12, P13…] and lines; examiner note: the series of lines connecting the points shown in fig. 5A is analogous to a polygonal shape); and wherein the processor receives a designation of an input unit (see Ikeda at least fig. 2; input unit [17]) configured to designate a desired position in the polygonal shape displayed on the screen (see Ikeda at least fig. 5A, fig. 8A, col. 4 lines 61-64, and col. 6 lines 55-57; "Supposing the actual position of block 402' to be the position indicated by broken line in FIG. 5B, the teaching coordinates of P33 shown in FIG. 5A must be corrected to the position of P33' shown in FIG. 5B"; "operator selects a corresponding welding line, and further selects a welding point from the selected welding line"). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention to have modified the device of claim 5 by using a screen as is done in Arata and having an operator select points on lines as is done in Ikeda in order to "guide the user manipulating the haptic device 30 through the procedure" (Arata par. 78) and to "correct position of teaching points" (Ikeda col. 6 line 29). Regarding claim 7, Kamiya in view of Hsu already teaches the robot control device according to claim 1, and Kamiya further teaches the processor calculates a straight line passing through the point of action of the external force (see Kamiya at least par. 42; "contact-position calculating unit 55 calculates, when it is determined that the probe 4 is in the contact state by the external-force-value monitoring unit 52, an actual contact position based on the external force value"), Kamiya in view of Hsu does not explicitly teach a display; and wherein the processor: displays a screen indicating a positional relationship between the straight line and the robot, receives a designation of a desired position on the straight line displayed on the screen, and sets the designated desired position as the tool tip point of the robot. However, Arata, who discloses calibrating a tip position, teaches a a process displaying a screen (see Arata at least fig. 19 and fig. 21; screen [86h] shows dots [900] and screen [186a] shows the tool)…, Additionally, Ikeda, who discloses a robot controller for correction a position, discloses an input unit (see Ikeda at least fig. 2; input unit [17]), wherein a processor calculates a straight line passing through a point of action (see Ikeda at least fig. 5A, col. 4 lines 42-44; "track passing through the welding start point P12, welding middle point P13, and welding end point P14 is called a first welding line")…, … indicating a positional relationship between the straight line and the robot (see Ikeda at least fig. 5A; tool [18] with welding lines), receiving a designation of a desired position on the straight line displayed on the screen (see Ikeda at least fig. 5A, fig. 8A, col. 4 lines 61-64, and col. 6 lines 55-57; "Supposing the actual position of block 402' to be the position indicated by broken line in FIG. 5B, the teaching coordinates of P33 shown in FIG. 5A must be corrected to the position of P33' shown in FIG. 5B"; "operator selects a corresponding welding line, and further selects a welding point from the selected welding line"), and sets the designated desired position as the tool tip point of the robot (see Ikeda at least col. 3 lines 64-66; "creates a motion track of the RM 11 based on the teaching data, and sends an operation instruction to the servo 14 based on the created motion track"). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention to have modified the device of claim 1 by using a screen as is done in Arata and having an operator select points on a line as is done in Ikeda in order to "guide the user manipulating the haptic device 30 through the procedure" (Arata par. 78) and to "correct position of teaching points" (Ikeda col. 6 line 29). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Wade Miles whose telephone number is (571)270-7777. The examiner can normally be reached Monday-Friday 10:00 am - 7:00 pm ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WADE MILES/Supervisory Patent Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Dec 15, 2022
Application Filed
Feb 24, 2025
Non-Final Rejection — §103
Jun 03, 2025
Response Filed
Sep 28, 2025
Final Rejection — §103
Dec 12, 2025
Request for Continued Examination
Dec 20, 2025
Response after Non-Final Action
Mar 20, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12446892
DUAL SIDE SPRING V-CLIP FOR SURGICAL TREATMENT OF LEFT ATRIAL APPENDAGE
2y 5m to grant Granted Oct 21, 2025
Patent 12193935
PULL-THROUGH CHORDAE TENDINEAE SYSTEM
2y 5m to grant Granted Jan 14, 2025
Patent 12193680
OCCLUSION CLIP
2y 5m to grant Granted Jan 14, 2025
Patent 12185959
ASPIRATION THROMBECTOMY SYSTEM AND METHODS FOR THROMBUS REMOVAL WITH ASPIRATION CATHETER
2y 5m to grant Granted Jan 07, 2025
Patent 12161343
Implant Detachment
2y 5m to grant Granted Dec 10, 2024
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
70%
Grant Probability
99%
With Interview (+48.1%)
3y 2m
Median Time to Grant
High
PTA Risk
Based on 578 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month