Prosecution Insights
Last updated: April 19, 2026
Application No. 18/856,925

REMOTE CONTROL SYSTEM, ROBOT REMOTE CONTROL METHOD, AND REMOTE CONTROL PROGRAM

Non-Final OA §102§103§112
Filed
Oct 15, 2024
Examiner
LE, TIEN MINH
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Kawasaki Jukogyo Kabushiki Kaisha
OA Round
1 (Non-Final)
68%
Grant Probability
Favorable
1-2
OA Rounds
2y 12m
To Grant
92%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
55 granted / 81 resolved
+15.9% vs TC avg
Strong +24% interview lift
Without
With
+23.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 12m
Avg Prosecution
30 currently pending
Career history
111
Total Applications
across all art units

Statute-Specific Performance

§101
8.1%
-31.9% vs TC avg
§103
51.7%
+11.7% vs TC avg
§102
18.5%
-21.5% vs TC avg
§112
18.8%
-21.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 81 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-11 as originally filed are pending and have been considered as follows. Priority 1. Acknowledgement is made that the present application is a national phase conversion of PCT/JP2023/015227 filed on 04/14/2023 which claim priority to foreign application JP2022-067838 filed on 04/15/2022. Information Disclosure Statement 2. The applicant filed an IDS on 10/15/2024. It has been annotated and considered. Claim Interpretation 3. The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. 4. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. 5. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “haptic stimulator” in claims 1, 2, 4, 5, 6, 9, 10, and 11. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. A review of the specification shows the following appears to be the corresponding structure described in the specification for the 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph limitations: “haptic stimulator” in claims 1, 2, 4, 5, 6, 9, 10, and 11 corresponds to “a haptic stimulator 8” [0020] and Fig. 1. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 6. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. 7. Claims 4 and 11 is/are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claim 4, it is unclear what is meant by the phrase “the number of types of contact occurred…”. It is unclear when a “number of types of contact occurred…” was first introduced. For examination purposes, the examiner is interpreting “the number of types of contact occurred…” as “a number of types of contact occurred…”. Regarding claim 11, it is unclear is meant by the phrase “a user's sense of touch if the robot…”. It is unclear if this is referring to a new user's sense of touch or the same user's sense of touch introduced earlier in the claim. For examination purposes, the examiner is interpreting “a user's sense of touch if the robot…” as “the user's sense of touch if the robot …”. In the art rejection above, the claims have been treated as best understood by the examiner. Any claim not explicitly rejected under this heading is rejected as being dependent on an indefinite claim. Claim Rejections - 35 USC § 102 8. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. 9. Claims 1-3, 5-7, and 9-11 is/are rejected under 35 U.S.C. 102(a)(2)/(a)(1) as being anticipated by Rossano et al. (US 20180154518, hereinafter Rossano). Regarding claim 1, Rossano teaches a remote control system (see at least Abstract: “A teleoperated robotic system that utilizes a graphical user interface (GUI) to perform work on a workpiece(s) using a robot.”) comprising: a robot (see at least Fig. 1 and [0019]: “According to certain embodiments, the robot station 102 includes one or more robots 106 having one or more degrees of freedom.”); an operator that receives input from a user (see at least [0025]: “The at least one teleoperation member 122 can be structured to receive commands or other input information from the operator of the teleoperation member 122, including, but not limited to, commands provided through the physical engagement and/or movement of at least a portion of the teleoperation member 122 and/or operator.”); a display that presents an image of the robot to the user (see at least [0028]: “According to certain embodiments, such information provided to the management system 122 from the robot station 102 and/or the communication network or link 118 may be processed by the management system 122 in a manner that provides a visual indication or information regarding the operation of the robot 106 and/or work on the workpiece that can be displayed on the display 124, either as a captured image(s) and/or as a digital representation.”); a haptic stimulator that applies a stimulus to a user's sense of touch (see at least [0026]: “According to certain embodiments, the teleoperation member 122 can include haptic capabilities and/or can be a haptic device, including, for example, a teleoperation member 122 device that experiences feedback in the form of vibrations sensed by the operator through the teleoperation member 122, among other sensory feedback to the operator, relating to forces experienced and/or encountered by the robot 106 and/or end effector 108.”; [0037]: “FIGS. 3A and 3B illustrate an exemplary graphical user interface 136 (GUI), as shown on the display 124, that is fully integrated with the operation of a teleoperation member 122 in the form of a haptic joystick 138.”); and a controller that controls the robot based on the input to the operator (see at least Fig. 1 and [0021]: “The controller 112 can be configured to provide a variety of functions, including, for example, be utilized in the selective delivery of electrical power to the robot 106 and/or to control the movement of the robot 106 and/or the operation of other equipment that is mounted to the robot 106, including, for example, the end effector 108.”), wherein the controller operates the haptic stimulator if the robot contacts another object (see at least Figs. 1-2 and [0034]: “Moreover, the force sensor 130 can be used to detect and/or determine forces being experienced by, and/or imparted onto, the end effector 108 and/or robot 106 and can directly or indirectly provide corresponding information or data to the controller 112, the computation member 116 of the robot system 102, and/or the management system 122 of the operator station 104. The controller 112, computation member 116, and/or the management system 122 can utilize such data or related information to provide a signal to the teleoperation member 122 that is used to provide a visual, auditory, or physical signal to the operator of the teleoperation member 122 that may provide an indication of the relative degree of the force sensed by the force sensor 130.”; [0035]: “As the robot 106 is maneuvered by operation of the teleoperation member 122, and the subpart 132 is moved into contact with the main part 134, at least a portion of the forces generated by such contact between the subpart 132 and main part 134 are sensed by the force sensor 130. Force feedback information or data relating to, or indicative of, the forces sensed by the force sensor 130 can be communicated to the operator via the haptic capabilities of the teleoperation member 122, as previously discussed.”). Regarding claim 2, Rossano teaches the limitations of claim 1. Rossano further teaches wherein the controller causes the robot to perform a job including steps based on the input to the operator (see at least [0025]: “The at least one teleoperation member 122 can be structured to receive commands or other input information from the operator of the teleoperation member 122, including, but not limited to, commands provided through the physical engagement and/or movement of at least a portion of the teleoperation member 122 and/or operator….For example, according to certain embodiments in which the teleoperation member 122 is a joystick, movement of the joystick or a stick of the joystick can be converted to electronic signals indicative of the direction(s) or movement that are translated into corresponding similar movement of the robot 106 and/or a speed of movement of the robot 106.”; [0044]: “Referencing FIGS. 4A-4D, for multi-stage assemblies, the GUI 136 can be structured to graphically depict on the display 124, and/or provide a textual or graphical indication(s), of the current stage of the assembly of the subpart 132 to the main part 134. Further, as previously discussed, FIGS. 4A-4D further demonstrate the haptic joystick 138 being synchronized to the GUI 136 such that the movement or jogging of the joystick 138 correlates to the GUI 136 depicting the movement of a first reference indicator 142 that corresponds to a particular location and/or orientation of the robot 106 and/or subpart 132.”), and operates the haptic stimulator if the robot contacts another object upon completion of at least one of the steps (see at least [0026]: “According to certain embodiments, the teleoperation member 122 can include haptic capabilities and/or can be a haptic device, including, for example, a teleoperation member 122 device that experiences feedback in the form of vibrations sensed by the operator through the teleoperation member 122, among other sensory feedback to the operator, relating to forces experienced and/or encountered by the robot 106 and/or end effector 108. Alternatively, or additionally, the teleoperation member 122 can include haptic capabilities in which the teleoperation member 122 includes one or more tactile sensors that can measure or sense forces exerted on the teleoperation member 122 by the operator. Such forces can be translated into the movement or operation of the robot 106, including, for example, the force an end effector 108 used by the robot 106 exerting a corresponding force on the workpiece, such as a clamping, gripping, and/or striking force.”). Regarding claim 3, Rossano teaches the limitations of claim 2. Rossano further teaches wherein the operator receives commands corresponding to the steps (see at least [0025]: “The at least one teleoperation member 122 can be structured to receive commands or other input information from the operator of the teleoperation member 122, including, but not limited to, commands provided through the physical engagement and/or movement of at least a portion of the teleoperation member 122 and/or operator.”), and the controller causes the robot to perform a step corresponding to a command input to the operator (see at least Fig. 1 and [0021]: “The controller 112 can be configured to provide a variety of functions, including, for example, be utilized in the selective delivery of electrical power to the robot 106 and/or to control the movement of the robot 106 and/or the operation of other equipment that is mounted to the robot 106, including, for example, the end effector 108.”; [0044]: “As shown by the illustrated example, adjustments of the haptic joystick 138 can be correlated to the movement of the first reference indicator 142 and/or second reference indicator 146 as the subpart 132 is moved from a position that is not part of the multi-stage assemblies (FIG. 4A), along the first two stage of assembly (FIGS. 4B and 4C), and eventually to a target indicator 148 that corresponds to the insertion of the subpart 132 into a cavity 154 of the main part 134 at a final assembly stage (FIG. 4D).”). Regarding claim 5, Rossano teaches the limitations of claim 1. Rossano further teaches wherein the controller determines a type of contact of the robot (see at least Figs. 1-2 and [0034]: “The controller 112, computation member 116, and/or the management system 122 can utilize such data or related information to provide a signal to the teleoperation member 122 that is used to provide a visual, auditory, or physical signal to the operator of the teleoperation member 122 that may provide an indication of the relative degree of the force sensed by the force sensor 130. Moreover, the degree of haptic feedback provided by the teleoperation member 122, such as whether the force feedback is relatively large or small, may provide an indication of the extent of force sensed by the force sensor 130…Further, according to certain embodiments, the haptic feedback of the teleoperation member 122 is provided by movement and/or resistance to movement of the teleoperation member 122, such as, for example, vibration(s) or degree of resistance to movement of a haptic joystick of the teleoperation member 122.”), and operates the haptic stimulator for each of different types of contact (see at least [0034]: “Further, according to certain embodiments, the haptic feedback of the teleoperation member 122 is provided by movement and/or resistance to movement of the teleoperation member 122, such as, for example, vibration(s) or degree of resistance to movement of a haptic joystick of the teleoperation member 122. Further, the extent and/or duration of such movement or resistance can correspond to the degree of the force sensed by the force sensor 130.”). Regarding claim 6, Rossano teaches the limitations of claim 1. Rossano further teaches wherein the controller determines a magnitude of a contact force of the robot (see at least Figs. 1-2 and [0034]: “The controller 112, computation member 116, and/or the management system 122 can utilize such data or related information to provide a signal to the teleoperation member 122 that is used to provide a visual, auditory, or physical signal to the operator of the teleoperation member 122 that may provide an indication of the relative degree of the force sensed by the force sensor 130. Moreover, the degree of haptic feedback provided by the teleoperation member 122, such as whether the force feedback is relatively large or small, may provide an indication of the extent of force sensed by the force sensor 130…Further, according to certain embodiments, the haptic feedback of the teleoperation member 122 is provided by movement and/or resistance to movement of the teleoperation member 122, such as, for example, vibration(s) or degree of resistance to movement of a haptic joystick of the teleoperation member 122.”), and changes an operation mode of the haptic stimulator according to the magnitude of the contact force (see at least [0034]: “Further, according to certain embodiments, the haptic feedback of the teleoperation member 122 is provided by movement and/or resistance to movement of the teleoperation member 122, such as, for example, vibration(s) or degree of resistance to movement of a haptic joystick of the teleoperation member 122. Further, the extent and/or duration of such movement or resistance can correspond to the degree of the force sensed by the force sensor 130.”). Regarding claim 7, Rossano teaches the limitations of claim 1. Rossano further teaches a sensor that detects a force acting on the robot (see at least Figs. 1-2 and [0034]: “Moreover, the force sensor 130 can be used to detect and/or determine forces being experienced by, and/or imparted onto, the end effector 108 and/or robot 106 and can directly or indirectly provide corresponding information or data to the controller 112, the computation member 116 of the robot system 102, and/or the management system 122 of the operator station 104.”), wherein the controller determines contact of the robot with another object based on a detection result of the sensor (see at least Figs. 1-2 and [0034]: “The controller 112, computation member 116, and/or the management system 122 can utilize such data or related information to provide a signal to the teleoperation member 122 that is used to provide a visual, auditory, or physical signal to the operator of the teleoperation member 122 that may provide an indication of the relative degree of the force sensed by the force sensor 130.”). Regarding claim 9, Rossano teaches a remote control system (see at least Abstract: “A teleoperated robotic system that utilizes a graphical user interface (GUI) to perform work on a workpiece(s) using a robot.”) comprising: a robot (see at least Fig. 1 and [0019]: “According to certain embodiments, the robot station 102 includes one or more robots 106 having one or more degrees of freedom.”); an operator that receives input from a user (see at least [0025]: “The at least one teleoperation member 122 can be structured to receive commands or other input information from the operator of the teleoperation member 122, including, but not limited to, commands provided through the physical engagement and/or movement of at least a portion of the teleoperation member 122 and/or operator.”); a display that presents an image of the robot to the user (see at least [0028]: “According to certain embodiments, such information provided to the management system 122 from the robot station 102 and/or the communication network or link 118 may be processed by the management system 122 in a manner that provides a visual indication or information regarding the operation of the robot 106 and/or work on the workpiece that can be displayed on the display 124, either as a captured image(s) and/or as a digital representation.”); a haptic stimulator that applies a stimulus to a user's sense of touch (see at least [0026]: “According to certain embodiments, the teleoperation member 122 can include haptic capabilities and/or can be a haptic device, including, for example, a teleoperation member 122 device that experiences feedback in the form of vibrations sensed by the operator through the teleoperation member 122, among other sensory feedback to the operator, relating to forces experienced and/or encountered by the robot 106 and/or end effector 108.”; [0037]: “FIGS. 3A and 3B illustrate an exemplary graphical user interface 136 (GUI), as shown on the display 124, that is fully integrated with the operation of a teleoperation member 122 in the form of a haptic joystick 138.”); and a controller that causes the robot to perform a job including steps based on the input to the operator (see at least Fig. 1 and [0021]: “The controller 112 can be configured to provide a variety of functions, including, for example, be utilized in the selective delivery of electrical power to the robot 106 and/or to control the movement of the robot 106 and/or the operation of other equipment that is mounted to the robot 106, including, for example, the end effector 108.”; [0025]: “The at least one teleoperation member 122 can be structured to receive commands or other input information from the operator of the teleoperation member 122, including, but not limited to, commands provided through the physical engagement and/or movement of at least a portion of the teleoperation member 122 and/or operator….For example, according to certain embodiments in which the teleoperation member 122 is a joystick, movement of the joystick or a stick of the joystick can be converted to electronic signals indicative of the direction(s) or movement that are translated into corresponding similar movement of the robot 106 and/or a speed of movement of the robot 106.”; [0044]: “Referencing FIGS. 4A-4D, for multi-stage assemblies, the GUI 136 can be structured to graphically depict on the display 124, and/or provide a textual or graphical indication(s), of the current stage of the assembly of the subpart 132 to the main part 134. Further, as previously discussed, FIGS. 4A-4D further demonstrate the haptic joystick 138 being synchronized to the GUI 136 such that the movement or jogging of the joystick 138 correlates to the GUI 136 depicting the movement of a first reference indicator 142 that corresponds to a particular location and/or orientation of the robot 106 and/or subpart 132.”), wherein the controller operates the haptic stimulator upon completion of at least one of the steps (see at least [0026]: “According to certain embodiments, the teleoperation member 122 can include haptic capabilities and/or can be a haptic device, including, for example, a teleoperation member 122 device that experiences feedback in the form of vibrations sensed by the operator through the teleoperation member 122, among other sensory feedback to the operator, relating to forces experienced and/or encountered by the robot 106 and/or end effector 108. Alternatively, or additionally, the teleoperation member 122 can include haptic capabilities in which the teleoperation member 122 includes one or more tactile sensors that can measure or sense forces exerted on the teleoperation member 122 by the operator. Such forces can be translated into the movement or operation of the robot 106, including, for example, the force an end effector 108 used by the robot 106 exerting a corresponding force on the workpiece, such as a clamping, gripping, and/or striking force.”; [0044]: “Referencing FIGS. 4A-4D, for multi-stage assemblies, the GUI 136 can be structured to graphically depict on the display 124, and/or provide a textual or graphical indication(s), of the current stage of the assembly of the subpart 132 to the main part 134. Further, as previously discussed, FIGS. 4A-4D further demonstrate the haptic joystick 138 being synchronized to the GUI 136 such that the movement or jogging of the joystick 138 correlates to the GUI 136 depicting the movement of a first reference indicator 142 that corresponds to a particular location and/or orientation of the robot 106 and/or subpart 132…As shown by the illustrated example, adjustments of the haptic joystick 138 can be correlated to the movement of the first reference indicator 142 and/or second reference indicator 146 as the subpart 132 is moved from a position that is not part of the multi-stage assemblies (FIG. 4A), along the first two stage of assembly (FIGS. 4B and 4C), and eventually to a target indicator 148 that corresponds to the insertion of the subpart 132 into a cavity 154 of the main part 134 at a final assembly stage (FIG. 4D).”). Regarding claim 10, Rossano teaches a robot remote control method for controlling a robot via an operator (see at least Fig. 1 and [0006]: “Another aspect of the present application is a method for providing a graphic user interface for a teleoperation robotic system having a teleoperated member at an operator station and a robot at a robot station.”), comprising: receiving input from a user via the operator (see at least [0025]: “The at least one teleoperation member 122 can be structured to receive commands or other input information from the operator of the teleoperation member 122, including, but not limited to, commands provided through the physical engagement and/or movement of at least a portion of the teleoperation member 122 and/or operator.”); moving the robot based on the input to the operator (see at least Fig. 1 and [0021]: “The controller 112 can be configured to provide a variety of functions, including, for example, be utilized in the selective delivery of electrical power to the robot 106 and/or to control the movement of the robot 106 and/or the operation of other equipment that is mounted to the robot 106, including, for example, the end effector 108.”); presenting an image of the moving robot to the user via a display (see at least [0028]: “According to certain embodiments, such information provided to the management system 122 from the robot station 102 and/or the communication network or link 118 may be processed by the management system 122 in a manner that provides a visual indication or information regarding the operation of the robot 106 and/or work on the workpiece that can be displayed on the display 124, either as a captured image(s) and/or as a digital representation.”); and operating a haptic stimulator that applies a stimulus to a user's sense of touch if the robot contacts another object (see at least Figs. 1-2 and [0026]: “According to certain embodiments, the teleoperation member 122 can include haptic capabilities and/or can be a haptic device, including, for example, a teleoperation member 122 device that experiences feedback in the form of vibrations sensed by the operator through the teleoperation member 122, among other sensory feedback to the operator, relating to forces experienced and/or encountered by the robot 106 and/or end effector 108.”; [0034]: “Moreover, the force sensor 130 can be used to detect and/or determine forces being experienced by, and/or imparted onto, the end effector 108 and/or robot 106 and can directly or indirectly provide corresponding information or data to the controller 112, the computation member 116 of the robot system 102, and/or the management system 122 of the operator station 104. The controller 112, computation member 116, and/or the management system 122 can utilize such data or related information to provide a signal to the teleoperation member 122 that is used to provide a visual, auditory, or physical signal to the operator of the teleoperation member 122 that may provide an indication of the relative degree of the force sensed by the force sensor 130.”; [0035]: “As the robot 106 is maneuvered by operation of the teleoperation member 122, and the subpart 132 is moved into contact with the main part 134, at least a portion of the forces generated by such contact between the subpart 132 and main part 134 are sensed by the force sensor 130. Force feedback information or data relating to, or indicative of, the forces sensed by the force sensor 130 can be communicated to the operator via the haptic capabilities of the teleoperation member 122, as previously discussed.”). Regarding claim 11, Rossano teaches a non-transitory storage medium storing a remote control program causing a computer to implement a function of controlling a remote control system (see at least [0022]: “Operations, instructions, and/or commands determined and/or transmitted from the controller 112 can be based on one or more models stored in non-transient computer readable media in a controller 112, other computer, and/or memory that is accessible or in electrical communication with the controller 112”) including a robot (see at least Fig. 1 and [0019]: “According to certain embodiments, the robot station 102 includes one or more robots 106 having one or more degrees of freedom.”), an operator that receives input from a user (see at least [0025]: “The at least one teleoperation member 122 can be structured to receive commands or other input information from the operator of the teleoperation member 122, including, but not limited to, commands provided through the physical engagement and/or movement of at least a portion of the teleoperation member 122 and/or operator.”), and a haptic stimulator that applies a stimulus to a user's sense of touch (see at least [0026]: “According to certain embodiments, the teleoperation member 122 can include haptic capabilities and/or can be a haptic device, including, for example, a teleoperation member 122 device that experiences feedback in the form of vibrations sensed by the operator through the teleoperation member 122, among other sensory feedback to the operator, relating to forces experienced and/or encountered by the robot 106 and/or end effector 108.”; [0037]: “FIGS. 3A and 3B illustrate an exemplary graphical user interface 136 (GUI), as shown on the display 124, that is fully integrated with the operation of a teleoperation member 122 in the form of a haptic joystick 138.”), the remote control program further causing the computer to implement a function of receiving the input from the user via the operator (see at least [0025]: “The at least one teleoperation member 122 can be structured to receive commands or other input information from the operator of the teleoperation member 122, including, but not limited to, commands provided through the physical engagement and/or movement of at least a portion of the teleoperation member 122 and/or operator.”), a function of moving the robot based on the input to the operator (see at least Fig. 1 and [0021]: “The controller 112 can be configured to provide a variety of functions, including, for example, be utilized in the selective delivery of electrical power to the robot 106 and/or to control the movement of the robot 106 and/or the operation of other equipment that is mounted to the robot 106, including, for example, the end effector 108.”), and a function of operating a haptic stimulator that applies a stimulus to a user's sense of touch if the robot contacts another object (see at least Figs. 1-2 and [0034]: “Moreover, the force sensor 130 can be used to detect and/or determine forces being experienced by, and/or imparted onto, the end effector 108 and/or robot 106 and can directly or indirectly provide corresponding information or data to the controller 112, the computation member 116 of the robot system 102, and/or the management system 122 of the operator station 104. The controller 112, computation member 116, and/or the management system 122 can utilize such data or related information to provide a signal to the teleoperation member 122 that is used to provide a visual, auditory, or physical signal to the operator of the teleoperation member 122 that may provide an indication of the relative degree of the force sensed by the force sensor 130.”; [0035]: “As the robot 106 is maneuvered by operation of the teleoperation member 122, and the subpart 132 is moved into contact with the main part 134, at least a portion of the forces generated by such contact between the subpart 132 and main part 134 are sensed by the force sensor 130. Force feedback information or data relating to, or indicative of, the forces sensed by the force sensor 130 can be communicated to the operator via the haptic capabilities of the teleoperation member 122, as previously discussed.”). Claim Rejections - 35 USC § 103 10. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 11. Claim 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rossano et al. (US 20180154518, hereinafter Rossano) in view of Devengenzo (US 9919424, hereinafter Devengenzo). Regarding claim 4, Rossano teaches the limitations of claim 3. Rossano further teaches wherein the controller changes an operation mode of the haptic stimulator (see at least Figs. 1-2 and [0034]: “Moreover, the degree of haptic feedback provided by the teleoperation member 122, such as whether the force feedback is relatively large or small, may provide an indication of the extent of force sensed by the force sensor 130…Further, according to certain embodiments, the haptic feedback of the teleoperation member 122 is provided by movement and/or resistance to movement of the teleoperation member 122, such as, for example, vibration(s) or degree of resistance to movement of a haptic joystick of the teleoperation member 122. Further, the extent and/or duration of such movement or resistance can correspond to the degree of the force sensed by the force sensor 130.”). Rossano fails to explicitly teach changing an operation mode of the haptic stimulator according to the number of types of contact occurred simultaneously. However, Devengenzo teaches a method and system for performing automated tasks with a robot system that changes an operation mode of the haptic stimulator according to the number of types of contact occurred simultaneously (see at least Fig. 2 and Col. 13, Lines 41-56: “In haptic feedback implementations, the object-feedback system 42 can include one or more haptic actuators configured to provoke operator stimulation through the sense of touch…According to some examples, the haptic actuator(s) can be actuated according to various haptic profiles. The haptic profiles can include information relating to the frequency, intensity, and/or relative synchronizations for actuating the haptic actuator(s). In particular, distinct haptic profiles can be associated with different aspects of the interaction between the end-effector 12 and the object such that, by learning to recognize the different haptic profiles, the operator can receive information about the interaction through haptic stimulation.”; Col. 14, lines 48-52: “For instance, the object-feedback system 42 can provide one or more haptic profiles to the operator warning that the amount of force applied by the end-effector 12 to the object is approaching or has exceeded the threshold.”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rossano to incorporate the teachings of Devengenzo and provide a means to change an operation mode of the haptic stimulator according to the number of types of contact occurred simultaneously, with a reasonable expectation of success, in order to provide a warning to the user of an amount of force that is close to a threshold. Claim Rejections - 35 USC § 103 12. Claim 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rossano et al. (US 20180154518, hereinafter Rossano) in view of Sodeyama et al. (US 20210394362, hereinafter Sodeyama). Regarding claim 8, Rossano teaches the limitations of claim 7. Rossano further teaches wherein the robot includes a robot arm and a hand having an end effector and coupled to the robot arm (see at least Figs. 1, 2, and [0019]: “For example, according to certain embodiments, the end effector 108 can be mounted to a wrist or arm 110 of the robot 106. Further, as discussed below, at least portions of the wrist, arm 110, and/or end effector 108 can be moveable relative to other portions of the robot 106 and/or a workpiece via remote operation of the robot 106 and/or end effector 108 by an operator of the operator station 104, as discussed below.”; [0020]: “A variety of different types of end effectors 108 can be utilized by the teleoperation robotic system 100, including, for example, an end effector 108 that is a painting or coating spraying device or tool, welding gun, gripper(s), fixture, spotlight(s), conveyor, and/or a milling or drilling tool, among other types of tools or devices that can perform work on, grip, displace, and/or perform other functions relating to a workpiece)”), the controller causes the robot to perform a workpiece picking job (see at least [0020]: “A variety of different types of end effectors 108 can be utilized by the teleoperation robotic system 100, including, for example, an end effector 108 that is a painting or coating spraying device or tool, welding gun, gripper(s), fixture, spotlight(s), conveyor, and/or a milling or drilling tool, among other types of tools or devices that can perform work on, grip, displace, and/or perform other functions relating to a workpiece)”; [0038]: “According to such an approach, once the subpart 132 is gripped, held, or otherwise operably coupled by/to the robot 106, the position and/or orientation of the subpart 132 is known as such a position and/or orientation would stay in a known relationship to the end effector 108 of the robot 106 until the subpart 132 is released from the robot 106 and/or end effector 108.”) , and the sensor includes a first sensor that detects a force acting on the end effector from a surface on which a workpiece is placed (see at least [0034]: “Moreover, the force sensor 130 can be used to detect and/or determine forces being experienced by, and/or imparted onto, the end effector 108 and/or robot 106 and can directly or indirectly provide corresponding information or data to the controller 112, the computation member 116 of the robot system 102, and/or the management system 122 of the operator station 104. The controller 112, computation member 116, and/or the management system 122 can utilize such data or related information to provide a signal to the teleoperation member 122 that is used to provide a visual, auditory, or physical signal to the operator of the teleoperation member 122 that may provide an indication of the relative degree of the force sensed by the force sensor 130.”). Rossano fails to explicitly teach a finger that performs opening-closing movement, and the sensor includes a first sensor that detects a force acting on the finger from a surface, and a second sensor that detects a force acting on the finger from the workpiece in a finger opening-closing direction. However, Sodeyama teaches an apparatus and method for enabling a physical interaction between an autonomous robot and a person that comprises a hand having a finger that performs opening-closing movement and coupled to the robot arm (see at least Figs. 1, 5, and [0057]: “Each of the manipulators 44L and 44R (hereinafter, their reference numerals are simply 44 in a case where the manipulators 44L and 44R are not distinguished from each other) includes an upper arm portion 441 attached to a place corresponding to a shoulder of the body portion 42, a forearm portion 442 attached to the upper arm portion 441 at a place corresponding to an elbow of the manipulator 44, and a hand portion 443 attached to the forearm portion 442 at a place corresponding to a wrist of the manipulator 44.”), and a sensor includes a first sensor that detects a force acting on the finger from a surface (see at least Fig. 5 and [0087]: “The 6-axis force sensor 501 is attached to, for example, a wrist part of the manipulator 44, and detects a magnitude and a direction of a force and a torque applied to the wrist part.”; [0130]: “Next, when the physical interaction execution unit 55 lifts the object B1, the object recognition unit 52 calculates a load caused by lifting the object B1 based on the sensor data detected by the various sensors (the 6-axis force sensor 501, the 3-axis force sensor 502, the slip sensor 503, and the like) of the manipulator 44 (Step S103), and recognizes or estimates the characteristics of the object B1, for example, the coefficient of static friction, the coefficient of dynamic friction, the mass, the rigidity, the strength, the temperature, the humidity, and the like, based on the calculated load (Step S104).”), and a second sensor that detects a force acting on the finger from the workpiece in a finger opening-closing direction (see at least Fig. 5 and [0088]: “The 3-axis force sensor 502 is attached to, for example, each knuckle in the hand portion 443 and detects a magnitude and a direction of a force or a torque applied to the knuckle.”; [0130]: “Next, when the physical interaction execution unit 55 lifts the object B1, the object recognition unit 52 calculates a load caused by lifting the object B1 based on the sensor data detected by the various sensors (the 6-axis force sensor 501, the 3-axis force sensor 502, the slip sensor 503, and the like) of the manipulator 44 (Step S103), and recognizes or estimates the characteristics of the object B1, for example, the coefficient of static friction, the coefficient of dynamic friction, the mass, the rigidity, the strength, the temperature, the humidity, and the like, based on the calculated load (Step S104).”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rossano to incorporate the teachings of Sodeyama and provide a hand having a finger that performs opening-closing movement, and the sensor includes a first sensor that detects a force acting on the finger from a surface, and a second sensor that detects a force acting on the finger from the workpiece in a finger opening-closing direction, with a reasonable expectation of success, in order to allow the robot to perform more intricate manipulations using fingers. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Shahoian et al. (US 20050128186) teaches an apparatus and method for haptic feedback control with a handheld remote to control a slave robot. Norton et al. (US 20170255301) teaches a system and apparatus for a user interface using haptic gloves to communicate and control a robot. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TIEN MINH LE whose telephone number is (571)272-3903. The examiner can normally be reached Monday to Friday (8:30am-5:30pm eastern time). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached on (571)272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /T.M.L./Examiner, Art Unit 3656 /KHOI H TRAN/Supervisory Patent Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Oct 15, 2024
Application Filed
Jan 07, 2026
Non-Final Rejection — §102, §103, §112
Mar 26, 2026
Interview Requested
Apr 07, 2026
Applicant Interview (Telephonic)
Apr 07, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12566070
DETERMINATION APPARATUS AND DETERMINATION METHOD
2y 5m to grant Granted Mar 03, 2026
Patent 12528325
A CONTROL SYSTEM FOR A VEHICLE
2y 5m to grant Granted Jan 20, 2026
Patent 12508704
Marker Detection Apparatus and Robot Teaching System
2y 5m to grant Granted Dec 30, 2025
Patent 12509122
VEHICLE SELECTION DEVICE AND VEHICLE SELECTION METHOD
2y 5m to grant Granted Dec 30, 2025
Patent 12466074
IMAGE PROCESSING METHOD, IMAGE PROCESSING APPARATUS, ROBOT-MOUNTED TRANSFER DEVICE, AND SYSTEM
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
68%
Grant Probability
92%
With Interview (+23.8%)
2y 12m
Median Time to Grant
Low
PTA Risk
Based on 81 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month