Prosecution Insights
Last updated: April 19, 2026
Application No. 17/757,306

HANDHELD DEVICE FOR TRAINING AT LEAST ONE MOVEMENT AND AT LEAST ONE ACTIVITY OF A MACHINE, SYSTEM AND METHOD

Non-Final OA §103
Filed
Jun 14, 2022
Examiner
KASPER, BYRON XAVIER
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Wandelbots GmbH
OA Round
5 (Non-Final)
70%
Grant Probability
Favorable
5-6
OA Rounds
3y 0m
To Grant
88%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
72 granted / 103 resolved
+17.9% vs TC avg
Strong +18% interview lift
Without
With
+18.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
36 currently pending
Career history
139
Total Applications
across all art units

Statute-Specific Performance

§101
10.9%
-29.1% vs TC avg
§103
56.3%
+16.3% vs TC avg
§102
11.9%
-28.1% vs TC avg
§112
16.4%
-23.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 103 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 2. This communication is responsive to the Application No. 17/757,306 and the amendments filed on 2/5/2026. 3. Claims 26-31 and 34-46 are presented for examination. Information Disclosure Statement 4. The information disclosure statements (IDS) submitted on 6/14/2022, 11/10/2022, 9/18/2023, and 11/1/2024 have been fully considered by the Examiner. Response to Arguments 5. Applicant’s arguments with respect to the rejection of claim(s) 26-31 and 33-46 under 35 U.S.C. 103 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Regarding independent claim 26, the Examiner agrees that the combination of US 20180345495 A1 to Aldridge, US 20060271263 A1 to Self, and WO 2017036520 A1 to Dai fails to teach all of the amendments of the claim. However, in light of the amendments and the Applicant’s remarks, an updated search was conducted, and a new ground of rejection concerning claim 26 has been determined, in which will be described later. Regarding independent claims 42 and 44, as these claims contain similar limitations as claim 26, are still rejected for similar reasons as claim 26 is, in which will be described later. Regarding dependent claims 27-31, 34-41, 43, and 45-46, as all of these claims depend from either claims 26, 42, or 44, are still rejected, in which will be described later. Regarding dependent claim 33, this claim has been cancelled, and thus, is withdrawn from further consideration. 6. The Examiner notes, that in the previous Non-Final Office Action mailed 4/22/2025, that claims 26, 31, 42, and 44 comprise limitations interpreted to under 35 U.S.C. 112(f) for using a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. The Applicant’s remarks filed 2/5/2026 do not contain any arguments against the Examiner’s interpretation to these claims under 35 U.S.C. 112(f), nor do the amended claims filed 2/5/2026 address these interpretations. As such, the Examiner maintains the interpretations to claims 26, 31, 42, and 44 under 35 U.S.C. 112(f) stated in the Non-Final Rejection mailed 4/22/2025. 7. Examiner notes Applicant's request for interview, if any issues remain that would prevent allowance of the application, however, in light of the new ground of rejection in view of the newly found prior art, which was necessitated based on Applicant's amendments to the claims, Examiner would like to provide the Applicant with an opportunity to review the new grounds of rejection in view of said newly found prior art, prior to an interview. Once the Applicant has reviewed the new grounds of rejection, Examiner is willing to conduct an interview to discuss any remaining issues with the application that Applicant would like to discuss. Claim Rejections - 35 USC § 103 8. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 9. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 10. Claim(s) 26, 28, 30, 31, 34, 35, 38, 39, 42, 44, and 46 is/are rejected under 35 U.S.C. 103 as being unpatentable over Aldridge et al. (US 20180345495 A1 hereinafter Aldridge) in view of Lee et al. (US 20120026408 A1 hereinafter Lee) and Dai et al. (WO 2017036520 A1 hereinafter Dai (Provided by Applicant's IDS)). Regarding Claim 26, Aldridge teaches a handheld device ([0018] via “In one embodiment, a robotic point capture and motion control system may enable the capturing of one or more points in space associated with a handheld controller device without having to move the robot to one or more points in space during programming.”) for training at least one movement and at least one activity of a machine ([0073] via “In one embodiment, the operator may move the controller device 320 to make contact at a surface with the probe tip 312. This allows the robotic point capture and motion control system to teach the robot by capturing one or more points in space that may be defined for one or more purposes. For example, the operator may hold the controller device 320 by hand and may move the probe tip 312 onto one or more surfaces (e.g., planes, complex surfaces, cylinders, or any other surface) or even points in space that may define a space boundary. The operator may then press a trigger on the controller device 320 to capture (e.g., learn) the point and orientation of the probe 310 and the probe tip 312. The point and orientation may be stored for use in the robot program.”), the handheld device comprising: a handle ([0073] via “For example, the operator may hold the controller device 320 by hand and may move the probe tip 312 onto one or more surfaces (e.g., planes, complex surfaces, cylinders, or any other surface) or even points in space that may define a space boundary.”), (Note: See Figures 3A and 3B of Aldridge where the handheld controller device 320 has a handle on the left side of the controller device.); an input unit configured to input activation information for activating the training of the machine ([0053] via “In one embodiment, a robotic point capture and motion control system may facilitate a training mode such that the controller device is capable of learning and capturing points in space at various locations being traversed using the controller device. The user may press the pressure sensitive trigger to gain control of the robot. The robot may be moved into the desired position and orientation of a point in space and then the trigger is released.”), (Note: The Examiner interprets the trigger as the input unit.); an output unit configured to output the activation information for activating the training of the machine to an external device that is external to the handheld device ([0041] via “The controller device 102 may control the robotic device 120 by transmitting control signals to the robotic device 120 through a wire or through wireless signals and vice versa.”), ([0045] via “The robotic device 120 may receive the control signal and may be controlled by the received control signal. The control signal may be received directly from the controller device 102, or may be received through the motion capture input device 123. For example, the control signal may cause the robotic device 120 to apply or remove pneumatic air from a robotic gripper of the robotic device 120, or any kind of input/output or generic gripper or any device to communicate to on the robot.”), (Note: The Examiner interprets the handheld controller device 102 having a control signal output as the output unit.); and a coupling structure for releasably coupling an interchangeable attachment configured in accordance with the at least one activity ([0068] via “Referring to FIG. 3A, there is shown a control device 320 having one or more probes 310 attached to a connector 309 of the control device 320. A probe 310 may be a replica of an end effector of a robot, a point device, a tool, or any other suitable devices. In one example, the probe 310 may have a probe tip 312 that may be used to designate a point in space.”), (Note: The Examiner interprets the probe 310 as the coupling structure since it is able to be attached to the handheld control device 320 (See Figures 3A and 3B of Aldridge).). Aldridge is silent on the handheld device comprising: one or more sensors configured to detect one or more locating signals initiated externally to the handheld device; wherein at least one sensor of the one or more sensors comprises an optoelectronic sensor configured to capture the location signal; one or more processors configured to determine location information of the handheld device based on the one or more locating signals; and one or more transmitters for transmitting the location information as training data for training the machine. However, Lee teaches a handheld device comprising: one or more sensors configured to detect one or more locating signals initiated externally to the handheld device; wherein at least one sensor of the one or more sensors comprises an optoelectronic sensor configured to capture the location signal ([0036] via “In this embodiment, the remote controller 10 has an internal structure shown schematically in FIG. 2A, including an optical sensor 11, ….”), ([0037] via “Referring to FIGS. 1, 2A, and 2B, the system according to this embodiment operates in the following way: First, the light source 30 emits alight signal, such as an infrared (IR) signal. Next, the optical sensor 11 senses the light signal to detect a relative position of the optical sensor to the light source, that is, a relative position of the remote controller to the light source, wherein the positional information can be two-dimensional information (x and y directions in FIG. 1) or three-dimensional information (x, y, and z directions in FIG. 1). The optical sensor 11 generates a first electronic signal representative of the positional information according to the sensed light signal, and transmits the first electronic signal to the first processor and transceiver 12.”), (Note: See Figures 1 and 2A-B of Lee as well. The Examiner interprets the optical sensor 11 of Lee as the optoelectronic sensor.); and one or more processors configured to determine location information of the handheld device based on the one or more locating signals ([0037] via “The optical sensor 11 generates a first electronic signal representative of the positional information according to the sensed light signal, and transmits the first electronic signal to the first processor and transceiver 12. The first processor and transceiver 12 processes the first electronic signal to convert it to a first wireless signal RF1 and transmits the wireless signal RF1 to the transceiver device 20 via the first antenna 13. In the transceiver device 20, after the second antenna 23 receives the first wireless signal RF1 from the remote controller 10, the second processor and transceiver 22 converts it to a second electronic signal, which is transmitted to the video display host 40 via a second and a first connectors 21 and 41.”). Further, Dai teaches a handheld device comprising: one or more transmitters for transmitting the location information as training data for training the machine (Page 8 line 33 – Page 9 line 11, where “According to a further variant of the method the robot program comprises data describing a desired movement path of the robot arm. It is also possible to teach position vectors describing the course of the movement path by use of the robot teaching system according to the invention in the case that it has an integrated position determination device. Thus the robot teaching device can be held at desired locations on a reference workpiece wherein a force might be applied thereon through the mechanical interaction tool and wherein not only data values describing the applied force but also data values describing the determined position and/or orientation of the hand-held teaching device are provided by the communication interface to the computing unit. Thus according to a further embodiment of the invention the hand held teaching device further comprises means for determining its position and/or orientation wherein the desired movement path of the robot arm is teached by use of the hand held teaching device.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Lee wherein the handheld device comprises: one or more sensors configured to detect one or more locating signals initiated externally to the handheld device; wherein at least one sensor of the one or more sensors comprises an optoelectronic sensor configured to capture the location signal; and one or more processors configured to determine location information of the handheld device based on the one or more locating signals. Doing so allows for the determination of the position of the handheld device in three-dimensions from a set point, as stated above by Lee in paragraph [0037], which also allows for additional parameters of the handheld device able to be determined, as stated by Lee ([0039] via “As mentioned earlier, the positional information of the remote controller relative to the light source can be obtained in the present invention. Hence, a displacement, velocity, or acceleration of the movement of the remote controller 10 can be calculated according to the positional information at different time points, and a time difference therebetween.”). In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Dai wherein the handheld device comprises: one or more transmitters for transmitting the location information as training data for training the machine. Doing so teaches the machine the desired positions and orientations to complete its movement path based on location demonstration from the handheld device, as stated above by Dai. Regarding Claim 28, modified reference Aldridge teaches the handheld device of claim 26, wherein the output unit comprises a wireless communication device for communicating between the handheld device and the external device ([0035] via “In one embodiment, and with reference to FIG. 1, a robotic device 120 may communicate directly with the controller device 102. For example, the two devices may communicate through a wired or a wireless connection (e.g., magnetic, optical, wireless technology based communication, cables, etc.).”). Regarding Claim 30, modified reference Aldridge teaches the handheld device of claim 26, wherein the input unit is configured to capture a handling of the handheld device while performing the activity ([0030] via “The user may select the position in free space, moves out of the way, and then initiates the robot's move to the selected position by modulating the speed with the trigger on the controller. That is the user may make the robot move from slow to fast based on gently pressing the trigger to firmly pressing the trigger.”), ([0053] via “In one embodiment, a robotic point capture and motion control system may facilitate a training mode such that the controller device is capable of learning and capturing points in space at various locations being traversed using the controller device. The user may press the pressure sensitive trigger to gain control of the robot. The robot may be moved into the desired position and orientation of a point in space and then the trigger is released. A button is pressed on the controller device to add the point.”). Regarding Claim 31, modified reference Aldridge teaches the handheld device of claim 26, the handheld device further comprising a feedback unit configured to generate feedback to a user of the handheld device ([0044] via “The controller device 102 may also contain haptic feedback devices to provide vibration for certain events, like adding a point, or to communicate robot inertia to the hand of the operator. This may be applicable to a “spray paint” mode to help the operator understand on the fly the kinds of accelerations they are “asking” the robot to do while the robot is being taught.”). Regarding Claim 34, modified reference Aldridge teaches the handheld device of claim 26, wherein the location information comprises one or more of the following: spatial information pertaining to at least one of a movement of the handheld device in space, a position of the handheld device in space, and an orientation of the handheld device in space ([0069] via “In one or more embodiments, the probe 310 may have an offset distance (e.g., distance d) from a location on the controller device 320. The controller device may be a position-sensing and orientation-sensing hand-held controller device. For example, the controller device 320 may contain circuitry comprising a sensor that can sense and capture the position and orientation of the control device relative to a coordinate system.”). Regarding Claim 35, modified reference Aldridge teaches the handheld device of claim 26, but is silent on wherein the one or more sensors comprises a motion sensor or a position sensor. However, Lee teaches wherein the one or more sensors comprises a motion sensor or a position sensor ([0037] via “… First, the light source 30 emits alight signal, such as an infrared (IR) signal. Next, the optical sensor 11 senses the light signal to detect a relative position of the optical sensor to the light source, that is, a relative position of the remote controller to the light source, wherein the positional information can be two-dimensional information (x and y directions in FIG. 1) or three-dimensional information (x, y, and z directions in FIG. 1).”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Lee wherein the one or more sensors comprises a motion sensor or a position sensor. Doing so allows for the determination of the position of the handheld device in three-dimensions from a set point, as stated above by Lee in paragraph [0037], which also allows for additional parameters of the handheld device able to be determined, as stated by Lee ([0039] via “As mentioned earlier, the positional information of the remote controller relative to the light source can be obtained in the present invention. Hence, a displacement, velocity, or acceleration of the movement of the remote controller 10 can be calculated according to the positional information at different time points, and a time difference therebetween.”). Regarding Claim 38, modified reference Aldridge teaches the handheld device of claim 26, wherein the coupling structure is arranged on a first end face of the handheld device ([0068] via “Referring to FIG. 3A, there is shown a control device 320 having one or more probes 310 attached to a connector 309 of the control device 320. A probe 310 may be a replica of an end effector of a robot, a point device, a tool, or any other suitable devices. In one example, the probe 310 may have a probe tip 312 that may be used to designate a point in space.”), (Note: See Figures 3A and 3B of Aldridge, where the probe 310 and probe tip 312 are arranged on the opposite side from the handle of the handheld device facing towards the robot, to which the Examiner interprets this direction to be first end face of the handheld device.). Regarding Claim 39, modified reference Aldridge teaches the handheld device of claim 26, wherein the interchangeable attachment comprises an end portion of a tool that represents the activity of the machine ([0023] via “In one embodiment, a robotic point capture and motion control system may facilitate a single point and orientation capture in 3D space using a handheld controller and touch probe. … For example, if the end effector of the robot is an attachment that includes a gripper, the touch probe on the controller device may also be a gripper that may be used by a user to capture one or more points and orientations in the 3D space. These captured points and orientations may then be used to program the robot.”). Regarding Claim 42, Aldridge teaches a handheld device ([0018] via “In one embodiment, a robotic point capture and motion control system may enable the capturing of one or more points in space associated with a handheld controller device without having to move the robot to one or more points in space during programming.”) for training a movement and an activity of a machine ([0073] via “In one embodiment, the operator may move the controller device 320 to make contact at a surface with the probe tip 312. This allows the robotic point capture and motion control system to teach the robot by capturing one or more points in space that may be defined for one or more purposes. For example, the operator may hold the controller device 320 by hand and may move the probe tip 312 onto one or more surfaces (e.g., planes, complex surfaces, cylinders, or any other surface) or even points in space that may define a space boundary. The operator may then press a trigger on the controller device 320 to capture (e.g., learn) the point and orientation of the probe 310 and the probe tip 312. The point and orientation may be stored for use in the robot program.”), the handheld device comprising: a handle ([0073] via “For example, the operator may hold the controller device 320 by hand and may move the probe tip 312 onto one or more surfaces (e.g., planes, complex surfaces, cylinders, or any other surface) or even points in space that may define a space boundary.”), (Note: See Figures 3A and 3B of Aldridge where the handheld controller device 320 has a handle on the left side of the controller device.); an input unit configured to input activation information for activating the training of the machine ([0053] via “In one embodiment, a robotic point capture and motion control system may facilitate a training mode such that the controller device is capable of learning and capturing points in space at various locations being traversed using the controller device. The user may press the pressure sensitive trigger to gain control of the robot. The robot may be moved into the desired position and orientation of a point in space and then the trigger is released.”), (Note: The Examiner interprets the trigger as the input unit.); an output unit configured to output the activation information for activating the training of the machine to a device external to the handheld device ([0041] via “The controller device 102 may control the robotic device 120 by transmitting control signals to the robotic device 102 through a wire or through wireless signals and vice versa.”), ([0045] via “The robotic device 120 may receive the control signal and may be controlled by the received control signal. The control signal may be received directly from the controller device 102, or may be received through the motion capture input device 123. For example, the control signal may cause the robotic device 120 to apply or remove pneumatic air from a robotic gripper of the robotic device 120, or any kind of input/output or generic gripper or any device to communicate to on the robot.”), (Note: The Examiner interprets the handheld controller device 102 having a control signal output as the output unit.); and a tool configured according to the at least one activity ([0023] via “In one embodiment, a robotic point capture and motion control system may facilitate a single point and orientation capture in 3D space using a handheld controller and touch probe. … For example, if the end effector of the robot is an attachment that includes a gripper, the touch probe on the controller device may also be a gripper that may be used by a user to capture one or more points and orientations in the 3D space. These captured points and orientations may then be used to program the robot.”), ([0068] via “Referring to FIG. 3A, there is shown a control device 320 having one or more probes 310 attached to a connector 309 of the control device 320. A probe 310 may be a replica of an end effector of a robot, a point device, a tool, or any other suitable devices. In one example, the probe 310 may have a probe tip 312 that may be used to designate a point in space.”), (Note: The Examiner interprets the probe as the tool.). Aldridge is silent on the handheld device comprising: one or more sensors configured to detect one or more locating signals initiated externally to the handheld device; wherein at least one sensor of the one or more sensors comprises an optoelectronic sensor configured to capture the location signal; one or more processors configured to determine location information of the handheld device based on the one or more locating signals; and one or more transmitters for transmitting location information as training data for training the machine. However, Lee teaches a handheld device comprising: one or more sensors configured to detect one or more locating signals initiated externally to the handheld device; wherein at least one sensor of the one or more sensors comprises an optoelectronic sensor configured to capture the location signal ([0036] via “In this embodiment, the remote controller 10 has an internal structure shown schematically in FIG. 2A, including an optical sensor 11, ….”), ([0037] via “Referring to FIGS. 1, 2A, and 2B, the system according to this embodiment operates in the following way: First, the light source 30 emits alight signal, such as an infrared (IR) signal. Next, the optical sensor 11 senses the light signal to detect a relative position of the optical sensor to the light source, that is, a relative position of the remote controller to the light source, wherein the positional information can be two-dimensional information (x and y directions in FIG. 1) or three-dimensional information (x, y, and z directions in FIG. 1). The optical sensor 11 generates a first electronic signal representative of the positional information according to the sensed light signal, and transmits the first electronic signal to the first processor and transceiver 12.”), (Note: See Figures 1 and 2A-B of Lee as well. The Examiner interprets the optical sensor 11 of Lee as the optoelectronic sensor.); and one or more processors configured to determine location information of the handheld device based on the one or more locating signals ([0037] via “The optical sensor 11 generates a first electronic signal representative of the positional information according to the sensed light signal, and transmits the first electronic signal to the first processor and transceiver 12. The first processor and transceiver 12 processes the first electronic signal to convert it to a first wireless signal RF1 and transmits the wireless signal RF1 to the transceiver device 20 via the first antenna 13. In the transceiver device 20, after the second antenna 23 receives the first wireless signal RF1 from the remote controller 10, the second processor and transceiver 22 converts it to a second electronic signal, which is transmitted to the video display host 40 via a second and a first connectors 21 and 41.”). Further, Dai teaches a handheld device comprising: one or more transmitters for transmitting location information as training data for training the machine (Page 8 line 33 – Page 9 line 11, where “According to a further variant of the method the robot program comprises data describing a desired movement path of the robot arm. It is also possible to teach position vectors describing the course of the movement path by use of the robot teaching system according to the invention in the case that it has an integrated position determination device. Thus the robot teaching device can be held at desired locations on a reference workpiece wherein a force might be applied thereon through the mechanical interaction tool and wherein not only data values describing the applied force but also data values describing the determined position and/or orientation of the hand-held teaching device are provided by the communication interface to the computing unit. Thus according to a further embodiment of the invention the hand held teaching device further comprises means for determining its position and/or orientation wherein the desired movement path of the robot arm is teached by use of the hand held teaching device.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Lee wherein the handheld device comprises: a handheld device comprising: one or more sensors configured to detect one or more locating signals initiated externally to the handheld device; wherein at least one sensor of the one or more sensors comprises an optoelectronic sensor configured to capture the location signal; and one or more processors configured to determine location information of the handheld device based on the one or more locating signals. Doing so allows for the determination of the position of the handheld device in three-dimensions from a set point, as stated above by Lee in paragraph [0037], which also allows for additional parameters of the handheld device able to be determined, as stated by Lee ([0039] via “As mentioned earlier, the positional information of the remote controller relative to the light source can be obtained in the present invention. Hence, a displacement, velocity, or acceleration of the movement of the remote controller 10 can be calculated according to the positional information at different time points, and a time difference therebetween.”). In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Dai wherein the handheld device comprises: one or more transmitters for transmitting location information as training data for training the machine. Doing so teaches the machine the desired positions and orientations to complete its movement path based on location demonstration from the handheld device, as stated above by Dai. Regarding Claim 44, Aldridge teaches a system for training a movement and an activity of a machine ([0073] via “In one embodiment, the operator may move the controller device 320 to make contact at a surface with the probe tip 312. This allows the robotic point capture and motion control system to teach the robot by capturing one or more points in space that may be defined for one or more purposes. For example, the operator may hold the controller device 320 by hand and may move the probe tip 312 onto one or more surfaces (e.g., planes, complex surfaces, cylinders, or any other surface) or even points in space that may define a space boundary. The operator may then press a trigger on the controller device 320 to capture (e.g., learn) the point and orientation of the probe 310 and the probe tip 312. The point and orientation may be stored for use in the robot program.”), the system comprising: a handheld device ([0018] via “In one embodiment, a robotic point capture and motion control system may enable the capturing of one or more points in space associated with a handheld controller device without having to move the robot to one or more points in space during programming.”) comprising: a handle ([0073] via “For example, the operator may hold the controller device 320 by hand and may move the probe tip 312 onto one or more surfaces (e.g., planes, complex surfaces, cylinders, or any other surface) or even points in space that may define a space boundary.”), (Note: See Figures 3A and 3B of Aldridge where the handheld controller device 320 has a handle on the left side of the controller device.); and a coupling structure for releasably coupling an interchangeably attachable tool ([0068] via “Referring to FIG. 3A, there is shown a control device 320 having one or more probes 310 attached to a connector 309 of the control device 320. A probe 310 may be a replica of an end effector of a robot, a point device, a tool, or any other suitable devices. In one example, the probe 310 may have a probe tip 312 that may be used to designate a point in space.”), wherein the interchangeably attachable tool is configured to perform the activity ([0068] via “The coordinates of the probe tip 312 may indicate the location of the probe tip 312. The control device 320 may determine the coordinates of the probe tip 312 based on a profile associated with the probe 310 that may have been determined based on the type of the probe 310 used. The profile associated with probe 310 may include length of the probe 310 and positioning of the probe 310 when installed on the control device 320. The profile associated with the probe 310 may be inputted to the robotic point capture and motion control system during installation of the probe 310. The profile of the probe 310 may be sent to a base station (e.g., motion capture input device 123 of FIG. 1) to determine specific actions that may be performed by the robotic point capture and motion control system.”), (Note: The Examiner interprets the probe 310 as the coupling structure (See Figures 3A and 3B of Aldridge).); an external device that is external to the handheld device ([0042] via “The motion capture input device 123 may be a stand-alone device, or may be included in the robotic device 120. The controller device 102 may communicate its position and orientation data to the motion capture input device 123.”), (Note: The Examiner interprets the combination of the motion capture input device 123 and the robotic device 120 as the external device, as they are able to perform the functions of the input and output units described below.), the external device comprising: an input unit configured to input activation information for activating the movement for training the machine ([0045] via “The robotic device 120 may receive the control signal and may be controlled by the received control signal. The control signal may be received directly from the controller device 102, or may be received through the motion capture input device 123. For example, the control signal may cause the robotic device 120 to apply or remove pneumatic air from a robotic gripper of the robotic device 120, or any kind of input/output or generic gripper or any device to communicate to on the robot.”), (Note: The Examiner interprets the motion capture input device 123 as the input unit of the external device, based on receiving input of a control signal that can be used to activate the robot 120.); and an output unit configured to output the activation information activating the movement for training the machine ([0045] via “The robotic device 120 may receive the control signal and may be controlled by the received control signal. The control signal may be received directly from the controller device 102, or may be received through the motion capture input device 123. For example, the control signal may cause the robotic device 120 to apply or remove pneumatic air from a robotic gripper of the robotic device 120, or any kind of input/output or generic gripper or any device to communicate to on the robot. Further, the control signal may cause the robotic device 120 to move to a new position in space.”), (Note: The Examiner interprets the robotic device 120 as the output unit, since it receives the control signals from the motion capture input device 123 (input unit) and performs actions.). Aldridge is silent on the handheld device comprising: one or more sensors configured to detect one or more locating signals initiated externally to the handheld device; wherein at least one sensor of the one or more sensors comprises an optoelectronic sensor configured to capture the location signal; one or more processors configured to determine location information of the handheld device based on the one or more locating signals; and one or more transmitters for transmitting the location information as training data for training the machine. However, Lee teaches a handheld device comprising: one or more sensors configured to detect one or more locating signals initiated externally to the handheld device; wherein at least one sensor of the one or more sensors comprises an optoelectronic sensor configured to capture the location signal ([0036] via “In this embodiment, the remote controller 10 has an internal structure shown schematically in FIG. 2A, including an optical sensor 11, ….”), ([0037] via “Referring to FIGS. 1, 2A, and 2B, the system according to this embodiment operates in the following way: First, the light source 30 emits alight signal, such as an infrared (IR) signal. Next, the optical sensor 11 senses the light signal to detect a relative position of the optical sensor to the light source, that is, a relative position of the remote controller to the light source, wherein the positional information can be two-dimensional information (x and y directions in FIG. 1) or three-dimensional information (x, y, and z directions in FIG. 1). The optical sensor 11 generates a first electronic signal representative of the positional information according to the sensed light signal, and transmits the first electronic signal to the first processor and transceiver 12.”), (Note: See Figures 1 and 2A-B of Lee as well. The Examiner interprets the optical sensor 11 of Lee as the optoelectronic sensor.); and one or more processors configured to determine location information of the handheld device based on the one or more locating signals ([0037] via “The optical sensor 11 generates a first electronic signal representative of the positional information according to the sensed light signal, and transmits the first electronic signal to the first processor and transceiver 12. The first processor and transceiver 12 processes the first electronic signal to convert it to a first wireless signal RF1 and transmits the wireless signal RF1 to the transceiver device 20 via the first antenna 13. In the transceiver device 20, after the second antenna 23 receives the first wireless signal RF1 from the remote controller 10, the second processor and transceiver 22 converts it to a second electronic signal, which is transmitted to the video display host 40 via a second and a first connectors 21 and 41.”). Further, Dai teaches a handheld device comprising: one or more transmitters for transmitting the location information as training data for training the machine (Page 8 line 33 – Page 9 line 11, where “According to a further variant of the method the robot program comprises data describing a desired movement path of the robot arm. It is also possible to teach position vectors describing the course of the movement path by use of the robot teaching system according to the invention in the case that it has an integrated position determination device. Thus the robot teaching device can be held at desired locations on a reference workpiece wherein a force might be applied thereon through the mechanical interaction tool and wherein not only data values describing the applied force but also data values describing the determined position and/or orientation of the hand-held teaching device are provided by the communication interface to the computing unit. Thus according to a further embodiment of the invention the hand held teaching device further comprises means for determining its position and/or orientation wherein the desired movement path of the robot arm is teached by use of the hand held teaching device.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Lee wherein the handheld device comprises: one or more sensors configured to detect one or more locating signals initiated externally to the handheld device; wherein at least one sensor of the one or more sensors comprises an optoelectronic sensor configured to capture the location signal; and one or more processors configured to determine location information of the handheld device based on the one or more locating signals. Doing so allows for the determination of the position of the handheld device in three-dimensions from a set point, as stated above by Lee in paragraph [0037], which also allows for additional parameters of the handheld device able to be determined, as stated by Lee ([0039] via “As mentioned earlier, the positional information of the remote controller relative to the light source can be obtained in the present invention. Hence, a displacement, velocity, or acceleration of the movement of the remote controller 10 can be calculated according to the positional information at different time points, and a time difference therebetween.”). In addition, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Dai wherein the handheld device comprises: one or more transmitters for transmitting the location information as training data for training the machine. Doing so teaches the machine the desired positions and orientations to complete its movement path based on location demonstration from the handheld device, as stated above by Dai. Regarding Claim 46, modified reference Aldridge teaches the handheld device of claim 26, but is silent on wherein the one or more sensors are infrared sensors. However, Lee teaches wherein the one or more sensors are infrared sensors ([0050] via “In the embodiments shown in FIGS. 1, 2A, and 2B, it is not required to connect the remote controller 10 to an external image capturing device or to include such additional image capturing device inside the remote controller 10, since the remote controller 10 is provided with the optical sensor 11 already, which can be used for sensing infrared signals and general images. When the baby monitor function is not required, the optical sensor 11 receives light signals through an infrared (IR) pass filter to sense the infrared signal emitted by the light source 30.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Lee wherein the one or more sensors are infrared sensors. Doing so determines the position of the handheld device within the operating space from a light source set point, wherein the locating light source is chosen as an example light range of the finite numbers of light ranges of the electromagnetic spectrum, as stated by Lee ([0037] via “Referring to FIGS. 1, 2A, and 2B, the system according to this embodiment operates in the following way: First, the light source 30 emits alight signal, such as an infrared (IR) signal. Next, the optical sensor 11 senses the light signal to detect a relative position of the optical sensor to the light source, that is, a relative position of the remote controller to the light source, wherein the positional information can be two-dimensional information (x and y directions in FIG. 1) or three-dimensional information (x, y, and z directions in FIG. 1).”). 11. Claim(s) 27, 29, 36, 37, 40, 41, 43, and 45 is/are rejected under 35 U.S.C. 103 as being unpatentable over Aldridge et al. (US 20180345495 A1 hereinafter Aldridge) in view of Lee et al. (US 20120026408 A1 hereinafter Lee) and Dai et al. (WO 2017036520 A1 hereinafter Dai (Provided by Applicant's IDS)), and further in view of Riedel (DE 102015206575 A1 hereinafter Riedel (Provided by Applicant's IDS, however an English translation was used and is attached herein)). Regarding Claim 27, modified reference Aldridge teaches the handheld device of claim 26, but is silent on the handheld device further comprising an interface configured to implement a function in conjunction with circuitry of the interchangeable attachment. However, Riedel teaches an interface configured to implement a function in conjunction with circuitry of the interchangeable attachment (Page 12 paragraphs 5-6 via “As accessory modules 31 are also force sensors 31b conceivable on the handle, which make it possible to couple with the handle directly on the vehicle and manually move this force-controlled. In the 14 are accordingly various accessory modules 31 shown schematically, which can increase the functionality of the control handle for some applications targeted. These accessory modules 31 For example, the front and the bottom could be coupled mechanically and, if necessary, also electrically, to the handle at the top and bottom. … Additional functions could, for example, additional batteries 31c , Sensors for power and / or inclination, cameras 31d , more buttons 31e or input options, such as a space mouse 31f , mechanical coupling elements, such as adjustable angle 31a or custom versions. The accessory modules 31 For example, they could be formed by additional 3D or 6D force sensors, by means of which a mobile robot platform and / or a robot arm, in particular a robot arm, which does not have its own force sensors, can be manually guided, in particular in a compliance control.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Riedel wherein the handheld device further comprises an interface configured to implement a function in conjunction with circuitry of the interchangeable attachment. By connecting the accessory modules to the handheld device, the functionality of the handheld device for controlling different applications increases, as stated above by Riedel. Regarding Claim 29, modified reference Aldridge teaches the handheld device of claim 26, but is silent on the handheld device further comprising a battery. However, Riedel teaches the handheld device further comprising a battery (Page 10 paragraph 1 via “Due to the lack of cable connection, a battery or a battery for powering the operating handle is present in the base body, which can be contact-based or inductively charged when coupled with various coupling partners, such as cell frame or mobile platform.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Riedel wherein the handheld device further comprises a battery. Doing so allows for the direct powering of the handheld device without the need for a cable connection, increasing the mobility of the handheld device, as implied above by Riedel. Regarding Claim 36, modified reference Aldridge teaches the handheld device of claim 26, but is silent on wherein the coupling structure is configured to positively couple the interchangeable attachment. However, Riedel teaches wherein the coupling structure is configured to positively couple the interchangeable attachment (Page 4 paragraph 9 via “The plug connection means can form a positive connection means in this respect in cooperation with the mating connector means. This positive connection means can be secured against accidental release with an additional manually operated locking means.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Riedel wherein the coupling structure is configured to positively couple the interchangeable attachment. Doing so prevents an accidental release of the coupling structure from the handheld device, as stated above by Riedel. Regarding Claim 37, modified reference Aldridge teaches the handheld device of claim 27, but is silent on wherein the interface is configured to provide electrical power to the circuit or to communicate with the circuit. However, Riedel teaches wherein the interface is configured to provide electrical power to the circuit or to communicate with the circuit (Page 12 paragraph 6 via “In the 14 are accordingly various accessory modules 31 shown schematically, which can increase the functionality of the control handle for some applications targeted. These accessory modules 31 For example, the front and the bottom could be coupled mechanically and, if necessary, also electrically, to the handle at the top and bottom.”), (Note: The Examiner interprets that since there is an electrical connection between the handheld device and the accessory module, that there is some electrical power communication between the two over the interface connection in between.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Riedel wherein the interface is configured to provide electrical power to the circuit or to communicate with the circuit. Doing so allows for the functionality of the accessory modules to be realized and performed when attached to the handheld device, as stated above by Riedel. Regarding Claim 40, modified reference Aldridge teaches the handheld device of claim 26, but is silent on the handheld device further comprising an additional interchangeable attachment, wherein the additional interchangeable attachment is configured according to a second activity of the machine that is different from a first activity the at least one activity of the machine, and wherein the additional interchangeable attachment and the interchangeable attachment are configured to be interchangeable with each other. However, Riedel teaches an additional interchangeable attachment, wherein the additional interchangeable attachment is configured according to a second activity of the machine that is different from a first activity the at least one activity of the machine, and wherein the additional interchangeable attachment and the interchangeable attachment are configured to be interchangeable with each other (Page 12 paragraphs 5-6 via “As accessory modules 31 are also force sensors 31b conceivable on the handle, which make it possible to couple with the handle directly on the vehicle and manually move this force-controlled. In the 14 are accordingly various accessory modules 31 shown schematically, which can increase the functionality of the control handle for some applications targeted. These accessory modules 31 For example, the front and the bottom could be coupled mechanically and, if necessary, also electrically, to the handle at the top and bottom. … Additional functions could, for example, additional batteries 31c , Sensors for power and / or inclination, cameras 31d , more buttons 31e or input options, such as a space mouse 31f , mechanical coupling elements, such as adjustable angle 31a or custom versions. The accessory modules 31 For example, they could be formed by additional 3D or 6D force sensors, by means of which a mobile robot platform and / or a robot arm, in particular a robot arm, which does not have its own force sensors, can be manually guided, in particular in a compliance control.”), (Note: The Examiner interprets the plurality of different accessory modules 31a-31f implementing different functions when attached to the handheld device as satisfying the limitations of the claim.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Riedel wherein the handheld device further comprises an additional interchangeable attachment, wherein the additional interchangeable attachment is configured according to a second activity of the machine that is different from a first activity the at least one activity of the machine, and wherein the additional interchangeable attachment and the interchangeable attachment are configured to be interchangeable with each other. By connecting different accessory modules to the handheld device with different functionalities, the functionality of the handheld device for controlling different applications increases, as stated above by Riedel. Regarding Claim 41, modified reference Aldridge teaches the handheld device of claim 26, but is silent on wherein the interchangeable attachment comprises circuitry configured to provide a function. However, Riedel teaches wherein the interchangeable attachment comprises circuitry configured to provide a function (Page 12 paragraphs 5-6 via “As accessory modules 31 are also force sensors 31b conceivable on the handle, which make it possible to couple with the handle directly on the vehicle and manually move this force-controlled. In the 14 are accordingly various accessory modules 31 shown schematically, which can increase the functionality of the control handle for some applications targeted. These accessory modules 31 For example, the front and the bottom could be coupled mechanically and, if necessary, also electrically, to the handle at the top and bottom. … Additional functions could, for example, additional batteries 31c , Sensors for power and / or inclination, cameras 31d , more buttons 31e or input options, such as a space mouse 31f , mechanical coupling elements, such as adjustable angle 31a or custom versions. The accessory modules 31 For example, they could be formed by additional 3D or 6D force sensors, by means of which a mobile robot platform and / or a robot arm, in particular a robot arm, which does not have its own force sensors, can be manually guided, in particular in a compliance control.”), (Note: The Examiner interprets the circuitry as a sensor, as likewise stated on at least page 52 line 32 – page 53 line 14 of the specification of the instant application for circuitry 852.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Riedel wherein the interchangeable attachment comprises circuitry configured to provide a function. By connecting different accessory modules to the handheld device with different sensing functionalities, the functionality of the handheld device for controlling different applications increases, as stated above by Riedel. Regarding Claim 43, modified reference Aldridge teaches the handheld device of claim 42, but is silent on the handheld device further comprising an interface configured to, together with a circuit of the tool, implement the function. However, Riedel teaches wherein the handheld device further comprises an interface configured to, together with a circuit of the tool, implement the function (Page 12 paragraphs 5-6 via “As accessory modules 31 are also force sensors 31b conceivable on the handle, which make it possible to couple with the handle directly on the vehicle and manually move this force-controlled. In the 14 are accordingly various accessory modules 31 shown schematically, which can increase the functionality of the control handle for some applications targeted. These accessory modules 31 For example, the front and the bottom could be coupled mechanically and, if necessary, also electrically, to the handle at the top and bottom. … Additional functions could, for example, additional batteries 31c , Sensors for power and / or inclination, cameras 31d , more buttons 31e or input options, such as a space mouse 31f , mechanical coupling elements, such as adjustable angle 31a or custom versions. The accessory modules 31 For example, they could be formed by additional 3D or 6D force sensors, by means of which a mobile robot platform and / or a robot arm, in particular a robot arm, which does not have its own force sensors, can be manually guided, in particular in a compliance control.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Riedel wherein the handheld device further comprises an interface configured to, together with a circuit of the tool, implement the function. By connecting the accessory modules to the handheld device, the functionality of the handheld device for controlling different applications increases, as stated above by Riedel. Regarding Claim 45, modified reference Aldridge teaches the system of claim 44, but is silent on wherein the handheld device further comprises an interface configured to implement a function together with a circuit of the interchangeably attachable tool. However, Riedel teaches wherein the handheld device further comprises an interface configured to implement a function together with a circuit of the interchangeably attachable tool (Page 12 paragraphs 5-6 via “As accessory modules 31 are also force sensors 31b conceivable on the handle, which make it possible to couple with the handle directly on the vehicle and manually move this force-controlled. In the 14 are accordingly various accessory modules 31 shown schematically, which can increase the functionality of the control handle for some applications targeted. These accessory modules 31 For example, the front and the bottom could be coupled mechanically and, if necessary, also electrically, to the handle at the top and bottom. … Additional functions could, for example, additional batteries 31c , Sensors for power and / or inclination, cameras 31d , more buttons 31e or input options, such as a space mouse 31f , mechanical coupling elements, such as adjustable angle 31a or custom versions. The accessory modules 31 For example, they could be formed by additional 3D or 6D force sensors, by means of which a mobile robot platform and / or a robot arm, in particular a robot arm, which does not have its own force sensors, can be manually guided, in particular in a compliance control.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Riedel wherein the handheld device further comprises an interface configured to implement a function together with a circuit of the interchangeably attachable tool. By connecting the accessory modules to the handheld device, the functionality of the handheld device for controlling different applications increases, as stated above by Riedel. Examiner’s Note 12. The Examiner has cited particular paragraphs or columns and line numbers in the references applied to the claims above for the convenience of the Applicant. Although the specified citations are representative of the teachings of the art and are applied to specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested of the Applicant in preparing responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. See MPEP 2141.02 [R-07.2015] VI. A prior art reference must be considered in its entirety, i.e., as a whole, including portions that would lead away from the claimed Invention. W.L. Gore & Associates, Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851 (1984). See also MPEP §2123. Conclusion 13. Any inquiry concerning this communication or earlier communications from the examiner should be directed to BYRON X KASPER whose telephone number is (571)272-3895. The examiner can normally be reached Monday - Friday 8 am - 5 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Mott can be reached on (571) 270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BYRON XAVIER KASPER/Examiner, Art Unit 3657 /ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Jun 14, 2022
Application Filed
Jul 01, 2024
Non-Final Rejection — §103
Oct 08, 2024
Response Filed
Oct 30, 2024
Final Rejection — §103
Jan 06, 2025
Response after Non-Final Action
Feb 03, 2025
Request for Continued Examination
Feb 04, 2025
Response after Non-Final Action
Apr 10, 2025
Non-Final Rejection — §103
Jun 25, 2025
Response Filed
Oct 29, 2025
Final Rejection — §103
Jan 06, 2026
Interview Requested
Jan 15, 2026
Applicant Interview (Telephonic)
Jan 15, 2026
Examiner Interview Summary
Feb 05, 2026
Request for Continued Examination
Feb 20, 2026
Response after Non-Final Action
Mar 03, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594964
METHOD OF AND SYSTEM FOR GENERATING REFERENCE PATH OF SELF DRIVING CAR (SDC)
2y 5m to grant Granted Apr 07, 2026
Patent 12594137
HARD STOP PROTECTION SYSTEM AND METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12583101
METHOD FOR OPERATING A MODULAR ROBOT, MODULAR ROBOT, COLLISION AVOIDANCE SYSTEM, AND COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Mar 24, 2026
Patent 12576529
ROBOT SIMULATION DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12564962
ROBOT REMOTE OPERATION CONTROL DEVICE, ROBOT REMOTE OPERATION CONTROL SYSTEM, ROBOT REMOTE OPERATION CONTROL METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
70%
Grant Probability
88%
With Interview (+18.4%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 103 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month