Prosecution Insights
Last updated: April 18, 2026
Application No. 18/931,582

MANIPULATOR SYSTEM, MANAGEMENT DEVICE FOR MANAGING OPERATION STATE OF MANIPULATOR, AND MANAGEMENT METHOD FOR MANAGING OPERATION STATE OF MANIPULATOR

Non-Final OA §103
Filed
Oct 30, 2024
Examiner
LE, TIEN MINH
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Hitachi, Ltd.
OA Round
1 (Non-Final)
68%
Grant Probability
Favorable
1-2
OA Rounds
2y 12m
To Grant
92%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
55 granted / 81 resolved
+15.9% vs TC avg
Strong +24% interview lift
Without
With
+23.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 12m
Avg Prosecution
30 currently pending
Career history
111
Total Applications
across all art units

Statute-Specific Performance

§101
8.1%
-31.9% vs TC avg
§103
51.7%
+11.7% vs TC avg
§102
18.5%
-21.5% vs TC avg
§112
18.8%
-21.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 81 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-12 as originally filed are pending and have been considered as follows. Priority 1. Acknowledgement is made of applicant’s claim for foreign priority application No. JP2023-201902 filed on 11/29/2023. Information Disclosure Statement 2. The information disclosure statement (IDS) filed on 10/30/2024 is being considered by the examiner. Claim Objections 3. Claims 6 and 9 are objected to because of the following informalities: In claim 6, the phrase “sequentially real time” should be replaced with “sequentially in real time” or similar language. In claim 9, the phrase “video of a manipulation target object” should be replaced with “video of the manipulation target object” or similar language since a manipulation target object was introduced earlier in the claim. Appropriate correction is required. Claim Interpretation 4. The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. 5. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. 6. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “drive unit” in claim 1. “measurement unit” in claim 1. “memory unit” in claims 1 and 10. “reception unit” in claims 1 and 10. “calculation unit” in claims 1, 4, 6, and 10. “identification unit” in claims 1, 2, 3, 10, and 11. “control unit” in claims 2, 3, 7, 8, and 11. “output unit” in claims 2 and 3. “information providing unit” in claim 3. “calculation-use signal generating unit” in claim 5. “remote control unit” in claim 8. “observing unit” in claim 9. “providing unit” in claim 9. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. A review of the specification shows the following appears to be the corresponding structure described in the specification for the 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph limitations: “drive unit” in claim 1 corresponds to “drive unit 1” [0022] and Fig. 1. “measurement unit” in claim 1 corresponds to “drive unit output angle measuring units 11” [0022] and Fig. 1. “memory unit” in claims 1 and 10 corresponds to “memory unit 43” [0026] and Fig. 1. “reception unit” in claims 1 and 10 corresponds to “reception unit 41” [0026] and Fig. 1. “calculation unit” in claims 1, 4, 6, and 10 corresponds to “calculation unit 42” [0026] and Fig. 1. “identification unit” in claims 1, 2, 3, 10, and 11 corresponds to “identification unit 44” [0026] and Fig. 1. “control unit” in claims 2, 3, 7, 8, and 11 corresponds to “control unit 47” [0026] and Fig. 1. “output unit” in claims 2 and 3 corresponds to “output unit 46” [0026] and Fig. 1. “information providing unit” in claim 3 corresponds to “information providing unit 100” [0066] and Fig. 6. “calculation-use signal generating unit” in claim 5 corresponds to “calculation-use signal generating unit 45” [0026] and Fig. 1. “remote control unit” in claim 8 corresponds to “remote control unit” [0048]. “observing unit” in claim 9 corresponds to “an observation means 7” [0024]. “providing unit” in claim 9 corresponds to “information providing unit 100” [0066] and Fig. 6. Paragraph [0026] indicates that “The computer 4 is a management device that manages an operation state of the manipulator. Therefore, examiner is interpreting “measurement unit”, “reception unit”, “calculation unit”, “identification unit”, “control unit”, “output unit”, “information providing unit”, “calculation-use signal generating unit”, “remote control unit”, “observing unit”, and “providing unit” as software modules running on a computer. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 103 7. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 8. Claim 1-5 and 7-12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Takahashi et al. (US 20230191592, hereinafter Takahashi) in view of Spenninger et al. (US 20220266447, hereinafter Spenninger). Regarding claim 1, Takahashi teaches a manipulator system provided with a drive unit that drives a manipulator (see at least Fig. 1 and [0048]: hand tool 252) and a measurement unit that measures a position of the manipulator (see at least [0053]: position sensors (angle sensors) 551 to 556; [0171-0172]: distance sensor 902), the manipulator system comprising: a memory unit configured to store a mass of the manipulator before the operation of manipulator and a spring coefficient of the manipulator before the operation of the manipulator (see at least [0061]: “In addition, the control apparatus 300 includes a read only memory (ROM) 302, a random-access memory (RAM) 303, and a hard disk drive (HDD) 304, which serve as storage portions.”; [0077]: “The force control portion 505 receives signals representing the robot model 503 (virtual mass M.sub.ref), the force target value F.sub.ref, the position target value P.sub.ref, a stiffness coefficient (spring coefficient) K.sub.ref, a viscosity coefficient (damper coefficient) D.sub.ref, the current position P(t), and the force F. Then the force control portion 505 uses these values, and calculates torque command values τ.sub.MFref1 to τ.sub.MFref6 for the joints J.sub.1 to J.sub.6.”; [0108]: “In addition, the CPU 301 calculates torques applied to the joints to Jo by the self weight of the robot arm 251. Specifically, the CPU 301 calculates the torques by using pre-stored mechanical-model information (the mechanical model includes a model of the end effector attached to the distal end of the robot).”); a reception unit configured to receive an input signal that drives the manipulator (see at least [0083]: “First, an operator inputs the force target value F.sub.ref and the position target value P.sub.ref into the teaching pendant 400 (S1).”), and an output signal of the manipulator that is operated in response to the input signal (see at least [0085]: “The switch control portions 511 to 516 output the torque command values (force) τ.sub.MFref1 to τ.sub.MFref6 to the motor control portions 531 to 536, as the torque command values τ.sub.Mref1 to τ.sub.Mref6 (S3).”); a calculation unit configured to calculate a force of the manipulator during the operation of the manipulator (see at least [0110]: “Then, the CPU 301 detects the external force applied to the first to the sixth links 10 to 15 or the distal-end flange 16 (that is, an operation performed by a person), depending on the detection results from the torque sensors 541 to 546 of the joints J.sub.1 to J.sub.6 (S21).”; [0127]: “In addition, the torque sensor 542 of the joint J.sub.2 that detects the base-end-side external force detects external force applied in a direction in which the self weight of the robot 200 is supported.”) and a spring coefficient of the manipulator during the operation of the manipulator using the input signal and output signal (see at least [0113]: “If it is detected that a person is contacting a robot at a plurality of (two) positions, that is, if the person is holding the robot with both hands (S22: Yes), then the CPU 301 changes the above-described damper coefficient D and the spring coefficient K, which are force control parameters (S23) (second resistance control).”); and an identification unit configured to identify an operation state of the manipulator by comparing the force of the manipulator before the operation of manipulator and the spring coefficient of the manipulator before the operation of the manipulator both of which the memory unit stores (see at least [0110]: “If the external force is not detected (S21: No), then the CPU 301 checks whether the direct teach is being continued (S24).”; [0114]: “That is, the CPU 301 performs the force control on the electric motors 211 to 216 of the joints J.sub.1 to J.sub.6, and sets the resistance value of each of the joints J.sub.1 to J.sub.6 so that the resistance value is a second resistance value that is larger than the first resistance value, and that is produced in accordance with the damper coefficient D and the spring coefficient K that have been changed from the default values.”), and the force of the manipulator during the operation of the manipulator and the spring coefficient of the manipulator during the operation of the manipulator both of which the calculation unit calculates with each other, respectively (see at least [0141]: “Then, the CPU 301 determines whether a person is contacting the robot arm at a plurality of positions, depending on the contact detection results from the contact sensors 710 to 716 (S29). That is, the CPU 301 determines which of one hand or both hands an operator is holding the robot arm with. If a plurality of (two) contact positions is detected, that is, if the robot arm is being held with both hands (S29: Yes), then the CPU 301 changes the damper coefficient D and the spring coefficient K, which are force control parameters, to a damper coefficient and a spring coefficient used for a case where an operator holds the robot arm at a plurality of positions (S31). That is, if an operator is holding the robot arm with both hands, the CPU 301 changes the force control parameters so that the resistance value of the joints J.sub.1 to J.sub.6 is a second resistance value. As a result, the joints J.sub.1 to J.sub.6 are made difficult to move, so that the first to the sixth links 10 to 15 are made difficult to move. Then the CPU 301 resets the counter N (S32), and returns to Step S27.”). Takahashi fails to explicitly teach calculating a mass of the manipulator during the operation of the manipulator and identify an operation state of the manipulator by comparing the mass of the manipulator before and during the operation of the manipulator. However, Spenninger teaches a method and system for operating a robot manipulator that calculates a mass of a manipulator during operation of the manipulator and identify an operation state of the manipulator by comparing the mass of the manipulator before and during the operation of the manipulator (see at least Figs. 1-2 and [0020]: “According to a further advantageous embodiment, the weight force of the mass of the load is ascertained by static or dynamic system identification.”; [0038]: “The weight force of the mass of the load 5 is ascertained by static system identification. That is to say, after the load 5 is arranged by a user on the end effector 3, the torque sensors arranged in the joints of the robot manipulator 1 detect a torque and the mass of the load 5 is obtained via the current joint angle of the robot manipulator 1 from the known mass distribution of the robot manipulator 1 and the end effector 3.”; [0039]: “Due to the completely known mass distribution of all elements of the robot manipulator 1 including the load 5, a torque on the base of the robot manipulator 1 is known by way of the integral of all mass elements over the radii. The maximum permissible workspace and the maximum permissible velocity and acceleration of the end effector 3 are now generated by a gradient-based search method. In the gradient-based method, further search points are ascertained around a starting point at a certain distance in the target function to comply with the predetermined metric, namely the limiting value in the torque on the base of the robot manipulator 1.”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Takahashi to incorporate the teachings of Spenninger and provide a means to calculate a mass of a manipulator during operation of the manipulator and identify an operation state of the manipulator by comparing the mass of the manipulator before and during the operation of the manipulator, with a reasonable expectation of success, in order to recognize a maximum permissible workspace and maximum permissible velocity and acceleration of an end effector [0039]. Regarding claim 2, modified Takahashi teaches the limitation of claim 1. Takahashi further teaches a control unit configured to decide a control mode of the manipulator using the operation state that the identification unit identifies (see at least [0141]: “Then, the CPU 301 determines whether a person is contacting the robot arm at a plurality of positions, depending on the contact detection results from the contact sensors 710 to 716 (S29). That is, the CPU 301 determines which of one hand or both hands an operator is holding the robot arm with.”); and an output unit configured to output the control mode of the manipulator to the manipulator in response to the operation state of the manipulator that the identification unit identifies (see at least [0141]: “That is, the CPU 301 determines which of one hand or both hands an operator is holding the robot arm with. If a plurality of (two) contact positions is detected, that is, if the robot arm is being held with both hands (S29: Yes), then the CPU 301 changes the damper coefficient D and the spring coefficient K, which are force control parameters, to a damper coefficient and a spring coefficient used for a case where an operator holds the robot arm at a plurality of positions (S31).”). Regarding claim 3, modified Takahashi teaches the limitation of claim 1. Takahashi further teaches a control unit configured to decide information to be provided to a user using the operation state of the manipulator that the identification unit identifies (see at least [0141]: “That is, if an operator is holding the robot arm with both hands, the CPU 301 changes the force control parameters so that the resistance value of the joints J.sub.1 to J.sub.6 is a second resistance value. As a result, the joints J.sub.1 to J.sub.6 are made difficult to move, so that the first to the sixth links 10 to 15 are made difficult to move. Then the CPU 301 resets the counter N (S32), and returns to Step S27.”); and an output unit configured to output the information to be provided to an information providing unit of the user in response to the operation state of the manipulator that the identification unit identifies (see at least Figs.28-29 and [0198]: “In this manner, the image processing portion performs a known edge-extraction process on an image captured by the image capture apparatus 500, and can easily obtain a rough position (or rough positions) of a hand (or hands) of a user in an image, as illustrated in FIGS. 29A and 29B. As illustrated in FIGS. 29A and 29B, an image in which a rough position (or positions) of a hand (or hands) of a user is indicated by a marker (or markers) 501 may be displayed on the monitor 321 in the direct teach for the robot arm 251. Thus, a contact position (or contact positions) at which a user is contacting the robot arm 251 can be obtained, based on the rough position (positions) of the hand (hands) of the user and the number of contact positions.”). Regarding claim 4, modified Takahashi teaches the limitation of claim 1. Takahashi further teaches wherein the calculation unit is configured to calculate a parameter of a physical model of the manipulator system to which modeling is applied preliminarily using the input signal that drives the manipulator and the output signal of the manipulator (see at least [0102]: “If the robot arm is operated with a human hand, force is produced and detected. The force target value produced in accordance with the force is determined by parameters (hereinafter referred to as force control parameters) of force control, which are set in advance. Thus, regardless of which of one hand and both hands the robot arm is operated with, the same force target value is produced if the force produced by the operation with a human hand (or human hands) is the same, and if the force control parameters are not changed.”; [0107]: “Next, the control performed in the direct teach of the first embodiment will be described with reference to FIG. 9A. If the direct teach is started by, the control apparatus 300, the CPU 301 reads default values of various force-control parameters (S19). Note that the default values of various parameters, read in S19, are set in advance by a user.”). Regarding claim 5, modified Takahashi teaches the limitation of claim 1. Takahashi further teaches further comprising a calculation-use signal generating unit configured to output a signal for performing calculation of the spring coefficient and the force in the calculation unit (see at least [0141]: “If a plurality of (two) contact positions is detected, that is, if the robot arm is being held with both hands (S29: Yes), then the CPU 301 changes the damper coefficient D and the spring coefficient K, which are force control parameters, to a damper coefficient and a spring coefficient used for a case where an operator holds the robot arm at a plurality of positions (S31).”). Takahashi fails to explicitly teach output a signal for performing calculation of the mass in the calculation unit. However, Spenninger teaches a method and system for operating a robot manipulator that outputs a signal for performing calculation of a mass in a calculation unit (see at least Figs. 1-2 and [0020]: “According to a further advantageous embodiment, the weight force of the mass of the load is ascertained by static or dynamic system identification.”; [0038]: “The weight force of the mass of the load 5 is ascertained by static system identification. That is to say, after the load 5 is arranged by a user on the end effector 3, the torque sensors arranged in the joints of the robot manipulator 1 detect a torque and the mass of the load 5 is obtained via the current joint angle of the robot manipulator 1 from the known mass distribution of the robot manipulator 1 and the end effector 3.”; [0039]: “Due to the completely known mass distribution of all elements of the robot manipulator 1 including the load 5, a torque on the base of the robot manipulator 1 is known by way of the integral of all mass elements over the radii. The maximum permissible workspace and the maximum permissible velocity and acceleration of the end effector 3 are now generated by a gradient-based search method. In the gradient-based method, further search points are ascertained around a starting point at a certain distance in the target function to comply with the predetermined metric, namely the limiting value in the torque on the base of the robot manipulator 1.”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Takahashi to incorporate the teachings of Spenninger and provide a means to outputs a signal for performing calculation of a mass in a calculation unit, with a reasonable expectation of success, in order to recognize a maximum permissible workspace and maximum permissible velocity and acceleration of an end effector [0039]. Regarding claim 7, modified Takahashi teaches the limitation of claim 1. Takahashi further teaches a control unit configured to calculate the input signal for operating the manipulator to a desired position based on a measured position of the manipulator (see at least [0096]: “By driving the electric motors 211 to 216 in accordance with the above-described flow, the hand position P of the robot 200 can be controlled so that the hand position P follows the desired position target value P.sub.ref.”). Regarding claim 8, modified Takahashi teaches the limitation of claim 1. Takahashi further teaches a control unit configured to calculate the input signal for operating the manipulator to a desired position based on a measured position of the manipulator (see at least [0096]: “By driving the electric motors 211 to 216 in accordance with the above-described flow, the hand position P of the robot 200 can be controlled so that the hand position P follows the desired position target value P.sub.ref.”); and a remote control unit configured to receive a manipulation performed by a human (see at least [0162]: “FIG. 23A illustrates a teaching pendant 410. FIG. 23B illustrates a state where an operator performs the direct teach on the robot arm 251 while holding the teaching pendant 410. FIG. 24 is a flowchart illustrating the control performed in a direct teach mode of the sixth embodiment.”), wherein the control unit is configured to calculate an instruction of a position or a force that the manipulator outputs based on an operation of the remote control unit (see at least [0165]: “If the enable button of the teaching pendant 410 is ON in Step S41 (S41: Yes), then the CPU 301 proceeds to Step S21. Then, the CPU 301 detects the external force applied to the first to the sixth links 10 to 15 or the distal-end flange 16 (that is, an operation performed by a person), depending on the detection results from the torque sensors 541 to 546 of the joints J.sub.1 to J.sub.6. If the external force is not detected (S21: No), then the CPU 301 proceeds to the above-described Step S24.”; [0166]: “If the external force (or an operation performed by a person) is detected (S21: Yes), then the CPU 301 proceeds to Step S22, and determines whether a person is contacting the robot arm at a plurality of (two) positions, depending on the contact detection results from the contact sensors 710 to 716 or on the output values from the torque sensors 541 to 546. If a person is contacting the robot arm at a plurality of positions (S22: Yes), then the CPU 301 returns to Step S41.”). Regarding claim 9, modified Takahashi teaches the limitation of claim 1. Takahashi further teaches an observing unit configured to observe a manipulation target object of the manipulator (see at least [0196]: “For example, as illustrated in FIG. 28, contact positions at which a user is contacting the robot arm 251 may be obtained by using an image capture apparatus 500 installed at a position at which the image capture apparatus 500 can capture an image of the robot arm 251. In FIG. 28, the image capture apparatus 500 is connected to the control apparatus 300; and the control apparatus 300 includes an image processing portion that performs image processing on an image captured by the image capture apparatus 500.”); and a providing unit configured to be capable of providing a user with an image or a video of a manipulation target object acquired by the observing unit (see at least Fig. 19 and [0155]: “In a case where the hand tool 252 is operated (for example, the hand tool 252 is opened and closed), a work plane (or the position and posture of the hand tool 252 in some cases) on which the hand tool 252 performs work on a workpiece 910 (which is an object on which the work is to be performed) is often fixed.”; Figs. 28-29B and [0198]: “In this manner, the image processing portion performs a known edge-extraction process on an image captured by the image capture apparatus 500, and can easily obtain a rough position (or rough positions) of a hand (or hands) of a user in an image, as illustrated in FIGS. 29A and 29B. As illustrated in FIGS. 29A and 29B, an image in which a rough position (or positions) of a hand (or hands) of a user is indicated by a marker (or markers) 501 may be displayed on the monitor 321 in the direct teach for the robot arm 251.”). Regarding claim 10, Takahashi teaches a management device that manages an operation state of a manipulator (see at least Fig. 1), the management device comprising: a memory unit configured to store a mass of the manipulator before the operation of manipulator and a spring coefficient of the manipulator before the operation of manipulator (see at least [0061]: “In addition, the control apparatus 300 includes a read only memory (ROM) 302, a random-access memory (RAM) 303, and a hard disk drive (HDD) 304, which serve as storage portions.”; [0077]: “The force control portion 505 receives signals representing the robot model 503 (virtual mass M.sub.ref), the force target value F.sub.ref, the position target value P.sub.ref, a stiffness coefficient (spring coefficient) K.sub.ref, a viscosity coefficient (damper coefficient) D.sub.ref, the current position P(t), and the force F. Then the force control portion 505 uses these values, and calculates torque command values τ.sub.MFref1 to τ.sub.MFref6 for the joints J.sub.1 to J.sub.6.”; [0108]: “In addition, the CPU 301 calculates torques applied to the joints to Jo by the self weight of the robot arm 251. Specifically, the CPU 301 calculates the torques by using pre-stored mechanical-model information (the mechanical model includes a model of the end effector attached to the distal end of the robot).”); a reception unit configured to receive an input signal that drives the manipulator, and an output signal of the manipulator that is operated in response to the input signal (see at least [0083]: “First, an operator inputs the force target value F.sub.ref and the position target value P.sub.ref into the teaching pendant 400 (S1).”), and an output signal of the manipulator that is operated in response to the input signal (see at least [0085]: “The switch control portions 511 to 516 output the torque command values (force) τ.sub.MFref1 to τ.sub.MFref6 to the motor control portions 531 to 536, as the torque command values τ.sub.Mref1 to τ.sub.Mref6 (S3).”); a calculation unit configured to calculate a force of the manipulator during an operation of the manipulator (see at least [0110]: “Then, the CPU 301 detects the external force applied to the first to the sixth links 10 to 15 or the distal-end flange 16 (that is, an operation performed by a person), depending on the detection results from the torque sensors 541 to 546 of the joints J.sub.1 to J.sub.6 (S21).”; [0127]: “In addition, the torque sensor 542 of the joint J.sub.2 that detects the base-end-side external force detects external force applied in a direction in which the self weight of the robot 200 is supported.”) and a spring coefficient of the manipulator during an operation of the manipulator using the input signal and the output signal (see at least [0113]: “If it is detected that a person is contacting a robot at a plurality of (two) positions, that is, if the person is holding the robot with both hands (S22: Yes), then the CPU 301 changes the above-described damper coefficient D and the spring coefficient K, which are force control parameters (S23) (second resistance control).”); and an identification unit configured to identify an operation state of the manipulator by comparing the force of the manipulator before the operation of manipulator and the spring coefficient of the manipulator before the operation of manipulator both of which the memory unit stores (see at least [0110]: “If the external force is not detected (S21: No), then the CPU 301 checks whether the direct teach is being continued (S24).”; [0114]: “That is, the CPU 301 performs the force control on the electric motors 211 to 216 of the joints J.sub.1 to J.sub.6, and sets the resistance value of each of the joints J.sub.1 to J.sub.6 so that the resistance value is a second resistance value that is larger than the first resistance value, and that is produced in accordance with the damper coefficient D and the spring coefficient K that have been changed from the default values.”), and the force of the manipulator during the operation of the manipulator and the spring coefficient of the manipulator during the operation of the manipulator both of which the calculation unit calculates with each other, respectively (see at least [0141]: “Then, the CPU 301 determines whether a person is contacting the robot arm at a plurality of positions, depending on the contact detection results from the contact sensors 710 to 716 (S29). That is, the CPU 301 determines which of one hand or both hands an operator is holding the robot arm with. If a plurality of (two) contact positions is detected, that is, if the robot arm is being held with both hands (S29: Yes), then the CPU 301 changes the damper coefficient D and the spring coefficient K, which are force control parameters, to a damper coefficient and a spring coefficient used for a case where an operator holds the robot arm at a plurality of positions (S31). That is, if an operator is holding the robot arm with both hands, the CPU 301 changes the force control parameters so that the resistance value of the joints J.sub.1 to J.sub.6 is a second resistance value. As a result, the joints J.sub.1 to J.sub.6 are made difficult to move, so that the first to the sixth links 10 to 15 are made difficult to move. Then the CPU 301 resets the counter N (S32), and returns to Step S27.”). Takahashi fails to explicitly teach calculating a mass of the manipulator during the operation of the manipulator and identify an operation state of the manipulator by comparing the mass of the manipulator before and during the operation of the manipulator. However, Spenninger teaches a method and system for operating a robot manipulator that calculates a mass of a manipulator during operation of the manipulator and identify an operation state of the manipulator by comparing the mass of the manipulator before and during the operation of the manipulator (see at least Figs. 1-2 and [0020]: “According to a further advantageous embodiment, the weight force of the mass of the load is ascertained by static or dynamic system identification.”; [0038]: “The weight force of the mass of the load 5 is ascertained by static system identification. That is to say, after the load 5 is arranged by a user on the end effector 3, the torque sensors arranged in the joints of the robot manipulator 1 detect a torque and the mass of the load 5 is obtained via the current joint angle of the robot manipulator 1 from the known mass distribution of the robot manipulator 1 and the end effector 3.”; [0039]: “Due to the completely known mass distribution of all elements of the robot manipulator 1 including the load 5, a torque on the base of the robot manipulator 1 is known by way of the integral of all mass elements over the radii. The maximum permissible workspace and the maximum permissible velocity and acceleration of the end effector 3 are now generated by a gradient-based search method. In the gradient-based method, further search points are ascertained around a starting point at a certain distance in the target function to comply with the predetermined metric, namely the limiting value in the torque on the base of the robot manipulator 1.”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Takahashi to incorporate the teachings of Spenninger and provide a means to calculate a mass of a manipulator during operation of the manipulator and identify an operation state of the manipulator by comparing the mass of the manipulator before and during the operation of the manipulator, with a reasonable expectation of success, in order to recognize a maximum permissible workspace and maximum permissible velocity and acceleration of an end effector [0039]. Regarding claim 11, modified Takahashi teaches the limitation of claim 1. Takahashi further teaches a control unit configured to decide a control mode of the manipulator or information to be provided to a user using the operation state of the manipulator that the identification unit identifies (see at least [0141]: “Then, the CPU 301 determines whether a person is contacting the robot arm at a plurality of positions, depending on the contact detection results from the contact sensors 710 to 716 (S29). That is, the CPU 301 determines which of one hand or both hands an operator is holding the robot arm with.”). Regarding claim 12, Takahashi a management method for managing an operation state of a manipulator (see at least Fig. 1), the management method comprising the steps of: storing a mass of the manipulator before the operation of manipulator and a spring coefficient of the manipulator before the operation of manipulator (see at least [0061]: “In addition, the control apparatus 300 includes a read only memory (ROM) 302, a random-access memory (RAM) 303, and a hard disk drive (HDD) 304, which serve as storage portions.”; [0077]: “The force control portion 505 receives signals representing the robot model 503 (virtual mass M.sub.ref), the force target value F.sub.ref, the position target value P.sub.ref, a stiffness coefficient (spring coefficient) K.sub.ref, a viscosity coefficient (damper coefficient) D.sub.ref, the current position P(t), and the force F. Then the force control portion 505 uses these values, and calculates torque command values τ.sub.MFref1 to τ.sub.MFref6 for the joints J.sub.1 to J.sub.6.”; [0108]: “In addition, the CPU 301 calculates torques applied to the joints to Jo by the self weight of the robot arm 251. Specifically, the CPU 301 calculates the torques by using pre-stored mechanical-model information (the mechanical model includes a model of the end effector attached to the distal end of the robot).”); receiving an input signal that drives the manipulator (see at least [0083]: “First, an operator inputs the force target value F.sub.ref and the position target value P.sub.ref into the teaching pendant 400 (S1).”), and an output signal of the manipulator that is operated in response to the input signal (see at least [0085]: “The switch control portions 511 to 516 output the torque command values (force) τ.sub.MFref1 to τ.sub.MFref6 to the motor control portions 531 to 536, as the torque command values τ.sub.Mref1 to τ.sub.Mref6 (S3).”); calculating a force of the manipulator during an operation of the manipulator (see at least [0110]: “Then, the CPU 301 detects the external force applied to the first to the sixth links 10 to 15 or the distal-end flange 16 (that is, an operation performed by a person), depending on the detection results from the torque sensors 541 to 546 of the joints J.sub.1 to J.sub.6 (S21).”; [0127]: “In addition, the torque sensor 542 of the joint J.sub.2 that detects the base-end-side external force detects external force applied in a direction in which the self weight of the robot 200 is supported.”) and a spring coefficient of the manipulator during an operation of the manipulator using the input signal and the output signal (see at least [0113]: “If it is detected that a person is contacting a robot at a plurality of (two) positions, that is, if the person is holding the robot with both hands (S22: Yes), then the CPU 301 changes the above-described damper coefficient D and the spring coefficient K, which are force control parameters (S23) (second resistance control).”); and identifying an operation state of the manipulator by comparing the force of the manipulator before the operation of manipulator and the spring coefficient of the manipulator before the operation of manipulator both of which are stored (see at least [0110]: “If the external force is not detected (S21: No), then the CPU 301 checks whether the direct teach is being continued (S24).”; [0114]: “That is, the CPU 301 performs the force control on the electric motors 211 to 216 of the joints J.sub.1 to J.sub.6, and sets the resistance value of each of the joints J.sub.1 to J.sub.6 so that the resistance value is a second resistance value that is larger than the first resistance value, and that is produced in accordance with the damper coefficient D and the spring coefficient K that have been changed from the default values.”), and the force of the manipulator during the operation of the manipulator and the spring coefficient of the manipulator during the operation of the manipulator both of which are calculated with each other, respectively (see at least [0141]: “Then, the CPU 301 determines whether a person is contacting the robot arm at a plurality of positions, depending on the contact detection results from the contact sensors 710 to 716 (S29). That is, the CPU 301 determines which of one hand or both hands an operator is holding the robot arm with. If a plurality of (two) contact positions is detected, that is, if the robot arm is being held with both hands (S29: Yes), then the CPU 301 changes the damper coefficient D and the spring coefficient K, which are force control parameters, to a damper coefficient and a spring coefficient used for a case where an operator holds the robot arm at a plurality of positions (S31). That is, if an operator is holding the robot arm with both hands, the CPU 301 changes the force control parameters so that the resistance value of the joints J.sub.1 to J.sub.6 is a second resistance value. As a result, the joints J.sub.1 to J.sub.6 are made difficult to move, so that the first to the sixth links 10 to 15 are made difficult to move. Then the CPU 301 resets the counter N (S32), and returns to Step S27.”). Takahashi fails to explicitly teach calculating a mass of the manipulator during the operation of the manipulator and identify an operation state of the manipulator by comparing the mass of the manipulator before and during the operation of the manipulator. However, Spenninger teaches a method and system for operating a robot manipulator that calculates a mass of a manipulator during operation of the manipulator and identify an operation state of the manipulator by comparing the mass of the manipulator before and during the operation of the manipulator (see at least Figs. 1-2 and [0020]: “According to a further advantageous embodiment, the weight force of the mass of the load is ascertained by static or dynamic system identification.”; [0038]: “The weight force of the mass of the load 5 is ascertained by static system identification. That is to say, after the load 5 is arranged by a user on the end effector 3, the torque sensors arranged in the joints of the robot manipulator 1 detect a torque and the mass of the load 5 is obtained via the current joint angle of the robot manipulator 1 from the known mass distribution of the robot manipulator 1 and the end effector 3.”; [0039]: “Due to the completely known mass distribution of all elements of the robot manipulator 1 including the load 5, a torque on the base of the robot manipulator 1 is known by way of the integral of all mass elements over the radii. The maximum permissible workspace and the maximum permissible velocity and acceleration of the end effector 3 are now generated by a gradient-based search method. In the gradient-based method, further search points are ascertained around a starting point at a certain distance in the target function to comply with the predetermined metric, namely the limiting value in the torque on the base of the robot manipulator 1.”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Takahashi to incorporate the teachings of Spenninger and provide a means to calculate a mass of a manipulator during operation of the manipulator and identify an operation state of the manipulator by comparing the mass of the manipulator before and during the operation of the manipulator, with a reasonable expectation of success, in order to recognize a maximum permissible workspace and maximum permissible velocity and acceleration of an end effector [0039]. Claim Rejections - 35 USC § 103 9. Claim 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Takahashi et al. (US 20230191592, hereinafter Takahashi) and Spenninger et al. (US 20220266447, hereinafter Spenninger) in view of Cohen et al. (US 20210290261, hereinafter Cohen). Regarding claim 6, modified Takahashi teaches the limitation of claim 1. Takahashi further teaches wherein the calculation unit is configured to calculate the spring coefficient and the force during the operation of the manipulator system (see at least [0110]: “Then, the CPU 301 detects the external force applied to the first to the sixth links 10 to 15 or the distal-end flange 16 (that is, an operation performed by a person), depending on the detection results from the torque sensors 541 to 546 of the joints J.sub.1 to J.sub.6 (S21).”; [0113]: “If it is detected that a person is contacting a robot at a plurality of (two) positions, that is, if the person is holding the robot with both hands (S22: Yes), then the CPU 301 changes the above-described damper coefficient D and the spring coefficient K, which are force control parameters (S23) (second resistance control).”; [0127]: “In addition, the torque sensor 542 of the joint J.sub.2 that detects the base-end-side external force detects external force applied in a direction in which the self weight of the robot 200 is supported.”). Takahashi fails to explicitly teach calculating the mass during the operation of the manipulator system. However, Spenninger teaches a method and system for operating a robot manipulator that calculates a mass during operation of a manipulator system (see at least Figs. 1-2 and [0020]: “According to a further advantageous embodiment, the weight force of the mass of the load is ascertained by static or dynamic system identification.”; [0038]: “The weight force of the mass of the load 5 is ascertained by static system identification. That is to say, after the load 5 is arranged by a user on the end effector 3, the torque sensors arranged in the joints of the robot manipulator 1 detect a torque and the mass of the load 5 is obtained via the current joint angle of the robot manipulator 1 from the known mass distribution of the robot manipulator 1 and the end effector 3.”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Takahashi to incorporate the teachings of Spenninger and provide a means to calculate a mass of a manipulator during operation of the manipulator and identify an operation state of the manipulator by comparing the mass of the manipulator before and during the operation of the manipulator, with a reasonable expectation of success, in order to recognize a maximum permissible workspace and maximum permissible velocity and acceleration of an end effector [0039]. The combination of Takahashi and Spenninger fails to explicitly teach calculating data sequentially in real time during operation. However, Cohen teaches a method and system for a robotically-manipulated device that calculate data sequentially in real time during operation (see at least [0131]: “Such control may be based on a predetermined path within the patient's body, or under real-time control of the physician, e.g., using a joystick to steer the cannula as it lengthens, and a throttle (e.g., a 1-axis joystick) which controls the speed of forward or reverse motion (extension/growth/assembly, or else retraction/disassembly)….Instead of having a large number of sensors and/or a large number of wires, a stylet equipped with sensors or merely electrical contacts can be slid within the controller to measure its shape (or transmit data from fixed sensors) sequentially (or more at a time).”). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Takahashi and Spenninger to incorporate the teachings of Cohen and provide a means to calculate data sequentially in real time during operation, with a reasonable expectation of success, in order to have instantaneous feedback and updates in an orderly fashion. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Inagaki et al. (US 20190255709) teaches an apparatus and method for measuring a vibration of an end effector supported by a distal end of a robot that takes into account mass and inertial mass of the robot. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TIEN MINH LE whose telephone number is (571)272-3903. The examiner can normally be reached Monday to Friday (8:30am-5:30pm eastern time). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached on (571)272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /T.M.L./Examiner, Art Unit 3656 /KHOI H TRAN/Supervisory Patent Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Oct 30, 2024
Application Filed
Apr 02, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12566070
DETERMINATION APPARATUS AND DETERMINATION METHOD
2y 5m to grant Granted Mar 03, 2026
Patent 12528325
A CONTROL SYSTEM FOR A VEHICLE
2y 5m to grant Granted Jan 20, 2026
Patent 12508704
Marker Detection Apparatus and Robot Teaching System
2y 5m to grant Granted Dec 30, 2025
Patent 12509122
VEHICLE SELECTION DEVICE AND VEHICLE SELECTION METHOD
2y 5m to grant Granted Dec 30, 2025
Patent 12466074
IMAGE PROCESSING METHOD, IMAGE PROCESSING APPARATUS, ROBOT-MOUNTED TRANSFER DEVICE, AND SYSTEM
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
68%
Grant Probability
92%
With Interview (+23.8%)
2y 12m
Median Time to Grant
Low
PTA Risk
Based on 81 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month