DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Status of Claims
Claims 1-6 are currently pending and are being hereby examined herein.
Response to Amendments / Remarks
Any reference to the prior office action refers to the nonfinal rejection dated 20 October 2025.
The rejection under 35 U.S.C. 112(b) of Claim 3 from the prior office action is withdrawn.
Applicant’s arguments, filed 18 December 2025, with respect to the prior art rejections (35 U.S.C. 102/35 U.S.C. 103) from the prior office action are persuasive. Therefore, the prior art rejections (35 U.S.C. 102/35 U.S.C. 103) from the prior office action have been withdrawn. However, upon further consideration, Applicant's amendment necessitated new ground(s) of rejection under 35 U.S.C. 103 (see below).
Joint Inventors
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 1-3 and 5-6 are is rejected under 35 U.S.C. 103 as being unpatentable over U.S. Pub. No. 2021/0394362 (Sodeyama et al., hereinafter, Sodeyama) in view of U.S. Patent No. 10,471,591 (hereinafter, Hinkle) in view of U.S. Pub. No. 2014/0365009 (hereinafter, Wettels).
Regarding Claim 1, Sodeyama discloses A control system (see at least FIG. 4: control unit 10) for a manipulator (see at least FIG. 1: autonomous robot 1) comprising an arm (see at least FIG. 1: manipulator 44R), and a hand attached to the arm and configured to grasp an object to be grasped (see at least FIG. 1: hand portion 443 is grasping object B1), the control system for the manipulator further comprising:
at least one memory storing instructions (see at least [0112], FIG. 4, and FIG. 7: ROM 14), and
at least one processor configured to execute the instructions (see at least [0112], FIG. 4, and FIG. 7: CPU 12) to:
recognize a nearby object in an environment around the manipulator… (see at least [0132] and FIG. 9: “the transfer target person recognition unit 51 recognizes a position of the hand H1 of the transfer target person by analyzing, for example, the image data acquired by the camera 19 (Step S108)”);
control a movement of the arm (see at least [0132] and FIG. 9: “the transfer action planning unit 53 creates a posture control plan from the position of the hand H1 of the transfer target person recognized by the transfer target person recognition unit 51 to a determined posture of the transfer of the autonomous robot 1 (Step S109). Then, the physical interaction execution unit 55 controls a posture of the autonomous robot 1 according to the posture control plan created by the transfer action planning unit 53 (Step S110). Note that the posture control of the autonomous robot 1 can include, for example, profile control of the manipulator 44, tilt control of the body portion 42 and the head portion 41, or the like”);
predict a contact state between the grasped object and the nearby object, and adjust a grasping force of the hand to be weakened from a current grasping force to a predetermined value in accordance with the prediction of the contact state, the predetermined value corresponding to a value of the grasping force… measured by a sensor (see at least [0136]-[0137], [0151], and FIG. 12: “in the release operation according to the first example, first, the grip information acquisition unit 56 determines whether or not a component in a direction opposite to gravity, a direction opposite to a rotational moment due to the gravity, or a position direction of the hand H1 of the transfer target person (hereinafter, these directions are referred to as a specific direction) in, for example, the slip amount U or the initial slip amount u detected by the slip sensor 503 or the slip amount U detected by the distance measuring sensor 504 has become larger than zero (Step S121). That is, it is determined whether or not the receiving operation by the transfer target person has started”; “in a case where the slip amount U or the initial slip amount u in the specific direction has becomes larger than zero (YES in Step S121), that is, in a case where the receiving operation by the transfer target person has started, the grip information acquisition unit 56 specifies the transfer direction A1 by analyzing, for example, image data acquired by the camera 504b in the distance measuring sensor 504 (Step S122). The specified transfer direction A1 is input from the grip information acquisition unit 56 to the physical interaction execution unit 55 together with a start trigger of the release operation based on the fact that the slip amount U or the initial slip amount u in the specific direction has become larger than zero. In response to this, the physical interaction execution unit 55 starts the release operation of the object B1 in the transfer direction A1 (Step S123).”; “In Step S324, the physical interaction execution unit 55 executes the grip force control of the hand portion 443, the arm operation control of the arm portion, and the whole body operation control of the autonomous robot 1 blended with one another in the blend ratio specified based on the learned model. Specifically, the physical interaction execution unit 55 controls (grip force control) a change amount in a force in a unit time when a grip force F generated in the hand portion 443 in order to grip the object B1 is decreased for the release, controls a change amount in a position per unit time when a position of the object B1 is moved in the transfer direction A1 by changing a posture of the arm portion gripping the object B1 and arranging the object B1 in a target coordinate space, and moves a position of the object B1 in the transfer direction A1 by changing a position and a posture of the autonomous robot 1 gripping the object B1 and arranging the object B1 in the target coordinate space. At that time, a decrease amount in the grip force F per unit time, a change amount in the posture of the arm portion, and a change amount in the position and the posture of the autonomous robot 1 are blended with one another in the above-mentioned blend ratio so that a moving speed of the object B1 in the transfer direction A1 maintains continuity.”); and
adjust the grasping force of the hand at a predetermined period after a timing at which the contact state is predicted (see at least [0060], [0071]-[0075], [0119]-[0121], [0138]-[0139], and [0143]: “In addition, by continuously measuring the change in the slip amount of the object B1 in the transfer direction A1 (Step S125) after the physical interaction execution unit 55 starts the release operation and controlling the decrease amount in the grip force F generated in the hand portion 443 in a unit time based on the measured change in the slip amount of the object B1 in the transfer direction A1, it is possible to reduce a sudden change in displacement of the object B1 in the transfer direction A1. Therefore, it becomes possible to transfer the object B1 more smoothly, and it becomes possible to reduce an erroneous fall of the object B1 or unnatural vertical displacement of the object B1 within the hand portion 443.”).
Sodeyama does not explicitly disclose recognize a nearby object in an environment around the manipulator by using an object recognition engine including a deep learning model.
Additionally, Sodeyama discloses a tactile sensor (see at least [0089], [0139]-[0141], [0169]-[0172], FIG. 12, and FIG. 21: “The slip sensor 503 is attached to, for example, a part of the hand portion 443 in contact with the object B1 to be gripped, such as the palm, the finger pad, or the like, and detects a magnitude (slip amount) and a direction of shear slip between the object B1 and the part in contact with the object B1. In addition, the slip sensor 503 may detect a magnitude (initial slip amount) and a direction of initial slip generated between the object B1 and the part in contact with the object B1. As the slip sensor 503, for example, a vision sensor that observes deformation of a viscoelastic body attached to the part of the hand portion 443 in contact with the object B1 and having a predetermined shape, a pressure distribution sensor that measures a two-dimensional distribution of a pressure, or the like, can be used.”; “a pressure distribution sensor can also be used as a sensor (corresponding to the slip sensor 503) for measuring initial slip”; when the release operation is completed the grip force F is 0 and the pressure distribution sensor will detect no grasping force from pressure and friction force), but does not explicitly disclose a value of the grasping force obtained from pressure and friction force exerted in a measurement area of a tactile sensor.
Hinkle, in the same field of robot controls, and therefore analogous art, teaches recognize a nearby object in an environment around the manipulator by using an object recognition engine including a deep learning model (see at least column 22 lines 35-50 and column 23 lines 1-11: “In order to identify an actor, robotic device 701 may be configured to fit a digital virtual (i.e., a digital representation approximating an actual/physical bone skeleton) to the sensor data received from the sensor in robotic head 712. Successfully fitting a virtual skeleton to this sensor data may indicate the presence of an actor within the portion of the environment represented by the sensor data. Additionally or alternatively, robotic device 701 may use other computer vision, machine learning, or artificial intelligence techniques to identify actors in the sensor data.”).
It would have been obvious, before the effective filing date of the invention, with a reasonable expectation of success, to one having ordinary skill in the art, to combine the teachings of Sodeyama with the teachings of Hinkle. The object recognition is recited broadly in Sodeyama and one of ordinary skill in the art would be motivated to substitute the teachings of Hinkle into Sodeyama to enable a specific way to recognize the environment around the robot to determine when to initiate hand-over (see at least Hinkle column 22 and column 23).
Wettels, in the same field of robot controls, and therefore analogous art, teaches a value of the grasping force obtained from pressure and friction force exerted in a measurement area of a tactile sensor (see at least [0011], [0075], [0083]-[0085], [0087], FIG. 2, and, FIG. 3: “Sensing elastomer pad 210 can be applied to gripper finger 204 of gripper 200 and configured to observe changes in applied forces over time, and in particular to provide near-real time observation of forces at pad 210. Elastomer pad 210 in combination with sensing electronics as described herein provides information on the contact between finger 204 and object 208. For example, the information may include normal force applied, sheer or slipping force, and data about the orientation of object 208. This information is useful in grasping delicate objects where it is essential to control the gripping force and ascertain adequate grip before attempting to move object 208.”).
It would have been obvious, before the effective filing date of the invention, with a reasonable expectation of success, to one having ordinary skill in the art, to combine the teachings of Wettels with the Sodeyama and Hinkle combination with the motivation of sensing gripping force at the finger of the gripper with a tactile sensor, as opposed to at a joint as disclosed in Sodeyama, because knowing the force at the finger “is useful in grasping delicate objects” (see at least Wettels [0083]).
Regarding Claim 2, the Sodeyama, Hinkle, and Wettels combination teaches the limitations of Claim 1. Additionally, Sodeyama further discloses wherein the at least one processor is further configured to execute the instructions to: ... adjust the grasping force of the hand so as to be weakened from the current grasping force (see at least [0071] and [0077]-[0078]: “the release operation can be started at an initial stage of the transfer operation, and a fluctuation in a load applied to the object B1 or the grip force of the hand portion 443 can thus be suppressed to the minimum, such that it becomes possible to transfer the object B1 more smoothly.”).
Additionally, Hinkle further teaches wherein the at least one processor is further configured to execute the instructions to: control the arm so as to place the grasped object on the nearby object (see at least Figure 12A and Figure 12C: cup 702 is placed on hand 1200 by robotic arm 708); and
determine whether or not a difference between a position coordinate of the nearby object and that of the hand grasping the grasped object becomes equal to or smaller than a predetermined threshold, and when determining that the difference becomes equal to or smaller than the predetermined threshold, adjust the grasping force of the hand so as to be weakened from the current grasping force (see at least column 32 lines 25-65: “When hand 1200 is detected within the second unobstructed portion of the field of view (e.g., the bottom region between lines 1202 and 1204), as illustrated in FIG. 12B, movement of arm 708 may be stopped. Gripper 710 may subsequently be opened, as indicated by lines 1206 in FIG. 12C, to release cup 702 into hand 1200. Detecting hand 1200 in the second portion of the field of view may involve detecting hand 1200 in an area of an image corresponding to the field of view, determining that a distance between the palm of gripper 710 and hand 1200 is smaller than a distance between the palm of gripper 710 and cup 702 (i.e., determining that hand 1200 in under cup 702), or a combination thereof. Notably, by relying on detecting hand 1200 within the second portion of the field of view, robotic device 701 inherently determines that hand 1200 is within a threshold vertical distance of cup 702, since moving hand 1200 down would remove it from within the second portion of the field of view.”; opening the gripper is weakening the grasping force).
It would have been obvious, before the effective filing date of the invention, with a reasonable expectation of success, to one having ordinary skill in the art, to further combine additional teachings in the same embodiment of Hinkle with the Sodeyama, Hinkle, and Wettels combination with the motivation of enabling another handover option for the robot, and during that handover, avoiding inadvertently dropping the object (see at least Hinkle column 9 lines 25-65).
Regarding Claim 3, the Sodeyama, Hinkle, and Wettels combination teaches the limitations of Claim 1. Additionally, Sodeyama further discloses wherein the at least one processor is further configured to execute the instructions to determine a relationship between the grasped object and that of the nearby object, and adjust the grasping force of the hand based on the relationship (see at least [0061]-[0070]: “the release operation executed by the autonomous robot does not have a predetermined pattern, and an optimal release operation differs for each object or each transfer target person. For example, the optimal release operation differs between a case of transferring a teacup in which hot water is put and a case of transferring a tennis ball”; “ a movement amount of the object B1 in the transfer direction A1 per unit time changes depending on a magnitude of an external force applied to the object B1 by the transfer target person in the receiving operation, but this change in the movement amount differs depending on a difference in characteristics peculiar to the object B1, for example, a coefficient of static friction, a coefficient of dynamic friction, a mass, a shape dimension, a rigidity, a strength, a temperature, a humidity, and the like.”; “an operation that enable a high-quality physical interaction, specifically, the transfer of the object, between the person and the autonomous robot in an environment in which the object which is a target to be transferred is unknown or an environment in which a behavior, a situation, or the like of the transfer target person is unknown, such as a home, a nursing care facility, a store, or the like, where the person and the autonomous robot coexist, that is, an environment where it is not possible to create a model of the optimal release operation in advance”; “it is possible to perform the optimal transfer operation according to the object B1 to be transferred or the behavior, the situation, or the like of the transfer target person”; “the optimal transfer operation may be, for example, that a change rate in a slip amount in the transfer direction at the time of transferring the object maintains continuity (see, for example, FIG. 2). However, the optimal transfer operation is not limited thereto, and various transfer operations in which the transfer target person can receive the object from the autonomous robot 1 without stress can be defined as the optimal transfer operation”; “machine learning using a change rate of a grip force of the hand portion 443, information detected by various sensors mounted on the hand portion 443, or the like, as an input and using continuity of a moving speed of the object B1 in the transfer direction A1 as an output is performed. Therefore, it is possible to enable smooth transfer of the object B1 depending on characteristics (a coefficient of static friction, a coefficient of dynamic friction, a mass, a shape dimension, a rigidity, a strength, a temperature, a humidity, and the like) of the object B1 and the behavior, the situation, or the like of the transfer target person”).
Regarding Claim 5, Sodeyama discloses A method for controlling a manipulator comprising an arm, and a hand attached to the arm and configured to grasp an object to be grasped (see at least [0054] and FIG. 1). All other limitations of Claim 5 are substantially similar to limitations in Claim 1, and the rejection for Claim 1 should be referenced.
Regarding Claim 6, Sodeyama discloses A non-transitory computer-readable medium storing a program for causing a computer to control a manipulator comprising an arm, and a hand attached to the arm and configured to grasp an object to be grasped (see at least [0054], [0112], FIG. 1, FIG. 4, and FIG. 7). All other limitations of Claim 6 are substantially similar to limitations in Claim 1, and the rejection for Claim 1 should be referenced.
Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Sodeyama in view of Hinkle in view of Wettels in further view of U.S. Pub. No. 2011/0166709 (Kim et al., hereinafter, Kim).
Regarding Claim 4, the Sodeyama, Hinkle, and Wettels combination teaches the limitations of Claim 1. The Sodeyama, Hinkle, and Wettels combination does not explicitly teach wherein the at least one processor is further configured to execute the instructions to predict the contact state between the grasped object and the nearby object by using movement information of the manipulator, and adjust the grasping force of the hand according to the prediction of the contact state.
Kim, in the same field of robot controls, and therefore analogous art, teaches wherein the at least one processor is further configured to execute the instructions to predict the contact state between the grasped object and the nearby object by using movement information of the manipulator, and adjust the grasping force of the hand according to the prediction of the contact state (see at least [0047], [0055]-[0057], FIG. 4E, FIG. 4F, and FIG. 6: “FIG. 4E illustrates that the human may pull the article from the robot hand 8 to take the article from the robot hand 8 opposite to an initial intention if it is judged that the robot hand 8 stably grips the article. The robot hand 8 may pull the article toward the robot 1 by the first designated distance, as shown in FIG. 4B. The robot hand 8 may sense pulling force from the robot hand 8, and understand that the human does not intend to transfer the article to the robot hand 8 but rather intends to take the article from the robot hand 8. Then, as shown in FIG. 4F, the robot hand 8 may push the article toward the human in the direction of the pulling force while reducing its gripping force, thereby allowing the human to take the article.”).
It would have been obvious, before the effective filing date of the invention, with a reasonable expectation of success, to one having ordinary skill in the art, to combine the teachings of Kim with the Sodeyama, Hinkle, and Wettels combination with the motivation of improving judgement about if a transfer should occur (see at least [0006]).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALEXANDRA ROBYN MORFORD whose telephone number is (571)272-6109. The examiner can normally be reached Monday - Friday 8:00 AM - 4:00 PM ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Worden can be reached at (571) 272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JASON HOLLOWAY/Primary Examiner, Art Unit 3658
/A.R.M./ Examiner, Art Unit 3658