Prosecution Insights
Last updated: April 19, 2026
Application No. 18/850,183

ROBOT SYSTEM, PROCESSING METHOD, AND RECORDING MEDIUM

Non-Final OA §101§103
Filed
Sep 24, 2024
Examiner
STIEBRITZ, NOAH WILLIAM
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
NEC Corporation
OA Round
1 (Non-Final)
67%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
51%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
12 granted / 18 resolved
+14.7% vs TC avg
Minimal -16% lift
Without
With
+-15.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
44 currently pending
Career history
62
Total Applications
across all art units

Statute-Specific Performance

§101
18.6%
-21.4% vs TC avg
§103
61.7%
+21.7% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
8.0%
-32.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 18 resolved cases

Office Action

§101 §103
DETAILED ACTION This is a non-final Office Action on the merits in response to communications filed by Applicant on September 24th, 2024. Claims 1-2 and 4-12 are currently pending and examined below. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendments to the Claims filed on September 24th, 2024 have been entered. Claims 1-2 and 4-12 are currently amended and pending, and claim 3 has been canceled. The amendments to the Specification filed on September 24th, 2024 has been entered. The amendments to the Abstract filed on September 24th, 2024 has been entered. Information Disclosure Statement The Information Disclosure Statement(s) filed on 09/24/2024 is/are being considered by the examiner. Specification The abstract of the disclosure is objected to because the abstract is less than 50 words in length (42 words in length). A corrected abstract of the disclosure is required and must be presented on a separate sheet, apart from any other text. See MPEP § 608.01(b). The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. Claim Objections Claim 12 objected to because of the following informalities: In Claim 12 line 3, there appears to be a typographical error regarding the phrase “object, to:”. It is suggested that this phrase be corrected to “object to:” for the purpose of improving clarity/ Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1-2 and 4-12 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. STEP 1: Do the claims fall within one of the statutory categories? Yes, claim(s) 1, 11, and 12 are directed towards a system, method, and non-transitory computer readable medium respectively. STEP 2A (PRONG 1): Is the claim directed to a law of nature, a natural phenomenon, or an abstract idea? Yes, the claims are directed to an abstract idea. The system of claim 1, the method of claim 11, and the non-transitory computer readable medium of claim 12 are directed towards a mental process, and as such, are directed towards an abstract idea. Claim 1 recites the limitation “identify a timing at which the target object is released by the robot arm”. Identifying a timing to release an object gripped by a robot arm is process that can be performed in the human mind by simply observing the object gripped by robot and the location the robot is to release the object. Said process is commonly performed in the human mind during manual control of such a robotic manipulator. Therefore, claim 1 is clearly directed towards a mental process, and as such, an abstract idea. Claim 11 recites the limitation “identifying a timing at which the target object is released by the robot arm”. Identifying a timing to release an object gripped by a robot arm is process that can be performed in the human mind by simply observing the object gripped by robot and the location the robot is to release the object. Said process is commonly performed in the human mind during manual control of such a robotic manipulator. Therefore, claim 1 is clearly directed towards a mental process, and as such, an abstract idea. Claim 12 recites the limitation “identify a timing at which the target object is released by the robot arm”. Identifying a timing to release an object gripped by a robot arm is process that can be performed in the human mind by simply observing the object gripped by robot and the location the robot is to release the object. Said process is commonly performed in the human mind during manual control of such a robotic manipulator. Therefore, claim 12 is clearly directed towards a mental process, and as such, an abstract idea. STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application? Claim(s) 1, 11, and 12 do not recite any of the exemplary considerations that are indicative of an abstract idea having been integrated into a practical application. Claim 1 recites the additional limitation “a robot arm configured to move a target object”. A robot configured to move a target object is generic linking. The additional limitation “a memory configured to store instructions” is generic linking. The additional limitation “and a processor configured to execute the instructions to” is generic linking. The additional limitation “based on a physical quantity generated by the target object measured at a movement destination of the target object” is insignificant pre-solution data gathering and is merely describing the data to be collected which is considered to be insignificant extra-solution activity. Claim 11 recites the additional limitation “including a robot arm configured to move a target object”. A robot configured to move a target object is generic linking. The additional limitation “based on a physical quantity generated by the target object measured at a movement destination of the target object” is insignificant pre-solution data gathering and is merely describing the data to be collected which is considered to be insignificant extra-solution activity. Claim 12 recites the additional limitation “which includes a robot system including a robot arm configured to move a target object”. A robot configured to move a target object is generic linking. The additional limitation “based on a physical quantity generated by the target object measured at a movement destination of the target object” is insignificant pre-solution data gathering and is merely describing the data to be collected which is considered to be insignificant extra-solution activity. Therefore, it is clear the abstract idea consists of generic linking and insignificant extra-solution activity, which is not indicative of having been integrated into a practical application. STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? Claim(s) 1, 11, and 12 do not recite additional elements that amount to significantly more than the judicial exception. Claim(s) 1, 11, and 12 do not recite any specific limitations that are not considered to be generic linking or insignificant extra-solution activity. A robot configured to move a target object is generic linking. A memory configured to store instructions is generic linking. A processor configured to execute the instructions to is generic linking. Based on a physical quantity generated by the target object measured at a movement destination of the target object is insignificant pre-solution data gathering and is merely describing the data to be collected which is considered to be insignificant extra-solution activity. In conclusion, claim(s) 1, 11, and 12 are rejected under 35 U.S.C. 101 because: (a) are directed toward an abstract idea, (b) does not recite additional elements that integrate the judicial exception into a practical application, and (c) does not recite additional elements that amount to significantly more than the judicial exception, it is clear that the claims are directed toward non-statutory subject matter. Regarding claim 2, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because comprising a weighing scale configured to measure the physical quantity is generic linking. Therefore, claim 2 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 4, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the processor is configured to identify the timing at which the target object is released by the robot arm, based on acceleration obtained by performing a time derivative for a result of measuring the physical quantity is a part of the abstract idea of claim 1. Additionally the acceleration obtained by performing a time derivative for a result of measuring the physical quantity is insignificant pre-solution data gathering and is merely describing the data to be collected which is considered to be insignificant extra-solution activity. Therefore, claim 4 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 5, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the processor is configured to identify a timing at which a statistical value of a result of measuring the physical quantity in a predetermined previous time period exceeds a threshold value as the timing at which the target object is released by the robot arm is a part of the abstract idea of claim 1 and is insignificant pre-solution data gathering. Therefore, claim 5 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 6, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the processor is configured to identify the timing in a period indicating a status in which the target object is released by the robot arm is insignificant pre-solution data gathering. Therefore, claim 6 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 7, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the robot arm is configured to change a movement speed of the target object at the movement destination in accordance with a type of the target object is a part of the abstract idea of claim 1 and is generic linking. And wherein the processor is configured to identify the timing at which the target object is released by the robot arm, based on the physical quantity corresponding to the movement speed changed by the robot arm is a part of the abstract idea of claim 1 and is insignificant pre-solution data gathering. Therefore, claim 7 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 8, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the type of the target object is classified based on a degree of fragility due to falling is merely specifying the data to be manipulated/gathered and is considered to be insignificant extra-solution activity. And wherein the robot arm is configured to make a change so that the movement speed of the target object at the movement destination slows down as the degree increases is a part of the abstract idea of claim 1. Therefore, claim 8 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 10, this claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because wherein the physical quantity is a force, a weight, or acceleration generated by the target object is merely describing the data to be gathered and is considered to be insignificant extra-solution activity. Therefore, claim 10 is also rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Therefore, claims 2, 3-8, and 10 do not include additional elements that are sufficient to amount to significantly more than the judicial exception, and are therefore rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim 9 was not rejected under 35 U.S.C. § 101 because it recites the limitation “wherein the processor is configured to release the target object at a release timing”. Such a limitation describes an active control step of the system using the information determined using the abstract idea, which clearly shows integration into a practical application. Such a limitation or one of similar phraseology, if added to the independent claims, would overcome the 35 U.S.C. 101 rejection of the independent claims 1. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1, 4, and 7-12 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 12090661 B2 ("Sun '661") in view of JP 2005161507 A ("Hirano"). Regarding claim 1, Sun ‘661 teaches a robot system comprising (Sun ‘661: Figure 1, Abstract, “A velocity control-based robotic system is disclosed. In various embodiments, sensor data is received from one or more sensors deployed in a physical space in which a robot is located. A processor is used to determine based at least in part on the sensor data an at least partly velocity-based trajectory along which to move an element comprising the robot. A command to implement the velocity-based trajectory is sent to the robot.”, Column 4 lines 22-33, “In the example shown in FIG. 1A, robotic arm 116 is mounted on carriage 118, which is configured to ride along a rail or other linear guide 120 disposed alongside and substantially parallel to the conveyor 108, on a side opposite the kitting machines 102, 104, and 106. In various embodiments, a motor, belt, chain, or other source of motive force is applied via a controller (not shown in FIG. 1) to move the carriage 118 and attached robotic arm 116 along the rail or guide 120 to facilitate the automated retrieval of items from the kitting machines 102, 104, and 106 and the placement of items in boxes 112, 114 as they are moved along conveyor 108.”): a robot arm configured to move a target object (Sun ‘661: Column 4 lines 22-33, “In the example shown in FIG. 1A, robotic arm 116 is mounted on carriage 118, which is configured to ride along a rail or other linear guide 120 disposed alongside and substantially parallel to the conveyor 108, on a side opposite the kitting machines 102, 104, and 106. In various embodiments, a motor, belt, chain, or other source of motive force is applied via a controller (not shown in FIG. 1) to move the carriage 118 and attached robotic arm 116 along the rail or guide 120 to facilitate the automated retrieval of items from the kitting machines 102, 104, and 106 and the placement of items in boxes 112, 114 as they are moved along conveyor 108.”, Column 4 line 56 – Column 5 line 14, “The control computer 122 controls the carriage 118 and/or robotic arm 116 as needed to position the robotic arm 116 to retrieve the first one or more items from the associated one(s) of the kitting machines 102, 104, and 106. Control computer 122 may control the kitting machines 102, 104, and 106, e.g., to ensure the require item(s) in the required quantities are present in the pickup zone at the end of kitting machines 102, 104, and 106 nearest to the conveyor 108 and robotic arm 116. Control computer 122 controls robotic arm 116 to retrieve the item(s) from the corresponding pickup zone(s) and places them in the box ( e.g., 112, 114) before moving on to perform coordinated retrieval and packing of any further items required to be included in that particular kit.”. The robotic arm is clearly configured to grip and move the items.); a memory configured to store instructions (Sun ‘661: Column 2 lines 47-64, “The invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless stated otherwise, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term 'processor' refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.”); and a processor configured to execute the instructions to (Sun ‘661: Column 2 lines 47-64): based on a physical quantity generated by the target object measured at a movement destination of the target object (Sun ‘661: Column 5 lines 15-37, “In various embodiments, a plurality of cameras may be deployed in a number of locations, including in the environment and on the respective elements comprising system 100, to facilitate automated (and, if needed, human assisted) kitting operations. In various embodiments, sensors other than cameras may be deployed, including without limitation contact or limit switches, pressure sensors, weight sensors, and the like.”, Column 6 line 55 – Column 7 line 2, “In various embodiments, additional sensors not shown, e.g., weight or force sensors embodied in and/or adjacent to conveyor 134 and/or robotic arm 132, force sensors in the x-y plane and/or z-direction (vertical direction) of suction cups 140, etc. may be used to identify, determine attributes of, grasp, pick up, move through a determined trajectory, and/or place in a destination location on or in receptacle 136 items on conveyor 134 and/or other sources and/or staging areas in which items may be located and/or relocated, e.g., by system 130.”, Column 9 lines 40-57, “In various embodiments, control module 308 is configured to use velocity-based control as disclosed herein to perform robotic operations. For example, to perform a task to grasp an item A and move it to a destination D, in various embodiments control module 308 determines a trajectory that is at least in part velocity-based, e.g., to move an end effector (suction or pincer/finger type gripper) to a position to grasp the item A and/or move the end effector with the item A in its grasp to the destination D. In various embodiments, the trajectory includes a sequence of one or more velocity vectors ( e.g., magnitude/speed and direction in three-dimensional space) along and/or according to which the end effector is to be moved. The trajectory may indicate the desired velocity (magnitude and direction) for each of set of one or more phases or segments and/or may indicate for each segment and/or each transition between segments a desired and/or maximum rate of acceleration and/or other higher order derivatives of position ( e.g., jerk, etc.).”, Column 15 lines 44-67, “FIG. 7A is a flow diagram illustrating an embodiment of a process to determine and impose limits to control a robotic system. In various embodiments, the process 700 of FIG. 7A is implemented by a control computer, such as computer 122 of FIG. 1A, computer 148 of FIG. 1B, and/or computer 212 of FIG. 2. In the example shown, sensor data received at 702 is used to determine at 704 applicable limits on the velocity, acceleration, jerk, etc., to avoid damage. For example, image, weight, or other sensor data received at 702 may be used to determine how heavy, fragile, or rigid an object that is in the grasp of the robot or to be grasped, or other attributes that may affect how securely the robot has or will be able to have the object in its grasp.”. The cited passage clearly shows that the robot is configured to determine a trajectory in order to perform a pick and place operation of an object. The cited passages additionally show that the trajectory is determined in part based on how much the object weighs (i.e. how heavy the object is), it’s fragility, and it’s rigidity. Furthermore, the sensors used to gather this data are in a location separate from the robot, such as in the conveyor belt or in areas adjacent to the robot or the conveyor belt.). Sun ‘661 does not teach identify a timing at which the target object is released by the robot arm. Hirano, in the same field of endeavor, teaches Sun ‘661 does not teach identify a timing at which the target object is released by the robot arm (Hirano: ¶ 0008, “In this robot hand device, the reaction force is calculated from the difference between the weight of the object and the gripping force of the robot hand, and when the reaction force calculated by this calculation exceeds a threshold value, the device performs control to release the object. Therefore, this robot hand device can easily obtain the reaction force and can determine the timing to release the hand without any means for actually detecting the reaction force.”, ¶ 0029, “The object weight estimation control in the control device 7 will be described. By knowing the weight of an object, it is possible to set an optimum gripping force when gripping the object with the robot hand 3, and also to determine the timing to release the object when placing it down. Therefore, in the object weight estimation control, the weight of the object is estimated with high accuracy based on the resultant fingertip force and the hand posture.”. The cited passage clearly shows that the system is configured to determine the timing to release a gripped object.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine robot system taught in Sun ‘661 with identify a timing at which the target object is released by the robot arm taught in Hirano with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because determining the proper timing to release a gripped object allows the robot to more safely and accurately place said object (Hirano: ¶ 0044, “Next, the control of placing an object in the control device 7 will be described. When placing an object, the timing at which the robot hand 3 releases the object is important. For example, if released too quickly, the object may fall to the floor and be damaged. Conversely, if the release is too slow, the placing action continues even after the object hits the floor, which may damage the object due to the force pressing the object against the floor. Therefore, in order to place an object on the floor without causing an impact, it is necessary to determine whether or not the object will hit the floor.”). Regarding claim 4, Sun ‘661 in view of Hirano teaches wherein the processor is configured to identify the timing at which the target object is released by the robot arm (Hirano: ¶ 0008, “In this robot hand device, the reaction force is calculated from the difference between the weight of the object and the gripping force of the robot hand, and when the reaction force calculated by this calculation exceeds a threshold value, the device performs control to release the object. Therefore, this robot hand device can easily obtain the reaction force and can determine the timing to release the hand without any means for actually detecting the reaction force.”, ¶ 0029, “The object weight estimation control in the control device 7 will be described. By knowing the weight of an object, it is possible to set an optimum gripping force when gripping the object with the robot hand 3, and also to determine the timing to release the object when placing it down. Therefore, in the object weight estimation control, the weight of the object is estimated with high accuracy based on the resultant fingertip force and the hand posture.”. The cited passage clearly shows that the system is configured to determine the timing to release a gripped object.), based on acceleration obtained by performing a time derivative for a result of measuring the physical quantity (Sun ‘661: Column 9 lines 40-57, “In various embodiments, control module 308 is configured to use velocity-based control as disclosed herein to perform robotic operations. For example, to perform a task to grasp an item A and move it to a destination D, in various embodiments control module 308 determines a trajectory that is at least in part velocity-based, e.g., to move an end effector (suction or pincer/finger type gripper) to a position to grasp the item A and/or move the end effector with the item A in its grasp to the destination D. In various embodiments, the trajectory includes a sequence of one or more velocity vectors ( e.g., magnitude/speed and direction in three-dimensional space) along and/or according to which the end effector is to be moved. The trajectory may indicate the desired velocity (magnitude and direction) for each of set of one or more phases or segments and/or may indicate for each segment and/or each transition between segments a desired and/or maximum rate of acceleration and/or other higher order derivatives of position ( e.g., jerk, etc.).”, Column 10 line 54 – Column 11 line 9, “In various embodiments, a robotic system as disclosed herein performs adaptive/intelligent trajectory generation, which makes use of the ability to control higher derivatives (velocity, and acceleration) and not just position of robot and/or elements comprising the robot. When trying to follow a real time motion plan accurately, the system uses velocity and/or acceleration control to follow the simulated robot very exactly, enabling the system to react more quickly to changing environment. If velocity and acceleration tracking were not used, as disclosed herein the real (not simulated) robot would lag behind the desired path/position (as deter using (only) pos1t10n control. But if desired pos1t10n is changing, the real robot cannot effectively track the path dynamically using only position control; therefore, in various embodiments, velocity and/or acceleration control are used. In various embodiments, velocity and acceleration tracking allow the robot to instantly know when and how to accelerate without waiting for a large position error, allowing these higher derivatives (i.e., velocity, acceleration, etc.) to be tracked accurately.”, Column 11 lines 59-62, “Higher derivative control is used, in various embodiments, to smoothly slow down the robot and switch directions, but as fast as possible given limits on object/robot to reduce cycle time”, Column 13 lines 19-24, “In various embodiments, a trajectory as described above (e.g., in terms of velocity and/or acceleration for example, and not just position) is followed using velocity (and/or other higher derivative) control, to achieve a desired state and motion, such as a motion/trajectory determined by simulation”. The cited passages clearly show that the system is configured to determine higher order derivatives of position, including acceleration.). Regarding claim 7, Sun ‘661 in view of Hirano teaches wherein the robot arm is configured to change a movement speed of the target object at the movement destination in accordance with a type of the target object (Sun ‘661: Column 9 lines 40-57, “In various embodiments, control module 308 is configured to use velocity-based control as disclosed herein to perform robotic operations. For example, to perform a task to grasp an item A and move it to a destination D, in various embodiments control module 308 determines a trajectory that is at least in part velocity-based, e.g., to move an end effector (suction or pincer/finger type gripper) to a position to grasp the item A and/or move the end effector with the item A in its grasp to the destination D. In various embodiments, the trajectory includes a sequence of one or more velocity vectors ( e.g., magnitude/speed and direction in three-dimensional space) along and/or according to which the end effector is to be moved. The trajectory may indicate the desired velocity (magnitude and direction) for each of set of one or more phases or segments and/or may indicate for each segment and/or each transition between segments a desired and/or maximum rate of acceleration and/or other higher order derivatives of position ( e.g., jerk, etc.).”, Column 15 line 44 – Column 16 line 6, “FIG. 7A is a flow diagram illustrating an embodiment of a process to determine and impose limits to control a robotic system. In various embodiments, the process 700 of FIG. 7A is implemented by a control computer, such as computer 122 of FIG. 1A, computer 148 of FIG. 1B, and/or computer 212 of FIG. 2. In the example shown, sensor data received at 702 is used to determine at 704 applicable limits on the velocity, acceleration, jerk, etc., to avoid damage. For example, image, weight, or other sensor data received at 702 may be used to determine how heavy, fragile, or rigid an object that is in the grasp of the robot or to be grasped, or other attributes that may affect how securely the robot has or will be able to have the object in its grasp. The grasp may be monitored actively, such as by monitoring pressures or air flow associated with a suction gripper or monitoring or estimating shear forces to detect actual or potential/imminent slippage of an item from the robot's grasp. Sensor data may be used to classify the object, e.g., by type or class (based on volume, weight, etc.), and application limits (e.g., on velocity or other higher order derivatives of position) may be determined based on the classification. At 706, the limits determined at 704 are implemented and enforced. For example, a trajectory determined to perform velocity-based control, as disclosed herein, may be determined by the control system taking into consideration the limits determined at 704. For example, a trajectory may be determined that ensures that at all times the end effector remains below a velocity limit determined at 704. Processing continues as described (702, 704, 706) until done (708), e.g., until objects are no longer being moved by the system.”), and wherein the processor is configured to identify the timing at which the target object is released by the robot arm, based on the physical quantity corresponding to the movement speed changed by the robot arm (Hirano: ¶ 0008, “In this robot hand device, the reaction force is calculated from the difference between the weight of the object and the gripping force of the robot hand, and when the reaction force calculated by this calculation exceeds a threshold value, the device performs control to release the object. Therefore, this robot hand device can easily obtain the reaction force and can determine the timing to release the hand without any means for actually detecting the reaction force.”, ¶ 0029, “The object weight estimation control in the control device 7 will be described. By knowing the weight of an object, it is possible to set an optimum gripping force when gripping the object with the robot hand 3, and also to determine the timing to release the object when placing it down. Therefore, in the object weight estimation control, the weight of the object is estimated with high accuracy based on the resultant fingertip force and the hand posture.”. The cited passage clearly shows that the system is configured to determine the timing to release a gripped object.). Sun ‘661 teaches a robot system configured to plan a trajectory and velocity profile for a pick an place operation of an object based on the sensed attributes of said object. Hirano teaches determining the timing a released a gripped object by a robot. A person of ordinary skill in the art would have had the technological capabilities required to have combine the robot system taught in Sun ‘661 with the method of determining a release timing for an object gripped by a robot taught in Hirano. Furthermore, even though not explicitly stated in Sun ‘661, the time when to release the object must be determined as a part of the trajectory. Additionally, one of ordinary skill in the art would recognize that the movement speed of the robot would affect the release timing. Hirano teaches determining the timing to release an object when the robot is moving at a constant speed. One of ordinary skill in the art would recognize that the determined timing would change based on the speed. As such, because Sun ‘661 teaches determining the movement speed of the robot arm based on a physical property of the object, when modified with the timing determination method taught in Hirano, the speed of the robot arm would clearly affect the timing of the release. Therefore, the combination of Sun ‘661 in view of Hirano clearly teaches the limitations of claim 7. Regarding claim 8, Sun ‘661 in view of Hirano teaches wherein the type of the target object is classified based on a degree of fragility due to falling (Column 15 line 44 – Column 16 line 6, “FIG. 7A is a flow diagram illustrating an embodiment of a process to determine and impose limits to control a robotic system. In various embodiments, the process 700 of FIG. 7A is implemented by a control computer, such as computer 122 of FIG. 1A, computer 148 of FIG. 1B, and/or computer 212 of FIG. 2. In the example shown, sensor data received at 702 is used to determine at 704 applicable limits on the velocity, acceleration, jerk, etc., to avoid damage. For example, image, weight, or other sensor data received at 702 may be used to determine how heavy, fragile, or rigid an object that is in the grasp of the robot or to be grasped, or other attributes that may affect how securely the robot has or will be able to have the object in its grasp. The grasp may be monitored actively, such as by monitoring pressures or air flow associated with a suction gripper or monitoring or estimating shear forces to detect actual or potential/imminent slippage of an item from the robot's grasp. Sensor data may be used to classify the object, e.g., by type or class (based on volume, weight, etc.), and application limits (e.g., on velocity or other higher order derivatives of position) may be determined based on the classification. At 706, the limits determined at 704 are implemented and enforced. For example, a trajectory determined to perform velocity-based control, as disclosed herein, may be determined by the control system taking into consideration the limits determined at 704. For example, a trajectory may be determined that ensures that at all times the end effector remains below a velocity limit determined at 704. Processing continues as described (702, 704, 706) until done (708), e.g., until objects are no longer being moved by the system.”. The cited passage clearly shows that the velocity is set, based in part, on the fragility of the object.), and wherein the robot arm is configured to make a change so that the movement speed of the target object at the movement destination slows down as the degree increases (Su ‘661: Column 15 line 44 – Column 16 line 6. One of ordinary skill in the art would have recognized that because the velocity limits are determined based on the attributes and class of the object (such as fragility), more fragile objects would have a lower movement speed limit than less fragile objects.). Regarding claim 9, Sun ‘661 in view of Hirano teaches wherein the processor is configured to release the target object at a release timing (Hirano: ¶ 0008, “In this robot hand device, the reaction force is calculated from the difference between the weight of the object and the gripping force of the robot hand, and when the reaction force calculated by this calculation exceeds a threshold value, the device performs control to release the object. Therefore, this robot hand device can easily obtain the reaction force and can determine the timing to release the hand without any means for actually detecting the reaction force.”, ¶ 0029, “The object weight estimation control in the control device 7 will be described. By knowing the weight of an object, it is possible to set an optimum gripping force when gripping the object with the robot hand 3, and also to determine the timing to release the object when placing it down. Therefore, in the object weight estimation control, the weight of the object is estimated with high accuracy based on the resultant fingertip force and the hand posture.”, ¶ 0053, “When the placing operation is to be ended, the control device 7 sends a command signal CS to the motor driver 20 to stop the arm 2 (stopping the motors 21, ... of each joint), and after the arm 2 has stopped, sends a command signal CS to the motor driver 30 to release the grip of the object by the robot hand 3 (rotatingly driving each motor 31, ...). As shown in FIG. 7, when the object comes into contact with the floor, the resultant fingertip force F<sub>z</sub>NER184_ in the extension direction decreases, so the floor reaction force F increases and exceeds the floor reaction force threshold. When the arm 2 exceeds this point, the arm 2 is stopped, and immediately after the arm 2 stops, the robot hand 3 releases its grip.”. The robot clearly releases the object at the determined timing.). Regarding claim 10, Sun ‘661 in view of Hirano teaches wherein the physical quantity is a force, a weight, or acceleration generated by the target object (Column 15 lines 44-67, “FIG. 7A is a flow diagram illustrating an embodiment of a process to determine and impose limits to control a robotic system. In various embodiments, the process 700 of FIG. 7A is implemented by a control computer, such as computer 122 of FIG. 1A, computer 148 of FIG. 1B, and/or computer 212 of FIG. 2. In the example shown, sensor data received at 702 is used to determine at 704 applicable limits on the velocity, acceleration, jerk, etc., to avoid damage. For example, image, weight, or other sensor data received at 702 may be used to determine how heavy, fragile, or rigid an object that is in the grasp of the robot or to be grasped, or other attributes that may affect how securely the robot has or will be able to have the object in its grasp.” Hirano: ¶ 0008, “In this robot hand device, the reaction force is calculated from the difference between the weight of the object and the gripping force of the robot hand, and when the reaction force calculated by this calculation exceeds a threshold value, the device performs control to release the object. Therefore, this robot hand device can easily obtain the reaction force and can determine the timing to release the hand without any means for actually detecting the reaction force.”, ¶ 0029, “The object weight estimation control in the control device 7 will be described. By knowing the weight of an object, it is possible to set an optimum gripping force when gripping the object with the robot hand 3, and also to determine the timing to release the object when placing it down. Therefore, in the object weight estimation control, the weight of the object is estimated with high accuracy based on the resultant fingertip force and the hand posture.”). Regarding claim 11, Sun ‘661 teaches a processing method to be performed by a robot system including a robot arm configured to move a target object, the processing method comprising (Sun ‘661: Figure 1, Abstract, “A velocity control-based robotic system is disclosed. In various embodiments, sensor data is received from one or more sensors deployed in a physical space in which a robot is located. A processor is used to determine based at least in part on the sensor data an at least partly velocity-based trajectory along which to move an element comprising the robot. A command to implement the velocity-based trajectory is sent to the robot.”, Column 4 lines 22-33, “In the example shown in FIG. 1A, robotic arm 116 is mounted on carriage 118, which is configured to ride along a rail or other linear guide 120 disposed alongside and substantially parallel to the conveyor 108, on a side opposite the kitting machines 102, 104, and 106. In various embodiments, a motor, belt, chain, or other source of motive force is applied via a controller (not shown in FIG. 1) to move the carriage 118 and attached robotic arm 116 along the rail or guide 120 to facilitate the automated retrieval of items from the kitting machines 102, 104, and 106 and the placement of items in boxes 112, 114 as they are moved along conveyor 108.”, Column 4 line 56 – Column 5 line 14, “The control computer 122 controls the carriage 118 and/or robotic arm 116 as needed to position the robotic arm 116 to retrieve the first one or more items from the associated one(s) of the kitting machines 102, 104, and 106. Control computer 122 may control the kitting machines 102, 104, and 106, e.g., to ensure the require item(s) in the required quantities are present in the pickup zone at the end of kitting machines 102, 104, and 106 nearest to the conveyor 108 and robotic arm 116. Control computer 122 controls robotic arm 116 to retrieve the item(s) from the corresponding pickup zone(s) and places them in the box ( e.g., 112, 114) before moving on to perform coordinated retrieval and packing of any further items required to be included in that particular kit.”. The robotic arm is clearly configured to grip and move the items.): based on a physical quantity generated by the target object measured at a movement destination of the target object (Sun ‘661: Column 5 lines 15-37, “In various embodiments, a plurality of cameras may be deployed in a number of locations, including in the environment and on the respective elements comprising system 100, to facilitate automated (and, if needed, human assisted) kitting operations. In various embodiments, sensors other than cameras may be deployed, including without limitation contact or limit switches, pressure sensors, weight sensors, and the like.”, Column 6 line 55 – Column 7 line 2, “In various embodiments, additional sensors not shown, e.g., weight or force sensors embodied in and/or adjacent to conveyor 134 and/or robotic arm 132, force sensors in the x-y plane and/or z-direction (vertical direction) of suction cups 140, etc. may be used to identify, determine attributes of, grasp, pick up, move through a determined trajectory, and/or place in a destination location on or in receptacle 136 items on conveyor 134 and/or other sources and/or staging areas in which items may be located and/or relocated, e.g., by system 130.”, Column 9 lines 40-57, “In various embodiments, control module 308 is configured to use velocity-based control as disclosed herein to perform robotic operations. For example, to perform a task to grasp an item A and move it to a destination D, in various embodiments control module 308 determines a trajectory that is at least in part velocity-based, e.g., to move an end effector (suction or pincer/finger type gripper) to a position to grasp the item A and/or move the end effector with the item A in its grasp to the destination D. In various embodiments, the trajectory includes a sequence of one or more velocity vectors ( e.g., magnitude/speed and direction in three-dimensional space) along and/or according to which the end effector is to be moved. The trajectory may indicate the desired velocity (magnitude and direction) for each of set of one or more phases or segments and/or may indicate for each segment and/or each transition between segments a desired and/or maximum rate of acceleration and/or other higher order derivatives of position ( e.g., jerk, etc.).”, Column 15 lines 44-67, “FIG. 7A is a flow diagram illustrating an embodiment of a process to determine and impose limits to control a robotic system. In various embodiments, the process 700 of FIG. 7A is implemented by a control computer, such as computer 122 of FIG. 1A, computer 148 of FIG. 1B, and/or computer 212 of FIG. 2. In the example shown, sensor data received at 702 is used to determine at 704 applicable limits on the velocity, acceleration, jerk, etc., to avoid damage. For example, image, weight, or other sensor data received at 702 may be used to determine how heavy, fragile, or rigid an object that is in the grasp of the robot or to be grasped, or other attributes that may affect how securely the robot has or will be able to have the object in its grasp.”. The cited passage clearly shows that the robot is configured to determine a trajectory in order to perform a pick and place operation of an object. The cited passages additionally show that the trajectory is determined in part based on how much the object weighs (i.e. how heavy the object is), it’s fragility, and it’s rigidity. Furthermore, the sensors used to gather this data are in a location separate from the robot, such as in the conveyor belt or in areas adjacent to the robot or the conveyor belt.). Sun ‘661 does not teach identifying a timing at which the target object is released by the robot arm. Hirano, in the same field of endeavor, teaches Sun ‘661 does not teach identifying a timing at which the target object is released by the robot arm (Hirano: ¶ 0008, “In this robot hand device, the reaction force is calculated from the difference between the weight of the object and the gripping force of the robot hand, and when the reaction force calculated by this calculation exceeds a threshold value, the device performs control to release the object. Therefore, this robot hand device can easily obtain the reaction force and can determine the timing to release the hand without any means for actually detecting the reaction force.”, ¶ 0029, “The object weight estimation control in the control device 7 will be described. By knowing the weight of an object, it is possible to set an optimum gripping force when gripping the object with the robot hand 3, and also to determine the timing to release the object when placing it down. Therefore, in the object weight estimation control, the weight of the object is estimated with high accuracy based on the resultant fingertip force and the hand posture.”. The cited passage clearly shows that the system is configured to determine the timing to release a gripped object.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine processing method taught in Sun ‘661 with identifying a timing at which the target object is released by the robot arm taught in Hirano with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because determining the proper timing to release a gripped object allows the robot to more safely and accurately place said object (Hirano: ¶ 0044, “Next, the control of placing an object in the control device 7 will be described. When placing an object, the timing at which the robot hand 3 releases the object is important. For example, if released too quickly, the object may fall to the floor and be damaged. Conversely, if the release is too slow, the placing action continues even after the object hits the floor, which may damage the object due to the force pressing the object against the floor. Therefore, in order to place an object on the floor without causing an impact, it is necessary to determine whether or not the object will hit the floor.”). Regarding claim 12, Sun ‘661 teaches a non-transitory recording medium storing a program for causing a computer, which includes a robot system including a robot arm configured to move a target object, to (Sun ‘661: Figure 1, Abstract, “A velocity control-based robotic system is disclosed. In various embodiments, sensor data is received from one or more sensors deployed in a physical space in which a robot is located. A processor is used to determine based at least in part on the sensor data an at least partly velocity-based trajectory along which to move an element comprising the robot. A command to implement the velocity-based trajectory is sent to the robot.”, Column 2 lines 47-64, “The invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless stated otherwise, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term 'processor' refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.”, Column 4 lines 22-33, “In the example shown in FIG. 1A, robotic arm 116 is mounted on carriage 118, which is configured to ride along a rail or other linear guide 120 disposed alongside and substantially parallel to the conveyor 108, on a side opposite the kitting machines 102, 104, and 106. In various embodiments, a motor, belt, chain, or other source of motive force is applied via a controller (not shown in FIG. 1) to move the carriage 118 and attached robotic arm 116 along the rail or guide 120 to facilitate the automated retrieval of items from the kitting machines 102, 104, and 106 and the placement of items in boxes 112, 114 as they are moved along conveyor 108.”, Column 4 line 56 – Column 5 line 14, “The control computer 122 controls the carriage 118 and/or robotic arm 116 as needed to position the robotic arm 116 to retrieve the first one or more items from the associated one(s) of the kitting machines 102, 104, and 106. Control computer 122 may control the kitting machines 102, 104, and 106, e.g., to ensure the require item(s) in the required quantities are present in the pickup zone at the end of kitting machines 102, 104, and 106 nearest to the conveyor 108 and robotic arm 116. Control computer 122 controls robotic arm 116 to retrieve the item(s) from the corresponding pickup zone(s) and places them in the box ( e.g., 112, 114) before moving on to perform coordinated retrieval and packing of any further items required to be included in that particular kit.”. The robotic arm is clearly configured to grip and move the items.): based on a physical quantity generated by the target object measured at a movement destination of the target object (Sun ‘661: Column 5 lines 15-37, “In various embodiments, a plurality of cameras may be deployed in a number of locations, including in the environment and on the respective elements comprising system 100, to facilitate automated (and, if needed, human assisted) kitting operations. In various embodiments, sensors other than cameras may be deployed, including without limitation contact or limit switches, pressure sensors, weight sensors, and the like.”, Column 6 line 55 – Column 7 line 2, “In various embodiments, additional sensors not shown, e.g., weight or force sensors embodied in and/or adjacent to conveyor 134 and/or robotic arm 132, force sensors in the x-y plane and/or z-direction (vertical direction) of suction cups 140, etc. may be used to identify, determine attributes of, grasp, pick up, move through a determined trajectory, and/or place in a destination location on or in receptacle 136 items on conveyor 134 and/or other sources and/or staging areas in which items may be located and/or relocated, e.g., by system 130.”, Column 9 lines 40-57, “In various embodiments, control module 308 is configured to use velocity-based control as disclosed herein to perform robotic operations. For example, to perform a task to grasp an item A and move it to a destination D, in various embodiments control module 308 determines a trajectory that is at least in part velocity-based, e.g., to move an end effector (suction or pincer/finger type gripper) to a position to grasp the item A and/or move the end effector with the item A in its grasp to the destination D. In various embodiments, the trajectory includes a sequence of one or more velocity vectors ( e.g., magnitude/speed and direction in three-dimensional space) along and/or according to which the end effector is to be moved. The trajectory may indicate the desired velocity (magnitude and direction) for each of set of one or more phases or segments and/or may indicate for each segment and/or each transition between segments a desired and/or maximum rate of acceleration and/or other higher order derivatives of position ( e.g., jerk, etc.).”, Column 15 lines 44-67, “FIG. 7A is a flow diagram illustrating an embodiment of a process to determine and impose limits to control a robotic system. In various embodiments, the process 700 of FIG. 7A is implemented by a control computer, such as computer 122 of FIG. 1A, computer 148 of FIG. 1B, and/or computer 212 of FIG. 2. In the example shown, sensor data received at 702 is used to determine at 704 applicable limits on the velocity, acceleration, jerk, etc., to avoid damage. For example, image, weight, or other sensor data received at 702 may be used to determine how heavy, fragile, or rigid an object that is in the grasp of the robot or to be grasped, or other attributes that may affect how securely the robot has or will be able to have the object in its grasp.”. The cited passage clearly shows that the robot is configured to determine a trajectory in order to perform a pick and place operation of an object. The cited passages additionally show that the trajectory is determined in part based on how much the object weighs (i.e. how heavy the object is), it’s fragility, and it’s rigidity. Furthermore, the sensors used to gather this data are in a location separate from the robot, such as in the conveyor belt or in areas adjacent to the robot or the conveyor belt.). Sun ‘661 does not teach identify a timing at which the target object is released by the robot arm. Hirano, in the same field of endeavor, teaches Sun ‘661 does not teach identify a timing at which the target object is released by the robot arm (Hirano: ¶ 0008, “In this robot hand device, the reaction force is calculated from the difference between the weight of the object and the gripping force of the robot hand, and when the reaction force calculated by this calculation exceeds a threshold value, the device performs control to release the object. Therefore, this robot hand device can easily obtain the reaction force and can determine the timing to release the hand without any means for actually detecting the reaction force.”, ¶ 0029, “The object weight estimation control in the control device 7 will be described. By knowing the weight of an object, it is possible to set an optimum gripping force when gripping the object with the robot hand 3, and also to determine the timing to release the object when placing it down. Therefore, in the object weight estimation control, the weight of the object is estimated with high accuracy based on the resultant fingertip force and the hand posture.”. The cited passage clearly shows that the system is configured to determine the timing to release a gripped object.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine processing method taught in Sun ‘661 with identify a timing at which the target object is released by the robot arm is released by the robot arm taught in Hirano with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because determining the proper timing to release a gripped object allows the robot to more safely and accurately place said object (Hirano: ¶ 0044, “Next, the control of placing an object in the control device 7 will be described. When placing an object, the timing at which the robot hand 3 releases the object is important. For example, if released too quickly, the object may fall to the floor and be damaged. Conversely, if the release is too slow, the placing action continues even after the object hits the floor, which may damage the object due to the force pressing the object against the floor. Therefore, in order to place an object on the floor without causing an impact, it is necessary to determine whether or not the object will hit the floor.”). Claim(s) 2 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 12090661 B2 ("Sun '661") in view of JP 2005161507 A ("Hirano") in further view of US 12319517 B2 ("Sun '517 "). Regarding claim 2, Sun ‘661 in view of Hirano teaches comprising a weight sensor configured to measure the physical quantity (Sun ‘661: Column 5 lines 15-37, “In various embodiments, a plurality of cameras may be deployed in a number of locations, including in the environment and on the respective elements comprising system 100, to facilitate automated (and, if needed, human assisted) kitting operations. In various embodiments, sensors other than cameras may be deployed, including without limitation contact or limit switches, pressure sensors, weight sensors, and the like.”, Column 6 line 55 – Column 7 line 2, “In various embodiments, additional sensors not shown, e.g., weight or force sensors embodied in and/or adjacent to conveyor 134 and/or robotic arm 132, force sensors in the x-y plane and/or z-direction (vertical direction) of suction cups 140, etc. may be used to identify, determine attributes of, grasp, pick up, move through a determined trajectory, and/or place in a destination location on or in receptacle 136 items on conveyor 134 and/or other sources and/or staging areas in which items may be located and/or relocated, e.g., by system 130.”). Sun ‘661, in view of Hirano does not teach comprising a weighing scale configured to measure the physical quantity. Sun ‘571, in the same field of endeavor, teaches comprising a weighing scale configured to measure the physical quantity (Sun ‘571: Column 8 lines 1-42, “In the example shown in FIG. 2A, system 200 includes image sensors, including in this example 3D cameras 214 and 216. In various embodiments, other types of sensors may be used (individually or in combination) in a singulation system as disclosed herein, including a camera, an infrared sensor array, a laser array, a scale, a gyroscope, a current sensor, a voltage sensor, a power sensor, a force sensor, a pressure sensor, a weight sensor, and the like.”, Column 13 lines 26-34, “According to various embodiments, system 200 may include one or more sensors other than or in addition to a plurality of cameras, such as one or more of an infrared sensor array, a laser array, a scale, a gyroscope, a current sensor, a voltage sensor, a power sensor, and the like. Information received from the various other sensors is used in determining one or more attributes of the item to be singulated and/or attributes of another item or object within the workspace, etc.”. The cited passage clearly show that a scale is used to determine an attribute of the object.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robot system taught in Sun ‘661 in view of Hirano with comprising a weighing scale configured to measure the physical quantity taught in Sun ‘571 with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because it would have required the simple substitution of one known sensor for another. Both Sun ‘661 and Sun ‘571 teach sensors located in areas adjacent to the robot arm that are used to determine an attribute of the object the robot is to grasp. Additionally both a weight sensor and weighing scale are used to determine physical properties of an object such as weight. A person of ordinary skill in the art would have had the technological capabilities required to have substituted the weight sensor taught in Sun ‘661 for the weighting scale taught in Sun ‘571. Such a modification would not have changed or introduced new functionality to either. No inventive effort would have been required. Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 12090661 B2 ("Sun '661") in view of JP 2005161507 A ("Hirano") in further view of JP 2018155555 A ("Shiraishi"). Regarding claim 5, Sun ‘661 in view of Hirano wherein the processor is configured to identify a timing at which a statistical value of a result of measuring the physical quantity in a predetermined previous time period (Hirano: ¶ 0043, “In reality, the value of W<sub>O</sub> changes with each sampling due to the influence of noise from the six-axis force sensor 6 and the rotary encoder 5, so W<sub>O</sub> calculated over a predetermined time period (for example, one second) is averaged, and the average value is used as an estimate of the weight of the object.”). Sun ’61 in view of Hirano does not teach wherein the processor is configured to identify a timing at which a statistical value of a result of measuring the physical quantity in a predetermined previous time period exceeds a threshold value as the timing at which the target object is released by the robot arm. Shiraishi, in the same field of endeavor, teaches wherein the processor is configured to identify a timing at which a statistical value of a result of measuring the physical quantity in a predetermined previous time period exceeds a threshold value as the timing at which the target object is released by the robot arm (Shiraishi: ¶ 0035, “Note that the stop determination unit 43 determines whether or not the stop condition is satisfied by directly using the output value without performing the filtering process on the output value of the force sensor 34 or the output value of the acceleration sensor 35. However, the stop determination unit 43 may determine whether or not the stop condition is satisfied based on a value obtained by subjecting output values ​​of the force sensor 34 and the acceleration sensor 35 to a predetermined filtering process. Here, as shown in FIG. 3, a delay occurs with respect to the output value A of the force sensor 34 for the filtered value B. The delay time differs depending on the type of filter used and the like. The stop determination unit 43 uses a filter (for example, a moving average filter or the like) having a delay time shorter than the filter used by the mass calculation unit 42 when a predetermined filter process is performed on the output value of the force sensor 34 or the like. Thus, even when the filter is used, the stop determination unit 43 can quickly determine whether or not the stop condition is satisfied.”, ¶ 0036, “When the stop determining unit 43 determines that the stopping condition is satisfied while the holding mechanism 20 is being moved by the robot arm 10, the operation control unit 41 stops the operation of the robot arm 10. When it is determined that the stop condition is satisfied by the stop determination unit 43 while the holding mechanism 20 is being moved by the robot arm 10, the operation control unit 41 releases holding of the article W by the holding mechanism 20 (that is, , Release the article W).”, ¶ 0040, “s described above, in the mass measurement device 1, when the output value of the force sensor 34 exceeds the reference force measurement value, or when the output value of the acceleration sensor 35 exceeds the reference acceleration measurement value, It is determined that the stop condition is satisfied. When the change rate of the output value of the force sensor 34 exceeds the reference force change rate or when the output value of the acceleration sensor 35 exceeds the reference acceleration change rate, the stop determination unit 43 determines that the stop condition is satisfied . The operation control unit 41 stops the operation of the robot arm 10 when it is determined that the stop condition is satisfied.”. As can be seen from the cited passage, the robot is configured to release an object when a stop condition is satisfied. Said stop condition is when the rate of change of a measured value of the object exceeds a threshold. The measured value can additionally be an average value of the measurement taken by the sensor.). Sun ‘661 in view of Hirano teaches a robot system comprising: wherein the processor is configured to identify a timing at which a statistical value of a result of measuring the physical quantity in a predetermined previous time period. Sun in view of Hirano does not teach wherein the processor is configured to identify a timing at which a statistical value of a result of measuring the physical quantity in a predetermined previous time period exceeds a threshold value as the timing at which the target object is released by the robot arm. A person of ordinary skill in the art would have had the technological capabilities required to have modified the system taught in Sun ‘661 in view of Hirano with wherein the processor is configured to identify a timing at which a statistical value of a result of measuring the physical quantity in a predetermined previous time period exceeds a threshold value as the timing at which the target object is released by the robot arm taught in Suzuki. Furthermore, the system taught in Sun ‘661 in view of Hirano is already configured to determine the timing of the release based on an average value of a measured property of the object. Modifying the method to determine the timing based on the time it takes for the average value to exceed a threshold as taught in Shiraishi would require the simple addition of the threshold. A person of ordinary skill in the art would have had the technological capabilities to implement such a threshold into the system. Such a modification would not have changed or introduced new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a robot system comprising: wherein the processor is configured to identify a timing at which a statistical value of a result of measuring the physical quantity in a predetermined previous time period exceeds a threshold value as the timing at which the target object is released by the robot arm. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robot system taught in Sun ‘661 in view of Hirano with wherein the processor is configured to identify a timing at which a statistical value of a result of measuring the physical quantity in a predetermined previous time period exceeds a threshold value as the timing at which the target object is released by the robot arm taught in Shiraishi with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 12090661 B2 ("Sun '661") in view of JP 2005161507 A ("Hirano") in further view of JP 2004188242 A ("Suzuki"). Regarding claim 6, Sun ‘661 in view of Hirano does not teach wherein the processor is configured to identify the timing in a period indicating a status in which the target object is released by the robot arm. Suzuki teaches wherein the processor is configured to identify the timing in a period indicating a status in which the target object is released by the robot arm (Suzuki: Figure 6, ¶ 0042, “ In Figure 6, between times Ta and Tb, the measured sorting objects S measured by the measuring device M are placed sequentially in the bucket 3 without being removed, and after the placed measured sorting object S is detected by the presence/absence sensor 24, the measurement information of the next sorting object S is obtained from the measuring device M.”. As can be seen from the cited passage and figures, the system is configured to determine the timing to release a gripped object in the time period Ta to Tb, which is the time period indicating the status that the object is to be release in the bucket.). Sun ‘661 in view of Hirano teaches a robot system configured to determine the trajectory and velocity of a robot arm in order to perform a pick and place operation on an object and further determine the release timing of said object. Sun ‘661 in view of Hirano does not teach wherein the processor is configured to identify the timing in a period indicating a status in which the target object is released by the robot arm. Suzuki teaches wherein the processor is configured to identify the timing in a period indicating a status in which the target object is released by the robot arm. A person of ordinary skill in the art would have had the technological capabilities required to have modified the system taught in Sun ‘661 with wherein the processor is configured to identify the timing in a period indicating a status in which the target object is released by the robot arm taught in Suzuki with a reasonable expectation of success. Furthermore, the system taught in Sun ‘661 in view of Hirano is already configured to determine the timing for releasing the object based on when the robot reaches the container the object is to be placed in (Sun ‘661: Column 14 lines 18-36, “FIG. 5A is a diagram illustrating an example of velocity control in an embodiment of a velocity control-based robotic system. In the example environment and system 500, an end effector 502, e.g., mounted at the distal/operative end of a robotic arm (not shown in FIG. SA) with an object 504 in its grasp is tasked to place the object 504 in a receptacle 508. If the receptacle 508 were stationary, the system could use position control to move the end effector 502 and object 504 to a static position, e.g., point 506 in the example shown, above the receptacle 508 and release the object 504 to place it in the receptacle 508.”, Column 14 lines 37-46 “In various embodiments, a robotic system as disclosed herein uses velocity control to more quickly and accurately place the object 504 into the moving receptacle 508. For example, in some embodiments, a robotic control system as disclosed herein would observe and determine the velocity 510 of the receptacle 508 and would compute and implement via velocity control, as disclosed herein, a trajectory to intercept and then move in parallel with receptacle 508, to enable the object 504 to be placed successfully into the receptacle 508, as both continued to move at velocity 510.”). The timing of the release is clearly variable and dependent of velocity. As such a person of ordinary skill in the art could add a period of time indicating a release status as taught in Suzuki according to known methods. Such an addition of an allow period of time to release the object would not change or introduce new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a robot system comprising: wherein the processor is configured to identify the timing in a period indicating a status in which the target object is released by the robot arm. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robot system taught in Sun ‘661 in view of Hirano with wherein the processor is configured to identify the timing in a period indicating a status in which the target object is released by the robot arm taught in Suzuki with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Noah W Stiebritz whose telephone number is (571)272-3414. The examiner can normally be reached Monday thru Friday 7-5 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached at (571) 270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /N.W.S./Examiner, Art Unit 3658 /TRUC M DO/Primary Examiner, Art Unit 3658
Read full office action

Prosecution Timeline

Sep 24, 2024
Application Filed
Jan 05, 2026
Non-Final Rejection — §101, §103
Apr 02, 2026
Applicant Interview (Telephonic)
Apr 06, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602063
LOAD HANDLING SYSTEM AND LOAD HANDLING METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12575900
Steerable Eversion Robot System and Method of Operating the Steerable Eversion Robot System
2y 5m to grant Granted Mar 17, 2026
Patent 12552043
METHOD FOR CONTROLLING ROBOTIC ARM, ELECTRONIC DEVICE, AND COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Feb 17, 2026
Patent 12472640
CONTROL METHOD AND SYSTEM FOR ARTICLE TRANSPORTATION BASED ON MOBILE ROBOT
2y 5m to grant Granted Nov 18, 2025
Patent 12467759
VEHICLE WITH SWITCHABLE FORWARD AND BACKWARD CONFIGURATIONS, CONTROL METHOD, AND CONTROL PROGRAM
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
67%
Grant Probability
51%
With Interview (-15.6%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 18 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month