Prosecution Insights
Last updated: April 19, 2026
Application No. 18/127,789

VISUAL HEIGHT ESTIMATOR WITH ROBOTIC POURING CONTROLLER FOR GRANULAR MEDIA

Final Rejection §103
Filed
Mar 29, 2023
Examiner
STIEBRITZ, NOAH WILLIAM
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Samsung Electronics Co., Ltd.
OA Round
2 (Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
2y 6m
To Grant
51%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
12 granted / 18 resolved
+14.7% vs TC avg
Minimal -16% lift
Without
With
+-15.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
44 currently pending
Career history
62
Total Applications
across all art units

Statute-Specific Performance

§101
18.6%
-21.4% vs TC avg
§103
61.7%
+21.7% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
8.0%
-32.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 18 resolved cases

Office Action

§103
DETAILED ACTION This is a Final Office Action on the merits in response to communications filed by Applicant on January 27th, 2026. Claims 1-20 are currently pending and examined below Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendments to the claims, filed on January 27th, 2026, have been entered. Claims 1, 9, and 17 are currently amended and pending, claims 2-8, 10-16, and 18-20, are original, unamended, and pending. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-4, 7-12, and 15-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 11897138 B2 ("Tsuboi") in view of US 12090627 B2 ("Hwang") in further view of CN 113031437 A ("You") in further view of KR 101390819 B1 ("Kim"). Regarding claim 1, Tsuboi teaches a method for controlling a robotic device for pouring a granular media, the method comprising (Tsuboi: Abstract, “An apparatus capable of smoothly injecting contents in an object into another object, an injection method, and an injection program. The apparatus includes a robotic arm device configured to grip a first container, and circuitry configured to recognize a flowrate of contents while injecting an amount of the contents from the first container into a second container, and control a tilt of the first container using the robotic arm device to inject the contents into the second container according to the recognized flowrate of the contents.”, Column 3 lines 54-62, “FIG. 1 is a diagram illustrating the injection apparatus 10 gripping a container 21. The injection apparatus 10 is a robot having two arms (arms 131 and 132). The arm 131 has an arm part 131a and a grip part 131b. The arm 132 includes an arm 132a and a grip part 132b. Each of the arm parts 131a and 132a has joint parts. Each of the grip parts 131b and 132b is capable of gripping a container. In the example of FIG. 1, the injection apparatus 10 grips the container 21 by the grip part 131b of the arm 131.”, Column 17 lines 10-23, “Further, the injection apparatus 10 may determine whether the contents of the container 21 are grain or liquid while viewing the container 22 by a sensor such as the vision sensor. Then, the injection apparatus 10 may change the way to inject the contents of the container 21 depending on whether the contents of the container 21 are grain or liquid. For example, the injection apparatus 10 may change the tilt amount of the container 21 depending on whether the contents of the container 21 are liquid or grain. Further, the injection apparatus 10 may change the target injection time or the acceleration/deceleration time depending on whether the contents of the container 21 are liquid or grain. The contents can be smoothly injected into the container 22 according to the kind of the contents.”, Column 18 lines 26-30, “Further, various embodiments can be combined as needed when the processing contents are not incompatible. Further, the order of the respective steps illustrated in the sequence diagrams or the flowcharts of various embodiments can be changed as needed.”. The cited passages teach that the robot can be configured to pour both liquid and granular media, and that the various embodiments described can be combined as needed.): obtaining an image of a receiving container (Tsuboi: Column 14 lines 24-34, “In a case where the container 21 or the container 22 includes a transparent material, the injection apparatus 10 can observe the contents in the container 21 or the container 22 by use of the vision sensor. In this case, the injection apparatus 10 measures the liquid level height of the liquid in the container 21 or the container 22 by the vision sensor. FIG. 17 is a diagram illustrating how the liquid level height is measured by the vision sensor. Also in this method, the injection apparatus 10 can recognize the liquid level height, thereby smoothly injecting the contents into the container 22.”. The cited passage clearly teaches receiving an image of the receiving container.); identifying, using the image, a current height of granular media in the receiving container (Tsuboi: Column 10 lines 59-67, “FIG. 9 is a diagram illustrating an exemplary model of the container 22. The example of FIG. 9 illustrates a glass in a truncated cone shape with the height he, the radius Re of the opening, and the radius re of the bottom as a model of the container 22. A method for estimating the amount of liquid in the container 22 will be described below by way of the model of the container 22 illustrated in FIG. 9.”, Column 11 lines 6-10, “At first, the recognition part 142 measures the height le of the opening of the container 22 from the liquid level on the basis of the information from the sensors provided in the measurement part 11. For example, the recognition part 142 measures the height le by a depth image.”, Column 11 lines 11-18, “The recognition part 142 then estimates the volume Vw of the liquid in the container 22 on the basis of the height le. For example, the recognition part 142 calculates the depth de [ m] of the liquid in the container 22 in Equation ( 6).”, Column 14 lines 24-34, “In a case where the container 21 or the container 22 includes a transparent material, the injection apparatus 10 can observe the contents in the container 21 or the container 22 by use of the vision sensor. In this case, the injection apparatus 10 measures the liquid level height of the liquid in the container 21 or the container 22 by the vision sensor. FIG. 17 is a diagram illustrating how the liquid level height is measured by the vision sensor. Also in this method, the injection apparatus 10 can recognize the liquid level height, thereby smoothly injecting the contents into the container 22.”. As can be seen from the cited passages, the system is configured to determine the amount of liquid in the receiving container based on an image of said receiving container. One of ordinary skill in the art would recognize that the depth of the liquid in the receiving container is the same as the height of the liquid in the receiving container.); identifying a terminal height of the granular media in the receiving container (Tsuboi: Column 11 lines 53-58, “The planning part 143 plans the liquid level height [m] by the volume flowrate target value Vref. The planning part 143 then plans the tilt amount [rad] by the planned liquid level height. FIG. 11 is a diagram illustrating specific examples of the liquid level height and the tilt amount in the examples of FIG. l0A to FIG. l0C.”); determining an input trajectory signal to the robotic device for pouring a non- granular media to the terminal height of the granular media based on the current height of the granular media (Tsuboi: Column 7 line 56 – Column 8 line 7, “Subsequently, the planning part 143 in the injection apparatus 10 determines the amount of liquid to be injected into the container 22 on the basis of the estimated value of the amount of liquid in the container 22 (step S104). The planning part 143 then determines a flowrate plan (step S105). FIG. 5 is a diagram illustrating an exemplary flowrate plan. The vertical axis indicates volume flowrate Vfr and the horizontal axis indicates time. In the example of FIG. 5, the planning part 143 determines a plan in which the flowrate increases until time tl, the flowrate remains constant (volume flowrate Vfrl) until time t2, the flowrate decreases, and then the injection ends at time t3. By way of example, the injection amount is 100 ml, the time t1 is 2 sec, the time t2 is 10 sec, time t3 is 12 sec, and the volume flowrate Vfrl is 10 ml/sec. The values indicated in the flowrate plan are target values of the flowrate. Additionally, in the example of FIG. 5, the flowrate plan is based on the volume of the liquid, but the flowrate plan may be based on the weight of the liquid.”, Column 8 lines 8-16, “Subsequently, the recognition part 142 in the injection apparatus 10 measures the center position of the container 22 (such as the center position of the glass) ( step S106). The planning part 143 in the injection apparatus 10 then determines a tip trajectory plan of the injection port of the container 21 (such as a pot tip trajectory plan) (step S107). The drive control part 146 in the injection apparatus 10 then moves the tip position of the container 21 according to the tip trajectory plan (step S108).”, Column 8 lines 17-24, “The injection apparatus 10 then performs an injection operation on the container 21 (step S109). For example, the determination part 144 in the injection apparatus 10 deter mines the tilt amount of the container 21, and the tilt control part 145 in the injection apparatus 10 performs tilt control of the container 21 on the basis of the determined tilt amount. The drive control part 146 drives the arm 131 under control of the tilt control part 145.”, Column 11 lines 53-58, “The planning part 143 plans the liquid level height [m] by the volume flowrate target value Vref. The planning part 143 then plans the tilt amount [rad] by the planned liquid level height. FIG. 11 is a diagram illustrating specific examples of the liquid level height and the tilt amount in the examples of FIG. l0A to FIG. l0C.”. The cited passages clearly teach that a trajectory used to pour a non-granular media is determined based on the height of the media in the receiving container and the desired height of the media in the receiving container.); and controlling the robotic device to tilt a source container according to the wrist tilt command signal (Tsuboi: Column 8 lines 17-24, “The injection apparatus 10 then performs an injection operation on the container 21 (step S109). For example, the determination part 144 in the injection apparatus 10 deter mines the tilt amount of the container 21, and the tilt control part 145 in the injection apparatus 10 performs tilt control of the container 21 on the basis of the determined tilt amount. The drive control part 146 drives the arm 131 under control of the tilt control part 145.”). Tsuboi does not teach identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container; determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media; and controlling the robotic device to tilt and vibrate a source container according to the wrist tilt command signal. Hwang, in the same field of endeavor, and controlling the robotic device to tilt and vibrate a source container according to the wrist tilt command signal (Hwang: Column 9 lines 53-59, “The first dropping operation S121 includes an operation of dropping the coffee powder by rotating the coffee powder container 220 gripped by the robot arm 160 in a vertical direction. Here, the vertical rotation means that an opened inlet of the coffee powder container 220 is rotated toward the dripper 210, that is, toward the floor, and the coffee powder may be naturally dropped by gravity.”, Column 9 line 60 – Column 10 line 10, “The second dropping operation S122 may include an operation in which the robot arm 160 performs an upward and downward motion at least once while gripping the coffee powder container 220 in the rotated state. In the second dropping operation S122, the remaining coffee powder stuck in the coffee powder container 220 that is not dropped may also be provided to the dripper 210. In addition, in the second dropping operation S122, a downward motion of the upward and downward motion may include a stop motion in a state in which the robot arm 160 is accelerated. When the robot arm 160 gripping the coffee powder container 220 performs a downward motion, if the robot arm 160 is stopped while accelerating, that is, if the robot arm 160 suddenly stops during acceleration, the remaining coffee powders in the coffee powder container 220 may be separated from the container due to inertia. Accordingly, the remaining coffee powder may be more effectively provided to the dripper 210.”. The cited passages clearly teach controlling the wrist of the robot to tilt and vibrate a source container containing a granular media. One of ordinary skill in the art would recognize that the upward and downward motion applied to the source container would vibrate the granular media stored within.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine method for controlling a robotic device taught in Tsuboi with and controlling the robotic device to tilt and vibrate a source container according to the wrist tilt command signal taught in Hwang with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because allows any granular media that was stuck in the source container to be poured into the receiving container. This allows for the more effective delivery of the granular media (Hwang: Column 9 line 60 – Column 10 line 10, “The second dropping operation S122 may include an operation in which the robot arm 160 performs an upward and downward motion at least once while gripping the coffee powder container 220 in the rotated state. In the second dropping operation S122, the remaining coffee powder stuck in the coffee powder container 220 that is not dropped may also be provided to the dripper 210. In addition, in the second dropping operation S122, a downward motion of the upward and downward motion may include a stop motion in a state in which the robot arm 160 is accelerated. When the robot arm 160 gripping the coffee powder container 220 performs a downward motion, if the robot arm 160 is stopped while accelerating, that is, if the robot arm 160 suddenly stops during acceleration, the remaining coffee powders in the coffee powder container 220 may be separated from the container due to inertia. Accordingly, the remaining coffee powder may be more effectively provided to the dripper 210.”). Tsuboi in view of Hwang does not teach identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container; determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. You, in the same field of endeavor, teaches identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container (You: ¶ 0059, “Step 2: Use a deep neural network to identify the robot, source container, and target container, and simultaneously obtain the relative position information between the source and target containers, the liquid type, and the liquid level information to complete the acquisition of status information.”, ¶ 0060, “After acquiring images in space based on the depth camera, the deep convolutional neural network is used to extract spatial information such as the center of mass, edges, and deflection angles of each target object in the image. After processing, the relative coordinate information (Δx, Δy) with the robot as the origin in the space and the relative tilt angle α of the source container and the target container are obtained. At the same time, the liquid type n and the detected liquid level height h<sub>r</sub> are judged, the liquid type n is encoded as a one-hot vector, and the detected liquid level height is corrected according to the liquid type to obtain the final estimated liquid level h. Finally, the above information is combined to obtain the state vector s, that is, (Δx, Δy, h, α).”. The cited passages clearly teach that the current height of the media in the receiving container is determined using an image and a convolutional neural network.). Tsuboi in view of Hwang teaches a method for controlling a robotic device for pouring a granular media, the method comprising: identifying, using the image, a current height of granular media in the receiving container. Tsuboi in view of Hwang does not teach identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container. You teaches identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container. A person of ordinary skill in the art would have had the technological capabilities required to have modified the method taught in Tsuboi in view of Hwang with identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container taught in You. Furthermore, the method taught in Tsuboi in view of Hwang is already configured to determine the height of the granular media in the receiving container using an image of said receiving container. Modifying the method taught in Tsuboi in view of Hwang to use a convolutional neural network to determine the height of the granular media as taught in You would only require the simple addition of a known algorithm. Additionally, a convolutional neural network would have been known to one of ordinary skill in the art and one of ordinary skill in the art would have had the technological capabilities to implement such an algorithm. Such a modification would not have changed or introduced new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a method for controlling a robotic device for pouring a granular media, the method comprising: identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the method taught in Tsuboi in view of Hwang with identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container taught in You with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Tsuboi in view of Hwang in further view of You does not teach determining a wrist tilt command signal by modulating the input trajectory signal by a square wave that comprising a frequency and an amplitude corresponding to a type of the granular media. Kim, in the same field of endeavor, teaches determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media (Kim: Figures 5 and 6, Abstract, “The present invention provides a trajectory generation system for converting a target trajectory to a follow-up trajectory to which a robot actually follows up. The trajectory generation system comprises: a target trajectory generation module for generating the target trajectory of the robot corresponding to the inputted motion of a user; and a follow-up trajectory generation module for converting square waves per unit area to the follow-up trajectory by convoluting the square waves per unit area to the inputted trajectory. When the second motion of the user is inputted to the robot during the operation of the robot operated by following up a first follow-up trajectory, which is generated by converting a first target trajectory according to an inputted first motion of the user, the target trajectory generation module generates a second target trajectory corresponding to an inputted second motion and the follow-up trajectory generation module generates first and second convolution trajectories by convoluting the first and second target trajectories, respectively. Accordingly, the system can generate a new follow-up trajectory by connecting the end point of the first convolution trajectory and the start point of the second convolution trajectory at a time when the second motion is inputted. Also, the present invention provides a trajectory generation method using the trajectory system.”, ¶ 0026, “Referring to Fig. 2 (a), the target trajectory generation module 10 generates a target trajectory P0 (t) corresponding to a target point input of the user. The target trajectory P0 (t) is an ideal trajectory in which the robot moves the distance S to reach the target point without any time lag, but is a trajectory in which the actual robot cannot follow.”, ¶ 0027, “Therefore, the target trajectory P0 (t) is converted into the following trajectory that the robot can actually follow by the trajectory generating module 20.”, ¶ 0029, “2 (a), the convolution operation module 21 convolutes the convolution function h0 (t), which is a square wave function of the unit area, on the target trajectory P0 (t), and outputs the trajectory P1 . The trajectory determination module 22 considers the differential functions of the trajectory P1 (t) to determine whether the trajectory P1 (t) is a trajectory suitable for the robot to follow.”, ¶ 0032, “As shown in Fig. 3, the convolution function used in the second convolution is a convolution function h1 (t) which is a square wave function of a unit area.”, ¶ 0069, “Referring to Fig. 5 (a), the target trajectory generation module 10 generates a target trajectory y0 (t) corresponding to a target point input of the user. Specifically, the target trajectory generation module 10 generates a target trajectory y0 (t) in the form of a square waveform function having a time t0 and a velocity V1 so that the robot can move the distance S and reach the target point do. V1 is selected so that it does not exceed the maximum value Vmax of the speed that the actuator of the robot can generate.”, ¶ 0075, “As shown in Fig. 6, the convolution function used for the second convolution is a convolution function h2 (t) which is a square wave function of a unit area.”. The cited passages clearly teach that the command signal of the robot (i.e. the target trajectory) is modified using different square waves. Additionally, the cited figures and passages clearly shows that the control signal can be modified by square waves of different frequency and amplitude.). Tsuboi in view of Hwang in further view of You teaches a method for controlling a robotic device for pouring a granular media. Tsuboi in view of Hwang in further view of You does not teach determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. Kim teaches determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. A person of ordinary skill in the art would have had the technological capabilities required to have modified the method taught in Tsuboi in view of Hwang in further view of You with determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media taught in Kim. Furthermore, the method taught in Tsuboi in view of Hwang in further view of You is already configured to determine the type of media being poured and modify the pouring process based on the type and properties of the media (Tsuboi: Column 17 lines 24-37, “Further, the injection apparatus 10 may determine whether the contents of the container 21 are grain or powder while viewing the container 22 by a sensor such as the vision sensor. Then, the injection apparatus 10 may change the way to inject the contents of the container 21 depending on whether the contents of the container 21 are grain or powder. For example, the injection apparatus 10 may change the tilt amount of the container 21 depending on whether the contents of the container 21 are powder or grain. Further, the injection apparatus 10 may change the target injection time or the acceleration/deceleration time depending on whether the contents of the container 21 are powder or grain. The contents can be smoothly injected into the container 22 according to the kind of the contents.”, Column 17 lines 38-46, “Of course, the injection apparatus 10 may change the way to inject the contents of the container 21 depending on whether the contents of the container 21 are grain, powder, or liquid. The injection apparatus 10 can smoothly inject the contents into the container 22 according to the kind of the contents of the container 21. The target injection time or the acceleration/deceleration time may be changed depending on whether the contents of the container 21 are grain, powder, or liquid.”, Column 22 lines 21-24, “The apparatus according to any one of (25) to (29), wherein the circuitry is further configured to determine one or more characteristics related to the contents contained by the first container.”, Column 22 lines 26-30, “The apparatus according to any one of (25) to (30), wherein the one or more characteristics related to the contents includes at least one of type, weight, height, or viscosity of the contents.”, Column 22 lines 37-42, “The apparatus according to any one of (25) to (32), wherein the circuitry determines a flowrate plan for injecting the amount of the contents from the first container into the second container according to the one or more determined characteristics related to the first container.”). Additionally, Kim teach using multiple square waves to modify the trajectory of the robot, wherein the square waves have different frequencies and amplitudes. Therefore, one of ordinary skill in the art would have been able to modify the methods such that a different square wave is used for each type of media. A person of ordinary skill in the art would have also been able to modify the method taught in Tsuboi in view of Hwang in further view of You such that the trajectory is modified using a square wave taught in Kim, as such a modification would only require the simple addition of the mathematical equations taught in Kim. Such modifications would not have changed or introduced new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a method for controlling a robotic device for pouring a granular media comprising: determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the method taught in Tsuboi in view of Hwang in further view of You with determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media taught in Kim with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Regarding claim 2, Tsuboi in view of Hwang in further view of You in further view of Kim teaches further comprising: while controlling the robotic device to tilt and vibrate the source container, identifying the current height of the granular media in the receiving container at predetermined time intervals (You: ¶ 0091, “Step 4: After the policy network converges, the robot's posture information and state information are input into the policy network, and the robot's action strategy is output, specifically:” ¶ 0092, “After the policy network converges, the relative coordinate information (Δx, Δy) of the source and target containers, the relative tilt angle α, and the corrected liquid level h of the target container are input into the policy network. The policy network outputs the robot's motion control vector a and transmits it to the robot to make the robot perform the corresponding action.”, ¶ 0093, “Step 5: Use the robot motion strategy predicted in step 4 to drive the robot to complete the water pouring action. The robot is set with a minimum action duration threshold ε. The robot continues to pour water according to the received action control vector a. The duration t is not less than the minimum action duration threshold ε. After completing the current action, it waits for a new action control vector until the water pouring task is completed.”. The cited passage clearly teach that after a time t spent pouring, the method is configured to repeat the process in order to determine a new control vector (which includes a tilt angle) until such a time that the pouring task is complete. One of ordinary skill in the art would recognize that in order for a control vector to be determined, the previous steps of the method would be determined and, therefore, an image of the receiving container would be capture and the height of the media in said container would be determined again.). Regarding claim 3, Tsuboi in view of Hwang in further view of You in further view of Kim teaches further comprising modifying the wrist tilt command signal based on the identified current height of the granular media in the receiving container at each predetermined time interval (¶ 0091, “Step 4: After the policy network converges, the robot's posture information and state information are input into the policy network, and the robot's action strategy is output, specifically:” ¶ 0092, “After the policy network converges, the relative coordinate information (Δx, Δy) of the source and target containers, the relative tilt angle α, and the corrected liquid level h of the target container are input into the policy network. The policy network outputs the robot's motion control vector a and transmits it to the robot to make the robot perform the corresponding action.”, ¶ 0093, “Step 5: Use the robot motion strategy predicted in step 4 to drive the robot to complete the water pouring action. The robot is set with a minimum action duration threshold ε. The robot continues to pour water according to the received action control vector a. The duration t is not less than the minimum action duration threshold ε. After completing the current action, it waits for a new action control vector until the water pouring task is completed.”. One of ordinary skill in the art would see that because a new control vector and height of the media in the receiving container is determined after each time t, the tilt command would be updated based on the current height of the media in the receiving container.). Regarding claim 4, Tsuboi in view of Hwang in further view of You in further view of Kim teaches wherein the modifying the wrist tilt command signal comprises modifying the wrist tilt command signal using a proportional-derivative (PD) controller (Tsuboi: Column 9 lines 5-17, “The determination part 144 then determines the tilt amount of the container 21 on the basis of the calculated difference (step S202). For example, the determination part 144 determines the tilt amount by proportional control (P control) in which the calculated different is multiplied by a proportional gain, PI control in which the calculated difference is integrated, multiplied by an integral gain, and added to a proportional control term, PID control in which the differential of the calculated difference is multiplied by a differential gain and added to a PI control term, or the like. The tilt amount determined by the determination part 144 is output as a tilt amount instruction value θref to the tilt control part 145.”. One of ordinary skill in the art would have recognized the a PD controller is an obvious variation of a PID controller. One of ordinary skill in the art would have had the technological capabilities required to have implemented an obvious variation of a PID controller such as a PD controller. Additionally, the obvious variation of a PD controller can be easily achieved by simply not considering an integral term of by setting said integral term to zero.). Regarding claim 7, Tsuboi in view of Hwang in further view of You in further view of Kim teaches wherein obtaining the image comprises obtaining the image using a camera connected to the robotic device (Tsuboi: Column 5 lines 21-32, “Further, the measurement part 11 may include a sensor for giving visual information to the injection apparatus 10. For example, the measurement part 11 may include a vision sensor for visually capturing the container 21 or the container 22. Further, the measurement part 11 may include a camera for acquiring a thermographic image. Of course, the camera provided in the measurement part 11 may be a typical camera for capturing a visible ray. The vision sensor or the camera may be arranged at a different position from the main body of the injection apparatus 10. Also in this case, the sensors (including the camera) can be regarded as part of the injection apparatus 10.”). Regarding claim 8, Tsuboi in view of Hwang in further view of You in further view of Kim teaches wherein the image is an RGB image (Tsuboi: Column 5 lines 21-32, “Further, the measurement part 11 may include a sensor for giving visual information to the injection apparatus 10. For example, the measurement part 11 may include a vision sensor for visually capturing the container 21 or the container 22. Further, the measurement part 11 may include a camera for acquiring a thermographic image. Of course, the camera provided in the measurement part 11 may be a typical camera for capturing a visible ray. The vision sensor or the camera may be arranged at a different position from the main body of the injection apparatus 10. Also in this case, the sensors (including the camera) can be regarded as part of the injection apparatus 10.”. One of ordinary skill in the art would recognize that a camera configured to capture a visible ray would produce an RGB image.). Regarding claim 9, Tsuboi teaches a robotic device comprising (Tsuboi: Abstract, “An apparatus capable of smoothly injecting contents in an object into another object, an injection method, and an injection program. The apparatus includes a robotic arm device configured to grip a first container, and circuitry configured to recognize a flowrate of contents while injecting an amount of the contents from the first container into a second container, and control a tilt of the first container using the robotic arm device to inject the contents into the second container according to the recognized flowrate of the contents.”, Column 3 lines 54-62, “FIG. 1 is a diagram illustrating the injection apparatus 10 gripping a container 21. The injection apparatus 10 is a robot having two arms (arms 131 and 132). The arm 131 has an arm part 131a and a grip part 131b. The arm 132 includes an arm 132a and a grip part 132b. Each of the arm parts 131a and 132a has joint parts. Each of the grip parts 131b and 132b is capable of gripping a container. In the example of FIG. 1, the injection apparatus 10 grips the container 21 by the grip part 131b of the arm 131.”, Column 17 lines 10-23, “Further, the injection apparatus 10 may determine whether the contents of the container 21 are grain or liquid while viewing the container 22 by a sensor such as the vision sensor. Then, the injection apparatus 10 may change the way to inject the contents of the container 21 depending on whether the contents of the container 21 are grain or liquid. For example, the injection apparatus 10 may change the tilt amount of the container 21 depending on whether the contents of the container 21 are liquid or grain. Further, the injection apparatus 10 may change the target injection time or the acceleration/deceleration time depending on whether the contents of the container 21 are liquid or grain. The contents can be smoothly injected into the container 22 according to the kind of the contents.”, Column 18 lines 26-30, “Further, various embodiments can be combined as needed when the processing contents are not incompatible. Further, the order of the respective steps illustrated in the sequence diagrams or the flowcharts of various embodiments can be changed as needed.”. The cited passages teach that the robot can be configured to pour both liquid and granular media, and that the various embodiments described can be combined as needed.): at least one memory storing instructions (Tsuboi: Column 6 lines 3-14, “The control part 14 is a controller for controlling each part in the injection apparatus 10. The control part 14 is realized by a processor such as central processing unit (CPU) or micro processing unit (MPU), for example. For example, the control part 14 is realized when the processor executes various programs stored in the storage apparatus in the injection apparatus 10 by use of a random access memory (RAM) or the like as a work area. Additionally, the control part 14 may be realized by an integrated circuit such as application specific integrated circuit (ASIC) or field programmable gate array (FPGA). All of CPU, MPU, ASIC, and FPGA may be regarded as controller.”); and at least one processor configured to execute the instructions to (Tsuboi: Column 6 lines 3-14): obtain an image of a receiving container (Tsuboi: Column 14 lines 24-34, “In a case where the container 21 or the container 22 includes a transparent material, the injection apparatus 10 can observe the contents in the container 21 or the container 22 by use of the vision sensor. In this case, the injection apparatus 10 measures the liquid level height of the liquid in the container 21 or the container 22 by the vision sensor. FIG. 17 is a diagram illustrating how the liquid level height is measured by the vision sensor. Also in this method, the injection apparatus 10 can recognize the liquid level height, thereby smoothly injecting the contents into the container 22.”. The cited passage clearly teaches receiving an image of the receiving container.); identify, using the image, a current height of granular media in the receiving container (Tsuboi: Column 10 lines 59-67, “FIG. 9 is a diagram illustrating an exemplary model of the container 22. The example of FIG. 9 illustrates a glass in a truncated cone shape with the height he, the radius Re of the opening, and the radius re of the bottom as a model of the container 22. A method for estimating the amount of liquid in the container 22 will be described below by way of the model of the container 22 illustrated in FIG. 9.”, Column 11 lines 6-10, “At first, the recognition part 142 measures the height le of the opening of the container 22 from the liquid level on the basis of the information from the sensors provided in the measurement part 11. For example, the recognition part 142 measures the height le by a depth image.”, Column 11 lines 11-18, “The recognition part 142 then estimates the volume Vw of the liquid in the container 22 on the basis of the height le. For example, the recognition part 142 calculates the depth de [ m] of the liquid in the container 22 in Equation ( 6).”, Column 14 lines 24-34, “In a case where the container 21 or the container 22 includes a transparent material, the injection apparatus 10 can observe the contents in the container 21 or the container 22 by use of the vision sensor. In this case, the injection apparatus 10 measures the liquid level height of the liquid in the container 21 or the container 22 by the vision sensor. FIG. 17 is a diagram illustrating how the liquid level height is measured by the vision sensor. Also in this method, the injection apparatus 10 can recognize the liquid level height, thereby smoothly injecting the contents into the container 22.”. As can be seen from the cited passages, the system is configured to determine the amount of liquid in the receiving container based on an image of said receiving container. One of ordinary skill in the art would recognize that the depth of the liquid in the receiving container is the same as the height of the liquid in the receiving container.); identify a terminal height of the granular media in the receiving container (Tsuboi: Column 11 lines 53-58, “The planning part 143 plans the liquid level height [m] by the volume flowrate target value Vref. The planning part 143 then plans the tilt amount [rad] by the planned liquid level height. FIG. 11 is a diagram illustrating specific examples of the liquid level height and the tilt amount in the examples of FIG. l0A to FIG. l0C.”); determine an input trajectory signal to the robotic device for pouring a non- granular media to the terminal height of the granular media based on the current height of the granular media (Tsuboi: Column 7 line 56 – Column 8 line 7, “Subsequently, the planning part 143 in the injection apparatus 10 determines the amount of liquid to be injected into the container 22 on the basis of the estimated value of the amount of liquid in the container 22 (step S104). The planning part 143 then determines a flowrate plan (step S105). FIG. 5 is a diagram illustrating an exemplary flowrate plan. The vertical axis indicates volume flowrate Vfr and the horizontal axis indicates time. In the example of FIG. 5, the planning part 143 determines a plan in which the flowrate increases until time tl, the flowrate remains constant (volume flowrate Vfrl) until time t2, the flowrate decreases, and then the injection ends at time t3. By way of example, the injection amount is 100 ml, the time t1 is 2 sec, the time t2 is 10 sec, time t3 is 12 sec, and the volume flowrate Vfrl is 10 ml/sec. The values indicated in the flowrate plan are target values of the flowrate. Additionally, in the example of FIG. 5, the flowrate plan is based on the volume of the liquid, but the flowrate plan may be based on the weight of the liquid.”, Column 8 lines 8-16, “Subsequently, the recognition part 142 in the injection apparatus 10 measures the center position of the container 22 (such as the center position of the glass) ( step S106). The planning part 143 in the injection apparatus 10 then determines a tip trajectory plan of the injection port of the container 21 (such as a pot tip trajectory plan) (step S107). The drive control part 146 in the injection apparatus 10 then moves the tip position of the container 21 according to the tip trajectory plan (step S108).”, Column 8 lines 17-24, “The injection apparatus 10 then performs an injection operation on the container 21 (step S109). For example, the determination part 144 in the injection apparatus 10 deter mines the tilt amount of the container 21, and the tilt control part 145 in the injection apparatus 10 performs tilt control of the container 21 on the basis of the determined tilt amount. The drive control part 146 drives the arm 131 under control of the tilt control part 145.”, Column 11 lines 53-58, “The planning part 143 plans the liquid level height [m] by the volume flowrate target value Vref. The planning part 143 then plans the tilt amount [rad] by the planned liquid level height. FIG. 11 is a diagram illustrating specific examples of the liquid level height and the tilt amount in the examples of FIG. l0A to FIG. l0C.”. The cited passages clearly teach that a trajectory used to pour a non-granular media is determined based on the height of the media in the receiving container and the desired height of the media in the receiving container.); and control the robotic device to tilt a source container according to the wrist tilt command signal (Tsuboi: Column 8 lines 17-24, “The injection apparatus 10 then performs an injection operation on the container 21 (step S109). For example, the determination part 144 in the injection apparatus 10 deter mines the tilt amount of the container 21, and the tilt control part 145 in the injection apparatus 10 performs tilt control of the container 21 on the basis of the determined tilt amount. The drive control part 146 drives the arm 131 under control of the tilt control part 145.”). Tsuboi does not teach identify, using the image and a convolutional neural network, a current height of granular media in the receiving container; determine a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media; and control the robotic device to tilt and vibrate a source container according to the wrist tilt command signal. Hwang, in the same field of endeavor, and control the robotic device to tilt and vibrate a source container according to the wrist tilt command signal (Hwang: Column 9 lines 53-59, “The first dropping operation S121 includes an operation of dropping the coffee powder by rotating the coffee powder container 220 gripped by the robot arm 160 in a vertical direction. Here, the vertical rotation means that an opened inlet of the coffee powder container 220 is rotated toward the dripper 210, that is, toward the floor, and the coffee powder may be naturally dropped by gravity.”, Column 9 line 60 – Column 10 line 10, “The second dropping operation S122 may include an operation in which the robot arm 160 performs an upward and downward motion at least once while gripping the coffee powder container 220 in the rotated state. In the second dropping operation S122, the remaining coffee powder stuck in the coffee powder container 220 that is not dropped may also be provided to the dripper 210. In addition, in the second dropping operation S122, a downward motion of the upward and downward motion may include a stop motion in a state in which the robot arm 160 is accelerated. When the robot arm 160 gripping the coffee powder container 220 performs a downward motion, if the robot arm 160 is stopped while accelerating, that is, if the robot arm 160 suddenly stops during acceleration, the remaining coffee powders in the coffee powder container 220 may be separated from the container due to inertia. Accordingly, the remaining coffee powder may be more effectively provided to the dripper 210.”. The cited passages clearly teach controlling the wrist of the robot to tilt and vibrate a source container containing a granular media. One of ordinary skill in the art would recognize that the upward and downward motion applied to the source container would vibrate the granular media stored within.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robotic device taught in Tsuboi with and control the robotic device to tilt and vibrate a source container according to the wrist tilt command signal taught in Hwang with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because allows any granular media that was stuck in the source container to be poured into the receiving container. This allows for the more effective delivery of the granular media (Hwang: Column 9 line 60 – Column 10 line 10, “The second dropping operation S122 may include an operation in which the robot arm 160 performs an upward and downward motion at least once while gripping the coffee powder container 220 in the rotated state. In the second dropping operation S122, the remaining coffee powder stuck in the coffee powder container 220 that is not dropped may also be provided to the dripper 210. In addition, in the second dropping operation S122, a downward motion of the upward and downward motion may include a stop motion in a state in which the robot arm 160 is accelerated. When the robot arm 160 gripping the coffee powder container 220 performs a downward motion, if the robot arm 160 is stopped while accelerating, that is, if the robot arm 160 suddenly stops during acceleration, the remaining coffee powders in the coffee powder container 220 may be separated from the container due to inertia. Accordingly, the remaining coffee powder may be more effectively provided to the dripper 210.”). Tsuboi in view of Hwang does not teach identify, using the image and a convolutional neural network, a current height of granular media in the receiving container; determine a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. You, in the same field of endeavor, teaches identify, using the image and a convolutional neural network, a current height of granular media in the receiving container (You: ¶ 0059, “Step 2: Use a deep neural network to identify the robot, source container, and target container, and simultaneously obtain the relative position information between the source and target containers, the liquid type, and the liquid level information to complete the acquisition of status information.”, ¶ 0060, “After acquiring images in space based on the depth camera, the deep convolutional neural network is used to extract spatial information such as the center of mass, edges, and deflection angles of each target object in the image. After processing, the relative coordinate information (Δx, Δy) with the robot as the origin in the space and the relative tilt angle α of the source container and the target container are obtained. At the same time, the liquid type n and the detected liquid level height h<sub>r</sub> are judged, the liquid type n is encoded as a one-hot vector, and the detected liquid level height is corrected according to the liquid type to obtain the final estimated liquid level h. Finally, the above information is combined to obtain the state vector s, that is, (Δx, Δy, h, α).”. The cited passages clearly teach that the current height of the media in the receiving container is determined using an image and a convolutional neural network.). Tsuboi in view of Hwang teaches a robotic device comprising: identify, using the image, a current height of granular media in the receiving container. Tsuboi in view of Hwang does not teach identify, using the image and a convolutional neural network, a current height of granular media in the receiving container. You teaches identify, using the image and a convolutional neural network, a current height of granular media in the receiving container. A person of ordinary skill in the art would have had the technological capabilities required to have modified the device taught in Tsuboi in view of Hwang with identify, using the image and a convolutional neural network, a current height of granular media in the receiving container taught in You. Furthermore, the device taught in Tsuboi in view of Hwang is already configured to determine the height of the granular media in the receiving container using an image of said receiving container. Modifying the device taught in Tsuboi in view of Hwang to use a convolutional neural network to determine the height of the granular media as taught in You would only require the simple addition of a known algorithm. Additionally, a convolutional neural network would have been known to one of ordinary skill in the art and one of ordinary skill in the art would have had the technological capabilities to implement such an algorithm. Such a modification would not have changed or introduced new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a robotic device comprising: identify, using the image and a convolutional neural network, a current height of granular media in the receiving container. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the device taught in Tsuboi in view of Hwang with identify, using the image and a convolutional neural network, a current height of granular media in the receiving container taught in You with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Tsuboi in view of Hwang in further view of You does not teach a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. Kim, in the same field of endeavor, teaches a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media (Kim: Abstract, “The present invention provides a trajectory generation system for converting a target trajectory to a follow-up trajectory to which a robot actually follows up. The trajectory generation system comprises: a target trajectory generation module for generating the target trajectory of the robot corresponding to the inputted motion of a user; and a follow-up trajectory generation module for converting square waves per unit area to the follow-up trajectory by convoluting the square waves per unit area to the inputted trajectory. When the second motion of the user is inputted to the robot during the operation of the robot operated by following up a first follow-up trajectory, which is generated by converting a first target trajectory according to an inputted first motion of the user, the target trajectory generation module generates a second target trajectory corresponding to an inputted second motion and the follow-up trajectory generation module generates first and second convolution trajectories by convoluting the first and second target trajectories, respectively. Accordingly, the system can generate a new follow-up trajectory by connecting the end point of the first convolution trajectory and the start point of the second convolution trajectory at a time when the second motion is inputted. Also, the present invention provides a trajectory generation method using the trajectory system.”, ¶ 0026, “Referring to Fig. 2 (a), the target trajectory generation module 10 generates a target trajectory P0 (t) corresponding to a target point input of the user. The target trajectory P0 (t) is an ideal trajectory in which the robot moves the distance S to reach the target point without any time lag, but is a trajectory in which the actual robot cannot follow.”, ¶ 0027, “Therefore, the target trajectory P0 (t) is converted into the following trajectory that the robot can actually follow by the trajectory generating module 20.”, ¶ 0029, “2 (a), the convolution operation module 21 convolutes the convolution function h0 (t), which is a square wave function of the unit area, on the target trajectory P0 (t), and outputs the trajectory P1 . The trajectory determination module 22 considers the differential functions of the trajectory P1 (t) to determine whether the trajectory P1 (t) is a trajectory suitable for the robot to follow.”, ¶ 0032, “As shown in Fig. 3, the convolution function used in the second convolution is a convolution function h1 (t) which is a square wave function of a unit area.”, ¶ 0069, “Referring to Fig. 5 (a), the target trajectory generation module 10 generates a target trajectory y0 (t) corresponding to a target point input of the user. Specifically, the target trajectory generation module 10 generates a target trajectory y0 (t) in the form of a square waveform function having a time t0 and a velocity V1 so that the robot can move the distance S and reach the target point do. V1 is selected so that it does not exceed the maximum value Vmax of the speed that the actuator of the robot can generate.”, ¶ 0075, “As shown in Fig. 6, the convolution function used for the second convolution is a convolution function h2 (t) which is a square wave function of a unit area.”. The cited passages clearly teach that the command signal of the robot (i.e. the target trajectory) is modified using different square waves. Additionally, the cited figures and passages clearly shows that the control signal can be modified by square waves of different frequency and amplitude.). Tsuboi in view of Hwang in further view of You teaches robotic device for pouring a granular media. Tsuboi in view of Hwang in further view of You does not teach determine a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. Kim teaches determine a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. A person of ordinary skill in the art would have had the technological capabilities required to have modified the device taught in Tsuboi in view of Hwang in further view of You with determine a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media taught in Kim. Furthermore, the device taught in Tsuboi in view of Hwang in further view of You is already configured to determine the type of media being poured and modify the pouring process based on the type and properties of the media (Tsuboi: Column 17 lines 24-37, “Further, the injection apparatus 10 may determine whether the contents of the container 21 are grain or powder while viewing the container 22 by a sensor such as the vision sensor. Then, the injection apparatus 10 may change the way to inject the contents of the container 21 depending on whether the contents of the container 21 are grain or powder. For example, the injection apparatus 10 may change the tilt amount of the container 21 depending on whether the contents of the container 21 are powder or grain. Further, the injection apparatus 10 may change the target injection time or the acceleration/deceleration time depending on whether the contents of the container 21 are powder or grain. The contents can be smoothly injected into the container 22 according to the kind of the contents.”, Column 17 lines 38-46, “Of course, the injection apparatus 10 may change the way to inject the contents of the container 21 depending on whether the contents of the container 21 are grain, powder, or liquid. The injection apparatus 10 can smoothly inject the contents into the container 22 according to the kind of the contents of the container 21. The target injection time or the acceleration/deceleration time may be changed depending on whether the contents of the container 21 are grain, powder, or liquid.”, Column 22 lines 21-24, “The apparatus according to any one of (25) to (29), wherein the circuitry is further configured to determine one or more characteristics related to the contents contained by the first container.”, Column 22 lines 26-30, “The apparatus according to any one of (25) to (30), wherein the one or more characteristics related to the contents includes at least one of type, weight, height, or viscosity of the contents.”, Column 22 lines 37-42, “The apparatus according to any one of (25) to (32), wherein the circuitry determines a flowrate plan for injecting the amount of the contents from the first container into the second container according to the one or more determined characteristics related to the first container.”). Additionally, Kim teach using multiple square waves to modify the trajectory of the robot, wherein the square waves have different frequencies and amplitudes. Therefore, one of ordinary skill in the art would have been able to modify the devices such that a different square wave is used for each type of media. A person of ordinary skill in the art would have also been able to modify the device taught in Tsuboi in view of Hwang in further view of You such that the trajectory is modified using a square wave taught in Kim, as such a modification would only require the simple addition of the mathematical equations taught in Kim. Such modifications would not have changed or introduced new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a robotic device comprising: determine a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the device taught in Tsuboi in view of Hwang in further view of You with determine a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media taught in Kim with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Regarding claim 10, Tsuboi in view of Hwang in further view of You in further view of Kim teaches wherein the at least one processor is further configured to execute the instructions to: while controlling the robotic device to tilt and vibrate the source container, identify the current height of the granular media in the receiving container at predetermined time intervals (You: ¶ 0091, “Step 4: After the policy network converges, the robot's posture information and state information are input into the policy network, and the robot's action strategy is output, specifically:” ¶ 0092, “After the policy network converges, the relative coordinate information (Δx, Δy) of the source and target containers, the relative tilt angle α, and the corrected liquid level h of the target container are input into the policy network. The policy network outputs the robot's motion control vector a and transmits it to the robot to make the robot perform the corresponding action.”, ¶ 0093, “Step 5: Use the robot motion strategy predicted in step 4 to drive the robot to complete the water pouring action. The robot is set with a minimum action duration threshold ε. The robot continues to pour water according to the received action control vector a. The duration t is not less than the minimum action duration threshold ε. After completing the current action, it waits for a new action control vector until the water pouring task is completed.”. The cited passage clearly teach that after a time t spent pouring, the method is configured to repeat the process in order to determine a new control vector (which includes a tilt angle) until such a time that the pouring task is complete. One of ordinary skill in the art would recognize that in order for a control vector to be determined, the previous steps of the method would be determined and, therefore, an image of the receiving container would be capture and the height of the media in said container would be determined again.). Regarding claim 11, Tsuboi in view of Hwang in further view of You in further view of Kim teaches further comprising modifying the wrist tilt command signal based on the identified current height of the granular media in the receiving container at each predetermined time interval (¶ 0091, “Step 4: After the policy network converges, the robot's posture information and state information are input into the policy network, and the robot's action strategy is output, specifically:” ¶ 0092, “After the policy network converges, the relative coordinate information (Δx, Δy) of the source and target containers, the relative tilt angle α, and the corrected liquid level h of the target container are input into the policy network. The policy network outputs the robot's motion control vector a and transmits it to the robot to make the robot perform the corresponding action.”, ¶ 0093, “Step 5: Use the robot motion strategy predicted in step 4 to drive the robot to complete the water pouring action. The robot is set with a minimum action duration threshold ε. The robot continues to pour water according to the received action control vector a. The duration t is not less than the minimum action duration threshold ε. After completing the current action, it waits for a new action control vector until the water pouring task is completed.”. One of ordinary skill in the art would see that because a new control vector and height of the media in the receiving container is determined after each time t, the tilt command would be updated based on the current height of the media in the receiving container.). Regarding claim 12, Tsuboi in view of Hwang in further view of You in further view of Kim teaches wherein the at least one processor is further configured to modify the wrist tilt command signal comprises modifying the wrist tilt command signal using a proportional-derivative (PD) controller (Tsuboi: Column 9 lines 5-17, “The determination part 144 then determines the tilt amount of the container 21 on the basis of the calculated difference (step S202). For example, the determination part 144 determines the tilt amount by proportional control (P control) in which the calculated different is multiplied by a proportional gain, PI control in which the calculated difference is integrated, multiplied by an integral gain, and added to a proportional control term, PID control in which the differential of the calculated difference is multiplied by a differential gain and added to a PI control term, or the like. The tilt amount determined by the determination part 144 is output as a tilt amount instruction value θref to the tilt control part 145.”. One of ordinary skill in the art would have recognized the a PD controller is an obvious variation of a PID controller. One of ordinary skill in the art would have had the technological capabilities required to have implemented an obvious variation of a PID controller such as a PD controller. Additionally, the obvious variation of a PD controller can be easily achieved by simply not considering an integral term of by setting said integral term to zero.). Regarding claim 15, Tsuboi in view of Hwang in further view of You in further view of Kim teaches wherein the at least one processor is further configured to execute the instructions to obtain the image using a camera connected to the robotic device (Tsuboi: Column 5 lines 21-32, “Further, the measurement part 11 may include a sensor for giving visual information to the injection apparatus 10. For example, the measurement part 11 may include a vision sensor for visually capturing the container 21 or the container 22. Further, the measurement part 11 may include a camera for acquiring a thermographic image. Of course, the camera provided in the measurement part 11 may be a typical camera for capturing a visible ray. The vision sensor or the camera may be arranged at a different position from the main body of the injection apparatus 10. Also in this case, the sensors (including the camera) can be regarded as part of the injection apparatus 10.”). Regarding claim 16, Tsuboi in view of Hwang in further view of You in further view of Kim teaches wherein the image is an RGB image (Tsuboi: Column 5 lines 21-32, “Further, the measurement part 11 may include a sensor for giving visual information to the injection apparatus 10. For example, the measurement part 11 may include a vision sensor for visually capturing the container 21 or the container 22. Further, the measurement part 11 may include a camera for acquiring a thermographic image. Of course, the camera provided in the measurement part 11 may be a typical camera for capturing a visible ray. The vision sensor or the camera may be arranged at a different position from the main body of the injection apparatus 10. Also in this case, the sensors (including the camera) can be regarded as part of the injection apparatus 10.”. One of ordinary skill in the art would recognize that a camera configured to capture a visible ray would produce an RGB image.). Regarding claim 17, Tsuboi teaches a non-transitory computer readable storage medium that stores instructions to be executed by at least one processor to perform a method for controlling a robotic device for pouring a granular media, the method comprising (Tsuboi: Abstract, “An apparatus capable of smoothly injecting contents in an object into another object, an injection method, and an injection program. The apparatus includes a robotic arm device configured to grip a first container, and circuitry configured to recognize a flowrate of contents while injecting an amount of the contents from the first container into a second container, and control a tilt of the first container using the robotic arm device to inject the contents into the second container according to the recognized flowrate of the contents.”, Column 3 lines 54-62, “FIG. 1 is a diagram illustrating the injection apparatus 10 gripping a container 21. The injection apparatus 10 is a robot having two arms (arms 131 and 132). The arm 131 has an arm part 131a and a grip part 131b. The arm 132 includes an arm 132a and a grip part 132b. Each of the arm parts 131a and 132a has joint parts. Each of the grip parts 131b and 132b is capable of gripping a container. In the example of FIG. 1, the injection apparatus 10 grips the container 21 by the grip part 131b of the arm 131.”, Column 6 lines 3-14, “The control part 14 is a controller for controlling each part in the injection apparatus 10. The control part 14 is realized by a processor such as central processing unit (CPU) or micro processing unit (MPU), for example. For example, the control part 14 is realized when the processor executes various programs stored in the storage apparatus in the injection apparatus 10 by use of a random access memory (RAM) or the like as a work area. Additionally, the control part 14 may be realized by an integrated circuit such as application specific integrated circuit (ASIC) or field programmable gate array (FPGA). All of CPU, MPU, ASIC, and FPGA may be regarded as controller.”, Column 17 lines 10-23, “Further, the injection apparatus 10 may determine whether the contents of the container 21 are grain or liquid while viewing the container 22 by a sensor such as the vision sensor. Then, the injection apparatus 10 may change the way to inject the contents of the container 21 depending on whether the contents of the container 21 are grain or liquid. For example, the injection apparatus 10 may change the tilt amount of the container 21 depending on whether the contents of the container 21 are liquid or grain. Further, the injection apparatus 10 may change the target injection time or the acceleration/deceleration time depending on whether the contents of the container 21 are liquid or grain. The contents can be smoothly injected into the container 22 according to the kind of the contents.”, Column 18 lines 26-30, “Further, various embodiments can be combined as needed when the processing contents are not incompatible. Further, the order of the respective steps illustrated in the sequence diagrams or the flowcharts of various embodiments can be changed as needed.”. The cited passages teach that the robot can be configured to pour both liquid and granular media, and that the various embodiments described can be combined as needed.): obtaining an image of a receiving container (Tsuboi: Column 14 lines 24-34, “In a case where the container 21 or the container 22 includes a transparent material, the injection apparatus 10 can observe the contents in the container 21 or the container 22 by use of the vision sensor. In this case, the injection apparatus 10 measures the liquid level height of the liquid in the container 21 or the container 22 by the vision sensor. FIG. 17 is a diagram illustrating how the liquid level height is measured by the vision sensor. Also in this method, the injection apparatus 10 can recognize the liquid level height, thereby smoothly injecting the contents into the container 22.”. The cited passage clearly teaches receiving an image of the receiving container.); identifying, using the image, a current height of granular media in the receiving container (Tsuboi: Column 10 lines 59-67, “FIG. 9 is a diagram illustrating an exemplary model of the container 22. The example of FIG. 9 illustrates a glass in a truncated cone shape with the height he, the radius Re of the opening, and the radius re of the bottom as a model of the container 22. A method for estimating the amount of liquid in the container 22 will be described below by way of the model of the container 22 illustrated in FIG. 9.”, Column 11 lines 6-10, “At first, the recognition part 142 measures the height le of the opening of the container 22 from the liquid level on the basis of the information from the sensors provided in the measurement part 11. For example, the recognition part 142 measures the height le by a depth image.”, Column 11 lines 11-18, “The recognition part 142 then estimates the volume Vw of the liquid in the container 22 on the basis of the height le. For example, the recognition part 142 calculates the depth de [ m] of the liquid in the container 22 in Equation ( 6).”, Column 14 lines 24-34, “In a case where the container 21 or the container 22 includes a transparent material, the injection apparatus 10 can observe the contents in the container 21 or the container 22 by use of the vision sensor. In this case, the injection apparatus 10 measures the liquid level height of the liquid in the container 21 or the container 22 by the vision sensor. FIG. 17 is a diagram illustrating how the liquid level height is measured by the vision sensor. Also in this method, the injection apparatus 10 can recognize the liquid level height, thereby smoothly injecting the contents into the container 22.”. As can be seen from the cited passages, the system is configured to determine the amount of liquid in the receiving container based on an image of said receiving container. One of ordinary skill in the art would recognize that the depth of the liquid in the receiving container is the same as the height of the liquid in the receiving container.); identifying a terminal height of the granular media in the receiving container (Tsuboi: Column 11 lines 53-58, “The planning part 143 plans the liquid level height [m] by the volume flowrate target value Vref. The planning part 143 then plans the tilt amount [rad] by the planned liquid level height. FIG. 11 is a diagram illustrating specific examples of the liquid level height and the tilt amount in the examples of FIG. l0A to FIG. l0C.”); determining an input trajectory signal to the robotic device for pouring a non- granular media to the terminal height of the granular media based on the current height of the granular media (Tsuboi: Column 7 line 56 – Column 8 line 7, “Subsequently, the planning part 143 in the injection apparatus 10 determines the amount of liquid to be injected into the container 22 on the basis of the estimated value of the amount of liquid in the container 22 (step S104). The planning part 143 then determines a flowrate plan (step S105). FIG. 5 is a diagram illustrating an exemplary flowrate plan. The vertical axis indicates volume flowrate Vfr and the horizontal axis indicates time. In the example of FIG. 5, the planning part 143 determines a plan in which the flowrate increases until time tl, the flowrate remains constant (volume flowrate Vfrl) until time t2, the flowrate decreases, and then the injection ends at time t3. By way of example, the injection amount is 100 ml, the time t1 is 2 sec, the time t2 is 10 sec, time t3 is 12 sec, and the volume flowrate Vfrl is 10 ml/sec. The values indicated in the flowrate plan are target values of the flowrate. Additionally, in the example of FIG. 5, the flowrate plan is based on the volume of the liquid, but the flowrate plan may be based on the weight of the liquid.”, Column 8 lines 8-16, “Subsequently, the recognition part 142 in the injection apparatus 10 measures the center position of the container 22 (such as the center position of the glass) ( step S106). The planning part 143 in the injection apparatus 10 then determines a tip trajectory plan of the injection port of the container 21 (such as a pot tip trajectory plan) (step S107). The drive control part 146 in the injection apparatus 10 then moves the tip position of the container 21 according to the tip trajectory plan (step S108).”, Column 8 lines 17-24, “The injection apparatus 10 then performs an injection operation on the container 21 (step S109). For example, the determination part 144 in the injection apparatus 10 deter mines the tilt amount of the container 21, and the tilt control part 145 in the injection apparatus 10 performs tilt control of the container 21 on the basis of the determined tilt amount. The drive control part 146 drives the arm 131 under control of the tilt control part 145.”, Column 11 lines 53-58, “The planning part 143 plans the liquid level height [m] by the volume flowrate target value Vref. The planning part 143 then plans the tilt amount [rad] by the planned liquid level height. FIG. 11 is a diagram illustrating specific examples of the liquid level height and the tilt amount in the examples of FIG. l0A to FIG. l0C.”. The cited passages clearly teach that a trajectory used to pour a non-granular media is determined based on the height of the media in the receiving container and the desired height of the media in the receiving container.); and controlling the robotic device to tilt a source container according to the wrist tilt command signal (Tsuboi: Column 8 lines 17-24, “The injection apparatus 10 then performs an injection operation on the container 21 (step S109). For example, the determination part 144 in the injection apparatus 10 deter mines the tilt amount of the container 21, and the tilt control part 145 in the injection apparatus 10 performs tilt control of the container 21 on the basis of the determined tilt amount. The drive control part 146 drives the arm 131 under control of the tilt control part 145.”). Tsuboi does not teach identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container; determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media; and controlling the robotic device to tilt and vibrate a source container according to the wrist tilt command signal. Hwang, in the same field of endeavor, and controlling the robotic device to tilt and vibrate a source container according to the wrist tilt command signal (Hwang: Column 9 lines 53-59, “The first dropping operation S121 includes an operation of dropping the coffee powder by rotating the coffee powder container 220 gripped by the robot arm 160 in a vertical direction. Here, the vertical rotation means that an opened inlet of the coffee powder container 220 is rotated toward the dripper 210, that is, toward the floor, and the coffee powder may be naturally dropped by gravity.”, Column 9 line 60 – Column 10 line 10, “The second dropping operation S122 may include an operation in which the robot arm 160 performs an upward and downward motion at least once while gripping the coffee powder container 220 in the rotated state. In the second dropping operation S122, the remaining coffee powder stuck in the coffee powder container 220 that is not dropped may also be provided to the dripper 210. In addition, in the second dropping operation S122, a downward motion of the upward and downward motion may include a stop motion in a state in which the robot arm 160 is accelerated. When the robot arm 160 gripping the coffee powder container 220 performs a downward motion, if the robot arm 160 is stopped while accelerating, that is, if the robot arm 160 suddenly stops during acceleration, the remaining coffee powders in the coffee powder container 220 may be separated from the container due to inertia. Accordingly, the remaining coffee powder may be more effectively provided to the dripper 210.”. The cited passages clearly teach controlling the wrist of the robot to tilt and vibrate a source container containing a granular media. One of ordinary skill in the art would recognize that the upward and downward motion applied to the source container would vibrate the granular media stored within.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine a non-transitory computer readable storage medium that stores instructions to be executed by at least one processor to perform a method for controlling a robotic device for pouring a granular media taught in Tsuboi with and controlling the robotic device to tilt and vibrate a source container according to the wrist tilt command signal taught in Hwang with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because allows any granular media that was stuck in the source container to be poured into the receiving container. This allows for the more effective delivery of the granular media (Hwang: Column 9 line 60 – Column 10 line 10, “The second dropping operation S122 may include an operation in which the robot arm 160 performs an upward and downward motion at least once while gripping the coffee powder container 220 in the rotated state. In the second dropping operation S122, the remaining coffee powder stuck in the coffee powder container 220 that is not dropped may also be provided to the dripper 210. In addition, in the second dropping operation S122, a downward motion of the upward and downward motion may include a stop motion in a state in which the robot arm 160 is accelerated. When the robot arm 160 gripping the coffee powder container 220 performs a downward motion, if the robot arm 160 is stopped while accelerating, that is, if the robot arm 160 suddenly stops during acceleration, the remaining coffee powders in the coffee powder container 220 may be separated from the container due to inertia. Accordingly, the remaining coffee powder may be more effectively provided to the dripper 210.”). Tsuboi in view of Hwang does not teach identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container; a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. You, in the same field of endeavor, teaches identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container (You: ¶ 0059, “Step 2: Use a deep neural network to identify the robot, source container, and target container, and simultaneously obtain the relative position information between the source and target containers, the liquid type, and the liquid level information to complete the acquisition of status information.”, ¶ 0060, “After acquiring images in space based on the depth camera, the deep convolutional neural network is used to extract spatial information such as the center of mass, edges, and deflection angles of each target object in the image. After processing, the relative coordinate information (Δx, Δy) with the robot as the origin in the space and the relative tilt angle α of the source container and the target container are obtained. At the same time, the liquid type n and the detected liquid level height h<sub>r</sub> are judged, the liquid type n is encoded as a one-hot vector, and the detected liquid level height is corrected according to the liquid type to obtain the final estimated liquid level h. Finally, the above information is combined to obtain the state vector s, that is, (Δx, Δy, h, α).”. The cited passages clearly teach that the current height of the media in the receiving container is determined using an image and a convolutional neural network.). Tsuboi in view of Hwang teaches a non-transitory computer readable storage medium that stores instructions to be executed by at least one processor to perform a method for controlling a robotic device for pouring a granular media, the method comprising: identifying, using the image, a current height of granular media in the receiving container. Tsuboi in view of Hwang does not teach identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container. You teaches identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container. A person of ordinary skill in the art would have had the technological capabilities required to have modified the non-transitory computer readable storage medium taught in Tsuboi in view of Hwang with identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container taught in You. Furthermore, the non-transitory computer readable storage medium taught in Tsuboi in view of Hwang is already configured to determine the height of the granular media in the receiving container using an image of said receiving container. Modifying the non-transitory computer readable storage medium taught in Tsuboi in view of Hwang to use a convolutional neural network to determine the height of the granular media as taught in You would only require the simple addition of a known algorithm. Additionally, a convolutional neural network would have been known to one of ordinary skill in the art and one of ordinary skill in the art would have had the technological capabilities to implement such an algorithm. Such a modification would not have changed or introduced new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a non-transitory computer readable storage medium that stores instructions to be executed by at least one processor to perform a method for controlling a robotic device for pouring a granular media, the method comprising: identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the non-transitory computer readable storage medium taught in Tsuboi in view of Hwang with identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container taught in You with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Tsuboi in view of Hwang in further view of You does not teach determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. Kim, in the same field of endeavor, teaches determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media (Kim: Abstract, “The present invention provides a trajectory generation system for converting a target trajectory to a follow-up trajectory to which a robot actually follows up. The trajectory generation system comprises: a target trajectory generation module for generating the target trajectory of the robot corresponding to the inputted motion of a user; and a follow-up trajectory generation module for converting square waves per unit area to the follow-up trajectory by convoluting the square waves per unit area to the inputted trajectory. When the second motion of the user is inputted to the robot during the operation of the robot operated by following up a first follow-up trajectory, which is generated by converting a first target trajectory according to an inputted first motion of the user, the target trajectory generation module generates a second target trajectory corresponding to an inputted second motion and the follow-up trajectory generation module generates first and second convolution trajectories by convoluting the first and second target trajectories, respectively. Accordingly, the system can generate a new follow-up trajectory by connecting the end point of the first convolution trajectory and the start point of the second convolution trajectory at a time when the second motion is inputted. Also, the present invention provides a trajectory generation method using the trajectory system.”, ¶ 0026, “Referring to Fig. 2 (a), the target trajectory generation module 10 generates a target trajectory P0 (t) corresponding to a target point input of the user. The target trajectory P0 (t) is an ideal trajectory in which the robot moves the distance S to reach the target point without any time lag, but is a trajectory in which the actual robot cannot follow.”, ¶ 0027, “Therefore, the target trajectory P0 (t) is converted into the following trajectory that the robot can actually follow by the trajectory generating module 20.”, ¶ 0029, “2 (a), the convolution operation module 21 convolutes the convolution function h0 (t), which is a square wave function of the unit area, on the target trajectory P0 (t), and outputs the trajectory P1 . The trajectory determination module 22 considers the differential functions of the trajectory P1 (t) to determine whether the trajectory P1 (t) is a trajectory suitable for the robot to follow.”, ¶ 0032, “As shown in Fig. 3, the convolution function used in the second convolution is a convolution function h1 (t) which is a square wave function of a unit area.”, ¶ 0069, “Referring to Fig. 5 (a), the target trajectory generation module 10 generates a target trajectory y0 (t) corresponding to a target point input of the user. Specifically, the target trajectory generation module 10 generates a target trajectory y0 (t) in the form of a square waveform function having a time t0 and a velocity V1 so that the robot can move the distance S and reach the target point do. V1 is selected so that it does not exceed the maximum value Vmax of the speed that the actuator of the robot can generate.”, ¶ 0075, “As shown in Fig. 6, the convolution function used for the second convolution is a convolution function h2 (t) which is a square wave function of a unit area.”. The cited passages clearly teach that the command signal of the robot (i.e. the target trajectory) is modified using different square waves. Additionally, the cited figures and passages clearly shows that the control signal can be modified by square waves of different frequency and amplitude.). Tsuboi in view of Hwang in further view of You teaches a non-transitory computer readable storage medium that stores instructions to be executed by at least one processor to perform a method for controlling a robotic device for pouring a granular media. Tsuboi in view of Hwang in further view of You does not teach determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. Kim teaches determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media (Tsuboi: Column 17 lines 24-37, “Further, the injection apparatus 10 may determine whether the contents of the container 21 are grain or powder while viewing the container 22 by a sensor such as the vision sensor. Then, the injection apparatus 10 may change the way to inject the contents of the container 21 depending on whether the contents of the container 21 are grain or powder. For example, the injection apparatus 10 may change the tilt amount of the container 21 depending on whether the contents of the container 21 are powder or grain. Further, the injection apparatus 10 may change the target injection time or the acceleration/deceleration time depending on whether the contents of the container 21 are powder or grain. The contents can be smoothly injected into the container 22 according to the kind of the contents.”, Column 17 lines 38-46, “Of course, the injection apparatus 10 may change the way to inject the contents of the container 21 depending on whether the contents of the container 21 are grain, powder, or liquid. The injection apparatus 10 can smoothly inject the contents into the container 22 according to the kind of the contents of the container 21. The target injection time or the acceleration/deceleration time may be changed depending on whether the contents of the container 21 are grain, powder, or liquid.”, Column 22 lines 21-24, “The apparatus according to any one of (25) to (29), wherein the circuitry is further configured to determine one or more characteristics related to the contents contained by the first container.”, Column 22 lines 26-30, “The apparatus according to any one of (25) to (30), wherein the one or more characteristics related to the contents includes at least one of type, weight, height, or viscosity of the contents.”, Column 22 lines 37-42, “The apparatus according to any one of (25) to (32), wherein the circuitry determines a flowrate plan for injecting the amount of the contents from the first container into the second container according to the one or more determined characteristics related to the first container.”). A person of ordinary skill in the art would have had the technological capabilities required to have modified the non-transitory computer readable storage medium taught in Tsuboi in view of Hwang in further view of You with determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media taught in Kim. Furthermore, the non-transitory computer readable storage medium taught in Tsuboi in view of Hwang in further view of You is already configured to determine the type of media being poured and modify the pouring process based on the type and properties of the media. Additionally, Kim teach using multiple square waves to modify the trajectory of the robot, wherein the square waves have different frequencies and amplitudes. Therefore, one of ordinary skill in the art would have been able to modify the non-transitory computer readable storage medium such that a different square wave is used for each type of media. A person of ordinary skill in the art would have also been able to modify the non-transitory computer readable storage medium taught in Tsuboi in view of Hwang in further view of You such that the trajectory is modified using a square wave taught in Kim, as such a modification would only require the simple addition of the mathematical equations taught in Kim. Such modifications would not have changed or introduced new functionality. No inventive effort would have been required. The combination would have yielded the predictable result of a non-transitory computer readable storage medium that stores instructions to be executed by at least one processor to perform a method for controlling a robotic device for pouring a granular media, the method comprising: determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the non-transitory computer readable storage medium taught in Tsuboi in view of Hwang in further view of You with determining a wrist tilt command signal by modulating the input trajectory signal by a square wave, the square wave comprising a frequency and an amplitude corresponding to a type of the granular media taught in Kim with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Regarding claim 18, Tsuboi in view of Hwang in further view of You in further view of Kim teaches further comprising: while controlling the robotic device to tilt and vibrate the source container, identifying the current height of the granular media in the receiving container at predetermined time intervals (You: ¶ 0091, “Step 4: After the policy network converges, the robot's posture information and state information are input into the policy network, and the robot's action strategy is output, specifically:” ¶ 0092, “After the policy network converges, the relative coordinate information (Δx, Δy) of the source and target containers, the relative tilt angle α, and the corrected liquid level h of the target container are input into the policy network. The policy network outputs the robot's motion control vector a and transmits it to the robot to make the robot perform the corresponding action.”, ¶ 0093, “Step 5: Use the robot motion strategy predicted in step 4 to drive the robot to complete the water pouring action. The robot is set with a minimum action duration threshold ε. The robot continues to pour water according to the received action control vector a. The duration t is not less than the minimum action duration threshold ε. After completing the current action, it waits for a new action control vector until the water pouring task is completed.”. The cited passage clearly teach that after a time t spent pouring, the method is configured to repeat the process in order to determine a new control vector (which includes a tilt angle) until such a time that the pouring task is complete. One of ordinary skill in the art would recognize that in order for a control vector to be determined, the previous steps of the method would be determined and, therefore, an image of the receiving container would be capture and the height of the media in said container would be determined again.). Regarding claim 19, Tsuboi in view of Hwang in further view of You in further view of Kim teaches further comprising modifying the wrist tilt command signal based on the identified current height of the granular media in the receiving container at each predetermined time interval (¶ 0091, “Step 4: After the policy network converges, the robot's posture information and state information are input into the policy network, and the robot's action strategy is output, specifically:” ¶ 0092, “After the policy network converges, the relative coordinate information (Δx, Δy) of the source and target containers, the relative tilt angle α, and the corrected liquid level h of the target container are input into the policy network. The policy network outputs the robot's motion control vector a and transmits it to the robot to make the robot perform the corresponding action.”, ¶ 0093, “Step 5: Use the robot motion strategy predicted in step 4 to drive the robot to complete the water pouring action. The robot is set with a minimum action duration threshold ε. The robot continues to pour water according to the received action control vector a. The duration t is not less than the minimum action duration threshold ε. After completing the current action, it waits for a new action control vector until the water pouring task is completed.”. One of ordinary skill in the art would see that because a new control vector and height of the media in the receiving container is determined after each time t, the tilt command would be updated based on the current height of the media in the receiving container.). Regarding claim 20, Tsuboi in view of Hwang in further view of You in further view of Kim teaches wherein the modifying the wrist tilt command signal comprises modifying the wrist tilt command signal using a proportional-derivative (PD) controller (Tsuboi: Column 9 lines 5-17, “The determination part 144 then determines the tilt amount of the container 21 on the basis of the calculated difference (step S202). For example, the determination part 144 determines the tilt amount by proportional control (P control) in which the calculated different is multiplied by a proportional gain, PI control in which the calculated difference is integrated, multiplied by an integral gain, and added to a proportional control term, PID control in which the differential of the calculated difference is multiplied by a differential gain and added to a PI control term, or the like. The tilt amount determined by the determination part 144 is output as a tilt amount instruction value θref to the tilt control part 145.”. One of ordinary skill in the art would have recognized the a PD controller is an obvious variation of a PID controller. One of ordinary skill in the art would have had the technological capabilities required to have implemented an obvious variation of a PID controller such as a PD controller. Additionally, the obvious variation of a PD controller can be easily achieved by simply not considering an integral term of by setting said integral term to zero.). Claim(s) 5-6 and 13-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 11897138 B2 ("Tsuboi") in view of US 12090627 B2 ("Hwang") in further view of CN 113031437 A ("You") in further view of KR 101390819 B1 ("Kim") in further view of JP 2021181878 A ("Wada"). Regarding claim 5, Tsuboi in view of Hwang in further view of You in further view of Kim teaches further comprising receiving a user input to the robotic device (Tsuboi: Column 7 24-32, “FIG. 4 is a flowchart illustrating an injection processing according to an embodiment of the present disclosure. The injection processing is a processing in which the injection apparatus 10 injects a liquid in the container 21 into the container 22. The injection apparatus 10 starts the injection processing when receiving a user instruction.”). Tsuboi in view of Hwang in further view of You in further view of Kim does not teach further comprising receiving a user input to the robotic device for identifying the type of the granular media. Wada, in the same field of endeavor, teaches further comprising receiving a user input to the robotic device for identifying the type of the granular media (Wada: ¶ 0054, “The control voice information is an instruction issued by voice including a cooking operation that the cookware is to perform. For example, "turn off the stove", "make it low", "make it high", "make it medium", "light the stove", "cut", "wash", "chop", and "chill". Instructions such as "to", "to room temperature", and "mix" are some specific examples of control voice information. As will be described in detail later, the question voice information acquisition unit S may be configured to acquire "question response voice information" in addition to the control voice information (question response voice information acquisition means S). It may be configured to acquire "time-series cooking utensil control information" (time-series cooking utensil control concession acquisition means S), or may be configured to acquire "cooking target-dependent cooking utensil control information". (Cooking target-dependent cooking utensil control information acquisition means S).”, ¶ 0140, “Further, as shown in FIG. 27, cooking equipment control. The cooking utensil control information acquisition unit S of the server device (2700) acquires cooking utensil-dependent cooking utensil control information which is cooking utensil control information for controlling the cooking utensil based on the information indicating the cooking target included in the control voice information. It may be configured as the voice-controlled cooking utensil platform according to any one of the inventions of the basic configuration having the cooking object-dependent cooking utensil control information acquisition means S (2701), and the other configurations 2 to 6. .. The "cooking target-dependent cooking utensil control information means S" acquires cooking object-dependent cooking utensil control information, which is cooking utensil control information that controls the cooking utensil based on the information indicating the cooking target included in the control voice information. The "cooking target" is an ingredient, a seasoning, oil, water, a combination thereof, a dish, or the like. Dependence on the cooking target means that the control voice information based on the voice of the user of the cooking target and the cooking method to be performed for the cooking target is used, and the cooking utensil control information suitable for the cooking target is automatically acquired. Then, the cooking utensil is controlled according to the acquired cooking utensil control information suitable for the cooking object. Control of this cookware may also include time series control. Therefore, the cooking object-dependent cooking utensil control information acquisition means may be configured to be able to use an appropriate cooking utensil control information database associated with each cooking object. It is preferable that this cookware control information database is prepared for each cookware for each cookware manufacturer. In addition, it can be configured so that the user can select which cooking method and cooking procedure to select for each cooking target. That is, a plurality of cookware control information may be selectively associated with one cooking target.”, ¶ 0141, “For example, if you give a control voice to "boil broccoli", it automatically selects 3 minutes, which is the generally appropriate boiling time for broccoli, and medium heat, which is generally appropriate. And control the cooking utensils. For example, if you give a control voice to "boil somen noodles", the general boiling time of somen noodles is 1 minute and 30 seconds, and the medium heat, which is generally considered appropriate, is automatically selected and cooked. Control the equipment.”. The cited passages clearly teach that the system is configured to take a user input regarding the “cooking target”. Paragraph 0101 list potential ingredients that are “cooking targets”, many of which are considered granular media.). Tsuboi in view of Hwang in further view of You in further view of Kim teaches a method for controlling a robotic device further comprising receiving a user input to the robotic device. Tsuboi in view of Hwang in further view of You in further view of Kim does not teach further comprising receiving a user input to the robotic device for identifying the type of the granular media. Wada teaches further comprising receiving a user input to the robotic device for identifying the type of the granular media. A person of ordinary skill in the art would have had the technological capabilities required to have modified the method taught in Tsuboi in view of Hwang in further view of You in further of Kim with further comprising receiving a user input to the robotic device for identifying the type of the granular media taught in Wada. Furthermore, the method taught in Tsuboi in view of Hwang in further view of You in further view of Kim already teaches receiving a user input. Additionally, the method taught in Tsuboi in view of Hwang in further view of You in further view of Kim is also configured to determine the type of granular media. As such, modifying the method to receive a user input specifying the granular media would require the simple addition of the method taught in Wada. Such a modification would not have changed or introduced new functionality to either. No inventive effort would have been required. The combination would have yielded the predictable result of a method for controlling a robotic device further comprising receiving a user input to the robotic device for identifying the type of the granular media. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the method taught in Tsuboi in view of Hwang in further view of You in further view of Kim with receiving a user input to the robotic device for identifying the type of the granular media taught in Wada with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Regarding claim 6, Tsuboi in view of Hwang in further view of You in further view of Kim in further view of Wada teaches wherein the user input is a voice command (Wada; ¶ 0054, “The control voice information is an instruction issued by voice including a cooking operation that the cookware is to perform. For example, "turn off the stove", "make it low", "make it high", "make it medium", "light the stove", "cut", "wash", "chop", and "chill". Instructions such as "to", "to room temperature", and "mix" are some specific examples of control voice information. As will be described in detail later, the question voice information acquisition unit S may be configured to acquire "question response voice information" in addition to the control voice information (question response voice information acquisition means S). It may be configured to acquire "time-series cooking utensil control information" (time-series cooking utensil control concession acquisition means S), or may be configured to acquire "cooking target-dependent cooking utensil control information". (Cooking target-dependent cooking utensil control information acquisition means S).”). Regarding claim 13, Tsuboi in view of Hwang in further view of You in further view of Kim teaches wherein the at least one processor is further configured to execute the instructions to receive a user input to the robotic device (Tsuboi: Column 7 24-32, “FIG. 4 is a flowchart illustrating an injection processing according to an embodiment of the present disclosure. The injection processing is a processing in which the injection apparatus 10 injects a liquid in the container 21 into the container 22. The injection apparatus 10 starts the injection processing when receiving a user instruction.”). Tsuboi in view of Hwang in further view of You in further view of Kim does not teach wherein the at least one processor is further configured to execute the instructions to receive a user input to the robotic device for identifying the type of the granular media. Wada, in the same field of endeavor, teaches wherein the at least one processor is further configured to execute the instructions to receive a user input to the robotic device for identifying the type of the granular media (Wada: ¶ 0054, “The control voice information is an instruction issued by voice including a cooking operation that the cookware is to perform. For example, "turn off the stove", "make it low", "make it high", "make it medium", "light the stove", "cut", "wash", "chop", and "chill". Instructions such as "to", "to room temperature", and "mix" are some specific examples of control voice information. As will be described in detail later, the question voice information acquisition unit S may be configured to acquire "question response voice information" in addition to the control voice information (question response voice information acquisition means S). It may be configured to acquire "time-series cooking utensil control information" (time-series cooking utensil control concession acquisition means S), or may be configured to acquire "cooking target-dependent cooking utensil control information". (Cooking target-dependent cooking utensil control information acquisition means S).”, ¶ 0140, “Further, as shown in FIG. 27, cooking equipment control. The cooking utensil control information acquisition unit S of the server device (2700) acquires cooking utensil-dependent cooking utensil control information which is cooking utensil control information for controlling the cooking utensil based on the information indicating the cooking target included in the control voice information. It may be configured as the voice-controlled cooking utensil platform according to any one of the inventions of the basic configuration having the cooking object-dependent cooking utensil control information acquisition means S (2701), and the other configurations 2 to 6. .. The "cooking target-dependent cooking utensil control information means S" acquires cooking object-dependent cooking utensil control information, which is cooking utensil control information that controls the cooking utensil based on the information indicating the cooking target included in the control voice information. The "cooking target" is an ingredient, a seasoning, oil, water, a combination thereof, a dish, or the like. Dependence on the cooking target means that the control voice information based on the voice of the user of the cooking target and the cooking method to be performed for the cooking target is used, and the cooking utensil control information suitable for the cooking target is automatically acquired. Then, the cooking utensil is controlled according to the acquired cooking utensil control information suitable for the cooking object. Control of this cookware may also include time series control. Therefore, the cooking object-dependent cooking utensil control information acquisition means may be configured to be able to use an appropriate cooking utensil control information database associated with each cooking object. It is preferable that this cookware control information database is prepared for each cookware for each cookware manufacturer. In addition, it can be configured so that the user can select which cooking method and cooking procedure to select for each cooking target. That is, a plurality of cookware control information may be selectively associated with one cooking target.”, ¶ 0141, “For example, if you give a control voice to "boil broccoli", it automatically selects 3 minutes, which is the generally appropriate boiling time for broccoli, and medium heat, which is generally appropriate. And control the cooking utensils. For example, if you give a control voice to "boil somen noodles", the general boiling time of somen noodles is 1 minute and 30 seconds, and the medium heat, which is generally considered appropriate, is automatically selected and cooked. Control the equipment.”. The cited passages clearly teach that the system is configured to take a user input regarding the “cooking target”. Paragraph 0101 list potential ingredients that are “cooking targets”, many of which are considered granular media.). Tsuboi in view of Hwang in further view of You in further view of Kim teaches a robotic device wherein the at least one processor is further configured to execute the instructions to receive a user input to the robotic device. Tsuboi in view of Hwang in further view of You in further view of Kim does not teach wherein the at least one processor is further configured to execute the instructions to receive a user input to the robotic device for identifying the type of the granular media. Wada teaches wherein the at least one processor is further configured to execute the instructions to receive a user input to the robotic device for identifying the type of the granular media. A person of ordinary skill in the art would have had the technological capabilities required to have modified the device taught in Tsuboi in view of Hwang in further view of You in further of Kim with wherein the at least one processor is further configured to execute the instructions to receive a user input to the robotic device for identifying the type of the granular media taught in Wada. Furthermore, the device taught in Tsuboi in view of Hwang in further view of You in further view of Kim already teaches receiving a user input. Additionally, the device taught in Tsuboi in view of Hwang in further view of You in further view of Kim is also configured to determine the type of granular media. As such, modifying the device to receive a user input specifying the granular media would require the simple addition of the method taught in Wada. Such a modification would not have changed or introduced new functionality to either. No inventive effort would have been required. The combination would have yielded the predictable result of a robotic device wherein the at least one processor is further configured to execute the instructions to receive a user input to the robotic device for identifying the type of the granular media. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the device taught in Tsuboi in view of Hwang in further view of You in further view of Kim with wherein the at least one processor is further configured to execute the instructions to receive a user input to the robotic device for identifying the type of the granular media taught in Wada with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Regarding claim 14, Tsuboi in view of Hwang in further view of You in further view of Kim in further view of Wada teaches wherein the user input is a voice command (Wada; ¶ 0054, “The control voice information is an instruction issued by voice including a cooking operation that the cookware is to perform. For example, "turn off the stove", "make it low", "make it high", "make it medium", "light the stove", "cut", "wash", "chop", and "chill". Instructions such as "to", "to room temperature", and "mix" are some specific examples of control voice information. As will be described in detail later, the question voice information acquisition unit S may be configured to acquire "question response voice information" in addition to the control voice information (question response voice information acquisition means S). It may be configured to acquire "time-series cooking utensil control information" (time-series cooking utensil control concession acquisition means S), or may be configured to acquire "cooking target-dependent cooking utensil control information". (Cooking target-dependent cooking utensil control information acquisition means S).”). Response to Arguments Applicant's arguments filed January 27th, 2026, have been fully considered but they are not persuasive. On Pages 8-11, Applicant argues that the prior art on record fails to teach the limitations of the amended independent claims. Specifically, Applicant argues that the prior fails to teach the amended limitation “determining a wrist tilt command signal by modulating the input trajectory signal by a square wave that comprising a frequency and an amplitude corresponding to a type of the granular media; and controlling the robotic device to tilt and vibrate a source container according to the wrist tilt command signal”. The Examiner respectfully disagrees. In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). As stated in the previous Non-Final Office action mailed October 27th, 2025, and above in the 35 U.S.C. § 103 rejection section, the primary reference a method for controlling a robotic device for pouring a granular media, the method comprising (Tsuboi: Abstract, Column 3 lines 54-62, Column 17 lines 10-23, Column 18 lines 26-30): obtaining an image of a receiving container (Tsuboi: Column 14 lines 24-34); identifying, using the image, a current height of granular media in the receiving container (Tsuboi: Column 10 lines 59-67, Column 11 lines 6-10, Column 11 lines 11-18, Column 14 lines 24-34); identifying a terminal height of the granular media in the receiving container (Tsuboi: Column 11 lines 53-58); determining an input trajectory signal to the robotic device for pouring a non- granular media to the terminal height of the granular media based on the current height of the granular media (Tsuboi: Column 7 line 56 – Column 8 line 7, Column 8 lines 8-16, Column 8 lines 17-24, Column 11 lines 53-58); and controlling the robotic device to tilt a source container according to the wrist tilt command signal (Tsuboi: Column 8 lines 17-24). The secondary reference Hwang teaches controlling the robotic device to tilt and vibrate a source container according to the wrist tilt command signal (Hwang: Column 9 lines 53-59, Column 9 line 60 – Column 10 line 10). The secondary reference You teaches identifying, using the image and a convolutional neural network, a current height of granular media in the receiving container (You: ¶ 0059, ¶ 0060). The secondary reference Kim teaches determining a wrist tilt command signal by modulating the input trajectory signal by a square wave that comprising a frequency and an amplitude corresponding to a type of the granular media (Kim: Abstract, ¶ 0026, ¶ 0027, ¶ 0032, ¶ 0069, ¶ 0075). The primary reference Tsuboi teaches a method of pouring a media, wherein the media can be granular in nature (Tsuboi: Column 17 lines 24-37, Column 17 lines 38-46), by controlling the wrist to tilt the container according to the wrist tilt command. The secondary reference Hwang teaches a robot that is configured to pour a granular media and causes the robot to perform a shaking motion while gripping the container storing the granular media in order to facilitate pouring. One of ordinary skill in the art would recognize that this shaking motion would induce a vibration into the granular media stored in the container, causing said granular media to move. Therefore, the combination of Tsuboi in view of Hwang in further view of You in further view of Kim teaches the limitation “controlling the robotic device to tilt and vibrate a source container according to the wrist tilt command signal”. The primary reference Tsuboi teaches a method of pouring a media, wherein the media can be granular in nature (Tsuboi: Column 17 lines 24-37, Column 17 lines 38-46) and modifying the tilt command of the wrist of the robot based on the properties of the media being poured (Tsuboi: Column 17 lines 24-37, Column 17 lines 38-46, Column 22 lines 21-24, Column 22 lines 26-30, Column 22 lines 37-42). The secondary reference Kim teaches modifying robot command signals according to various different square waves. One of ordinary skill in the art would have recognized that the square waves used to modulate the robot control signal taught in Kim have different amplitudes and frequencies. The primary reference already teaches modifying the wrist control signal used to pour the media based on the type and characteristics of said media and, as such, one of ordinary skill in the art would have been able to modify the system such that the wrist control signal is modulated based on a square wave having a different frequency and amplitude based on the media as taught in Kim. A person of ordinary skill in the art would have been familiar with and had the knowledge to implement a square wave as a control input to a robot before the effective filling date of the claimed invention. Additionally, Kim teaches the use of multiple square waves having different amplitudes and frequencies. Therefore, the combination of Tsuboi in view of Hwang in further view of You in further view of Kim teaches the limitation “determining a wrist tilt command signal by modulating the input trajectory signal by a square wave that comprising a frequency and an amplitude corresponding to a type of the granular media”. In conclusion, for the reasons stated herein and above in the 35 U.S.C. § 103 section, the 35 U.S.C. § 103 rejection of the independent claims 1, 9, and 17 is maintained. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Noah W Stiebritz whose telephone number is (571)272-3414. The examiner can normally be reached Monday thru Friday 7-5 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached at (571) 270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /N.W.S./ Examiner, Art Unit 3658 /Ramon A. Mercado/Supervisory Patent Examiner, Art Unit 3658
Read full office action

Prosecution Timeline

Mar 29, 2023
Application Filed
Oct 20, 2025
Non-Final Rejection — §103
Jan 27, 2026
Response Filed
Mar 09, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602063
LOAD HANDLING SYSTEM AND LOAD HANDLING METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12575900
Steerable Eversion Robot System and Method of Operating the Steerable Eversion Robot System
2y 5m to grant Granted Mar 17, 2026
Patent 12552043
METHOD FOR CONTROLLING ROBOTIC ARM, ELECTRONIC DEVICE, AND COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Feb 17, 2026
Patent 12472640
CONTROL METHOD AND SYSTEM FOR ARTICLE TRANSPORTATION BASED ON MOBILE ROBOT
2y 5m to grant Granted Nov 18, 2025
Patent 12467759
VEHICLE WITH SWITCHABLE FORWARD AND BACKWARD CONFIGURATIONS, CONTROL METHOD, AND CONTROL PROGRAM
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
51%
With Interview (-15.6%)
2y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 18 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month