DETAILED ACTION
This action is responsive to application filed on 03/04/2024, in which claims 1-26 are pending.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. One suggestion would be to change the current title into “generating animatronic animation.”
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-26 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claim 1-20 of copending Application No. 19/200,927 (reference application). Although the claims at issue are not identical, they are not patentably distinct from each other because substantially all the limitations of instant application are anticipated by the reference application. The additional limitations that have not been explicitly anticipated by the co-pending application have been addressed by the prior arts cited in the prior art section of the office action.
Claims 1-26 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claim 1-6, of copending Application No. 17/224,012 (reference application). Although the claims at issue are not identical, they are not patentably distinct from each other because substantially all the limitations of instant application are anticipated by the reference application. The additional limitations that have not been explicitly anticipated by the co-pending application have been addressed by the prior arts cited in the prior art section of the office action.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-2, 5-13, 15-18, and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Bacher et al. (US 20220226987 A1).
With respect to claims 1:
Bacher teaches:
A method for generating animatronic animation (see, abstract), the method comprising: generating an animation for an animatronic based on one or more creative requirements (See, para “[0029] The IK solver 230 is configured (e.g., with the new IK formulation described herein) to perform one or more algorithms to use the motion inputs 242 to provide retargeted motions 248 that are used to generate the control signals 221. The control over mechanical joints 216 is based on constraints 244 (or is a constraint-based formulation for retargeting) while the control over IK targets in the motion inputs 242 is based on objectives 246 (or is an objective-based formulation for retargeting). The retargeted motions 248 may include end effector and IK target trajectories. Further, the retargeted motions 248 may include a pathway or trajectory for the CoM of the robot 210 as may be determined by the CoM module 234 to keep the CoM within the support spanned by the contact of the robot 210 with the ground. Additionally, the IK solver 230 may call or use the kinematic singularities module 238 so as to ensure that the retargeted motions 248 are generated so as to safeguard against kinematic singularities.), the animation imitating one or more input motions and limited by the one or more creative requirements (See para, “[0032] To this end, the inventors recognized that a controller is desired that uses an inverse kinematics formulation that: (1) transfers motion onto complex assemblies that may contain loops; (2) ensures that all the robotic constraints are fulfilled even when the transferred motion cannot be followed exactly by the robotic figure; and (3) provides the closest motion that the robot supports while keeping the motion smooth. Further, in the case of these and also autonomous robots, it is desirable to provide the artists/designers providing input animations or motions with real-time feedback on whether a particular motion is physically feasible (e.g., is balance maintained?), and the inventors recognized it would be desirable for the new IK-based controller to provide direct control over the center of mass (CoM) of the robot.”); retargeting the animation onto the animatronic based on mechanical constraints of the animatronic, such that the animation is physically feasible on the animatronic (See para, “[0032] To this end, the inventors recognized that a controller is desired that uses an inverse kinematics formulation that: (1) transfers motion onto complex assemblies that may contain loops; (2) ensures that all the robotic constraints are fulfilled even when the transferred motion cannot be followed exactly by the robotic figure; and (3) provides the closest motion that the robot supports while keeping the motion smooth. Further, in the case of these and also autonomous robots, it is desirable to provide the artists/designers providing input animations or motions with real-time feedback on whether a particular motion is physically feasible (e.g., is balance maintained?), and the inventors recognized it would be desirable for the new IK-based controller to provide direct control over the center of mass (CoM) of the robot.”); and outputting the animation to the animatronic (See, para, “[0007] The controller includes an inverse kinematics (IK) module that implements a versatile IK formulation for the retargeting of motions, including expressive motions, onto mechanical systems (i.e., robots with loops and/or without loops). Further, the controller is configured (e.g., via design of the IK module or routines called by this module) to support the precise control of the position and orientation of end effectors and the center of mass (CoM) (such as of walking robots). The formulation of the algorithms carried out by the IK module safeguards against a disassembly when IK targets are moved outside the workspace of the robot. Further, even if every IK target is within the workspace, a designer or operator could ask for a motion the robot cannot perform, and the new IK module addresses such situations. For example, two IK targets could be attached to the robot at the ends of the two arms. If they are too far apart, the robot cannot perform the motion even though the individual IK targets could be matched perfectly with a sideways motion. A regularizer (which may be called by the IK module or be a subroutine of this module) is included in the controller that smoothly circumvents kinematic singularities where velocities go to infinity. The inventors have performed validation examples on physical robots to demonstrate the versatility and efficacy of the IK module and its algorithm designs in controlling (via its output or control signals) overactuated systems (including those with loops) and in retargeting (as part of generating the control outputs) an expressive motion onto a bipedal robot.”)
With respect to claim 2:
Bacher teaches the method of claim 1, and further teaches:
further comprising receiving animation source content for the animation, wherein the animation source content defines the one or more input motions (See, para, “[0030] At this point in the description, it may be helpful to describe an example problem being addressed by the inventors and how their new IK formulation for robot controllers addresses this particular problem. When one is designing a new audio-animatronic figure, a first Maya rig may be created to explore how many functions should be provided to meet the artistic intent of the figure or robot designer. Thereafter, a first mechanical representation (e.g., a digital model in Solidworks or the like) is created and refined. To animate this mechanical representation, a second Maya rig may be manually created that implements the robotic constraints, and this second rig is refined and revised along with the mechanical representation through the design phase of the figure or robot. Instead of retargeting motion from an animated character onto a figure, the retargeting of motion data, which may be captured from an actress or actor performing the target animation, can also be used as input motions as part of the animation task of figures or robots.”)
With respect to claim 5:
Bacher teaches the method of claim 1, and further teaches:
wherein the one or more creative requirements comprise a music timing requirement, a facial requirement, or an artistic requirement (see para, “[0034] The IK formulation has been proven by the inventors to be able to transfer rich motions onto complex assemblies with loops with only a very few tracked points and/or frames. In one demonstration, the inventors were able to show use of the IK formulation on an autonomous robot to control the center of mass position over time, which enabled the rapid creation of animations that were physically feasible and that could be blended in when the robot was moving dynamically. The IK formulation implements the kinematic constraints of the robot as hard constraints, which means they are satisfied and ensures the robot is kept in a feasible state at any moment in time. The IK targets are implemented as soft constraints or objectives, which means they are satisfied as closely as possible given the robot kinematics. The operator or artist providing the input motions can be provided feedback during the animation tasks, and, if the IK targets cannot be fulfilled exactly, the artist can be provided control over which targets to give higher priority (e.g., by weighting of one or more of input motions or trajectories).”
With respect to claim 6:
Bacher teaches the method of claim 1, and further teaches:
wherein the mechanical constraints comprise physical limitations of the animatronic itself (See para, “[0009] In some implementations, the soft constraints for the IK target are met during the solving of the inverse kinematics by minimizing distances between a target trajectory and a retargeted trajectory for the IK target while ensuring the retargeted input motions are physically feasible for the kinematic structure. In the same or other cases, the kinematic structure is configured to include at least one kinematic loop.” Also, see Fig. 2 constraints for Mechanical joints. )
With respect to claim 7:
Bacher teaches a system configured to perform the method of claim 1, wherein the system comprises the animatronic (See, Fig. 2; also claim 1 above)
With respect to claim 8:
Bacher teaches the system of claim 7, and further teaches wherein the method is performed on the animatronic in real-time (See para, “[0028] The system 200 includes a motion design system 260 that may be any useful computer or computer system operable by an operator (e.g., an artist or director) to create and provide input motions or animations 261 to the robot 210. The robot 210 is shown to also include a motion planner 250 that processes or parses these input motions/animations 261 (such as the input motion 110 shown in FIG. 1) to obtain a set of input trajectories 251 that are output to the robot controller 220 for storage as motion inputs 242. In some cases, it is desirable for the robotic system 200 to provide real-time planning of motions for the robot 210, and, to this end, the system 200 may include a sensing system 270 with one or more sensors (e.g., video cameras and corresponding processing hardware and software) providing sensor data 271. This data 271 may provide feedback on the environment surrounding the robot 210 in real-time, and the motion planner 250 may be adapted to generate responsive motions suited to this sensed environment, with input trajectories 251 being output based on these determined or responsive motions.”)
With respect to claims 9:
Bacher teaches:
A method for animatronic animation generation (see, abstract), the method comprising: receiving animation source content (para 0030, “… Instead of retargeting motion from an animated character onto a figure, the retargeting of motion data, which may be captured from an actress or actor performing the target animation, can also be used as input motions as part of the animation task of figures or robots.”); generating an animation for an animatronic based on the animation source content and based on at least one first constraint or requirement ((See, para “[0029] The IK solver 230 is configured (e.g., with the new IK formulation described herein) to perform one or more algorithms to use the motion inputs 242 to provide retargeted motions 248 that are used to generate the control signals 221. The control over mechanical joints 216 is based on constraints 244 (or is a constraint-based formulation for retargeting) while the control over IK targets in the motion inputs 242 is based on objectives 246 (or is an objective-based formulation for retargeting). The retargeted motions 248 may include end effector and IK target trajectories. Further, the retargeted motions 248 may include a pathway or trajectory for the CoM of the robot 210 as may be determined by the CoM module 234 to keep the CoM within the support spanned by the contact of the robot 210 with the ground. Additionally, the IK solver 230 may call or use the kinematic singularities module 238 so as to ensure that the retargeted motions 248 are generated so as to safeguard against kinematic singularities.); retargeting the animation onto the animatronic based on at least one second constraint or requirement (See para, “[0032] To this end, the inventors recognized that a controller is desired that uses an inverse kinematics formulation that: (1) transfers motion onto complex assemblies that may contain loops; (2) ensures that all the robotic constraints are fulfilled even when the transferred motion cannot be followed exactly by the robotic figure; and (3) provides the closest motion that the robot supports while keeping the motion smooth. Further, in the case of these and also autonomous robots, it is desirable to provide the artists/designers providing input animations or motions with real-time feedback on whether a particular motion is physically feasible (e.g., is balance maintained?), and the inventors recognized it would be desirable for the new IK-based controller to provide direct control over the center of mass (CoM) of the robot.”); and outputting the animation to the animatronic (See, para, “[0007] The controller includes an inverse kinematics (IK) module that implements a versatile IK formulation for the retargeting of motions, including expressive motions, onto mechanical systems (i.e., robots with loops and/or without loops). Further, the controller is configured (e.g., via design of the IK module or routines called by this module) to support the precise control of the position and orientation of end effectors and the center of mass (CoM) (such as of walking robots). The formulation of the algorithms carried out by the IK module safeguards against a disassembly when IK targets are moved outside the workspace of the robot. Further, even if every IK target is within the workspace, a designer or operator could ask for a motion the robot cannot perform, and the new IK module addresses such situations. For example, two IK targets could be attached to the robot at the ends of the two arms. If they are too far apart, the robot cannot perform the motion even though the individual IK targets could be matched perfectly with a sideways motion. A regularizer (which may be called by the IK module or be a subroutine of this module) is included in the controller that smoothly circumvents kinematic singularities where velocities go to infinity. The inventors have performed validation examples on physical robots to demonstrate the versatility and efficacy of the IK module and its algorithm designs in controlling (via its output or control signals) overactuated systems (including those with loops) and in retargeting (as part of generating the control outputs) an expressive motion onto a bipedal robot.”).
With respect to claim 10:
Bacher teaches the method of claim 9, and further teaches wherein the receiving comprises receiving a motion capture dataset of one more motions performed by an actor (para 0030, “ Instead of retargeting motion from an animated character onto a figure, the retargeting of motion data, which may be captured from an actress or actor performing the target animation, can also be used as input motions as part of the animation task of figures or robots.”)
With respect to claim 11:
Bacher teaches the method of claim 9, and further teaches wherein the receiving comprises receiving existing animation performed by the animatronic or another animatronic (para 0030, “… A user can specify motion targets, e.g., in the form of points or frames, that are tracked over time or trajectories for the center of mass or a point in space the robot is supposed to track. The trajectories of points or frames can be extracted, in some cases by a motion planner, from a target animation that does not implement the robotic constraints or from motion capture data. The IK-based controller is configured to track corresponding points and frames on the robot and to minimize differences to target motions while guaranteeing that all constraints are satisfied at all times.”)
With respect to claim 12:
Bacher teaches the method of claim 9, and further teaches wherein: the animation is generated to imitate or match the animation source content irrespective of kinematic constraints of the animatronic (para “[0028] The system 200 includes a motion design system 260 that may be any useful computer or computer system operable by an operator (e.g., an artist or director) to create and provide input motions or animations 261 to the robot 210. The robot 210 is shown to also include a motion planner 250 that processes or parses these input motions/animations 261 (such as the input motion 110 shown in FIG. 1) to obtain a set of input trajectories 251 that are output to the robot controller 220 for storage as motion inputs 242. In some cases, it is desirable for the robotic system 200 to provide real-time planning of motions for the robot 210, and, to this end, the system 200 may include a sensing system 270 with one or more sensors (e.g., video cameras and corresponding processing hardware and software) providing sensor data 271. This data 271 may provide feedback on the environment surrounding the robot 210 in real-time, and the motion planner 250 may be adapted to generate responsive motions suited to this sensed environment, with input trajectories 251 being output based on these determined or responsive motions.”); and the retargeting comprises determining motions of the animatronic based on the kinematic constraints (para, “[0029] The IK solver 230 is configured (e.g., with the new IK formulation described herein) to perform one or more algorithms to use the motion inputs 242 to provide retargeted motions 248 that are used to generate the control signals 221. The control over mechanical joints 216 is based on constraints 244 (or is a constraint-based formulation for retargeting) while the control over IK targets in the motion inputs 242 is based on objectives 246 (or is an objective-based formulation for retargeting). The retargeted motions 248 may include end effector and IK target trajectories. Further, the retargeted motions 248 may include a pathway or trajectory for the CoM of the robot 210 as may be determined by the CoM module 234 to keep the CoM within the support spanned by the contact of the robot 210 with the ground. Additionally, the IK solver 230 may call or use the kinematic singularities module 238 so as to ensure that the retargeted motions 248 are generated so as to safeguard against kinematic singularities.”)
With respect to claim 13:
Bacher teaches the method of claim 9, and further teaches wherein the animation is a first animation (Para 0028 teaches proving input animations (plural), thus animations can be first, second, or third…; also para “[0030] At this point in the description, it may be helpful to describe an example problem being addressed by the inventors and how their new IK formulation for robot controllers addresses this particular problem. When one is designing a new audio-animatronic figure, a first Maya rig may be created to explore how many functions should be provided to meet the artistic intent of the figure or robot designer. Thereafter, a first mechanical representation (e.g., a digital model in Solidworks or the like) is created and refined….”), further comprising: generating a second animation for the animatronic based on the animation source content and based on the at least one first constraint or requirement (para 0030, “…To animate this mechanical representation, a second Maya rig may be manually created that implements the robotic constraints, and this second rig is refined and revised along with the mechanical representation through the design phase of the figure or robot. ….”), wherein the second animation is different than the first animation (see, para 0030); and retargeting the second animation onto the animatronic based on the at least one second constraint or requirement (see, para 0030 as cited above.)
With respect to claim 15:
Bacher teaches the method of claim 9, and further teaches wherein the at least one second constraint or requirement comprises a kinematic constraint of the animatronic (para 0008, “….To this end, the controller may include an inverse kinematics (IK) solver, and the controller processes input motions with the IK solver by solving inverse kinematics to retarget the input motions onto the kinematic structure with the control signals. Additionally, the IK solver may, in some embodiments, solve the inverse kinematics by applying hard constraints to at least one of the mechanical joints and soft constraints to an IK target associated with a point on the kinematic structure.”)
With respect to claims 16:
Bacher teaches:
A method for generating new animations for an animatronic (see, abstract), the method comprising: receiving animation source content defining input motions (para 0030, “… Instead of retargeting motion from an animated character onto a figure, the retargeting of motion data, which may be captured from an actress or actor performing the target animation, can also be used as input motions as part of the animation task of figures or robots.”); generating an animation similar to the input motions and based on at least one creative requirement for the animation ((See, para “[0029] The IK solver 230 is configured (e.g., with the new IK formulation described herein) to perform one or more algorithms to use the motion inputs 242 to provide retargeted motions 248 that are used to generate the control signals 221. The control over mechanical joints 216 is based on constraints 244 (or is a constraint-based formulation for retargeting) while the control over IK targets in the motion inputs 242 is based on objectives 246 (or is an objective-based formulation for retargeting). The retargeted motions 248 may include end effector and IK target trajectories. Further, the retargeted motions 248 may include a pathway or trajectory for the CoM of the robot 210 as may be determined by the CoM module 234 to keep the CoM within the support spanned by the contact of the robot 210 with the ground. Additionally, the IK solver 230 may call or use the kinematic singularities module 238 so as to ensure that the retargeted motions 248 are generated so as to safeguard against kinematic singularities.); retargeting the animation onto the animatronic based on at least one kinematic or dynamic constraint of the animatronic (See para, “[0032] To this end, the inventors recognized that a controller is desired that uses an inverse kinematics formulation that: (1) transfers motion onto complex assemblies that may contain loops; (2) ensures that all the robotic constraints are fulfilled even when the transferred motion cannot be followed exactly by the robotic figure; and (3) provides the closest motion that the robot supports while keeping the motion smooth. Further, in the case of these and also autonomous robots, it is desirable to provide the artists/designers providing input animations or motions with real-time feedback on whether a particular motion is physically feasible (e.g., is balance maintained?), and the inventors recognized it would be desirable for the new IK-based controller to provide direct control over the center of mass (CoM) of the robot.”); and deploying the animation to the animatronic for performance by the animatronic (See, para, “[0007] The controller includes an inverse kinematics (IK) module that implements a versatile IK formulation for the retargeting of motions, including expressive motions, onto mechanical systems (i.e., robots with loops and/or without loops). Further, the controller is configured (e.g., via design of the IK module or routines called by this module) to support the precise control of the position and orientation of end effectors and the center of mass (CoM) (such as of walking robots). The formulation of the algorithms carried out by the IK module safeguards against a disassembly when IK targets are moved outside the workspace of the robot. Further, even if every IK target is within the workspace, a designer or operator could ask for a motion the robot cannot perform, and the new IK module addresses such situations. For example, two IK targets could be attached to the robot at the ends of the two arms. If they are too far apart, the robot cannot perform the motion even though the individual IK targets could be matched perfectly with a sideways motion. A regularizer (which may be called by the IK module or be a subroutine of this module) is included in the controller that smoothly circumvents kinematic singularities where velocities go to infinity. The inventors have performed validation examples on physical robots to demonstrate the versatility and efficacy of the IK module and its algorithm designs in controlling (via its output or control signals) overactuated systems (including those with loops) and in retargeting (as part of generating the control outputs) an expressive motion onto a bipedal robot.”)
With respect to claim 17:
Bacher teaches the method of claim 16, and further teaches wherein the retargeting comprises transferring the animation onto the animatronic using an inverse kinematics and/or inverse dynamics tool (para 0008, “….To this end, the controller may include an inverse kinematics (IK) solver, and the controller processes input motions with the IK solver by solving inverse kinematics to retarget the input motions onto the kinematic structure with the control signals. Additionally, the IK solver may, in some embodiments, solve the inverse kinematics by applying hard constraints to at least one of the mechanical joints and soft constraints to an IK target associated with a point on the kinematic structure.”)
With respect to claim 18:
Bacher teaches the method of claim 17, and further teaches wherein the inverse kinematics and/or inverse dynamics tool validates the animation against all physical constraints of the animatronic (para, “[0009] In some implementations, the soft constraints for the IK target are met during the solving of the inverse kinematics by minimizing distances between a target trajectory and a retargeted trajectory for the IK target while ensuring the retargeted input motions are physically feasible for the kinematic structure. In the same or other cases, the kinematic structure is configured to include at least one kinematic loop.” In addition, para “[0071] The new IK-based controller or IK processing pipeline was tested or validated with three examples that highlight different aspects and functionality of the IK formulation. For all the examples, the input to the pipeline was a digital animation created using Autodesk Maya, from which the relevant tracking frames were extracted using a custom script.”)
With respect to claim 20:
Bacher teaches the method of claim 16, and further teaches wherein the input motions are created by a human or robot actor (para 0030, “…Instead of retargeting motion from an animated character onto a figure, the retargeting of motion data, which may be captured from an actress or actor performing the target animation, can also be used as input motions as part of the animation task of figures or robots.”).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 3-4, 21-22, 24, and 26 are rejected under 35 U.S.C. 103 as being unpatentable over Bacher et al. (US 20220226987 A1) in view of Peng et al. (US 20250232504 A1)
With respect to claim 3:
Bacher teaches the method of claim 1, however Bacher doesn’t explicitly teach wherein the generating comprises generating the animation using a trained model.
Peng teaches wherein the generating comprises generating the animation using a trained model (See, para “[0056] The model 260 can generate an output response 270 responsive to receiving the model-compliant input from the dataset generator 256. The output response 270 can be in a suitable motion and/or gesture file format that presents an animation or movement of a human character in a kinematic model, planar models, or volumetric model through space defined by a coordinate system and over a period of time. The output response 270 can be rendered by a suitable graphical processor to be displayed on a display screen as animation, video, and so on. Through rendering, surfaces and textual for the human characters can be added.”)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to incorporate design approving as taught by Peng into Bacher.
One would be motivated to do so to generate the motion model using machine learning models. (See, abstract, title)
With respect to claim 4:
Bacher teaches the method of claim 3, however Bacher doesn’t explicitly teach wherein the trained model is based on imitation learning or generative artificial intelligence..
Peng teaches wherein the trained model is based on imitation learning or generative artificial intelligence (See para, “[0041] The training system 200 can train (e.g., update) the human motion foundation model 110 by applying as input the training data 204. The training data 204 can include one or more of the mocap data 102 and the video reconstruction data 104. In some examples, the training data 204 can also include text prompts. The human motion foundation model 110 (e.g., the generative model) is trained or updated using the training data 204 to allow the human motion foundation model 110 to output the output data 206 (e.g., the generated motion 120).” Also, see para 0028 for imitation model.)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to incorporate design approving as taught by Peng into Bacher.
One would be motivated to do so to generate the motion model using machine learning models. (See, abstract, title)
Note: Regarding claims 3-4, also, see rejections under Mitchell with regard to explcit teaching of training the model.
With respect to claims 21:
Bacher teaches:
A method for training a model used to generate animations for an animatronic (see, abstract), the method comprising: receiving a set of constraints of an animatronic and a set of motion data (para 0034 “The IK formulation implements the kinematic constraints of the robot as hard constraints, which means they are satisfied and ensures the robot is kept in a feasible state at any moment in time. The IK targets are implemented as soft constraints or objectives, which means they are satisfied as closely as possible given the robot kinematics. The operator or artist providing the input motions can be provided feedback during the animation tasks, and, if the IK targets cannot be fulfilled exactly, the artist can be provided control over which targets to give higher priority (e.g., by weighting of one or more of input motions or trajectories).”); and [training] a model based on the set of motion data and the set of constraints to generate an animation that is physically feasible on an animatronic (para “[0034] The IK formulation has been proven by the inventors to be able to transfer rich motions onto complex assemblies with loops with only a very few tracked points and/or frames. In one demonstration, the inventors were able to show use of the IK formulation on an autonomous robot to control the center of mass position over time, which enabled the rapid creation of animations that were physically feasible and that could be blended in when the robot was moving dynamically. The IK formulation implements the kinematic constraints of the robot as hard constraints, which means they are satisfied and ensures the robot is kept in a feasible state at any moment in time. The IK targets are implemented as soft constraints or objectives, which means they are satisfied as closely as possible given the robot kinematics. The operator or artist providing the input motions can be provided feedback during the animation tasks, and, if the IK targets cannot be fulfilled exactly, the artist can be provided control over which targets to give higher priority (e.g., by weighting of one or more of input motions or trajectories).”)
However, Bacher does not explicitly teaches training the model using the motion data and constraints.
Peng teaches training a model based on the set of motion data and the set of constraints to generate an animation (para “0023] To provide data in addition to the mocap data 102, scalable techniques such as human motion data (e.g., video reconstruction data 104) reconstructed from videos can be used. In some examples, the video reconstruction data 104 can be treated using reinforcement learning (RL), physics-based character simulations, and user feedback information 130. Such video reconstruction data 104 can generalize and expand the motion repertoire beyond what is available through the mocap data 102. As shown, the video reconstruction data 104 is applied as input to a motion imitation controller 106, which is trained (e.g., updated) using RL and physics-based character simulations. The mocap data 102 and the output of the motion imitation controller 106 are applied as supervision to train the human motion foundation model 110, for example, as inputs into a training pipeline of the human motion foundation model 110….” In addition, see Fig. 2. Also, para “[0034] In some examples, the user feedback information 130 can include a rating or a score indicative of the quality of the generated motion 120 as perceived by a user. For example, the user can be instructed to provide a score indicating the relevance of the generated motion 120 to a text prompt used to generate the generated motion 120. During training, the text prompt can be inputted into the human motion foundation model 110 as a constraint, or the text prompt can be a label or description of the video reconstruction data 104 or the mocap data 102….”))
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to incorporate design approving as taught by Peng into Bacher.
One would be motivated to do so to generate the motion model using machine learning models. (See, abstract, title)
With respect to claim 22:
Bacher as modified by Peng teaches the method of claim 21, and Peng further teaches wherein: the motion data comprises a set of input motions that is performed or has been previously performed by an actor (para 0030, “ Instead of retargeting motion from an animated character onto a figure, the retargeting of motion data, which may be captured from an actress or actor performing the target animation, can also be used as input motions as part of the animation task of figures or robots.”); and the training comprises: analyzing kinematic inputs of the input motions, simulating, based on the analysis, kinematic and dynamic motion of the animatronic (para “[0075] The new pipeline or IK solver process supports arbitrary kinematic structures, which includes serial robots. As an example, the inventors retargeted an arm motion onto a 4-DoF arm of a boxer robot. FIGS. 8A-8C illustrate a simulated boxer character or robot 810, a physical robot 820 (both in neutral poses), and a plot 830 of the joint velocities over time (for the unregularized version (shown with dotted lines) and regularized versions (shown with solid lines). The inset views in plot/graph 830 show the configuration change of the robot before and after the velocity spike, with motors with lighter shading indicating that the velocity limit is exceeded.”), and determining whether the kinematic and dynamic motion is feasible on the animatronic, wherein feasibility is defined as being within the set of constraints (claim 2, “..wherein the soft constraints for the IK target are met during the solving of the inverse kinematics by minimizing distances between a target trajectory and a retargeted trajectory for the IK target while ensuring the retargeted input motions are physically feasible for the kinematic structure.”).
Note: As an alternative Peng describes training comprising analyzing kinematic inputs of the input motions, simulating, based on the analysis, kinematic and dynamic motion of the animatronic (see, at least para 0028-0031). The same rational can be used for combination, as described with regard to claim 21.
With respect to claim 24:
Bacher as modified by Peng teaches the method of claim 22, and further teaches wherein the actor is a human actor, the animatronic, or a second animatronic (para 0030, “ Instead of retargeting motion from an animated character onto a figure, the retargeting of motion data, which may be captured from an actress or actor performing the target animation, can also be used as input motions as part of the animation task of figures or robots.”)
With respect to claim 26:
Bacher as modified by Peng teaches the method of claim 21, and Bacher further teaches wherein the set of constraints comprises at least one of a kinematic constraint, a dynamic constraint or a creative requirement (para 0008, “….To this end, the controller may include an inverse kinematics (IK) solver, and the controller processes input motions with the IK solver by solving inverse kinematics to retarget the input motions onto the kinematic structure with the control signals. Additionally, the IK solver may, in some embodiments, solve the inverse kinematics by applying hard constraints to at least one of the mechanical joints and soft constraints to an IK target associated with a point on the kinematic structure.”)
Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Bacher in view of in view of Lavalley et al. (US 20200369333 A1)
With respect to claim 14:
Bacher teaches the method of claim 9, however doesn’t explicitly teach wherein the at least one first constraint or requirement comprises a creative requirement defined by existing music, lighting, or narration.
Lavalley teaches wherein the at least one first constraint or requirement comprises a creative requirement defined by existing music, lighting, or narration (para “[0058] FIG. 2 is a functional block diagram of software architecture 200 for a robot actor of the present description (such as robot 160 of FIG. 1), which may be used to implement a control module 170 of the robot 160 of FIG. 1. The novel software architecture 200 was developed to transform a legged character platform or robot into a capable actor. The associated requirements for such an actor include the ability to: (a) move and locomote in style and emotion that matches one or several character performances; (b) interact with nearby people in a life-like manner while navigating freely through an environment; and (c) execute complex narratives that may branch and/or loop based on external inputs from the environment or a remote operator (not shown in FIG. 1 but understood by those skilled in the arts as providing wireless control signals to the robot 160).” Also see para 0035 for story-telling.)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to incorporate narration requirement as taught by Lavalley into Bacher.
One would be motivated to do so configure the animatronics to perform multiple roles. (Para 0034, “…The robot disclosed herein is sometimes referred to as a “capable robot actor” meaning that it can be configured to perform multiple disparate characters and roles and that it can perform those characters with sufficient fidelity that an observer's attention is directed increasingly on the illusion of a living character and decreasingly on the reality of an actor portraying the character.”)
Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Bacher in view of Mitchell et al. (US 20210110001 A1)
With respect to claim 19:
Bacher teaches the method of claim 16, however it does not explicitly teach further comprising receiving an operator command approving the animation.
Mitchell teaches further comprising receiving an operator command approving the animation (See, Fig. 5).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to incorporate design approving as taught by Mitchell into Bacher.
One would be motivated to do so revise the design until it is approved. (see, para 0057)
Claim 23 is rejected under 35 U.S.C. 103 as being unpatentable over Bacher as modified by Peng and further in view of in view of Mitchell et al. (US 20210110001 A1)
With respect to claim 23:
Bacher as modified by Peng teaches the method of claim 22, however it does not explicitly teach wherein the training further comprises repeating the simulation with one or more adjustments to the kinematic or dynamic motion based on the determination.
Mitchell teaches wherein the training further comprises repeating the simulation with one or more adjustments to the kinematic or dynamic motion based on the determination (See, Fig. 5)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to incorporate design approving as taught by Mitchell into Bacher as modified by Peng.
One would be motivated to do so revise the design until it is approved. (see, para 0057)
Claim 25 is rejected under 35 U.S.C. 103 as being unpatentable over Bacher as modified by Peng and further in view of in view of Wang et al. (US 20240201677 A1)
With respect to claim 25:
Bacher as modified by Peng teaches the method of claim 21, however it does not explicitly teaches wherein the training comprises learning an encoder and a decoder for the set of motion data.
Wang teaches wherein the training comprises learning an encoder and a decoder for the set of motion data (para, “[0012] FIG. 5 is a block diagram illustration depicting how encoder and decoder neural networks are initially trained based on state and action data from the human demonstration, according to an embodiment of the present disclosure;” In addition see para 0006)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to incorporate encoder and decoder as taught by Wang into Bacher as modified by Peng.
One would be motivated to do so to compute actions based on motion state. (see, para 0006)
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US 20250387906 A1: “Provided are a robot control method and apparatus, an electronic device, a storage medium, and a program product. The method includes: acquiring state data configured for indicating a current motion state of a robot, and acquiring environmental data configured for indicating an environment where the robot is currently located; predicting, based on the state data, an initial action parameter configured for controlling the robot to imitate an object action of a target object; predicting, based on the environmental data, an environmental impact parameter configured for representing impact generated by the environment on imitation of the object action by the robot….”
Any inquiry concerning this communication or earlier communications from the examiner should be directed to WASEEM ASHRAF whose telephone number is (571)270-3948. The examiner can normally be reached Monday-Friday 09:30 A.M-06:00 P.M.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tariq Hafiz can be reached at 571-272-5250. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/WASEEM ASHRAF/Supervisory Patent Examiner, Art Unit 3621