DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
Applicant's request for reconsideration of the rejection of the last Office action is persuasive and, therefore, the 103 rejection of previous Office action is withdrawn. However, a new art is found for the rejection as shown below.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-21 are rejected under 35 U.S.C. 103 as being unpatentable over Asikainen et al. (US 2022/0296966 A1) in view of Chaney (US 2020/0294298 A1).
Regarding claim 1, Asikainen discloses a method for providing an immersive and interactive fitness experience to a user ([0036], a system for tracking physical activity of a user performing exercise movements and providing feedback and recommendations relating to performing the exercise movements), the method comprising: identifying at least one of a pose and a movement corresponding to the user performing one or more activities in a real-world environment ([0046], a personal training application may include software and/or logic to provide the functionality for tracking physical activity of a user performing exercise movements); rendering the at least one of the pose and the movement identified in the avatar using an Extended Reality (XR) technique ([0039], interactive electronic display may comprise a frameless touch screen configured to morph the reflected image on the full-length mirrored surface and overlay graphical content on and/or beside the reflected image); monitoring a pattern of the one or more activities being performed by the user in the real-world environment using an Artificial Intelligence (AI) model ([0051], the interactive personal training device is configured to process and analyze the stream of sensor data using trained machine learning algorithms); and dynamically providing a feedback to at least one of the user and the avatar based on the monitoring ([0051], provide feedback in real time on the user performing the exercise movement, the feedback may include the weight moved in exercise movement pattern, the number of repetitions performed in the exercise movement pattern, the number of sets completed in the exercise movement pattern, the power generated by the exercise movement pattern, etc.), wherein the feedback is provided through an Al-assisted virtual expert in at least one of the real-world environment and the metaverse or the virtual environment ([0051], the feedback may include a comparison of the exercise form of the user against conditions of an ideal or correct exercise form predefined for the exercise movement and providing a visual overlay on the interactive display of the interactive personal training device to guide the user to perform the exercise movement correctly). Although Asikainen teaches a gamification engine to facilitate users to “level up” a virtual self or avatar based on their preferred physical body representation by following or performing exercise programs or routines ([0107]), Asikainen differs from the claimed invention in not specifically teaching the steps of generating, in a metaverse or a virtual environment, an avatar corresponding to the user of the real-world environment based on the at least one of the pose and the movement identified, wherein the avatar and the associated virtual environment is customizable based on one of a user's physical appearance, user preferences, or user performance, and monitoring the corresponding avatar in the metaverse or virtual environment. However, Chaney teaches a method for generating three-dimensional avatars of multiple telepresence participants and rendering the avatars in a virtual three-dimensional space having capture devices to capture data about their corresponding users, and optionally to capture data about the real-world environments in which those users such that a computing device may use movement data captured by capture device to understand movements made by the user and to make the avatar make the same movements as the user ([0027] and [0031]-[0032]) in order to improve telepresence technologies. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Asikainen in having the steps of generating, in a metaverse or a virtual environment, an avatar corresponding to the user of the real-world environment based on the at least one of the pose and the movement identified, wherein the avatar and the associated virtual environment is customizable based on one of a user's physical appearance, user preferences, or user performance, and monitoring the corresponding avatar in the metaverse or virtual environment, as per teaching of Chaney, in order to improve telepresence technologies.
Regarding claim 2, Asikainen discloses capturing a physical appearance of the user in the real-world environment via a capturing module ([0051], sensor(s) configured to capture a video of a scene in which user is performing the exercise movement).
Regarding claim 3, Asikainen discloses tracking, by the Al model, motion of the user and the corresponding image performing the one or more activities in the real-world environment and the metaverse or the virtual environment, respectively; and determining, by the Al model, a type of each of the one or more activities performed by the user and the corresponding image based on the tracking (figure 1B and [0051], the interactive personal training device equipped with the sensor(s) configured to capture a video of a scene in which user is performing the exercise movement and provide feedback in real time on the user performing the exercise movement, wherein the feedback may include a comparison of the exercise form of the user against conditions of an ideal or correct exercise form predefined for the exercise movement and providing a visual overlay on the interactive display of the interactive personal training device to guide the user 106 to perform the exercise movement correctly). Asikainen differs from the claimed invention in not specifically teaching tracking the corresponding avatar and determining the corresponding avatar based on the tracking. However, Chaney teaches such ([0031]-[0032]) in order to improve telepresence technologies. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Asikainen in tracking the corresponding avatar and determining the corresponding avatar based on the tracking, as per teaching of Chaney, to improve telepresence technologies
Regarding claim 4, Asikainen discloses that the feedback comprises one of a positive encouragement feedback provided to the user or the corresponding avatar when performing the one or more activities correctly, a warning or a critical error feedback provided to the user or the corresponding avatar when performing the one or more activities incorrectly, or a feedback provided in a form of accessible progress report of the user or the corresponding avatar performing the one or more activities ([0051] and [0098], provide feedback in real time on the user performing the exercise movement, the feedback may include the weight moved in exercise movement pattern, the number of repetitions performed in the exercise movement pattern, the number of sets completed in the exercise movement pattern, the power generated by the exercise movement pattern; and recommendation engine instructs the user interface engine to generate an alert on the interactive screen of the interactive personal training device informing the user to decrease force in the direction of the actual motion path to avoid injury).
Regarding claim 5, Asikainen discloses that the feedback is provided in one of a visual form, an audio form, or a haptic form ([0040] and 0051], to create haptic feedback including vibrations or a rumble in the equipment or to provide feedback including computation of classical force exerted by the user in the exercise movement and providing an audible and/or visual instruction to the user).
Regarding claim 6, Asikainen discloses assisting, by the Al-assisted virtual expert, the user and the corresponding avatar to perform the one or more activities individually; and providing, by the Al-assisted virtual expert, a personalized feedback to the user and the corresponding avatar based on performance of the one or more activities ([0138], the feedback may be displaying a graphical representation of a personal trainer ‘avatar’ correctly performing a squat exercise movement next to the 3D model of the user performing the same exercise movement for comparison).
Regarding claim 7, Asikainen discloses assisting, by the Al-assisted virtual expert, a set of users and corresponding avatars to perform the one or more activities in a group ([0134], the personal training engine may instantiate a channel for the platform provided by each one of the third-party partners); and providing, by the Al-assisted virtual expert, a generalized feedback to each the set of users and the corresponding avatars based on performance of the one or more activities. wherein the set of avatars performing the group activities are presented as if they are physically present in the real-world, but the activities are actually performed at different times or experienced solely in the metaverse or virtual environment when the user is actively engaged ([0138], the feedback may be green tick mark displayed on the interactive screen for a perfect repetition of the exercise movement, a yellow tick mark displayed for an acceptable repetition of the exercise movement, and a red strike mark displayed for an incorrect form in the repetition of the exercise movement, and the program enhancement engine provides the third-party partner with a base acceptable thresholds prepopulated for a common set of exercise movements from exercise and fitness literature and associated feedback to be relayed to the user, wherein the third-party partner has the ability to review and revise the base acceptable thresholds and associated feedback to match their training methodology or principles).
Regarding claim 8, Asikainen discloses when the user is actively engaged in the metaverse or virtual environment, the set of users and corresponding avatars performing the group activities are presented as if physically present in the real-world environment at the same time, for the activities performed at different times and locations ([0107], avatar including the real time information about the user's fitness activity may be shared with or made visible to other users via the interactive personal training device or via a social media application).
Regarding claim 9, Asikainen discloses that the metaverse or the virtual environment is one of a virtual reality (VR) environment, an augmented reality (AR) environment, a mixed reality (MR) environment, or an XR environment ([0100], instruct the user interface engine to generate an augmented reality).
Regarding claim 10, Asikainen discloses that the one or more activities comprises of high knees, leg raises, crunches, jumping jacks, lateral squats, lunges, squats, burpees, overhead triceps, push-ups, dumbbell squat press, core scissors, elbow knee, a band lateral raise, a band lateral stretch, a hook, an uppercut, boxing, kettlebell, deadlift, dead bug, squat thrusters, yoga, or high-intensity interval training (HIlT) ([0051] and [0070], each equipment, i.e. barbell, plate, kettlebell, dumbbell, medical ball, accessories, etc., includes an IMU sensor, and workout programs may include one or more exercise movements based on, cardio, yoga, strength training, weight training, bodyweight exercises, dancing, toning, stretching, martial arts, Pilates, core strengthening, or a combination thereof).
Regarding claim 11, the limitations of the claim are rejected as the same reasons as set forth in claim 1.
Regarding claim 12, the limitations of the claim are rejected as the same reasons as set forth in claim 2.
Regarding claim 13, the limitations of the claim are rejected as the same reasons as set forth in claim 3.
Regarding claim 14, the limitations of the claim are rejected as the same reasons as set forth in claim 4.
Regarding claim 15, the limitations of the claim are rejected as the same reasons as set forth in claim 5.
Regarding claim 16, the limitations of the claim are rejected as the same reasons as set forth in claim 6.
Regarding claim 17, the limitations of the claim are rejected as the same reasons as set forth in claim 7.
Regarding claim 18, the limitations of the claim are rejected as the same reasons as set forth in claim 8.
Regarding claim 19, the limitations of the claim are rejected as the same reasons as set forth in claim 9.
Regarding claim 20, the limitations of the claim are rejected as the same reasons as set forth in claim 10.
Regarding claim 21, the limitations of the claim are rejected as the same reasons as set forth in claim 1.
Response to Arguments
Applicant’s arguments, see pages 9-17, filed 11/13/2025, with respect to claims 1-2, 5, 9-12, 15 and 19-21 have been fully considered and are persuasive. The rejections of claims 1-21 have been withdrawn.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Trehan (US 12,246,241 B2) discloses a method for capturing and coordinating physical activities of multiple users and providing overall feedback for one or more users performing one or more physical activities at one or more locations (abstract and figures 1-2).
Todasco et al. (US 12,032,732 B2) discloses a method for automated configuration of augmented and virtual reality avatars for user specific behaviors having a service provider to automate avatar configurations and presentations without requiring user inputs and based on real-time data (abstract and claim 1).
Trehan (US 2022/0072381 A1) discloses a method for training users to perform physical activities (abstract and [0010]-[0011]).
Vissa et al. (US 2020/0038709 A1) discloses a method for real-time AR activity feedback having an avatar to represent previous best performance of a person on a route depicted by an AR path, allowing the person to try and beat their previous best performance (abstract).
Dobbins et al. (US 8,217,995 B2) discloses a motion capture system to track the movements and interactions of multiple users within the virtual reality simulation such that the multiple users are represented by avatars in real time within the simulation (abstract).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GEORGE ENG whose telephone number is (571)272-7495. The examiner can normally be reached Flex M to F, 7 am to 3 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alford Kindred can be reached at 571-272-4037. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/GEORGE ENG/ Supervisory Patent Examiner, Art Unit 2699