Prosecution Insights
Last updated: April 19, 2026
Application No. 18/095,394

ADAPTIVE WEIGHT PRESCRIPTION SYSTEM AND METHOD FOR RESISTANCE TRAINING

Final Rejection §102
Filed
Jan 10, 2023
Examiner
ATKINSON, GARRETT K
Art Unit
3784
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Nautilus Inc.
OA Round
2 (Final)
71%
Grant Probability
Favorable
3-4
OA Rounds
2y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
542 granted / 759 resolved
+1.4% vs TC avg
Strong +36% interview lift
Without
With
+35.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 3m
Avg Prosecution
17 currently pending
Career history
776
Total Applications
across all art units

Statute-Specific Performance

§101
2.4%
-37.6% vs TC avg
§103
25.6%
-14.4% vs TC avg
§102
42.0%
+2.0% vs TC avg
§112
24.2%
-15.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 759 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-7, 9, 12, and 14-23 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Asikainan et al (US 2021/0008413). Asikainan teaches regarding claim: 1. A computer-implemented method comprising: generating a workout plan for a user having a user profile, the workout plan comprising an exercise selected from a set of exercises stored in the user profile, the exercise having a prescribed weight and a prescribed number of exercise repetitions (Asikainan paragraphs 64 and 66 discloses generating workouts based on the user profile. Asikainan paragraph 92 further discloses that the workouts include a weight [to be used] and a number of repetitions, both of which can be modified, “0092] The recommendation engine 210 generates on-the-fly recommendation to modify or alter the user exercise workout based on the state or level of fatigue of the user. For example, the recommendation engine 210 may recommend to the user to push for As Many Repetitions As Possible (AMRAP) in the last set of an exercise movement if the level of fatigue of the user is low. In another example, the recommendation engine 210 may recommend to the user to reduce the number of repetitions from 10 to five on a set of exercise movements if the level of fatigue of the user is high. In another example, the recommendation engine 210 may recommend to the user to increase weight on the weight equipment by 10 pounds if the level of fatigue of the user is low. In yet another example, the recommendation engine 210 may recommend to the user to decrease weight on the weight equipment by 20 pounds if the level of fatigue of the user is high. The recommendation engine 210 may also take into account any personally set objectives, a last occurrence of a workout session, number of repetitions of one or more exercise movements, heart rate, breathing rate, facial expression, weight volume, etc. to generate a recommendation to modify the user exercise workout to prevent a risk of injury. For example, the recommendation engine 210 uses Heart Rate Variability (PPG-HRV) in conjunction with exercise analysis to recommend a change in exercise patterns (e.g. if PPG-HRV is poor, recommend a lighter workout). In some implementations, the recommendation engine 210 instructs the user interface engine 216 to display the recommendation on the interactive screen of the interactive personal training device 108 after the user completes a set of repetitions or at the end of the workout session. Example recommendations may include a set amount of weight to pull or push, a number of repetitions to perform (e.g., Push for one more rep), a set amount of weight to increase on an exercise movement (e.g., Add 10 pound plate for barbell deadlift), a set amount of weight to decrease on an exercise movement (e.g., Remove 20 pound plate for barbell squat), a change in an order of exercise movements, change a cadence of the repetition, increase a speed of an exercise movement, decrease a speed of an exercise movement (e.g. Reduce the duration of eccentric movement by 1 second to achieve 10% strength gain over 2 weeks), an alternative exercise movement (e.g., Do goblet squat instead) to achieve a similar exercise objective, a next exercise movement, a stretching mobility exercise to improve a range of motion, etc.”); capturing user performance data for the exercise during execution of the workout plan by the user in an exercise environment, the user performance data comprising an actual weight used in performance of the exercise by the user and an actual number of exercise repetitions performed by the user (Asikainan paragraph 87 discloses the performance tracker which, amongst others, receives the used weight and the number of repetitions.); determining a new value for the prescribed weight based on a comparison between a stored value of the prescribed weight in the user profile and the actual weight, and a comparison between the prescribed number of exercise repetitions and the actual number of exercise repetitions; and automatically adjusting the stored value of the prescribed weight in the user profile to the new value for the prescribed weight (Asikainan paragraphs 64, 80-90, 92 disclose updating the user profile, comparing data, and determining changes to the routine based on the sensor and user data and subsequently modifying the used weight and the repetitions). 2. The computer-implemented method of claim 1, wherein the exercise has an equipment type, wherein a set of weight graduation levels is defined for the equipment type, and wherein determining the new value comprises: determining a weight graduation level in the set of weight graduation levels corresponding to the stored value; and determining the new value relative to the weight graduation level corresponding to the stored value (Asikainan paragraph 92 discloses adding or removing a set amount of weight based on the values of the plates. i.e. weight steps/graduation levels). 3. The computer-implemented method of claim 1, further comprising presenting the prescribed weight and the prescribed number of exercise repetitions in the exercise environment prior to capturing the user performance data for the exercise (via the virtual personal trainer instructing and motivating the user as discussed in paragraph 34 “[0034] The interactive personal training devices 108a . . . 108n may be computing devices with data processing and communication capabilities. In the example of FIG. 1A, the interactive personal training device 108 is configured to implement a personal training application 110. The interactive personal training device 108 may comprise an interactive electronic display mounted behind and visible through a reflective, full-length mirrored surface. The full-length mirrored surface reflects a clear image of the user and performance of any physical movement in front of the interactive personal training device 108. The interactive electronic display may comprise a frameless touch screen configured to morph the reflected image on the full-length mirrored surface and overlay graphical content (e.g., augmented reality content) on and/or beside the reflected image. Graphical content may include, for example, a streaming video of a personal trainer performing an exercise movement. The interactive personal training devices 108a . . . 108n may be voice, motion, and/or gesture activated and revert back to a mirror when not in use. The interactive personal training devices 108a . . . 108n may be accessed by users 106a . . . 106n to access on-demand and live workout sessions, track user performance of the exercise movements, and receive feedback and recommendation accordingly. The interactive personal training device 108 may include a memory, a processor, a camera, a communication unit capable of accessing the network 105, a power source, and/or other software and/or hardware components, such as a display (for viewing information provided by the entities 120 and 140), graphics processing unit (for handling general graphics and multimedia processing), microphone array, audio exciters, audio amplifiers, speakers, sensor(s), sensor hub, firmware, operating systems, drivers, wireless transceivers, a subscriber identification module (SIM) or other integrated circuit to support cellular communication, and various physical connection interfaces (e.g., HDMI, USB, USB-C, USB Micro, etc.).”). 4. The computer-implemented method of claim 1, wherein capturing the user performance data for the exercise comprises receiving voice commands from the exercise environment and processing the voice commands to obtain at least a portion of the user performance data (generation and processing of associated voice commands as discussed in paragraphs 34 and 46, “[0046] FIG. 1B is a diagram illustrating an example configuration for tracking physical activity of a user performing exercise movements and providing feedback and recommendations relating to performing the exercise movements. As depicted, the example configuration includes the interactive personal training device 108 equipped with the sensor(s) 109 configured to capture a video of a scene in which user 106 is performing the exercise movement using the barbell equipment 134a. For example, the sensor(s) 109 may comprise one or more of a high definition (HD) camera, a regular 2D camera, a RGB camera, a multi-spectral camera, a structured light 3D camera, a time-of-flight 3D camera, a stereo camera, a radar sensor, a LiDAR scanner, an infrared sensor, or a combination of one or more of the foregoing sensors. The sensor(s) 109 comprising of one or more cameras may provide a wider field of view (e.g., field of view >120 degrees) for capturing the video of the scene in which user 106 is performing the exercise movement and acquiring depth information (R, G, B, X, Y, Z) from the scene. The depth information may be used to identify and track the exercise movement even when there is an occlusion of keypoints while the user is performing a bodyweight exercise movement or weight equipment-based exercise movement. A keypoint refers to a human joint, such as an elbow, a knee, a wrist, a shoulder, hip, etc. The depth information may be used to determine a reference plane of the floor on which the exercise movement is performed to identify the occluded exercise movement. The depth information may be used to determine relative positional data for calculating metrics such as force and time-under-tension of the exercise movement. Concurrently, the IMU sensor 132 on the equipment 134a in motion and the wearable device 130 on the person of the user are communicatively coupled with the interactive personal training device 108 to transmit recorded IMU sensor data and recorded vital signs and health status information (e.g., heart rate, blood pressure, etc.) during the performance of the exercise movement to the interactive personal training device 108. For example, the IMU sensor 132 records the velocity and acceleration, 3D positioning, and orientation of the equipment 134a during exercise movement. Each equipment 134 (e.g., barbell, plate, kettlebell, dumbbell, medical ball, accessories, etc.) include an IMU sensor 132. The interactive personal training device 108 is configured to process and analyze the stream of sensor data using trained machine learning algorithms and provide feedback in real time on the user 106 performing the exercise movement. For example, the feedback may include the weight moved in exercise movement pattern, the number of repetitions performed in the exercise movement pattern, the number of sets completed in the exercise movement pattern, the power generated by the exercise movement pattern, etc. In another example, the feedback may include a comparison of the exercise form of the user 106 against conditions of an ideal or correct exercise form predefined for the exercise movement and providing a visual overlay on the interactive display of the interactive personal training device to guide the user 106 to perform the exercise movement correctly. In another example, the feedback may include computation of classical force exerted by the user in the exercise movement and providing an audible and/or visual instruction to the user to increase or decrease force in a direction using motion path guidance on the interactive display of the interactive personal training device. The feedback may be provided visually on the interactive display screen of the interactive personal training device 108, audibly through the speakers of the interactive personal training device 108, or a combination of both. In some implementations, the interactive personal training device 108 may cause one or more light strips on its frame to pulse to provide the user with visual cues (e.g., repetition counting, etc.) representing a feedback. The user 106 may interact with the interactive personal training device 108 using voice commands or gesture-based commands. It should be understood that the sensor(s) 109 on the interactive personal training device 108 may be configured to track movements of multiple people at the same time. Although the example configuration in FIG. 1B is illustrated in the context of tracking physical activity of a user performing exercise movements and providing feedback and recommendations relating to performing the exercise movements, it should be understood that the configuration may apply to other contexts in vertical fields, such as medical diagnosis (e.g., health practitioner reviewing vital signs of a user, volumetric scanning, 3D imaging in medicine, etc.), physical therapy (e.g. physical therapist checking adherence to physio protocols during rehabilitation), and enhancing user experience in commerce including fashion, clothing, and accessories (e.g., virtual shopping with augmented reality try-ons), and body composition scanning in a personal training or coaching capacity.”). 5. The computer-implemented method of claim 1, wherein capturing the user performance data from the exercise environment comprises capturing video images of the exercise environment and processing the video images to obtain at least a portion of the user performance data (via motion tracking as discussed in paragraphs 34 and 46). 6. The computer-implemented method of claim 1, further comprising playing a video associated with the exercise in the exercise environment contemporaneously with capturing the user performance data for the exercise in the exercise environment (instructor video to motivate user as discussed in paragraph 34). 7. The computer-implemented method of claim 2, wherein determining the new value for the prescribed weight comprises: determining that the actual number of exercise repetitions is equal to or greater than the prescribed number of exercise repetitions; determining that the actual weight used is equal to or greater than the prescribed weight; determining that a ratio of an actual workout volume based on the actual weight and the actual number of exercise repetitions to a prescribed workout volume based on the prescribed weight and the prescribed number of exercise repetitions is equal to or greater than a threshold in a range from 1 to 1.5; and selecting a weight graduation level from the set of weight graduation levels that is greater than the stored value of the prescribed weight as the new value for the prescribed weight (see AMRAP workout discussion and subsequent increase in prescribed weight in paragraph 92). 9. The computer-implemented method of claim 1, wherein determining the new value for the prescribed weight comprises: determining that the actual number of exercise repetitions is equal to or greater than the prescribed number of exercise repetitions; determining that the actual weight is lower than the prescribed weight; and setting the new value for the prescribed weight to the actual weight (when monitoring the user and adjusting the workout during the course of normal use as described in paragraph 92). 12. The computer-implemented method of claim 1, wherein determining the new value for the prescribed weight comprises: determining that the actual number of exercise repetitions is equal to or greater than the prescribed number of exercise repetitions; determining that the actual weight is higher than the prescribed weight; and setting the new value for the prescribed weight to the actual weight (when monitoring the user and adjusting the workout during the course of normal use as described in paragraph 92). 14. The computer-implemented method of claim 1, wherein the exercise has an exercise code and an exercise multiplier associated with the exercise code, and further comprising: determining a user capability index for the exercise based on the new value for the prescribed weight and the exercise multiplier; and storing the user capability index in the user profile in association with the exercise (when monitoring the user and adjusting the workout to appropriately challenge the user during the course of normal use as described in paragraph 92). 15. The computer-implemented method of claim 1, further comprising: generating an initial workout plan comprising a set of base movement patterns; assigning prescribed weights to the set of base movement patterns based on a body weight of the user; capturing initial user performance data for the set of base movement patterns during execution of the initial workout plan by the user, the initial user performance data comprising the actual number of exercise repetitions performed by the user for the set of base movement patterns; and determining initial values for the prescribed weights of exercises in the set of exercises stored in the user profile based on the initial user performance data (as discussed in paragraphs 34 and 92). 16. The computer-implemented method of claim 15, wherein determining the initial values for the prescribed weights of the exercises comprises: determining a subset of the set of exercises having a primary movement pattern that matches a first movement pattern from the set of base movement patterns; and determining initial values for the prescribed weights of the subset of the set of exercises based on a portion of the initial user performance data corresponding to the first movement pattern (as discussed in paragraphs 34 and 92). 17. The computer-implemented method of claim 1, further comprising: capturing an image of a user assuming an initial user position; and confirming that the initial user position is correct based on the image (via the motion tracking and the virtual trainer as discussed in at least paragraph 34). 18. The computer-implemented method of claim 1, further comprising: setting default weights values; receiving user responses to a series of questions; responsive to the user responses, adjusting the default weights values; and implementing the default weights values in the workout plan (based on initial user inputs and assessment as discussed in 34 and 92). 19. The computer-implemented method of claim 1, further comprising: receiving an indication of a plurality of problems with form based on video captured during execution of the workout plan; and selectively announcing a problem out of the problems with form based on remaining time in an interval, whether the problem has been observed before, or a priority of the problem (via user monitoring and the virtual instructor as discussed in paragraphs 34 and 92). 20. A computing system comprising: at least one hardware processor; at least one memory coupled to the at least one hardware processor; and one or more non-transitory computer-readable media having stored therein computer- executable instructions that, when executed by the computing system, cause the computing system to perform: generating a workout plan for a user having a user profile, the workout plan comprising an exercise selected from a set of exercises stored in the user profile, the exercise having a prescribed weight and a prescribed number of exercise repetitions; capturing user performance data for the exercise during execution of the workout plan by the user in an exercise environment, the user performance data comprising an actual weight used in performance of the exercise by the user and an actual number of exercise repetitions performed by the user; determining a new value for the prescribed weight based on a stored value of the prescribed weight in the user profile, the prescribed number of exercise repetitions, and the user performance data; and adjusting the stored value of the prescribed weight in the user profile to the new value for the prescribed weight (all as discussed and cited above). 21. One or more non-transitory computer-readable media storing computer-executable instructions that when executed cause a computing system to perform operations comprising: generating a workout plan for a user having a user profile, the workout plan comprising an exercise selected from a set of exercises stored in the user profile, the exercise having a prescribed weight and a prescribed number of exercise repetitions; capturing user performance data for the exercise during execution of the workout plan by the user in an exercise environment, the user performance data comprising an actual weight used in performance of the exercise by the user and an actual number of exercise repetitions performed by the user, wherein capturing the user performance data comprising capturing video images of the exercise environment during execution of the workout plan by the user in the exercise environment and processing the video images to obtain at least a portion of the user performance data; determining a new value for the prescribed weight based on a stored value of the prescribed weight in the user profile, the prescribed number of exercise repetitions, and the user performance data; and adjusting the stored value of the prescribed weight in the user profile to the new value for the prescribed weight (all as discussed and cited above). 22. The one or more non-transitory computer-readable media of claim 21, wherein the exercise has an exercise code and an exercise multiplier associated with the exercise code, and wherein the operations further comprise: determining a user capability index for the exercise based on the new value for the prescribed weight and the exercise multiplier; and storing the user capability index in the user profile in association with the exercise (as discussed above). 23. The one or more non-transitory computer-readable media of claim 21, wherein the operations further comprise: generating an initial workout plan comprising a set of base movement patterns; assigning prescribed weights to the set of base movement patterns based on a body weight of the user; capturing initial user performance data for the set of base movement patterns during execution of the initial workout plan by the user, the initial user performance data comprising the number of exercise repetitions performed by the user for the set of base movement patterns; and determining a subset of the set of exercises having a primary movement pattern that matches a first movement pattern from the set of base movement patterns; and determining initial values for the prescribed weights of the subset of the set of exercises based on a portion of the initial user performance data corresponding to the first movement pattern (as discussed and cited above). Response to Arguments Applicant's arguments filed 7/28/2025 have been fully considered but they are not persuasive. Applicant has argued Asikainen does not teach making a recommendation based on the claimed comparisons, nor automatically updating the user profile with new prescribed values. The office disagrees. Both are taught -- [0064] discusses updating the user profile with new exercise data, which would include the prescribed values, and [0088]-[0090] discusses recommendations based on the claimed comparisons. Conclusion Applicant's amendment necessitated any new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to GARRETT K ATKINSON whose telephone number is (571)272-8117. The examiner can normally be reached 0800-1800 M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, LoAn Jimenez can be reached at (571) 272-4966. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /GARRETT K ATKINSON/Primary Examiner, Art Unit 3784
Read full office action

Prosecution Timeline

Jan 10, 2023
Application Filed
Apr 30, 2025
Non-Final Rejection — §102
Jul 17, 2025
Interview Requested
Jul 24, 2025
Examiner Interview Summary
Jul 24, 2025
Applicant Interview (Telephonic)
Jul 28, 2025
Response Filed
Nov 15, 2025
Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599810
TREADMILL, TREADMILL SYSTEM AND TREADMILL ASSEMBLY METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12589286
EASY-TO-ASSEMBLE TRAMPOLINE WITH SPRAY ELEMENTS
2y 5m to grant Granted Mar 31, 2026
Patent 12589280
DEVICE AND SYSTEM FOR EXERCISING AND MONITORING THE PELVIC FLOOR MUSCLES
2y 5m to grant Granted Mar 31, 2026
Patent 12589273
WEIGHTED VEST
2y 5m to grant Granted Mar 31, 2026
Patent 12569391
MULTI-FUNCTION STRETCHING APPARATUS
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
71%
Grant Probability
99%
With Interview (+35.5%)
2y 3m
Median Time to Grant
Moderate
PTA Risk
Based on 759 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month