Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of claim(s)
Claims 1-20 have been examined. Claims 1 and 11 have been amended.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sahin (US 20150099946A1) in view of Torres (US20190333629A1) and further in view of Bradley et al. (US 20190034494A1 hereinafter Bradley).
With respect to claim 1, Sahin teaches a system comprising:
a trainer computing device configured for operation by a trainer providing training to a trainee, the trainee having a developmental disability, the trainer computing device being configured for communication with a server that stores data for at least one task having a plurality of task steps, each task step having associated instructions for performing the task step (‘946; Para 0038: by disclosure, Sahin describes, as in Fig. 1A, evaluating an individual 102 for autism spectrum disorder includes a wearable data collection device 104 worn by the individual 102 (trainee) and/or a wearable data collection device 108 worn by a caregiver 106 (trainer), such that data 116 related to the interactions between the individual 102 and the caregiver 108 are recorded by at least one wearable data collection device 104, 108 and uploaded to a network 110 for analysis, archival, and/or real-time sharing with a remotely located evaluator 114; Para 0082: the session and evaluation data may be uploaded to long term storage in a server farm or cloud storage area.; Para 0087: as in Fig. 4, an example method 400 for conducting an evaluation session using a wearable data collection device donned by a caregiver of an individual being evaluated for Autism Spectrum Disorder; upon powering and donning the wearable data collection device, or launching an evaluation session application, the evaluation session may be initiated. Initiation of the evaluation session may include, in some embodiments, establishment of a communication channel between the wearable data communication device and a remote computing system: Para 0089: upon powering and donning the wearable data collection device, or launching an evaluation session application, the evaluation session may be initiated. Initiation of the evaluation session may include, in some embodiments, establishment of a communication channel between the wearable data communication device and a remote computing system);
a trainee computing device configured for operation by the trainee, the training computing device being configured for communication with the server and configured to output the associated instructions for each task step of the plurality of task steps while the trainee is performing the at least one task (946; Para 0038: by disclosure, Sahin describes, as in Fig. 1A, evaluating an individual 102 for autism spectrum disorder includes a wearable data collection device 104 worn by the individual 102 (trainee) and/or a wearable data collection device 108 worn by a caregiver 106 (trainer), such that data 116 related to the interactions between the individual 102 and the caregiver 108 are recorded by at least one wearable data collection device 104, 108 and uploaded to a network 110 for analysis, archival, and/or real-time sharing with a remotely located evaluator 114; Para 0050: the wearable data collection device 104, in some implementations, is configured to monitor physiological functions of the individual 102. In some examples, the wearable data collection device 104 may collect heart and/or breathing rate data 116 e (or, optionally, electrocardiogram (EKG) data), electroencephalogram (EEG) data 116 f, and/or Electromyography (EMG) data 116 i). The wearable data collection device 104 may interface with one or more peripheral devices, in some embodiments, to collect the physiological data. For example, the wearable data collection device 104 may have a wired or wireless connection with a separate heart rate monitor, EEG unit, or EMG unit. In other embodiments, at least a portion of the physiological data is collected via built-in monitoring systems. Unique methods for non-invasive physiological monitoring are described in greater detail in relation to FIG. 11. Optional onboard and peripheral sensor devices for use in monitoring physiological data are described in relation to FIG. 12.)
wherein:
the trainer computing device is further configured to: enable the trainer to: monitor progress of the trainee operating the trainee computing device while the trainee is performing the plurality of task steps of the at least one task; and assign a task step score from a plurality of task step scores for each task step of the plurality of task steps after the trainee has completed each task step, the task step score indicating a level of guidance provided by the trainer to the trainee while the trainee was completing each task step; receive the task step scores for the plurality of task steps assigned by the trainer; and communicate the task steps scores for the plurality of task steps assigned by the trainer to the server; and (‘946; Para 0055: During an evaluation session, the caregiver 106 is tasked with performing interactive tasks with the individual 102. Video recording data 116 j collected by the caregiver wearable data collection device 108 is supplied to a computing system of the evaluator 114 in real-time via the analysis and data management system 118 such that the evaluator 114 is able to see the individual 102 more or less “through the eyes of” the caregiver 108 during the evaluation session. The evaluator 114 may also receive voice recording data 116 a from either the caregiver wearable data collection device 108 or the subject wearable data collection device 104.; Para 0199: effectiveness of the presented guidance is determined (748). For example, based upon recorded video and/or audio data, the socially relevant event identifier can identify a socially relevant response invoked by the individual and compare the response to the prompted response. This step and the following steps 748 and 750, in one example, may be performed at least in part by features of the social acceptability coach algorithm 540 b, described in relation to FIG. 5B); and
Bradley discloses
receive the task step scores for the plurality of task steps assigned by the trainer; and communicate the task steps scores for the plurality of task steps assigned by the trainer to the server (‘494; Abstract: Bradley describes tracking goal progression. Input establishing accounts for providers serving a client is received. The client is an individual receiving treatment or assistance. The accounts are stored in a server available through one or more networks. The client is assigned to one or more of the providers in response to selections from an administrator. Goals and a plan of action are established for the client in response to treatments and assistance required. Data associated with the client received from the providers is compiled. Para 0106: Bradley further describes in the setup for task analysis 302 there may be two settings for that request a hierarchy and task. Task analysis 302 may also be referred to as ratings. To set up the prompt hierarchy an administrator or user may identify a field including a number of prompt types or numbers (e.g., hand over hand, touch prompt, gestural, verbal, independent, etc.), additional comments, color code or priority, and so forth. Task analysis 302 may be set up by establishing the number of tasks or steps, a narrative label for each step, optional comments for more description if needed, and scores. In one embodiment, the tasks may be associated with a particular goal or objective.)
It would have been obvious to one of ordinary skill in the art before the effective filing date of claimed invention to modify the system of diagnosis and treatment of neurology evaluation and management of autism spectrum disorder of Sahin with the technique of platform for optimizing goal progression as taught by Bradley and the motivation is to provide a complexity level from a plurality of complexity levels to be used by the trainer during a subsequent training session
Torres discloses
the server is configured to select a complexity level from a plurality of complexity levels to be used by the trainer during a subsequent training session when the trainee performs the at least one task based on the task step scores for the plurality of task steps, the complexity level indicating training conditions and an amount of oversight to be provided by the trainer to the trainee during the subsequent training session (‘629; Para 0020: FIG. 12A through FIG. 12D show that the intended and automatic modes of control are indistinguishable in ASD case. FIG. 12A: The prediction of control mode (movement class) from 6 participants (5 novices and the expert) is accurate for both the technique and movement type. FIG. 12B: Worst individual performance (well above chance, ⅛ from 4 techniques and 2 movement types) still does not confuse the two movement types at all. Rows are actual values while columns are assigned values from the leave-one-out cross-validation. FIG. 12C: ASD case performs at a comparable level to the novices for predicting each individual technique but the goal-directed segments of the techniques are generally confused with the supplemental segments when using the maximum curvature from the hand trajectories as input to the linear classifier. FIG. 12D: The predictive accuracy of the classifier drops for each technique and the goal-directed vs. spontaneous supplemental movements are indistinguishable when using the maximum speed as input to the classifier.; Para 0066: A mobile child-machine interface system has been developed that enables one to visit the classroom settings and have TD children interact with touch screens and perform cognitive-driven tasks adapted from their curricula…The initial version of this interface was in open loop. Children responded to stimuli presented on the touch screen and pointed at the correct target that matched a given sample evoked by their touch of the screen. The stimuli had perceptual and cognitive features that varied in increasing levels of complexity from purely visual (e.g., color) to more abstract (e.g., geometric shapes) and even yet to more complex features that required mental rotation to correctly match the given sample).
It would have been obvious to one of ordinary skill in the art before the effective filing date of claimed invention to modify the system of diagnosis and treatment of neurology evaluation and management of autism spectrum disorder of Sahin/Bradley with the technique of diagnosis and treatment of neurological disorders as taught by Torres and the motivation is to provide a complexity level from a plurality of complexity levels to be used by the trainer during a subsequent training session.
Claim 11 is rejected as the same reason with claim 1.
With respect to claim 2, the combined art teaches the system of claim 1, wherein the associated instructions for each of the plurality of tasks steps output by the trainee computing device to the trainee while the trainee is performing the at least one task include at least one of: video output; audio output; image output; and textual output (‘946; Paras0135- 0136, 0205-0206).
Claim 12 is rejected as the same reason with claim 2.
With respect to claim 3, the combined art teaches the system of claim 1, wherein the plurality of task step scores include an independent score indicating that the trainee was able to independently complete the associated task step without assistance (‘946; Para 0055).
Claim 13 is rejected as the same reason with claim 3.
With respect to claim 4, the combined art teaches the system of claim 3, wherein the server is configured to select the complexity level based on a number of task steps from the plurality of task steps that were assigned with the independent score by the trainer (‘946; Para 0257).
Claim 14 is rejected as the same reason with claim 4.
With respect to claim 5, the combined art teaches the system of claim 3, wherein the plurality of task step scores additionally include at least one of: a full physical score indicating that the trainer provided full physical assistance to the trainee while the trainee was performing the associated task step; a partial physical score indicating that the trainer provided partial physical assistance to the trainee while the trainee was performing the associated task step; and a gesture score indicating that thetrainer provided a gesture to the trainee to assist the trainee while the trainee was performing the associated task step (‘629; Para 0103).
Claim 15 is rejected as the same reason with claim 5.
With respect to claim 6, the combined art teaches the system of claim 1, wherein the trainer computing device is configured to display the associated instructions that are also being output to the trainee with the trainee computing device (‘946; Paras 0039-0040).
Claim 16 is rejected as the same reason with claim 6.
With respect to claim 7, the combined art teaches the system of claim 1, wherein the trainer computing device is configured to direct the trainee computing device, through communication with the server, to move to a different task step within the plurality of task steps based on input received from the trainer with the trainer computing device (‘946; Para 0019).
Claim 17 is rejected as the same reason with claim 7.
With respect to claim 8, the combined art teaches the system of claim 1, wherein each of the plurality of complexity levels indicate at least one of: a trainer-to-trainee ratio; whether the trainer is in close proximity to the trainee; whether the trainer is in a room with the trainee; and whether verbal praise is provided after each step (‘946; Para 0169).
Claim 18 is rejected as the same reason with claim 8.
With respect to claim 9, the combined art teaches the system of claim 1, wherein the trainee computing device is configured to provide additional instructions to the trainee in response to the trainee encountering a problem during performance of a task step (‘946; Para 0069).
Claim 19 is rejected as the same reason with claim 9.
With respect to claim 10, the combined art teaches the system of claim 1, wherein the trainer computing device and the trainee computing device are each one of a laptop, a tablet, and a smartphone (‘946; Para 0107).
Claim 20 is rejected as the same reason with claim 10.
Response to Arguments
Applicant’s arguments with respect to claim(s) have been considered but are moot because the new ground of rejection does not rely on any reference of Bradley applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HIEP VAN NGUYEN whose telephone number is (571)270-5211. The examiner can normally be reached Monday through Friday between 8:00AM and 5:00PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason B Dunham can be reached on 5712728109. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HIEP V NGUYEN/Primary Examiner, Art Unit 3686