Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
1. This Final Office action is in reply to the Applicant amendment filed on 16 September 2025.
2. Claims 1, 3, 11, 17 have been amended.
3. Claims 1-20 are currently pending and have been examined.
Response to Amendment
Response to Arguments
Applicants’ arguments filed 16 September 2025 have been fully considered but they are not persuasive. In the remarks regarding the 35 USC § 101 rejection for Claims 1-20, Applicants argue that: (1) the claims are not directed to an abstract idea, and even if they were, they would amount to significantly more than the abstract idea. Examiner respectfully disagrees. Still commensurate to the two-part subject matter eligibility framework decision in the Federal court decision in Alice Corp. Pty. Ltd. V. CLS Bank International et al., (Alice), 2019 revised patent subject matter eligibility guidance (2019 PEG) and the October 2019 Update: Subject Matter Eligibility (“October 2019 Update), and the new “July 2024 Guidance Update on Patent Subject Matter Eligibility Examples, including on Artificial Intelligence”, and the Examiner details the maintained rejection under 35 U.S.C. 101 in the below rejection with further explanation. Applicant basically argues that as amended, Applicants state: “Claims do not Recites a Judicial Exception; …not directed to an abstract idea because amended claim 1 as a whole integrates the alleged judicial exceptions into a practical application; …amended claim 1 is patent eligible because it recites additional elements that are “unconventional or otherwise more than what is well-understood, routine, conventional activity in the field” (see Remarks/Arguments pages 7-11). However the Examiner respectfully disagrees. Starting with Step 2A: Prong One, the claims still recite:
Certain methods of organizing human activity –managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions). At least the elements involving "monitoring... user interactions" and managing "tasks for accomplishing the goal" often fall under this category. Managing a schedule or task list to reach a goal is a fundamental human activity for organizing productivity and behavior. Mental processes – concepts performed in the human mind (including an observation, evaluation, judgment, opinion). The limitations regarding "determining whether the content is related to the goal," "determining whether the content conflicts," and "automatically revising a node" are classified as Mental Processes. A human can observe user interactions (monitoring), evaluate if they relate to a goal (determining relationship), judge if there is a conflict (determining conflict), and update a list of tasks (revising a node/updating a progress of the goal). Even though performed "automatically" by a computer, the underlying logic is a mental process. See MPEP § 2106.04(a) II C. Hence, the claims are ineligible under Step 2A Prong one. Furthermore, the dependent claims are merely directed to the particulars of the abstract idea and likewise do not add significantly more to the above-identified judicial exception. The limitations of the claims do not transform the abstract idea that they recite into patent-eligible subject matter because the claims simply instruct the practitioner to implement the abstract idea using generally-recited computer components.
Step 2A: Prong Two: Claims 1-20: With regard to this step of the analysis (as explained in MPEP § 2106.04(d)), the judicial exception is not integrated into a practical application. Independent Claims 1, 11, 17 recite additional elements directed to “at least one processor; computing devices; memory storing instructions” (e.g., see Applicants’ published Specification ¶’s 18, 90-105). Therefore, the claims contain computer components that are cited at a high level of generality and are merely invoked as a tool to perform the abstract idea. Simply implementing an abstract idea on a computer is not a practical application of the abstract idea. Furthermore, the dependent claims are merely directed to the particulars of the abstract idea and likewise do not add significantly more to the above-identified judicial exception. The limitations of the claims do not transform the abstract idea that they recite into patent-eligible subject matter because the claims simply instruct the practitioner to implement the abstract idea using generally-recited computer components, and furthermore do not amount to an improvement to a computer or any other technology, and thus are ineligible. See MPEP § 2106.05(f) (h). Step 2B: As explained in MPEP § 2106.05, Claims 1-20 do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements when considered both individually and as an ordered combination do not amount to significantly more than the abstract idea nor recites additional elements that integrate the judicial exception into a practical application. The additional elements of “at least one processor; computing devices; memory storing instructions”, etc. are generically-recited computer-related elements that amount to a mere instruction to “apply it” (the abstract idea) on the computer-related elements (see MPEP § 2106.05 (f) – Mere Instructions to Apply an Exception). These additional elements in the claims are recited at a high level of generality and are merely limiting the field of use of the judicial exception (see MPEP §2106.05 (h) – Field of Use and Technological Environment). There is no indication that the combination of elements improves the function of a computer or improves any other technology. Furthermore, the dependent claims are merely directed to the particulars of the abstract idea and likewise do not add significantly more to the above-identified judicial exception. The limitations of the claims do not transform the abstract idea that they recite into patent-eligible subject matter because the claims simply instruct the practitioner to implement the abstract idea using generally-recited computer components, and furthermore do not amount to an improvement to a computer or any other technology, and thus are ineligible. In summary as indicated below through Steps 1-2B, the recitation of a computer to perform the claim limitations amount to no more than mere instruction to apply the exception using generic computer components. Even when considered in combination, these additional elements represent mere instructions to implement an abstract idea or other exception on a computer and insignificant extra-solution activity, which do not provide an inventive concept. For at least these reasons, the rejection is maintained.
Regarding the current prior art rejection, Applicants submit that: (2) Bower et al. (Bower) (US 2019/0216392) and Bower in view of Groenewegen et al. (Groenewegen) (US 2023/0141807) do not teach or suggest in representative amended Claim 1: basically “…the cited reference does not disclose all of the elements of the rejected claims; …submits that Groenwegen fails to remedy the deficiency in the teachings of Bower” [see Remarks pages 11-13]. With regard to argument (2), the Examiner respectfully disagrees. As seen below in the maintained rejections, Bower, and Bower in view of Groenewegen teach Applicants claim limitations, even as amended. Additionally, Applicant's arguments fail to comply with 37 CFR 1.111(b) because they amount to a general allegation that the claims define a patentable invention without specifically pointing out how the language of the claims patentably distinguishes them from the references. It is noted that any citations to specific, pages, columns, paragraphs, lines, or figures in the prior art references and any interpretation of the reference should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. See MPEP 2123. The Examiner has a duty and responsibility to the public and to Applicant to interpret the claims as broadly as reasonably possible during prosecution. In re Prater, 415 F.2d 1 393, 1404-05, 162 USPQ 541, 550-51 (CCPA 1969). For at least these reasons, the rejections are maintained.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. §101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, natural phenomenon, or an abstract idea) because the claimed invention is directed to a judicial exception (i.e., a law of nature, natural phenomenon, or an abstract idea) without significantly more. The claims as a whole recite certain grouping of an abstract idea and are analyzed in the following step process:
Step 1: Claims 1-20 are each focused to a statutory category of invention, namely “system; method” sets.
Step 2A: Prong One: Claims 1-20 recite limitations that set forth the abstract ideas, namely, the claims as a whole recite the claimed invention is directed to an abstract idea without significantly more. The claims recite steps for, generally, “revising a data structure representing a goal to be accomplished by a user”. This judicial exception is not integrated into a practical application because the claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the claims encompass processing information by:
“monitoring a plurality of user interactions with one or more computing devices;
recording content associated with at least one user interaction;
determining whether the content is related to the goal, wherein the goal is associated with one or more tasks for accomplishing the goal, wherein the goal is associated with one or more tasks for completing the goal and new tasks are automatically generated;
based on determining the content is related to at least one task associated with accomplishing the goal, determining whether the content conflicts with the at least one task; and
based on determining the content conflicts with the at least one task, automatically revising a node representing the at least one task in a data structure representing the goal, wherein revising the node representing the at least one task includes associating at least a portion of the content with the node”
The claim limitations (in above bolded) fall under the categories:
Certain methods of organizing human activity –managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions). At least the elements involving "monitoring... user interactions" and managing "tasks for accomplishing the goal" often fall under this category. Managing a schedule or task list to reach a goal is a fundamental human activity for organizing productivity and behavior.
Mental processes – concepts performed in the human mind (including an observation, evaluation, judgment, opinion). The limitations regarding "determining whether the content is related to the goal," "determining whether the content conflicts," and "automatically revising a node" are classified as Mental Processes. A human can observe user interactions (monitoring), evaluate if they relate to a goal (determining relationship), judge if there is a conflict (determining conflict), and update a list of tasks (revising a node/updating a progress of the goal). Even though performed "automatically" by a computer, the underlying logic is a mental process. See MPEP § 2106.04(a) II C. Hence, the claims are ineligible under Step 2A Prong one. Furthermore, the dependent claims are merely directed to the particulars of the abstract idea and likewise do not add significantly more to the above-identified judicial exception. The limitations of the claims do not transform the abstract idea that they recite into patent-eligible subject matter because the claims simply instruct the practitioner to implement the abstract idea using generally-recited computer components.
Prong Two: Claims 1-20: With regard to this step of the analysis (as explained in MPEP § 2106.04(d)), the judicial exception is not integrated into a practical application. Independent Claims 1, 11, 17 recite additional elements directed to “at least one processor; computing devices; memory storing instructions” (e.g., see Applicants’ published Specification ¶’s 18, 90-105). Therefore, the claims contain computer components that are cited at a high level of generality and are merely invoked as a tool to perform the abstract idea. Simply implementing an abstract idea on a computer is not a practical application of the abstract idea. Furthermore, the dependent claims are merely directed to the particulars of the abstract idea and likewise do not add significantly more to the above-identified judicial exception. The limitations of the claims do not transform the abstract idea that they recite into patent-eligible subject matter because the claims simply instruct the practitioner to implement the abstract idea using generally-recited computer components, and furthermore do not amount to an improvement to a computer or any other technology, and thus are ineligible. See MPEP § 2106.05(f) (h).
Step 2B: As explained in MPEP § 2106.05, Claims 1-20 do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements when considered both individually and as an ordered combination do not amount to significantly more than the abstract idea nor recites additional elements that integrate the judicial exception into a practical application. The additional elements of “at least one processor; computing devices; memory storing instructions”, etc. are generically-recited computer-related elements that amount to a mere instruction to “apply it” (the abstract idea) on the computer-related elements (see MPEP § 2106.05 (f) – Mere Instructions to Apply an Exception). These additional elements in the claims are recited at a high level of generality and are merely limiting the field of use of the judicial exception (see MPEP §2106.05 (h) – Field of Use and Technological Environment). There is no indication that the combination of elements improves the function of a computer or improves any other technology. Furthermore, the dependent claims are merely directed to the particulars of the abstract idea and likewise do not add significantly more to the above-identified judicial exception. The limitations of the claims do not transform the abstract idea that they recite into patent-eligible subject matter because the claims simply instruct the practitioner to implement the abstract idea using generally-recited computer components, and furthermore do not amount to an improvement to a computer or any other technology, and thus are ineligible.
The Examiner interprets that the steps of the claimed invention both individually and as an ordered combination result in Mere Instructions to Apply a Judicial Exception (see MPEP §2106.05 (f)). These claims recite only the idea of a solution or outcome with no restriction on how the result is accomplished and no description of the mechanism used for accomplishing the result. Here, the claims utilize a computer or other machinery (e.g., see Applicants’ published Specification ¶’s 18, 90-105) regarding using existing computer processors as well as program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored. “system 100” in its ordinary capacity for performing tasks (e.g., to receive, analyze, transmit and display data) and/or use computer components after the fact to an abstract idea (e.g., a fundamental economic practice and certain methods of organization human activities) and does not provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016)). Software implementations are accomplished with standard programming techniques with logic to perform connection steps, processing steps, comparison steps and decisions steps. These claims are directed to being a commonplace business method being applied on a general-purpose computer (see Alice Corp. Pty, Ltd. V. CLS Bank Int’l, 134 S. Ct. 2347, 1357, 110 USPQ2d 1976, 1983 (2014)); Versata Dev. Group, Inc., v. SAP Am., Inc., 793 D.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015)) and require the use of software such as via a server to tailor information and provide it to the user on a generic computer. Based on all these, Examiner finds that when viewed either individually or in combination, these additional claim element(s) do not provide meaningful limitation(s) that raise to the high standards of eligibility to transform the abstract idea(s) into a patent eligible application of the abstract idea(s) such that the claim(s) amounts to significantly more than the abstract idea(s) itself. Accordingly, Claims 1-20 are rejected under 35 U.S.C. §101 because the claimed invention is directed to a judicial exception (i.e. abstract idea exception) without significantly more.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-5, 7-13, 15-19 are rejected under 35 U.S.C. 102(A) (1) as being anticipated by Bower et al. (Bower) (US 2019/0216392).
With regard to Claims 1, 11, 17, Bower teaches a system/method comprising: at least one processor; and memory storing instructions that, when executed by the at least one processor, cause the system to perform a set of operations (an apparatus for generating a quantifier of cognitive skills in an individual is provided. The apparatus includes a user interface; a memory to store processor-executable instructions; and a processing unit communicatively coupled to the user interface and the memory, in which upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to render a first instance of a primary task with an interference at the user interface, requiring a first response from the individual to the first instance of the primary task in the presence of the interference, where the interference comprises one or both of an interruptor or a distraction, and to render a second instance of the primary task without the interference at the user interface, requiring a second response from the individual to the second instance of the primary task. The processing unit is configured to receive a secondary response to the interference at substantially the same time as the processing unit receives the second response; or (ii) receive the secondary response to the interference that is an interruptor at substantially the same time as the processing unit receives the first response and not receive the secondary response to the interference that is a distraction at substantially the same time that processing unit receives the first response. The processing unit is further configured to receive data indicative of at least one physiological profile of the individual, the physiological profile being based on one or more measurements of the at least one physiological component, the at least one physiological component being coupled to measure a physiological measurement of the individual. The processing unit is further configured to receive data indicative of the first response, the second response, and the at least one physiological profile, and analyze the differences in the individual's performance from performing the primary task without interference and with interference at least in part by determining a difference between the data indicative of the first response and the data indicative of the second response relative to the at least one physiological profile to determine a performance metric of the individual, the performance metric comprising an indicator of the cognitive ability of the individual) (see at least paragraphs 3-10), the set of operations comprising:
monitoring a plurality of user interactions with one or more computing devices (An example system, method, and apparatus according to the principles herein includes a platform product (including using an APP) that is configured to monitor physiological measurement indicating signals of user engagement and to analyze data indicative of a score of an individual in interacting with the tasks and/or interference to determine the effects of the adjustment of the visual or auditory messages and/or the adjustments to visual or auditory characteristics used with the tasks and/or interference on the user's physiological measurements, to determine the type messages and/or adjustments are more likely to have a desired effect on the individual (such as increased user engagement)) (see at least paragraphs 123-127);
recording content associated with at least one user interaction (The results of the analysis can be used to control the processing unit to adjust the visual or auditory messages and/or to adjust the visual or auditory characteristics used with the tasks and/or interference, until the physiological measurement of the individual change to levels indicative of sufficient user engagement) (see at least paragraphs 35, 123-127);
determining whether the content is related to a goal to be accomplished by a user (the term “task” refers to a goal and/or objective to be accomplished by an individual. Using the example systems, methods, and apparatus described herein, the computerized task is rendered using programmed computerized components, and the individual is instructed (e.g., using a computing device) as to the intended goal or objective from the individual for performing the computerized task. The task may require the individual to provide or withhold a response to a particular stimulus, using at least one component of the computing device (e.g., one or more sensor components of the computing device). The “task” can be configured as a baseline cognitive function that is being measured), wherein the goal is associated with one or more tasks for accomplishing the goal, wherein the goal is associated with one or more tasks for completing the goal and new tasks are automatically generated (the term “interference” refers to a type of stimulus presented to the individual such that it interferes with the individual's performance of a primary task. In any example herein, an interference is a type of task that is presented/rendered in such a manner that it diverts or interferes with an individual's attention in performing another task (including the primary task). In some examples herein, the interference is configured as a secondary task that is presented simultaneously with a primary task, either over a discrete time period (e.g., a short, discrete time period) or over an extended time period (e.g., less than the time frame over which the primary task is presented), or over the entire period of time of the primary task. In any example herein, the interference can be presented/rendered continuously, or continually (i.e., repeated in a certain frequency, irregularly, or somewhat randomly). For example, the interference can be presented at the end of the primary task or at discrete, interim periods during presentation of the primary task. The degree of interference can be modulated based on the type, amount, and/or temporal length of presentation of the interference relative to the primary task) (see at least paragraphs 35-38, 145, 272);
based on determining the content is related to the goal, determining whether the content updates/conflicts with the at least one task (progressively adjust; adjust; updated; a platform product (including using an APP) that uses a cognitive platform configured to render and integrate at least one simultaneous conflicting computer-implemented time-varying element(s) into different tasks during a MTG. This could be used for the purpose of assessing or improving measures of cognition related to the user interaction with the platform product indicating the user's handling of conflicting emotional information) the goal (uses a cognitive platform configured to render at least one computer-implemented time-varying element with levels of valence determined based on previous user responses to computer-implemented time-varying element at one or more level of valence. This may apply an adaptive algorithm to progressively adjust the level of valence to achieve specific goals, such as creating a psychometric curve of expected user performance on a task across stimulus or difficulty levels, or determining the specific level at which a user’s task performance would meet a specific criterion like 50% accuracy in a Go/No-Go task; Example storage device 1534 can also store one or more databases for storing any suitable information required to implement example systems and methods. The databases can be updated manually or automatically at any suitable time to add, delete, and/or update one or more items in the databases) (see at least paragraphs 35, 107-113, 145, 356);
based on determining the content updates the goal, updating/revise/dynamically (the computing system can be configured to modify the difficulty level using adaptive thresholding methods, such as but not limited to using psychometric staircase algorithms, to dynamically and rapidly maintain the performance of the individual at a specific performance level. For example, the thresholding algorithm can be implemented to achieve as close to about 80% accuracy in the performance of the individual in the primary task (such as but not limited to a visuomotor tracking task) and/or the interference (such as but not limited to a target discrimination (or target detection) task) from the individual by adjusting the difficulty levels appropriately) revising a data structure/node representing the goal/task based on the content/in a data structure representing the goal (Example storage device 1534 can also store one or more databases for storing any suitable information required to implement example systems and methods. The databases can be updated manually or automatically at any suitable time to add, delete, and/or update one or more items in the databases; the computing system can be configured to compare the cData and/or nData measured and collected from a test individual that is to be assessed and/or trained to the user physiological profile(s), to compute a weighting factor to be applied to a computed performance metric for the test individual to determine a weighted performance metric. The weighted performance metric can be used in pace of the actual performance metric to determine the adjustment (adapting) of the difficulty levels from one trial to another and/or from one session to another. The CSIs can be modified such that cData and/or nData measured and collected from a test individual's performance of the tasks and/or interference correlates highly with (including substantially matching with) the user physiological profile(s), thereby replicating the user physiological profile(s). That is, the difficulty level of the tasks and/or interference can be adjusted such that the cData and/or nData indicative of the response from the test individual more closely correlates with a predetermined user physiological profile) (see at least paragraphs 35, 113, 145, 249, 356);
updating a progress (progress milestones) of the goal (may apply an adaptive algorithm to progressively adjust the level of valence to achieve specific goals, such as creating a psychometric curve of expected user performance on a task across stimulus or difficulty levels, or determining the specific level at which a user's task performance would meet a specific criterion like 50% accuracy in a Go/No-Go task) (see at least paragraphs 70-74, 145).
With regard to Claims 2, 12, Bower teaches: determining the content is related to existing content in the data structure representing the goal (see at least paragraph 35).
With regard to Claims 3, 18, Bower teaches:
determining that the content relates to at least one task of the one or more tasks (see at least paragraphs 4, 35);
associating at least a portion of the content with the node representing the at least one task (see at least paragraphs 4, 35).
With regard to Claims 4, 13, 19, Bower teaches:
associating, with the node representing the at least one task, a reasoning chain supporting the determination that the content relates to the at least one task (see at least paragraphs 86, 292).
With regard to Claim 5, Bower teaches: determining that the content completes the at least one task (see at least paragraph 69); and flagging the at least one task as complete in the data structure representing the goal (see at least paragraph 69).
With regard to Claim 7, Bower teaches: determining the content is related to an existing goal (see at least paragraph 145); or determining the content is related to a new goal (see at least paragraph 145).
With regard to Claim 8, Bower teaches: determining one or more tasks for completing the new goal (see at least paragraphs 58-61); and generating a new data structure representing the new goal, the new data structure including a node representing each task of the determined one or more tasks (see at least paragraphs 58-61).
With regard to Claim 9, Bower teaches: determining whether the content conflicts with the goal (see at least paragraphs 106, 107); based on determining that the content conflicts with the goal, determining whether to revise the goal (see at least paragraphs 106, 107); based on determining to revise the goal, updating the data structure to represent the revised goal (see at least paragraphs 106, 107, 356).
With regard to Claim 10, Bower teaches: upon updating the goal, determining another goal should be updated (see at least paragraphs 106, 107); updating the other goal (see at least paragraphs 106, 107, 356).
With regard to Claim 15, Bower teaches: based on determining the content conflicts with the at least one task, automatically generating a new node representing a new task in the data structure representing the goal (see at least paragraphs 106, 107, 356).
With regard to Claim 16, Bower teaches: requesting approval from the user before generating the new node representing the new task (see at least paragraph 258).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 6, 14, 20 are rejected under 35 U.S.C. 103 as being unpatentable over Bower as seen above for Claims 1-5, 7-13, 15-19 in view of Groenewegen et al. (Groenewegen) (US 2023/0141807).
With regard to Claims 6, 14, 20, Bower does not specifically teach receiving a request to surface the data structure; surfacing the data structure in one of an application or an interface based on the computer-readable instructions. Groenewegen teaches: receiving a request to surface the data structure (perform a feature surfacing method that includes detecting edit patterns or other interaction patterns that are done inefficiently or otherwise non-optimally “by hand” (e.g., using more gestures than necessary) and can be done “automatically” instead (e.g., user fewer gestures). Some embodiments analyze patterns and identify which patterns can be optimized for a user, e.g., using a mapping data structure);
surfacing the data structure in one of an application or an interface based on the computer-readable instructions (Some tool feature surfacing functionality 210 taught herein includes or uses a library 220 of automatable edit sequences 222. A given entry in the library 220 includes an edit graph data structure and one or more corresponding temporal edit patterns (TEPs) 228. When the system 202 matches the edit graph to user inputs, the system may recommend that a corresponding TEP be applied to make changes in the target content 404. The TEP 228 may thus be viewed as a kind of transform 218, which is associated with an edit graph) in analogous art of interactions between a user and a system, with attendant metadata as to the current goal of the interaction and how that goal relates (or fails to relate) to other goals of a tool usage session for the purposes of: “software 300 to provide tool feature surfacing functionality 210. For example, software 300 may perform any one or more of the methods illustrated in FIG. 9 (which incorporates FIG. 8). In particular, software 300 may surface a tool feature 212 by obtaining 802 user-tool interaction context data 462, detecting 808 a pattern 304 in the interactions 302, mapping 810 the pattern 304 to an interaction optimization 306, and offering 812 the user a suggestion 214 which identifies a feature 212 that can provide the desired result 310 (inferred from the pattern 304) in a better way” (see at least paragraphs 27, 48, 52, 262-272).
It would have been obvious to one of ordinary skill in the art at the time of the invention to include the surfacing underutilized tool features as taught by Groenewegen in the system of Bower, since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Conclusion
The prior art made of record and not relied upon is considered pertinent to Applicant's disclosure:
Pachauri et al. (US 2020/0265485)
THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to THOMAS L MANSFIELD whose telephone number is (571)270-1904. The examiner can normally be reached M-Thurs, alt. Fri. (9-6).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patricia Munson can be reached at (571) 270-5396. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
THOMAS L. MANSFIELD
Examiner
Art Unit 3623
/THOMAS L MANSFIELD/Primary Examiner, Art Unit 3624