Prosecution Insights
Last updated: April 19, 2026
Application No. 17/120,343

SYSTEM AND METHOD FOR TRIGGERING A TRAINING EVENT

Non-Final OA §101§103
Filed
Dec 14, 2020
Examiner
HATCHER, DEIRDRE D
Art Unit
3625
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Ats Automation Tooling Systems Inc.
OA Round
5 (Non-Final)
28%
Grant Probability
At Risk
5-6
OA Rounds
3y 10m
To Grant
53%
With Interview

Examiner Intelligence

Grants only 28% of cases
28%
Career Allow Rate
98 granted / 357 resolved
-24.5% vs TC avg
Strong +26% interview lift
Without
With
+25.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 10m
Avg Prosecution
45 currently pending
Career history
402
Total Applications
across all art units

Statute-Specific Performance

§101
40.0%
+0.0% vs TC avg
§103
37.1%
-2.9% vs TC avg
§102
8.4%
-31.6% vs TC avg
§112
11.9%
-28.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 357 resolved cases

Office Action

§101 §103
DETAILED ACTION This communication is a Non-Final Rejection Office Action in response to the 12/14/2025 submission filed in Application 17/120,343. Claims 1, 10 have been amended. Claims 1, 4-10, 13-16, 18 are now presented. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/14/2025 has been entered. Response to Arguments Applicant’s arguments filed 10/3/2025 with respect to the prior art have been considered but are moot because the arguments do not apply to the new grounds of rejection that was necessitated by amendment. Applicant's remaining arguments have been fully considered but they are not persuasive. Regarding the rejection under 101, the Applicant argues “Applicant submits that claims 1 and 10 are not directed solely to a method of organizing human activity and/or mental processes. In particular, claims 1 and 10 involve a separate data collection device in addition to the programmable logic device, the automatic detection of an automation element, and the calculation of metrics in relation to a trigger event and training materials, including a feed-back score.” The Examiner respectfully disagrees. The following imitations recited in the claims can be performed mentally: detecting a trigger event based on the automation data wherein the detecting the trigger event further comprises detecting an associated automation element; determining, if there is an associated training component related to correcting the trigger event stored in a database, if yes, retrieving the training component associated with the trigger event, if no, and determining the most relevant training component based on the trigger event; and automatically determining efficiency data associated with the training component based on the feed-back and from monitoring the automation element during and after the addressing of the triggering event; and correlating the efficiency data and the feed-back to determine a feed-back score; Further, the following limitations recited in the claims amount to providing the appropriate training which amounts to teaching and following instructions. detecting a trigger event based on the automation data wherein the detecting the trigger event further comprises detecting an associated automation element; providing access to the most relevant training component to a maintenance user and directing the end user to the associated automation element, As such the claim as amended still recite abstract ideas. Limitations that fall into the abstract idea groupings cannot also amount to an improvement to the technology or technical field. Regarding the rejection under 101, the Applicant argues “ Applicant further draws the Examiner's attention to page 2 of the Memorandum issued by the USPTO on August 4, 2025, entitled "Reminders on evaluating subject matter eligibility of claims under 35. US.C. 101." In particular, the Memorandum notes "The courts consider a mental process (thinking) that "can be performed in the human mind, or by a human using a pen and paper," to be an abstract idea. The USPTO subject matter eligibility analysis follows this precedent and instructs examiners to determine that a claim recites a mental process when it contains limitation(s) that can practically be performed in the human mind, including, for example, observations, evaluations, judgments, and opinions. On the other hand, a claim does not recite a mental process when it contains limitation(s) that cannot practically be performed in the human mind, for instance when the human mind is not equipped to perform the claim limitation(s)." (emphasis added)”. In the present application, Applicant submits that the claims represent an improvement in the operation and maintenance of a manufacturing line by providing a technological system/method to determine a trigger event in real-time and then automatically select the most appropriate training materials based on factors such as feed-back score. This is not something that the human mind can do practically given, at least the time constraints and the volume of data involved. Thus, it is believed that the claims are not directed to an abstract idea, as the claims do not recite matter that falls within the enumerated groupings of abstract ideas noted in the 2019 Revised Patent Subject Matter Eligibility Guidance.” The Examiner respectfully disagrees. The claims do not recite any particular way the detection of the trigger event is implemented. As such, under the broadest reasonable interpretation, a human can detect a trigger event by using observation. Further, the claims are not limited to any particular time constrains. A human can also select the most appropriate training materials for a user. The Examiner maintains that under the broadest reasonable interpretation, the claims are directed to mental processes and methods of organizing human activity. Regarding the rejection under 101, the Applicant argues “As noted above, the presently claimed combination of features demonstrate a technology rooted solution to a problem in detecting a trigger event and determining the most appropriate training materials to keep an automation system running in an efficient manner. Applicant submits that the claims include various elements that amount to significantly more than human activity/mental processes. Further the presently claimed combination of features is integrated into a practical application. For example, the presently claimed combination recites features that facilitate the detection of a trigger event in an automation system and the automatic selection of materials format to allow an automation system to be repaired/maintained as quickly and efficiently as possible. The use of feedback score to assist with training material selection are technical features that address the technical problem of automatic selection of materials. Applicant submits that this is a practical application of a new metric determined by technology that is more than just an abstract idea and a statement of "apply it". These and other features are also other than what is well- understood, routine, and conventional in the field.” The Examiner respectfully disagrees. Limitations that fall into one of the abstract idea groupings cannot also prove an improvement to the technology. The limitations that are beyond the abstract idea include: receiving automation data from a programmable logic controller associated with the at least one automation element and from a data collection device associated with the at least one automation element; a processor to perform the abstract idea; automatically retrieving online and third party sources storing, in the database, the feed-back and feed-back score in association with the training component. When viewing the broadly recited data gathering, and data storage in combination with the generic computer does not add more than when viewing the elements individually. Accordingly, the combination of additional elements do not integrate the abstract idea into a practical application or provide an inventive concept because it does not impose any meaningful limits on practicing the abstract idea. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: a data acquisition module; a data collection device trigger; a training module; a notification module in claims 10, 12, 14, 15. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. Para. 41 of the Applicant’s specification discloses “Figure 2 is a block diagram illustrating an embodiment of a system for triggering training events 300 for automation systems. The system 300 includes a processor 305, a storage device (such as database 310 or a data store), a data acquisition module 315, a data collection device trigger, a training module 325, and a notification module 330. The system 300 may further be operatively connected to a data store 335, which may be physically connected to the system, may be wirelessly accessible by the system or may be accessible via a network connection. The system 300 may a standalone system or may be seen as part of the production monitoring server 120, the production controller 115 and/or the data collection device 205 and/or any combination thereof. The system 300 is intended to interact with an end user 340 and provide a training event for the end user 340.” The Examiner interprets the recited modules and device trigger to by implemented by the disclosed processor. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1, 4-10, 13-16, 18 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. When considering subject matter eligibility under 35 U.S.C. 101, in step 1 it must be determined whether the claim is directed to one of the four statutory categories of invention, i.e., process, machine, manufacture, or composition of matter. If the claim does fall within one of the statutory categories, in step 2A prong 1 it must then be determined whether the claim is recite a judicial exception (i.e., law of nature, natural phenomenon, and abstract idea). If the claim recites a judicial exception, under step 2A prong 2 it must additionally be determined whether the recites additional elements that integrate the judicial exception into a practical application. If a claim does not integrate the Abstract idea into a practical application, under step 2B it must then be determined if the claim provides an inventive concept. In the Instant case Claims 1, 4-9 are directed toward a method triggering a training event in a manufacturing line. Claims 10, 14-16, 18 are directed toward a system for triggering a training event in a manufacturing line. As such, each of the Claims is directed to one of the four statutory categories of invention. The 2019 Preliminary Examination Guidance (2019 PEG), explains that in step 2A prong 1 examiners are to evaluate claims to determine if they recite an abstract idea. The guidance explains that claims that recite mathematical concepts, mental processes, and methods of organizing human activity recite abstract ideas. As per step 2A prong 1 of the eligibility analysis, claim 1 recites the abstract idea of detecting a need to administer training and then providing access to the training which falls into the abstract idea categories of certain methods of organizing human activity and mental processes. The elements of Claim 1 that represent the Abstract idea include: A method for triggering a training event in a manufacturing line having at least one automation element, the method comprising: detecting a trigger event based on the automation data wherein the detecting the trigger event further comprises detecting an associated automation element; determining, if there is an associated training component related to correcting the trigger event stored in a database, if yes, retrieving the training component associated with the trigger event, if no, and determining the most relevant training component based on the trigger event; providing access to the most relevant training component to a maintenance user and directing the maintenance user to the associated automation element, determining feed-back associated with the training component, wherein determining feed-back comprises: receiving feed-back associated with the training component from the maintenance user; automatically determining efficiency data associated with the training component based on the feed-back and from monitoring the automation element during and after the addressing of the triggering event; and correlating the efficiency data and the feed-back to determine a feed-back score; The 2019 PEG states certain method of organizing human activity including managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions) are abstract. The instant claims are directed to determining a need for training and then providing the training which amounts to teaching, and following rules or instructions. Further, the claim recites mental processes including observation, evaluation, judgment, opinion. For example, the detecting step, determining if there is an associated training component; and determining feed-back associated with the training component are drawn to observation and evaluation. A human can review online and third party sources to identify relevant training. Further, receiving feed-back associated with the training component from the end user automatically determining efficiency data associated with the training component; and correlating the efficiency data and the feed-back to determine a feed-back score can be performed mentally. For example, a human can provide feedback; determine efficiency data associated with the training component; and correlate the efficiency data and the feed-back to determine a feed-back score that indicates relevancy. As such, the claim recites at least one abstract idea. Under step 2A prong 2 the examiner must then determine if the recited abstract idea is integrated into a practical application. The 2019 PEG states that additional elements that are indicative of integration into a practical application include: Improvements to the functioning of a computer, or to any other technology or technical field - see MPEP 2106.05(a) Applying or using a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition – see Vanda Memo Applying the judicial exception with, or by use of, a particular machine - see MPEP 2106.05(b) Effecting a transformation or reduction of a particular article to a different state or thing - see MPEP 2106.05(c) Applying or using the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception - see MPEP 2106.05(e) and Vanda Memo Limitations that are not indicative of integration into a practical application: Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f) Adding insignificant extra-solution activity to the judicial exception - see MPEP 2106.05(g) Generally linking the use of the judicial exception to a particular technological environment or field of use – see MPEP 2106.05(h) In the instant case, this judicial exception is not integrated into a practical application. In particular, Claim 1 recites the additional elements of: receiving automation data from a programmable logic controller associated with the at least one automation element and from a data collection device associated with the at least one automation element; a processor to perform the abstract idea; automatically retrieving online and third party sources storing, in the database, the feed-back and feed-back score in association with the training component. Claim 10 further recites the addition elements of a data acquisition module; a data collection device trigger; a training module; a notification module which the specification discloses as being implemented by a processor. The claims has been amended to recite a programmable logic controller associated with the at least one automation element. However, the processor and programmable logic controller are recited at a high-level of generality (i.e., as a generic processor performing a generic computer functions) such that it amounts no more than mere instructions to apply the exception using a generic computer component. Further MPEP 2105.05(g) explains that data gathering can be considered pre-solution activity. See MPEP 2106.05(g) that states: An example of pre-solution activity is a step of gathering data for use in a claimed process, e.g., a step of obtaining information about credit card transactions, which is recited as part of a claimed process of analyzing and manipulating the gathered information by a series of steps in order to detect whether the transactions were fraudulent. In the instant case, the receipt of automation data and retrieval of Further, MPEP 2106.05(g) states when determining whether an additional element is insignificant extra-solution activity, examiners may consider whether the limitation is significant (i.e. it imposes meaningful limits on the claim such that it is not nominally or tangentially related to the invention). In the instant case, the step of storing feedback in a database does not add a meaningful limitation to the process of triggering a training event. When viewing the broadly recited data gathering, and data storage in combination with the generic computer does not add more than when viewing the elements individually. Accordingly, the combination of additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. In step 2B, the examiner must be determine whether the claim adds a specific limitation other than what is well-understood, routine, conventional activity in the field - see MPEP 2106.05(d). As discussed with respect to Step 2A Prong Two, the additional elements in the claim amount to no more than mere instructions to apply the exception using a generic computer component. The same analysis applies here in 2B, i.e., mere instructions to apply an exception on a generic computer cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B. Further, the receipt data is recited broadly in the claims. MPEP 2106.05(d) states receiving or transmitting data over a network, e.g., using the Internet to gather data is conventional when claimed generically (see Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network)). As such, the broadly claimed receipt of location data is considered well-known and conventional as established by the MPEP and relevant case law. Further, MPEP 2106.05(d) states storing and retrieving information in memory is well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (see Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93). When viewing the broadly recited data gathering and data storage in combination with the generic computer does not add more than when viewing the elements individually. Accordingly, the additional elements do not provide an inventive concept because it does not impose any meaningful limits on practicing the abstract idea. Further Claims 4-9 further limit the mental processes and methods of organizing human activity recited in the parent claim, but fail to remedy the deficiencies of the parent claim as they do not impose any additional elements that amount to significantly more than the abstract idea itself. Further, claims 9 recites the use of machine learning. However, in the instant case, the claims attempt to cover any solution to the identified problem with no restriction on how the result is accomplished and no description of the mechanism for accomplishing the result, does not integrate a judicial exception into a practical application or provide significantly more because this type of recitation is equivalent to the words "apply it”. For example, the claims do not state how the machine learning is performed. As such, the broadly recited machine learning does not integrate a judicial exception into a practical application or provide significantly more. Accordingly, the Examiner concludes that there are no meaningful limitations in claims 1, 4-9 that transform the judicial exception into a patent eligible application such that the claim amounts to significantly more than the judicial exception itself. Claim 9 further recites the additional elements of machine learning. However, the broadly recited machine learning attempts to cover any solution to the identified problem with no restriction on how the result is accomplished and no description of the mechanism for accomplishing the result, which does not integrate a judicial exception into a practical application or provide significantly more because this type of recitation is equivalent to the words "apply it”. As such, the broadly recited ML does not integrate a judicial exception into a practical application or provide significantly more. Further, similar to the analysis with respect to step 2A prong 2 recitation of claim limitations that attempt to cover any solution to an identified problem with no restriction on how the result is accomplished cannot provide an inventive concept under step 2B of the eligibility analysis. The analysis above applies to all statutory categories of invention. The presentment of claim 1 otherwise styled as a computer program product or system, for example, would be subject to the same analysis. As such, claims 10, 13-18 are also rejected. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1, 7, 8, 9, 10, 16, 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Asenjo US 2018/0012510 A1 in view of Vaughn US 20190347117 A1 in view of Powers US 10757132 B1 in view of Gomes Pereira US 2019/0347148 A1. As per Claim 1 Asenjo teaches a method for triggering a training event in a manufacturing line having at least one automation element, the method comprising: receiving automation data from a programmable logic controller associated with the at least one automation element; (Asenjo para. 45 teaches the collection component 102 can monitor or track the operation of the industrial automation system 104 and users associated with the industrial automation system 104 (e.g., operators, technicians, managers, engineers, etc., interacting or working with the industrial automation system 104). The collection component 102 can receive, obtain, detect, capture, or collect data relating to the operation of the industrial automation system 104, the users associated with the industrial automation system 104, and the network component 112. The collection component also can receive, obtain, detect, capture, or collect data from other sources, such as extrinsic sources. Para. 37 teaches the use of a programmable logic controller) and from a data collection device associated with the at least one automation element; (Asenjo para. 46 teaches the collection component 102 can receive, obtain, detect, capture, or collect data relating to the work or interactions of users with the industrial automation system 104 and the network component 112. For example, the collection component 102 can receive and/or capture data relating to the respective work or interactions of respective users with the industrial automation system 104, including the respective work or interactions of respective users with the various industrial devices 106, processes 108, HMIs, control programs, other assets 110, the network component 112, etc., associated with the industrial automation system 104. The collection component 102 can comprise or be associated with various sensor components (not shown in FIG. 1) that can facilitate sensing, detecting, obtaining, or capturing data relating to the work or interactions of users with the industrial automation system 104 and the network component 112, the operation of the industrial automation system 104, and the operation of the network component 112. The sensor components can comprise, for example, video sensor components that can be distributed across the industrial automation system 104 and can sense or capture visual data, audio sensor components that can be distributed across the industrial automation system 104 and can sense or capture audio data, motion sensor components that can be distributed across the industrial automation system 104 and can sense or capture motion data, operational sensor components that can be distributed across the industrial automation system 104 and can sense or capture various operational aspects or parameters (e.g., status, temperature, quantity, quality, etc.) relating to the industrial automation system 104, location sensor components that can sense the respective locations of respective users (e.g., based at least in part on determining the respective locations of their respective mobile communication devices or tags (e.g., radio-frequency identification (RFID) tags), etc. For example, the collection component 102, e.g., via the various sensor components, can facilitate capturing user actions and/or behavior with respect to the various portions (e.g., various industrial devices 106, processes 108, HMIs, control programs, other assets 110, the network component 112, etc.) of the industrial automation system 104.) detecting a trigger event based on the automation data; (Asenjo para. 123 teaches at 1102, a set of data relating to an industrial automation system can be analyzed to facilitate performing one or more functions or operations relating to enhancing performance of a user associated with the industrial automation system or enhancing performance of the industrial automation system, wherein the set of data is maintained in a cloud-based platform (e.g., comprising the cloud-based data store). The performance enhancement component can analyze the set of data relating to the industrial automation system. The set of data can comprise data associated with the industrial automation system and/or one or more other industrial automation systems that can comprise an industrial device(s), industrial process(es), or industrial asset(s) that can be the same as or similar to an industrial device(s), industrial process(es), or industrial asset(s) associated with the industrial automation system. The set of data also can comprise data relating to one or more users (e.g., operators, managers, technicians, etc.) that are associated with the industrial automation system and the one or more other industrial automation systems. Para. 127 teaches as desired, the performance enhancement component also can identify or determine other instances of less favorable or relatively poor performance of the industrial automation system, or portion thereof, by other users (or the user at other times), based at least in part on the data analysis results, in accordance with the set of defined performance criteria, wherein such less favorable or relatively poor performance of the industrial automation system, or portion thereof, can be associated with user interactions or behaviors with respect to the industrial automation system that are different than the interactions or behaviors of the user that had resulted in the favorable performance of the industrial automation system, or portion thereof. The performance enhancement component can compare the user interactions and behaviors of the user that are associated with the favorable performance of the industrial automation system, or portion thereof, with the different user interactions and behaviors of other users (or the user at other times) that are associated with less favorable or relatively poor performance of the industrial automation system, or portion thereof. Such a comparison by the performance enhancement component can facilitate enabling the performance enhancement component to determine which user interactions or behaviors of the user with respect to the industrial automation system, or portion thereof, were at least partially responsible for the favorable performance of the industrial automation system, or portion thereof. This can facilitate enabling the performance enhancement component to determine or identify the correlation between the user action(s) of the user with respect to the industrial automation system and favorable performance of the industrial automation system.) wherein the detecting the trigger event further comprises detecting an associated automation element; (Asenjo para. 151-151 teaches At 1410, one or more alternate user actions, which can be performed to complete the work task(s) in place of a preferred user action(s) of the one or more preferred user actions, can be determined based at least in part on the differences between the physical elements, mental elements, and/or other elements involved in the performance of the one or more preferred user actions to complete the work task(s) and the physical skills, work skills, intelligence, and/or other factors of the second user. The performance enhancement component can determine the one or more alternate user actions, which can be performed by the second user (or another user) in place of a preferred user action(s) of the one or more preferred user actions, to facilitate completing the work task(s) to achieve the same or substantially same result in a different way (e.g., to achieve the same or substantially same favorable performance as can be achieved by performing the one or more preferred user actions). For example, the preferred user actions may comprise four user actions, wherein the performance enhancement component can determine that the second user is able to perform three of the user actions in a manner that is the same or substantially the same as the first user, but due to physical limitations of the second user, the second user is not able to perform the fourth preferred user action in the way that the first user was able to perform it. The performance enhancement component can determine one or more alternate user actions that the second user can perform (e.g., can be physically capable of performing) in place of the fourth preferred user action, wherein performing the one or more alternate user actions can achieve the same or substantially the same result as performing the fourth preferred user action. Employing the method 1400, the performance enhancement component can determine different types of alternate user actions with regard to a work task for different users based at least in part on the respective physical skills, work skills, intelligence, and/or other factors of the different users. At 1412, a training presentation, comprising information relating to the one or more alternate user actions, can be generated. The training presentation can be or comprise, for example, a video (e.g., an animated video), a visual illustration, an audio presentation, printed materials (e.g., written instructions), and/or a placard, presenting or illustrating the one or more alternate user actions and/or one or more of the preferred user actions (e.g., the sequence of alternate user actions and/or preferred user actions that the second user is able to perform to complete the work task(s)). The training presentation can be used to facilitate training the second user and/or another user(s) to perform the work task(s) associated with the industrial automation system.) determining, at a processor, if there is an associated training component related to correcting the trigger event stored in a database, if yes, retrieving the training component associated with the trigger event and determining the most relevant training component based on the trigger event; (Asenjo para. 128 teaches in some implementations, based at least in part on the determined or identified correlation between the user action(s) of the user and the determined favorable performance of the industrial automation system, the performance enhancement component can facilitate determining training-related or other functions or operations (e.g., determining preferred user actions in relation to the industrial automation system, determining poor or unsafe user practices in relation to the industrial automation system, training users to perform their tasks more efficiently with respect to the industrial automation system (e.g., based at least in part on the preferred user actions), automating preferred user actions, etc.) that can facilitate training users or automating preferred user actions in connection with the industrial automation system. Para. 64 teaches The training presentation can be or comprise, for example, a video (e.g., an animated video), a visual illustration, an audio presentation, a training model, an interactive training simulation, printed materials (e.g., written instructions, training manual or guide, etc.), a searchable training or troubleshooting database (e.g., knowledgebase), a poster, a placard, and/or another suitable training presentations, presenting or illustrating the preferred user action(s) (or alternate user action(s), as disclosed herein). The training presentation or training module can be used to facilitate training users (e.g., new or inexperienced user, poor performing user) to perform their work tasks (e.g., work duties, work assignments, etc.) more efficiently when working with the industrial automation system 104 (e.g., based at least in part on the preferred user action(s)). Para. 66 teaches The performance enhancement component 116 can generate a training presentation or training module illustrating or presenting the poor or unsafe user action(s) and/or comprising a set of instructions, guidelines, or recommendations that can facilitate training instructing users to not perform the poor or unsafe user action(s) in connection with the operation(s) or event(s) associated with the industrial automation system 104 to facilitate mitigating poor work performance habits of under-performing users. The performance enhancement component 116 can store this training presentation or training module, or a portion thereof, in the data store 114, and/or can provide this training presentation or training module for use (e.g., presentation) in training users. This training presentation or training module can be employed to facilitate training users (e.g., new or inexperienced user, poor performing user) to perform their tasks more efficiently and/or safely with respect to the industrial automation system 104. Para. 136 teaches At 1210, a training presentation can be generated based at least in part on the preferred user action(s). The performance enhancement component can generate a training presentation that can comprise information relating to the preferred user action(s). The training presentation can be used to facilitate training other users (e.g., inexperienced or poorer performing users) to perform better and/or more efficiently when performing the work tasks associated with the preferred user action(s) while working with the industrial automation system. The training presentation can be or comprise, for example, a video (e.g., an animated video, a video of a user performing or discussing performance of a work task), a visual illustration, an audio presentation, printed materials (e.g., written instructions), a poster, and/or a placard, presenting or illustrating the preferred user action(s). providing access to the most relevant training component to a maintenance user and directing the maintenance user to the associated automation element; (Asenjo para. 152 teaches at 1412, a training presentation, comprising information relating to the one or more alternate user actions, can be generated. The training presentation can be or comprise, for example, a video (e.g., an animated video), a visual illustration, an audio presentation, printed materials (e.g., written instructions), and/or a placard, presenting or illustrating the one or more alternate user actions and/or one or more of the preferred user actions (e.g., the sequence of alternate user actions and/or preferred user actions that the second user is able to perform to complete the work task(s)). The training presentation can be used to facilitate training the second user and/or another user(s) to perform the work task(s) associated with the industrial automation system. Para. 51 teaches the disclosed subject matter, including the various aspects and implementations regarding the performance enhancement component 116, collection component 102, and data store 114, can be employed to facilitate the set up and deployment of an industrial automation system(s), improvement of an industrial automation system(s), performance analysis relating to operation of the of an industrial automation system(s), cost analysis (e.g., production cost analysis) relating to an industrial automation system(s), training of users (e.g., operators, technicians, managers, engineers, maintenance personnel, etc.) associated with an industrial automation system(s), etc.) Asenjo does not teach if no, automatically retrieving-online and third party sources and determining the most relevant training component; However, Vaughn para. 36 teaches If the correct pictorial sequence 196 to correct the problem is not found in the content database 112, the comparison module 133 may utilize the web crawling service 111 to locate various troubleshooting information, including step-by-step instructions, product manuals, troubleshooting websites, diagrams, blogs, etc. to develop a correct pictorial sequences 196. Both Asenjo and Vaughn are directed to troubleshooting. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the Applicant’s invention to modify the teachings of Asenjo to include if no, automatically retrieving-online and third party sources and determining the most relevant training component to find the most relevant solution to the problem. Asenjo does not teach determining feed-back associated with the training component: However, Powers column 19, lines 42-60 teach the scenario feedback component 585 provides automatic feedback mechanism 586 and manual feedback mechanism 588. The automatic feedback mechanism 586 may provide an automatic feedback statement 587 to the scenario developer 13 for each completed training exercise. In an aspect, the statement 587 may be provided only for training exercises that involved a fidelity step down or for a training exercise that was completed with a perfect or near perfect score. The former condition may indicate a training exercise whose indications were too obtuse; the later condition may indicate a training exercise that was not sufficiently realistic. The manual feedback mechanism 588 allows the scenario developer 13 to generate, in advance of a training exercise, a feedback statement 589 (which may be completed by a trainee 11 or an observer/instructor 12) that will provide the scenario developer 13 specific data from the completed training exercise. wherein determining feed-back comprises: receiving feed-back associated with the training component from the user However, Powers column 19, lines 42-60 teaches the scenario feedback component 585 provides automatic feedback mechanism 586 and manual feedback mechanism 588. The automatic feedback mechanism 586 may provide an automatic feedback statement 587 to the scenario developer 13 for each completed training exercise. In an aspect, the statement 587 may be provided only for training exercises that involved a fidelity step down or for a training exercise that was completed with a perfect or near perfect score. The former condition may indicate a training exercise whose indications were too obtuse; the later condition may indicate a training exercise that was not sufficiently realistic. The manual feedback mechanism 588 allows the scenario developer 13 to generate, in advance of a training exercise, a feedback statement 589 (which may be completed by a trainee 11 or an observer/instructor 12) that will provide the scenario developer 13 specific data from the completed training exercise. automatically determining efficiency data associated with the training component based on the feed-back and from monitoring the automation element during and after the addressing of the triggering event; and correlating the efficiency data and the feed-back to determine a feed-back score; Powers column 3, lines 37-55 teach one current assumption about training fidelity is that a higher-fidelity training scenario will produce more effective training. This assumption has many weaknesses. First, while the training effectiveness of a high-fidelity training scenario may be higher than the training effectiveness of a low-fidelity training scenario, the training efficiency—effectiveness per unit cost—may be lower. Costs may include, for example, costs to apply a training scenario, such as trainee time, and costs to develop a scenario. If a high-fidelity training exercise takes one hour and produces a 20% skill improvement, a low-fidelity training exercise takes a half hour and produces a 15% improvement, and the same low-fidelity training exercise performed twice (taking one hour) produces a 25% improvement, the low-fidelity training exercise empirically is more efficient. Powers column 11, lines 35-50 teach a trainee's progress report may be used by a live observer/instructor 12 to monitor the trainee's progress, may be used in an after-action report by the observer/instructor 12 or the trainee 11, and/or may be used to programmatically determine the trainee's score for the training exercise 430. Powers column 15, lines 12-25 teach the protocol may be extended to create a training impact model 504 that in turn is used to determine a training scenario impact evaluation mechanism 506, where the training impact model 504 is used to evaluate a single scenario that is under development. The training impact model 504 also may use only easily-measurable observables to ensure the accuracy of the impact evaluation mechanism 506. The training impact model 504 then is used during the creation of training scenarios to estimate training efficiency and to optimize optimizing training efficiency balanced with the costs of developing and delivering the training. To facilitate model development, program 500 includes toolkit 503, which among other functions implements a drag and drop mechanism and a pop-up window mechanism. Column 5, lines 5-25 teach to develop a training impact model, the training platform may be invoked to host and conduct many training scenarios with similar goals but different levels of fidelity and to streamline the process of conducting experimental training exercises using these training scenarios. The platform also is capable of gathering and recording data about the training exercise, both detailed exercise progress metrics and external data, including trainee behavioral and sensory data. These data enable the construction of a training impact model that uses the measurable data to estimate training effectiveness. Furthermore, to apply the training impact model, the training platform may host and conduct training exercises, gather the impact metrics, and use the impact score (compared to the impact scores of related training scenarios) to provide guidance to the training scenario developer as to whether the training scenario is effective. Further, column 11 lines 65-colum 12 lines 5 teach The program 500 instruments the training exercise 330 with sensors that record not only progress toward exercise goals, but also ancillary data, such as trainee sensory and behavioral data, that could be of value in a training impact model. In an aspect, the computer network 300 may employ other sensors, such as video, audio, motion, and environmental sensors to gather information related to the trainee's experience and performance.) Further, column 3 lines 5-35 teach For consistency of description and ease of description, this disclosure refers generally to the following terms and their definitions. However, these terms and their definitions are not intended to be limiting, and similar terms and their definitions are encompassed by the disclosure. Trainee typically is a human. A trainee may be an individual or a member of a trainee group. Training effectiveness is a measure of the increase or maintenance of proficiency, skill, knowledge, or performance (referred to hereafter simply as skill) of an individual trainee or trainee group. Training effectiveness may be expressed as a percentage increase in skill. Training effectiveness may be measured by a post-exercise test (immediately thereafter or/and at some later time), for example. Training effectiveness also may be measured during the course of a training exercise. Training efficiency may be a measure of the cost of developing a training program or a training scenario. storing, in the database, the feed-back and feed-back score in association with the training component However, Powers column 22, lines 30-40 teach the operation 800 continues in block 855 where, following execution of a training exercise corresponding to a scenario, the program 500 executes to generate an automatic feedback statement 587 and store the statement in the scenario database 128 with the corresponding scenario, and/or, receive from an observer/instructor 12, or trainee 11, a feedback statement 589. In block 860, the program 500 executes to adjust the scenario, as appropriate, based on the automatic and/or manual feedback. In block 865, the program 500 executes to save the adjusted scenarios in the scenario database 128. The operation 800 then ends. Both Asenjo and Powers are directed toward training programs. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the Applicant’s invention to modify the teachings of Asenjo to include teach determining feed-back associated with the training component, automatically determining efficiency data associated with the training component based on the feed-back and from monitoring the automation element during the addressing of the triggering event; and correlating the efficiency data and the feed-back to determine a feed-back score, and storing, in the database, the feed-back in association with the training component as taught by Powers to allow systematic training program evaluation considering the interaction of training effectiveness, fidelity, and efficiency (see column 1, lines 55-60). Asenjo in view of Powers does not explicitly disclose an indication of relevancy of the training component However, Gomes Pereira para. 43 teaches this component provides, as examples, (i) ranking of relevant information that may serve as a response to a technical issue based upon the identified parameters, (ii) determining a response to a user as the root cause for the incident, and (iii) further actions that can be taken to fix the issues. Accordingly, the structured data analysis accesses a collection of candidate solutions and ranks those solutions. The solutions can be correlated to the issues they are applied to, to form issue-solution combinations. It compares the current situation (indicated by the collected data) against the information from previous incidents, the documentation, and other collected information. Based on this, it compares or ranks solutions and determines the best solution documented for the situation. It can present to a support team 238 recommended actions to perform to address the issue. In cases where a confidence that the top-ranked solution is below some confidence threshold, the method can refrain from recommending the actions to implement that solution, or could recommend but refrain from automatically implementing those actions. This may be particularly applicable if the experienced technical issue is a newly identified issue, solutions to which have not been fully tested. With regard to automated implementation of actions, the method could automatically (using automation tools or other tools) implement recommended actions absent involvement of a user. Such actions could be any actions performed by or on assets of the environment, for instance actions to set configuration settings or perform specific activities using those assets, as examples. Further, para. 78 teaches feedback from users can be provided as part of the results fed into the training process (at 420) to further train the classification model in identifying best solutions to identified issues. The user feedback can contribute to ranks of the candidate issue-solution combinations in future rankings. In this manner, the ongoing training provided by the feedback loop can affect the rankings of issue-solution combinations. If a particular solution is found based on user feedback to not properly address a problem, the training can correct this to devalue or eliminate the issue-solution combination. Feedback that a proposed solution successfully addressed a given technical issue might boost the model's confidence that that issue-solution combination accurately addresses the root cause of that technical issue. Both Asenjo in view of Powers and Gomes Pereira are directed to providing information to a user to assist in performance of a task or operation. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the Applicant’s invention to modify the teachings of Asenjo in view of Powers to include the feedback represents an indication of relevancy of the training component a taught by Gomes Pereira to identify and guide users in solution that will most likely resolve the identified issue (as suggested by para. 78). As per Claim 7 Asenjo teaches a method according to claim 1 wherein the trigger event is detected via monitoring collected operation data of the manufacturing line. (Asenjo para. 34 teaches For instance, a well-performing operator (e.g., experienced or skilled operator) can accrue a detailed working knowledge of how to manage or process for optimal performance given a range of operating scenarios (e.g., how to quickly clear a particular fault, which preventative operations will maximize a machine or process uptime, etc.). This can include not only knowledge of the manufacturing process itself, which may be common across multiple customers working in similar industries, but also knowledge of the idiosyncrasies of a customer's particular system configuration (e.g., the particular combination of machines, automation devices, and software running the process). Para. 110 teaches FIG. 8 illustrates a block diagram of an example device model 800 according to various aspects and implementations of the disclosed subject matter. In the illustrated example model 800, the device model 806 can be associated with a cloud-aware industrial device 802 (e.g., a programmable logic controller, a variable frequency drive, an HMI, a vision camera, a barcode marking system, etc.). As a cloud-aware device, the industrial device 802 can be configured to automatically detect and communicate with the cloud platform 808 upon installation at a plant facility, simplifying integration with existing cloud-based data storage, analysis, and applications (e.g., as performed by the training system described herein). When added to an existing industrial automation system, the industrial device 802 can communicate with the cloud platform and can send identification and configuration information in the form of the device model 806 to the cloud platform 808. The device model 806 can be received by the training system 810, which can update the customer's device data 812 based on the device model 806. In this way, the training system 810 can leverage the device model 806 to facilitate integrating the new industrial device 802 into the greater system as a whole. This integration can include the training system 810 updating cloud-based applications or services to recognize the new industrial device 802, adding the new industrial device 802 to a dynamically updated data model of the customer's industrial enterprise or plant, modifying a sequence of preferred user actions or alternate user actions for performing a work task in connection with an industrial automation system, modifying a training presentation or module, modifying a component, process, technique, or algorithm that automated an action that had been a preferred user action, modifying a work assignment of a user, modifying a simulation model of the industrial automation system to integrate, incorporate, or include a simulation or an emulation of the new industrial device 802 based on the identification and configuration information (or other data), determining or predicting a response of the modified industrial automation system based on a modified simulation model that integrates the new industrial device 802, making other devices on the plant floor aware of the new industrial device 802, or other desired integration or updating functions. Once deployed, some data items comprising the device model 806 can be collected and monitored by the training system 810 on a real-time or near real-time basis.) As per Claim 8 Asenjo teaches a method according to claim 1 wherein the trigger event is an event associated with: machine stoppages, faulty part detection, out of specification operations, a machine not responding within a set time period, a new operator, new equipment, general repair and maintenance, or a combination thereof. (Asenjo para. 110 teaches FIG. 8 illustrates a block diagram of an example device model 800 according to various aspects and implementations of the disclosed subject matter. In the illustrated example model 800, the device model 806 can be associated with a cloud-aware industrial device 802 (e.g., a programmable logic controller, a variable frequency drive, an HMI, a vision camera, a barcode marking system, etc.). As a cloud-aware device, the industrial device 802 can be configured to automatically detect and communicate with the cloud platform 808 upon installation at a plant facility, simplifying integration with existing cloud-based data storage, analysis, and applications (e.g., as performed by the training system described herein). When added to an existing industrial automation system, the industrial device 802 can communicate with the cloud platform and can send identification and configuration information in the form of the device model 806 to the cloud platform 808. The device model 806 can be received by the training system 810, which can update the customer's device data 812 based on the device model 806. In this way, the training system 810 can leverage the device model 806 to facilitate integrating the new industrial device 802 into the greater system as a whole. This integration can include the training system 810 updating cloud-based applications or services to recognize the new industrial device 802, adding the new industrial device 802 to a dynamically updated data model of the customer's industrial enterprise or plant, modifying a sequence of preferred user actions or alternate user actions for performing a work task in connection with an industrial automation system, modifying a training presentation or module, modifying a component, process, technique, or algorithm that automated an action that had been a preferred user action, modifying a work assignment of a user, modifying a simulation model of the industrial automation system to integrate, incorporate, or include a simulation or an emulation of the new industrial device 802 based on the identification and configuration information (or other data), determining or predicting a response of the modified industrial automation system based on a modified simulation model that integrates the new industrial device 802, making other devices on the plant floor aware of the new industrial device 802, or other desired integration or updating functions. Once deployed, some data items comprising the device model 806 can be collected and monitored by the training system 810 on a real-time or near real-time basis.) As per Claim 9 Asenjo teaches a method according to claim 1 wherein at least one of the trigger event and the training component is determined via machine learning. (Asenjo para. 93 teaches in some implementations, one or more components of the system 300 (e.g., communicator component 302, aggregator component 304, monitor component 306, . . . , performance enhancement component 318) can comprise software instructions that can be stored in the data store 322 and executed by the processor component 320. Para. 113 teaches the performance enhancement component 318 can leverage (e.g., use) a large amount of historical data relating to the asset or asset type that has been gathered (e.g., collected and/or aggregated) from many different industrial automation systems to facilitate learning or determining common operating characteristics of many diverse configurations of industrial assets or asset types at a relatively high degree of granularity and under many different operating contexts. The performance enhancement component 318 can use the learned or determined operating characteristics relating to the industrial assets or asset types to facilitate performing training-related or other services in connection with an industrial automation system.) Claims 10, 16, 18 recite similar limitation to those recited in Claims 1, 7, 8, 9 and are rejected for similar reasons. Further, Asenjo teaches a system for triggering a training event in a manufacturing line comprising: a data acquisition module; a data collection device trigger; a training module; and a notification module (see Asenjo para. 91) Claim(s) 4, 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Asenjo US 2018/0012510 A1 in view of Vaughn US 20190347117 A1 in view of Powers US 10757132 B1 in view of Gomes Pereira US 2019/0347148 A1 as applied to claim 1 and in further view of Rumi US 2006/0224254 A1. As per Claim 4 Asenjo does not explicitly disclose a method according to claim 1 wherein the efficiency data comprises at least one of: the a length of time the manufacturing line experienced stoppage, a length of time the end user took to address the trigger event, and frequency of the trigger event. However, Rumi para. 33 teaches The present invention collects and sorts data by company, plant, production line or area, system, or machine. It automatically acquires all signals, counts, time, codes and states of operation including uptime and stoppage/downtime as they relate to manual or automated data time stamping, cleaning, configuring, grouping, organizing. The present invention further verifies data with filters and algorithms for conversion into interpreted data that are then automatically transferred into primary filters, fuzzy logic, artificial intelligence, algorithms to give displays, reports, what ifs, costs, interpretations of system performance parameters and efficiency metrics. The present invention further auto-generates recommendations such as enlightenment, decisions and proposed actions by using secondary filters, algorithms, decision matrices, fuzzy logic, what ifs, costs and artificial intelligence to undertake directed efforts. Both Asenjo and Powers are directed toward training programs. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the Applicant’s invention to modify the teachings of Asenjo to include wherein the efficiency data comprises at least one of: the a length of time the manufacturing line experienced stoppage, a length of time the end user took to address the trigger event, and frequency of the trigger event as taught by Rumi to evaluate the effectiveness and efficiency of the training program that was used to obtain the observed result and using this data to undertake the proper remedial action to improve the worker's performance and also assist in reviewing and improving the training program (see para. 900.) Claims 13 recite similar limitation to those recited in Claims 4 and are rejected for similar reasons. Further, Asenjo teaches a system for triggering a training event in a manufacturing line comprising: a data acquisition module; a data collection device trigger; a training module; and a notification module (see Asenjo para. 91) Claim(s) 6, 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Asenjo US 2018/0012510 A1 in view of Vaughn US 20190347117 A1 in view of Powers US 10757132 B1 in view of Gomes Pereira US 2019/0347148 A1 in view of view of Rumi US 2006/0224254 A1 as applied to claims 4, 13 and in further view of Paramoure US 2010/0250318 A1. As per Claim 6 Asenjo does not explicitly disclose a method according to claim 4, wherein determining the training component associated with the trigger event comprises: reviewing the feed-back score associated with the training component; and if the feed-back score is below a predetermined reject threshold, disregarding the training component; and retrieving a further training component associated with the trigger event; otherwise, providing access to the training component to the end user. However, Paramoure para, 65 teaches at this point, the pre-test and post-test scores are reviewed and analyzed to determine if an acceptable level of learning occurred as a result of the training program (Block 154). Learning is measured by an increase in the score from the pre-test to the post-test. The comparison of the two scores shows the effectiveness of the training program. In addition, the post-test score is compared against the pre-established pass/fail criteria to determine if an acceptable level of learning has occurred. If learning did not occur at an acceptable level, the training program design and/or implementation may be analyzed to determine methods of improvement to the training program. If the training can be modified such that learning can occur at an acceptable level, the training program is modified and is re-administered (Block 156). However, if the training cannot be modified, the process may be halted. Both Asenjo in view of Powers and Paramoure are directed to providing employee training. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the Applicant’s invention to modify the teachings of Asenjo to include wherein determining the training component associated with the trigger event comprises: reviewing the feed-back score associated with the training component; and if the feed-back score is below a predetermined reject threshold, disregarding the training component; and retrieving a further training component associated with the trigger event; otherwise, providing access to the training component to the end user as taught by Paramoure to better determine methods of improvement to the training program (see Paramoure para. 65). Claims 15 recite similar limitation to those recited in Claims 6 and are rejected for similar reasons. Further, Asenjo teaches a system for triggering a training event in a manufacturing line comprising: a data acquisition module; a data collection device trigger; a training module; and a notification module (see Asenjo para. 91) Claim(s) 5, 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Asenjo US 2018/0012510 A1 in view of Vaughn US 20190347117 A1 in view of Powers US 10757132 B1 in view of Gomes Pereira US 2019/0347148 A1 as applied to claims 1, 10 and in further view of Paramoure US 2010/0250318 A1. As per Claim 5 Asenjo does not explicitly disclose a method according to claim 1, wherein determining the training component associated with the trigger event comprises: reviewing the feed-back score associated with the training component; and if the feed-back score is below a predetermined reject threshold, disregarding the training component; and retrieving a further training component associated with the trigger event; otherwise, providing access to the training component to the end user. However, Paramoure para, 65 teaches at this point, the pre-test and post-test scores are reviewed and analyzed to determine if an acceptable level of learning occurred as a result of the training program (Block 154). Learning is measured by an increase in the score from the pre-test to the post-test. The comparison of the two scores shows the effectiveness of the training program. In addition, the post-test score is compared against the pre-established pass/fail criteria to determine if an acceptable level of learning has occurred. If learning did not occur at an acceptable level, the training program design and/or implementation may be analyzed to determine methods of improvement to the training program. If the training can be modified such that learning can occur at an acceptable level, the training program is modified and is re-administered (Block 156). However, if the training cannot be modified, the process may be halted. Both Asenjo in view of Powers and Paramoure are directed to providing employee training. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the Applicant’s invention to modify the teachings of Asenjo to include wherein determining the training component associated with the trigger event comprises: reviewing the feed-back score associated with the training component; and if the feed-back score is below a predetermined reject threshold, disregarding the training component; and retrieving a further training component associated with the trigger event; otherwise, providing access to the training component to the end user as taught by Paramoure to better determine methods of improvement to the training program (see Paramoure para. 65). Claims 14 recite similar limitation to those recited in Claims 5 and are rejected for similar reasons. Further, Asenjo teaches a system for triggering a training event in a manufacturing line comprising: a data acquisition module; a data collection device trigger; a training module; and a notification module (see Asenjo para. 91) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to DEIRDRE D HATCHER whose telephone number is (571)270-5321. The examiner can normally be reached Monday-Friday 8-4:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Brian Epstein can be reached at 571-270-5389. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DEIRDRE D HATCHER/Primary Examiner, Art Unit 3625
Read full office action

Prosecution Timeline

Dec 14, 2020
Application Filed
Jan 04, 2024
Non-Final Rejection — §101, §103
Apr 09, 2024
Response Filed
May 13, 2024
Final Rejection — §101, §103
Sep 16, 2024
Request for Continued Examination
Sep 20, 2024
Response after Non-Final Action
Sep 27, 2024
Non-Final Rejection — §101, §103
Feb 28, 2025
Response Filed
May 31, 2025
Final Rejection — §101, §103
Dec 03, 2025
Request for Continued Examination
Dec 12, 2025
Response after Non-Final Action
Jan 06, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591902
METHOD FOR PREDICTING BUSINESS PERFORMANCE USING MACHINE LEARNING AND APPARATUS USING THE SAME
2y 5m to grant Granted Mar 31, 2026
Patent 12572867
DIGITAL PROCESSING SYSTEMS AND METHODS FOR MANAGING WORKFLOWS
2y 5m to grant Granted Mar 10, 2026
Patent 12536488
DETERMINING MACHINE LEARNING MODEL ANOMALIES AND IMPACT ON BUSINESS OUTPUT DATA
2y 5m to grant Granted Jan 27, 2026
Patent 12530703
DELIVERY OF DATA-DRIVEN & CROSS-PLATFORM EXPERIENCES BASED ON BEHAVIORAL COHORTS & IDENTITY RESOLUTION
2y 5m to grant Granted Jan 20, 2026
Patent 12462210
Performance Measuring System Measuring Sustainable Development Relevant Properties Of An Object, and Method Thereof
2y 5m to grant Granted Nov 04, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
28%
Grant Probability
53%
With Interview (+25.9%)
3y 10m
Median Time to Grant
High
PTA Risk
Based on 357 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month