DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on February 6, 2026 has been entered.
Status of Claims
This office action is in response to arguments and amendments entered on February 6, 2026 for the patent application 18/370,066 originally filed on September 19, 2023. Claims 1 and 12 are amended. Claims 1-22 are pending. The first office action of February 25, 2025 and the second office action of August 6, 2025 are fully incorporated by reference into this Non-Final Office Action.
Claim Rejections - 35 USC § 101
35 U.S.C. § 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-22 are rejected under 35 U.S.C. § 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Step 1 – “Statutory Category Identification”
Claim 1 is directed to “a method” (i.e. a process) and claim 12 is directed to “a system” (i.e. a machine), hence the claims are directed to one of the four statutory categories (i.e. process, machine, manufacture, or composition of matter). In other words, Step 1 of the subject-matter eligibility analysis is “Yes.”
Step 2A, Prong 1 “Abstract Idea Identification”
However, the claims are drawn to an abstract idea of “improving security training,” in the form of “certain methods of organizing human activity,” in terms of managing personal behavior or relationships or interactions between people (including social activities, teaching and following rules or instructions), or reasonably in the form of “mental processes,” in terms of processes that can be performed in the human mind (including an observation, evaluation, judgement or opinion). Regardless, the claims are reasonably understood as either “certain methods of organizing human activity” or “mental processes,” which require the following limitations:
Per claim 1
“communicating…one or more simulated phishing communications of a simulated phishing campaign to …one or more users in a format consistent with a specific messaging platform;
receiving… a failure event from the simulated phishing campaign manager that one or more users failed one or more simulated phishing communications of a simulated phishing campaign by interacting with the one or more simulated phishing communications;
receiving… feedback from one or more users that interacted with and failed the one or more simulated phishing communications, the feedback inputted …identifying one or more reasons that the one or more users interacted with an exploit type of the one or more simulated phishing communications…retrieving one or more attributes of the one or more users from a user records storage, the one or more attributes comprising at least one of a job role, risk score or completion of prior training of the one or more users;
categorizing… different types of the feedback into one or more predefined categories of a plurality of predefined categories based at least on the one or more attributes of the one or more users from which feedback was requested, wherein the user feedback categorization engine maps each of the different types of feedback to the one or more predefined categories based at least on content of the feedback and the retrieved one or more user attributes;
collating… the categorized feedback into one or more classifications selected from a plurality of classifications based at least on the retrieved one or more attributes;
communicating… a selection of a template… based at least on a categorized type of feedback of the different types of feedback mapped to the one or more predefined categories and the one or more classifications;
creating…a second one or more simulated phishing communications to the one or more users in accordance with the template selected … based at least on a categorized type of feedback of the different types of feedback mapped to the one or more predefined categories and the one or more classification; and
communicating… the second one or more simulated phishing communications to …the one or more users in a format consistent with a specific messaging platform.”
Per claim 12
“… communicating one or more simulated phishing communications of a simulated phishing campaign to …one or more users in a format consistent with a specific messaging platform;
… receive a failure event … that one or more users failed the one or more simulated phishing communications of a simulated phishing campaign by interacting with the one or more simulated phishing communications and receive, …feedback inputted …from one or more users that interacted with and failed the one or more simulated phishing communications, the feedback identifying one or more reasons that the one or more users interacted an exploit type of the one or more simulated phishing communications, …retrieving one or more attributes of the one or more users from a user records storage, the one or more attributes comprising at least one of a job role, risk score or completion of prior training of the one or more users;
…categorize the different types of feedback into one or more predefined categories of a plurality of predefined categories based at least on the one or more attributes of the one or more users from which feedback was requested, wherein the user feedback categorization engine maps each of the different types of feedback to the one or more predefined categories based at least on content of the feedback and the retrieved one or more user attributes;
…collate the categorized feedback into one or more classifications selected from a plurality of classifications based at least on the retrieved one or more attributes;
… communicate …a selection of a template based at least on a categorized type of feedback of the different types of feedback mapped to the one or more predefined categories and the one or more classifications;
…create a second one or more simulated phishing communications to the one or more users in accordance with the template selected by the recommendation engine based at least on a categorized type of feedback of the different types of feedback mapped to the one or more predefined categories and the one or more classification; and
wherein …to communicate a second one or more simulated phishing communications to the one or more users in a format consistent with a specific messaging platform.”
These limitations simply describe a process of data gathering and manipulation, which is partially analogous to “collecting information, analyzing it, and displaying certain results of the collection analysis” (i.e. Electric Power Group, LLC, v. Alstom, 830 F.3d 1350, 119 U.S.P.Q.2d 1739 (Fed. Cir. 2016)). Hence, these limitations are akin to an abstract idea which has been identified among non-limiting examples to be an abstract idea. In other words, Step 2A, Prong 1 of the subject-matter eligibility analysis is “Yes.”
Step 2A, Prong 2 – “Practical Application”
Furthermore, the claims do not include additional elements that either alone or in combination are sufficient to claim a practical application because to the extent that, e.g., “a user interface,” “storage,” “one or more devices” and “one or more servers,” are merely claimed to generally link the use of a judicial exception (e.g., pre-solution activity of data gathering and post-solution activity of presenting data) to (1) a particular technological environment or (2) field of use, per MPEP §2106.05(h); and are applying the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea, per MPEP §2106.05(f). In other words, the claimed “improving security training,” is not providing a practical application, thus Step 2A, Prong 2 of the subject-matter eligibility analysis is “No.”
Step 2B – “Significantly More”
Likewise, the claims do not include additional elements that either alone or in combination are sufficient to amount to significantly more than the judicial exception because to the extent that, e.g. “a user interface,” “storage,” “one or more devices” and “one or more servers,” are claimed, these are generic, well-known, and conventional data gather computing elements. As evidence that these are generic, well-known, and a conventional data gathering computing elements (or an equivalent term), as a commercially available product, or in a manner that indicates that the additional elements are sufficiently well-known, the Applicant’s specification discloses these in a manner that indicates that the additional elements are sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 U.S.C. § 112(a), per MPEP § 2106.07(a) III (a). As such, this satisfies the Examiner’s evidentiary burden requirement per the Berkheimer memo.
Specifically, the Applicant’s claimed “user interface,” as described in paras. [0086] and [0087] of the written description of the specification as originally filed, discloses the following:
“[0086] FIG. 3 is an illustration depicting a graphical user interface 302 showing a first alert 306 in conjunction with a suspected phishing threat, according to certain aspects of the present disclosure. The graphical user interface 302 can be implemented on any suitable device, such as a user device (e.g., user device 104 of FIG. 1).”
“[0087] The graphical user interface 302 can include a window 304 for displaying a received digital communication. The window 304 can be presented in any suitable fashion, such as a separate window or part of another window (e.g., a viewing pane within a window of an email application).” As such, the Applicant’s claimed “user interface,” is reasonably interpreted as a generic, well-known, and conventional data gathering computing element.
Also, the Applicant’s claimed “storage,” as described in para. [0161] of the written description of the specification as originally filed, discloses the following:
“[0161] Computer-readable medium 1022 can be any medium that participates in providing instructions to processor 1006 for execution, including without limitation, non-volatile storage media (e.g., optical disks, magnetic disks, flash drives, etc.) or volatile media (e.g., SDRAM, ROM, etc.). The computer-readable medium (e.g., storage devices, mediums, and memories) can include, for example, a cable or wireless signal containing a bit stream and the like. However, when mentioned, non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.” As such, the Applicant’s claimed “storage,” is reasonably interpreted as a generic, well-known, and conventional data gathering computing element.
Further, the Applicant’s claimed “one or more devices,” as described in para. [0159] of the written description of the specification as originally filed, discloses the following:
“[0159] FIG. 10 is a block diagram of an example system architecture 1002 for implementing features and processes of the present disclosure, such as those presented with reference to processes 200, 500, 600, 700, 800, 900 of FIGS. 2, 5, 6, 7, 8, 9, respectively. The features and processes disclosed herein can be implemented using one or multiple instances of system architecture 1002. The system architecture 1002 can be used to implement a server (e.g., a cloud-accessible server), a user device (e.g., a smartphone or personal computer), or any other suitable device for performing some or all of the aspects of the present disclosure. The system architecture 1002 can be implemented on any electronic device that runs software applications derived from compiled instructions, including without limitation personal computers, servers, smart phones, electronic tablets, game consoles, email devices, and the like. In some implementations, the system architecture 1002 can include one or more processors 1006, one or more input devices 1014, one or more display devices 1012, one or more network interfaces 1010, and one or more computer-readable media 1022. Each of these components can be coupled by bus 1020.” As such, the Applicant’s claimed “one or more devices,” is reasonably interpreted as a generic, well-known, and conventional data gathering computing element.
Finally, the Applicant’s claimed “one or more servers,” as described in paras. [0047], [0049], [0051] and [0168] of the written description of the specification as originally filed, discloses the following:
“[0047] In some embodiments, the system may include multiple, logically grouped servers 106. In one of these embodiments, the logical group of servers may be referred to as a server farm or a machine farm. In another of these embodiments, the servers 106 may be geographically dispersed. In other embodiments, a machine farm may be administered as a single entity. In still other embodiments, the machine farm includes a plurality of machine farms. The servers 106 within each machine farm can be heterogeneous—one or more of the servers 106 or machines 106 can operate according to one type of operating system platform (e.g., Windows, manufactured by Microsoft Corp. of Redmond, Washington), while one or more of the other servers 106 can operate according to another type of operating system platform (e.g., Unix, Linux, or Mac OSX).”
“[0049] The servers 106 of each machine farm do not need to be physically proximate to another server 106 in the same machine farm. Thus, the group of servers 106 logically grouped as a machine farm may be interconnected using a wide-area network (WAN) connection or a metropolitan-area network (MAN) connection. For example, a machine farm may include servers 106 physically located in different continents or different regions of a continent, country, state, city, campus, or room. Data transmission speeds between servers 106 in the machine farm can be increased if the servers 106 are connected using a local-area network (LAN) connection or some form of direct connection. Additionally, a heterogeneous machine farm may include one or more servers 106 operating according to a type of operating system, while one or more other servers execute one or more types of hypervisors rather than operating systems. In these embodiments, hypervisors may be used to emulate virtual hardware, partition physical hardware, virtualize physical hardware, and execute virtual machines that provide access to computing environments, allowing multiple operating systems to run concurrently on a host computer. Native hypervisors may run directly on the host computer. Hypervisors may include VMware ESX/ESXi, manufactured by VMWare, Inc., of Palo Alta, California; the Xen hypervisor, an open source product whose development is overseen by Citrix Systems, Inc. of Fort Lauderdale, Florida; the HYPER-V hypervisors provided by Microsoft, or others. Hosted hypervisors may run within an operating system on a second software level. Examples of hosted hypervisors may include VMWare Workstation and VirtualBox, manufactured by Oracle Corporation of Redwood City, California.”
“[0051] Server 106 may be a file server, application server, web server, proxy server, appliance, network appliance, gateway, gateway server, virtualization server, deployment server, SSL VPN server, or firewall. In one embodiment, a plurality of servers 106 may be in the path between any two communicating servers 106.”
“[0168] The features can be implemented in a computing system that includes a back-end component, such as a data server, or that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination thereof. The components of the system can be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include, e.g., a LAN, a WAN, and the computers and networks forming the Internet.”
This element is reasonably interpreted as a generic networking computer with generic computer components, which provides no details of anything beyond ubiquitous standard off-the-shelf equipment. As such, the Applicant’s specification discloses ubiquitous standard equipment within modern computer networking and does not provide anything significantly more. Therefore, Step 2B, of the subject-matter eligibility analysis is “No.”
In addition, dependent claims 2-11 and 13-22 do not provide a practical application and are insufficient to amount to significantly more than the judicial exception. As such, dependent claims 2-11 and 13-22 are also rejected under 35 U.S.C. § 101, based on their respective dependencies to claim 1 or 12. Therefore, claims 1-22 are rejected under 35 U.S.C. § 101 as being directed to non-statutory subject matter.
Response to Arguments
The Applicant’s arguments filed on February 6, 2026 related to claims 1-22 are fully considered, but are not persuasive.
CLAIM REJECTIONS UNDER 35 U.S.C. § 101
The Applicant respectfully argues “The rejection characterizes the subject matter as "providing user feedback on receiving simulated phishing communications," paraphrasing the claim steps at a high level. That framing ignores that the claims expressly require operations performed "by one or more servers comprising a message generator," receiving "a failure event from the simulated phishing campaign manager," causing "a user interface prompt to be rendered on the one or more devices," retrieving "one or more attributes" from "a user records storage," mapping feedback "to the one or more predefined categories based at least on content of the feedback and the retrieved one or more user attributes," collating "into one or more classifications," communicating "a selection of a template based at least on" those mapped categories and classifications, creating "a second one or more simulated phishing communications in accordance with the template," and communicating the second simulated phishing communications "in a format consistent with a specific messaging platform." None of those limitations describes or recites a mental process or method of organizing human activity; they are expressly tied to the stated server components and operations”.
“A mental process is limited to steps that can practically be performed in the human mind, but the explicit claim language requires operations such as receiving "a failure event" from "the simulated phishing campaign manager," causing "a user interface prompt to be rendered," retrieving "one or more attributes" from a storage source, mapping feedback to predefined categories "based at least on content of the feedback and the retrieved one or more user attributes," collating the results "into one or more classifications," and creating and communicating subsequent simulated phishing communications "in a format consistent with a specific messaging platform." These are not tasks that can practically be performed mentally, and the claims do not recite mental steps.”
“The claims do not set forth rules for behavior, teaching, or interpersonal organization. The claimed limitation set is directed to operations performed by "one or more servers comprising a message generator," by "the simulated phishing campaign manager," by "the user feedback requestor," by "the user feedback categorization engine," by "the user feedback analytics engine," and by "the recommendation engine." The only human-related action appearing in the claims is that feedback is received after the system causes "a user interface prompt to be rendered." That is merely an input to the expressly recited system operations. Nothing in the claim language recites an activity that constitutes organizing human behavior; the human feedback is simply data used by the recited modules.”
The Examiner respectfully disagrees. It is worth noting in MPEP §2106 under “II. Certain Methods Of Organizing Human Activity,” certain activity between a person and a computer (for example a method of anonymous loan shopping that a person conducts using a mobile phone) may fall within the "certain methods of organizing human activity" grouping. As applied in this case, a person interacting with a computer for “improving security training,” reasonably constitutes identifying the Applicant’s claims as an abstract idea in the form of “certain methods of organizing human activity.”
Likewise, with respect to mental processes, actual mental performance of the abstract idea is not required, Further, the MPEP § 2106.04(a)(2)(III)(C) states that “claims can recite a mental process even if they are claimed as being performed on a computer” and that “examiners should review the specification to determine if the claimed invention is described as a concept that is performed in the human mind and Applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept. In these situations, the claim is considered to recite a mental process.” In the present case, the claim limitations perform steps that are performed on a generic computer and/or computer environment, and merely uses a computer as a tool to perform the concept. As such, the argument is not persuasive.
The Applicant respectfully argues “At Prong 2, the claim must be evaluated in its entirety. Here, the claims expressly require that a "user feedback categorization engine" map different types of feedback "to the one or more predefined categories based at least on content of the feedback and the retrieved one or more user attributes," that a "user feedback analytics engine" collate the categorized feedback "into one or more classifications," that a "recommendation engine" communicate "a selection of a template based at least on" those mapped categories and classifications, that the "message generator" create the second simulated phishing communications "in accordance with the template," and that the simulated phishing campaign manager communicate them "in a format consistent with a specific messaging platform." The additional elements are not incidental; they define how data flows through specifically identified components to produce the claimed result. This constitutes integration into a practical application.”
The Examiner respectfully disagrees. The Applicant’s argument appears to describe various software modules to carry-out the abstract idea of “improving security training,” which does not constitute a “Practical Application.” Instead, the Applicant’s claims are not considered a “Practical Application,” because the claims do not provide any of the following:
• An improvement in the functioning of a computer, or an improvement to other technology or technical field, as discussed in MPEP §§ 2106.04(d)(1) and 2106.05(a);
• Applying or using a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, as discussed in MPEP § 2106.04(d)(2);
• Implementing a judicial exception with, or using a judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim, as discussed in MPEP § 2106.05(b);
• Effecting a transformation or reduction of a particular article to a different state or thing, as discussed in MPEP § 2106.05(c); and
• Applying or using the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception, as discussed in MPEP § 2106.05(e).
Furthermore, there are also several factors that reasonably explain that the Applicant’s claims are not indicative of integration into a practical application, which include:
• Merely reciting the words "apply it" (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea, as discussed in MPEP § 2106.05(f);
• Adding insignificant extra-solution activity to the judicial exception, as discussed in MPEP § 2106.05(g); and
• Generally linking the use of a judicial exception to a particular technological environment or field of use, as discussed in MPEP § 2106.05(h).
Here, the Applicant’s claims are not providing any technological advancement as described in the first five bulleted factors and, as described above in the rejection, the Applicant’s claims are merely claimed to use a computer as a tool to perform an abstract idea and to generally link the use of a judicial exception to a particular technological environment or field of use. As such, the argument is not persuasive.
The Applicant respectfully argues “The claim language is not phrased as general automation of an idea. It details a sequence beginning with receiving "a failure event," causing "a user interface prompt to be rendered," retrieving specified attributes, mapping feedback to predefined categories based on both feedback content and retrieved attributes, collating to classifications, selecting a template "based at least on" those categories and classifications, creating the second simulated phishing communications "in accordance with the template," and communicating them "in a format consistent with a specific messaging platform." This detailed chain of specifically recited operations constitutes a particular way of achieving the claimed result.”
The Examiner respectfully disagrees. The Applicant’s claims absolutely illustrate an “automation of an abstract idea.” As such, the argument is not persuasive.
The Applicant respectfully argues “The explicit claim text sets out a defined pipeline in which different machine executable components operate on data in a specified order, using predefined categories, classifications, and templates, and producing second simulated phishing communications formatted for a specific messaging platform. By reciting how these components must act, how data is mapped and classified, how templates are selected "based at least on" the prior steps, and how the output must conform to platform-specific formatting, the claims reflect a particular solution grounded in the technical field of server-implemented generation and communication of simulated phishing communications. The claimed operations constitute an improvement in the way such systems process inputs and adapt to such inputs to generate outputs.”
The Examiner respectfully disagrees. The Applicant’s “improvements” are nothing more than an arrangement of logical steps to carry out the abstract idea and using basic technology to do so. As such, the argument is not persuasive.
The Applicant respectfully argues “Even if analysis proceeds to Step 2B, the claims do not merely list conventional components. The claim language requires an ordered combination in which: a "failure event" is received from "the simulated phishing campaign manager"; a "user interface prompt" is caused to be rendered; feedback is received and user attributes are retrieved; the "user feedback categorization engine" maps to predefined categories based on feedback content and retrieved attributes; the "user feedback analytics engine" collates into classifications; the "recommendation engine" communicates a template selection "based at least on" those mapped categories and classifications; the "message generator" creates the second simulated phishing communications "in accordance with the template"; and the simulated phishing campaign manager communicates them "in a format consistent with a specific messaging platform." The inventive concept lies in these recited dependencies and the required flow through the enumerated components, which together amount to significantly more than any alleged abstract idea.”
The Examiner respectfully disagrees. The Applicant’s “enumerated components” consisting of “a user interface,” “storage,” “one or more devices” and “one or more servers,” are nothing more than generic, well-known, and conventional data gather computing elements as previously described above in the rejection. As such, the argument is not persuasive.
The Applicant respectfully argues “The USPTO instructs that, if eligibility is a "close call," the Examiner should only reject if it is more likely than not that the claim is ineligible. Although Applicant submits that is not a close call, to the extent the Examiner believes the same, the technical details and improvements described in the Claims should easily tip the balance in favor of eligibility.”
The Examiner respectfully disagrees. Any form of a "close call," is moot in view of the Applicant’s current claim set promoting an abstract idea of “improving security training.” Nothing being claimed or argued provides anything substantial to “easily tip the balance in favor of eligibility.” Instead, the Examiner respectfully suggests using the pre-appeal conference process which is at no cost to the Applicant. In short, the pre-appeal conference process requires the Examiner, his respective Supervisory Primary Examiner and a neutral third party to review the merits of the last rejection in view of a five-page mini-brief containing arguments submitted by the Applicant. As such, the argument is not persuasive.
The Applicant respectfully argues “For at least all of the above reasons, Applicant requests reconsideration and submits independent Claims 1 and 12, as well as Claims 2-11 that depend on Claim 1 and Claims 13-22 that depend on Claim 12 recite patent eligible subject matter. Accordingly, Applicant respectfully requests withdrawal of the rejection of Claims 1-22 under 35 U.S.C. § 101.”
The Examiner respectfully disagrees, for the reasons stated here and above. Therefore, the rejections under 35 U.S.C. §101 are not withdrawn.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT P. BULLINGTON whose telephone number is (313) 446-4841. The examiner can normally be reached on Monday through Friday from 8 A.M. to 4 P.M. If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Peter Vasat, can be reached on (571) 270-7625. The fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://portal.uspto.gov/external/portal. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at (866) 217-9197 (toll-free).
/Robert P Bullington, Esq./ Primary Examiner, Art Unit 3715