Prosecution Insights
Last updated: April 19, 2026
Application No. 18/588,492

SYSTEMS AND METHODS FOR GENERATING WORKFLOWS BASED ON NATURAL LANGUAGE INPUTS USING LARGE LANGUAGE MODELS

Final Rejection §101§103§112
Filed
Feb 27, 2024
Examiner
GOLDBERG, IVAN R
Art Unit
3619
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Servicenow Inc.
OA Round
2 (Final)
35%
Grant Probability
At Risk
3-4
OA Rounds
4y 8m
To Grant
72%
With Interview

Examiner Intelligence

Grants only 35% of cases
35%
Career Allow Rate
128 granted / 365 resolved
-16.9% vs TC avg
Strong +37% interview lift
Without
With
+36.9%
Interview Lift
resolved cases with interview
Typical timeline
4y 8m
Avg Prosecution
57 currently pending
Career history
422
Total Applications
across all art units

Statute-Specific Performance

§101
27.7%
-12.3% vs TC avg
§103
40.4%
+0.4% vs TC avg
§102
3.4%
-36.6% vs TC avg
§112
20.7%
-19.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 365 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Notice to Applicant The following is a Final Office action. In response to Examiner’s Non-Final Rejection of 9/17/25, Applicant, on 12/17/25, amended claims. Claims 1-15 and 17-20 are pending in this application and have been rejected below. Information Disclosure Statement The information disclosure statement (IDS) submitted on 9/30/25 and 2/9/26 are being considered by the examiner. Response to Amendment The 112b rejection is withdrawn in light of the amendments. New 112b rejections are necessitated by the amendments. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 2-3, 10, and 18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 2 recites the limitation "an approval". There is insufficient antecedent basis for this limitation in the claim, as claim 1 is now amended to recite “an approval.” It is unclear if this is a second approval, or is referring to the same limitation now in claim 1. For purposes of applying prior art only, Examiner interprets this as referring to the same approval as in claim 1. Claim 18 recites similar limitations and is rejected for the same reasons. Claim 3 depends from claim 2 and is rejected for the same reasons. Claims 3, 10 recite the limitation "a client device". There is insufficient antecedent basis for this limitation in the claim, as claim 1 and claim 9 are now amended to recite “a client device.” It is unclear if this is a second client device, or is referring to the same limitation now in claim 1 and claim 9. For purposes of applying prior art only, Examiner interprets this as referring to the same client device as in claim 1 and 9. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-15 and 17-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e. an abstract idea) without reciting significantly more. Step One - First, pursuant to step 1 in MPEP 2106.03, the claim 1 is directed to a method which is a statutory category. Step 2A, Prong One - MPEP 2106.04 - The claim 1 recites– “A method comprising: receiving… a natural language request to generate a workflow, wherein the natural language request specifies at least one characteristic of the workflow; generating… using the one or more… language models…, a first placeholder activity of a skeleton workflow based on the at least one characteristic, and wherein the first placeholder activity comprises a first placeholder value for a first property of the first placeholder activity; generating… using the one or more… language models…, based on the at least one characteristic and the first placeholder activity, a second placeholder activity of the skeleton workflow, wherein the second placeholder activity comprises a second placeholder value for a second property; generating and transmitting, …for display on a graphical …, the skeleton workflow; receiving, …an input requesting to modify the skeleton workflow; based on the input and an activity library, generating, …, a recommended first activity to replace the first placeholder activity and a recommended second activity to replace the second placeholder activity; receiving, …, an approval of the recommended first activity and the recommended second activity; and updating, …, the skeleton workflow based on the approval to replace the first placeholder activity with the recommended first activity and replace the second placeholder activity with the recommended second activity.” As drafted, this is, under its broadest reasonable interpretation, within the Abstract idea grouping of “certain methods of organizing human activity” (managing relationships between people – including… teaching, and following rules or instructions). Here, the claim has a description of a workflow is given by a user (e.g. Applicant’s FIG. 5 gives example of “create a travel expense reimbursement process for managing employee travel expenses efficiently”; applicant’s [0077] as published gives many other example topics: “ workflow may be related to credit card fraud investigation, employee onboarding, employee training, accounting, financial close, employee reviews, product testing, invoicing/billing, quality control, … purchasing, inventory, logistics, employee benefit management… supply chain management, vendor onboarding”), then generating a placeholder activity of a template/skeleton workflow based on the specifics in the requested workflow that includes activities (e.g. FIG. 12 and [0055-0056] as published – e.g. “capture case details,” then 1.2 evaluate fraud” then “send notification”), then generate a second placeholder activity; display the workflow (e.g. procedure for reimbursement); generate a recommended first activity based on input to modify the skeleton/template workflow and recommended second activity; receive approval [from a person] for recommended 1st and 2nd activity; then update the template/skeleton based on the further input. Accordingly, claim 1 is directed to an abstract idea for forming a workflow or business process and modifying it based on suggestions. Step 2A, Prong Two - MPEP 2106.04 - This judicial exception is not integrated into a practical application. Claim 1 recites Additional elements that are: “A method comprising: receiving, via a processor, a natural language request to generate a workflow, wherein the natural language request specifies at least one characteristic of the workflow; generating, via the processor, using one or more large language models (LLMs), a first placeholder activity of a skeleton workflow… generating, via the processor, using the one or more LLMs, …a second placeholder activity of the skeleton workflow…; generating and transmitting, via the processor, for display on a graphical user interface (GUI) of a client device, the skeleton workflow; … [each step is “via the processor”] receiving, “via the processor”, an input requesting to modify the skeleton workflow; and updating, “via the processor” the skeleton workflow based on the input.” (At this time, individually or in combination, the additional elements are: processor and large language models (LLMs) and displaying on a GUI of a client device. The specification in [0046] as published states “As used herein, a large language model (LLMs) is a probabilistic model of a natural language used for general-purpose language generation. LLMs typically include one or more artificial neural networks having a transformer-base architecture.” The specification has made it unclear as to what the LLM requires or not. Additional elements of computer, generating a skeleton workflow [abstract idea portion] using or more LLMs, displaying it on a client device, is considered “apply it [abstract idea] on a computer” (See MPEP 2106.05f; see also MPEP 2106.05h field of use for the “generating, using one or more large language models (LLMs)”; computer; and GUI of a client device). Accordingly, the additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim also fails to recite any improvements to another technology or technical field, improvements to the functioning of the computer itself, use of a particular machine, effecting a transformation or reduction of a particular article to a different state or thing, and/or an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. See 84 Fed. Reg. 55. The claim is directed to an abstract idea. Step 2B in MPEP 2106.05 - The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of a computer; and “a large language model (LLM),” and “display on a GUI of a client device” is treated as MPEP 2106.05(f) (Mere Instructions to Apply an Exception – “Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible.” Alice Corp., 134 S. Ct. at 235); and MPEP 2106.05h (field of use). Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. Independent claim 9 is directed to a system at step 1, which is a statutory category. Claim 9 recites similar limitations as claim 1 and is rejected for the same reasons at step 2a, prong one, 2a, prong 2, and step 2b. Claim 9 further recites “A system, comprising: processing circuitry; and a memory, accessible by the processing circuitry, and storing instructions that, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising” for performing each step. At step 2a, prong two and step 2B this is considered “apply it [abstract idea] on a computer” (See MPEP 2106.05f) and “field of use” (MPEP 2106.05h). The claim is not patent eligible. Claim 9 further recites limitations that are in dependent claim 2, of receiving approval [from a person] of the updated skeleton/template workflow and generating the workflow (which can be steps for how to handle reimbursement for travel expenses). Independent claim 17 is directed to an article of manufacture at step 1, which is a statutory category. Claim 17 recites similar limitations as claim 1 and claim 9 and is rejected for the same reasons at step 2a, prong one; step 2a, prong 2 and step 2b. Notably, it is similar to claim 9 in that it at least clarifies a computer performs each operation. Claims 2, 18 narrows the abstract idea by receiving approval [from a person] of the updated skeleton/template workflow and generating the workflow (which can be steps for how to handle reimbursement for travel expenses). Claims 3, 20 has an additional element of generating additional GUIs on a client device to display the workflow as it is carried out. The workflow could be anything (e.g. Applicant’s [0077 gives examples of “may be related to credit card fraud investigation, employee onboarding, employee training, accounting, financial close, employee reviews, product testing, invoicing/billing, quality control, IT security, purchasing, inventory, logistics, employee benefit management, software development, supply chain management, vendor onboarding”.) Having some information displayed related to various business processes is considered “apply it [abstract idea] on a computer” (See MPEP 2106.05f) and “field of use” (MPEP 2106.05h). Claim 20 differs from claim 3, that it recites the display is for an “additional client device”. Nonetheless, the same reasoning applies. Claims 4, 11, 19 narrows the abstract idea by having placeholders values based on the natural language request (i.e. words/descriptions). The same additional element of a computer/LLM is considered “apply it [abstract idea] on a computer” (See MPEP 2106.05f) and “field of use” (MPEP 2106.05h) similar to the independent claims above as there are no details given as to how the LLM is queried or how it generates the workflow. Claims 5, 12 narrow the abstract idea by describing the first property as being EITHER an input, output, actions, label, description, a rule, a trigger, or an advanced property. This just narrows the abstract idea of instructions for people to follow to conduct a business process. Claim 6 narrows the abstract idea by having EITHER a value for an additional property, adding a third property, removing an activity, or replacing an activity. Claims 7, 13 has an additional element of using a “chat interface.” Claim 1 does not require a computer, so it appears claim 7 is referring to just the input area for text on a display screen for how the text is entered. This is considered “apply it [abstract idea] on a computer” (See MPEP 2106.05f) and “field of use” (MPEP 2106.05h). Claim 13 does require a computer for each step, and the same reasoning applies - this is considered “apply it [abstract idea] on a computer” (See MPEP 2106.05f) and “field of use” (MPEP 2106.05h). Claim 8 narrows the abstract idea by having EITHER other workflows, business process model and notation conventions, operating procedures, best practices, OR publications. The different data from ONE of these items is used to “train” the LLM. The “training the LLM” is considered an additional element, and is considered “apply it [abstract idea] on a computer” (See MPEP 2106.05f) and “field of use” (MPEP 2106.05h) at step 2a, prong two and step 2B. No details or explanation on how the “training” is conducted is given at this time. Claim 10 has additional elements where the steps are executed on a “cloud-based client instance,” which is interpreted as “by a computer” even if at a data center, and specifies that “a request to generate the workflow”, input to modify workflow, and approval” are “received from a client device.” The cloud-based client instance (e.g. a computer at a data center) and receiving some data “from a client device” are considered an additional element, and is considered “apply it [abstract idea] on a computer” (See MPEP 2106.05f) and “field of use” (MPEP 2106.05h) at step 2a, prong two and step 2B. At step 2B, the fact that data is “received” from a client device and then executed at a “cloud-based client instance” is also considered a conventional computer function. See MPEP 2106.05d(II) “ Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362.” Claim 14 depends from claim 13 [chat interface] and narrows the abstract idea by stating it gives a person a recommendation for modifying the skeleton workflow. Claim 15 has an additional element of a “popup window” for displaying a recommendation for modifying the skeleton/template workflow. This is considered “apply it [abstract idea] on a computer” (See MPEP 2106.05f) and “field of use” (MPEP 2106.05h) at step 2a, prong two and step 2B. Providing another area for how text is displayed by itself or in combination with the other limitations, is not considered an improvement in the display. Therefore, the claim(s) are rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter. For more information on 101 rejections, see MPEP 2106. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-14, and 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over Tripathy (US 2025/0139417) in view of Wilson (US 2025/0111334). Concerning claim 1, Tripathy describes: A method (Tripathy – see par 35 - FIG. 4 depicts an example method 400 of workflow assistance. see par 61-62 – FIG. 10 is example system 1000 to perform various aspects, such as methods in FIG. 4-5; Processing system 1000 is generally an example of an electronic device configured to execute computer-executable instructions) comprising: receiving, via a processor (Tripathy – see par 28 - FIG. 2 illustrates a block diagram of an example implementation of the workflow assistant system 130 briefly described in FIG. 1. The map component 210, score component 220, and data acquisition component 230 can be implemented by at least one processor coupled to at least one memory that stores instructions that, when executed by the at least one processor, cause the processor to perform the functionality of each component when executed.), a natural language request to generate a workflow (Tripathy – see par 21 - In one embodiment, an intelligent virtual assistant or chatbot can receive a text problem statement from a user and invoke a generative pre-trained transformer (e.g., GPT-4) to determine and create an appropriate workflow for the user; See FIG. 4, 410 – “receive a request”; see par 35 - method 400 can be implemented by the workflow assistant system 130 of FIG. 1 [which in par 28 above, is explained as being executed by a processor]; par 36 - For example, a user could ask a question or state a problem the user is dealing with in natural language text or voice that is received by the workflow assistant system 130; in one embodiment, a conversational chatbot or virtual intelligent assistance can be employed to receive the request as well as request and receive further information), wherein the natural language request specifies at least one characteristic of the workflow (Tripathy -see par 22 - The workflow assistant system 130 can utilize a machine learning model 150 trained to recognize the natural problem statement, for example, utilizing natural language processing, and identify a workflow template corresponding to the problem statement, such as a bill reminder workflow template. The same or a different machine learning model 150 can initiate generation of the workflow template by mapping data in the problem statement to template parameters and requesting missing template values; see par 37, FIG. 4, step 420 - In one embodiment, one or more machine learning models 150 can be trained to identify the problem statement utilizing natural language processing to classify the problem statement based on historical data..); generating, via the processor, using one or more large language models (LLMs), a first placeholder activity (Tripathy – see par 19 - the application can include or interact with a workflow system 120. The workflow system 120 enables automation and optimization of tasks, processes, or activities including creation and execution of workflows. In other words, a workflow can comprise an orchestrated and repeatable pattern of activities and provide a way to automate tasks. In a financial management application domain, workflows can automate tasks such as processing recurring invoices, sending reminders for overdue payments, categorizing and tracking expenses, and generating financial reports, among other things. see par 32 - generative artificial intelligence can refer to any artificial intelligence or machine learning model capable of generating content. Accordingly, other approaches are also possible that produce similar results, such as using a large language model (LLM) tuned to generate workflows based on tokenized input of a user; see par 37 – step 420 - Further, the same or a different machine learning model 150 can use learned mappings between problem statements or classes of problem statements and workflow templates as a basis for inferring a match) of a skeleton workflow based on the at least one characteristic (Tripathy –see par 46 - A parameter can be a placeholder or variable in a template that is assigned a specific value or content. In accordance with one embodiment, the template parameters can be fetched from a repository or the workflow system 130. The parameters can be workflow-dependent but generally correspond to conditions, triggers, actions, and other details), wherein the first placeholder activity comprises a first placeholder value for a first property of the first placeholder activity (Tripathy – see par 15 - Some workflow systems utilize templates, a pre-defined framework or structure that provides a starting point for creating customized workflows. Templates provide a foundation for generating new workflows to expedite creation and reduce the learning curve for new users; see par 21 - In one embodiment, an intelligent virtual assistant or chatbot can receive a text problem statement from a user and invoke a generative pre-trained transformer (e.g., GPT-4) to determine and create an appropriate workflow for the user. In one instance, the workflow assistant system 130 can transform a problem into prompts for a generative pre-trained transformer to aid in identifying a corresponding template and generating data to complete the template. See par 25 - In one instance, the machine learning model 150 can converse with the user in natural language to clarify the problem statement by requesting additional information until the confidence score satisfies the threshold. Additionally, or alternatively, the workflow assistant system 130 can output the template partially completed as a suggestion. See par 31 - For example, generative artificial intelligence techniques, such as those associated with virtual intelligent agents or chatbots, can be utilized to request data in the user's natural language. Furthermore, the request can be conversational rather than a formal and impersonal request for the data. Such requests can be termed prompt engineering. Template parameters or attributes that may need to be provided include, but are not limited to, conditions or triggers, task assignment, when and how notifications or alerts are sent, and the content or format of notifications. Once data is acquired, a template can be opened and dynamically populated with the data); generating, via the processor, using the one or more LLMs, based on the at least one characteristic and the first placeholder activity, a second placeholder activity of the skeleton workflow, wherein the second placeholder activity comprises a second placeholder value for a second property (Tripathy – see par 34 - a user can specify a problem statement 342 by way of the customer layer 340, which is received by the assistance system 332 of the workflow plugin layer 330. The assistance system 332 can identify a template associated with a workflow that addresses the problem statement 342. The assistance system 332 initiates generation of the workflow template by invoking the generator 322 of the generative AI layer 320 with a built prompt. The generator 322 can then invoke the context builder to determine template parameters to be generated. In accordance with one embodiment, the parameters can be specified and built into a vector that is passed to Chat GPT to fetch a response and score. See par 47 - Method 500 starts at block 510 with determining a first set of parameters or attributes associated with a workflow template. In accordance with one aspect, the workflow template can be selected as a match for a user problem statement. Parameters can be determined by analyzing a template or metadata associated with the template. A parameter can be a placeholder or variable in a template that is assigned a specific value or content. In accordance with one embodiment, the template parameters can be fetched from a repository or the workflow system 130; See par 49 - The method continues at block 530 by comparing the first set of parameters from user input and the second set of parameters from a workflow template. A result of the comparison can be zero or more parameters missing from the first set of parameters with respect to the second set of parameters); generating and transmitting, via the processor, for display on a graphical user interface (GUI) of a client device, the skeleton workflow (Tripathy – see par 20 - The workflow assistant system 130 can also be communicatively coupled to a user by way of a user computing device 140 (e.g., tablet, desktop computer, terminal, laptop computer, smartphone, etc.); FIG. 1 – device 140 has a screen; see par 56 - FIGS. 6-9 depict example screenshots associated with workflow assistance. The example screenshots illustrate how a user can interact with the workflow assistant system 130, the application 110, and workflow system 120 of FIG. 1 through a graphical user interface. Further, the screenshots can illustrate aspects of the workflow page 344 of FIG. 3 in accordance with certain embodiments. See par 58, FIG. 7 - The workflow is set to “ON” rather than “OFF,” indicating that the workflow is currently active and able to be triggered based on specified conditions. Finally, the actions column is an icon that, when activated or selected, allows further inspection and editing of details regarding the workflow; see par 59 – FIG. 8 – screenshot of edit workflow interface; subject of “invoice reminder workflow”; series of options and conditions and actions to specify in 830, 832, 840). Wilson also discloses the “skeleton workflow” on “a GUI” (Wilson – see par 78 – generative output engine may be a large language model (LLM); See FIG. 11, par 295 - the selection panel 1102 can include a defined set of tiles 1104 that are associated with a particular category (e.g., IT, HR, Legal, and so on) and/or associated with particular intake requests (e.g., password troubleshooting, onboarding, and so on). The set of tiles 1104 may include default tiles that may are configured with generalized or common workflows. Additionally or alternatively, the set of tiles 1104 may include workflows that have previously been configurated as part of an already generated intake flow. In some cases, the set of tiles 1104 may include workflows that were created as part of a generative output from a generative output engine as described herein.) Tripathy and Wilson disclose: receiving, via the processor, an input requesting to modify the skeleton workflow (Tripathy – see par 25 - In this manner, a user can verify and optionally add or change the information in the template before triggering the generation of a corresponding workflow. In one instance, the machine learning model 150 can converse with the user in natural language to clarify the problem statement by requesting additional information until the confidence score satisfies the threshold.); and based on the input and an activity library, generating, via the processor, a recommended first activity to replace the first placeholder activity and a recommended second activity (Tripathy – see par 29 - In one instance, the map component 210 can retrieve template data or metadata from a template repository of the workflow system 120 and map the statement to a similar template based on the template data. see par 32 - data regarding one or more templates can be acquired from a template repository or store. In one particular instance, parameters needed to build context can be retrieved, tuned, and built as a vector that can be passed to a model (e.g., chat GPT) to fetch a response and a score. Actions can be taken based on the value of the score. For example, if the score equals 100, the workflow can be built and enabled. If the confidence score equals 50, the workflow can be built but in a disabled state (e.g., not enabled/inactive). If the score is less than 5 and additional information is needed, the user can be prompted for the additional information). to replace the second placeholder activity (Tripathy see par 30 -the workflow assistant system 130 can output one or more templates as a suggestion, which a user can select and complete. see par 31 - generative artificial intelligence techniques, such as those associated with virtual intelligent agents or chatbots, can be utilized to request data in the user's natural language; see par 32 - In one particular instance, parameters needed to build context can be retrieved, tuned, and built as a vector that can be passed to a model (e.g., chat GPT) to fetch a response and a score. Actions can be taken based on the value of the score. For example, if the score equals 100, the workflow can be built and enabled. If the confidence score equals 50, the workflow can be built but in a disabled state (e.g., not enabled/inactive). If the score is less than 5 and additional information is needed, the user can be prompted for the additional information. If the score is less than 5 and additional information is unnecessary, the workflow template page can be opened with current information, allowing the rest to be prefilled by a user. This approach uses generative artificial intelligence to recommend and build a workflow for a user (e.g., customer); see par 33 - The workflow plugin layer 330 includes assistance system 332, which corresponds to the workflow assistant system 130 of FIG. 1, and is configured to converse with a user in natural language to receive a problem statement and identify workflows, and corresponding templates, that address the problem statement); Tripathy discloses using “chat GPT” to build the workflow (See par 32) and disclosing “generative AI layer” to determine template parameters to be generated (See par 34). The later portions on editing the workflow (FIG. 7-9) are unclear if they are just the input on the screen, though FIG. 7 has a “give feedback” button. Wilson discloses based on the “input” and an activity library, generating, via the processor, a recommended first activity to “replace” the first placeholder activity and a recommended second activity to replace the second placeholder activity (Wilson – see par 56 - system may include an option to utilize a generative output engine to make recommendations about the fields. in response to initiating a generative output process, the system may create a prompt for the generative output engine which may be a preconfigured prompt and/or include user input such as natural language commands. The system may submit the prompt to the generative output engine and receive a generative response. The generative response may be used to suggest one or more changes to the fields included in the portal intake flow. see par 295 - The set of tiles 1104 may include default tiles that may are configured with generalized or common workflows; see par 297 - the preview panel 1106 may include a dynamic preview panel that allows a user to modify the workflow (e.g. “HR Starter (Default)”) using the schematic 1108. For example, the user may modify one or more process steps, conditions for progressing an issue, a process flow for an issue and/or other parameters associated with the corresponding workflow. Accordingly, user updates to the preview panel 1106 may be used to configure the workflow for the portal intake flow that is generated). Tripathy and Wilson disclose: receiving, via the processor, an approval of the recommended first activity and the recommended second activity (Tripathy – see par 25 - Template parameters capture template context, and parameter values can be identified automatically based on user-provided data. Subsequently, the workflow system 120 can generate a workflow based on a completed template. In one instance, the machine learning model 150 can converse with the user in natural language to clarify the problem statement by requesting additional information until the confidence score satisfies the threshold. Additionally, or alternatively, the workflow assistant system 130 can output the template partially completed as a suggestion. In this manner, a user can verify and optionally add or change the information in the template. Alternatively, the workflow can be generated but in an inactive or unenabled state, allowing a user to review the workflow and make changes before workflow execution; See also Wilson par 199 - . In this example, it may be appropriate for the output processor to direct the output of the generative engine service 116 to the frontend (e.g., rendered on the client device 104, as one example) so that a user of the client device 104 can approve the content before it is prepended to the document. see par 273 - the system may display in the first UI, a first preview panel including a dynamic preview of an intake interface for the request type associated with the selected tile. The intake interface may include intake fields that are used to collect information related to the particular request type as part of the intake portal flow. For example, the fields may include fields for identifying the requestor or a user associated with the request, fields for providing a summary of the issue/request, and so on. The fields displayed in the dynamic preview may be a set of default fields associated with a particular request type. For example, an IT request type may be different fields than an HR request type. see par 298 - In response to a selection of a particular workflow type and/or user input to the user interface 1100 confirming selection of a workflow, the system can create the new portal intake flow for use in a web based service. The new portal intake flow can be configured to generate a new issue using an intake interface that is configured in accordance with the intake request type and having a workflow configured in accordance with the selected workflow.); updating, via the processor, the skeleton workflow based on the approval to replace the first placeholder activity with the recommended first activity and replace the second placeholder activity with the recommended second activity (Tripathy – See par 27 - Moreover, the workflows can be generated from templates that are automatically generated based on a natural language description of a problem. Continuing with the previous example, a user can state that they are having trouble with bills being paid on time. The workflow assistant system 130 can utilize natural language processing to determine a problem statement and match a bill reminder template to the problem statement. The workflow assistant system 130 can also request additional information to aid in completing the bill reminder template; see par 54 - It is further to be appreciated that various default information or details can also be prefilled but can be updated by a user, such as the content and format of notifications, among other things. In this manner, user requests can be limited to what is needed, such as what clients the user considers high-value clients for bill review and processing). Wilson – see par 55 - In some cases, the prompt may include a command to return portal intake flows to use in a specific project management interface, for example, portal intake flows for a project management interface sued by an HR team. The generative output may return a list of recommended portal intake flows and the system may generate a project management interface that includes the recommended portal intake flows. In some cases, the system can display a dynamic preview of the project management interface, which may allow a user to modify aspects of the interface. For example, the user may change an arrangement of the portal intake flows, add and/or remove various portal intake flow and so on. Additionally or alternatively, a user may submit feedback on the generated portal intake flow which may be used to generate a second prompt for a generative output engine. The second prompt may request the generative output engine to provide suggested changes based on the user input. see par 297 - user may modify one or more process steps, conditions for progressing an issue, a process flow for an issue and/or other parameters associated with the corresponding workflow. Accordingly, user updates to the preview panel 1106 may be used to configure the workflow for the portal intake flow that is generated.). Tripathy and Wilson are analogous art as they are directed to processes/workflows and taking input (e.g. text data) and performing natural language processing in forming workflow/processes (see Tripathy Abstract; Wilson Abstract, par 56, 78, 295 (IT, HR, Legal, and so on)). Tripathy discloses displaying data and having workflows set to “active” and “editing” details regarding the workflow (See par 20, 58). Tripathy discloses outputting templates as a suggestion, using “chat GPT” to “recommend” and build the workflow (See par 30, 32), and disclosing “generative AI layer” to determine template parameters to be generated (See par 34). Wilson improves upon Tripathy by disclosing having displaying “default tiles” for common workflows for different categories and using natural language user input then having generative response to suggest changes to fields in a portal intake flow (See par 56, 295, 297). One of ordinary skill in the art would be motivated to further include having default tiles for common workflows and generative responses to suggest changes to fields to efficiently improve upon the workflows and suggestions in Tripathy. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention for receiving problem statements in natural language and using workflow templates in Tripathy, to further include having workflow with natural language inputs and graphical representation of default flows as disclosed in Wilson, since the claimed invention is merely a combination of old elements, and in combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable and there is a reasonable expectation of success. Concerning claims 2 and 18, Tripathy and Wilson discloses: The method of claim 1, comprising: receiving an approval of the updated skeleton workflow (Tripathy – see par 25 - Template parameters capture template context, and parameter values can be identified automatically based on user-provided data. Subsequently, the workflow system 120 can generate a workflow based on a completed template. In one instance, the machine learning model 150 can converse with the user in natural language to clarify the problem statement by requesting additional information until the confidence score satisfies the threshold. Additionally, or alternatively, the workflow assistant system 130 can output the template partially completed as a suggestion. In this manner, a user can verify and optionally add or change the information in the template. Alternatively, the workflow can be generated but in an inactive or unenabled state, allowing a user to review the workflow and make changes before workflow execution.); and generating, in response to receiving the approval of the skeleton workflow, the workflow based on the approved updated skeleton workflow (Tripathy – see par 25 - Template parameters capture template context, and parameter values can be identified automatically based on user-provided data. Subsequently, the workflow system 120 can generate a workflow based on a completed template. In one instance, the machine learning model 150 can converse with the user in natural language to clarify the problem statement by requesting additional information until the confidence score satisfies the threshold. Additionally, or alternatively, the workflow assistant system 130 can output the template partially completed as a suggestion. In this manner, a user can verify and optionally add or change the information in the template before triggering the generation of a corresponding workflow; See also Wilson par 199 - . In this example, it may be appropriate for the output processor to direct the output of the generative engine service 116 to the frontend (e.g., rendered on the client device 104, as one example) so that a user of the client device 104 can approve the content before it is prepended to the document. see par 273 - the system may display in the first UI, a first preview panel including a dynamic preview of an intake interface for the request type associated with the selected tile. The intake interface may include intake fields that are used to collect information related to the particular request type as part of the intake portal flow. For example, the fields may include fields for identifying the requestor or a user associated with the request, fields for providing a summary of the issue/request, and so on. The fields displayed in the dynamic preview may be a set of default fields associated with a particular request type. For example, an IT request type may be different fields than an HR request type. see par 298 - In response to a selection of a particular workflow type and/or user input to the user interface 1100 confirming selection of a workflow, the system can create the new portal intake flow for use in a web based service. The new portal intake flow can be configured to generate a new issue using an intake interface that is configured in accordance with the intake request type and having a workflow configured in accordance with the selected workflow). Concerning independent claim 9, Tripathy and Wilson disclose: A system (Tripathy – see par 35 - FIG. 4 depicts an example method 400 of workflow assistance. see par 61-62 – FIG. 10 is example system 1000 to perform various aspects, such as methods in FIG. 4-5; Processing system 1000 is generally an example of an electronic device configured to execute computer-executable instructions), comprising: processing circuitry (Tripathy – see par 95 - the various operations of methods described above may be performed by any suitable means capable of performing the corresponding functions. The means may include various hardware or software component(s) or module(s), including, but not limited to a circuit, an application specific integrated circuit (ASIC), or processor); and a memory, accessible by the processing circuitry, and storing instructions that, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising (Tripathy – see par 65 - Processor(s) 1002 are generally configured to retrieve and execute instructions stored in one or more memories, including local memories like the computer-readable medium 1012, as well as remote memories and data stores.)… The remaining limitations are similar to claims 1 and 2 above. Tripathy and Wilson disclose the limitations for the same reasons as above. Concerning independent claim 17, Tripathy and Wilson disclose: A non-transitory, computer readable medium comprising instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations comprising (Tripathy – see par 35 - FIG. 4 depicts an example method 400 of workflow assistance. see par 61-62 – FIG. 10 is example system 1000 to perform various aspects, such as methods in FIG. 4-5; Processing system 1000 is generally an example of an electronic device configured to execute computer-executable instructions; see par 65 - Processor(s) 1002 are generally configured to retrieve and execute instructions stored in one or more memories, including local memories like the computer-readable medium 1012, as well as remote memories and data stores). The remaining limitations are similar to claims 1 and 2 above. Tripathy and Wilson disclose the limitations for the same reasons as above. Concerning claims 3 and 20, Tripathy and Wilson disclose: The method of claim 2, comprising generating one or more graphical user interfaces (GUIs) configured to be displayed via a client device… (Tripathy – see par 20 - The workflow assistant system 130 can also be communicatively coupled to a user by way of a user computing device 140 (e.g., tablet, desktop computer, terminal, laptop computer, smartphone, etc.); FIG. 1 – device 140 has a screen; see par 56 - FIGS. 6-9 depict example screenshots associated with workflow assistance. The example screenshots illustrate how a user can interact with the workflow assistant system 130, the application 110, and workflow system 120 of FIG. 1 through a graphical user interface. Further, the screenshots can illustrate aspects of the workflow page 344 of FIG. 3 in accordance with certain embodiments. See par 58, FIG. 7 - The workflow is set to “ON” rather than “OFF,” indicating that the workflow is currently active and able to be triggered based on specified conditions. Finally, the actions column is an icon that, when activated or selected, allows further inspection and editing of details regarding the workflow.) Tripathy discloses displaying data and having workflows set to “active” (See par 20, 58). Wilson discloses the recited content on the GUI: The method of claim 2, comprising generating one or more graphical user interfaces (GUIs) configured to be displayed via a client device “as the workflow is carried out” (Wilson – see par 294 - For example, the workflow types can define a current status, what happens when the current status satisfies one or more criteria, a progression of the issue to through different steps that are performed to resolve the issue and so on. In some cases, a workflow type can define timelines for an issue, inputs required from various users, manage approvals and/or other processes for advancing an issue; See FIG. 11, par 295 - the selection panel 1102 can include a defined set of tiles 1104 that are associated with a particular category (e.g., IT, HR, Legal, and so on) and/or associated with particular intake requests (e.g., password troubleshooting, onboarding, and so on)..) Obvious to combine Tripathy and Wilson for the same reasons as in claim 1. In addition, Tripathy discloses displaying data and having workflows set to “active” (See par 20, 58). Wilson improves upon Tripathy by disclosing having status (e.g. current status; progression) for different steps in a workflow as part of its graphical representation. One of ordinary skill in the art would be motivated to further include having status (e.g. current status; progression) for different steps in a workflow as part of its graphical representation to efficiently improve upon the workflows that can be “active” in Tripathy. Concerning claims 4, 11, and 19, Tripathy describes: The method of claim 1, wherein generating the first and second placeholder activities comprises: setting, for the first placeholder activity, using the one or more LLMs, based on the natural language request, the first placeholder value for the first property of the first placeholder activity (Tripathy -see par 48 - The method 500 continues at block 520 by determining a second set of parameters from user input. For example, a problem statement can be analyzed for context information corresponding to one or more conditions, actions, or triggers. A condition can be an expression, such as “TxnAmount>1000.” Here, the expression includes a variable (e.g., transaction amount), an operator (e.g., greater than, less, than, equal), and a value (e.g., 1000). An action can correspond to approval, reminder, or notification, among other things. Further, the action can have an assignee to whom the action is assigned. For example, the assignee can be an individual to whom a reminder is sent. An action can specify a step or task to perform, such as emailing a customer or generating a mobile notification, among other things. A trigger is an event that initiates the performance of a corresponding action. One or more conditions can define when a trigger occurs, such as when an invoice is due in a predetermined number of days. The one or more conditions, actions, or triggers, among other things, can correspond to the second set of attributes. See par 54 – method 500… It is further to be appreciated that various default information or details can also be prefilled but can be updated by a user, such as the content and format of notifications, among other things. In this manner, user requests can be limited to what is needed, such as what clients the user considers high-value clients for bill review and processing.); setting, for the second placeholder activity, using the one or more LLMs, based on the natural language request and the first placeholder activity, a second placeholder value for a second property of the second placeholder activity (Tripathy -see par 51 - The method 500 continues at block 550, translating missing parameters into a natural language. In other words, the missing parameters can be translated into a natural language request for particular data and sent to the user. see par 52- The method 500 continues at block 560 by receiving user input in response to the request. The method can subsequently loop back to block 520, where one or more parameters or other details are identified from the user input; see par 53 - The second set of parameters or parameter values can be determined from user input at block 520, which can first correspond to the problem statement. The name of the person specifying the problem can be determined from the user name or other means associated with the problem statement at block 520. The first and second sets of parameters can be compared at block 530. The determination at block 540 can indicate missing parameters, such as bill identification. At block 550, a request can be generated to request bill identification. However, the request can be specified in natural language based on context data. For example, the request can ask, “What do you consider high-valued bills?” Input can be received at block 560 in response to the request. Subsequently, the response can be analyzed to determine one or more parameters at block 520; see also Wilson see par 229 - Once the raw user input is transformed into a string prompt, the prompt may be provided as input to a request queue 238 that orders different user request for input from the generative output engine 228. Output of the request queue 238 can be provided as input to a prompt hydrator 240 configured to populate template fields, add context identifiers, supplement the prompt, and perform other normalization operations described herein. see par 255 - For example, a project related to software development for a videogame may have different service requirements from a project related to hardware changes within an enterprise. Accordingly, a first request type may prompt the user to provide details on the version number of the software and a description of the issue. A second request, relating to hardware changes, may prompt the user to provide details on their current hardware and role within the organization. Generally, the issue intake builder allows administrators to determine which fields are visible in a frontend application 404a generated by the help desk service 412 when filling out an issue item for a given project. see par 297-298 - a dynamic preview panel that allows a user to modify the workflow using the schematic 1108; In response to a selection of a particular workflow type and/or user input to the user interface 1100 confirming selection of a workflow, the system can create the new portal intake flow for use in a web based service. The new portal intake flow can be configured to generate a new issue using an intake interface that is configured in accordance with the intake request type and having a workflow configured in accordance with the selected workflow) Obvious to combine Tripathy and Wilson for the same reasons as in claim 1. Concerning claims 5, 12, Tripathy and Wilson disclose: The method of claim 1, wherein the first property of the first placeholder activity comprises an input to the first placeholder activity, an output of the first placeholder activity, one or more actions that take place to generate the output of the first placeholder activity based on the input to the first placeholder activity, a label of the first placeholder activity, a description of the first placeholder activity, a rule to apply during performance of the first placeholder activity, a trigger that initiates the first placeholder activity, or an execution property of the first placeholder activity (Examiner notes the claim is in the alternative. Tripathy – see par 40 - The method 400 continues to block 450, where information or data is received in response to the request. The data can relate to one or more conditions associated with triggering a workflow, one or more actions performed in response to a trigger, task assignments, notification information, and when the workflow terminates, among other things. See par 48 - for example, a problem statement can be analyzed for context information corresponding to one or more conditions, actions, or triggers. Concerning claim 6, Tripathy and Wilson disclose: The method of claim 1, wherein the input modifying the skeleton workflow comprises providing a value for an additional property of the first placeholder activity, adding a third activity, removing the second placeholder activity, replacing the second placeholder activity with a fourth activity selected from the activity library, or any combination thereof (Examiner notes the claim is in the alternative. Tripathy – see par 48 - The method 500 continues at block 520 by determining a second set of parameters from user input. For example, a problem statement can be analyzed for context information corresponding to one or more conditions, actions, or triggers. A condition can be an expression, such as “TxnAmount>1000.” Here, the expression includes a variable (e.g., transaction amount), an operator (e.g., greater than, less, than, equal), and a value (e.g., 1000). An action can correspond to approval, reminder, or notification, among other things. Further, the action can have an assignee to whom the action is assigned. For example, the assignee can be an individual to whom a reminder is sent. An action can specify a step or task to perform, such as emailing a customer or generating a mobile notification, among other things. A trigger is an event that initiates the performance of a corresponding action.). Concerning claims 7, 13, Tripathy and Wilson disclose: The method of claim 1, wherein the input requesting to modify the skeleton workflow is provided via a chat interface (Tripathy – see par 21 - In one embodiment, an intelligent virtual assistant or chatbot can receive a text problem statement from a user and invoke a generative pre-trained transformer (e.g., GPT-4) to determine and create an appropriate workflow for the user. see par 31 - In accordance with one embodiment, the data acquisition component 230 can request data from a user. For example, generative artificial intelligence techniques, such as those associated with virtual intelligent agents or chatbots, can be utilized to request data in the user's natural language. Furthermore, the request can be conversational rather than a formal and impersonal request for the data. Such requests can be termed prompt engineering. Template parameters or attributes that may need to be provided include, but are not limited to, conditions or triggers, task assignment, when and how notifications or alerts are sent, and the content or format of notifications. See par 36 - In one embodiment, a conversational chatbot or virtual intelligent assistance can be employed to receive the request as well as request and receive further information; Wilson – see par 345 - The first input region can receive typed/written input provided in a natural language for input into a generative output model, as described herein. For example, the input region can be configured to receive a description of how the project management interface should be configured including portal intake flows or other features that should be included, configurations for the portal intake flows or other features, a description of the arrangement and/or any other suitable input from the user.). Obvious to combine Tripathy and Wilson for the same reasons as in claim 1. Concerning claim 8, Tripathy and Wilson disclose: The method of claim 1, wherein the one or more LLMs are trained on one or more other workflows, one or more business process model and notation (BPMN) conventions, one or more industry standard operating procedures, one or more industry best practices, one or more publications, or any combination thereof (Examiner notes the claim is in the alternative. Tripathy – see par 37 - One or more machine learning models 150 can be executed to infer, predict, or otherwise determine a workflow template matching the problem statement. In one embodiment, one or more machine learning models 150 can be trained to identify the problem statement utilizing natural language processing to classify the problem statement based on historical data (disclosing 1st alternative)). Concerning claim 10, Tripathy and Wilson disclose: The system of claim 9, wherein the processing circuitry is configured to execute a cloud-based client instance (Tripathy – see par 28 - computing device can be configured to be a special-purpose device or appliance that implements the functionality of the workflow assistant system 130. Further, all or portions of the workflow assistant system 130 can be distributed across computing devices or made accessible through a network service.), and wherein the natural language request to generate the workflow, the input requesting to modify the skeleton workflow, the approval of the recommended first activity and the recommended second activity, and the approval of the updated skeleton workflow are received from a client device (Tripathy – see par 20 - The workflow assistant system 130 can also be communicatively coupled to a user by way of a user computing device 140 (e.g., tablet, desktop computer, terminal, laptop computer, smartphone, etc.). The workflow assistant system 130 can thus be an intermediary between a user computing device 140 and the workflow system 120; see par 22, fig. 1 - -where user gives “problem/response” – “a user, by way of the user computing device 140, can specify a problem statement somewhat abstractly in a natural language”; see par 25 - Template parameters capture template context, and parameter values can be identified automatically based on user-provided data. Subsequently, the workflow system 120 can generate a workflow based on a completed template. In one instance, the machine learning model 150 can converse with the user in natural language to clarify the problem statement by requesting additional information until the confidence score satisfies the threshold. Additionally, or alternatively, the workflow assistant system 130 can output the template partially completed as a suggestion. In this manner, a user can verify and optionally add or change the information in the template before triggering the generation of a corresponding workflow; Wilson par 199 - In this example, it may be appropriate for the output processor to direct the output of the generative engine service 116 to the frontend (e.g., rendered on the client device 104, as one example) so that a user of the client device 104 can approve the content before it is prepended to the document. see par 273 - the system may display in the first UI, a first preview panel including a dynamic preview of an intake interface for the request type associated with the selected tile. The intake interface may include intake fields that are used to collect information related to the particular request type as part of the intake portal flow. For example, the fields may include fields for identifying the requestor or a user associated with the request, fields for providing a summary of the issue/request, and so on. … For example, an IT request type may be different fields than an HR request type. see par 298 - In response to a selection of a particular workflow type and/or user input to the user interface 1100 confirming selection of a workflow, the system can create the new portal intake flow for use in a web based service..). Obvious to combine Tripathy and Wilson for the same reasons as in claim 1. Concerning claim 14, Tripathy and Wilson disclose: The system of claim 13, wherein the operations comprise displaying, via the chat interface, one or more recommendations for modifying the skeleton workflow (Tripathy see par 30 -the workflow assistant system 130 can output one or more templates as a suggestion, which a user can select and complete. see par 31 - generative artificial intelligence techniques, such as those associated with virtual intelligent agents or chatbots, can be utilized to request data in the user's natural language; see par 32 - In one particular instance, parameters needed to build context can be retrieved, tuned, and built as a vector that can be passed to a model (e.g., chat GPT) to fetch a response and a score. Actions can be taken based on the value of the score. For example, if the score equals 100, the workflow can be built and enabled. If the confidence score equals 50, the workflow can be built but in a disabled state (e.g., not enabled/inactive). If the score is less than 5 and additional information is needed, the user can be prompted for the additional information. If the score is less than 5 and additional information is unnecessary, the workflow template page can be opened with current information, allowing the rest to be prefilled by a user. This approach uses generative artificial intelligence to recommend and build a workflow for a user (e.g., customer); see par 33 - The workflow plugin layer 330 includes assistance system 332, which corresponds to the workflow assistant system 130 of FIG. 1, and is configured to converse with a user in natural language to receive a problem statement and identify workflows, and corresponding templates, that address the problem statement). Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Rao, Tripathy (US 2025/0139417) in view of Wilson (US 2025/0111334), as applied to claims 1-14, and 17-20 above, and further in view of Botha (US 11,194,576). Concerning claim 15, Tripathy discloses The system of claim 9, wherein the operations comprise displaying, … a recommendation for modifying the skeleton workflow (Tripathy see par 30 -the workflow assistant system 130 can output one or more templates as a suggestion, which a user can select and complete. see par 31 - generative artificial intelligence techniques, such as those associated with virtual intelligent agents or chatbots, can be utilized to request data in the user's natural language; see par 32 - In one particular instance, parameters needed to build context can be retrieved, tuned, and built as a vector that can be passed to a model (e.g., chat GPT) to fetch a response and a score. Actions can be taken based on the value of the score. For example, if the score equals 100, the workflow can be built and enabled. If the confidence score equals 50, the workflow can be built but in a disabled state (e.g., not enabled/inactive). If the score is less than 5 and additional information is needed, the user can be prompted for the additional information. If the score is less than 5 and additional information is unnecessary, the workflow template page can be opened with current information, allowing the rest to be prefilled by a user. This approach uses generative artificial intelligence to recommend and build a workflow for a user (e.g., customer); see par 33 - The workflow plugin layer 330 includes assistance system 332, which corresponds to the workflow assistant system 130 of FIG. 1, and is configured to converse with a user in natural language to receive a problem statement and identify workflows, and corresponding templates, that address the problem statement). Wilson discloses generative response suggesting changes to fields in portal intake flow (See par 56) and “For example, in response to determining a user selection of the second tile 1204b, the system can cause the preview panel 1206 to display the second interface corresponding to an interface for a feature corresponding to the selected second tile 1204b (e.g., a user interface for the request portal). The system may cause the preview panel to display an overlay window 1110 that generates a summary corresponding to the second interface 1208a displayed in the preview panel 1206” (See par 303). Botha discloses: The system of claim 9, wherein the operations comprise displaying, “via a popup window,” a recommendation for modifying the skeleton workflow Botha – See col. 27, lines 5-18 - The interface comprises elements that allows an administrator to input data and/or instructions (e.g. to select an option, to substitute a part of the workflow program document with a correction or update, etc.) and optionally elements that allows output of information to the administrator (e.g. options for exception handling, prompts for input, error messages for invalid inputs, etc.) see col. 27, lines 19-31 – The exception handling interface may be displayed to an exception handler such as a predetermined actor (or group of actors), for example the actor associated with the sequence that leads to the exception, to a designated administrator, and/or to an AI agent trained to handle exceptions. The exception handling interface may be displayed in conjunction with the workflow program document, e.g. in the form of a pop-up window; see col. 32, lines 37-53 - The MLA (machine learning algorithms) may identify repeated or similar sequences. Based on this, the MLA may suggest edits to a workflow program document. For example, as a process is being automated, the MLA may suggest steps to encode a component which is executed similarly to another component, or may suggest steps for use in a component of another process. Alternatively, small differences in similar sequences may be identified, for example, highlighted, to assist a user in determining if differences were intentional or if the processes might usefully be made more consistent.) Tripathy, Wilson, and Botha are analogous art as they are directed to business processes/workflows (see Tripathy Abstract; Wilson Abstract, par 56, 78, 295 (IT, HR, Legal, and so on)l Botha Abstract, Col. 1, lines 26-35 – “business or process analysts”; Col. 9, lines 33-53 – example process can be … legal approval with dependency condition). Tripathy discloses suggesting a workflow template that a user can further complete (see par 30), and having a template opened with some information and the rest filled by a user (See par 31-32). Wilson discloses generative response suggesting changes to fields in portal intake flow (See par 56) and having “overlay window” with a preview (See par 303). Botha improves upon Tripathy and Wilson by disclosing using a pop-up window to output message on issue/exception with the workflow and AI/Machine learning to identify sequences to suggest edits to a workflow (See col. 27; 32, lines 37-53). One of ordinary skill in the art would be motivated to further include having a known pop-up window for providing suggestions/modifications to a workflow to efficiently improve upon the suggested workflows in Tripathy and Wilson. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention for receiving problem statements in natural language and using workflow templates in Tripathy, to further include the suggestions and overlay window in Wilson, and to further having a pop-up window to suggest edits to a workflow as disclosed in Botha, since the claimed invention is merely a combination of old elements, and in combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable and there is a reasonable expectation of success. Response to Arguments Applicant’s arguments of 12/15/25 have been considered but are not persuasive and/or moot over the revised rejections. With regards to 101, Applicant argues that “amended independent claim 1 does not recite any humans, nor any activity performed by humans.” Remarks, page 12. In response, Examiner respectfully disagrees with this analysis. The arguments are moot over the revised rejections necessitated by the amendments. Furthermore, the claims involve asking for input from a client device and then getting approval from a client/person. The “skeleton workflows” are for any business topic falling within a “certain method of organizing human activity” - Applicant’s FIG. 5 gives example of “create a travel expense reimbursement process for managing employee travel expenses efficiently”; applicant’s [0077] as published gives many other example topics: “ workflow may be related to credit card fraud investigation, employee onboarding, employee training, accounting, financial close, employee reviews, product testing, invoicing/billing, quality control, … purchasing, inventory, logistics, employee benefit management… supply chain management, vendor onboarding. See also Applicant’s FIG. 12 showing how a person sets up a business process – “case capture; fraud investigation ; case disposition.” Examiner also notes, having a “computer” perform each limitation is not enough for a claim to be eligible; as a “computer” is analyzed at step 2a, prong two and step 2B. See MPEP 2106.05f (apply it [abstract idea] on a computer). Applicant then argues that the claims are eligible at 2a, prong two, similar to McRo, since [0047] states “Traditionally, generating and modifying workflows for processes using workflow generation tools has been tedious and time consuming,” and the claims here have many limitations for “lower processor utilization and reduced computational costs associated with less time spent designing workflows… reduce[d] human hours spent designing workflows, as well as problems with workflows resulting from human error,” citing to [0086]. Remarks, pages 13-14. In response, Examiner respectfully disagrees. First, eligibility based on 101 is not simply whether any “specific” limitations or “rules” are recited in the claim – it needs to be a particular solution to “improve a computer or other technology.” Rather, McRo, as explained in MPEP 2106.05(a)(II)(“Improvements to Any Other Technology of Technical Field”), had a specific way to solve the problem of producing accurate and realistic lip synchronization and facial expressions in animated characters. As stated in MPEP 2106.05(a)(I)(“Improvement to Computer Functionality”), in computer-related technologies, the examiner should determine whether the claim purports to improve computer capabilities or, instead, invokes computers merely as a tool. Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1336, 118 USPQ2d 1684, 1689 (Fed. Cir. 2016).” Here, [0047] relays a problem [making a workflow is tedious and time-consuming for a person], and ends with an aspiration: “Techniques for making the creation of workflows faster, more efficient, and a better experience for workflow designers are needed. Faster and more efficient workflow generation is associated with lower processor utilization and reduces computational costs.” The faster and efficient in [0047] does not appear to be connected with limitations from the claims, let alone “additional elements” of the claims. Rather, it seems to be stating it will save the user time. Similarly, [0086] states “Technical effects of the disclosed techniques may include lower processor utilization and reduced computational costs associated with less time spent designing workflows. Further, deployment of the presently disclosed techniques may reduce human hours spent designing workflows, as well as problems with workflows resulting from human error.” It is unclear like [0047], how this relates to “lower processor utilization and reduced computational costs”. The claim is using a “large language model”. There are no details on how it is using the LLM in a manner to “lower processor utilization” or reduce computational burdens. The claims here are still viewed at this time as ineligible even in view of Enfish and Desjardins. See MPEP 2106.04(d)(1) “The specification need not explicitly set forth the improvement, but it must describe the invention such that the improvement would be apparent to one of ordinary skill in the art. Conversely, if the specification explicitly sets forth an improvement but in a conclusory manner (i.e., a bare assertion of an improvement without the detail necessary to be apparent to a person of ordinary skill in the art), the examiner should not determine the claim improves technology.” The LLM gives a skeleton workflow; and it gives a placeholder activity; there is no further detail at this time to explain how “LLM outputting workflow and placeholder activity” of a business process workflow is improving computing technology itself. Applicant then argues that the claims are eligible at 2B, similar to Bascom, since [0047], [0086] show the claims are “lower processor utilization and reduced computational costs” and the claims have “unconventional means.” Remarks, pages 15-16. In response, Examiner respectfully disagrees. First, it is unclear what limitations are being alleged as unconventional. Second, with regards to step 2B, only those additional elements (analyzed under 2B) that are deemed “conventional” need to comply with Berkheimer. When elements are just part of “apply it” [abstract idea] on a computer, under MPEP 2106.05(f); or “field of use” under MPEP 2106.05h, no evidence is needed. Third, [0047, 0086] were addressed above – and a “bare assertion” from Specification with no details is not sufficient for arguing improving computing technology. See also MPEP 2106.05(a) “if the specification explicitly sets forth an improvement but in a conclusory manner (i.e., a bare assertion of an improvement without the detail necessary to be apparent to a person of ordinary skill in the art), the examiner should not determine the claim improves technology. An indication that the claimed invention provides an improvement can include a discussion in the specification that identifies a technical problem and explains the details of an unconventional technical solution expressed in the claim, or identifies technical improvements realized by the claim over the prior art.” Arguments regarding prior art are moot in view of new rejections necessitated by the amendments. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to IVAN R GOLDBERG whose telephone number is (571)270-7949. The examiner can normally be reached 830AM - 430PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anita Coupe can be reached at 571-270-3614. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /IVAN R GOLDBERG/Primary Examiner, Art Unit 3619
Read full office action

Prosecution Timeline

Feb 27, 2024
Application Filed
Sep 15, 2025
Non-Final Rejection — §101, §103, §112
Nov 25, 2025
Interview Requested
Dec 01, 2025
Applicant Interview (Telephonic)
Dec 01, 2025
Examiner Interview Summary
Dec 17, 2025
Response Filed
Feb 12, 2026
Final Rejection — §101, §103, §112
Mar 27, 2026
Applicant Interview (Telephonic)
Mar 27, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596970
SYSTEM AND METHOD FOR INTERMODAL FACILITY MANAGEMENT
2y 5m to grant Granted Apr 07, 2026
Patent 12591826
SYSTEM FOR CREATING AND MANAGING ENTERPRISE USER WORKFLOWS
2y 5m to grant Granted Mar 31, 2026
Patent 12586020
DETERMINING IMPACTS OF WORK ITEMS ON REPOSITORIES
2y 5m to grant Granted Mar 24, 2026
Patent 12579493
SYSTEMS AND METHODS FOR CLIENT INTAKE AND MANAGEMENT USING HIERARCHICAL CONFLICT ANALYSIS
2y 5m to grant Granted Mar 17, 2026
Patent 12555055
CENTRALIZED ORCHESTRATION OF WORKFLOW COMPONENT EXECUTIONS ACROSS SOFTWARE SERVICES
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
35%
Grant Probability
72%
With Interview (+36.9%)
4y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 365 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month