DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/7/2025 has been entered.
Notice to Applicant
The following is a Non-Final Office action. In response to Examiner’s Final Rejection of 8/7/25, Applicant, on 11/7/25, amended claims. Claims 1-2, 4-6, 8-9, 11-12, 14-16, and 18-19 are pending in this application and have been rejected below.
Response to Amendment
Applicant’s amendments are acknowledged.
The double patenting rejections are withdrawn in light of the amendments made to 18/165,343; 18/167,075; and 18/335,151.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-2, 4-6, 8-9, 12, and 14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 1 recites the limitation "to set the processor" in line 10; but then in line 12 it recites “a processor coupled to the storage device”. There is insufficient antecedent basis for the limitation in line 10 as it is unclear if there are one or two different “processors.” For purposes of applying prior art only, Examiner interprets the claim as referring to the same processor.
Claims 2, 4-6, 8-9 depend from claim 1 and are rejected for the same reasons.
Claims 2, 12 recites the limitation "a management knowledge graph". However, there is insufficient antecedent basis for this limitation in the claim as claims 1 and 11 are amended to now include “a management knowledge graph.” It is unclear if there are one or two different “management knowledge graphs.” For purposes of applying prior art only, Examiner interprets claims 2, 12 as reciting “the management knowledge graph.”
Claims 4, 14 recite the limitation “the task engine module.” However, there is insufficient antecedent basis for this limitation in the claim as claims 1, 11 do not introduce this limitation. Previously, claims 3, 13 recited “a task engine module,” but the claim is now deleted. Examiner suggests considering properly considering introducing “a task engine module” and referring to previous claim 3 for ideas on how to incorporate.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-2, 4-6, 8-9, 11-12, 14-16, and 18-19 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e. an abstract idea) without reciting significantly more.
Step One - First, pursuant to step 1 in MPEP 2106.03, the claim 1 is directed to a system comprising a processor performing operations which is a statutory category.
Step 2A, Prong One - MPEP 2106.04 - The claim 1 recites–
“A data-driven system based on a data-driven model, comprising:
a storage …, storing a data change sensing…, a data-driven …, and a data footprint… and a knowledge map…comprising a management knowledge graph, wherein the management knowledge graph is configured to output a data management model to the data driven …, such that the data driven … obtains task data comprising an execution logic and an execution rule according to the data management model, wherein the execution logic corresponds to business activities, and the execution rule corresponds to a relationship with business activities during data changes, and the execution logic and the execution rule are configured to …. to drive a execution of a task when data changes;
…
executes the data change sensing… to detect the enterprise resource planning system according to detection definition provided by the knowledge map … to determine whether a changed data occurs,
wherein when… determines the changed data occurs…executes the data driven… to convert the changed data into the task data and output the task data (One of Applicant’s example of “convert” is [0026] as published where “purchase requisition data is converted to purchase data through a requisition to purchase task (a task through which the data was processed) to the enterprise resource planning system to execute the task in the enterprise resource planning system, so that the enterprise resource planning system executes corresponding service computation of the task to generate business data and feedback the business data to the data driven module, and the data driven … is configured to generate an execution result based on the business data and output the execution result… , wherein the data driven … queries and analyzes a task definition to the knowledge map module according to the business data,
… executes the data footprint … to record previous task data and processing data generated during execution of the task according to the business data, wherein the previous task data is used to perform a current … service in a subsequent service without re-executing a previously performed task operation and the task definition is provided by the knowledge map …,
wherein the recorded processing data carries a data identifier, and the data identifier is configured to indicate a data conversion process during execution of the task”.
As drafted, this is, under its broadest reasonable interpretation, within the Abstract idea grouping of “certain methods of organizing human activity” (managing relationships between people. At this time, the claims are interpreted as storing data regarding a business process/operations (e.g. [0021] as published,) with knowledge of a business process (e.g. a sequence of activities for business purchases), having logic and rules for business activities, detecting there is a “change” in business data in enterprise resource planning based on a detection definition, then executing a task to have a task result, then analyzing a task definition to the knowledge map according to business data (e.g. tasks can be just business activities [0021], [0026] tasks are… purchases), where previous task data is used to perform a current service in a subsequent service that does not re-execute previous tasks (interpreted as describing a business process, such as Applicant’s example of “purchases” from ERP (Enterprise resource planning)), and then a data identifier that indicates to a person that data is converted during the purchasing task ([0026] as published indicates can refer to purchase data). The claim is directed to “certain methods of organizing human activity”: a) commercial or legal interactions (sales activities); and b) “managing personal behavior” (including following rules or instructions), because it is a process of a series of manual tasks by users with business data, for an Enterprise Resource Planning, and purchases/tasks along with a “change” in business data where there is a “knowledge map” of the business process. The second set of actions do not need to repeat some of the first actions that were already performed.
Step 2A, Prong Two - MPEP 2106.04 - This judicial exception is not integrated into a practical application. In particular, the claim 1 recites additional elements that are:
“A data-driven system connected to a terminal device and an enterprise resource planning system, comprising:
a storage device storing a data change sensing module, a data-driven module, and a data footprint module, and a knowledge map module…module [for a number of limitations]… and the execution logic and the execution rule are configured to …set the processor to drive a execution of a task when data changes; and
a processor coupled to the storage device,
wherein the processor executes the data change sensing module to detect …,
wherein when the processor determines the changed data occurs, the processor executes the data driven module to… configured to generate an execution result based on the business data and output the execution result to the terminal device, wherein the data driven module queries and analyzes a task definition to the knowledge map module according to the business data;
wherein the processor executes the data footprint module to record previous task data and processing data generated during execution of the task according to the business data, wherein the previous task data is used to perform a current call service in a subsequent service without re-executing a previously performed task operation and the task definition is provided by the knowledge map module,
wherein the recorded processing data carries a data identifier, and the data identifier is configured to indicate a data conversion process during execution of the task.
The additional elements, “processor,” storage device”, “module”, “call service” ([0026] as published – appears to be referring to “calling” another computer it appears), when viewed individually or in combination, are viewed as computer elements and “apply it [abstract idea] on a computer” at MPEP 2106.05f; and “field of use” (MPEP 2106.05h). Claim limitations individually or as a whole, when viewed in light of the specification, falls within MPEP 2106.05a example that is not sufficient to show an improvement in computer functionality: Accelerating a process of analyzing audit log data when the increased speed comes solely from the capabilities of a general-purpose computer, FairWarning IP, LLC v. Iatric Sys., 839 F.3d 1089, 1095, 120 USPQ2d 1293, 1296 (Fed. Cir. 2016), as here, we have no technical details. The same process would occur manually – a first task is performed, and the second task could have the user “not repeat duplicate steps”, for a purchase based on business data using “enterprise resource planning.”
Accordingly, the additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim also fails to recite any improvements to another technology or technical field, improvements to the functioning of the computer itself, use of a particular machine, effecting a transformation or reduction of a particular article to a different state or thing, and/or an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. See 84 Fed. Reg. 55. The claim is directed to an abstract idea.
Step 2B in MPEP 2106.05 - The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of a processor/computer, storage device, and modules, and current call service and subsequent service to execute operations are MPEP 2106.05(f) (Mere Instructions to Apply an Exception – “Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible.” Alice Corp., 134 S. Ct. at 235) and “field of use” (MPEP 2106.05h). Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept.
In addition, to extent information results of purchases/business data is “output… to terminal device,” and “call service” and “subsequent service” is sending messages between computers (through an API in example in specification [0026], but not yet claimed), are both considered a conventional computer function – See MPEP 2106.05d - Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321. In addition, the steps involving “record previous task data” are viewed as conventional functions at step 2B (See MPEP 2106.05d(II)(iv) - Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306).
The claim fails to recite any improvements to another technology or technical field, improvements to the functioning of the computer itself, use of a particular machine, effecting a transformation or reduction of a particular article to a different state or thing, adding unconventional steps that confine the claim to a particular useful application, and/or meaningful limitations beyond generally linking the use of an abstract idea to a particular environment. See 84 Fed. Reg. 55. The claim is not patent eligible. Viewed individually or as a whole, these additional claim element(s) do not provide meaningful limitation(s) to transform the abstract idea into a patent eligible application of the abstract idea such that the claim(s) amounts to significantly more than the abstract idea itself.
Independent claim 11 is directed to a method at step 1, which is a statutory category. Claim 11 recites similar limitations as claim 1 and is rejected for the same reasons at step 2a, prong one, 2a, prong 2, and step 2b. The limitations of “processor, module,” are viewed as “apply it [abstract idea] on a computer” at step 2a, prong 2 and step 2b. The remaining limitations are similar to claim 1 above. The claim is not patent eligible.
Dependent claims 2, 4-6, 8-9, 12, 14-16, and 18-19 recite limitations regarding knowledge maps, graphs, detection definition, converting data into tasks, action logic graphs, management knowledge graphs, and scheduling, and determining change in the business data. These are just viewed as data items describing the tasks at this time and are narrowing the abstract. To any extent it’s “by a computer” this is also viewed as MPEP 2106.05f “apply it on a computer.”
Dependent claims 6 and 16 name that there is a “service orchestration module” for “providing” business data. To the extent this is naming a computer function, it is considered MPEP 2106.05f “apply it on a computer” at step 2a, prong 2 and step 2B. It is also considered a conventional computer function at step 2B – See MPEP 2106.05d “Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321.”
Therefore, the claim(s) are rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter.
For more information on 101 rejections, see MPEP 2106.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-9, 11-19 are rejected under 35 U.S.C. 103 as being unpatentable over Rao (US 2021/0342723), Kennis (US 2005/0209876) and Addala (US 2011/0218922).
Concerning claim 1, Rao discloses:
A data-driven system connected to a terminal device …(Rao – see par 28, FIG. 1 – architecture 100 for automating enterprise processes; See par 32 - The discovery module 111 may view the process and trace the processes from the front end, to the fulfillment center. The discovery module 111 may generate a digital representation of the process and identify the different applications or systems that the process uses. see par 60 - The discovery platform 110 may use the event and system logs, audit trails, input and output data from the execution of the intelligent automation applications 135 to generate new updated digital representations of the business process. The business process may then be analyzed to determine if there was improvement in the efficiency of the business process, and the optimization module 133 may then adjust the model to further improve the performance of the business process; see par 62-63, FIG. 2 – service architecture 200 comprises client device which may be desktop computer , laptop, smartphone or other computing device),
Rao discloses having a number of different applications for business domains – Finance, HR, etc (See par 29, FIG. 1).
Kennis discloses:
A data-driven system connected to a terminal device “and an enterprise resource planning system” (Kennis – see par 110 - ERP System: Enterprise Resource Planning system, generally the software, system, and/or Applications responsible for planning and tracking the financial, logistical and human operations of an Enterprise. see par 161, FIG. 1 – enterprise computing environment 10 in which a TIM (Transaction integrity monitoring system) is operative; computing environment includes an ERP system 110 as exemplary of a type of computer system; See par 162 - Users 101 of the TIM system interact with the system via a user interface (UI) comprising a personal computer or terminal and associated display for configuring the system).
Rao and Kennis disclose:
A data driven system… comprising:
a storage device (Rao – see par 28, FIG. 1 – architecture includes Automation platform 130; discovery platform 110; AI & ML Core Technology Platform 120, etc; See FIG. 11, par 111-114 – example machine of a computer system to perform any of the methodologies herein; includes data storage devices 1118; The processor device 1102 is configured to execute instructions 1126 for performing the operations and steps discussed herein; par 116 - The data storage device 1118 may include a machine-readable storage medium 1124 (also known as a computer-readable medium) on which is stored one or more sets of instructions or software 1126 embodying any one or more of the methodologies or functions described herein), storing a data change sensing module ([0020] as published states “When the data change sensing module 121 detects the changed data, in step S320, the processor 110 may execute the data driven module 122 to obtain business data according to the changed data, and execute a task according to the business data to generate an execution result. In this embodiment, the data driven module 122 may query and analyze a project task definition to the knowledge map module 124 according to the business data, and execute a corresponding task according to a corresponding definition and rule.”
Rao discloses based on broadest reasonable interpretation in light of the specification – See FIG. 1, par 45 – automation platform 130 comprising operation module 131, automation module 132, and optimization module 133),
a data-driven module (Rao – See par 28 - the automation of enterprise business processes may utilize an integrated approach machine learning and artificial intelligence to automate the full spectrum of business processes in an Enterprise. The entire automation stack covers several components that operate in an integrated fashion to deliver a continuously learning and intelligent automation solution to address an enterprise need. See par 29, 45-46 – The automation platform 130 may comprise an operation module 131, automation module 132 and optimization module 133. 130 may leverage the machine learning models generated by the AI & ML core technology platform 120, to deliver intelligent automation for complex processes in specific business domains like finance, enterprise security, governance risk and compliance (GRC), HR and other enterprise business activities.), and a data footprint module ([0032] as published states – “In operation S604, the data footprint module 123 may provide the corresponding historical business data according to the requirement of the task engine module 1211.” (Rao – see par 31 - Discovery module 111 may use process mining techniques applied to application and user footprints, digital exhaust and event logs to discover and create a digital representation of the business process)
a knowledge map module (Rao – see par 33, FIG. 1 – discovery platform includes discovery module 111; The discovery module 111 may access event logs, system logs, application logs, or any log of user activity during the performing a process. The log data may then be fed into a process mining stack or machine learning algorithm to identify the process flow. This mining process may be used in generating the digital representation of the process. see par 35 – Discovery platform 110 also includes Visualization module 112 to generate visualization of simulated activity)
PNG
media_image1.png
785
784
media_image1.png
Greyscale
comprising a management knowledge graph (Applicant’s [0025] as published states “action logic graph 1242 abstracts the business logic into an action as one primary node type in the graph, and adds data as another primary node type. The action logic graph 1242 may describe the relationship between actions and data through edges that connect action nodes to data nodes.”
Rao discloses the limitations based on broadest reasonable interpretation in light of the specification – see par 79 - Domain-specific labels 610 may belong to one or more categories relevant to validation or generation of a financial report, such as an earnings report. In some embodiments, the validation documents 605 can be initially tagged by the user 604 with one or more relevant validation categories, so the machine learning models 615 can determine which set of domain-specific elements E# to look for and identify in the structured input data 603. In other embodiments, the categories can be preconfigured in the automation system 600 based on the domain to which the domain-specific labels 610 belong; see par 89-90 - At step 801, the method includes receiving historical process data associated with one or more computer systems executing the process; The process mining techniques may include one or more machine learning algorithms designed to identify business processes and their workflows. These machine learning algorithms may include petri nets, process trees, casual nets, state machines, BPMN models, declarative models, deep belief networks, Bayesian belief networks or other machine learning models.
see also Kennis – par 169 - A log extractor 141d is responsive to the provision of audit data logs by certain types of ERP database systems, in the form of records indicating the addition or change of records within particular ERP data. see par 171 - A mapper 150 is operative to retrieve data from the staging database 155 and normalize, transform or map that information into a predetermine format to comprise monitoring entities, which are then stored in a monitoring database 175. The monitoring database 175 stores monitoring entities, both support and transactional, identified by table and field names in accordance with mapping data stored in a knowledge base 165. The mapping data (e.g. in the form of mapping files and ontology files) establishes relationships between monitoring entities stored in the monitoring database and monitored entities from the ERP databases.), wherein the management knowledge graph is configured to output a data management model to the data driven module, such that the data driven module obtains task data comprising an execution logic and an execution rule according to the data management model, wherein the execution logic corresponds to business activities, and the execution rule corresponds to a relationship with business activities during data changes (specification [0030] states “the action logic map 1222 may be a business logic representation model based on the map model. That is, the business logic may be abstracted into the action to serve as the main node type in the map, and the data may be added as another main node type, so that the relationship between the action and the data may be described through the edge connecting the action node and the data node.” [0032] states “ In operation S602, the data footprint module 123 may obtain the business data and the data status (i.e., the processing data) of the current task execution result from the task engine module 1211. In operation S603, the data footprint module 123 may establish the status change history and the transformation relationship of the data according to the business metadata definition, the business data, and the data status (that is, associating the processing data with the corresponding historical data).” Rao discloses the limitations based on broadest reasonable interpretation in light of the specification –See par 37 - the AI & ML core technology platform 120 may use machine learning and AI algorithms such as petri nets, neural networks, … or other algorithms to tag, label, classify or identify elements from data sources, as well as learn, predict and validate data related to the process being automated. These algorithms may be used in both validation and generation of financial, HR, security, compliance and other enterprise processes. See par 76 - The validation process may generate large amounts of metadata 506, which may be stored along with a detailed accounting of sources accessed, date and time accessed, validation status of each element, and other details required to create and store an audit trail 507. The audit trail 507 can include a history of sources and data used to arrive at a validation assessment. See par 89 - At step 801, the method includes receiving historical process data associated with one or more computer systems executing the process. The historical process data may be unstructured documents or images. The historical process data may also include event logs, system logs, and other data related to the process being performed. See par 90 - At step 802, the method includes applying process mining techniques to the historical process data; See par 92 - At step 804, the method includes generating a digital representation of the process based on the process model. The digital representation may allow for the process to be replayed visually to give the user a better understanding of the entire process from beginning to end. This may visually highlight bottlenecks in the process;
see also Kennis – see par 122 - Knowledge Base: a collection of Frames representing the compliance policies of the Enterprise, e.g. Policy Frames, stored in a data store or database. see par 145 - Transaction: a set of system actions that result in a completed business activity. For example, a transaction includes the actions associated with adding or deleting a new vendor within an A/P system, or changing the name of an existing vendor from one name to another, or creating a purchase order. see par 172 - A knowledge base 165 stores information required by the extractor 140 (extraction data in the form of extractor files), information required by the mapper 150 (mapping data in the form of mapping files and ontology files), and a plurality of computer-executable policy statements or frames 167 that constitutes the rules and/or logic for determining exceptions.), and the execution logic and the execution rule are configured to set the processor to drive a execution of a task when data changes (Kennis – See par 172 - A collaborative rules engine (CORE) 160, also called a transaction analysis engine, is operative in accordance with aspects of the invention to execute policy statements, which constitutes one or more logical rules and/or expressions, against the monitoring database 175 to determine whether there is a violation of policies or rules (i.e. exceptions). see par 258 - In accordance with the invention, data corresponding to these various transactions generated by the responsible and monitored ERP system are (1) extracted from the data source in which electronic transaction representing these business activities are stored, (2) stored in the staging database, (3) transformed by the mapper into monitoring entities, and (4) stored in the monitoring database as monitoring entities
see also Addala – see par 98 - Configurable units are services that are built and defined by a customer. For example, a wrapper is provided around a service that is configured by a user. For example, a customer may want a shipping service that is specific to the customer's company. Accordingly, the service performed by the configurable unit may be defined and built by a customer, but the wrapper allows runtime engine 312 to invoke the service automatically. This allows customers to define services that are needed for their individual organizations. [0099] For example, for each business process, different parameters may be provided (i.e., different products may be ordered for different prices, etc.). This causes different input arguments to be inputted into the service. The common signature defines a data structure that allows the service to be re-used for different executable processes 310. Thus, the same deployed service is used to process different input arguments for the different orders, but different results may be obtained. In this way, the order fulfillment process can be abstracted); and
a processor, coupled to the storage device (Rao – See par 26 – embodiments implemented by computer system that includes processor, memory, and non-transitory computer-readable medium; see par 116 - the instructions 1126 may also reside, completely or at least partially, within the main memory 1104 and/or within the processor device 1102 during execution thereof by the computer system 110),
wherein the processor executes the data change sensing module to detect the enterprise resource planning system according to… ([0027] as published states “if the changed data is a purchase order, the scheduling engine module 1211 may initiate a detection instance at a specific time of day according to the scheduling definition, so as to detect whether there is new purchase order information through the detection engine module 1212.” [0033] as published states “The metadata parser 544 may analyze the business data, and identify the project task change information 541 (information on project task status changes in the business data),” [0039] as published states “The data driven system of the disclosure may also dynamically adapt to changes in data as the task proceeds to recommend the most appropriate task. The data driven system of the disclosure may also automatically detect business data changes based on the knowledge map.”) Rao – See par 33 - the discovery module 111 may access event logs, system logs, application logs, or any log of user activity during the performing a process; See par 47 - By implementing machine learning and AI within the domain-specific modules, the intelligent automation applications 135 are able to deliver automation at scale to handle rapidly changing business processes)
Rao discloses identifying process flow from log data (See par 33).
Kennis discloses aspects for ERP (enterprise resource planning) as well as the following limitation, and Addala discloses other portions of the limitation as well:
wherein the processor executes the data change sensing module to detect “the enterprise resource planning system according to a detection definition provided by the knowledge map module to determine whether a changed data occurs” (Kennis – see par 167, FIG. 1 – TIM system includes mapper 150, knowledge base 165 and ERP systems 110 that are monitored; see par 169 - A log extractor 141d is responsive to the provision of audit data logs by certain types of ERP database systems, in the form of records indicating the addition or change of records within particular ERP data. An environmental source extractor 141e is operative to obtain data from an enterprise's environment 133 (e.g. internal systems such as its information technology (IT) infrastructure). see par 171 - The mapping data (e.g. in the form of mapping files and ontology files) establishes relationships between monitoring entities stored in the monitoring database and monitored entities from the ERP databases. see par 181 - the extractor 140 and mapper 150 may be dispersed and located at various remote sites and in different configurations. Specifically for example, the extractor 140a and mapper 150a in connection with ERP 1 can be physically located at the ERP site 301a, instead of locally proximate to the TIM system 100.
see also Addala see par 70 - Users are provided the flexibility to define business processes in a central place configured to enter and capture all information required for orchestration and fulfilling an order. The business process may identify one or more services that define steps to be performed in the order fulfillment process. A run-time engine then uses the definition to dynamically invoke the services based on the definition of the business process. See par 81 - fulfillment workbench 180 allows users to make mass order information changes related to fulfillment including making single line or mass line changes to fulfillment information (e.g., dates, etc.). Fulfillment workbench 180 may further allow for the monitoring of orchestration processes, such as …, as well as status of individual tasks and corresponding fulfillment lines and people lines. Fulfillment workbench 180, in one embodiment, includes mechanisms for maintaining order fulfillment processing and allows an order processing user to control a process associated with an order including pause, edit, cancel, etc).
Rao, Kennis, and Addala disclose:
wherein when the processor determines the changed data occurs, the processor executes the data driven module (Rao –see par 33 -The log data may then be fed into a process mining stack or machine learning algorithm to identify the process flow. see par 56 - Multiple business process models from different domains may be used in combination when performing validation or generation processes for hybrid applications that span more than one domain. See par 79 - As described, configuring the automation tools, in some embodiments, may include training machine learning tools to label portions of the structured input data 603 with categories and elements relevant to validation of a financial report or other tasks as may be applicable in the context of the environment or domain in which the automation system 600 is deployed. see par 86 - The automation system 700 can identify domain-specific labels 710 corresponding to param#s and relevant to the query response 735. Machine learning models 715 can be trained to find and label portions of the structured input data 703 which contain underlying data corresponding to the identified domain-specific labels 710) convert the changed data into the task data and output the task data… (One of Applicant’s example of “convert” is [0026] as published where “purchase requisition data is converted to purchase data through a requisition to purchase task (a task through which the data was processed Rao – see par 55 - Recognition may be performed on invoices of any format. Different vendors and suppliers may use different invoice formats, and when adding a new vendor or supplier with a new invoice format, the bot may recognize the document as an invoice without needing to retrain the bot to incorporate a new invoice format.
With regards to “enterprise resource planning system,” Kennis discloses the limitations:
wherein when the processor determines the changed data occurs, the processor executes the data driven module convert the changed data into the task data and output the task data “to the enterprise resource planning system to execute the task in the enterprise resource planning system” (Kennis see par 131 - Normalize: a process of transforming data items expressed in a first data item naming schema (e.g. of an enterprise database) into data items expressed in a different data item naming schema (e.g. associated with a monitoring database); see par 247, FIG. 15 – log extractor 141d (FIG. 1) is responsive to transaction log file 1500 (FIG. 15) to process log file and extract data from an ERP system… Relevant database updates are also provided as a part of the entries 1502 and identify particular fields and values of the changes to data indicated by each record in the transaction log file. See par 259 - example of a static monitoring entity in that the creation of a vendor account within an ERP system will tend to persist for an extended period of time. The subsequent transactions with monitoring entity names PO Issued 1814, Received Purchase 1816, Invoice Received 1818, and Payment Issued 1820 are considered transient or transactional entities. see par 272 - The mapping, transformation, and renaming of data from the source data format (monitored database) to the target data format (monitoring database) is also referred to as normalizing), so that the enterprise resource planning system executes corresponding service computation of the task to generate business data and feedback the business data to the data driven module (Kennis – see par 171 - The mapping data (e.g. in the form of mapping files and ontology files) establishes relationships between monitoring entities stored in the monitoring database and monitored entities from the ERP databases. A principal function of the mapper 150 is to transform data from various and disparate (and possibly heterogenous) data sources into a shared schema or ontology, so that an analysis engine can examine and correlate data across the disparate systems and facilitate the preparation of policy statements that consider information from different data sources. see par 247, FIG. 15 – log extractor 141d (FIG. 1) is responsive to transaction log file 1500 (FIG. 15) to process log file and extract data from an ERP system; An exemplary transaction log file 1100 comprises a plurality of entries 1502, each typically including certain log file metadata 1504 relating to the information such as a timestamp, identification of a user or actor responsible for making the change or creating the log entry, and other relevant information. Relevant database updates are also provided as a part of the entries 1502 and identify particular fields and values of the changes to data indicated by each record in the transaction log file. For example, the entry 1502a indicates that a vendor AAA Inc. was added to a vendor table, entry 1502b indicates that a purchase order was issued to the vendor identified as AAA Inc;)
To any extent that Rao does not execute a task according to the “business data” to then “generate an execution result,” Kennis and Addala disclose:
and the data driven module is configured to “generate an execution result based on the business data and output the execution result to the terminal device (Kennis – see par 174 - An analysis & reporting server 180 provides a user interface (UI) to users, e.g., users 101, for purposes of receiving reports regarding exceptions and implications thereof, as well as providing a user interface to a case management component 190 also as will be described in greater detail below. Users 101 can be of different types, having different authorizations for different purposes. For example, certain users such as user 101a may have access to receive reports and manage certain cases or learn of certain exceptions; See FIG. 15, par 166 – ERP system from accounts payable communicating with computer terminal 111 of FIG. 1) wherein the data driven module queries and analyzes a task definition to the knowledge map module according to the business data” (Kennis – see par 172 - A knowledge base 165 stores information required by the extractor 140 (extraction data in the form of extractor files), information required by the mapper 150 (mapping data in the form of mapping files and ontology files), and a plurality of computer-executable policy statements or frames 167 that constitutes the rules and/or logic for determining exceptions. A collaborative rules engine (CORE) 160, also called a transaction analysis engine, is operative in accordance with aspects of the invention to execute policy statements, which constitutes one or more logical rules and/or expressions, against the monitoring database 175 to determine whether there is a violation of policies or rules (i.e. exceptions)). see par 179 - monitored entities are extracted and mapped to constitute one or more monitoring entities stored in the monitoring database as described herein, so as to allow the detection of exceptions that not in compliance with policies or procedures of the enterprise; see par 242 - FIG. 13 is a pseudocode listing of exemplary computer-implemented process 1300 for a programmatic extractor. The illustrated process … corresponds to the process 1210 in FIG. 12. Essentially, this process queries for changed data fields to any updated tables within the ERP system, and exports or transmits that data to the TIM system 100, where it is received by the programmatic extractor.
see also Addala – see par 96 - A service library 306 that includes multiple services that can be included in a business process. In one embodiment, a service library 306 includes services that can be performed in an order fulfillment business process. Order fulfillment involves processes that are performed to fulfill an order. For example, an order may be received from an order capture module. The order may be for a good, service, etc. Different services may be performed to fulfill the order, such as shipment, installation, invoicing, etc. The order fulfillment process may be characterized in these different services; see par 103 - FIG. 4 illustrates an example of an interface 308 according to one embodiment. Process level table 416 summarizes different business processes that have been modeled. As shown, the business processes--Carpet Installation and Process 1--have been modeled by a user; FIG. 4 also shows status 426 of executable processes 310; see par 155 - In an embodiment of the invention, OPM 1040 is also capable of querying a list of task services to be performed at header level and perform them in a sequence defined by the user).
Rao, Kennis, and Addala disclose:
wherein the processor executes the data footprint module to record previous task data (Rao – par 31 - Discovery module 111 may use process mining techniques applied to application and user footprints, digital exhaust and event logs to discover and create a digital representation of the business process; see par 32 – For example, in the domain of e-commerce, events from the time a user logs into a system, orders a product and receives a product are logged. The discovery module 111 may view the process and trace the processes from the front end, to the fulfillment center. The discovery module 111 may generate a digital representation of the process and identify the different applications or systems that the process uses. This representation, which in some embodiments can be an end-to-end representation, may be used to detect bottlenecks and alternate routes that the process may take. See par 33 – in some embodiments, discovery module 111 may be automated. see par 60 - The discovery platform 110 may use the event and system logs, audit trails, input and output data from the execution of the intelligent automation applications 135 to generate new updated digital representations of the business process)
Kennis discloses having an API and protocol to poll for new information (See par 230). Kennis discloses a synchronous database query protocol can be used and it can be possible to “reuse” existing algorithms to handle merge and loading behaviors (See par 239).
Addala discloses:
and processing data generated during execution of the task according to the business data, wherein the previous task data is used to perform a current call service in a subsequent service (Addala – see par 76 - external interface layer 150 may receive the standard goods and/or services request(s) output by the task layer services 140 and provide a single layer transform of the request(s) if needed to match the format of fulfillment systems. The transformation performed by external interface layer maps the data to the content and format required by the integrated fulfillment systems. Transformation by decomposition layer 120 converts the data to the internal format used by system 100. External interface layer 150 may map the data structure from task layer services 140 to the external format. External interface layer 150 provides flexible routing so that request(s) are routed to specific fulfillment systems based on business rules. See par 77 - The external interface layer determines the format used by an external system and transforms the message. For example, metadata defined by a user may be used to determine the format to be used. In one example, mappings to what external systems call a product that was ordered are used to translate the message; see par 323 - At step 2' the decomposition module transforms the new order, creates a new DOO order, and identifies the new DOO order as corresponding to the original DOO order. The decomposition module also assigns separate executable processes for the lines of the new DOO order as necessary. The decomposition module then calls a group API in change mode. ) without re-executing a previously performed task operation and the task definition is provided by the knowledge map module (Addala - see par 114 - For example, the metadata is used to determine the input arguments such that the services can process an order for the business process. Also, any partner links are determined using the metadata to allow the services to interact with external systems. Executable process 310 (FIG. 3) is assembled based on the definition of steps in the business process. Because services are reusable, the same code for a service can be used for different business processes. However, the input arguments or partner links may be different. Because the same code is re-used, automatic assembly of executable process 310 is provided. see par 294-295 - In an embodiment, an orchestration system is capable of establishing a rollback checkpoint which identifies a step in an executable process where adjustment activities are no longer required prior to the identified step in the executable process. Thus, when an original executable process is being executed, and an orchestration system receives a change request, the orchestration system need only adjust the most recent steps of the original executable process up to an identified rollback checkpoint. … system is able to avoid unnecessary adjustment steps when it adjusts the steps of an original executable process upon receiving a change request)
wherein the recorded processing data carries a data identifier, and the data identifier is configured to indicate a data conversion process during execution of the task (Rao – See par 53 - The bot may access financial systems within an enterprise and/or on external networks, as well as query databases and financial platforms. Communication may be accomplished through the use of API calls for the different systems and platforms being accessed. The earnings report generation bot would access the same data that the validation bot accessed. When a user runs the report generation bot, some of the parameters can be identical or similar to those parameters accessed by the validation bot. See FIG. 5, par 75 - API queries 502 may serve as the communication between different modules, processes or models used in the processing of reports. Both user-powered tagged documents 501 and API queries 502 can be used as sources of validation 503. Additionally, sources of validation 503 may include data from systems, databases and applications within the enterprise, data on outside networks or data on a cloud infrastructure)
See also Kennis – see par 207 - configuration involves establishment and expression of the enterprise policies in one or more enterprise policy statements, determination of the manner of extraction of data from monitored databases and providing extraction data, and determination of the mapping or normalization of data in the monitored databases to the enterprise ontology, so that extracted data can be stored in the monitoring database for out-of-band operations.
See also Addala –see par 66 - decomposition layer 120 may take the received order and translate it to the order format and order content required by the other layers of the distributed order orchestration system 100, such as the fulfillment layer 160. See par 76 - Transformation by decomposition layer 120 converts the data to the internal format used by system 100. See par 149 - the creation of an order by decomposition module 1020 involves the following steps. First, a header is created. Next, one or more lines are created and associated with the header. Subsequently, for each line, one or more fulfillment lines are created, where a fulfillment line may be only associated with one line. Next, a service is invoked that assigns a separate executable process for each line. However, in certain embodiments of the invention, … a single executable process is used to process the entire DOO order. In either scenario, decomposition module 1020 selects an executable process based on the name and creation date of the executable process. see par 154 - According to the embodiment, decomposition module 1020 invokes OPM 1040 of orchestration module 1030 by passing in the header identity of the DOO order. OPM 1040 is capable of launching one or more executable processes, and is also capable of interacting with, and controlling, the one or more executable processes. see par 195, FIG. 14 - Rollback action column 1410 identifies the rollback action for each step. For example, rollback action column 1410 identifies that the rollback action for step 10 is "Update Schedule," the rollback action for step 20 is "CancelToDo," and the rollback action for step 40 is "UpdateShipment." Redo after rollback column 1420 identifies whether to redo the step after a rollback for each step.).
Rao, Kennis, and Addala are analogous art as they are directed to automating business tasks and having business process models/mining (see Rao Abstract, Kennis Abstract; par 24-25 – automated data collection and monitoring; par 169 – programmatic extractor from ERP system; Addala Abstract par 69, 98, 161). 1) Rao discloses having a number of different applications for business domains – Finance, HR, etc (See par 29, FIG. 1) and client computing devices (See FIG. 2, 201). Rao discloses identifying process flow from log data (See par 33) and domain-specific labels for validation of a financial report (See par 79). Kennis improves upon Rao by disclosing having an ERP system in an enterprise computing system and includes auditing and extraction of change of records within ERP data as well as mapping data with relationships from systems monitored (See par 167, 169, 171), and rules and expressions against a monitoring database (See par 172), and extracting data from an ERP system (See par 247, FIG. 15). Kennis also improves upon Rao by disclosing normalizing and transforming data items into a different data item (See par 131, 207, 272). One of ordinary skill in the art would be motivated to further include an enterprise resource planning system (ERP) that includes changes to data to efficiently improve upon the business domains and process flow from log data in Rao. 2) Rao discloses identifying process flow from log data (See par 33) and making API call to different systems being accessed (See par 53). Kennis discloses having an API and protocol to poll for new information (See par 230). Kennis discloses a synchronous database query protocol can be used and it can be possible to “reuse” existing algorithms to handle merge and loading behaviors (See par 239). Addala improves upon Rao and Kennis by disclosing converting (or decomposing) data to format used by a system when executing orders that uses a “header”/name (See par 76, 154, 195), using metadata and re-using some code (See par 114), can execute a change and only adjust “most recent steps” using rollback checkpoints (see par 195, 294-295), and “calling” API while doing decompositions/converting (See par 323). Addala also improves upon Rao by having a configurable service defined and built by a customer with different parameters for each business process to cause different input arguments to be processed for different orders (See par 98-99). One of ordinary skill in the art would be motivated to further include adjusting order fulfillment using different formats and calls as well as processing different parameters for different business processes to efficiently improve upon the process flow in Rao and the API and reuse of existing algorithms in Kennis.
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the domain-specific labels that are part of automating business process models in Rao to further have ERP as disclosed in Kennis, to further have adjusting order fulfillment using different formats and calls, since the claimed invention is merely a combination of old elements, and in combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable and there is a reasonable expectation of success.
Concerning independent claim 11, Rao and Kennis and Addala disclose:
A data-driven method… (Rao – same as claim 1 -see par 28, FIG. 1 – architecture 100 for automating enterprise processes; See par 32 - The discovery module 111 may view the process and trace the processes from the front end, to the fulfillment center. The discovery module 111 may generate a digital representation of the process and identify the different applications or systems that the process uses. see par 60 - The discovery platform 110 may use the event and system logs, audit trails, input and output data from the execution of the intelligent automation applications 135 to generate new updated digital representations of the business process. The business process may then be analyzed to determine if there was improvement in the efficiency of the business process, and the optimization module 133 may then adjust the model to further improve the performance of the business process).
The remaining limitations are similar to claim 1 above. It would be obvious to combine Rao and Kennis and Addala for the same reasons as claim 1.
Concerning claims 2 and 12, Rao discloses:
The data driven system according to claim 1, wherein the knowledge map module comprises a management knowledge graph (Rao – see par 79 - Domain-specific labels 610 may belong to one or more categories relevant to validation or generation of a financial report, such as an earnings report. In some embodiments, the validation documents 605 can be initially tagged by the user 604 with one or more relevant validation categories, so the machine learning models 615 can determine which set of domain-specific elements E# to look for and identify in the structured input data 603. In other embodiments, the categories can be preconfigured in the automation system 600 based on the domain to which the domain-specific labels 610 belong;
see also Kennis – par 169 - A log extractor 141d is responsive to the provision of audit data logs by certain types of ERP database systems, in the form of records indicating the addition or change of records within particular ERP data. see par 171 - A mapper 150 is operative to retrieve data from the staging database 155 and normalize, transform or map that information into a predetermine format to comprise monitoring entities, which are then stored in a monitoring database 175. The monitoring database 175 stores monitoring entities, both support and transactional, identified by table and field names in accordance with mapping data stored in a knowledge base 165. The mapping data (e.g. in the form of mapping files and ontology files) establishes relationships between monitoring entities stored in the monitoring database and monitored entities from the ERP databases), wherein the management knowledge graph provides the detection definition to the data change sensing module (Kennis – see par 169 - . A log extractor 141d is responsive to the provision of audit data logs by certain types of ERP database systems, in the form of records indicating the addition or change of records within particular ERP data. An environmental source extractor 141e is operative to obtain data from an enterprise's environment 133 (e.g. internal systems such as its information technology (IT) infrastructure). see par 172 - A knowledge base 165 stores information required by the extractor 140 (extraction data in the form of extractor files), information required by the mapper 150 (mapping data in the form of mapping files and ontology files), and a plurality of computer-executable policy statements or frames 167 that constitutes the rules and/or logic for determining exceptions;
see also Addala - See par 74 - Distributed order orchestration system 100 may further include a task layer services 140 to provide encapsulated services used to control processing logic for each orchestration process stage. In particular, task layer services 140 may provide task-specific business logic to wrap logic around a certain request such that the system 100 knows what logical tasks are associated with a particular request. The steps that need to be performed in the executable process from orchestration may require tasks to be performed. For example, task layer services 140 can provide and control processing logic for scheduling a shipment; see par 85 - The administration user interface has an integrated setup that includes process sequence, planning, jeopardy, change management, and workbench display. The administration user interface also allows for user-defined status transitions for tasks, processes, and fulfillment lines, and business rules configuration for configuring constraints, transformation rules, and routing rules.).
Obvious to combine Rao, Kennis, and Addala as in claim 1 above. In addition, Rao discloses identifying process flow from log data (See par 33). Kennis and Addala improve upon Rao by disclosing having logic for tasks are required and handling changes/transitions. One of ordinary skill in the art would be motivated to further include having logic for tasks required and handling changes/transitions efficiently improve upon the process flow in Rao.
Concerning claims 4 and 14, Rao discloses:
The data driven system according to claim 3, wherein the data driven module further comprises a data pull engine module (Rao- see par 46 - Intelligent automation applications 135 may include modules for automating domain-specific processes. These modules may comprise a finance module 136, security module 137, HR module 138, and modules for other domains within the enterprise environment. The intelligent automation applications 135 may incorporate the machine learning and AI components created by the automation platform 130 to handle automation of complicated enterprise processes), and the data pull engine module processes the business data according to an action logic model to generate processed business data to the task engine module ([0025] as published states “In operation S406, the data pull engine module 1222 may process the business data according to the action logic model to generate processed business data to the task engine module 1221, so that the task engine module 1221 may generate an another execution result. In this regard, in step S411, the action logic graph 1242 may provide the action logic model to the data pull engine module 1222. The action logic graph 1242 may provide an instance of the action logic model to the data pull engine module 1222, and the data pull engine module 1222 may execute a sequence of actions (e.g., call services and/or tasks, etc.) according to a definition of the action logic graph 1242. In this embodiment, the action logic graph 1242 abstracts the business logic into an action as one primary node type in the graph, and adds data as another primary node type. The action logic graph 1242 may describe the relationship between actions and data through edges that connect action nodes to data nodes.” Rao – see par 75 - Validation topic 504 may be extracted from the user-powered tagged documents 501, API queries 502, sources of validation 503 or supplied as separate user input. The validation process may generate large amounts of metadata 506, which may be stored along with a detailed accounting of sources accessed, date and time accessed, validation status of each element, and other details required to create and store an audit trail 507. The audit trail 507 can include a history of sources and data used to arrive at a validation assessment; see par 31 - Discovery module 111 may use process mining techniques applied to application and user footprints, digital exhaust and event logs to discover and create a digital representation of the business process; see par 46 - . The intelligent automation applications 135 may incorporate the machine learning and AI components created by the automation platform 130 to handle automation of complicated enterprise processes. To handle the automation of enterprise processes, modules are configured to automate domain-specific activities and tasks. The modules are built and trained to intelligently automate activities and tasks in the domains of finance, security, HR and GRC. The finance module 136, security module 137 and HR module 138 may all be trained separately and on data sets that are specific to the domain being automated), such that the task engine module generates an another execution result according to the processed business data (0031] as published states “In step S504, the data pulling engine module 1212 may execute the data pulling execution path (the action logic) to obtain the target data (i.e., the requested data). In step S505, the data pulling engine module 1212 may output (return) the target data to the task engine module 1211, in which the target data may also be, for example, a metadata.”
Rao – see par 54 - an instance of finance module 136 may be run by a bot and configured to validate invoices. Invoices may be uploaded as images or documents, then read by the bot. The values extracted from the invoice may then be compared to threshold values or other metrics to determine approval or denial of the invoice. See par 82 – The AVM 625 can compare the aggregated and combined figures with those reported in the validation document 605 and determine whether the reported figure matches the aggregated and combined elements. If there is a match, the reported figure is validated. If there is not a match, the reported figure is marked as failed validation. Various validation thresholds depending on the desired tolerances of validation can be programmed in the automation system 600 to determine successful and failed validations (disclosing task execution results – success or failed validations).
Kennis – see par 164 - Transactions generated by the various applications in the ERP system 110 are stored in these databases, and information from such databases is extracted, processed, and analyzed in accordance with aspects of the invention; see par 284 - FIG. 26 illustrates an exemplary policy and exception that might be made the subject of an enterprise policy and expressed as a computer-executable policy statement (or frame) in accordance with aspects of the present invention. Valid and invalid business transaction sequences are is shown in the figure, so as to illustrate certain exemplary transactions that might occur during typical business activity and how an invalid sequence might be detected.).
It would be obvious to combine Rao, Kennis, and Addala for the same reasons as claim 1.
Concerning claims 5 and 15, Rao discloses:
The data driven system according to claim 4, wherein the knowledge map module comprises an action logic graph, wherein the action logic graph provides the action logic model to the data pull engine module (Rao – see par 79 - Domain-specific labels 610 may belong to one or more categories relevant to validation or generation of a financial report, such as an earnings report. In some embodiments, the validation documents 605 can be initially tagged by the user 604 with one or more relevant validation categories, so the machine learning models 615 can determine which set of domain-specific elements E# to look for and identify in the structured input data 603. In other embodiments, the categories can be preconfigured in the automation system 600 based on the domain to which the domain-specific labels 610 belong
see also Kennis – see par 172 - A knowledge base 165 stores information required by the extractor 140 (extraction data in the form of extractor files), information required by the mapper 150 (mapping data in the form of mapping files and ontology files), and a plurality of computer-executable policy statements or frames 167 that constitutes the rules and/or logic for determining exceptions. see par 286 - a frame can be constructed to track the logical business activity steps and impose the discipline (i.e. enterprise policy) of a particular sequence in a business process of an enterprise, and to indicate a policy exception in the event that a portion of a business transaction sequence is out of order. ).
It would be obvious to combine Rao and Kennis for the same reasons as claim 1.
Concerning claims 6 and 16, Rao discloses:
The data driven system according to claim 4, wherein the enterprise resource planning system (Kennis – see par 163 - ERP systems 110 with which the embodiments of the present invention are operative include both disparate heterogeneous and stand alone computer systems that run individual ERP applications on behalf of an enterprise, such as account payables systems, human resources systems, accounts receivable, general ledger, inventory management, and like.) comprises a service orchestration module, and the service orchestration module provides the business data ([0025] as published states “In this regard, the service orchestration module 203 may call the service, and sequentially access relevant business processing interface to implement business logic processing, and the service orchestration module 203 may return the business data of the processing result to the data pull engine module 1222.” (Rao – see par 75, FIG. 5 - API queries 502 may serve as the communication between different modules, processes or models used in the processing of reports. Both user-powered tagged documents 501 and API queries 502 can be used as sources of validation 503. Additionally, sources of validation 503 may include data from systems, databases and applications within the enterprise, data on outside networks or data on a cloud infrastructure; Validation topic 504 may be extracted from the user-powered tagged documents 501, API queries 502, sources of validation 503 or supplied as separate user input;
see Kennis – see par 176 - FIG. 2 illustrates a different (integrated) enterprise environment 10' with which the TIM system 100 is also operative. The known SAP system is an example of such an integrated ERP system. In such integrated environments, an ERP system 210 may consist of an integrated suite of different applications that provide enterprise functions such as accounts payable, accounts receivable, human resources, etc.
see also Addala – see par 68 - Distributed order orchestration system 100, as illustrated in FIG. 1, further includes an orchestration layer 130. Orchestration layer 130 provides individual orchestration processes to manage order and/or service line items. For example, orchestration layer 130 may provide business process management functionality to support planning of steps within a process, including step duration and calculation or recalculation of completion dates. Orchestration layer 130 may also provide external task execution functionality to support creation, update, release, and monitoring of external tasks. External tasks are those that are carried out by the fulfillment systems).
It would be obvious to combine Rao, Kennis, and Addala for the same reasons as claim 1.
Concerning claims 8 and 18, Rao, Kennis, and Addala disclose:
The data driven system according to claim 1, wherein the data change sensing module comprises …a detection engine module (Rao – see par 75, FIG. 5 - API queries 502 may serve as the communication between different modules, processes or models used in the processing of reports. Both user-powered tagged documents 501 and API queries 502 can be used as sources of validation 503. Additionally, sources of validation 503 may include data from systems, databases and applications within the enterprise, data on outside networks or data on a cloud infrastructure).
Addala discloses:
The data driven system according to claim 1, wherein the data change sensing module comprises “a scheduling engine module” and a detection engine module, “and the scheduling engine module periodically initiates a detection instance to trigger the detection engine module according to a corresponding scheduling definition to execute a data change sensing operation according to a detection definition matching the scheduling definition” (Addala - See par 74 - Distributed order orchestration system 100 may further include a task layer services 140 to provide encapsulated services used to control processing logic for each orchestration process stage. In particular, task layer services 140 may provide task-specific business logic to wrap logic around a certain request such that the system 100 knows what logical tasks are associated with a particular request. The steps that need to be performed in the executable process from orchestration may require tasks to be performed. For example, task layer services 140 can provide and control processing logic for scheduling a shipment; see par 78 - The external systems may be systems that perform the task related to processing an order, such as a scheduling system, shipping system, etc. When the task is performed, the result of the task is determined. The result may be a date when a shipment is scheduled, a date when a good is shipped, etc. The result is then sent back to external interface layer 150.)
It would be obvious to combine Rao, Kennis, and Addala for the same reasons as claim 1. Rao discloses identifying process flow from log data (See par 33) and validating (see par 75), and Kennis discloses checking policies for valid and invalid business transaction sequences (See par 284, FIG. 26). Addala improves upon Rao and Kennis by disclosing scheduling shipments and performing tasks “when” a good is shipped (See par 74, 78). One of ordinary skill in the art would be motivated to further including scheduling and performing tasks “when” goods ship to efficiently improve upon the process flow in Rao and business transaction sequences in Kennis.
Concerning claims 9 and 19, Rao, Kennis, and Addala disclose:
The data driven system according to claim 8, wherein the detection engine module obtains data of a business system (Rao – see par 75, FIG. 5 - API queries 502 may serve as the communication between different modules, processes or models used in the processing of reports. Both user-powered tagged documents 501 and API queries 502 can be used as sources of validation 503. Additionally, sources of validation 503 may include data from systems, databases and applications within the enterprise, data on outside networks or data on a cloud infrastructure; Validation topic 504 may be extracted from the user-powered tagged documents 501, API queries 502, sources of validation 503 or supplied as separate user input) through the enterprise resource planning system, and determines a change in the data of the business system to detect the changed data (Kennis – see par 164 - Data generated from enterprise transactions is stored in ERP database system 120, such as databases 121a, 121b, . . . 121n. Such databases are considered monitored databases in accordance with aspects of the invention. Transactions generated by the various applications in the ERP system 110 are stored in these databases, and information from such databases is extracted, processed, and analyzed in accordance with aspects of the invention. ; see par 169 - A log extractor 141d is responsive to the provision of audit data logs by certain types of ERP database systems, in the form of records indicating the addition or change of records within particular ERP data. An environmental source extractor 141e is operative to obtain data from an enterprise's environment 133 (e.g. internal systems such as its information technology (IT) infrastructure). Finally, an external source extractor 141f is operative to access and retrieve information from external data sources 132 via an external network 131 such as the Internet.
see also Addala – see par 72 - , the services invoked are encapsulated and reusable. The metadata is used to determine how and when to invoke services. see par 248 - The delta comprises a set of pre-defined order attributes, identified as "delta attributes." A delta attribute is an attribute that denotes a change in an order, and triggers an adjustment of the order that is being orchestrated. Thus, the delta is computed based on a well-defined set of delta attributes).
It would be obvious to combine Rao, Kennis, and Addala for the same reasons as claim 1, 2, and 8 above.
Response to Arguments
Applicant's arguments filed 11/7/25 have been fully considered but they are not persuasive and/or are moot in view of the new rejections.
Applicant’s 101 arguments are moot in view of the new rejections necessitated by the amendments.
Applicant argues there is a “complex process of a specific data model” where the “computational load, speed, and accuracy far exceed what the human mind can do.” Remarks, page 11. In response, Examiner respectfully disagrees. The claims currently name modules of a computer for performing business data and processes for purchases. It is not complex/specific in the additional elements at this time. There is no mention in the specification, nor a requirement in the claims, for a computational load or accuracy. The abstract idea grouping identified here is not “mental” so that is also no persuasive.
Applicant argues that “it is well known in the art that ERP systems are used to store massive amounts of data and handle massive data access operation”, therefore, the claims improve “computer system technology.” Remarks, page 12. In response, Examiner respectfully disagrees. There is no special definition in applicant’s specification requiring “massive amounts of data”; there are no details in the claims related to improvement of “massive data” operations.
Applicant argues that [0026] as filed states “the service orchestration module 203 may use previous task data recorded in the database 1231 for the current call service in the subsequent call service without re-executing the previously performed task operation, thus effectively improving the efficiency and execution speed of the service call” and therefore, the computer is improved. Remarks, pages 12-13. In response, Examiner respectfully disagrees. The claims and the specification only state that the previous activity of business data (related to a purchase) does not need to be performed again. This is the same as in a manual process – choosing not to repeat earlier processes. Accordingly, at this time, this is viewed as MPEP 2106.04(d)(1) “Conversely, if the specification explicitly sets forth an improvement only in a conclusory manner (i.e., a bare assertion of an improvement without the detail necessary to be apparent to a person of ordinary skill in the art), the examiner should not determine that the claim improves technology or a technical field.”
Applicant’s 103 arguments are moot in view of the new rejections (112b, 103) necessitated by the amendments. Applicant argues claim 1 is “command-based architecture.” Remarks, page 15. Examiner notes, it is unclear which limitation(s) or portions of the specification Applicant is referring to.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to IVAN R GOLDBERG whose telephone number is (571)270-7949. The examiner can normally be reached 830AM - 430PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anita Coupe can be reached at 571-270-3614. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/IVAN R GOLDBERG/Primary Examiner, Art Unit 3619