DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Notice to Applicant
The following is a Non-Final, first Office Action responsive to Applicant’s communication of 7/8/24, in which applicant filed the application. Claims 1-8 are pending in the instant application and have been rejected below.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 7/8/24 is being considered by the examiner.
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
The following title is suggested: Information Processing of Operation Logs. Other titles may also be acceptable.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1-6 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. It is unclear how the disclosure supports the amended claim 1 where “an analysis unit, comprising one or more processors” relative to “a visualization unit, comprising one or more processors” and “a cooperation unit, comprising one or more processors”. Is unclear how the disclosure supports such an arrangement. It appears from FIG. 1, 33, [0045] - “The information processing apparatus 10 is an information processing apparatus that is realized by, for example, a general-purpose computer such as a personal computer and executes processing…”, and [0052] - “The control unit 12 includes an internal memory for storing a program defining various processing procedures and the like and required data, and performs various types of processing using the program and the data. For example, the control unit 12 includes an analysis unit 12a, a visualization unit 12b, and an RPA cooperation unit 12c”; and [214] “ The hard disk drive 1031 stores an operating system (OS) 1091, an application program 1092, a program module 1093, and program data 1094, for example. In other words, a program that defines each processing of the information processing apparatus 10 is implemented as the program module 1093 in which a code that is executable by the computer 1000 is described.” that there would be support for instead reciting something like: “An information processing apparatus comprising a processor and a memory storing a program comprising: an analysis unit
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-6 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 1 recites the limitation " an analysis unit, comprising one or more processors” relative to “a visualization unit, comprising one or more processors” and “a cooperation unit, comprising one or more processors". There is insufficient antecedent basis for this limitation in the claim. It is unclear if “visualization unit” and “cooperation unit” are intended to refer to the same or different processors. In light of there being what appears to be support for arranging the apparatus in this manner in the claims, Examiner suggests reciting: “An information processing apparatus comprising a processor and a memory storing a program comprising: an analysis unit
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-8 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e. an abstract idea) without reciting significantly more.
Step One - First, pursuant to step 1 in MPEP 2106.03, the claim 1 is directed to an apparatus which is a statutory category.
Step 2A, Prong One - MPEP 2106.04 - The claim 1 recites–
“An information processing … comprising:
an analysis unit, …, configured to acquire an operation log regarding operation information, analyze an operation log, and specify attribute information of the operation log;
a visualization unit, …, configured to visualize an image including an object on the basis of the attribute information specified by the analysis unit, the object being an object of operation logs in predetermined units and being selectable by an operation of a user; and
a cooperation unit, …, configured to generate a feasible automatic operation program on the basis of a selected object in a case where the object included in the image visualized by the visualization unit is selected by the operation of the user.”
As drafted, this is, under its broadest reasonable interpretation, within the Abstract idea grouping of “mathematical relationships” and “certain methods of organizing human activity” (managing relationships between people – including… teaching, and following rules or instructions) because we have acquiring an operation log of operation information which can be for cases, business operations (See FIG. 3) for travel purposes in one example context [0059 as published]; with attributes [0055 – e.g. business operation ID, work ID, case ID, operation type ID], visualizing the operation log, such as in FIG. 28 as a time line format [See [0187-0189]], and then generating a feasible operation on the basis of what is selected [0206-0208]. Accordingly, claim 1 is directed to an abstract idea as it is directed to analyzing people’s activities as they conduct a business operation, putting the activities in a visualization (or timeline), and generating what would be feasible to automate based on a person’s selections.
Step 2A, Prong Two - MPEP 2106.04 - This judicial exception is not integrated into a practical application. Claim 1 recites Additional elements that are:
“An information processing apparatus comprising:
an analysis unit, comprising one or more processors, configured to acquire an operation log regarding operation information, analyze an operation log, and specify attribute information of the operation log;
a visualization unit, comprising one or more processors, configured to visualize an image including an object on the basis of the attribute information specified by the analysis unit, the object being an object of operation logs in predetermined units and being selectable by an operation of a user; and
a cooperation unit, comprising one or more processors, configured to generate a feasible automatic operation program on the basis of a selected object in a case where the object included in the image visualized by the visualization unit is selected by the operation of the user.”
(Additional elements of apparatus, processor, “automatic operation program” are considered “apply it [abstract idea] on a computer” (See MPEP 2106.05f)). To extent “visualization” is putting a display to a user, this is also considered apply it [abstract idea] on a computer” (See MPEP 2106.05f)) and “field of use” (MPEP 2106.05h). Notably, even the last step is just outputting or generating what is a “feasible” automatic operation, in contrast to the computer “automatically” performing some of the earlier portions of the “operation logs.”
Accordingly, the additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim also fails to recite any improvements to another technology or technical field, improvements to the functioning of the computer itself, use of a particular machine, effecting a transformation or reduction of a particular article to a different state or thing, and/or an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. See 84 Fed. Reg. 55. The claim is directed to an abstract idea.
Step 2B in MPEP 2106.05 - The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of a computer, processor, feasible automatic operation, are treated as MPEP 2106.05(f) (Mere Instructions to Apply an Exception – “Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible.” Alice Corp., 134 S. Ct. at 235) and “field of use” (MPEP 2106.05h). Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept.
Regarding independent claim 7, it is directed to a method at step 1, which is a statutory category. Claim 7 recites similar limitations as claim 1 and is rejected for the same reasons at step 2a, prong one, 2a, prong 2, and step 2b.
Independent claim 8 is directed to an article of manufacture at step 1, which is a statutory category. Claim 8 recites similar limitations as claim 1 and claim 7 and is rejected for the same reasons at step 2a, prong one; step 2a, prong 2 and step 2b.
Claim 2 narrows the abstract idea by describing/specifying content of a business operation, work information, case information, and putting this information in the visualization/display. Similar to claim 1, this is considered at step 2a, prong two and step 2B to be MPEP 2106.05(f) (Mere Instructions to Apply an Exception – “Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible.” Alice Corp., 134 S. Ct. at 235) and “field of use” (MPEP 2106.05h).
Claim 3 narrows the abstract idea by stating there is a “time line”, with at least 2 hierarchies or timelines, along with the attributes, and which have at one of a terminal (e.g. a name for a user computer), date, name, title, and file names being displayed to a user. The content is just displayed to a user. To extent computer is used for displaying, at step 2a, prong two and step 2B, it is also MPEP 2106.05(f) (apply it [abstract idea] on a computer) and “field of use” (MPEP 2106.05h).
Claim 4 narrows the abstract idea by having the visualization have nodes for “operation content” and edges “operation orders” to show the sequence of activities. A flow chart is formed in the same manner manually, and can be used to direct people to follow the same flowchart. To extent computer is used for displaying, at step 2a, prong two and step 2B, it is also MPEP 2106.05(f) (apply it [abstract idea] on a computer) and “field of use” (MPEP 2106.05h).
Claim 5 narrows the abstract idea by visualizing something related to a captured screen in a chronological order, based on the attribute information. To extent this is “using a computer” and displaying, this is considered MPEP 2106.05f (apply it [abstract idea] on a computer) and MPEP 2106.05h (field of use).
Claim 6 narrow the abstract idea by displaying the nodes and having a user connect nodes together. To extent the training is “by a computer” and displaying, this is considered MPEP 2106.05f (apply it [abstract idea] on a computer) and MPEP 2106.05h (field of use).
Therefore, the claim(s) are rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter.
For more information on 101 rejections, see MPEP 2106.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-2 and 4-8 are rejected under 35 U.S.C. 103 as being unpatentable over Urabe, et. al, “Visualizing user action data to discover business process,” 2019, In 2019 20th Asia-Pacific Network Operations and Management Symposium (APNOMS), IEEE, pages 1-4 and Berg (US 2021/0086354).
Concerning claim 1, Urabe discloses:
An information processing… (Urabe – see page 2, col. 1, section III - visualization tool… visualizes a flowchart of user actions; second feature enables analyzers to modify the visualization granularity on the visualization display; collected user actions on the timing of UI (User interface)); comprising.
Urabe discloses software “visualization tool” (See page 2, col. 1, section III) and the business process is retrieved “from a computer” (See Abstract).
Berg discloses:
An information processing “apparatus” (Berg ‘354 see par 27 – computing system or environment 140; see par 35 - A module may be at least partially implemented in software for execution by various types of processors.).
Urabe and Berg ‘354 disclose:
an analysis unit, comprising one or more processors (Berg see par 27, 35, 80 - In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor.), configured to acquire an operation log regarding operation information, analyze an operation log, and specify attribute information of the operation log (Applicant’s [0055] as published states “The analysis unit 12a specifies and provides a business operation ID, a work ID, a case ID, and an operation type ID as each piece of attribute information.”
Urabe 2019 discloses the limitations based on broadest reasonable interpretation in light of the specification – see page 1, col. 2, 2nd paragraph - Visualizing system logs and event logs as a flowchart is known to be effective to understand the whole actual state of operational processes. There are already several tools (e.g. Celonis, Disco) in the market that can visualize business processes from these data [3], [4]. See page 2, col. 1, 4th paragraph - We collected user actions on the timing of UI (user interface) state change based on Nakajima’s idea [6]. The information we collected were, user name (text), action time (text), case ID (text), application (text), window title (text), UI location (text), and screenshot of the window (image) (hereinafter referred to as “screenshot”). User name is a user who executed the action. Action time is the execution time of the action. Case ID is a unique identifier of individual cases. Application and window title are information of window which the action was executed. UI location is the actual location of the UI state change. Screenshot is an image of the window which the action was executed;
See also Berg ‘354 – see par 41 - a log file may include one or more screenshots, screenshots per customer, tree data, domain object model (DOM) tree data, UI elements, one or more selectors, an action type, clipboard content, user computing activity, or the like. The log file or record file may be uploaded for centralized, cloud, or the like analysis. An analysis may display the progress of the processes);
a visualization unit, comprising one or more processors (Berg see par 27, 35, 80 - In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor), configured to visualize an image including an object on the basis of the attribute information specified by the analysis unit, the object being an object of operation logs in predetermined units (Urabe – see page 2, col. 1, last paragraph - Fig. 1 is an example of the visualization result which two cases are visualized together as one flowchart. Actions that have the same application, window title, and UI location were expressed as one node and visualized using screenshot. Edges were created by following the action time of each action from each case. We added start node before the first action in each case and end node after the last action in each case.) and being selectable by an operation of a user (Urabe 2019 – See page 2, col. 2, 2nd paragraph - We also created a function that enables analyzers to modify the visualization granularity according to their analytical purposes. This function lets analyzers to select nodes and group them into one node to show a process in higher granularity. On the other hand, analyzers can choose the grouped node and ungroup them to visualize more detailed information. In this example, an analyzer selects two nodes (Fig. 2, blue square in the upper box) and click “group” and add a label “Type Destination”. Then, the visualization result will be modified to the yellow node shown in the bottom box of Fig. 2.); and
a cooperation unit, comprising one or more processors (Berg see par 27, 35, 80 - In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor), configured to generate a feasible automatic operation program on the basis of a selected object in a case where the object included in the image visualized by the visualization unit is selected by the operation of the user (Applicant’s [0198] as published states “where the object included in the image visualized by the visualization unit 12b has been selected by an operation of the user, the RPA cooperation unit 12c generates a feasible automatic operation program on the basis of the selected object. The RPA cooperation unit 12c has a function capable of creating the automatic operation flow on the basis of an object selected by the user.”
Urabe 2019 – see page 3, col. 2, 3rd paragraph - the analyzer discovered that users were taking time entering business trip purpose which was the only UI that required keyboard input while others were either selection boxes or buttons in Input basic information. In Input route, users were taking time entering routes of their business trips (e.g. station names, transportation type, and transportation fee). Since this information can be found on websites where users search for routes, the analyzer discovered that these actions can be automated by RPA tools. This discovery was found based on the analyzer’s experience and from carefully observing operational patterns from the visualization result. Since the analyzers we target in this tool do not have skills in performing BPM, we believe that our tool will be more supportive for them if it can automatically display areas for improvement on the visualization result;
see also Berg ‘354 see par 17 - Conductor 104 may instruct or command robot(s) or automation executor 106 to execute or monitor a workflow in a mainframe, web, virtual machine, remote machine, virtual desktop, enterprise platform, desktop app(s), browser, or the like client, application, or program. par 50 - Discovered distinct processes and state transitions may also be generated for visual validation (3). Validation of a visual may utilize manual input or additional robotic actions. Discovered distinct processes and state transitions may be ranked based on criticality, priority, value, or the like for RPA automation).
Both Urabe and Berg are analogous art as they are directed to understanding processes for automation in robotic process automation (RPA) (see Urabe Abstract, page 1, col. 2, 2nd paragraph; Berg Abstract). Urabe discloses software “visualization tool” (See page 2, col. 1, section III) and the business process is retrieved “from a computer” (See Abstract). Berg improves upon Urabe by executing a software/program in a processor (See par 27, 35, 80) and analyzing log files (See par 41). One of ordinary skill in the art would be motivated to further include using an explicit processor to execute a program to efficiently improve upon the visualization tool of user actions on a UI in Urabe.
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the visualization tool for user actions on a UI for different Case IDs in Urabe to further execute software/program in a processor (See par 27, 35, 80) and analyzing log files (See par 41) as disclosed in Berg, since the claimed invention is merely a combination of old elements, and in combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable and there is a reasonable expectation of success.
Concerning independent claim 7, Urabe and Berg disclose:
An information processing method that is executed by an information processing apparatus, the method comprising (Urabe – see page 2, col. 1, section III - visualization tool… visualizes a flowchart of user actions; second feature enables analyzers to modify the visualization granularity on the visualization display; collected user actions on the timing of UI (User interface);
Berg ‘354 see par 27 – computing system or environment 140; see par 35 - A module may be at least partially implemented in software for execution by various types of processors) to:
The remaining limitations are similar to claim 1 above. It would be obvious to combine Urabe and Berg for the same reasons as claim 1.
Concerning independent claim 8, Urabe and Berg disclose:
A non-transitory computer readable medium storing a program, wherein execution of the program causes a computer to perform operations comprising (Urabe – see page 2, col. 1, section III - visualization tool… visualizes a flowchart of user actions; second feature enables analyzers to modify the visualization granularity on the visualization display; collected user actions on the timing of UI (User interface);
Berg ‘354 see par 27 – computing system or environment 140; see par 35 - A module may be at least partially implemented in software for execution by various types of processors; See par 80 - methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor) to:
The remaining limitations are similar to claim 1 above. It would be obvious to combine Urabe and Berg for the same reasons as claim 1.
Concerning claims 2, Urabe and Berg disclose:
The information processing apparatus according to claim 1,
wherein the analysis unit is configured to specify business operation information indicating content of a business operation of the operation log (Urabe 2019 – see page 2, col. 2, last paragraph – page 3, col. 1, 1st paragraph - The targeted operational process we used for the evaluation was to submit business trip applications on a web-based system. Therefore, we used DOM (Document Object Model) [8] to recognize UI state change and collected user action data and screenshots of the active window. We collected 29 submissions by seven people who had business trips in March and April in 2019; see page 2, col. 1, Section III – collected user actions include “case ID (text)”; See page 3, col. 2, 2nd paragraph – “input basic information – choose expense type”;
see also Berg par 67 - Discovery, mining, or identification of distinct processes and state transitions using sequence or pattern extraction may be desirable for small and medium enterprises (SMEs), centers of excellence (COEs), single user, business analysis (BA), RPA development; see par 68 – documenting human business operations), specify work information indicating content of work of the operation log included in each piece of the specified business operation information (Berg see par 41 - A log file may include one or more screenshots, screenshots per customer, tree data, domain object model (DOM) tree data, UI elements, one or more selectors, an action type, clipboard content, user computing activity, or the like. See par 65 - Sequence or pattern extraction may be repeated to determine a hierarchical flow of tasks, actions, or the like. Actions described using hierarchical flow representation may result in multi-level abstraction based on keywords and obfuscated versions of screenshots. A discovered flow may be prioritized for the RPA.), and specify case information of the operation log included in each piece of the specified content of work (Urabe 2019, page 2, col. 1, 4th paragraph - We collected user actions on the timing of UI (user interface) state change based on Nakajima’s idea [6]. The information we collected were, user name (text), action time (text), case ID (text), application (text), window title (text), UI location (text), and screenshot of the window (image) (hereinafter referred to as “screenshot”). User name is a user who executed the action. Action time is the execution time of the action. Case ID is a unique identifier of individual cases. Application and window title are information of window which the action was executed. UI location is the actual location of the UI state change. Screenshot is an image of the window which the action was executed.), and
the visualization unit is configured to visualize an image that includes objects of operation logs in predetermined units, which are selectable by the operation of the user, on the basis of the work information, the business operation information, and the case information specified by the analysis unit (Urabe – see page 2, col. 2, 2nd paragraph - We also created a function that enables analyzers to modify the visualization granularity according to their analytical purposes. This function lets analyzers to select nodes and group them into one node to show a process in higher granularity. On the other hand, analyzers can choose the grouped node and ungroup them to visualize more detailed information. The function also updates the visualization result along with the modification so that analyzers can check the result in an instance. Fig. 2 is an example of a user’s modification on the visualization display. The initial visualization result is visualized per user action (Fig. 2, upper box) and the goal of the visualization result is to visualize per task. In this example, an analyzer selects two nodes (Fig. 2, blue square in the upper box) and click “group” and add a label “Type Destination”.).
It would be obvious to combine Urabe and Berg for the same reasons as claim 1. In addition, Urabe discloses having “case ID (text), application (text).” See page 2, col. 1. Berg improves upon Urabe by also having “clipboard content” and describing actions which can use keywords (See par 41, 65).
Concerning claim 4, Urabe and Berg disclose:
The information processing apparatus according to claim 1, wherein the visualization unit is configured to visualize an image including nodes indicating operation content (Urabe – see page 2, col. 1, last paragraph - Fig. 1 is an example of the visualization result which two cases are visualized together as one flowchart. Actions that have the same application, window title, and UI location were expressed as one node and visualized using screenshot; see page 2, col. 2, 1st paragraph - We also displayed the number of case ID the node occurred
on the node to show how common the certain node occurs
in the process) and edges indicating operation orders on the basis of the attribute information, as the image including the object (Urabe – see page 2, col. 1, last paragraph – Edges were created by following the action time of each action from each case. We added start node before the first action in each case and end node after the last action in each case; see page 2, col. 2, 1st paragraph - We displayed transition possibility between nodes and average action time on edges which is important for analyzers to discover frequent actions and the amount of time to execute actions. For example, “100% 6 sec” in Fig. 1 means that all actions of node “(1)” transit to node “(2)” and it takes 6 seconds on average to execute node “(1)”; see page 3, col. 1, 1st paragraph - As an initial visualization result (Fig. 3), there were 101 nodes (type of actions including start and end) and 278 edges between nodes.).
Concerning claim 5, Urabe and Berg disclose:
The information processing apparatus according to claim 1, wherein the visualization unit is configured to visualize captured screens of an operation screen in a chronological order on the basis of the attribute information, as the image including the object. (Urabe, page 2, col. 1, 4th paragraph – The information we collected were, user name (text), action time (text), case ID (text), application (text), window title (text), UI location (text), and screenshot of the window (image) (hereinafter referred to as “screenshot”). Screenshot is an image of the window which the action was executed. Urabe, page 2, col. 1, last paragraph - Actions that
have the same application, window title, and UI location were
expressed as one node and visualized using screenshot. See page 3, col. 1, 1st paragraph and FIG. 2 - nodes (type of actions including start and end) and 278 edges between nodes; see FIG. 2, 3 – a series of screenshots from “Start” on left to “end” on right;
Berg – see par 65 - Sequence or pattern extraction may be repeated to determine a hierarchical flow of tasks, actions, or the like. Actions described using hierarchical flow representation may result in multi-level abstraction based on keywords and obfuscated versions of screenshots).
It would be obvious to combine Urabe and Berg for the same reasons as claim 1.
Concerning claim 6, Urabe and Berg disclose:
The information processing apparatus according to claim 4, wherein the cooperation unit is configured to display the nodes visualized by the visualization unit and connect the nodes on the basis of an operation of the user (Applicant’s [0199] as published states “As illustrated in FIG. 31, the RPA cooperation unit 12c has a function capable of displaying the operation nodes transmitted from the visualization unit 12b and connecting the nodes by the operation of the user in order for the user to create the desired automatic operation flow. The RPA cooperation unit 12c displays the nodes visualized by the visualization unit 12b and connects the nodes to each other on the basis of a user operation. For example, as illustrated as an example in FIG. 31, the RPA cooperation unit 12c can edit the automatic operation flow by moving objects and connecting objects by arrows on the basis of the objects transmitted from the visualization unit 12b.”). Urabe – discloses the limitations based on broadest reasonable interpretation in light of the specification – see page 3, col. 2, 4th paragraph - it is necessary to consider the order of the nodes selected by users to realize this way of grouping.
See also Berg ‘354 discloses the limitations based on broadest reasonable interpretation in light of the specification – see par 65 - Sequence or pattern extraction may be repeated to determine a hierarchical flow of tasks, actions, or the like. Actions described using hierarchical flow representation may result in multi-level abstraction based on keywords and obfuscated versions of screenshots. A discovered flow may be prioritized for the RPA. see par 66 - In certain configurations, sequence or pattern extraction may check occurrences in the log file or record file based on action type and frequency. Actions that occur in a time period may be labeled, characterized, distinguished, or the like. A set of subsequent actions may be pre-defined, set, user defined, or the like; see par 79 - A sequencing model for the captured manual workflow may be determined or selected (604)).
Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Urabe, et. al, “Visualizing user action data to discover business process,” 2019, In 2019 20th Asia-Pacific Network Operations and Management Symposium (APNOMS), IEEE, pages 1-4 and Berg (US 2021/0086354), as applied to claims 1-2 and 4-8 above, and further in view of Luo (US 2018/0113783).
Concerning claim 3, Urabe and Berg disclose:
The information processing apparatus according to claim 1, wherein the visualization unit is configured to visualize a time line as the object in a plurality of hierarchies on the basis of any one or more of attributes from among the user who has performed the operation, a terminal with which the operation has been performed, a date on which the operation has been performed, a name of an application as an operation target, a window title of the operation target, and a file name of the operation target, as the image including the object (First, Examiner interprets the language “any one or more of attributes from among…” as referring to the alternative of limitations of “terminal, date, name, window title, file name, target” being in the image;
Second, Examiner notes the content here are nonfunctional descriptive material not entitled to patentable weight, See MPEP 2111.05, as any of these items (“terminal (listing the computer of a user), date, name, window title, file name, target” being in the image are only directed to conveying a message or meaning to a human reader independent of the computer system/processor recited. They do not have a functional relationship with the computer.
Nonetheless for purposes of compact prosecution, art is still applied –
Urabe – see page 3, col. 1, 1st paragraph - As an initial visualization result (Fig. 3), there were 101 nodes (type of actions including start and end) and 278 edges between nodes.; see page 3, see FIG. 3 – showing a timeline with “start” on the left to “end” on the right – and a number of different hierarchies
PNG
media_image1.png
294
254
media_image1.png
Greyscale
PNG
media_image2.png
266
482
media_image2.png
Greyscale
Berg discloses that visualizations can be generated with “TimelinePI” for automatic workflow and processing understanding, and the query systems of record (SOR) may be utilized for building workflows for RPA automation (See par 45).
To any extent Urabe and Berg do not disclose the limitations, Luo discloses:
The information processing apparatus according to claim 1, wherein the visualization unit is configured to visualize a time line as the object “in a plurality of hierarchies” on the basis of any one or more of attributes from among the user who has performed the operation, a terminal with which the operation has been performed, a date on which the operation has been performed, a name of an application as an operation target, a window title of the Luo see par 77 – log analyzer analyzes log data; The log analyzer 24 may include instructions to construct an S3 graph, or similar visualization, and instantiate the S3 graph with object instances. Such a graph may have nodes defined by object identifiers. Events associated with the object identifiers may be determined and included in the visual representation. This may take the form of an event timeline; see par 25 “identifier” can include “process ID”; see par 114 - An example GUI 26 is shown in FIG. 12. Objects are organized hierarchically allowing users to understand the system's structure as they drill down on each object. Each line represents an object with its IDs listed in the left panel. Users can drill down to objects at the next level by selecting (e.g., clicking) on the object. Each circle in the right panel represents an individual event, and its shading indicates the host 12 where the corresponding log message was outputted. See par 115 - Clearly visible is that user1's jobs start processing as soon as user3 releases its containers. The vertical lines show the interactions among objects, which are inferred from the events that included multiple objects. It shows that user1's Query 0437 has attempts that were created early on but only received containers much later).
Urabe, Berg, and Luo are analogous art as they are directed to understanding processes from logs (see Urabe Abstract, page 1, col. 2, 2nd paragraph; Berg Abstract; Luo Abstract, par 25). Urabe discloses having a visualization with nodes for start and end (see page 3, FIG. 3). Berg discloses that visualizations can be generated with “TimelinePI” for automatic workflow and processing understanding, and the query systems of record (SOR) may be utilized for building workflows for RPA automation (See par 45). Luo improves upon Urabe and Berg by showing a timeline relative to vertically hierarchically arranged activities by a user (See FIG. 12, 13). One of ordinary skill in the art would be motivated to further include a vertically hierarchically arranged activity and a timeline to efficiently improve upon the visualization tool of user actions on a UI along a timeline with a set of hierarchies in Urabe and a visualization generated by a ”timelinePI” in Berg.
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the visualization tool for user actions on a UI for different Case IDs in Urabe to further execute software/program in a processor (See par 27, 35, 80) and analyzing log files (See par 41) as disclosed in Berg, and to further visualize objects in a vertical hierarchy along a timeline as disclosed in Luo, since the claimed invention is merely a combination of old elements, and in combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable and there is a reasonable expectation of success.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Cho (US 2022/0114512) – directed to obtaining a log file generated while a user performs a task to look for repetitive tasks (See Abstract)
Kunnath (US 2022/00664422) – directed to using a visualization of workflow for RPA (Robotic process automation)
Ma (US 2020/0206920) – directed to recording even streams of human interacting with a computing device to identify candidate processes for robotic automation (See Abstract)
Any inquiry concerning this communication or earlier communications from the examiner should be directed to IVAN R GOLDBERG whose telephone number is (571)270-7949. The examiner can normally be reached 830AM - 430PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anita Coupe can be reached at 571-270-3614. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/IVAN R GOLDBERG/ Primary Examiner, Art Unit 3619