DETAILED ACTION
This is the initial Office action based on the application filed on December 30, 2022.
Claims 1-20 are pending.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
Claim 18 is rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor regards as the invention.
Claim 18 recites, in line 2, the limitations "the attributes of the model of the attributes" and in lines 3-4 “the attributes of a model of attributes” There is insufficient antecedent basis for both of these limitations in the claim. In the interest of compact prosecution, the Examiner subsequently interprets the limitations as – attributes of the model of the attributes – and -- attributes of a model of attributes – respectively in Claim 18.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-3 and 5-13 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Claim Interpretation: Under the broadest reasonable interpretation (BRI), the limitations of Claim 1 are presumed to have their plain meaning consistent with the specification as it would be interpreted by one of ordinary skill in the art. See MPEP § 2111.
Step 1: Claim 1 is directed to a method, which is a process (a series of steps or acts), and falls within one of the statutory categories of invention.
Step 2A, Prong One: Claim 1 recites the limitations:
constructing a model of the attributes of the user interface elements including an attribute indicating functionality of at least one user interface element.
These recited steps, under the broadest reasonable interpretation (BRI), cover performance of the steps in the human mind alone or with the aid of pen and paper. That is, other than reciting:
by a processor set.
Nothing in the claim precludes the steps from practically being performed in the human mind alone using observation, evaluation, judgment, and opinion or with the aid of pen and paper. For example, the limitation (a) in the context of the claim encompasses a human observing and evaluating attributes of user interface elements to mentally construct a model of attributes with the aid of pen and paper. See MPEP § 2106.04(a)(2)(III).
If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the human mind alone or with the aid of pen and paper but for the recitation of generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim recites an abstract idea.
Step 2A, Prong Two: This judicial exception is not integrated into a practical application. In particular, the claim recites the additional element:
by a processor set.
The additional element (1) is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the judicial exception using generic computer components. The processor set is used as a tool to perform the receiving, constructing, and storing steps of the claim. See MPEP § 2106.05(f).
Also, the claim recites the additional elements:
receiving user interface context identification information of attributes of user interface elements in user interface code of an application;
obtaining from the model of the attributes of the user interface elements an identification of the at least one user interface element referenced in an action command for performing the functionality of the at least one user interface element in a robotic process automation code; and
storing the model of the attributes of the user interface elements in persistent storage.
The additional elements (2) to (4) are mere data gathering/transmitting recited at a high level of generality, and thus are insignificant extra-solution activities. See MPEP § 2106.05(g). Furthermore, all uses of the recited judicial exception require such data gathering/transmitting, and, as such, the additional elements do not impose any meaningful limits on the claim. The additional elements amount to necessary data gathering/transmitting. See MPEP § 2106.05.
Accordingly, even when viewed in combination, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements when considered both individually and as a combination do not amount to significantly more than the abstract idea. As discussed above with respect to integration of the abstract idea into a practical application, the claim recites the additional element:
by a processor set.
The additional element (1) amounts to no more than mere instructions to apply the judicial exception using generic computer components. Mere instructions to apply a judicial exception using generic computer components cannot provide an inventive concept.
Also, the claim recites the additional elements:
receiving user interface context identification information of attributes of user interface elements in user interface code of an application;
obtaining from the model of the attributes of the user interface elements an identification of the at least one user interface element referenced in an action command for performing the functionality of the at least one user interface element in a robotic process automation code; and
storing the model of the attributes of the user interface elements in persistent storage.
The additional elements (2) to (4) simply append well-understood, routine, and conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception is not indicative of an inventive concept. MPEP § 2106.05(d)(II) expressly states that the courts have recognized the computer function of receiving or transmitting data over a network, e.g., using the Internet to gather data as a well‐understood, routine, and conventional computer function when it is claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activities. Thus, a person of ordinary skill in the art would readily comprehend that it is well-understood, routine, and conventional in the computing art to receive user interface context identification information of UI attributes, obtain an identification of a UI element from a model of UI attributes, and store the model of attributes in persistent storage. Therefore, the limitations remain insignificant extra-solution activities even upon reconsideration and do not amount to significantly more.
Thus, taken alone, the additional elements do not amount to significantly more than the above-identified judicial exception (the abstract idea). Looking at the additional elements as a combination adds nothing that is not already present when looking at the additional elements taken individually. Even when considered in combination, the additional elements represent mere instructions to apply a judicial exception using generic computer components and insignificant extra-solution activities, and therefore do not provide an inventive concept. The claim is not patent eligible.
Claims 2, 3, and 5-13 are rejected under 35 U.S.C. 101 as directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more for at least the reasons stated above.
Claim 2 recites the limitations:
by the processor set;
generating the robotic process automation code referencing the at least one user interface element identified in the model of the attributes of the user interface elements.
<<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>>
Claim 3 recites the limitation:
executing, by the processor set, the robotic process automation code referencing the at least one user interface element identified in the model of the attributes of the user interface elements.
<<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>>
Claim 5 recites the limitations:
by the processor set;
scanning the user interface code of the application.
<<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>>
Claim 6 recites the limitations:
by the processor set;
collecting the user interface context identification information of the attributes of the user interface elements, including relationships of the user interface elements to parent, children and peer user interface panels.
<<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>>
Claim 7 recites the limitations:
by the processor set;
formatting the user interface context identification information of the attributes of the user interface elements.
<<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>>
Claim 8 recites the limitation:
the attributes of the user interface elements in the user interface code of the application are specified by a markup language.
<<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>>
Claim 9 recites the limitation:
the attributes of the user interface elements in the user interface code comprise an element type.
<<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>>
Claim 10 recites the limitation:
the attributes of the user interface elements in the user interface code comprise an element label.
<<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>>
Claim 11 recites the limitation:
the attributes of the user interface elements in the user interface code comprise an element event.
<<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>>
Claim 12 recites the limitation:
the attributes of the user interface elements in the user interface code comprise an element dependency.
<<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>> + <<>>
Claim 13 recites the limitations:
by the processor set
storing the robotic process automation code in persistent storage.
These claims are dependent on Claim 1, but do not add any feature or subject matter that would solve the judicial exception deficiencies of Claim 1.
Claims 2, 5, and 7 (additional element (b) recited in Claim 2, additional element (b) recited in Claim 5, and additional element (b) recited in Claim 7) recite further mental steps which can be practically performed in the human mind alone using observation, evaluation, judgment, and opinion or with the aid of pen and paper and thus, fail to make the claim any less abstract (see MPEP § 2106.04(a)(2)(III)).
Claims 2, 3, and 5-13 recite further additional elements that do not integrate the judicial exception into a practical application of the judicial exception. Specifically, the additional element (a) recited in Claim 2, the additional element (a) recited in Claim 3, the additional element (a) recited in Claim 5, the additional element (a) recited in Claim 6, the additional element (a) recited in Claim 7, and the additional element (a) recited in Claim 13 fail to meaningfully limit the claim because they amount to no more than mere instructions to apply the judicial exception using generic computer components/functions. See MPEP § 2106.05(f). See MPEP § 2106.05(f). The additional element (a) recited in Claim 8, the additional element (a) recited in Claim 9, the additional element (a) recited in Claim 10, the additional element (a) recited in Claim 11, and the additional element (a) recited in Claim 12 fail to meaningfully limit the claim because they amount to merely indicating a field of use or technological environment in which to apply a judicial exception which does not amount to significantly more than the exception itself, and cannot integrate a judicial exception into a practical application. See MPEP § 2106.05(h). The additional element (b) recited in Claim 6 and the additional element (b) recited in Claim 13 fail to meaningfully limit the claim because it is mere data gathering/transmitting recited at a high level of generality, and thus are insignificant extra-solution activities. See MPEP § 2106.05(g). Therefore, Claims 2, 3, and 5-13 when considered both individually and as a combination fail to integrate the abstract idea into a practical application.
The additional elements recited in Claims 2, 3, and 5-13 are also not sufficient to amount to significantly more than the judicial exception. Specifically, the additional element (a) recited in Claim 2, the additional element (a) recited in Claim 3, the additional element (a) recited in Claim 5, the additional element (a) recited in Claim 6, the additional element (a) recited in Claim 7, and the additional element (a) recited in Claim 13 do not amount to significantly more than the abstract idea because they amount to no more than mere instructions to apply the judicial exception using generic computer components/functions which cannot provide an inventive concept. See MPEP § 2106.05(f). The additional element (a) recited in Claim 8, the additional element (a) recited in Claim 9, the additional element (a) recited in Claim 10, the additional element (a) recited in Claim 11, and the additional element (a) recited in Claim 12 do not amount to significantly more because they amount to merely indicating a field of use or technological environment in which to apply a judicial exception which does not amount to significantly more than the exception itself, and cannot integrate a judicial exception into a practical application. See MPEP § 2106.05(h). The additional element (b) recited in Claim 6 and the additional element (b) recited in Claim 13 do not amount to significantly more because they are mere data gathering/transmitting recited at a high level of generality, and thus are insignificant extra-solution activities which simply appends well-understood, routine, and conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception is not indicative of an inventive concept. MPEP § 2106.05(d)(II) expressly states that the courts have recognized the computer function of receiving or transmitting data over a network, e.g., using the Internet to gather data as a well‐understood, routine, and conventional computer function when it is claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activities. Therefore, Claims 2, 3, and 5-13 do not add any steps or additional elements, when considered both individually and as a combination, that amount to significantly more than the above-identified judicial exception that would convert Claim 1 into patent-eligible subject matter.
Claims 1-3 and 5-13 are therefore not drawn to patent-eligible subject matter as they are directed to an abstract idea without significantly more.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, 7-10, 13, and 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over US 2020/0401431 (hereinafter “Rashid”) in view of US 2022/0229762 (hereinafter “Stan”).
As per Claim 1, Rashid discloses:
A method, comprising:
receiving, by a processor set, user interface context identification information of attributes of user interface elements in [webpage] (paragraph [0060], “At 510, the method can retrieve [receive] the DOM properties for a current webpage (e.g., by requesting them from a DOM interface). At 520, the method can harvest supplemental properties for the webpage (e.g., from internal or external services). Then at 530, the method can combine the DOM properties and supplemental properties for the current webpage and save them into the current master data frame (emphasis added).”; paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties associated with the UI control elements [user interface context identification information] can be obtained from the DOM which can be stored as a tree structure and represent the logical structure of a current webpage (emphasis added).”; paragraph [0124], “The processing units 1910, 1915 execute computer-executable instructions, such as for implementing the features described in the examples herein (emphasis added).”);
constructing, by the processor set, a model of the attributes of the user interface elements including an attribute indicating functionality of at least one user interface element (paragraph [0060], “FIG. 5 is a flowchart 500 of an example method of constructing a current master data frame [model] and can be implemented, for example, by the system of FIG. 4. At 510, the method can retrieve the DOM properties for a current webpage (e.g., by requesting them from a DOM interface). At 520, the method can harvest supplemental properties for the webpage (e.g., from internal or external services). Then at 530, the method can combine the DOM properties and supplemental properties [attributes of UI elements] for the current webpage and save them into the current master data frame [model]. The current master data frame can then be used for matching purposes (e.g., against archived master data frames that have been previously saved for the user interface) (emphasis added).”; paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties [attributes] associated with the UI control elements can be obtained from the DOM which can be stored as a tree structure and represent the logical structure of a current webpage (emphasis added).”; paragraph [0062], “Thus, the master data frame described herein provides an internal data structure that maintains a unique identification profile for a UI control element in a webpage. Such unique identification profile can include not only direct properties obtained from the DOM (DOM properties), but also supplemental properties such as the ARIA properties that are extrinsic to the DOM structure. As an example, the following lists some properties [attributes] of a unique identification “profile” corresponding to a “Save” button in a webpage:
Internal_id = fvkihs8y8syijsbjsbkj
/* Example direct properties obtained from DOM */
Type = clickable [indicating functionality]
Role = button
Label = Save
Id = UI5_98799
Class = [‘class1’ ‘class2’, . .. . ] […] (emphasis added).”; paragraph [0068], “Incorporating the ARIA properties [attributes of UI elements] into the master data frame can facilitate efficient identification of a missing target UI control element. For example, a webpage upgrade may render a button to a link, or change a field to a dropdown list, or alter the appearance of a label, etc […] However, because the ARIA properties associated with such UI control elements remain unchanged, an ARIA-based RPA engine can reliably find those UI control elements based on their functional equivalence in terms of ARIA properties (emphasis added).”; paragraph [0124], “The processing units 1910, 1915 execute computer-executable instructions, such as for implementing the features described in the examples herein (emphasis added).”);
obtaining from the model of the attributes of the user interface elements an identification of the at least one user interface element referenced in an action command for performing the functionality of the at least one user interface element in a robotic process automation code (paragraph [0032], “When activated, the RPA engine 110 can run the RPA scripts 140 to automatically identify and interact with a plurality of UI control elements in the currently displaying webpage. In practice, a script can instruct the RPA engine 110 to interact with and monitor the user interface (e.g., enter values, activate graphical buttons, read fields, and the like) [action command for performing the functionality of at least one user interface element in a robotic process automation code]. As described herein, a script can refer to user interface elements via an identifier (emphasis added).”; paragraph [0074], “At 810, during robotic process automation (RPA) processing (e.g., execution of an RPA script that interacts with the currently displayed user interface that is represented by a current DOM), the adaptive control finder can receive a request [action command] from the RPA engine to identify a target UI control element in a current webpage. The current webpage can be represented by a current master data frame [model of attributes of UI elements], which can include a current DOM (or at least contain properties of the current DOM) (emphasis added).”; paragraph [0077], “At 840, the adaptive control finder can find an equivalent UI control element in the current master data frame (e.g., in the current UI represented by the current DOM) based at least on the archived version of the target UI control element as described further in FIG. 9. As described herein, the equivalent user interface control element can be an element in the current web page that is functionally equivalent to the target user interface control element in the archived webpage (emphasis added).”; paragraph [0078], “At 850, the adaptive control finder can output the equivalent UI control element found at 840 (e.g., by returning an identifier associated with the equivalent UI control element that is in the current DOM). The identifier can then be used to find the UI control element and perform the instructed operation on the equivalent UI control element (in place of the missing one) (emphasis added).”); and
storing, by the processor set, the model of the attributes of the user interface elements in persistent storage (paragraph [0061], “In any of the examples herein, the constructed current master data frame [model of attributes of UI elements] can be stored in archived master data frames at 540 (e.g., for later matching purposes) (emphasis added).”; paragraph [0131], “Any of the computer-readable media herein can be non-transitory (e.g., volatile memory such as DRAM or SRAM, nonvolatile memory such as magnetic storage, optical storage, or the like) and/or tangible. Any of the storing actions described herein can be implemented by storing in one or more computer-readable media (e.g., computer-readable storage media or other tangible media). Any of the things (e.g., data created and used during implementation) described as stored can be stored in one or more computer-readable media (e.g., computer-readable storage media or other tangible media) (emphasis added).”; paragraph [0124], “The processing units 1910, 1915 execute computer-executable instructions, such as for implementing the features described in the examples herein (emphasis added).”).
Rashid does not explicitly disclose:
in user interface code of an application.
However, Stan discloses:
in user interface code of an application (paragraph [0047], “In some embodiments, element IDs are included in a source code of user interface 38, for instance as a set of attribute-value pairs […] Exemplary source code comprises an HTML document which is rendered as a webpage by a web browser application (emphasis added).”).
Rashid and Stan are both within the same field of endeavor as the claimed invention regarding the use of RPA in finding an updated/alternative UI element.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “in user interface code of an application.” The modification would be obvious because one of ordinary skill in the art would be motivated to use attributes of user interface elements in user interface code of an application, such as an element ID, to “enable a successful and ideally unambiguous identification by RPA robot” of a target user interface element (Stan, paragraph [0047]).
As per Claim 2, the rejection of Claim 1 is incorporated; and Rashid discloses “the model of the attributes of the user interface elements (paragraph [0060], “FIG. 5 is a flowchart 500 of an example method of constructing a current master data frame [model] and can be implemented, for example, by the system of FIG. 4. At 510, the method can retrieve the DOM properties for a current webpage (e.g., by requesting them from a DOM interface). At 520, the method can harvest supplemental properties for the webpage (e.g., from internal or external services). Then at 530, the method can combine the DOM properties and supplemental properties [attributes of UI elements] for the current webpage and save them into the current master data frame [model]. The current master data frame can then be used for matching purposes (e.g., against archived master data frames that have been previously saved for the user interface) (emphasis added).”; paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties [attributes] associated with the UI control elements can be obtained from the DOM which can be stored as a tree structure and represent the logical structure of a current webpage (emphasis added).”),” but does not explicitly disclose:
generating, by the processor set, the robotic process automation code referencing the at least one user interface element identified in the model of the attributes of the user interface elements.
However, Stan discloses:
generating, by the processor set, the robotic process automation code referencing the at least one user interface element identified in [user input] (paragraph [0080], “In some embodiments, in response to receiving user input indicating a new target element for a selected activity, robot 12 may transmit a target update indicator 66 to a script editing module 38, which may execute on same RPA host as robot 12 or remotely, on another host being part of the respective RPA environment. Target update indicator 66 may comprise an indicator of a selected RPA script 40, an indicator of a selected RPA activity, and an indicator of a selected target UI element (e.g., a runtime selector characterizing the respective target). These indicators may collectively communicate to script editing module 38 to update the respective RPA script by changing the current target for the respective RPA activity to the new target specified by target update indicator 66. Module 38 may carry out the respective edit, to produce [generate] an updated RPA script 140 which may be further distributed to robots executing the respective automation (emphasis added).”; paragraph [0022], “According to some embodiments, the present invention provides, inter alia, computer systems comprising hardware (e.g. one or more processors) programmed to perform the methods described herein, as well as computer-readable media encoding instructions to perform the methods described herein (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “generating, by the processor set, the robotic process automation code referencing the at least one user interface element identified in the model of the attributes of the user interface elements.” The modification would be obvious because one of ordinary skill in the art would be motivated to generate and execute RPA code referencing a new/alternative target UI element to “provide a straightforward and intuitive way of fixing the robot by redirecting it to the correct target element so that it may continue executing the current automation” which helps solve the problem associated with costly/cumbersome debugging when a target UI of an interface changes (Stan, paragraphs [0085 & 0091]).
As per Claim 3, the rejection of Claim 1 is incorporated; and Rashid discloses “the model of the attributes of the user interface elements (paragraph [0060], “FIG. 5 is a flowchart 500 of an example method of constructing a current master data frame [model] and can be implemented, for example, by the system of FIG. 4. At 510, the method can retrieve the DOM properties for a current webpage (e.g., by requesting them from a DOM interface). At 520, the method can harvest supplemental properties for the webpage (e.g., from internal or external services). Then at 530, the method can combine the DOM properties and supplemental properties [attributes of UI elements] for the current webpage and save them into the current master data frame [model]. The current master data frame can then be used for matching purposes (e.g., against archived master data frames that have been previously saved for the user interface) (emphasis added).”; paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties [attributes] associated with the UI control elements can be obtained from the DOM which can be stored as a tree structure and represent the logical structure of a current webpage (emphasis added).”),” but does not explicitly disclose:
executing, by the processor set, the robotic process automation code referencing the at least one user interface element identified in the model of the attributes of the user interface elements.
However, Stan discloses:
executing, by the processor set, the robotic process automation code referencing the at least one user interface element identified in [user input] (Figure 9: 214, 226; paragraph [0078], “In such embodiments, in a step 226 robot 12 may receive user input indicating a selected candidate as runtime target for the current RPA activity. In response, robot 12 may attempt to carry out [execute] the respective activity on the selected candidate UI element (emphasis added).”; paragraph [0080], “In some embodiments, in response to receiving user input indicating a new target element for a selected activity, robot 12 may transmit a target update indicator 66 to a script editing module 38 […] Target update indicator 66 may comprise an indicator of a selected RPA script 40, an indicator of a selected RPA activity, and an indicator of a selected target UI element (e.g., a runtime selector characterizing the respective target). These indicators may collectively communicate to script editing module 38 to update the respective RPA script by changing the current target for the respective RPA activity to the new target specified by target update indicator 66. Module 38 may carry out the respective edit, to produce an updated RPA script 140 which may be further distributed to robots executing the respective automation (emphasis added).”; paragraph [0091], “A second manner in which displaying alternative targets may facilitate debugging is that it may provide a straightforward and intuitive way of fixing the robot by redirecting it to the correct target element so that it may continue executing the current automation (emphasis added).”; paragraph [0022], “According to some embodiments, the present invention provides, inter alia, computer systems comprising hardware (e.g. one or more processors) programmed to perform the methods described herein, as well as computer-readable media encoding instructions to perform the methods described herein (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “executing, by the processor set, the robotic process automation code referencing the at least one user interface element identified in the model of the attributes of the user interface elements.” The modification would be obvious because one of ordinary skill in the art would be motivated to generate and execute RPA code referencing a new/alternative target UI element to “provide a straightforward and intuitive way of fixing the robot by redirecting it to the correct target element so that it may continue executing the current automation” which helps solve the problem associated with costly/cumbersome debugging when a target UI of an interface changes (Stan, paragraphs [0085 & 0091]).
As per Claim 7, the rejection of Claim 1 is incorporated; and Rashid further discloses:
formatting, by the processor set, the user interface context identification information of the attributes of the user interface elements (paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties associated with the UI control elements [user interface context identification information] can be obtained from the DOM which can be stored as a tree structure [formatted as a tree structure] and represent the logical structure of a current webpage (emphasis added).”).
As per Claim 8, the rejection of Claim 1 is incorporated; and Rashid discloses “the attributes of the user interface elements […] are specified by a markup language (paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties associated with the UI control elements can be obtained from the DOM […] (emphasis added).”; paragraph [0051], “In practice, the change to the webpage (e.g., the HTML representation) results in a change to the DOM that represents the webpage. If an RPA script was written to rely on an identifier of the UI control element, it may have changed, and thus not be found during script execution (emphasis added).”; paragraph [0062], “Such unique identification profile can include not only direct properties obtained from the DOM (DOM properties), but also supplemental properties such as the ARIA properties that are extrinsic to the DOM structure. As an example, the following lists some properties of a unique identification “profile” corresponding to a “Save” button in a webpage:
Internal_id = fvkihs8y8syijsbjsbkj
/* Example direct properties obtained from DOM */
Type = clickable
Role = button
Label = Save
Id = UI5_98799
Class = [‘class1’ ‘class2’, . .. . ]
/* Example ARIA properties */
Aria_label = Save
Aria_role = button […] (emphasis added).”) [Examiner’s Remarks: Note that Rashid discloses UI elements being associated with properties that define attributes of a UI element, an example of a webpage being an HTML representation, the DOM that represents a webpage, and obtaining DOM properties that include UI element properties such as the example DOM and ARIA properties of a “Save” button. One of ordinary skill in the art would readily comprehend that the attributes of the UI elements from the UI element properties obtained from the DOM are specified by HTML (a markup language) since the DOM is a representing a webpage that is an HTML representation.],” but does not explicitly disclose:
in the user interface code of the application.
However, Stan discloses:
in the user interface code of the application (paragraph [0047], “In some embodiments, element IDs are included in a source code of user interface 38, for instance as a set of attribute-value pairs […] Exemplary source code comprises an HTML document which is rendered as a webpage by a web browser application (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “in the user interface code of the application.” The modification would be obvious because one of ordinary skill in the art would be motivated to use attributes of user interface elements in user interface code of an application, such as an element ID, to “enable a successful and ideally unambiguous identification by RPA robot” of a target user interface element (Stan, paragraph [0047]).
As per Claim 9, the rejection of Claim 1 is incorporated; and Rashid discloses “the attributes of the user interface elements […] comprise an element type (paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In another example, the UI control element can be associated with a type property characterizing the type of the UI control element, e.g., {Type=checkbox} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties associated with the UI control elements can be obtained from the DOM […] (emphasis added).”; paragraph [0062], “Such unique identification profile can include not only direct properties obtained from the DOM (DOM properties), but also supplemental properties such as the ARIA properties that are extrinsic to the DOM structure. As an example, the following lists some properties of a unique identification “profile” corresponding to a “Save” button in a webpage:
Internal_id = fvkihs8y8syijsbjsbkj
/* Example direct properties obtained from DOM */
Type = clickable
Role = button […] (emphasis added).”),” but does not explicitly disclose:
in the user interface code.
However, Stan discloses:
in the user interface code (paragraph [0047], “In some embodiments, element IDs are included in a source code of user interface 38, for instance as a set of attribute-value pairs (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “in the user interface code.” The modification would be obvious because one of ordinary skill in the art would be motivated to use attributes of user interface elements in user interface code of an application, such as an element ID, to “enable a successful and ideally unambiguous identification by RPA robot” of a target user interface element (Stan, paragraph [0047]).
As per Claim 10, the rejection of Claim 1 is incorporated; and Rashid discloses “the attributes of the user interface elements […] comprise an element label (paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties associated with the UI control elements can be obtained from the DOM […] (emphasis added).”; paragraph [0062], “Such unique identification profile can include not only direct properties obtained from the DOM (DOM properties), but also supplemental properties such as the ARIA properties that are extrinsic to the DOM structure. As an example, the following lists some properties of a unique identification “profile” corresponding to a “Save” button in a webpage:
Internal_id = fvkihs8y8syijsbjsbkj
/* Example direct properties obtained from DOM */
Type = clickable
Role = button
Label = Save […] (emphasis added).”),” but does not explicitly disclose:
in the user interface code.
However, Stan discloses:
in the user interface code (paragraph [0047], “In some embodiments, element IDs are included in a source code of user interface 38, for instance as a set of attribute-value pairs (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “in the user interface code.” The modification would be obvious because one of ordinary skill in the art would be motivated to use attributes of user interface elements in user interface code of an application, such as an element ID, to “enable a successful and ideally unambiguous identification by RPA robot” of a target user interface element (Stan, paragraph [0047]).
As per Claim 13, the rejection of Claim 2 is incorporated; and Rashid does not explicitly disclose:
storing, by the processor set, the robotic process automation code in persistent storage.
However, Stan discloses:
storing, by the processor set, the robotic process automation code in persistent storage (paragraph [0039], “Database server 16 is configured to selectively store and/or retrieve data related to RPA environment 10 in/from database 18. Such data may include configuration parameters of various robots 12a-c, robot groups, as well as data characterizing workflows executed by various robots, and data characterizing users, roles, schedules, queues, etc […] Database server 16 and database 18 may employ any data storage protocol and format known in the art, such as structured query language (SQL), ElasticSearch®, and Redis®, among others (emphasis added).”; paragraph [0028], “Once a workflow is developed, it may be encoded in computer-readable form as a set of RPA scripts 40 (FIG. 2) (emphasis added).”; paragraph [0022], “According to some embodiments, the present invention provides, inter alia, computer systems comprising hardware (e.g. one or more processors) programmed to perform the methods described herein, as well as computer-readable media encoding instructions to perform the methods described herein (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “storing, by the processor set, the robotic process automation code in persistent storage.” The modification would be obvious because one of ordinary skill in the art would be motivated to store RPA code in persistent storage to ensure data is preserved and consistent, especially for automated repetitive tasks that improve productivity by freeing human operators to “perform more intellectually sophisticated and/or creative activities” (Stan, paragraphs [0003 & 0039]).
As per Claim 17, Rashid discloses:
A system comprising:
a processor set (paragraph [0124], “With reference to FIG. 19, the computing system 1900 includes one or more processing units 1910, 1915 and memory 1920, 1925 (emphasis added).”), one or more computer readable storage media, and program instructions collectively stored on the one or more computer readable storage media (paragraph [0132], “Any of the methods described herein can be implemented by computer-executable instructions in (e.g., stored on, encoded on, or the like) one or more computer-readable media (e.g., computer-readable storage media or other tangible media) or one or more computer-readable storage devices (e.g., memory, magnetic storage, optical storage, or the like) (emphasis added).”), the program instructions executable to:
execute an action command in a robotic process automation code referencing a user interface element of a previous version of an application (paragraph [0032], “When activated, the RPA engine 110 can run [execute] the RPA scripts 140 to automatically identify and interact with a plurality of UI control elements in the currently displaying webpage. In practice, a script can instruct the RPA engine 110 to interact with and monitor the user interface (e.g., enter values, activate graphical buttons, read fields, and the like) [action command]. As described herein, a script can refer to user interface elements via an identifier (emphasis added).”; paragraph [0074], “At 810, during robotic process automation (RPA) processing (e.g., execution of an RPA script that interacts with the currently displayed user interface that is represented by a current DOM), the adaptive control finder can receive a request [action command] from the RPA engine to identify a target UI control element in a current webpage. The current webpage can be represented by a current master data frame, which can include a current DOM (or at least contain properties of the current DOM). As described herein, the request can include an identifier of the target UI control element (emphasis added).”; paragraph [0075], “At 820, the adaptive control finder can determine that a target UI control element identifier associated with the target UI control element is absent in the current DOM (emphasis added).”; paragraph [0076], “At 830, the adaptive control finder can retrieve an archived version of the target UI control element from an archived master data frame [user interface element of a previous version]. For example, an identifier (e.g. URL) of the web page can be used to retrieve a previously stored archived master data frame for the web page being rendered [previous version of application] (emphasis added).”) [Examiner’s Remarks: Note that Rashid discloses executing an RPA script that interacts with the user interface and when the target UI element referenced in the action command isn’t found in the current DOM/webpage, the adaptive control finder retrieves an archived (previous) version of the target UI element from a previously stored archived master data frame of the webpage. One of ordinary skill in the art would readily comprehend that retrieving an archived version of the target UI element since the target UI element is missing from the current DOM of the webpage means the executed action command references a user interface element of a previous version of a webpage application.];
receive an indication of failure to identify the user interface element during execution of the action command in the robotic process automation code using an updated version of the application (paragraph [0075], “At 820, the adaptive control finder can determine that a target UI control element identifier associated with the target UI control element is absent in the current DOM [updated version of the application] (emphasis added).”; paragraph [0059], “In practice, the archived master data frames 480 can be referenced by URL or some other identifier so that they can be retrieved at a later time for use to find absent UI control elements as disclosed herein (emphasis added).”; abstract, “A computer-implemented method can receive a request [action command] from a robotic process automation engine to identify a target user interface control element in a webpage represented by a current master data frame. The current master data frame comprises a current document object model (DOM). The method can determine that a target user interface control element identifier associated with the target user interface control element is absent in the current DOM. The method can retrieve an archived version of the target user interface control element from an archived master data frame of the webpage [previous version of application]. The method can find an equivalent user interface control element within the current master data frame based at least on the archived version of the target user interface control element […] (emphasis added).”; paragraph [0050], “As an example, FIG. 3 shows a schematic diagram 300 illustrating an original webpage 310 that received an upgrade 330 at a certain point of time and became a modified webpage 360 […] The original webpage 310 includes a target UI control element 320 and other UI control elements 350. After the upgrade 330, while the other UI control elements 350 remain unchanged, the target UI control element 320 is changed [updated] to the modified UI control element 340, which is also termed “equivalent UI control element” as described herein (emphasis added).”; paragraph [0060], “The current master data frame can then be used for matching purposes (e.g., against archived master data frames that have been previously saved for the user interface) (emphasis added).”) [Examiner’s Remarks: Note that Rashid discloses an archived master data frame of the webpage (previous version of the application), a current master data frame of a current webpage that includes a current DOM, and an example of a webpage receiving an upgrade that changes the original target UI element to a modified UI element. One of ordinary skill in the art would readily comprehend that the current webpage is an updated version of the application that’s different from the archived master data frame of the webpage which represents a previous version of the application. Moreover, Rashid discloses that the adaptive control finder is able to determine that a target UI element is absent within the current DOM and that archived master data frames can be retrieved to find absent UI elements. One of ordinary skill in the art would readily comprehend that determining whether a target UI element is absent within the current DOM and finding absent UI elements includes receiving an indication of failure to identify the user interface element during execution of the action command in the robotic process automation code using an updated version of the application (current DOM of current webpage) in order to find the absent UI elements.];
construct a model of attributes of user interface elements […] of the updated version of the application including an attribute indicating functionality of at least one updated user interface element (paragraph [0050], “As an example, FIG. 3 shows a schematic diagram 300 illustrating an original webpage 310 that received an upgrade 330 at a certain point of time and became a modified [updated] webpage 360 […] The original webpage 310 includes a target UI control element 320 and other UI control elements 350. After the upgrade 330, while the other UI control elements 350 remain unchanged, the target UI control element 320 is changed [updated] to the modified UI control element 340, which is also termed “equivalent UI control element” as described herein (emphasis added).”; paragraph [0060], “FIG. 5 is a flowchart 500 of an example method of constructing a current master data frame [model] and can be implemented, for example, by the system of FIG. 4. At 510, the method can retrieve the DOM properties for a current webpage (e.g., by requesting them from a DOM interface). At 520, the method can harvest supplemental properties for the webpage (e.g., from internal or external services). Then at 530, the method can combine the DOM properties and supplemental properties [attributes of UI elements] for the current webpage [updated version of the application] and save them into the current master data frame [model]. The current master data frame can then be used for matching purposes (e.g., against archived master data frames that have been previously saved for the user interface) (emphasis added).”; paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties [attributes] associated with the UI control elements can be obtained from the DOM […] (emphasis added).”; paragraph [0062], “Thus, the master data frame described herein provides an internal data structure that maintains a unique identification profile for a UI control element in a webpage. Such unique identification profile can include not only direct properties obtained from the DOM (DOM properties), but also supplemental properties such as the ARIA properties that are extrinsic to the DOM structure. As an example, the following lists some properties [attributes] of a unique identification “profile” corresponding to a “Save” button in a webpage:
Internal_id = fvkihs8y8syijsbjsbkj
/* Example direct properties obtained from DOM */
Type = clickable [indicating functionality]
Role = button
Label = Save […] (emphasis added).”; paragraph [0068], “Incorporating the ARIA properties [attributes of UI elements] into the master data frame can facilitate efficient identification of a missing target UI control element. For example, a webpage upgrade may render a button to a link, or change a field to a dropdown list, or alter the appearance of a label, etc […] However, because the ARIA properties associated with such UI control elements remain unchanged, an ARIA-based RPA engine can reliably find those UI control elements based on their functional equivalence in terms of ARIA properties (emphasis added).”; paragraph [0077], “At 840, the adaptive control finder can find an equivalent UI control element [updated user interface element] in the current master data frame (e.g., in the current UI represented by the current DOM [updated version of the application]) based at least on the archived version of the target UI control element as described further in FIG. 9. As described herein, the equivalent user interface control element can be an element in the current web page that is functionally equivalent to the target user interface control element in the archived webpage [previous version of the application] (emphasis added).”);
identify from the model of the attributes of the user interface elements the at least one updated user interface element that performs the functionality of the user interface element (Figure 8: 840; paragraph [0077], “At 840, the adaptive control finder can find [identify] an equivalent [updated] UI control element in the current master data frame [model of attributes of the user interface elements] (e.g., in the current UI represented by the current DOM) based at least on the archived version of the target UI control element as described further in FIG. 9. As described herein, the equivalent [updated] user interface control element can be an element in the current web page that is functionally equivalent to the target [previous] user interface control element in the archived webpage (emphasis added).”; paragraph [0050], “As an example, FIG. 3 shows a schematic diagram 300 illustrating an original webpage 310 that received an upgrade 330 at a certain point of time and became a modified webpage 360 […] The original webpage 310 includes a target UI control element 320 and other UI control elements 350. After the upgrade 330, while the other UI control elements 350 remain unchanged, the target UI control element 320 is changed [updated] to the modified UI control element 340, which is also termed “equivalent UI control element” as described herein (emphasis added).”).
Rashid does not explicitly disclose:
in user interface code;
continue execution of the action command in the robotic process automation code to perform the functionality of the at least one updated user interface element.
However, Stan discloses:
in user interface code (paragraph [0047], “In some embodiments, element IDs are included in a source code of user interface 38, for instance as a set of attribute-value pairs (emphasis added).”);
continue execution of the action command in the robotic process automation code to perform the functionality of the at least one updated user interface element (abstract, “In some embodiments, a robotic process automation (RPA) robot is configured to identify a runtime target of an automation activity [action command] (e.g., a button to click, a form field to fill in, etc.) by searching a user interface for a UI element matching a set of characteristics of the target defined at design-time. When the target identification fails […] display for selection by the user a set of alternative [updated] target elements of the runtime interface (emphasis added).”; paragraph [0080], “In some embodiments, in response to receiving user input indicating a new [updated] target element for a selected activity, robot 12 may transmit a target update indicator 66 to a script editing module 38 […] Target update indicator 66 may comprise an indicator of a selected RPA script 40, an indicator of a selected RPA activity, and an indicator of a selected target UI element (e.g., a runtime selector characterizing the respective target). These indicators may collectively communicate to script editing module 38 to update the respective RPA script by changing the current target for the respective RPA activity to the new target specified by target update indicator 66. Module 38 may carry out the respective edit, to produce an updated RPA script 140 which may be further distributed to robots executing the respective automation (emphasis added).”; paragraph [0091], “A second manner in which displaying alternative [updated] targets may facilitate debugging is that it may provide a straightforward and intuitive way of fixing the robot by redirecting it to the correct target element so that it may continue executing the current automation [action command]. In some embodiments, the display of alternative targets is interactive, in the sense that clicking on an alternative target automatically instructs the robot to apply the current automation activity to the respective alternative [updated] target. Furthermore, the RPA code of the robot may be automatically updated to reflect redirection of the respective automation activity to the new target UI element (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “in user interface code; continue execution of the action command in the robotic process automation code to perform the functionality of the at least one updated user interface element.” The modification would be obvious because one of ordinary skill in the art would be motivated to use attributes of user interface elements in user interface code of an application, such as an element ID, to “enable a successful and ideally unambiguous identification by RPA robot” of a target user interface element (Stan, paragraph [0047]). Moreover, the modification would be obvious because one of ordinary skill in the art would be motivated to continue execution of the action command with the updated user interface element in order to fix a robot performing the action command by “redirecting it to the correct target element” which helps solve the problem associated with costly/cumbersome debugging when a target UI of an interface changes (Stan, paragraphs [0085 & 0091]).
As per Claim 18, the rejection of Claim 17 is incorporated; and Rashid discloses “compare the attributes of the model of the attributes of the user interface elements […] of the updated version of the application with the attributes of a model of attributes of user interface elements […] of the previous version of the application (Figure 10; paragraph [0081], “At 920, the adaptive control finder can compare the candidate UI control element in the current master data frame [updated version of the application] to an archived [previous] version of the target UI control element (emphasis added).”; paragraph [0084], “As described below, the determination of matching between the candidate UI control element and the archived version of the target UI control element can be based on multiple attributes, or some other criteria (emphasis added).”; abstract, “A computer-implemented method can receive a request from a robotic process automation engine to identify a target user interface control element in a webpage represented by a current master data frame [updated version of the application]. The current master data frame comprises a current document object model (DOM) […] The method can retrieve an archived version of the target user interface control element from an archived master data frame of the webpage [previous version of the application]. The method can find an equivalent user interface control element within the current master data frame based at least on the archived version of the target user interface control element […] (emphasis added).”; paragraph [0050], “As an example, FIG. 3 shows a schematic diagram 300 illustrating an original webpage 310 that received an upgrade 330 at a certain point of time and became a modified webpage 360 […] The original webpage 310 includes a target UI control element 320 and other UI control elements 350. After the upgrade 330, while the other UI control elements 350 remain unchanged, the target UI control element 320 is changed [updated] to the modified UI control element 340, which is also termed “equivalent UI control element” as described herein (emphasis added).”; paragraph [0060], “FIG. 5 is a flowchart 500 of an example method of constructing a current master data frame [model] and can be implemented, for example, by the system of FIG. 4. At 510, the method can retrieve the DOM properties for a current webpage (e.g., by requesting them from a DOM interface). At 520, the method can harvest supplemental properties for the webpage (e.g., from internal or external services). Then at 530, the method can combine the DOM properties and supplemental properties [attributes of UI elements] for the current webpage [updated version of the application] and save them into the current master data frame. The current master data frame can then be used for matching purposes (e.g., against archived master data frames that have been previously saved for the user interface) (emphasis added).”),” but does not explicitly disclose:
in user interface code.
However, Stan discloses:
in user interface code (paragraph [0047], “In some embodiments, element IDs are included in a source code of user interface 38, for instance as a set of attribute-value pairs (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “in user interface code.” The modification would be obvious because one of ordinary skill in the art would be motivated to use attributes of user interface elements in user interface code of an application, such as an element ID, to “enable a successful and ideally unambiguous identification by RPA robot” of a target user interface element (Stan, paragraph [0047]).
As per Claim 19, the rejection of Claim 17 is incorporated; and Rashid does not explicitly disclose:
update the robotic process automation code to reference the at least one updated user interface element of the updated version of the application.
However, Stan discloses:
update the robotic process automation code to reference the at least one updated user interface element of the updated version of the application (paragraph [0079], “In some embodiments, error reporting window 70 may include an interactive element (e.g., a button as illustrated by item 79 in FIG. 10) which enables a user to indicate an alternative [updated] target UI directly on the runtime UI (emphasis added).”; paragraph [0080], “In some embodiments, in response to receiving user input indicating a new [updated] target element for a selected activity, robot 12 may transmit a target update indicator 66 to a script editing module 38, which may execute on same RPA host as robot 12 or remotely, on another host being part of the respective RPA environment. Target update indicator 66 may comprise an indicator of a selected RPA script 40, an indicator of a selected RPA activity, and an indicator of a selected target UI element (e.g., a runtime selector characterizing the respective target). These indicators may collectively communicate to script editing module 38 to update the respective RPA script by changing the current target for the respective RPA activity to the new target specified by target update indicator 66. Module 38 may carry out the respective edit, to produce an updated RPA script 140 which may be further distributed to robots executing the respective automation (emphasis added).”; paragraph [0085], “The automatic identification of activity targets, i.e., user interface elements acted upon by robotic software, poses a substantial technical problem because typically, the target user interface (e.g., an e-commerce webpage, an accounting interface, etc.) is developed and maintained independently of the robot designed to interact with the respective interface. Therefore, the functionality and/or appearance of the target UI may change [update] without the knowledge of RPA developers (emphasis added).”; paragraph [0090], “Displaying alternative [updated] target(s) may facilitate debugging in at least two ways. First, it may enable a developer to pinpoint and interpret the differences between the design-time and runtime user interfaces, or stated otherwise, to determine whether and how the respective interface has changed between design time and runtime. For instance, displaying an alternative target may help the developer understand whether the respective failure was caused by an RPA coding error or by a change [update] in the code of the respective user interface (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “update the robotic process automation code to reference the at least one updated user interface element of the updated version of the application.” The modification would be obvious because one of ordinary skill in the art would be motivated to update RPA code by referencing an updated/alternative target UI element to “provide a straightforward and intuitive way of fixing the robot by redirecting it to the correct target element so that it may continue executing the current automation” which helps solve the problem associated with costly/cumbersome debugging when a target UI of an interface changes (Stan, paragraphs [0085 & 0091]).
As per Claim 20, the rejection of Claim 19 is incorporated; and Rashid does not explicitly disclose:
store the robotic process automation code in persistent storage.
However, Stan discloses:
store the robotic process automation code in persistent storage (paragraph [0039], “Database server 16 is configured to selectively store and/or retrieve data related to RPA environment 10 in/from database 18. Such data may include configuration parameters of various robots 12a-c, robot groups, as well as data characterizing workflows executed by various robots, and data characterizing users, roles, schedules, queues, etc […] Database server 16 and database 18 may employ any data storage protocol and format known in the art, such as structured query language (SQL), ElasticSearch®, and Redis®, among others (emphasis added).”; paragraph [0028], “Once a workflow is developed, it may be encoded in computer-readable form as a set of RPA scripts 40 (FIG. 2) (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “store the robotic process automation code in persistent storage.” The modification would be obvious because one of ordinary skill in the art would be motivated to store RPA code in persistent storage to ensure data is preserved and consistent, especially for automated repetitive tasks that improve productivity by freeing human operators to “perform more intellectually sophisticated and/or creative activities” (Stan, paragraphs [0003 & 0039]).
Claims 4 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Rashid in view of Stan as applied to Claim 1 above, and further in view of US 2022/0391227 (hereinafter “Grigore”).
As per Claim 4, the rejection of Claim 1 is incorporated; and the combination of Rashid and Stan does not explicitly disclose:
building, by the processor set, a robotic process automation robot deployable in a production environment using the robotic process automation code; and
deploying, by the processor set, the robotic process automation robot in the production environment.
However, Grigore discloses:
building, by the processor set, a robotic process automation robot deployable in a production environment using the robotic process automation code (paragraph [0007], “The generated automation is configured to be executed by a production RPA robot at runtime remotely in an operating system session, a VM, or a container (emphasis added).”; paragraph [0059], “After an automation is functioning properly and ready for deployment on a production server 140, the automation is deployed to a container, a VM, or a server operating system session of production server 140 in the form of machine-readable code or a script (emphasis added).”; paragraph [0094], “If the validation succeeds, the RPA workflow/project may be published for production (runtime) use at 560. For instance, this may involve generating the automation code and running the automation code via an RPA robot in a VM, a container, or an operating system session (emphasis added).”; paragraph [0066], “FIG. 2 is an architectural diagram illustrating a computing system 200 configured to implement part or all of a web-based RPA designer system, according to an embodiment of the present invention […] Computing system 200 includes a bus 205 or other communication mechanism for communicating information, and processor(s) 210 coupled to bus 205 for processing information (emphasis added).”) [Examiner’s Remarks: Note that Grigore discloses generating automation code and executing it by a production RPA robot. One of ordinary skill in the art would readily comprehend that the generated automation code is used in building the RPA robot in order for the RPA robot to perform the automation.]; and
deploying, by the processor set, the robotic process automation robot in the production environment (paragraph [0059], “After an automation is functioning properly and ready for deployment on a production server 140, the automation is deployed to a container, a VM, or a server operating system session of production server 140 in the form of machine-readable code or a script (emphasis added).”; paragraph [0094], “If the validation succeeds, the RPA workflow/project may be published for production (runtime) use at 560. For instance, this may involve generating the automation code and running the automation code via an RPA robot in a VM, a container, or an operating system session (emphasis added).”; paragraph [0066], “FIG. 2 is an architectural diagram illustrating a computing system 200 configured to implement part or all of a web-based RPA designer system, according to an embodiment of the present invention […] Computing system 200 includes a bus 205 or other communication mechanism for communicating information, and processor(s) 210 coupled to bus 205 for processing information (emphasis added).”).
Grigore is within the same field of endeavor as the claimed invention regarding the building and deployment of RPA robots.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Grigore into the combined teachings of Rashid and Stan to include “building, by the processor set, a robotic process automation robot deployable in a production environment using the robotic process automation code; and deploying, by the processor set, the robotic process automation robot in the production environment.” The modification would be obvious because one of ordinary skill in the art would be motivated to build and deploy RPA robots using a web-based RPA designer system that allows developers to sign in through the cloud and obtain a list of template projects, developer-designed projects, etc. in order to reduce “the local processing and memory requirements on a user's computing system” and centralize “RPA designer functionality, enabling better compliance” (Grigore, abstract).
As per Claim 14, Rashid discloses:
A computer program product comprising one or more computer readable storage media having program instructions collectively stored on the one or more computer readable storage media (paragraph [0132], “Any of the methods described herein can be implemented by computer-executable instructions in (e.g., stored on, encoded on, or the like) one or more computer-readable media (e.g., computer-readable storage media or other tangible media) or one or more computer-readable storage devices (e.g., memory, magnetic storage, optical storage, or the like) (emphasis added).”), the program instructions executable to:
construct a model of attributes of user interface elements […] including an attribute indicating functionality of at least one user interface element (paragraph [0060], “FIG. 5 is a flowchart 500 of an example method of constructing a current master data frame [model] and can be implemented, for example, by the system of FIG. 4. At 510, the method can retrieve the DOM properties for a current webpage (e.g., by requesting them from a DOM interface). At 520, the method can harvest supplemental properties for the webpage (e.g., from internal or external services). Then at 530, the method can combine the DOM properties and supplemental properties [attributes of UI elements] for the current webpage and save them into the current master data frame [model]. The current master data frame can then be used for matching purposes (e.g., against archived master data frames that have been previously saved for the user interface) (emphasis added).”; paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties [attributes] associated with the UI control elements can be obtained from the DOM which can be stored as a tree structure and represent the logical structure of a current webpage (emphasis added).”; paragraph [0062], “Thus, the master data frame described herein provides an internal data structure that maintains a unique identification profile for a UI control element in a webpage. Such unique identification profile can include not only direct properties obtained from the DOM (DOM properties), but also supplemental properties such as the ARIA properties that are extrinsic to the DOM structure. As an example, the following lists some properties [attributes] of a unique identification “profile” corresponding to a “Save” button in a webpage:
Internal_id = fvkihs8y8syijsbjsbkj
/* Example direct properties obtained from DOM */
Type = clickable [indicating functionality]
Role = button
Label = Save
Id = UI5_98799
Class = [‘class1’ ‘class2’, . .. . ] […] (emphasis added).”; paragraph [0068], “Incorporating the ARIA properties [attributes of UI elements] into the master data frame can facilitate efficient identification of a missing target UI control element. For example, a webpage upgrade may render a button to a link, or change a field to a dropdown list, or alter the appearance of a label, etc […] However, because the ARIA properties associated with such UI control elements remain unchanged, an ARIA-based RPA engine can reliably find those UI control elements based on their functional equivalence in terms of ARIA properties (emphasis added).”);
obtain from the model of the attributes of the user interface elements an identification of the at least one user interface element referenced in an action command for performing the functionality of the at least one user interface element in a robotic process automation code (paragraph [0032], “When activated, the RPA engine 110 can run the RPA scripts 140 to automatically identify and interact with a plurality of UI control elements in the currently displaying webpage. In practice, a script can instruct the RPA engine 110 to interact with and monitor the user interface (e.g., enter values, activate graphical buttons, read fields, and the like) [action command for performing the functionality of at least one user interface element in a robotic process automation code]. As described herein, a script can refer to user interface elements via an identifier (emphasis added).”; paragraph [0074], “At 810, during robotic process automation (RPA) processing (e.g., execution of an RPA script that interacts with the currently displayed user interface that is represented by a current DOM), the adaptive control finder can receive a request [action command] from the RPA engine to identify a target UI control element in a current webpage. The current webpage can be represented by a current master data frame [model of attributes of UI elements], which can include a current DOM (or at least contain properties of the current DOM) (emphasis added).”; paragraph [0077], “At 840, the adaptive control finder can find an equivalent UI control element in the current master data frame (e.g., in the current UI represented by the current DOM) based at least on the archived version of the target UI control element as described further in FIG. 9. As described herein, the equivalent user interface control element can be an element in the current web page that is functionally equivalent to the target user interface control element in the archived webpage (emphasis added).”; paragraph [0078], “At 850, the adaptive control finder can output the equivalent UI control element found at 840 (e.g., by returning an identifier associated with the equivalent UI control element that is in the current DOM). The identifier can then be used to find the UI control element and perform the instructed operation on the equivalent UI control element (in place of the missing one) (emphasis added).”).
Rashid does not explicitly disclose:
in user interface code;
generate the robotic process automation code referencing the at least one user interface element identified in the model of the attributes of the user interface elements;
build a robotic process automation robot deployable in a production environment using the generated robotic process automation code; and
deploy the robotic process automation robot in the production environment.
However, Stan discloses:
in user interface code (paragraph [0047], “In some embodiments, element IDs are included in a source code of user interface 38, for instance as a set of attribute-value pairs (emphasis added).”);
generate the robotic process automation code referencing the at least one user interface element identified in [user input] (paragraph [0080], “In some embodiments, in response to receiving user input indicating a new target element for a selected activity, robot 12 may transmit a target update indicator 66 to a script editing module 38, which may execute on same RPA host as robot 12 or remotely, on another host being part of the respective RPA environment. Target update indicator 66 may comprise an indicator of a selected RPA script 40, an indicator of a selected RPA activity, and an indicator of a selected target UI element (e.g., a runtime selector characterizing the respective target). These indicators may collectively communicate to script editing module 38 to update the respective RPA script by changing the current target for the respective RPA activity to the new target specified by target update indicator 66. Module 38 may carry out the respective edit, to produce [generate] an updated RPA script 140 which may be further distributed to robots executing the respective automation (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Stan into the teaching of Rashid to include “in user interface code; generate the robotic process automation code referencing the at least one user interface element identified in the model of the attributes of the user interface elements.” The modification would be obvious because one of ordinary skill in the art would be motivated to use attributes of user interface elements in user interface code of an application, such as an element ID, to “enable a successful and ideally unambiguous identification by RPA robot” of a target user interface element (Stan, paragraph [0047]). Moreover, the modification would be obvious because one of ordinary skill in the art would be motivated to generate and execute RPA code referencing a new/alternative target UI element to “provide a straightforward and intuitive way of fixing the robot by redirecting it to the correct target element so that it may continue executing the current automation” which helps solve the problem associated with costly/cumbersome debugging when a target UI of an interface changes (Stan, paragraphs [0085 & 0091]).
However, Grigore discloses:
build a robotic process automation robot deployable in a production environment using the generated robotic process automation code (paragraph [0007], “The generated automation is configured to be executed by a production RPA robot at runtime remotely in an operating system session, a VM, or a container (emphasis added).”; paragraph [0059], “After an automation is functioning properly and ready for deployment on a production server 140, the automation is deployed to a container, a VM, or a server operating system session of production server 140 in the form of machine-readable code or a script (emphasis added).”; paragraph [0094], “If the validation succeeds, the RPA workflow/project may be published for production (runtime) use at 560. For instance, this may involve generating the automation code and running the automation code via an RPA robot in a VM, a container, or an operating system session (emphasis added).”) [Examiner’s Remarks: Note that Grigore discloses generating automation code and executing it by a production RPA robot. One of ordinary skill in the art would readily comprehend that the generated automation code is used in building the RPA robot in order for the RPA robot to perform the automation.]; and
deploy the robotic process automation robot in the production environment (paragraph [0059], “After an automation is functioning properly and ready for deployment on a production server 140, the automation is deployed to a container, a VM, or a server operating system session of production server 140 in the form of machine-readable code or a script (emphasis added).”; paragraph [0094], “If the validation succeeds, the RPA workflow/project may be published for production (runtime) use at 560. For instance, this may involve generating the automation code and running the automation code via an RPA robot in a VM, a container, or an operating system session (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Grigore into the combined teachings of Rashid and Stan to include “build a robotic process automation robot deployable in a production environment using the generated robotic process automation code; and deploy the robotic process automation robot in the production environment.” The modification would be obvious because one of ordinary skill in the art would be motivated to build and deploy RPA robots using a web-based RPA designer system that allows developers to sign in through the cloud and obtain a list of template projects, developer-designed projects, etc. in order to reduce “the local processing and memory requirements on a user's computing system” and centralize “RPA designer functionality, enabling better compliance” (Grigore, abstract).
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Rashid in view of Stan as applied to Claim 1 above, and further in view of US 8,381,186 (hereinafter “Khalil”).
As per Claim 5, the rejection of Claim 1 is incorporated; and the combination of Rashid and Stan does not explicitly disclose:
scanning, by the processor set, the user interface code of the application.
However, Khalil discloses:
scanning, by the processor set, the user interface code of the application (col. 4 lines 21-23, “UI element parser 410 may scan the source code of the input application for elements in the source code that relate to UI elements (emphasis added).”; col. 3 lines 51-53, “Block 310 may generally include scanning the UI of target application 120 to extract the user interface elements (emphasis added).”; col. 2 lines 59-60, “As illustrated, workstation 110 may include a bus 210, a processing unit 220 […] (emphasis added).”).
Khalil is within the same field of endeavor as the claimed invention regarding the extraction of UI element information from UI code of an application.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Khalil into the combined teachings of Rashid and Stan to include “scanning, by the processor set, the user interface code of the application.” The modification would be obvious because one of ordinary skill in the art would be motivated to scan the user interface code of a target application to automatically extract UI elements for creation of a training application in order to “efficiently present portions of the user interface of the target application in a way that facilitates training for target application” (Khalil, col. 3 lines 51-53 & lines 58-63).
Claims 6 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Rashid in view of Stan as applied to Claim 1 above, and further in view of US 10,628,109 (hereinafter “Murphy”).
As per Claim 6, the rejection of Claim 1 is incorporated; and Rashid discloses “collecting, by the processor set, the user interface context identification information of the attributes of the user interface elements, including [type] (paragraph [0060], “At 510, the method can retrieve [collect] the DOM properties for a current webpage (e.g., by requesting them from a DOM interface). At 520, the method can harvest supplemental properties for the webpage (e.g., from internal or external services). Then at 530, the method can combine the DOM properties and supplemental properties for the current webpage and save them into the current master data frame (emphasis added).”; paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In another example, the UI control element can be associated with a type property characterizing the type of the UI control element, e.g., {Type=checkbox} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties associated with the UI control elements [user interface context identification information] can be obtained from the DOM which can be stored as a tree structure and represent the logical structure of a current webpage (emphasis added).”; paragraph [0124], “The processing units 1910, 1915 execute computer-executable instructions, such as for implementing the features described in the examples herein (emphasis added).”),” but the combination of Rashid and Stan does not explicitly disclose:
collecting, by the processor set, the user interface context identification information of the attributes of the user interface elements, including relationships of the user interface elements to parent, children and peer user interface panels.
However, Murphy discloses:
relationships of the user interface elements to parent, children and peer user interface panels (col. 9 lines 16-28, “Logical relationships [attributes] between panels for a user interface may be identified (310). For example, the interface generator 120 may obtain [collect] relationship information that describes three panels [user interface elements] are to be displayed and the three panels are in a co-planar relationship with equal importance. In some implementations, identifying logical relationships between panels includes identifying that one or more of a first panel is a child of a second panel, that a first panel is a sibling [peer] of a second panel, that a first panel is dependent on a second panel, that a first panel is embedded within a second panel, that a first panel is serial to a second panel, that a first panel controls a second panel, or that a first panel was created from a second panel (emphasis added).”; col. 4 lines 64-67 to col. 5 lines 1-5, “The relationship information 110 may describe logical relationships between panels for display on the user device 102. The logical relationships may be one or more of dependence, hierarchy, or similarity among panels. For example, a logical relationship between two panels may be a parent and child relationship where a first panel is a parent and the second panel is a child that is shown in response to a user interacting with the first panel (emphasis added).”).
Murphy is within the same field of endeavor as the claimed invention regarding collecting information on relationships between UI elements.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Murphy into the combined teachings of Rashid and Stan to include “relationships of the user interface elements to parent, children and peer user interface panels.” The modification would be obvious because one of ordinary skill in the art would be motivated to collect information regarding logical (dependency) relationships of UI elements to other panels in order to generate a “graphical user interface that includes one or more panels arranged according to a spatial relationship described by a particular interface pattern” based on a logical relationship to overcome the problem of “graphical user interfaces that are difficult to use” when displaying panels in a predetermined manner specified by applications (Murphy, col. 1 lines 33-36 & lines 41-50).
As per Claim 12, the rejection of Claim 1 is incorporated; and the combination of Rashid and Stan discloses “the attributes of the user interface elements in the user interface code comprise an element identifier (Stan, paragraph [0047], “In some embodiments, element IDs are included in a source code of user interface 38, for instance as a set of attribute-value pairs (emphasis added).”),” but does not explicitly disclose:
the attributes of the user interface elements in the user interface code comprise an element dependency.
However, Murphy discloses:
an element dependency (col. 4 lines 64-67 to col. 5 lines 1-5, “The relationship information 110 may describe logical relationships between panels [UI elements] for display on the user device 102. The logical relationships may be one or more of dependence, hierarchy, or similarity among panels. For example, a logical relationship between two panels may be a parent and child relationship where a first panel is a parent and the second panel is a child that is shown in response to a user interacting with the first panel (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Murphy into the combined teachings of Rashid and Stan to include “an element dependency.” The modification would be obvious because one of ordinary skill in the art would be motivated to collect information regarding logical (dependency) relationships of UI elements to other panels in order to generate a “graphical user interface that includes one or more panels arranged according to a spatial relationship described by a particular interface pattern” based on a logical relationship to overcome the problem of “graphical user interfaces that are difficult to use” when displaying panels in a predetermined manner specified by applications (Murphy, col. 1 lines 33-36 & lines 41-50).
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Rashid in view of Stan as applied to Claim 1 above, and further in view of US 2020/0206920 (hereinafter “Ma”).
As per Claim 11, the rejection of Claim 1 is incorporated; and the combination of Rashid and Stan discloses “the attributes of the user interface elements in the user interface code comprise an element identifier (Stan, paragraph [0047], “In some embodiments, element IDs are included in a source code of user interface 38, for instance as a set of attribute-value pairs (emphasis added).”),” but does not explicitly disclose:
the attributes of the user interface elements in the user interface code comprise an element event.
However, Ma discloses:
an element event (paragraph [0084], “In a preferred but illustrative implementation, the event table representing sequences of events recorded in operation 302 may include fields and corresponding data including […]”; paragraph [0088], “an “event type” field and corresponding description, e.g. a key press, mouse click, touch gesture, audio input, visual gesture, API call, etc (emphasis added).”).
Ma is within the same field of endeavor as the claimed invention regarding robotic process automation.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Ma into the combined teachings of Rashid and Stan to include “an element event.” The modification would be obvious because one of ordinary skill in the art would be motivated to keep track of an element event since “it would be beneficial to analyze the recorded interactions to identify different sequences of actions taken by the users to accomplish each type of task, and optimize among the various potential solutions and generate a model for more efficient performance of identified valuable automation opportunities/tasks” (Ma, paragraph [0007]).
Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Rashid in view of Stan and Grigore as applied to Claim 14 above, and further in view of Khalil.
As per Claim 15, the rejection of Claim 14 is incorporated; and the combination of Rashid, Stan, and Grigore does not explicitly disclose:
receive the user interface code specifying the attributes of the user interface elements of a user interface of an application; and
identify the attributes of the user interface elements in the user interface code.
However, Khalil discloses:
receive the user interface code specifying the attributes of the user interface elements of a user interface of an application (col. 4 lines 15-16 & lines 21-23, “In one implementation, UI element parser 410 may receive the source code of target application 120 […] UI element parser 410 may scan the source code of the input application for elements in the source code that relate to UI elements (emphasis added).”; col. 4 lines 43-49, “UI component log 420 may identify each UI element located by UI element parser 410 and may also include the properties [attributes] associated with each element. The properties of an element may refer to the customizable parameters of an element that determine the specific visual appearance or other behavioral aspects of the UI element (emphasis added).”; col. 5 lines 9-16, “The properties may include any information relating to the display or behavior of the corresponding UI element. For example, text box entry 550 may include properties [attributes] that include the size of the text box, the font used in the text box, and other information relating to the display of the text box. The properties stored in properties field 520 may correspond to the properties of the UI element that were extracted from target application 120 (emphasis added).”); and
identify the attributes of the user interface elements in the user interface code (col. 4 lines 21-23, “UI element parser 410 may scan the source code of the input application for elements in the source code that relate to UI elements (emphasis added).”; col. 4 lines 43-49, “UI component log 420 may identify each UI element located by UI element parser 410 and may also include the properties [attributes] associated with each element. The properties of an element may refer to the customizable parameters of an element that determine the specific visual appearance or other behavioral aspects of the UI element (emphasis added).”; col. 5 lines 9-16, “The properties may include any information relating to the display or behavior of the corresponding UI element. For example, text box entry 550 may include properties [attributes] that include the size of the text box, the font used in the text box, and other information relating to the display of the text box. The properties stored in properties field 520 may correspond to the properties of the UI element that were extracted from target application 120 (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Khalil into the combined teachings of Rashid, Stan, and Grigore to include “receive the user interface code specifying the attributes of the user interface elements of a user interface of an application; and identify the attributes of the user interface elements in the user interface code.” The modification would be obvious because one of ordinary skill in the art would be motivated to receive the user interface code of a target application specifying attributes to automatically extract UI elements for creation of a training application in order to “efficiently present portions of the user interface of the target application in a way that facilitates training for target application” (Khalil, col. 3 lines 51-53 & lines 58-63).
Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Rashid in view of Stan and Grigore as applied to Claim 14 above, and further in view of Murphy.
As per Claim 16, the rejection of Claim 14 is incorporated; and Rashid discloses “collect UI context identification information for the attributes of the user interface elements, including [type] (paragraph [0060], “At 510, the method can retrieve [collect] the DOM properties for a current webpage (e.g., by requesting them from a DOM interface). At 520, the method can harvest supplemental properties for the webpage (e.g., from internal or external services). Then at 530, the method can combine the DOM properties and supplemental properties for the current webpage and save them into the current master data frame (emphasis added).”; paragraph [0040], “Each of the UI control elements can have one or more associated properties (or attributes), each being defined by a pair of attribute name and value, e.g., {attribute name=value} […] In another example, the UI control element can be associated with a type property characterizing the type of the UI control element, e.g., {Type=checkbox} […] In still further examples, a UI control element can be associated with properties that define the location, size, color or other attributes of the UI control element (emphasis added).”; paragraph [0041], “In some embodiments, the properties associated with the UI control elements [user interface context identification information] can be obtained from the DOM which can be stored as a tree structure and represent the logical structure of a current webpage (emphasis added).”),” but the combination of Rashid, Stan, and Grigore does not explicitly disclose:
collect UI context identification information for the attributes of the user interface elements, including relationships of the user interface elements to parent, children and peer user interface panels.
However, Murphy discloses:
relationships of the user interface elements to parent, children and peer user interface panels (col. 9 lines 16-28, “Logical relationships [attributes] between panels for a user interface may be identified (310). For example, the interface generator 120 may obtain [collect] relationship information that describes three panels [user interface elements] are to be displayed and the three panels are in a co-planar relationship with equal importance. In some implementations, identifying logical relationships between panels includes identifying that one or more of a first panel is a child of a second panel, that a first panel is a sibling [peer] of a second panel, that a first panel is dependent on a second panel, that a first panel is embedded within a second panel, that a first panel is serial to a second panel, that a first panel controls a second panel, or that a first panel was created from a second panel (emphasis added).”; col. 4 lines 64-67 to col. 5 lines 1-5, “The relationship information 110 may describe logical relationships between panels for display on the user device 102. The logical relationships may be one or more of dependence, hierarchy, or similarity among panels. For example, a logical relationship between two panels may be a parent and child relationship where a first panel is a parent and the second panel is a child that is shown in response to a user interacting with the first panel (emphasis added).”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of Murphy into the combined teachings of Rashid, Stan, and Grigore to include “relationships of the user interface elements to parent, children and peer user interface panels.” The modification would be obvious because one of ordinary skill in the art would be motivated to collect information regarding logical (dependency) relationships of UI elements to other panels in order to generate a “graphical user interface that includes one or more panels arranged according to a spatial relationship described by a particular interface pattern” based on a logical relationship to overcome the problem of “graphical user interfaces that are difficult to use” when displaying panels in a predetermined manner specified by applications (Murphy, col. 1 lines 33-36 & lines 41-50).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
US 2007/0043701 (hereinafter “Klementiev”) discloses UI element attributes and relationships of the UI elements to parent, children, and sibling elements.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to FEVEN H HURUY whose telephone number is (571) 272-3826. The examiner can normally be reached Mon-Fri. 7:30am-3:45pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wei Mui can be reached at (571) 272-3708. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/F.H.H./Examiner, Art Unit 2191 /WEI Y MUI/Supervisory Patent Examiner, Art Unit 2191