Prosecution Insights
Last updated: April 19, 2026
Application No. 18/506,415

SYSTEMS AND METHODS FOR VISUALIZING MACHINE INTELLIGENCE

Non-Final OA §101§103
Filed
Nov 10, 2023
Examiner
LE, JOHN H
Art Unit
2857
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
Datarobot Inc.
OA Round
1 (Non-Final)
88%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
95%
With Interview

Examiner Intelligence

Grants 88% — above average
88%
Career Allow Rate
1286 granted / 1464 resolved
+19.8% vs TC avg
Moderate +7% lift
Without
With
+7.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
53 currently pending
Career history
1517
Total Applications
across all art units

Statute-Specific Performance

§101
28.6%
-11.4% vs TC avg
§103
26.2%
-13.8% vs TC avg
§102
20.5%
-19.5% vs TC avg
§112
15.4%
-24.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1464 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. Step 1: According to the first part of the analysis, in the instant case, claims 1-14 are directed to a system, claim 15-18 are directed to a method, and claims 19-20 are directed to a computer readable medium that stores instructions. Thus, each of the claims falls within one of the four statutory categories (i.e. process, machine, manufacture, or composition of matter). Regarding claim 15: A method, comprising: deploying, by a data processing system comprising one or more processors, coupled with memory, for a machine learning project, a plurality of virtual sensors at a first location of a plurality of locations to detect metadata of a data source of the machine learning project, at a second location of the plurality of locations to detect deployment information of a model trained for the machine learning project, and at a third location of the plurality of locations to detect learning session information for creation of the model; collecting, by the data processing system, via the plurality of virtual sensors deployed at the plurality of locations, data for the machine learning project; and translating, by the data processing system, for render on a computing system, the data collected via the plurality of virtual sensors into a plurality of graphics including a graphic representing the metadata of the data source, a graphic representing the deployment of the model, and a graphic representing the learning session. Step 2A Prong 1: “deploying, by a data processing system comprising one or more processors, coupled with memory, for a machine learning project, a plurality of virtual sensors at a first location of a plurality of locations to detect metadata of a data source of the machine learning project, at a second location of the plurality of locations to detect deployment information of a model trained for the machine learning project, and at a third location of the plurality of locations to detect learning session information for creation of the model” is directed to math because virtual sensors are not physical hardware; they are software sensors that use mathematical, data-driven, or physics-based models (e.g., linear regression, neural networks, random forests) to predict information from other data points. Detecting metadata often involves statistical analysis of the dataset, such as calculating means, variance, or distributions to ensure data quality. Detecting deployment information and model performance involves statistics to monitor for drift or degradation. Tracking model creation (training) involves mathematical optimization techniques like gradient descent, cost function minimization, and hyperparameter tuning. “collecting, by the data processing system, via the plurality of virtual sensors deployed at the plurality of locations, data for the machine learning project” is directed to mental step of data gathering. “translating, by the data processing system, for render on a computing system, the data collected via the plurality of virtual sensors into a plurality of graphics including a graphic representing the metadata of the data source, a graphic representing the deployment of the model, and a graphic representing the learning session” is directed to math because training raw data into a visual graphic involves mapping data points onto a multi-dimensional coordinate system using matrices and vectors. Representing a “learning session” or “model deployment” involves tracking loss functions, accuracy metrics, and probability distributions to quantify the model’s performance over time. Creating a graphic for a data source requires calculating metadata statistic to summarize the underlying data structure. Rendering on a computer system uses algorithms to calculate pixel position, lighting, and scaling, all of which are governed by geometric equations. Each limitation recites in the claim is a process that, under BRI covers performance of the limitation in the mind but for the recitation of a generic “virtual sensor” which is a mere indication of the field of use. Nothing in the claim elements precludes the steps from practically being performed in the mind. Thus, the claim recites a mental process. Further, the claim recites the step of "deploying, by a data processing system comprising one or more processors, coupled with memory, for a machine learning project, a plurality of virtual sensors at a first location of a plurality of locations to detect metadata of a data source of the machine learning project, at a second location of the plurality of locations to detect deployment information of a model trained for the machine learning project, and at a third location of the plurality of locations to detect learning session information for creation of the model; translating, by the data processing system, for render on a computing system, the data collected via the plurality of virtual sensors into a plurality of graphics including a graphic representing the metadata of the data source, a graphic representing the deployment of the model, and a graphic representing the learning session” which as drafted, under BRI recites a mathematical calculation. The grouping of "mathematical concepts” in the 2019 PED includes "mathematical calculations" as an exemplar of an abstract idea. 2019 PEG Section |, 84 Fed. Reg. at 52. Thus, the recited limitation falls into the "mathematical concept" grouping of abstract ideas. This limitation also falls into the “mental process” group of abstract ideas, because the recited mathematical calculation is simple enough that it can be practically performed in the human mind, e.g., scientists and engineers have been solving the Arrhenius equation in their minds since it was first proposed in 1889. Note that even if most humans would use a physical aid (e.g., pen and paper, a slide rule, or a calculator) to help them complete the recited calculation, the use of such physical aid does not negate the mental nature of this limitation. See October Update at Section I(C)(i) and (iii). Additional Elements: Step 2A Prong 2: “deploying, by a data processing system comprising one or more processors, coupled with memory, for a machine learning project, a plurality of virtual sensors at a first location of a plurality of locations to detect metadata of a data source of the machine learning project, at a second location of the plurality of locations to detect deployment information of a model trained for the machine learning project, and at a third location of the plurality of locations to detect learning session information for creation of the model” does not integrate the judicial exception into a practical application. This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). “collecting, by the data processing system, via the plurality of virtual sensors deployed at the plurality of locations, data for the machine learning project” does not integrate the judicial exception into a practical application. This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). “translating, by the data processing system, for render on a computing system, the data collected via the plurality of virtual sensors into a plurality of graphics including a graphic representing the metadata of the data source, a graphic representing the deployment of the model, and a graphic representing the learning session” does not integrate the judicial exception into a practical application. This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). The claim is merely selecting data, manipulating or analyzing the data using math and mental process, and displaying the results. This is similar to electric power: MPEP 2106.05(h) vi. Limiting the abstract idea of collecting information, analyzing it, and displaying certain results of the collection and analysis to data related to the electric power grid, because limiting application of the abstract idea to power-grid monitoring is simply an attempt to limit the use of the abstract idea to a particular technological environment, Electric Power Group, LLC v. Alstom S.A., 830 F.3d 1350, 1354, 119 USPQ2d 1739, 1742 (Fed. Cir. 2016). Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, "claiming the improved speed or efficiency inherent with applying the abstract idea on a computer" does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field. Claim 15 recites the additional element(s) of using generic AI/ML technology, i.e. a machine learning project, to perform data evaluations or calculations, as identified under Prong 1 above. The claims do not recite any details regarding how the AI/ML algorithm or model functions or is trained. Instead, the claims are found to utilize the AI/ML algorithm as a tool that provides nothing more than mere instructions to implement the abstract idea on a general purpose computer. See MPEP 2106.05(f). Additionally, the use of the machine learning project merely indicates a field of use or technological environment in which the judicial exception is performed. See MPEP 2106.05(h). Therefore, the use of the machine learning project to perform steps that are otherwise abstract does not integrate the abstract idea into a practical application. See the 2024 Guidance Update on Patent Subject Matter Eligibility, Including on Artificial Intelligence; and Example 47, ineligible claim 2. The claim as a whole does not meet any of the following criteria to integrate the judicial exception into a practical application: An additional element reflects an improvement in the functioning of a computer, or an improvement to other technology or technical field; an additional element that applies or uses a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition; an additional element implements a judicial exception with, or uses a judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim; an additional element effects a transformation or reduction of a particular article to a different state or thing; and an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. Step 2B: “deploying, by a data processing system comprising one or more processors, coupled with memory, for a machine learning project, a plurality of virtual sensors at a first location of a plurality of locations to detect metadata of a data source of the machine learning project, at a second location of the plurality of locations to detect deployment information of a model trained for the machine learning project, and at a third location of the plurality of locations to detect learning session information for creation of the model” does not amount to significantly more than the judicial exception in the claim. This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). “collecting, by the data processing system, via the plurality of virtual sensors deployed at the plurality of locations, data for the machine learning project” does not amount to significantly more than the judicial exception in the claim. This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). “translating, by the data processing system, for render on a computing system, the data collected via the plurality of virtual sensors into a plurality of graphics including a graphic representing the metadata of the data source, a graphic representing the deployment of the model, and a graphic representing the learning session” does not amount to significantly more than the judicial exception in the claim. This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). The claim is therefore ineligible under 35 USC 101. Claim 1 is similar to claim 15 but recites a system comprising: a data processing system comprising one or more processors, coupled with memory which perform the steps as in claim 15. These additional elements fail to integrate the abstract idea into a practical application. These limitations are recited at a high level of generality and do not add significantly more to the judicial exception. These elements are generic computing devices that perform generic functions. Using generic computer elements to perform an abstract idea does not integrate an abstract idea into a practical application. See 2019 Guidance, 84 Fed. Reg. at 55. Moreover, “the mere recitation of a generic computer cannot transform a patent-ineligible abstract idea into a patent-eligible invention.” Alice, 573 U.S. at 223; see also FairWarninglP, LLCv. latric SysInc., 839 F.3d 1089, 1096 (Fed. Cir. 2016) (citation omitted) (“[T]he use of generic computer elements like a microprocessor or user interface do not alone transform an otherwise abstract idea into patent-eligible subject matter”). On the record before us, we are not persuaded that the hardware of claim 1 integrates the abstract idea into a practical application. Nor are we persuaded that the additional elements are anything more than well-understood, routine, and conventional so as to impart subject matter eligibility to claim 1. Claim 19 cites a computer-readable that stores instructions thereon, that, when executed by one or more processors, cause the one or more processors to perform the steps as in claim 15. This amounts to nothing more than instructions to implement the abstract idea on a computer, which fails to integrate the abstract idea into a practical application. See 2019 Guidance, 84 Fed. Reg. at 55. Additionally, using instructions to implement an abstract idea on a generic computer “is not ‘enough’ to transform an abstract idea into a patent-eligible invention.” Alice, 573 U.S. at 226. Therefore, the rejection of claim 19 for the same reason discussed above with regard to the rejection of claim 15. Regarding claims 2, 16, and 20, “wherein a virtual sensor of the plurality of virtual sensors: monitors values of a data element of the machine learning project; and streams the values of the data element to the data processing system” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). Regarding claims 3 and 17, “wherein a virtual sensor of the plurality of virtual sensors includes a web-hook that monitors a data element of the machine learning project” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)).. Regarding claims 4 and 18, “apply a visualization rule to the data collected based on a virtual sensor of the plurality of virtual sensors; identify a visual appearance of one or more of the plurality of graphics based on the visualization rule and the data; and generate the one or more of the plurality of graphics to include the visual appearance” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). Regarding claim 5, “receive, from the computing system, a selection of the graphic representing the learning session; and provide, for render on the computing system, a plurality of entities within the graphic representing the learning session, the plurality of entities representing components of the learning session” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)).. Regarding claim 6, “determine, based on the data, a health level of a component of the machine learning project; compare the health level to a threshold; and update a visual appearance of at least one of the plurality of graphics or connections between the plurality of graphics responsive to the health level satisfying the threshold” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). Regarding claim 7, “determine, based on the data, a health level of a connection of connections between the plurality of graphics; compare the health level to a threshold; generate an update to the machine learning project; and modify a visual appearance of the connection”. Regarding claim 8, “wherein the learning session designs and trains the model of the machine learning project based on a machine learning problem received from the computing system” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). Regarding claim 9, “generate data causing the computing system to display a time control element; receive a selection of the time control element from the computing system; and animate at least one of the plurality of graphics or connections between the plurality of graphics based on a historical record of a plurality of states of the machine learning project at a plurality of points in time” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)).. Regarding claim 10, “animate at least one of the plurality of graphics or the connections by adding, removing, or adjusting entities of the plurality of graphics based on the historical record” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). Regarding claim 11, “wherein the graphic representing the metadata of the data source includes a first spherical portion including a metadata entity representing metadata of the data source of the machine learning project; and wherein the graphic representing the deployment of the model is a second spherical portion including a deployment entity representing the deployment of the model trained for the machine learning project” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h))”. Regarding claim 12, “draw, based on the data, a first connection between the metadata entity and the graphic representing the learning session and a second connection between the graphic representing the learning session and the deployment entity; and wherein the first connection and the second connection indicate that the learning session uses data of the data source to produce the deployment of the model” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). Regarding claim 13, “generate, based on the data, a third spherical portion, the third spherical portion including a decision entity indicating a decision produced by the deployment of the model of the machine learning project; and draw, based on the data, a connection between the deployment entity and the decision entity; wherein the connection indicates that the deployment of the model of the machine learning project produces the decision” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). Regarding claim 14, “generate the first spherical portion to be a semi-sphere; generate, based on the data, a third spherical portion, the third spherical portion including a decision entity indicating a decision produced by the deployment of the model of the machine learning project; and generate the second spherical portion and the third spherical portion to be quarter- spheres” does not integrate the judicial exception into a practical application. It does not amount to significantly more than the judicial exception in the claim This additional element is merely using a computer as a tool to perform an abstract idea (see MPEP 2106.05(h)). Hence the claims 1-20 are treated as ineligible subject matter under 35 U.S.C. § 101. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-8 and 15-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sharma et al. (2020/0327371 A1) in view of Elprin et al. (US 2021/0133632 A1). Regarding claim 1, Sharma et al. discloses a system, comprising: a data processing system comprising one or more processors, coupled with memory (para (0055] - "A computer-readable medium may include any medium that participates in providing instructions to one or more processors for execution... Non-volatile media can include, for example, flash memory... or magnetic disks.") to: deploy, for a machine learning project, a plurality of virtual sensors at a first location of a plurality of locations to detect metadata of a data source of the machine learning project (para (0117] - "The virtual sensors can be employed to provide access to enriched real-time data and intelligence information derived therefrom."; para [0122] - "Deployment context information can be captured as metadata and stored with an application."; para [0130] - "However, the virtual sensors also may operate on outputs of other virtual sensors and on input data that was previously aggregated and stored, for example, in the time-series database. The latter can be useful in creating and training machine learning models on the edge computing platform without the need to transmit to a remote network."; para [0138] - "The software layer executes analytics expressions on the sensor stream data and matches the sensor data with semantic descriptions of occurrences of specific conditions through an expression language made available by the software layer."; para (0177] - ''The first step in providing machine learning capability to the example edge-computing platform is developed a selected ML model... Finally, the edge-converted model is deployed to the edge platform." - The virtual sensors are employed across the layers to receive data from the machine learning model.), at a second location of the plurality of locations to detect deployment information of a model trained for the machine learning project (para [0179] - "The development and training of new ML m models typically involves the use of... model creating, and mode training processes and components."; see also para [0117], [0130] and (0138]), and at a third location of the plurality of locations to detect learning session information for creation of the model (para [0117], [0130], [0138] and [0179]); collect, via the plurality of virtual sensors deployed at the plurality of locations, data for the machine learning project (para [0117], [0130], [0138] and [0179]) and translate, for render on a computing system, the data collected via the plurality of virtual sensors into a graphics (para [0095] - "The data processing layer 515 described further below is implemented as a dynamic computation domain specific language (DSL)-based directed acyclic graph (DAG) model."; para [0208] - "The example translator arid method preferable translates PMML input to... output in three phases. In Phase 1, the XML content is parsed and representation is built in memory capturing the semantics, relationships, transforms and ML algorithm technique to build a computation flow graph."; para (0258] - ''Thus, the system and method described herein envision the entire running edge system/device as a data-flow graph with continuous endless streams." - The directed graphs are represented visually to the user via the interface. The graph represents the data using a directed graph containing nodes and edges which indicate the flow of data as the model is trained.). Sharma et al. does not disclose including a graphic representing the metadata of the data source, a graphic representing the deployment of the model, and a graphic representing the learning session. However, Elprin et al. disclose including a graphic representing the metadata of the data source (para (0038] - "The model monitor methods arid systems may manage, track and monitor various aspects of data science lifecycles such as the technologies or processes for developing, validating, delivering model…..Various aspects of models in different stages of the processes can be visualized via a graphical user interface."; para (0072] - "A model monitor system may monitor... performance of a model in different phases (e.g., development, deployment. etc.)."- Fig. 4 specifies the model monitoring includes training, hyperparameters, integrity, and deployment.), a graphic representing the deployment of the model (para (0038] and (0072]), and a graphic representing the learning session (para [0038] and [0072]). It would have been obvious to one skilled in the art to include a graphic representation of all phases of model development, as taught by Elprin et al. to the system of Sharma et al., as it would allow the system to present the training sessions to the user, as specified by the tenant application (see Elprin et al. para [0038] and (0072]). Regarding claim 2, Sharma et al., in view of Elprin et al., disclose the system of claim 1. Sharma et al. further discloses wherein a virtual sensor of the plurality of virtual sensors (para (0117], (0130], (0138] and (0179]): monitors values of a data element of the machine learning project (para (0117], (0130], (0138] and (0179]); and streams the values of the data element to the data processing system (para (0117], (0130], (0138] and (0179]). Regarding claim 3, Sharma et al., in view of Elprin et al., disclose the system of claim 1. Sharma et al. further discloses wherein a virtual sensor of the plurality of virtual sensors includes (para (0117], (0130], [0138] and [0179]), that monitors a data element of the machine learning project (para (0117], [0130], [0138] and (0179]). Elprin et al. further discloses a web-hook (para (0063] - "The alert may be delivered in any suitable forms (e.g., audio, visual alert in a GUI, webhooks that can be integrated into other applications, etc.) or via any suitable communication channels (e.g., email, Slack, SMS)."). Regardfng claim 4, Sharma et al., in view of Elprin et al., disclose the system of claim 1. Sharma et al. further discloses comprising the data processing system to: apply a visualization rule to the data collected based on a virtual sensor of the plurality of virtual sensors (para [0117], (0130], [0138] and [01791); identify a visual appearance of one or more of the plurality of graphics based on the visualization rule and the data (para [0268] - "A user can edit the parameters associated with a machine learning model that is already part of an existing workflow and also can identify and add a new machine learning model to an existing workflow." - The system determines the shape of the flow graph based in the virtual sensor data.; see also para (0117], [0130], [0138] and (01791); and generate the one or more of the plurality of graphics to include the visual appearance (para [0117], [0130], (0138] and (01791). Regarding claim 5, Sharma et al., in view of Elprin et al., disclose the system of claim 1. Sharma et al. further discloses comprising the data processing system to: receive, from the computing system, a selection of the graphic representing the learning session (para [0117], (0130], [0138] and [01791); and provide, for render on the computing system, a plurality of entities within the graphic representing the learning session (para (0099] - "There are source nodes... in the graph, where directed edges represent the information flow between various nodes."; see also para (0208] and [02581), the plurality of entities representing components of the learning session (para (0117], [0130], [0138], (0179], [0208] and [02581). Regarding claim 6, Sharma et al., in view of Elprin et al., disclose the system of claim 1. Elprin et al. further discloses comprising the data processing system to: determine, based on the data, a health level of a component of the machine learning project (para [0087] - "FIG. 6A shows an example of various features monitored for a set of models For example, a model health status (e.g., not health, healthy, etc), result of last health check (e.g., drift in data or performance, range of drift, etc) can be displayed to users via a suitable format (e.g., scatter plot, bar graph, etc) The model monitoring user interface may display information related to model drift such as a drift trends chart for each model... for calculating the drift (e.g., test type, threshold, etc.)."); compare the health level to a threshold (para [00871), update a visual appearance of at least one of the plurality of graphics or connections between the plurality of graphics responsive to the health level satisfying the threshold (para [0040] - "In some cases, each of the models may be fully versioned, such that they can reflect changes across time….In some cases, every time a model is created or updated, a previous version of the model... may be archived and a more recent version of the model may be generated."; see also para [0038], [0072] and [00871).). Regarding claim 7, Sharma et al., in view of Elprin et al., disclose the system of claim 1. Sharma et al. further discloses comprising the data processing system to: determine, based on the data, a health level of a connection of connections between the plurality of graphics (para (0099] - "There are source nodes in the graph, where directed edges represent the information flow between various nodes."), generate an update to the machine learning project (para [0159] - "Once a model is trained, it can be "edge-fitted" as described the updated models pushed back to the edge have a highly iterative, closed-loop fashion."); and modify a visual appearance of the connection (para [0232] - "Once an edge-converted ML model Is deployed to the edge platform... it may be desirable to periodically evaluate the accuracy…and iteratively update the model as necessary."; see also para [01591). Elprin et al. further discloses compare the health level to a threshold (para [00871); generate an update to the machine learning project (para [0038], [0040], [0072], and [0087]). Regarding claim 8, Sharma et al., in view of Elprin et al., disclose the system of claim 1. Sharma et al. further discloses wherein the learning session designs and trains the model of the machine learning project based on a machine learning problem received from the computing system (para [0230] - "The machine learning platform can be accessed via a suitable user interface (UI) and incorporates functionality for selecting one or more standard or custom machine learning models and updating models as necessary."; para [0208] - "The code is then generated to implement the most optimal program for the target platform. Any test cases embedded by the ML model creator in the PMML file are also generated as static expectations generated as part of the code generated itself deployed to the example edge platform." - The model may be custom fitted to the problem being tested.). Regarding claim 15, Sharma et al. discloses a method, comprising: deploying, by a data processing system comprising one or more processors (para (00551), coupled with memory (para [0055], for a machine learning project, a plurality of virtual sensors at a first location of a plurality of locations to detect metadata of a data source of the machine learning project (para [0117], [0122], [0130], (0138] and [01771), at a second location of the plurality of locations to detect deployment information of a model trained for the machine learning project (para (0117], [0122], (0130], [0138] and [01771), and at a third location of the plurality of locations to detect learning session information for creation of the model (para [0117], [0122], [0130], [0138] and [01771); collecting, by the data processing system, via the plurality of virtual sensors deployed at the plurality of locations (para (0117], [0130], [0138] and [01791), data for the machine learning project (para [0117], [0130], [0138] and [0179]); and translating, by the data processing system, for render on a computing system, the data collected via the plurality of virtual sensors into a plurality of graphics (para [0095], (0208] and [0258]). Sharma et al. does not disclose including a graphic representing the metadata of the data source, a graphic representing the deployment of the model, and a graphic representing the learning session. However, Elprin et al. does disclose including a graphic representing the metadata of the data source (para [0038] and [0072]), a graphic representing the deployment of the model (para (0038] and (0072]), and a graphic representing the learning session (para (0038] and [0072]). It would have been obvious to one skilled in the art to include a graphic representation of all phases of model development, as taught by Elprin et al. to the system of Sharma et al. , as it would allow the system to present the training sessions to the user, as specified by the tenant application (see Elprin et al. para [0038] and [0072]). Regarding claim 16, Sharma et al., in view of Elprin et al., disclose the method of claim 15. Sharma et al. further discloses wherein a virtual sensor of the plurality of virtual sensors: monitors values of a data element of the machine learning project (para (0117], (0130], [0138] and [01791); and streams the values of the data element to the data processing system (para [0117], [0130], [0138] and [01791). Regarding claim 17, Sharma et al. discloses the method of claim 15. Sharma et al. further discloses wherein a virtual sensor of the plurality of virtual sensors includes (para [0117], [0130], [0138] and [0179]), that monitors a data element of the machine learning project (para [0117], [0130], [0138] and [0179]). Elprin et al. further discloses a web-hook (para [00631). Regarding claim 18, Sharma et al., in view of Elprin et al., disclose the method of claim 15. Sharma et al. further discloses comprising: applying, by the data processing system, a visualization rule to the data collected based on a virtual sensor of the plurality of virtual sensors (para (0117], (0130], (0138] and (0179]); identifying, by the data processing system, a visual appearance of one or more of the plurality of graphics based on the visualization rule and the data (para (0117], (0130], (0138], (0179] and (0268]); and generating, by the data processing system, the one or more of the plurality of graphics to include the visual appearance (para (0117], (0130], (0138] and [0179]). Regarding claim 19, Sharma et al. disclose a computer readable medium that stores instructions thereon, that, when executed by one or more processors, cause the one or more processors to (para (0055]): deploy, for a machine learning project, a plurality of virtual sensors at a first location of a plurality of locations to detect metadata of a data source of the machine learning project (para (0117], (0122], (0130], (0138] and (0177]), at a second location of the plurality of locations to detect deployment information of a model trained for the machine learning project (para (0117], (0130], (0138] and (0179]), and at a third location of the plurality of locations to detect learning session information for creation of the model (para (0117], (0130], (0138] and (0179]); collect, via the plurality of virtual sensors deployed at the plurality of locations, data for the machine learning project (para (0117], (0130], (0138] and (0179]); and translate, for render on a computing system, the data collected via the plurality of virtual sensors into a plurality of graphics (para (0208] and [0258]). Sharma et al. does not disclose including a graphic representing the metadata of the data source, a graphic representing the deployment of the model, and a graphic representing the learning session. However, Elprin et al. does disclose including a graphic representing the metadata of the data source (para (0038] and (0072]), a graphic representing the deployment of the model (para (0038] and (0072]), and a graphic representing the learning session (para (0038] and (0072]). It would have been obvious to one skilled in the art to include a graphic representation of all phases of model development, as taught by Elprin et al. to the system of Sharma et al., as it would allow the system to present the training sessions to the user, as specified by the tenant application (see Elprin et al. para (0038] and (0072]). Sharma et al. does not disclose including a graphic representing the metadata of the data source, a graphic representing the deployment of the model, and a graphic representing the learning session. However, Elprin et al. does disclose including a graphic representing the metadata of the data source (para (0038] and (0072]), a graphic representing the deployment of the model (para (0038] and (0072]), and a graphic representing the learning session (para (0038] and (0072]). It would have been obvious to one skilled.in the art to include a graphic representation of all phases of model development, as taught by Elprin et al. to the system of Sharma et al., as it would allow the system to present the training sessions to the user, as specified by the tenant application (see Elprin et al. para (0038) and (0072]). Regarding claim 20, Sharma et al., in view of Elprin et al., disclose the computer readable medium of claim 19. Sharma et al. further discloses wherein a virtual sensor of the plurality of virtual sensors: monitors values of a data element of the machine learning project (para (0117], (0130), (0138] and (0179]); and streams the values of the data element to the one or more processors (para (0117], (0130], (0138] and (0179]). Claim(s) 9-10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sharma et al. (2020/0327371 A1) in view of Elprin et al. (US 2021/0133632 A1) as applied to claim 1 above, and further in view of Wee et al. (US 2016/0357525 A1). Regarding claim 9, Sharma et al., in view of Elprin et al., disclose the system of claim 1. Sharma et al. further discloses comprising the data processing system to: generate data causing the computing system to display a time control element (para (0114] - "The time-series database... is a software system that is optimized for handing time series data comprising arrays of numbers indexed by time..."); receive a selection of the time control element from the computing system (para (0130) - "It is also important to note that the data aggregated by virtual SXL-sensors can be stored in the time-series databased on the edge platform..."); based on a historical record of a plurality of states of the machine learning project at a plurality of points in time (para (0183] - "The development and training data stored in the cloud also may include other historical sensor data form the edge platform 406... or a combination."). Sharma et al. does not disclose animating at least one of the plurality of graphics or connections between the plurality of graphics. However, Wee et al. does disclose animating at least one of the plurality of graphics or connections between the plurality of graphics (para (0066] - "Any appropriate animations can then be displayed to show the data flow and policy, and any visualizations... virtualized acts... may also be presented within the loT IDE."). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claim invention to include animation, as taught by Wee et al. to the system of Sharma et al. in view of Elprin et al. , as it would provide more graphical representation of the system, as specified by the tenant application (see Wee et al. para (0066]). Regarding claim 10, Sharma et al., in view of Elprin et al., in further view of Wee et al., disclose the system of claim 9. Sharma et al. further discloses comprising the data processing system to: at least one of the plurality of graphics or the connections by adding, removing, or adjusting entities of the plurality of graphics based on the historical record (para (0183] - "The development and training data stored in the cloud also may include other historical sensor data form the edge platform 406... or a combination. Various predications, inferences, analytical results, and other intelligence information... and machine learning models executing on the edge platform also may be sent to the cloud... in creating, verifying, evaluating and updating models as further described below." - Updating a model based on the historical data may add or remove nodes in the graphical representation.; see also para (0099]). Wee et al. further discloses animate (para (0066]). Claim(s) 11 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sharma et al. (2020/0327371 A1) in view of Elprin et al. (US 2021/0133632 A1) as applied to claim 1 above, and further in view of Limberg et al. (US 2020/0335011 A1). Regarding claim 11, Sharma et al., in view of Elprin et al., disclose the system of claim 1. Sharma et al. further discloses wherein the graphic representing the metadata of the data source includ.es (para [0117], [0130], [0138] and [0179]), including a metadata entity representing metadata of the data source of the machine learning project (para [0122], [0138] and [0177]); and wherein the graphic representing the deployment of the model (para [0117], [0122], [0130], [0138], [0177] and [0179]), including a deployment entity representing the deployment of the model trained for the machine learning project (para [0117], [0122], [0130], [0138], [0177] and [0179]). Sharma et al. does not disclose a first spherical portion, and a second spherical portion. However, Limberg et al. does disclose a first spherical portion (para [0023) - "The quantum state visualization devices discussed herein includes a portion of a spherical shell (e.g., a whole sphere, a half sphere, a three-quarter sphere, etc.)"), and a second spherical portion (para [0023]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claim invention to utilize a spherical visualization, as taught by Limberg et al. to the system of Sharma et al., in view of Elprin et al., as it would provide the system with specific graphical elements to represent the machine learning development, as specified by the tenant application (see Limberg et al., para [0023]). Regarding claim 14, Sharma et al., in view of Elprin et al., in further view of Limberg et al., discloses the system of claim 11. Sharma et al. further discloses comprising the data processing system to: including a decision entity indicating a decision produced by the deployment of the model of the machine learning project (para [0268] - "A user may edit the parameters associated with a machine learning model that is already part of an existing workflow... For example, a user can enter certain machine learning model information and parameters 1106, such as model type... algorithm type (e.g., decision tree) using the user interface."; see also para [0117], [0122], [0130), [0138], [0177] and [0179]). Limberg et al. further discloses generating the first spherical portion to be a semi-sphere (para [0023]); generate, based on the data, a third spherical portion (para [0023]), the third spherical portion (para [0023]), generate the second spherical portion and the third spherical portion to be quarter spheres (para [0023]). Claim(s) 12 and 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sharma et al. (2020/0327371 A1) in view of Elprin et al. (US 2021/0133632 A1) and Limberg et al. (US 2020/0335011 A1) as applied to claim 11 above, and further in view of Fletcher et al. (US 2016/0103908 A1). Regarding claim 12, Sharma et al., in view of Elprin et al., in further view of Limberg et al. (US 2020/0335011 A1), disclose the system of claim 11. Sharma et al. further discloses comprising the data processing system to: , based on the data, a first connection between the metadata entity and the graphic representing the learning session and a second connection between the graphic representing the learning session and the deployment entity (para [0117], [0122], [0130], [0138], [0177] and [0179]); and wherein the first connection and the second connection indicate that the learning session uses data of the data source to produce the deployment of the model (para [0117], [0122], [0130], [0138], [0177] and [0179]). Sharma et al., Elprin et al., and Limberg et al. do not disclose to draw. However, Fletcher et al. does disclose to draw (para [0230] - "Users can be provided the ability to design and draw the service-monitoring dashboard and customize each of the PKI widgets."). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claim invention to include drawing, as taught by Fletcher et al. to the system of Sharma et al., in view of Elprin et al., in further view of Limberg et al., as it would allow the system to provide custom connections, as specified by the tenant application (see Fletcher et al., para [0230]). Regarding claim 13, Sharma et al., in view of Elprin et al. and Limberg et al., discloses the system of claim 11. Sharma et al. further discloses comprising the data processing system to: generate, based on the data, a decision entity indicating a decision produced by the deployment of the model of the machine learning project (para [0268]); based on the data, a connection between the deployment entity and the decision entity (para [0117], [0122], [0130), [0138), [0177] and [0179]); wherein the connection indicates that the deployment of the model of the machine learning project produces the decision (para [0117], [0122], [0130], [0138], [0177] and [0179]). Limberg et al. further discloses a third spherical portion, the third spherical portion (para [0023]). Sharma et al. and Limberg et al. do not disclose to draw. However, Fletcher et al. does disclose to draw (para [0230] - "Users can be provided the ability to design and draw the service-monitoring dashboard and customize each of the PKI widgets."). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claim invention to include drawing, as taught by Fletcher et al. to the system of Sharma et al., in view of Elprin et al., in further view of Limberg et al., as it would allow the system to provide custom connections, as specified by the tenant application (see Fletcher et al., para [0230]). Other Prior Art The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Srinivasan et al. (WO 2021058526 A1) disclose a method for training machine learning models (51), having the steps of: detecting (S10) data (70) in the form of time series data using one or more computers (52), said data being obtained by means of one or more measuring devices (60-62), in each case in the form of a sensor for measuring a physical variable; receiving (S12) multiple classification data units relating to the data (70) using the one or more computers (52); receiving (S13) a selected part (71) of the data (70) using the one or more computers (52) for each of the classification data units; and training (S14) multiple machine learning models (51) using the one or more computers (52), in each case on the basis of at least one of the classification data units and the at least one corresponding selected part (71) of the data, wherein the multiple machine learning models (51) represent multiple instances of the same machine learning model. Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOHN H LE whose telephone number is (571)272-2275. The examiner can normally be reached on Monday-Friday from 7:00am – 3:30pm Eastern Time. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Shelby A. Turner can be reached on (571) 272-6334. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JOHN H LE/Primary Examiner, Art Unit 2857
Read full office action

Prosecution Timeline

Nov 10, 2023
Application Filed
Feb 02, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601756
MATCHING METHOD FOR SEMICONDUCTOR TOPOGRAPHY MEASUREMENT AND PROCESSING DEVICE USING THE SAME
2y 5m to grant Granted Apr 14, 2026
Patent 12590570
BLADE FAULT DIAGNOSIS METHOD, APPARATUS AND SYSTEM, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12585255
METHODS AND SYSTEMS FOR DETECTION IN AN INDUSTRIAL INTERNET OF THINGS DATA COLLECTION ENVIRONMENT WITH NOISE PATTERN RECOGNITION FOR BOILER AND PIPELINE SYSTEMS
2y 5m to grant Granted Mar 24, 2026
Patent 12585565
SELECTING A RUNTIME CONFIGURATION BASED ON MODEL PERFORMANCE
2y 5m to grant Granted Mar 24, 2026
Patent 12585566
MAINTENANCE PREDICTION FOR MODULES OF A MICROSCOPE
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
88%
Grant Probability
95%
With Interview (+7.3%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 1464 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month