Prosecution Insights
Last updated: April 19, 2026
Application No. 18/132,325

SENSOR READING CORRECTION

Final Rejection §101§103
Filed
Apr 07, 2023
Examiner
FORRISTALL, JOSHUA L
Art Unit
2857
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
Aquatic Informatics ULC
OA Round
2 (Final)
69%
Grant Probability
Favorable
3-4
OA Rounds
3y 3m
To Grant
92%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
40 granted / 58 resolved
+1.0% vs TC avg
Strong +23% interview lift
Without
With
+23.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
45 currently pending
Career history
103
Total Applications
across all art units

Statute-Specific Performance

§101
18.7%
-21.3% vs TC avg
§103
48.8%
+8.8% vs TC avg
§102
9.0%
-31.0% vs TC avg
§112
22.1%
-17.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 58 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment Applicant’s amendments to the claims, filed 02/17/2026, are accepted and appreciated by the Examiner. Response to Arguments Applicant's arguments, see Remarks, filed 11/13/2025, with respect to the rejection(s) of claims 1 and 19 under 35 U.S.C. 101 filed 11/13/2025 have been fully considered but they are not persuasive. There is no indication in the claim that the features of claims 1 and 19 cannot be done in human mind. Claims 1 and 19 amount to applying and generating detection rules to received data. As seen on specification pages 8 and 17 the detection rules amount to manipulating data, modifying data, and logically representing anomalies and data errors by using heuristic rules. The detection rules can therefore be viewed as an algorithm. According to MPEP 2106.04 (C) “Claims can recite a mental process even if they are claimed as being performed on a computer. The Supreme Court recognized this in Benson, determining that a mathematical algorithm for converting binary coded decimal to pure binary within a computer’s shift register was an abstract idea. The Court concluded that the algorithm could be performed purely mentally even though the claimed procedures "can be carried out in existing computers long in use, no new machinery being necessary." 409 U.S at 67, 175 USPQ at 675. See also Mortgage Grader, 811 F.3d at 1324, 117 USPQ2d at 1699 (concluding that concept of "anonymous loan shopping" recited in a computer system claim is an abstract idea because it could be "performed by humans without a computer").” The claim does not recite a specific technical architecture and it just recites generic sensors and retrieving data from said sensors. Retrieving or collecting data amounts to mere data gathering. As seen above retrieving, collecting, and displaying data or detection rules do not integrate the claim into a practical application as they amount to using a computer as a tool and mere data gathering. Although modifying detection rules may improve the result output by a computer, there is no improvement in the functioning of the computer. Similar to the reasoning applied in the Federal Circuit Court decision in the Electric Power Troup LLC v. Alstom S.A. case of August 1, 2016, page 8, “In Enfish, we applied the distinction to reject the § 101 challenge at stage one because the claims at issue focused not on asserted advances in uses to which existing computer capabilities could be put, but on a specific improvement—a particular database technique—in how computers could carry out one of their basic functions of storage and retrieval of data. Enfish, 822 F.3d at 1335–36; see Bascom, 2016 WL 3514158, at *5; cf. Alice, 134 S. Ct. at 2360 (noting basic storage function of generic computer). The present case is different: the focus of the claims is not on such an improvement in computers as tools, but on certain independently abstract ideas that use computers as tools.” No specific computer improvement, such as to how computers could carry out one of their basic functions of storage and retrieval of data, is present in the claims of the instant application; therefore, the claims in the instant application are an example of an abstract idea that uses computers as tools. For at least these reasons, Applicant' s arguments are not persuasive. Applicant’s arguments, see Remarks, filed 11/13/2025, with respect to the rejection(s) of claims 1 and 19 under 35 U.S.C. 102 have been fully considered and are persuasive in light of the amendments. Agerstam (US 20190138423 A1) does not explicitly teach “collecting, by one or more processors of a computing system, a set of first sensor data from a first set of one or more sensors, the set of first sensor data comprising historical time series data.” Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Agerstam (US 20190138423 A1) and Jordon (US 20240330759 A1). Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-21 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. With respect to claims 1 and 19, The following bold limitations are considered abstract: “collecting, by one or more processors of a computing system, a set of first sensor data from a first set of one or more sensors, the set of first sensor data comprising historical time series data; receiving, by the one or more processors of the computing system, via one or more input devices of one or more computing devices, auxiliary data corresponding to a set of first modified sensor data, wherein the auxiliary data includes at least one of (i) one or more changes to the set of first modified sensor data, or (ii) one or more labels applicable to the set of first modified sensor data, the set of first modified sensor data having been obtained or created at least in part by application of a first set of one or more detection rules to the set of first sensor data causing display of, by the one or more processors, via a user device, the first set of one or more detection rules and the set of first modified sensor data; receiving, by the one or more processors from the user device, a user modification of the first set of one or more detection rules, in response to display of the first set of one or more detection rules and the set of first modified sensor data; generating, by the one or more processors, a second set of one or more detection rules at least in part by modifying the first set of one or more detection rules based at least in part on the auxiliary data and the user modification; determining, by the one or more processors, that the second set of one or more detection rules is applicable to the set of second sensor data; and modifying, by the one or more processors, the set of second sensor data by applying the second set of one or more detection rules to the set of second sensor data, the second set of one or more detection rules correcting the set of second sensor data.” The bolded limitations are directed to abstract ideas and would fall within the “Mental Process” grouping of abstract ideas. Applying detection rules to data, generating new detection rules from auxiliary data and determining that the new detection rules are applicable to a second set of data can be done in the human mind using evaluation judgement and opinion. As seen in specification pages 8 and 17 the detection rules amount to manipulating data, modifying data, and logically representing anomalies and data errors by using heuristic rules. This can be viewed as an algorithm. According to MPEP 2106.04 (C) “Claims can recite a mental process even if they are claimed as being performed on a computer. The Supreme Court recognized this in Benson, determining that a mathematical algorithm for converting binary coded decimal to pure binary within a computer’s shift register was an abstract idea. The Court concluded that the algorithm could be performed purely mentally even though the claimed procedures "can be carried out in existing computers long in use, no new machinery being necessary." 409 U.S at 67, 175 USPQ at 675. See also Mortgage Grader, 811 F.3d at 1324, 117 USPQ2d at 1699 (concluding that concept of "anonymous loan shopping" recited in a computer system claim is an abstract idea because it could be "performed by humans without a computer").” MPEP 2106.04 further teaches “The courts consider a mental process (thinking) that "can be performed in the human mind, or by a human using a pen and paper" to be an abstract idea. CyberSource Corp. v. Retail Decisions, Inc., 654 F.3d 1366, 1372, 99 USPQ2d 1690, 1695 (Fed. Cir. 2011). As the Federal Circuit explained, "methods which can be performed mentally, or which are the equivalent of human mental work, are unpatentable abstract ideas the ‘basic tools of scientific and technological work’ that are open to all.’" 654 F.3d at 1371, 99 USPQ2d at 1694 (citing Gottschalk v. Benson, 409 U.S. 63, 175 USPQ 673 (1972)). See also Mayo Collaborative Servs. v. Prometheus Labs. Inc., 566 U.S. 66, 71, 101 USPQ2d 1961, 1965 (2012) ("‘[M]ental processes [] and abstract intellectual concepts are not patentable, as they are the basic tools of scientific and technological work’" (quoting Benson, 409 U.S. at 67, 175 USPQ at 675)); Parker v. Flook, 437 U.S. 584, 589, 198 USPQ 193, 197 (1978) (same). Accordingly, the "mental processes" abstract idea grouping is defined as concepts performed in the human mind, and examples of mental processes include observations, evaluations, judgments, and opinions.” This judicial exception is not integrated into a practical application. In particular, the claim recites the additional elements – “collecting, by one or more processors of a computing system, a set of first sensor data from a first set of one or more sensors, the set of first sensor data comprising historical time series data; receiving, by the one or more processors of the computing system, via one or more input devices of one or more computing devices, auxiliary data corresponding to a set of first modified sensor data, wherein the auxiliary data includes at least one of (i) one or more changes to the set of first modified sensor data, or (ii) one or more labels applicable to the set of first modified sensor data, causing display of, by the one or more processors, via a user device, the first set of one or more detection rules and the set of first modified sensor data; receiving, by the one or more processors from the user device, a user modification of the first set of one or more detection rules, in response to display of the first set of one or more detection rules and the set of first modified sensor data;” Examiner views these limitations amount to generally linking the use of the judicial exception to a particular technological environment or field of use – see MPEP 2106.05(h). As such Examiner does NOT view that the claims -Improve the functioning of a computer, or to any other technology or technical field -Apply the judicial exception with, or by use of, a particular machine - see MPEP 2106.05(b) -Effect a transformation or reduction of a particular article to a different state or thing - see MPEP 2106.05(c) -Apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception - see MPEP 2106.05(e) and Vanda Memo. Moreover, Examiner views the claims to be merely generally linking the use of the judicial exception to a generic computer system and generic data. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of “collecting, by one or more processors of a computing system, a set of first sensor data from a first set of one or more sensors, the set of first sensor data comprising historical time series data; receiving, by the one or more processors of the computing system, via one or more input devices of one or more computing devices, auxiliary data corresponding to a set of first modified sensor data, wherein the auxiliary data includes at least one of (i) one or more changes to the set of first modified sensor data, or (ii) one or more labels applicable to the set of first modified sensor data, causing display of, by the one or more processors, via a user device, the first set of one or more detection rules and the set of first modified sensor data; receiving, by the one or more processors from the user device, a user modification of the first set of one or more detection rules, in response to display of the first set of one or more detection rules and the set of first modified sensor data;” amount to using a computer as a tool to perform an abstract idea and mere data gathering. Collecting and receiving data is viewed as mere data gathering and displaying detection rules is viewed as necessary data outputting. See MPEP 2106.05(g). Furthermore, modifying generic data can be seen as adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f). Examiner further notes that such additional elements are viewed to be well known routine and conventional as evidenced by Agerstam (US 20190138423 A1) Tzur (US 20200041316 A1) The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Considering the claim as a whole, one of ordinary skill in the art would not know the practical application of the present invention since the claims do not apply or use the judicial exception in some meaningful way. As currently claimed, Examiner views that the additional elements do not apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, because the claims fails to recite clearly how the judicial exception is applied in a manner that does not monopolize the exception because the limitations regarding “collecting, by one or more processors of a computing system, a set of first sensor data from a first set of one or more sensors, the set of first sensor data comprising historical time series data; receiving, by the one or more processors of the computing system, via one or more input devices of one or more computing devices, auxiliary data corresponding to a set of first modified sensor data, wherein the auxiliary data includes at least one of (i) one or more changes to the set of first modified sensor data, or (ii) one or more labels applicable to the set of first modified sensor data, causing display of, by the one or more processors, via a user device, the first set of one or more detection rules and the set of first modified sensor data; receiving, by the one or more processors from the user device, a user modification of the first set of one or more detection rules, in response to display of the first set of one or more detection rules and the set of first modified sensor data.” can be viewed as a general system or devices that apply an abstract procedure to generic data. Dependent claims 2-18 and 20-21 when analyzed as a whole are held to be patent ineligible under 35 U.S.C. 101 because the additional recited limitation(s) fail(s) to establish that the claims are not directed to an abstract idea, as detailed below: They amount to further limiting how the detection rules are generated which is a mental process. There are no additional element(s) in the dependent claims that adds a meaningful limitation to the abstract idea to make the claim significantly more than the judicial exception (abstract idea). Dependent claims 2-18 and 20-21 further limit the abstract idea with an abstract idea and thus, the claims are still directed to an abstract idea without significantly more. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-14 and 19-21 are rejected under 35 U.S.C. 103 as being unpatentable over Agerstam (US 20190138423 A1) as modified by Jordon (US 20240330759 A1). With respect to claim 1, Agerstam teaches, A method comprising: receiving, by the one or more processors of the computing system, via one or more input devices of one or more computing devices, auxiliary data corresponding to a set of first modified sensor data, wherein the auxiliary data comprises a field visit data sample at least one of (i) one or more changes to the set of first modified sensor data, or (ii) one or more labels applicable to the set of first modified sensor data, the set of first modified sensor data having been obtained or created at least in part by application of a first set of one or more detection rules to a set of first sensor data from a first set of one or more sensors; (Para. [0039] teaches “is provided with the data processor 308 to transform the planning data 302 and/or the sensor data 304 collected by the data interface 110 into an information format useful for analysis by the data analyzer 112” (i.e. processors and modified sensor data.) Para. [0057] teaches “An example method to implement the model trainer 404 for heterogeneous sensors is to utilize the model trainer 404 to train a model based on features of a first sensor of a first type to predict the sensor data from a second sensor of a second type.” And “The trained model may determine, by analyzing the features of an image provided by the camera, that an object is located in the path of travel of the vehicle. The model trainer 404 provides the model to be published and stored.” (i.e. model is first set of detection rules.) causing display of, by the one or more processors, via a user device, the first set of one or more detection rules and the set of first modified sensor data; (Para. [0077] teaches “For example, the system applicator 118 may issue a request for a user, via an electronic display”) receiving, by the one or more processors from the user device, a user modification of the first set of one or more detection rules, in response to display of the first set of one or more detection rules and the set of first modified sensor data; (Para. [0077] teaches “For example, if the user authorizes use of the alternative operation noted in the system warning, the system applicator 118 may apply system B and continue normal operation using system B for system A (block 820). In some examples, the user may decline the alternative option noted in the system warning (block 822), and the system applicator 118 may determine other suitable available alternative/similar services (block 814) to use based on the predicted data” generating, by the one or more processors, a second set of one or more detection rules at least in part by modifying the first set of one or more detection rules based at least in part on the auxiliary data and the user modification; (Para. [0057] teaches “The example model trainer 404 may also train a model to determine the output of features of sensors S17 and S18 of the example sensor deployment 200 which are infrared (IR) sensors located on the transportation track to detect the obstruction in the track. The example trained model may determine, by analyzing the features, that when an IR sensor outputs a logic 1 (e.g., a digital value that represents a voltage greater than 0), there is an obstruction on the track, and when the IR sensor outputs a logic 0 (e.g., a digital value representing a voltage of zero or very small value) there is not an obstruction on the path” (i.e. s17 and s18 are the second set.) collecting, by the one or more processors, a set of second sensor data obtained from the first set of one or more sensors or from a second set of one or more sensors; (Abstract teaches “second sensor data from a second sensor of a monitored system;”) determining, by the one or more processors, that the second set of one or more detection rules is applicable to the set of second sensor data; (Para. [0057] teaches “The example model trainer 404 may then utilize the model published for the sensor S21 to validate the prediction of an obstruction that was determined by the IR-sensor-trained model. By utilizing this technique of data transfer method, the example model trainer 404 is more likely to provide the example anomaly detector 114 with an accurate and precise model of a future prediction of sensor data of a first sensor type to predict a deviation of a correct output operation of sensor data of a second type in the example sensor deployment 200.”) and modifying, by the one or more processors, the set of second sensor data by applying the second set of one or more detection rules to the set of second sensor data, the second set of one or more detection rules correcting the set of second sensor data. (Para. [0057] teaches “By utilizing this technique of data transfer method, the example model trainer 404 is more likely to provide the example anomaly detector 114 with an accurate and precise model of a future prediction of sensor data of a first sensor type to predict a deviation of a correct output operation of sensor data of a second type in the example sensor deployment 200.”)” Para. [0061] teaches “The example feature extractor 602 extracts data of sensors that may be anomalous. For example, the feature extractor 602 may receive sensor readings from a first sensor of a first type and determine the first sensor readings are possibly anomalous and the feature extractor 602 starts to extract features from a second sensor (of a first type or second type) to verify the first sensor data is anomalous. In some examples, the feature vector generated by the feature extractor 602 may represent data that is skewed, incorrect, anomalous, etc. compared to data that a sensor is intended to output.” Para. [0030] teaches “provided with a description of an anomaly and determines how to modify operation of the monitored system 108 to correct the error detected by the anomaly detector 114.”) Agerstam does not explicitly teach, collecting, by one or more processors of a computing system, a set of first sensor data from a first set of one or more sensors, the set of first sensor data comprising historical time series data; the set of second sensor data comprising time series data; Jordan teaches, collecting, by one or more processors of a computing system, a set of first sensor data from a first set of one or more sensors, the set of first sensor data comprising historical time series data; the set of second sensor data comprising time series data; (Para. [0005] teaches “method can include operations executed by one or more processors.” Para. [0029] teaches “As another example in which the object 104 is a machine, the time series 102a-n may each correspond to different sensors configured for measuring parameters of the machine. Examples of the sensors may include temperature sensors, pressure sensors, inclinometers, accelerometers, fluid-flow sensors, voltmeters, ammeters, vibration sensors, humidity sensors, position sensors, force sensors, or any combination of these. In some such examples, the data points in each of the time series 102a-n may indicate sensor values measured by a respective sensor during the prior time window.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Agerstam collecting, by one or more processors of a computing system, a set of first sensor data from a first set of one or more sensors, the set of first sensor data comprising historical time series data; the set of second sensor data comprising time series data with such as that of Jordan. One of ordinary skill would have been motivated to modify Agerstam, because time series analysis enables the analysis of trends, cycles, and patterns over specific time intervals, which allows for accurate forecasting, anomaly detection, and informed, proactive decision-making about the sensor data. Furthermore, Agerstam teaches gathering data over a time periods as seen in paragraphs [0090] and [0091]. With respect to claim 2, Agerstam further teaches The method of claim 1, further comprising generating, by the one or more processors, the first set of one or more detection rules. (Para. [0054] teaches “The example feature extractor 402 may generate a feature vector of a secondary sensor (e.g., a temperature sensor) used to obtain data of the same physical device (e.g., a holding vessel) as a primary sensor (e.g., a thermal imaging camera). In some examples, the feature extractor 402 may also generate a feature vector of a primary sensor.”) With respect to claim 3, Agerstam further teaches, The method of claim 2, wherein the first set of one or more detection rules is generated based in part on at least one of (i) first sensor data from the first set of one or more sensors, or (ii) prior sensor data from the first set of one or more sensors or from another set of one or more sensors. (Para. [0054] teaches “The example batch model updater 312 of FIG. 5 is provided with the feature extractor 402 to generate a feature vector based on a query for retrieving historical data from the operational database 310. The feature extractor 402 of FIG. 5 operates in a similar manner as the feature extractor 402 of FIG. 4 described in above examples. However, the feature vectors generated by the example feature extractor 402 of FIG. 5 represent features of different types of historical sensor data. For example, the feature extractor 402 queries the operational database 310 to retrieve historical data from sensors of heterogenous (e.g., different) types and produces output feature vectors corresponding to those heterogeneous sensors.”) With respect to claim 4, Agerstam further teaches, The method of claim 3, wherein the first set of one or more detection rules is generated based on prior auxiliary data that includes at least one of (i) one or more changes to the prior sensor data, or (ii) one or more labels applicable to the prior sensor data. (Para. [0054] teaches “The example batch model updater 312 of FIG. 5 is provided with the feature extractor 402 to generate a feature vector based on a query for retrieving historical data from the operational database 310. The feature extractor 402 of FIG. 5 operates in a similar manner as the feature extractor 402 of FIG. 4 described in above examples. However, the feature vectors generated by the example feature extractor 402 of FIG. 5 represent features of different types of historical sensor data. For example, the feature extractor 402 queries the operational database 310 to retrieve historical data from sensors of heterogenous (e.g., different) types and produces output feature vectors corresponding to those heterogeneous sensors.”) With respect to claim 5, Agerstam further teaches, The method of claim 4, wherein generating the first set of one or more detection rules comprises analyzing the prior sensor data and the prior auxiliary data by: selecting, by the one or more processors, based on the auxiliary data, a subset of the prior sensor data; (Para. [0068] teaches “At block 708, the example model trainer 404 determines if there is additional historical data to obtain corresponding to the example monitored system 108 of FIG. 1. For example, the model trainer 404 needs to receive historical data of the monitored system 108 that is sufficient to generate a sufficiently accurate and/or precise model.” and determining, by the one or more processors, one or more classes of modifications made to the subset of the prior sensor data. (Para. [0070] teaches “At block 720, the example feature extractor 602 generates a feature vector corresponding to extracted features of sensor data. For example, the feature vector may provide a number of features corresponding to the sensor data of the monitored system 108 such as temperature variables provided by multiple temperature sensors in proximity to one another, variables of an image captured by a camera and digital logic variables of an IR sensor, etc. At block 722, the example feature extractor 602 provides a feature vector to the published model in the example inference generator 604 (FIG. 6).”) With respect to claim 6, Agerstam further teaches, The method of claim 5, wherein generating the first set of detection rules comprises formulating an expression according to the one or more classes of modifications. (Para. [0045] teaches “For example, the batch model updater 312 retrieves historical sensor data from the operational database 310 and generates a feature vector(s) corresponding to the numeric or symbolic characteristics of a set of data (e.g., the historical sensor data). The example batch model updater 312 generates an inference (e.g., a conclusion, a classification, a deduction, etc.) of how a sensor, a system, etc. in the manufacturing environment 102 is operating.”) With respect to claim 7, Agerstam further teaches, The method of claim 6, wherein the subset of the prior sensor data is selected based on one or more characteristics of the first set of one or more sensors or the another set of one or more sensors. (Para. [0054] teaches “For example, the feature extractor 402 queries the operational database 310 to retrieve historical data from sensors of heterogenous (e.g., different) types and produces output feature vectors corresponding to those heterogeneous sensors.”) With respect to claim 8, Agerstam further teaches, The method of claim 7, wherein the one or more characteristics corresponds to one or more locations of the first set of one or more sensors or the another set of one or more sensors. (Para. [0033] teaches “In the illustrated example of FIG. 2, the sensor deployment 200 is illustrated with an example first boundary 204, an example second boundary 206, and an example third boundary 208 to illustrate maximum distance ranges that a gateway 214, 216, 218 can communicate with sensors. For example, the first boundary 204 includes sensors S11, S12, S13, S14, S15, S16, S17, S18 that are within the first boundary 204 and are managed by the first gateway 214. In some examples, sensors can be located in more than one boundary. For example, sensor S17, sensor S18, and sensor S21 are in a first overlap 210, the first overlap 210 indicating a crossover between the first boundary 204 and the second boundary 206. If a sensor is located in more than one boundary (e.g., an overlap), the sensor may be managed by more than one gateway. For example, the first gateway 214 manages the sensors located in the first boundary 204, and the second gateway 216 manages the sensors located in the second boundary 206. The sensors S17, S18, S21 located in the first overlap 210 can be managed by either gateway 214 and/or gateway 216.” Para. [0057] further teaches “example sensor deployment 200 which are infrared (IR) sensors located on the transportation track”) With respect to claim 9, Agerstam further teaches, The method of claim 1, wherein the modifying the set of second sensor data comprises applying the second set of one or more detection rules to generate data missing for one or more points in time. (Para. [0041] teaches “As used herein, filters are used to extract pertinent features of a given set of data by removing missing data values or replacing missing data values with placeholders, setting a range of predetermined data values and keeping only the data values within that range, etc. The filters can be predetermined and/or learned over time. The example data analyzer 112 is further described below in conjunction with FIG. 6.”) With respect to claim 10, Agerstam further teaches, The method of claim 1, wherein the modifying the set of second sensor data comprises applying the second set of one or more detection rules to refine the set of second sensor data. (Para. [0039] teaches “The example apparatus 300 of FIG. 3 is provided with the data processor 308 to transform the planning data 302 and/or the sensor data 304 collected by the data interface 110 into an information format useful for analysis by the data analyzer 112. For example, the data processor 308 can process sensor data 304 to generate transformed sensor data that is non-corrupt, accurate, and/or refined.” Para. [0057] teaches “By utilizing this technique of data transfer method, the example model trainer 404 is more likely to provide the example anomaly detector 114 with an accurate and precise model of a future prediction of sensor data of a first sensor type to predict a deviation of a correct output operation of sensor data of a second type in the example sensor deployment 200.”) With respect to claim 11, Agerstam further teaches, The method of claim 1, wherein the auxiliary data comprises metadata corresponding to the first set of modified sensor data. (Para. [0160] teaches “a comparator to determine if the second output data is anomalous by comparing metadata of the second output data to a reference input, the reference input representing non-anomalous data.”) With respect to claim 12, Agerstam further teaches, The method of claim 1, wherein the auxiliary data is indicative of how the first set of modified sensor data was modified by one or more users. (Para. [0128] teaches “a user to enter data and/or commands into the processor 1012. The input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.”) With respect to claim 13, Agerstam further teaches, The method of claim 1, wherein the auxiliary data comprises data from at least one sensor that is not included in either of the first set of one or more sensors or the second set of one or more sensors. (Para. [0033] teaches “For example, the first gateway 214 manages the sensors located in the first boundary 204, and the second gateway 216 manages the sensors located in the second boundary 206. The sensors S17, S18, S21 located in the first overlap 210 can be managed by either gateway 214 and/or gateway 216.” Para. [0036] teaches “A hybrid group or set includes at least two sensors in which a first one of the sensors is of a first sensor type (e.g., a mechanical float-based fluid level sensor) and a second one of the sensors is a second sensor type (e.g., a camera) different from the first sensor type. Such a hybrid sensor group can be located in the first boundary 204, the first overlap 210, or the second boundary 206, and the first gateway 214 collects sensor data therefrom and provides the sensor data to the data interface 110 of FIG. 1. For example, sensor S17, sensor S18, and sensor S21 are located in the first overlap 210 as illustrated in the FIG. 2. Sensor S17 and sensor S18 may be temperature sensors and sensor S21 may be an infrared sensor implemented using a camera. In such example, the three sensors to measure the temperature of a light fixture in the manufacturing environment 102.” Para. [0153] teaches “In Example 19, the subject matter of any one of Examples 17-18 can optionally include determining that the first and second sensors form a heterogenous sensor group and are in proximity to one another, and providing the historical sensor data of the heterogenous sensor group to a data interface”) With respect to claim 14, Agerstam further teaches, The method of claim 1, further comprising presenting, by the one or more processors, via one or more output devices, the second set of one or more detection rules, wherein presenting the second set of one or more detection rules comprises presenting, by the one or more processors, via the one or more output devices of the one or more computing devices, at least one of (i) a first indication of how the first set of one or more detection rules is modified to obtain the second set of one or more detection rules, (ii) a description, definition, summary, or representation of at least a portion of the second set of one or more detection rules, or (iii) a second indication of how applying the second set of one or more detection rules to the set of second sensor data modifies the set of second sensor data. (Para. [0043] teaches “In some examples, the notifier 116 is a notification service that is integrated with OT which delivers anomaly alerts through frameworks (e.g., mobile push notifications, logs, emails, etc.) selected for the manufacturing environment 102. In other examples, the notifier 116 can receive data from the data analyzer 112, the model publisher 314, or any other processor, hardware, software, and/or memory that could provide data to the notifier 116.”) With respect to claim 19, Agerstam teaches, A computing system comprising one or more processing circuits configured to: receive, via one or more input devices of one or more computing devices, auxiliary data corresponding to a set of first modified sensor data, wherein the auxiliary data comprises a field visit data sample at least one of (i) one or more changes to the set of first modified sensor data, or (ii) one or more labels applicable to the set of first modified sensor data, the set of first modified sensor data having been obtained or created at least in part by application of a first set of one or more detection rules to a set of first sensor data from a first set of one or more sensors; (Para. [0039] teaches “is provided with the data processor 308 to transform the planning data 302 and/or the sensor data 304 collected by the data interface 110 into an information format useful for analysis by the data analyzer 112” (i.e. processors and modified sensor data.) Para. [0057] teaches “An example method to implement the model trainer 404 for heterogeneous sensors is to utilize the model trainer 404 to train a model based on features of a first sensor of a first type to predict the sensor data from a second sensor of a second type.” And “The trained model may determine, by analyzing the features of an image provided by the camera, that an object is located in the path of travel of the vehicle. The model trainer 404 provides the model to be published and stored.” (i.e. model is first set of detection rules.) cause display of, via a user device, the first set of one or more detection rules and the set of first modified sensor data; (Para. [0077] teaches “For example, the system applicator 118 may issue a request for a user, via an electronic display”) receive, from the user device, a user modification of the first set of one or more detection rules, in response to display of the first set of one or more detection rules and the set of first modified sensor data; (Para. [0077] teaches “For example, if the user authorizes use of the alternative operation noted in the system warning, the system applicator 118 may apply system B and continue normal operation using system B for system A (block 820). In some examples, the user may decline the alternative option noted in the system warning (block 822), and the system applicator 118 may determine other suitable available alternative/similar services (block 814) to use based on the predicted data”) generate a second set of one or more detection rules at least in part by modifying the first set of one or more detection rules based at least in part on the auxiliary data and the user modification; (Para. [0057] teaches “The example model trainer 404 may also train a model to determine the output of features of sensors S17 and S18 of the example sensor deployment 200 which are infrared (IR) sensors located on the transportation track to detect the obstruction in the track. The example trained model may determine, by analyzing the features, that when an IR sensor outputs a logic 1 (e.g., a digital value that represents a voltage greater than 0), there is an obstruction on the track, and when the IR sensor outputs a logic 0 (e.g., a digital value representing a voltage of zero or very small value) there is not an obstruction on the path” (i.e. s17 and s18 are the second set.) collect a set of second sensor data obtained from the first set of one or more sensors or from a second set of one or more sensors; (Abstract teaches “second sensor data from a second sensor of a monitored system;”) determine that the second set of one or more detection rules is applicable to the set of second sensor data; (Para. [0057] teaches “The example model trainer 404 may then utilize the model published for the sensor S21 to validate the prediction of an obstruction that was determined by the IR-sensor-trained model. By utilizing this technique of data transfer method, the example model trainer 404 is more likely to provide the example anomaly detector 114 with an accurate and precise model of a future prediction of sensor data of a first sensor type to predict a deviation of a correct output operation of sensor data of a second type in the example sensor deployment 200.”) and modify the set of second sensor data by applying the second set of one or more detection rules to the set of second sensor data the second set of one or more detection rules correcting the set of second sensor data. (Para. [0057] teaches “By utilizing this technique of data transfer method, the example model trainer 404 is more likely to provide the example anomaly detector 114 with an accurate and precise model of a future prediction of sensor data of a first sensor type to predict a deviation of a correct output operation of sensor data of a second type in the example sensor deployment 200.”)” Para. [0061] teaches “The example feature extractor 602 extracts data of sensors that may be anomalous. For example, the feature extractor 602 may receive sensor readings from a first sensor of a first type and determine the first sensor readings are possibly anomalous and the feature extractor 602 starts to extract features from a second sensor (of a first type or second type) to verify the first sensor data is anomalous. In some examples, the feature vector generated by the feature extractor 602 may represent data that is skewed, incorrect, anomalous, etc. compared to data that a sensor is intended to output.” Para. [0030] teaches “provided with a description of an anomaly and determines how to modify operation of the monitored system 108 to correct the error detected by the anomaly detector 114.” Agerstam does not explicitly teach, collect, a set of first sensor data from a first set of one or more sensors, the set of first sensor data comprising historical time series data; the set of second sensor data comprising time series data; Jordan teaches, collect, a set of first sensor data from a first set of one or more sensors, the set of first sensor data comprising historical time series data; the set of second sensor data comprising time series data; (Para. [0005] teaches “method can include operations executed by one or more processors.” Para. [0029] teaches “As another example in which the object 104 is a machine, the time series 102a-n may each correspond to different sensors configured for measuring parameters of the machine. Examples of the sensors may include temperature sensors, pressure sensors, inclinometers, accelerometers, fluid-flow sensors, voltmeters, ammeters, vibration sensors, humidity sensors, position sensors, force sensors, or any combination of these. In some such examples, the data points in each of the time series 102a-n may indicate sensor values measured by a respective sensor during the prior time window.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Agerstam collect, a set of first sensor data from a first set of one or more sensors, the set of first sensor data comprising historical time series data; the set of second sensor data comprising time series data with such as that of Jordan. One of ordinary skill would have been motivated to modify Agerstam, because time series analysis enables the analysis of trends, cycles, and patterns over specific time intervals, which allows for accurate forecasting, anomaly detection, and informed, proactive decision-making about the sensor data. Furthermore, Agerstam teaches gathering data over a time periods as seen in paragraphs [0090] and [0091]. With respect to claim 20, Agerstam further teaches, The computing system of claim 19, the one or more processing circuits further configured to generate the first set of one or more detection rules based in part on (i) prior sensor data from the first set of one or more sensors or from another set of one or more sensors, and (ii) prior auxiliary data that includes at least one of (i) one or more changes to the prior sensor data, or (ii) one or more labels applicable to the prior sensor data. (Para. [0054] teaches “The example batch model updater 312 of FIG. 5 is provided with the feature extractor 402 to generate a feature vector based on a query for retrieving historical data from the operational database 310. The feature extractor 402 of FIG. 5 operates in a similar manner as the feature extractor 402 of FIG. 4 described in above examples. However, the feature vectors generated by the example feature extractor 402 of FIG. 5 represent features of different types of historical sensor data. For example, the feature extractor 402 queries the operational database 310 to retrieve historical data from sensors of heterogenous (e.g., different) types and produces output feature vectors corresponding to those heterogeneous sensors.”) With respect to claim 21, Agerstam further teaches, The computing system of claim 19, the one or more processing circuits further configured to presenting, via one or more output devices, the second set of one or more detection rules, wherein presenting the second set of one or more detection rules comprises at least one of: presenting, via the one or more output devices of the one or more computing devices, a first indication of how the first set of one or more detection rules is modified to obtain the second set of one or more detection rules; presenting, via the one or more output devices of the one or more computing devices, a description, definition, summary, or representation of at least a portion of the second set of one or more detection rules; or presenting, via the one or more output devices of the one or more computing devices, a second indication of how applying the second set of one or more detection rules to the set of second sensor data modifies the set of second sensor data. (Para. [0043] teaches “In some examples, the notifier 116 is a notification service that is integrated with OT which delivers anomaly alerts through frameworks (e.g., mobile push notifications, logs, emails, etc.) selected for the manufacturing environment 102. In other examples, the notifier 116 can receive data from the data analyzer 112, the model publisher 314, or any other processor, hardware, software, and/or memory that could provide data to the notifier 116.”) Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Agerstam (US 20190138423 A1) and Jordon (US 20240330759 A1) as applied to claim 1 above, and further in view of Adler (US 20190204080 A1). With respect to claim 15, The combination of Agerstam and Jordon does not explicitly teach, The method of claim 1, wherein the first set of one or more sensors and the second set of one or more sensors are sensors detecting conditions of one or more bodies of water. Adler teaches, The method of claim 1, wherein the first set of one or more sensors and the second set of one or more sensors are sensors detecting conditions of one or more bodies of water. (Para. [0008] teaches “accessing input data derived from one or more sensors, wherein the one or more sensors are configured to provide data representative of wave activity in a body of water, wherein the one or more sensors include sensors provided on one or more buoys located in a wave approach region for a notification zone” Para. [0132] teaches “For example, wave observations during a calibration analysis period are calibrated using secondary sensors and/or visual observations, and that calibration data is used to assist in analysis of wave data during a subsequent period”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combination of Agerstam and Jordon wherein the first set of one or more sensors and the second set of one or more sensors are sensors detecting conditions of one or more bodies of water such as that of Adler. One of ordinary skill would have been motivated to modify the combination of Agerstam and Jordon, because as seen in Para. [0002] of Agerstam “Anomaly detection is the process of identifying outliers in the inputs for a problem domain (e.g., data accuracy in an IoT environment). Example anomalies include, but are not limited to, a faulty sensor, a credit card fraud, a failure in a mechanical part or control system, etc. Some example impacts of anomalies include, but are not limited to, monetary losses, property damage, loss of life, etc.” The sensors in Adler collect anomalous data as the sensors must be calibrated and the detected waves could be dangerous and cause loss of life as seen in Para. [0062] of Adler. Claims 16-18 are rejected under 35 U.S.C. 103 as being unpatentable over Agerstam (US 20190138423 A1) and Jordon (US 20240330759 A1) as applied to claim 1 above, and further in view of Leverich (US 10992560 B2). With respect to claim 16, the combination of Agerstam and Jordon does not explicitly teach, The method of claim 1, wherein the second set of one or more detection rules comprises a second plurality of detection rules, and wherein generating the second plurality of detection rules comprises generating a second sequential order in which the second plurality of rules are to be applied to the set of second sensor data. Leverich teaches, wherein the second set of one or more detection rules comprises a second plurality of detection rules, and wherein generating the second plurality of detection rules comprises generating a second sequential order in which the second plurality of rules are to be applied to the set of second sensor data. (Col. 26 Ln(s). [28-30] teaches “In an embodiment, an anomaly detection configuration of an anomaly detection definition may provide information that is necessary to establish and configure the sequential set of data points of a signal (or in the case of a cohesive anomaly detection procedure, a plurality of signals), update signals based on incoming data streams 904, perform pre-loading or backfilling of signal based on historical data 908, dictate anomaly detection procedures associated with the anomaly detection definition and one or more signals, provide configuration parameters for the signals and anomaly detection procedures” Col. 38 Ln(s). [23-31] teaches “In some embodiments, processing of the new data points may include reordering the data points (e.g., signal data chunks 1124.sub.1-1124.sub.M) in a buffer (e.g., a reorder buffer 1134.sub.1-1134.sub.M), although, in some embodiments, ordering maybe performed at other times such as at the time of anomaly detection. In some embodiments, one or more of the data points within a signal may be also be updated based on backfill data” It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combination of Agerstam and Jordon wherein the second set of one or more detection rules comprises a second plurality of detection rules, and wherein generating the second plurality of detection rules comprises generating a second sequential order in which the second plurality of rules are to be applied to the set of second sensor data such as that of Leverich. One of ordinary skill would have been motivated to modify the combination of Agerstam and Jordon, because as seen in Para. [0050] of Agerstam “As used herein, temporal correlation is defined as the nature of the physical phenomenon between each observation of a sensor in the example sensor deployment 200 (e.g., the sensors observing the same type of physical phenomenon). In some examples, identifying the spatial-temporal correlation between sensors in the example sensor deployment 200 can reduce the data traffic generated in the example gateways 214, 216, and 218 by collecting data of two sensors spatially and temporally correlated rather than collecting the data of the plurality of sensors associated with one of the example gateways 214, 216, 218 to detect an anomaly.” A temporal correlation suggests a certain sequence of detection. Furthermore, Col. 28 Ln(s). [43-48] teach “The common data structure may allow for the anomaly detection configurations to be accessed, implemented, invoked, and updated in a similar manner.” With respect to claim 17, The combination of Agerstam and Jordon does not explicitly teach, The method of claim 16, wherein the second sequential order is based on at least one of an attribute or an action of each of the second plurality of detection rules. Leverich teaches, The method of claim 16, wherein the second sequential order is based on at least one of an attribute or an action of each of the second plurality of detection rules. (Col. 26 Ln(s). [28-30] teaches “In an embodiment, an anomaly detection configuration of an anomaly detection definition may provide information that is necessary to establish and configure the sequential set of data points of a signal (or in the case of a cohesive anomaly detection procedure, a plurality of signals), update signals based on incoming data streams 904, perform pre-loading or backfilling of signal based on historical data 908, dictate anomaly detection procedures associated with the anomaly detection definition and one or more signals, provide configuration parameters for the signals and anomaly detection procedures” Col. 38 Ln(s). [23-31] teaches “In some embodiments, processing of the new data points may include reordering the data points (e.g., signal data chunks 1124.sub.1-1124.sub.M) in a buffer (e.g., a reorder buffer 1134.sub.1-1134.sub.M), although, in some embodiments, ordering maybe performed at other times such as at the time of anomaly detection. In some embodiments, one or more of the data points within a signal may be also be updated based on backfill data” It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combination of Agerstam and Jordon wherein the second sequential order is based on at least one of an attribute or an action of each of the second plurality of detection rules such as that of Leverich. One of ordinary skill would have been motivated to modify the combination of Agerstam and Jordon, because as seen in Para. [0050] of Agerstam “As used herein, temporal correlation is defined as the nature of the physical phenomenon between each observation of a sensor in the example sensor deployment 200 (e.g., the sensors observing the same type of physical phenomenon). In some examples, identifying the spatial-temporal correlation between sensors in the example sensor deployment 200 can reduce the data traffic generated in the example gateways 214, 216, and 218 by collecting data of two sensors spatially and temporally correlated rather than collecting the data of the plurality of sensors associated with one of the example gateways 214, 216, 218 to detect an anomaly.” A temporal correlation suggests a certain sequence of detection. Furthermore, Leverich Col. 28 Ln(s). [43-48] teach “The common data structure may allow for the anomaly detection configurations to be accessed, implemented, invoked, and updated in a similar manner.” With respect to claim 18, the combination of Agerstam and Jordon does not explicitly teach, The method of claim 17, further comprising applying the second plurality of detection rules to the set of second sensor data according to the second sequential order. Leverich teaches, further comprising applying the second plurality of detection rules to the set of second sensor data according to the second sequential order. (Col. 26 Ln(s). [28-30] teaches “In an embodiment, an anomaly detection configuration of an anomaly detection definition may provide information that is necessary to establish and configure the sequential set of data points of a signal (or in the case of a cohesive anomaly detection procedure, a plurality of signals), update signals based on incoming data streams 904, perform pre-loading or backfilling of signal based on historical data 908, dictate anomaly detection procedures associated with the anomaly detection definition and one or more signals, provide configuration parameters for the signals and anomaly detection procedures” Col. 38 Ln(s). [23-31] teaches “In some embodiments, processing of the new data points may include reordering the data points (e.g., signal data chunks 1124.sub.1-1124.sub.M) in a buffer (e.g., a reorder buffer 1134.sub.1-1134.sub.M), although, in some embodiments, ordering maybe performed at other times such as at the time of anomaly detection. In some embodiments, one or more of the data points within a signal may be also be updated based on backfill data” It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combination of Agerstam and Jordon, further comprising applying the second plurality of detection rules to the set of second sensor data according to the second sequential order such as that of Leverich. One of ordinary skill would have been motivated to modify the combination of Agerstam and Jordon, because as seen in Para. [0050] of Agerstam “As used herein, temporal correlation is defined as the nature of the physical phenomenon between each observation of a sensor in the example sensor deployment 200 (e.g., the sensors observing the same type of physical phenomenon). In some examples, identifying the spatial-temporal correlation between sensors in the example sensor deployment 200 can reduce the data traffic generated in the example gateways 214, 216, and 218 by collecting data of two sensors spatially and temporally correlated rather than collecting the data of the plurality of sensors associated with one of the example gateways 214, 216, 218 to detect an anomaly.” A temporal correlation suggests a certain sequence of detection. Furthermore, Leverich, Col. 28 Ln(s). [43-48] teach “The common data structure may allow for the anomaly detection configurations to be accessed, implemented, invoked, and updated in a similar manner.”) Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOSHUA L FORRISTALL whose telephone number is 703-756-4554. The examiner can normally be reached Monday-Friday 8:30 AM- 5 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Schechter can be reached on 571-272-2302. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JOSHUA L FORRISTALL/Examiner, Art Unit 2857 /ANDREW SCHECHTER/Supervisory Patent Examiner, Art Unit 2857
Read full office action

Prosecution Timeline

Apr 07, 2023
Application Filed
Aug 07, 2025
Non-Final Rejection — §101, §103
Oct 27, 2025
Examiner Interview Summary
Oct 27, 2025
Applicant Interview (Telephonic)
Nov 13, 2025
Response Filed
Feb 18, 2026
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12572161
METHOD AND CONTROL DEVICE FOR CONTROLLING A ROTATIONAL SPEED
2y 5m to grant Granted Mar 10, 2026
Patent 12546581
CAPACITIVE DETECTION OF FOLD ANGLE FOR FOLDABLE DEVICES
2y 5m to grant Granted Feb 10, 2026
Patent 12516599
MONITORING CORROSION IN DOWNHOLE EQUIPMENT
2y 5m to grant Granted Jan 06, 2026
Patent 12481043
SYSTEMS AND TECHNIQUES FOR DEICING SENSORS
2y 5m to grant Granted Nov 25, 2025
Patent 12455392
METHOD TO CORRECT VSP DATA
2y 5m to grant Granted Oct 28, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
69%
Grant Probability
92%
With Interview (+23.4%)
3y 3m
Median Time to Grant
Moderate
PTA Risk
Based on 58 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month