This Office action is in response to application filed on 04/07/2023.
DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
2. The information Disclosure Statements (IDSs) filed 04/07/2023, 10/03/2024, 09/17/2025, and 01/16/2026 have been considered.
Abstract Objection
3. The abstract filed on 04/07/2023 is objected to because of the following informalities:
The language of the abstract should be clear and concise should not repeat information given in the title or in the claims body. It should avoid using phrases, e.g., “disclosed are…”, “Embodiments include….”, “The present disclosure…”, or “This object is achieved by …” See MPEP § 608.01(b) and 37 C.F.R. 1.438. The purpose of the abstract is to enable the Office and the public generally to determine quickly from a cursory inspection the nature and gist of the technical disclosure. See MPEP § 608.01(b) and 37 C.F.R. 1.72.
Appropriate correction is required.
Note: The abstract should be labeled with “Current Amendment”.
Claim Objections
4. Claims 1, 11-15, and 20 are objected to because of the following informalities:
Claims 1, 13-14, and 20 recite the term “would be” is not positive limitation. It is suggested to rewrite, i.e., claim 1 lines 7-9: ”presenting, via one or more output devices of one or more computing devices, an indication of [[how]] at least a portion of the first sensor data being modified through application of the one or more detection rules.” Similarly, claims 13-14 and 20 can be amended the same as in claim 1.
Claims 11-12 recite “the second sensor data” should read “a second sensor data”. Further, missing the period “.” at the end of claim 11.
Claims 12-14 recite “via one or more input devices” should read “via the one or more input devices”.
Claims 13-14 recite “via one or more output devices” should read “via the one or more output devices”. Further, an extra “and” at the end of line 4 of claim 13 and line 7 of claim 14.
Claim 15 line 7 is missing the “and” before the word “wherein”.
Claim 20 recites “collect (i) first sensor data”, where “(i)” is an extra not necessary and should be removed.
Appropriate correction is required.
Claim Rejections - 35 USC § 101
5. 35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
6. Claims 1-23 are rejected under 35 U.S.C. 101 as the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon,
or an abstract idea) without significantly more.
Under Step 1 of the 2019 Revised Patent Subject Matter Eligibility Guidance, the
claims are directed to process and system/apparatus (claims 1 and 20), which are statutory categories.
However, evaluating claims 1 and 20 under at Step 2A, Prong One, the claims are directed to the judicial exception of an Abstract idea using the grouping of mental processes including “generating or identifying one or more detection rules applicable to the first sensor data, for refining or supplementing the first sensor data, by analyzing at least a portion of the first sensor data; presenting an indication of how at least a portion of the first sensor data would be refined or supplemented through application of the one or more detection rules; and applying the one or more detection rules to the first
sensor data to refine or supplement the first sensor data.”
Next, Step 2A, Prong Two evaluates whether additional elements of the claims
"integrate the abstract idea into a practical application" in a manner that imposes a
meaningful limit on the judicial exception, such that the claims are more than a drafting effort designed to monopolize the exception. The additional limitations of “collecting, by one or more processors of a computing system, first sensor data comprising a time series of data obtained from a first set of one or more sensors; and receiving, by the one or more processors, via one or more input devices of the one or more computing devices, approval of the application of the one or more detection rules to the first sensor data” are merely data gathering using generic sensors, which are a form of insignificantly extra-solution activities, and using a processor which is a generic computer component recited at a high level of generality. These additional limitations do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claims are directed to the abstract idea. Thus, claims 1 and 20 are not patent eligible.
At Step 2B, consideration is given to additional elements that may make the
abstract idea significantly more. Under Step 2B, there are no additional elements that
make the claim significantly more than the abstract idea.
The additional limitations as recited above in step 2A - Prong Two, are considered insignificant extra-solution activities, mere computer implementation using generic computer element, which do not provide significantly more under Step 2B.
Dependent claims 2-19 and 21-23 do not integrate the claims into a practical application or amount to "significant more" because they merely add details to the algorithm which forms the abstract idea and/or include additional limitations that are insignificant extra-solution activities, and mere computer implementation using generic computer element. Thus, the dependent claims are ineligible.
Claim Rejections - 35 USC § 112
7. The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
8. Claims 21-23 are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention.
The recitation in claims 21-23, “the computer system of claim 17” lacks antecedent basis because claim 17 is a method. For purpose of examination, claims 21-23 are assumed depend on claim 20.
Claim Rejections - 35 USC § 102
11. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless -
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on
sale or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
12. Claims 1-5, 7-15, and 17-23 are rejected under 35 U.S.C. 102(a)(1) and 35 U.S.C. 102(a)(2) as being anticipated over US 2019/0138423 of Agerstam et al., hereinafter Agerstam (IDS of record).
As per Claim 1, Agerstam teaches a method comprising:
collecting, by one or more processors of a computing system (Fig 12, [0124]), first sensor data comprising a time series of data obtained from a first set of one or more sensors (Fig 10A: collector 1014 collects data from “camera C1” considered first set of sensors, step 1016 shows a list of sensor data in a first time, where sensor data corresponding with “time stamps” or “time series”, see [0090], [0108], [0111]);
generating or identifying, by the one or more processors, one or more detection rules applicable to the first sensor data (analyzer 112 applies rules-based system to generate scores in step A1020, see [0092], [0110], Fig 11A step 1108 using rules for each sensor data [0109]), for refining or supplementing the first sensor data, by analyzing at least a portion of the first sensor data (generating transformed sensor data and refined are considered “analyzed data” [0039], [0030], step 1020 shows a list of scores corresponds to sensors, i.e., N_C1 and N_C2 considered “analyzed data of first sensor data” [0088]);
presenting, by the one or more processors, via one or more output devices of one or more computing devices, an indication of how at least a portion of the first sensor data would be refined or supplemented through application of the one or more detection rules (Fig 1: notifier 116 “output device” generates a type of an anomaly detected, see [0029], notifier 116 can send a trigger to calibrate “refine” data, see [0044], [0030]);
receiving, by the one or more processors, via one or more input devices of the one or more computing devices, approval of the application of the one or more detection rules to the first sensor data (Fig 1: system applicator 118 receives anomalous report and corrects the system [0029], i.e., analyzed sensor data needs to be modified to correct, see [0063] [0072]); and
applying, by the one or more processors, the one or more detection rules to the first sensor data to refine or supplement the first sensor data (using rules for each sensor data, see [0109], [0063]).
As per Claim 2, Agerstam teaches the method of claim 1, further comprising
collecting, by the one or more processors, auxiliary data corresponding at least in part to the first sensor data, wherein the one or more detection rules are generated based at least on the auxiliary data (temperature data of camera data considered “auxiliary data”, see [0036], i.e., a thermal imaging camera generates redundant data of a temperature sensor, see [0053], [0055]) by: selecting, by the one or more processors, based on the auxiliary data, a subset of the first sensor data (consecutive images are captured at “a time resolution of milliseconds/minutes” apart considered “a subset of camera data”, see [0090]); and
determining, by the one or more processors, one or more classes of modifications made to the subset of the first sensor data (identifying the physical damage of sensor data from camera’s images [0053], “features of sensors of infrared data” considered “subset of temperature data from camera data, see [0036], [0057]. Fig 11C, step 1160: classify sensor data, detect malfunction and modify, see [0120], [0082], [0099]).
As per Claim 3, Agerstam teaches the method of claim 2, wherein generating the one or more detection rules comprises formulating, by the one or more processors, an expression according to the one or more classes of modifications (i.e., assign “1” for an anomalous score, see [0092], [0110], [0118]).
As per Claim 4, Agerstam teaches the method of claim 2, wherein the subset of the first sensor data is selected based on one or more characteristics of one or more of the set of one or more sensors from which the first sensor data was obtained (features of data of sensors considered “characteristics of sensors”, see [0027]-[0028], [0045]).
As per Claim 5, Agerstam teaches the method of claim 4, wherein the one or more characteristics corresponds to one or more locations of the one or more of the set of one or more sensors from which the first sensor data was obtained (identify features in sensor data correspond to location, see [0048], [0035], [0053]).
As per Claim 7, Agerstam teaches the method of claim 1, further comprising collecting, by the one or more processors, a set of metadata corresponding to the first sensor data (see [0098], [0100], [0114]-[0115]), wherein the one or more detection rules are further generated based on the set of metadata (see [0162], [0174]).
As per Claim 8, Agerstam teaches the method of claim 1, further comprising collecting, by the one or more processors, auxiliary data comprising a transformation of at least part of the first sensor data (Fig 10A-step 1016 shows analyzer 112 collects sensor data, step A output shows N_C1 considered “transformed data of the first set of sensor C).
As per Claim 9, Agerstam teaches the method of claim 1, further comprising collecting, by the one or more processors, auxiliary data comprising a modified time series of sensor data corresponding at least in part to the first sensor data (Fig 11B, steps 1116, 1104- obtain current/future sensor data at time T and T+1 “time series”, see [0090], [0108]), wherein at least a subset of the first sensor data is modified or deleted, wherein the auxiliary data is indicative of changes made to the first sensor data (extract temperature value vector considered “a change in data”, see [0048], [0069]).
As per Claim 10, Agerstam teaches the method of claim 1, further comprising collecting, by the one or more processors, auxiliary data comprising data from at least one sensor that is not included in the first set of one or more sensors (distance values considered an auxiliary data of radar sensors, which are not included in the camera set or “first set of sensors”, see [0089]).
As per Claim 11, Agerstam teaches the method of claim 1, wherein the indication includes a description, definition, summary, or representation of at least one of (i) at least a portion of the one or more the detection rules, or (ii) at least a portion of the second sensor data based on application of the one or more detection rules (generate a score for each sensor, i.e., score between “0” and “1” that represents “detection rules”, see [0092], [0109]-[0111]).
As per Claim 12, Agerstam teaches the method of the method of wherein the method further comprises receiving, by the one or more processors, via one or more input devices of the one or more computing devices, a user supplied modification of the one or more detection rules (Fig 8 - a request for a user and permit to modify, see [0077]), and wherein the application of the one or more detection rules to modify the second sensor data includes application of the user supplied modification (a second sensor data, i.e. camera “C2” as shown in Fig 10A, see [0071], the second sensor data with 0.81 deviation as anomalous, see [0072], the sensors under analysis need to be modified to correct, see [0063], [0135]).
As per Claim 13, Agerstam teaches the method of claim 1, further comprising:
receiving, by the one or more processors, via one or more input devices of the one or more computing devices (Fig 10A, data collected from cameras C1, C2, …), at least one of (i) one or more modifications to the first sensor data (extract temperature value vector considered “a change in data”, see [0048], [0069]), or (ii) one or more labels applied to the first sensor data; and
generating, by the one or more processors, one or more detection rules based on at least one of (i) the one or more modifications to the first sensor data (generate a feature vector and extract temperature value vector considered “a change in data”, see [0048], [0069]), or (ii) the one or more labels applied to the first sensor data;
collecting, by the one or more processors, second sensor data obtained from the first set of one or more sensors (Fig 10A - collect C2 considered a second sensor data of a first set of camera sensors) or a second set of one or more sensors;
presenting, by the one or more processors, via one or more output devices of one or more computing devices, an indication of how the second sensor data would be at least one of modified or labeled through application of the one or more detection rules (Fig 1: notifier 116 “output device” generates a type of an anomaly detected, see [0029], notifier 116 can send a trigger to calibrate “refine” data, see [0044], [0030]);
receiving, by the one or more processors, via one or more input devices of the one or more computing devices, approval of the application of the one or more detection rules to the second sensor data (receiving inputs and rules from, i.e., camera 2 “second sensor data”, see [0092], [0109]); and
applying, by the one or more processors, the one or more detection rules to the second sensor data to refine or supplement the second sensor data (see [0072], [0104]).
As per Claim 14, Agerstam teaches the method of the method of wherein generating the one or more detection rules comprises:
selecting, by the one or more processors, based on one or more modifications to the first sensor data or one or more labels applied to the first sensor data, a subset of the first sensor data (consecutive images are captured at “a time resolution of milliseconds/minutes” apart considered “a subset of camera data”, see [0090]);
determining, by the one or more processors, one or more classes of modifications made to the subset of the first sensor data (identifying the physical damage of sensor data from camera’s images [0053], “features of sensors of infrared data” considered “subset of temperature data from camera data, see [0036], [0057]. Fig 11C, step 1160: classify sensor data, detect malfunction and modify, see [0120], [0082], [0099]); and
formulating, by the one or more processors, an expression according to the one or more classes of modifications (i.e., assign “1” for an anomalous score, see [0092], [0110], [0118]); and wherein the method further comprises:
presenting, by the one or more processors, via one or more output devices of one or more computing devices, an indication of how second sensor data would be at least one of modified or labeled through application of the one or more detection rules (Fig 1: notifier 116 “output device” generates a type of an anomaly detected, see [0029], notifier 116 can send a trigger to calibrate “refine” data, see [0044], [0030]), the second sensor data obtained from one or more sensors in at least one of the first set of one or more sensors or a second set of one or more sensors (Fig 10A - collect C2 considered a second sensor data of a first set of camera sensors); and
receiving, by the one or more processors, via one or more input devices of the one or more computing devices, a user supplied modification of the set of one or more detection rules (Fig 8 - a request for a user and permit to modify, see [0077]), wherein the application of the one or more detection rules to modify the second sensor data includes application of the user supplied modification (a second sensor data, i.e. camera “C2” as shown in Fig 10A, see [0071], the second sensor data with 0.81 deviation as anomalous, see [0072], the sensors under analysis need to be modified to correct, see [0063], [0135]).
As per Claim 15, Agerstam teaches the method of claim 1, wherein generating the one or more detection rules comprises:
selecting, by the one or more processors, based on one or more modifications to the first sensor data or one or more labels applied to the first sensor data, a subset of the first sensor data (consecutive images are captured at “a time resolution of milliseconds/minutes” apart considered “a subset of camera data”, see [0090]); and
determining, by the one or more processors, one or more classes of modifications made to the subset of the first sensor data (identifying the physical damage of sensor data from camera’s images [0053], “features of sensors of infrared data” considered “subset of temperature data from camera data, see [0036], [0057]. Fig 11C, step 1160: classify sensor data, detect malfunction and modify, see [0120], [0082], [0099]), wherein the subset of the first sensor data is selected based on one or more characteristics of the set of one or more sensors (features of data of sensors considered “characteristics of sensors”, see [0027]-0028], [0045]), wherein the one or more characteristics corresponds to at least one of (i) one or more locations of the first set of one or more sensors (identify features in sensor data correspond to location, see [0048], [0035], [0053]), or (ii) one or more bodies of water from which sensor readings are collected using the first set of one or more sensors.
As per Claim 17, Agerstam teaches the method of claim 1, wherein the one or more detection rules comprises a plurality of detection rules (rules repository [0092], [0105]), and wherein generating or identifying the plurality of detection rules comprises generating or identifying a sequential order in which the plurality of detection rules are to be applied to the first sensor data (identify a pattern intended data attribute, a pattern data considered sequential order, see [0049],[0081], analyzer 112 applies rules-based system to generate scores in step A1020, see [0092], [0109]-[0110]).
As per Claim 18, Agerstam teaches the method of claim 17, wherein the sequential order is based on at least one of an attribute or an action of each of the plurality of detection rules (identify a pattern intended data attribute, a pattern data considered sequential order [0049]).
As per Claim 19, Agerstam teaches the method of claim 17, further comprising applying the plurality of detection rules to the first sensor data according to the sequential order (a list of scores corresponds to sensors [0088], assign “1”/”0” for an anomalous/abnormal score, see [0092], [0110], [0118]).
Claim 20 is rejected for the same rationale as in claim 1.
As per Claim 21, Agerstam teaches the computing system of claim 17, the one or more processing circuits further configured to communicate with at least one of:(A) a second computing system to collect at least one of (i) the first sensor data or (ii) the modification data; or (B) the first set of one or more sensors (a second camera 1004 considered “second system” captures images [0089], see Fig 2 for communication [0036]).
As per Claim 22, Agerstam teaches the computing system of claim 17, the one or more processing circuits further configured to collect auxiliary data corresponding at least in part to the first sensor data (temperature data of camera data considered “auxiliary data”, see [0036], [0070]).
Claim 23 is rejected for the same rationale as in claim 2.
Claim Rejections - 35 USC § 103
12. The following is a quotation under AIA of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action.
A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made.
13. Claim 6 is rejected under 35 U.S.C. 103 as being obvious over Agerstam in view of US 2020/0055577 of Shoemake et al., hereinafter Shoemake.
As per Claim 6, Agerstam teaches the method of claim 4, but does not teach wherein the one or more characteristics correspond to one or more bodies of water from which sensor readings are collected using the set of one or more sensors from which the first sensor data was obtained, and wherein the set of one or more sensors detect conditions of the one or more bodies of water. Shoemake teaches the one or more characteristics correspond to one or more bodies of water from which sensor readings are collected using the set of one or more sensors from which the first sensor data was obtained, and wherein the set of one or more sensors detect conditions of the one or more bodies of water (a water sensor system comprising one or more water sensors, Table 1 shows water sensors include pH/ salinity/ temperature/ water depth sensors, when the sensor passes through water the light collected [0072]-[0073]).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the present claimed invention, to modify the teaching of Agerstam having the characteristics, i.e., frequency measurement of sensor as taught by Shoemake that would create incorrect measurement of the fluorescent light intensity and compensate for this effect in which a simultaneous measurement of visible light transmitted through the oil/water sample is used to mathematically correct for the decreased fluorescent light measurement (Shoemake, [0075])..
14. Claim 16 is rejected under 35 U.S.C. 103 as being obvious over Agerstam in view of US 2021/0357292 of Pickering et al., hereinafter Pickering.
As per Claim 16, Agerstam teaches the method of claim 1, but does not teach wherein applying the one or more detection rules generates data missing from the first sensor data for one or more points in time. Pickering teaches applying the one or more detection rules generates data missing from the first sensor data for one or more points in time (generate missing data model, data stream associated with “time stamps” considered “points in time” [0027], the missing data points based upon rules based on verification [0031], [0046]). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the present claimed invention, to modify the teaching of Agerstam generating missing data as taught by Pickering that would facilitate to process a data stream and to estimate time values for the missing data points (Pickering, [0039]).
Conclusion
15. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
US20220026313 of Straehle, Method and device for detecting anomalies in sensor recordings of a technical system.
US patent 11544161 of Yarlagadda et al., Identifying anomalous sensors.
US20170184416 of Kohlenberg e al., Technologies for managing sensor anomalies.
16. Any inquiry concerning this communication or earlier communications from the
examiner should be directed to LYNDA DINH whose telephone number is (571) 270-
7150. The examiner can normally be reached on M-F 10 PM-6 PM ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an
interview, applicant is encouraged to use the USPTO Automated Interview Request
(AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Arleen M Vazquez can be reached on 571-272-2619. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the
Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only.
For more information about the PAIR system, see https://ppairmy.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LYNDA DINH/Examiner, Art Unit 2857
/LINA CORDERO/Primary Examiner, Art Unit 2857