Prosecution Insights
Last updated: April 19, 2026
Application No. 18/060,501

CONFIGURABLE DIGITAL BLOCK FOR INFRARED SENSORS

Final Rejection §101§103§112
Filed
Nov 30, 2022
Examiner
TIMILSINA, SHARAD
Art Unit
2857
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
STMicroelectronics
OA Round
2 (Final)
79%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
94%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
112 granted / 141 resolved
+11.4% vs TC avg
Moderate +15% lift
Without
With
+14.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
44 currently pending
Career history
185
Total Applications
across all art units

Statute-Specific Performance

§101
23.2%
-16.8% vs TC avg
§103
42.4%
+2.4% vs TC avg
§102
11.3%
-28.7% vs TC avg
§112
18.0%
-22.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 141 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to argument/Amendment Amendment and argument filed on 08/13/2025 are considered. Amendment: Claims 1-9, 12-16, 18-20 are amended. Claims 10, 11 are cancelled. Claim rejection under 35 U.S.C 112: Applicant amended the rejected claim to overcome the rejection. Therefore, the rejection is withdrawn. Examiner noted a new unclear issue in this response, please see it below in the claim rejection under 35 U.S.C 112 sections. Claim rejection under 35 U.S.C 101: Applicant argues “Claim 1 does not merely recite a desired result, but instead claims an improved sensor device with a multi-circuit sensor device in the technical field sensor devices. The other independent claims recite similar features. For at least these reasons, Applicant respectfully submits that claims 1-20 are directed to patent eligible subject matter.” Examiner respectfully disagrees because the sensor device with a multi-circuit device (first, second and third circuits) purely represent a mathematical concept, performing desired results (i.e., mathematical calculations or mental processes). Applicant recites the senor device with multi-circuits in the independent claims as if these circuits are real or physical circuits. First circuit (i.e., feature generator), second circuit (i.e., neural network) and third circuit (i.e., finite state machine) recited in independent claims 1, 14, 18 and dependent claims 2 and 15, 19 are mathematical models for computing mathematical relationships widely used in the field of art, therefore, are considered to be an abstract. Even if these circuits were real physical circuits, examiner considers the amended limitations with first, second and third circuit to be using or used on a generic computer or a computer part to perform the abstract ideas, i.e., mathematical or mental steps. MPEP 2106.04 III C and D also suggests that claims recite a mental process even if they are claimed as being performed on a computer. The claimed sensor device includes multiple mathematical calculators or models (which the applicant has called circuits) and does nothing more than high level mathematical calculations or mental process, do not provide improved sensor in the field of art in view of the prior arts. Therefore, the amended independent and dependent claims are not patent eligible, as they direct the independent and dependent claims further towards abstract ideas. Claim rejection under 35 U.S.C 103: Regarding the amended independent claims, the combination of following prior arts is applied in their rejections. Vigren et al. (US 20210103781 A1), Chowdhary et al. (US 20210272025 A1), Zhou et al. (CN 114818788 A), and Lin et al. (US 20120191967 A1). Claim Objections Claim 1 objected to because of the following informalities: Claim 1 recites an analog to digital converter coupled so the sensor… This can be corrected by replacing so with to. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 1-9, 12-20 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Independent claims 1, 14 and 18 recite a first memory stores a configuration data. Referring to specification, examiner noted a first memory block 136 is configured to store feature generation data in paragraph [0072] and [0073]. Therefore, the claimed first memory stores a configuration data in view of the specification is unclear. The feature generator 122 includes a memory 134. The memory 134 includes a first memory block 136 and a second memory block 138. Feature generator does not include third memory. It appears the first and second memory blocks in the specification are referred to second and third memory blocks respectively in the claim languages of claims 4 and 5. Applicant is requested to particularly point out and distinctly claim the subject matter as in the specification for the purpose of clarity and visibility. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C 101 because the claimed invention is directed to judicial exception (i.e., a law of nature, natural phenomenon, or an abstract idea) without significantly more. claim 1 recites: A sensor device, comprising: a sensor configured to generate sensor signals; an analog to digital converter coupled so the sensor and configured to receive the sensor signals and to generate sensor data; a digital signal processor coupled to the analog to digital converter and configured to receive the sensor data; a configurable digital analysis block coupled to the digital signal processor and configured to receive the sensor data from the digital processor; the configurable digital analysis block including: a first memory configured to store configuration data; a first circuit including a filter, one or more summers, a gain block, and a second memory and configured to generate feature data from the sensor data; a second circuit coupled to the first circuit and configured to generate first classification data based on the sensor data; a third circuit coupled to the first circuit and the second circuit and including a finite state machine including a state memory configured to store logic conditions and a command memory configured to store arc commands and state commands, the third circuit configured to generate second classification data based on the sensor data, wherein the configurable data analysis block is configured and to generate final classification data based on the sensor data by selectively including or excluding, based on the configuration data, each of the first, second, and third circuits, from participating in generating the final classification. The claim limitations in the abstract idea have been highlighted in bold above. Under the step 1 of the eligibility analysis, it is determined whether the claims are drawn to a statutory category by considering whether the claimed subject matter fall within the four statutory categories of patentable subject matter identified by 35 U.S.C 101: process, machine, manufacture, or composition of matter. The above claim is considered to be in the statutory category of (machine). Under the step 2A, prong one, it is considered whether the claim recites a judicial exception (abstract idea). In the above claim, the highlighted portion constitutes an abstract idea because, under a broadest reasonable interpretation, it recites limitations that fall into/recite an abstract idea exceptions. Specifically, under the 2019 Revised Patent Subject Matter Eligibility Guidance, it falls into groupings of subject matter when recited as such in a claim limitation, that cover mathematical concepts (mathematical relationships, mathematical formulas or equations, mathematical calculations) and mental process – concepts performed in the human mind including an observation, evaluation, judgement, and/or opinion. For example, a step of a sensor configured to generate sensor signals (is considered to be mathematical step); an analog to digital converter coupled so the sensor and configured to receive the sensor signals and to generate sensor data (is considered to be mathematical step); a first memory configured to store configuration data (is considered to be mental step); a first circuit including a filter, one or more summers, a gain block, and a second memory and configured to generate feature data from the sensor data (is considered to be mathematical step); a second circuit coupled to the first circuit and configured to generate first classification data based on the sensor data (is considered to be mathematical step); a third circuit coupled to the first circuit and the second circuit and including a finite state machine including a state memory configured to store logic conditions and a command memory configured to store arc commands and state commands, the third circuit configured to generate second classification data based on the sensor data, wherein the configurable data analysis block is configured and to generate final classification data based on the sensor data by selectively including or excluding, based on the configuration data, each of the first, second, and third circuits, from participating in generating the final classification (is considered to be mathematical step). These mathematical and mental steps represent that, under its broadest reasonable interpretation, covers performance of the limitation in the mind. That is, nothing in the claim element precludes the step from practically being performed in the mind. Similar limitations comprise the abstract ideas of the independent claims 14 and 18 Next, under the step 2A, prong two, it is considered whether the claim that recites a judicial exception is integrated into a practical application. In this step, it is evaluated whether the claim recites meaningful additional elements that integrate the exception into a practical application of that exception. In claim 1, the additional elements/steps are: sensor, analog to digital converter, digital signal processor, configurable digital analysis block, memory, first circuit, a filter, summers, a gain block, second circuit, third circuit, The above additional elements/steps (hardware or software component) are recited in generality and represent extra solution activity to the judicial exception. The additional elements/steps “an analog to digital converter coupled and a configurable digital analysis block configured to receive the sensor data”, a configurable digital analysis block coupled to the digital signal processor and configured to receive…” “a second circuit coupled to the first circuit” and “a third circuit coupled to the first circuit and the second circuit” are also recited in generality which seem to merely be gathering or transferring data and not really performing any kind of inventive step to provide any meaningful additional element. Also, it represents an extra-solution activity to the judicial exception. All uses of judicial exception require it. In claim 14, the additional elements/steps recite the similar additional elements/steps as of claim 1. The additional elements/steps (program/software – method) are recited in generality and represent extra- solution activity to the judicial exception. The additional elements/steps “receiving the sensor data… and receiving the digital sensor data…” ,“a second circuit coupled to the first circuit” and “a third circuit coupled to the first circuit and the second circuit” are also recited in generality which seem to merely be gathering and transferring data and not really performing any kind of inventive step to provide any meaningful additional element. Also, it represents an extra-solution activity to the judicial exception. All uses of judicial exception require it. In claim 18, the additional element is: infrared sensor, analog to digital converter, digital signal processor, configurable digital analysis block, memory, first circuit, a filter, summers, a gain block, second circuit, third circuit. These additional elements/steps are recited in generality and represent extra- solution activity to the judicial exception. The additional elements/steps “generating with a sensor …”, “receiving the sensor data…” “receiving the feature data…” is also recited in generality which seem to merely be gathering and transferring data and not really performing any kind of inventive step to provide any meaningful additional element. Also, it represents an extra-solution activity to the judicial exception. All uses of judicial exception require it. In conclusion, the above additional elements, considered individually and in combination with the other claim elements do not reflect an improvement to other technology or technical field, and, therefore, do not integrate the judicial exception into a practical application. Therefore, the claims are directed to a judicial exception and require further analysis under the step 2B. Considering the claim as a whole, one of ordinary skill in the art would not know the practical application of the present invention since the claims do not apply or use the judicial exception in some meaningful way. The independent claims, therefore, are not patent eligible. With regards to the dependent claims, the claims 2-9, 12, 13, 15-17 and 19-20 comprise the analogous subject matter and also comprise additional features/steps which are the part of an expanded abstract idea of the independent claim 1, 14 and 18 (additionally comprising mathematical relationship/mental process steps) and, therefore, the dependent claims are not eligible without additional elements that reflect a practical application and qualified for significantly more for substantially similar reason as discussed with regards to independent claims. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-6, 14, 15, 17-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Vigren et al. (US 20210103781 A1) herein after “Vigren” in view of Chowdhary et al. (US 20210272025 A1) after “Chowdhary”, Zhou et al. (CN 114818788 A) after “Zhou” and Lin et al. (US 20120191967 A1) here after “Lin”. Regarding claim 1, Vigren teaches a sensor device, comprising: a sensor configured to generate sensor signals (para [0022] Apparatus 100 comprises image sensor 10, radar sensor 20, and processing unit 40…. Similarly, radar sensor 20 is configured to capture radar data 25 for the same object 90 and provide radar data 25 to processing unit 40); an analog to digital converter coupled so the sensor and configured to receive the sensor signals and to generate sensor data (para [0025] The processing unit may include one or more communication interfaces, …as well as one or more data acquisition devices, such as an A/D converter); Examiner views AD converter converts the sensor’s analog signal to digital in the processing unit. a digital signal processor coupled to the analog to digital converter and configured to receive the sensor data (para [0025] The processing unit may include one or more processing units, e.g. a CPU (“Central Processing Unit”), a GPU (“Graphics Processing Unit”), an Al accelerator chip, a DSP (“Digital Signal Processor”); and Examiner views the digital signal processing processes or digitizes the sensor’s signal from AD converter in the processing unit. a configurable digital analysis block coupled to the digital signal processor and configured to receive the sensor data from the digital processor (From Fig. 1 and paragraph [0053] examiner views object classification apparatus 100 or classifier (i.e., configurable digital analysis block). Apparatus 100 includes a digital signal processor in the processing unit 40. Examiner views the apparatus 100 is connected to the digital signal processing), the configurable digital analysis block including: a first memory configured to store configuration data (para [0025] The processing unit may further include a system memory and a system bus that couples various system components including the system memory to the processing unit.); paragraph [0074] in the present application suggest sensor data as the feature data. Examiner views, Processing unit that has memory stores the sensors data or feature data. Virgen does not clearly teach a first circuit including a filter, one or more summers, a gain block, and a second memory and configured to generate feature data from the sensor data; a second circuit coupled to the first circuit and configured to generate first classification data based on the sensor data; a third circuit coupled to the first circuit and the second circuit and including a finite state machine including a state memory configured to store logic conditions and a command memory configured to store arc commands and state commands, the third circuit configured to generate second classification data based on the sensor data, wherein the configurable data analysis block is configured and to generate final classification data based on the sensor data by selectively including or excluding, based on the configuration data, each of the first, second, and third circuits, from participating in generating the final classification. Chowdhary teaches a first circuit including a filter, one or more summers, a gain block, and a second memory and configured to generate feature data from the sensor data (paragraph [0020], In Fig. 2 examiner views unit 104 generates feature data from the sensors 170 and 172. In fig. 2 within the unit 104 includes a filter 176, Arithmetic logic unit (ALU) 178 (i.e, summer), amplifier 174 (gain block), feature registers 180 as memory to store feature value.); The present invention groups filter, summer, amplifier, memory to be in a circuit calling a first circuit. a second circuit coupled to the first circuit and configured to generate first classification data based on the sensor data ([0022] The sensor unit 104 also includes a classifier 112. The classifier 112 receives the feature data for the various computed features. The classifier 112 generates classification data that classifies a context of the electronic device 102 based on the feature data. Fig. 2 examiner views classifier 112 as second circuit receives features data from feature register 180 that includes sensor data from sensors 170 and 172.) and generate first classification. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Chowdhary into Vigren for the purpose of having summer, filter, amplifier and memory in a circuit so that the features data (values) can be calculated, stored in the memory of retrieval so that a first classification can be generated. The combination f Vigren and Chowdhary do not teach a third circuit coupled to the first circuit and the second circuit and including a finite state machine including a state memory configured to store logic conditions and a command memory configured to store arc commands and state commands, the third circuit configured to generate second classification data based on the sensor data, wherein the configurable data analysis block is configured and to generate final classification data based on the sensor data by selectively including or excluding, based on the configuration data, each of the first, second, and third circuits, from participating in generating the final classification. Zou teaches a third circuit coupled to the first circuit and the second circuit and the third circuit configured to generate second classification data based on the sensor data, wherein the configurable data analysis block is configured and to generate final classification data based on the sensor data by selectively including or excluding, based on the configuration data, each of the first, second, and third circuits, from participating in generating the final classification (page 12, line 24. after obtaining the output of the neural network model, inputting the obtained classification prediction result and prediction probability into the finite state machine, the finite state opportunity to obtain the tracking target final state according to the self structure after logically judging: the final state of the tracking target is a transformation relationship between the state and the predicted probability value is greater than the preset probability minimum threshold value state, or is not changed state…. Page 13 line 12…finite state machine finishes state migration according to the output and self-structure by means of logic judgment, obtaining the final state of the target, the real-time state of the user can be identified, comprising walking, running, sitting, lying, jumping, station and falling. Page 13 line 18. when the obtained measured personnel data reaches the frame number n set by the time window size, inputting the data processed by the tested personnel into the neural network model for action classification, after classifying by the neural network model, obtaining the prediction result of action classification and prediction probability thereof, taking the output of the neural network the finite state machine state machine checks all the outgoing side of the node corresponding to the current state, using the output of the neural network model and the condition defined on the out side to match, if the matching is successful, performing the state transition along the edge, if all sides are matched and mismatched, spin, The state machine outputs the final state of the target). Herein the final output or state or classification after choosing or selecting the finite sate machine (third circuit) is to selectively receive one or more of the sensor data and the feature data (i.e., measured data or first circuit) and classification data (from second circuit or neural network) to generate a second and/or final classification data (i.e., identifications of walking, running, sitting, lying, jumping, station and falling. Examiner views the combination of first circuit, second circuit and the third circuit together as a configurable digital analysis block, please see fig. 1 and 3 in instant application where all the circuits are within 106. In the configurable digital analysis block when selected (or configured, signaled i.e., configuration data) for finite state machine (third circuit), uses feature data from the first circuit (i.e., feature generator), second circuit (i.e., neural network) and third circuit (i.e., finite state machine) to generate a final classification. Therefore, Zou provides all the steps required by configurable digital analysis block to provide a final classification as discussed above. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Zhou into Vigren for the purpose of obtaining a sensor data from sensor, feature data from feature generator and classification data from a neural network and use the classification data in a finite state machine so that a second or final classification or state of a monitored person can be accurately determined. The combination of Vigren, cowdhary, and Zhou do not clearly teach including a finite state machine including a state memory configured to store logic conditions and a command memory configured to store arc commands and state commands. Lin teaches including a finite state machine including a state memory configured to store logic conditions and a command memory configured to store arc commands and state commands (para [0093] FIG. 15A illustrates exemplary states of a reconfigurable finite state machine (FSM). As shown in FIG. 15A, when the current state of the FSM is state A (1501), if the input is i(0) (1507), then the FSM moves to state B (1502). When the current state of the FSM is state B (1502), if the input is j(0) (1508), then the FSM moves to state A (1501)…. [0094] FIG. 15B illustrates an exemplary configuration for a reconfigurable FSM. As shown in FIG. 15B, the reconfigurable FSM includes a random access memory 1515, a current state register 1521, and a reconfigurable multiplexer and reconfigurable random logic 1519. Each memory line of random access memory 1515 may include an input value 1516 of an input table, a state change value 1517 of a state change table, and an output control value 1518 of an output control table.), In above paragraphs and Figures, examiner views Finite machine includes a random-access memory (RAM). RAM stores logic condition, control output values (i.e., command values that include arc command) and state change value (i.e., state commands). For arc commands, examiner views the arc command represents a transition from one state to another when a condition is met. The above paragraph 93 provides an example “As shown in FIG. 15A, when the current state of the FSM is state A (1501), if the input is i(0) (1507), then the FSM moves to state B (1502). When the current state of the FSM is state B (1502), if the input is j(0) (1508), then the FSM moves to state A (1501)….” Lin suggests having a memory to store logic condition, command or control with arc representation and state command. The present invention would have been motivated to separate memories for logic condition and the commands so as to meet the specific requirement of the invention. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Lin into Vigren for the purpose of using memory for storing command and state of a finite sate machine so that the respective command and the state of the finite state machine can be appropriately stored. Claim 14, 18 is rejected as claim 1 above having same claim element/limitation. Regarding claim 2, the combination of Vigren, Chowdhary, Zhou and Lin teach The sensor device of claim 1, Chowdhary teaches wherein the first circuit is a feature generator configured, when selected for participation in generating the final classification (para [0025] The classifier 112 includes one or more classification algorithms that expect as input the selected features and that classify the context of the personal electronic device into one of a fixed number of possible classifications.), to receive the sensor data, to generate the feature data including a plurality features from the sensor data, and to output the feature data as a feature vector including the plurality of features (para [0044] In one embodiment, the feature sets 116 are feature vectors. Each data field of the feature vector corresponds to a particular feature type. [0168] The device includes a feature computation module configured to generate feature data from the raw sensor data.); Above the prior art teaches or suggests the steps of the feature generator (applicant calls a first circuit). the second circuit includes a neural network configured, when selected for participation in generating the final classification, to selectively receive either the sensor data or the feature vector, and to selectively generate the first classification (para [0046] During the machine learning process, the feature sets are provided to the model.. para [0047]… The feature sets are then passed through the model again and classified. Para [0122] The analysis model can include a neural network or other types of networks for generating new sensor configuration data 106.) Herein the output of machine learning or neural network (second circuit) is viewed to generate classification data based on the feature data set or feature vector or sensor data when neural network is selected or receives feature data; and Zou teaches the third circuit is configured, when selected for participation in generating the final classification, to selectively receive one or more of the sensor data, the feature data, and the first classification data and to generate the final classification data (page 12, line 24. after obtaining the output of the neural network model, inputting the obtained classification prediction result and prediction probability into the finite state machine, the finite state opportunity to obtain the tracking target final state according to the self structure after logically judging: the final state of the tracking target is a transformation relationship between the state and the predicted probability value is greater than the preset probability minimum threshold value state, or is not changed state…. Page 13 line 12…finite state machine finishes state migration according to the output and self-structure by means of logic judgment, obtaining the final state of the target, the real-time state of the user can be identified, comprising walking, running, sitting, lying, jumping, station and falling. Page 13 line 18. when the obtained measured personnel data reaches the frame number n set by the time window size, inputting the data processed by the tested personnel into the neural network model for action classification, after classifying by the neural network model, obtaining the prediction result of action classification and prediction probability thereof, taking the output of the neural network the finite state machine state machine checks all the outgoing side of the node corresponding to the current state, using the output of the neural network model and the condition defined on the out side to match, if the matching is successful, performing the state transition along the edge, if all sides are matched and mismatched, spin, The state machine outputs the final state of the target). Herein the final output or state after choosing or selecting the finite sate machine (third circuit) is to selectively receive one or more of the sensor data and the feature data (i.e., measured data) and classification data to generate a final classification data (i.e., identifications of walking, running, sitting, lying, jumping, station and falling. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Chowdhary and Zhou into Vigren for the purpose of obtaining a classification data from a neural network and use the classification data in a finite state machine to determine a final classification or state of a monitored person. Regarding claim 3, the combination of Vigren, Chowdhary, Zhou and Lin teach the sensor device of claim 2, Chowdhary further teaches wherein the second memory is configured to store, for each of the plurality of features, data for generating the feature. (para [0112] The feature calculation stage 1002 updates the staging buffer with the new feature data.) Fig. 2 in Chowdhary corresponds to Fig. 5 in present application. Examiner views these feature data are stored in a memory or buffer (i.e. viewed as second memory) of sensor configuration system 106 which includes feature calculation 1002. see. Fig. 10. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Chowdhary into Vigren for the purpose of storing a feature data in a memory of a sensor device so that a faster calculation and update of features data can be performed. Regarding claim 4, the combination of Vigren, Chowdhary, Zhou and Lin teach the sensor device of claim 3. Chowdhary further teaches wherein the feature generator includes a third memory configured to store values of the features ( Fig. 12 and 5, para [0101] The feature registers 180 store the feature values in feature sets or feature vectors. [0151] The feature computation module 1202 may include a staging buffer 1204. The staging buffer 1204 stores the feature data generated by the feature computation module 1202). Herein examiner views the feature computation module include memory (i.e. viewed as third memory) i.e., buffer 1204 for storing feature value or feature data of sensor configuration sytem 106. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Chowdhary into Vigren for the purpose of storing a feature data in a one or more separate memory of a sensor device so that a faster calculation and update of features data (values) can be performed. Regarding claim 5, the combination of Vigren, Chowdhary, Zhou and Lin the sensor device of claim 4, Chowdhary teaches wherein the feature data includes, for at least one of the features, a pointer indicating an address of a feature value in the third memory for computing the at least one of the feature (para [0101] The feature registers 180 store the feature values in feature sets or feature vectors. The feature sets can be provided to the classifier 112 so that the classifier 112 can classify a context of the electronic device for each feature set. The feature registers 180 are configurable in accordance with the configuration data 110. The feature registers 180 can be adjusted to store selected features in accordance with the current state of the classifier 112. If the classifier is updated to receive different numbers of features and/or different types of features, then the feature registers 180 are also updated to store the feature values in their proper sets. The feature registers 180 provide feature sets including feature values for each of a plurality of features to the classifier 112 and to the data registers 184). Examiner views feature data includes at least one feature in a feature data set, a register (i.e., viewed as pointer, register also indicates an address in memory) holds a memory address of a feature value in a memory (i.e., second memory) for calculating or updating at least one feature value. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Chowdhary into Vigren for the purpose of storing a feature data in a one or more memory of a sensor device and point or indicate a feature value in the feature data by using a register (or a pointer). Regarding claim 6, the combination of Vigren, Chowdhary, Zhou and Lin the sensor device of claim 5. Vigren further teaches wherein the second memory is a random access memory (para [0025] The system memory may include computer storage media in the form of volatile and/or non-volatile memory such as read only memory (ROM), random access memory (RAM) and flash memory). Prior art teaches a RAM, which examiner views a second memory. Claim 15 is rejected as claim 2 having same claim element/limitation. Regarding claim 17, the combination of Vigren, Chowdhary, Zhou and Lin teach the method of claim 15. Vigren further teaches comprising training the neural network with a machine learning process to generate the classification (para [0002] As the neural network is trained with more images representing human beings, the neural network becomes more and more accurate with its output of the object classification. para [0004] The object classifier may be any type of neural network, artificial intelligence, or machine learning scheme). Claim 19 is rejected as claim 2 having same limitation. Regarding claim 20, the combination of Vigren, Chowdhary, Zhou and Lin teach the method of claim 18 Zhou further teaches wherein the second classification data indicates whether or not a person is present (page 12, line 24. after obtaining the output of the neural network model, inputting the obtained classification prediction result and prediction probability into the finite state machine, the finite state opportunity to obtain the tracking target final state according to the self structure after logically judging: the final state of the tracking target is a transformation relationship between the state and the predicted probability value is greater than the preset probability minimum threshold value state, or is not changed state…. Page 13 line 12…finite state machine finishes state migration according to the output and self-structure by means of logic judgment, obtaining the final state of the target, the real-time state of the user can be identified, comprising walking, running, sitting, lying, jumping, station and falling.). Herein the final output or state of the finite sate machine is viewed as second classification based on the input (or first classification) from the neural network. The second classification data indicates if a person is walking, running, sitting and so on. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Zhou and Chowdhary into Vigren for the purpose determining a second classification by inputting a feature data from the output of a neural network into a finite state machine so that a state or condition of a monitored person can be accurately determined (i.e., if a person is present, walking running etc ). Claim(s) 7-9 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Vigren, Chowdhary, Zhou and Lin in view of Donnelly (US 20220044109 A1). Regarding claim 7, the combination of Vigren, Chowdhary, Zhou and Lin teach the sensor device of claim 2. The combination does not explicitly teach wherein the neural network is a quantized neural network trained with a machine learning process to generate the first classification data. Donnelly teaches wherein the neural network is a quantized neural network trained with a machine learning process to generate the first classification data (para [0033] The quantized neural network can be configured to perform any machine learning task. For example, the quantized neural network can be a feedforward neural network that is configured to process a network input to generate a network output, e.g., a classification output that includes a respective score corresponding to each of multiple categories for the network input.). A quantized neural network with a machine learning performs a first classification output. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Donnelly into Vigren, Chowdhary and Zhou for the purpose of using a quantized neural network trained with a machine learning process to perform a classification of data so that a more accurate pre-classification data can be generated than using a conventional neural network. Regarding claim 8, the combination of Vigren, Chowdhary, Zhou, Lin and Donnelly teach the sensor device of claim 7. Vigren further teaches wherein the first classification data includes a probability score for each possible class (para [0034] The probability value may be a single probability value indicative of the reliability of the entire classification or one or more probability values indicative of the reliability of each of the image-based object classification 115 variables.). Regarding claim 9, the combination of Vigren, Chowdhary, Zhou, Lin and Donelly teach the sensor device of claim 8. Vigren further teaches wherein the neural network includes a max operator configured to receive the first classification data and to output, as the classification, the class with the highest probability score (para [002] As the neural network is trained with more images representing human beings, the neural network becomes more and more accurate with its output of the object classification. Para [0004] The object classifier may be any type of neural network, artificial intelligence, or machine learning scheme. para [0052] In this example, for each object, the classifier may be configured to output the object classification with the highest probability value along with the corresponding probability value.). Herein examiner views the neural network utilizes input sensor data i.e., image and outputs classification or class with the highest probability value necessarily found by a max operator. Claim(s) 12-13, 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Vigren, Chowdhary, Zhou, and Lin in view of Clifford et al. (US 20220110546 A1) herein after Clifford. Regarding claim 12, the combination of Vigren, Chowdhary, Zhou, Lin the sensor device of claim 1. Vigren further discloses a pyroelectric sensor Fig. 1. Vigren does not explicilty teach wherein the sensor is a passive infrared sensor and the configurable digital analysis block is configured to generate the final classification indicating whether or not a person is in a field of view of the passive infrared sensor. Clifford teaches wherein the sensor is a passive infrared sensor (para [0023] FIG. 3A is a side view of an example of a passive infrared sensor for detecting movement events of a subject according to certain embodiments.) and the configurable digital analysis block is configured to generate the final classification indicating whether or not a person is in a field of view of the passive infrared sensor (para [0016] In addition, the machine-learning-based classifiers can automatically extract various statistical, time-domain, and/or frequency-domain features from the measured movement data, can select features that can provide the best classification sensitivity and specificity, and can generate classification results quickly. para [0059] In the example shown in FIG. 2A, motion sensor 200 may include a passive infrared sensor that can detect whether a person has moved into or out of a field of view of the passive infrared sensor based on the measurement of infrared light emanating from the person in the field of view.). Examiner views the configurable digital block (i.e., classifier) generates final classifications based on a passive infrared sensor measurement if a person is present in the field view of the passive infrared sensor. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Clifford into Vigren for the purpose of detecting if a person is present in a field view of a passive infrared sensor by implementing a classification technique by a machine learning (i.e., classifier) on the passive infrared sensor data. Using a passive infrared sensor has an advantage of generating signal indicating movement or presence of a person by using a thermal energy emitted by the person. Regarding claim 13, the combination of Vigren, Chowdhary, Zhou, Lin the sensor device of claim 1, the combination of Vigren, Chowdhary, Zhou, Lin do not teach wherein the sensor is a passive infrared sensor and the configurable digital analysis block is configured to generate the final classification indicating whether or not a person has crossed through a field of view of the passive infrared sensor. Clifford teaches wherein the sensor is a passive infrared sensor and the configurable digital analysis block is configured to generate the final classification indicating whether or not a person has crossed through a field of view of the passive infrared sensor (para [0016] In addition, the machine-learning-based classifiers can automatically extract various statistical, time-domain, and/or frequency-domain features from the measured movement data, can select features that can provide the best classification sensitivity and specificity, and can generate classification results quickly. para [0059] In the example shown in FIG. 2A, motion sensor 200 may include a passive infrared sensor that can detect whether a person has moved into or out of a field of view of the passive infrared sensor based on the measurement of infrared light emanating from the person in the field of view.). Examiner views the configurable digital block (i.e., classifier) generates final classifications based on a passive infrared sensor measurement if a person is has moved in or out (i.e., crossed) a field view of the passive infrared sensor. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Clifford into Vigren for the purpose of detecting if a person has crossed a field view of a passive infrared sensor by implementing a classification technique by a machine learning (i.e., classifier) on the passive infrared sensor data. Using a passive infrared sensor has an advantage of generating signal indicating movement or presence of a person by using a thermal energy emitted by the person. Regarding claim 16, the combination of Vigren, Cowdhary, Zhou and Lin teach the method of claim 15, however the combination does not clearly teach wherein sensor data includes an object temperature and an ambient temperature. Clifford teaches wherein generating sensor data includes generating an object temperature (para [0059] At the normal body temperature, a person may radiate most strongly in the infrared band, such as infrared light with wavelengths around 10 μm. A passive infrared sensor may include one or more pyroelectric sensors (e.g., made of a ceramic material) that can generate surface charges when exposed to infrared radiation) and an ambient temperature (para [0060] The differential pair configuration may compensate for offsets caused by environmental temperature changes and other output variations that may be common to the two pyroelectric sensors, and thus may provide a better sensitivity for detecting small changes in the spatial temperature pattern.) Herein measuring environmental temperature change is viewed as sensing an ambient temperature of the environment. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing of the invention to have incorporated Clifford into Vigren, Chowdhary, and Zhou for the purpose of detecting an object temperature and an ambient temperature by using pyroelectric sensors, so that a detection can be improved by canceling out offset or noises causes by the environmental/ambient temperature changes. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Matsugu (US 20020038294A1) discuss recognizing a pattern or detecting a particular object by using a neural network. Valencia (US20140337862A1) discusses method and device for communicating behavioral analysis using machine learning. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHARAD TIMILSINA whose telephone number is (571)272-7104. The examiner can normally be reached Monday-Friday 9:00-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Catherine Rastovski can be reached at 571-270-0349. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SHARAD TIMILSINA/Examiner, Art Unit 2863 /Catherine T. Rastovski/Supervisory Primary Examiner, Art Unit 2863
Read full office action

Prosecution Timeline

Nov 30, 2022
Application Filed
May 05, 2025
Non-Final Rejection — §101, §103, §112
Jul 15, 2025
Interview Requested
Jul 29, 2025
Applicant Interview (Telephonic)
Aug 02, 2025
Examiner Interview Summary
Aug 13, 2025
Response Filed
Oct 31, 2025
Final Rejection — §101, §103, §112
Dec 11, 2025
Interview Requested
Dec 30, 2025
Examiner Interview Summary
Dec 30, 2025
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601252
Predictions of Gas Concentrations In A Subterranean Formation
2y 5m to grant Granted Apr 14, 2026
Patent 12571667
FILL-LEVEL MEASUREMENT DEVICE
2y 5m to grant Granted Mar 10, 2026
Patent 12553704
METHOD AND SYSTEM FOR REAL-TIME MONITORING OF WALL THINNING AND ASCERTAINING OF WALL ATTRIBUTES USING FIBER BRAGG GRATING (FBG) SENSORS
2y 5m to grant Granted Feb 17, 2026
Patent 12531510
LOCATION UPDATE METHOD AND APPARATUS OF PHOTOVOLTAIC STRING
2y 5m to grant Granted Jan 20, 2026
Patent 12498215
CALCULATION METHOD FOR MEASURING FLATNESS OF CROSS-SECTION OF TUNNEL SEGMENT BASED ON SPATIAL POINT-TO-PLANE RELATION
2y 5m to grant Granted Dec 16, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
79%
Grant Probability
94%
With Interview (+14.6%)
2y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 141 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month