Prosecution Insights
Last updated: April 19, 2026
Application No. 17/822,709

AUTOMATIC PROCESSING CHAIN GENERATION

Non-Final OA §101§102§103
Filed
Aug 26, 2022
Examiner
GOLAN, MATTHEW BRYCE
Art Unit
2123
Tech Center
2100 — Computer Architecture & Software
Assignee
TDK Corporation
OA Round
3 (Non-Final)
0%
Grant Probability
At Risk
3-4
OA Rounds
3y 3m
To Grant
0%
With Interview

Examiner Intelligence

Grants only 0% of cases
0%
Career Allow Rate
0 granted / 3 resolved
-55.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
36 currently pending
Career history
39
Total Applications
across all art units

Statute-Specific Performance

§101
27.5%
-12.5% vs TC avg
§103
37.5%
-2.5% vs TC avg
§102
8.3%
-31.7% vs TC avg
§112
23.7%
-16.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 3 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION This Office Action is in response to a communication filed on February 19, 2026 for Application No. 17/822,709, in which claims 1-2, 4, and 6-17 are presented for examination. The amendments filed on February 19, 2026 have been entered, where claims 1, 4, 6, and 7 are amended and claims 3 and 5 are canceled. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 02/19/2026 has been entered. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-2, 4, and 6-17 are rejected under 35 U.S.C. 101 because the claimed invention is directed to abstract ideas without significantly more. Regarding Claim 1: Step 1: Claim 1 is a method claim. Therefore, claims 1-2, 4, and 6-17 are directed to a statutory category of eligible subject matter. Step 2A Prong 1: If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the "Mental Processes" grouping of abstract ideas. Here, steps of the claimed method are mental processes. Specifically, the claim recites “A method for designing a processing chain . . . the method comprising” (mental process – amounts to exercising judgment to form an opinion on process design, which may be aided by pen and paper); “a desired application comprising at least one activity” (mental process – amounts to an opinion); “automatically generating a processing chain . . . for executing the desired application based on the desired application and the plurality of raw sensor data” (mental process – amounts to exercising judgment based on observed information to form an opinion on how to process information, the details of which may be generated automatically based on an individual’s knowledge); and “wherein the automatically generating a processing chain of a sensor system for executing the desired application based on the desired application and the plurality of raw sensor data comprises: automatically selecting a sensor of the at least one sensor” (mental process – amounts to exercising judgement to form an opinion on a sensor, with reference to known or observed information and aims, which may happen automatically based on preferences); and “automatically selecting the at least one feature to extract from the sensor” (mental process – amounts to exercising judgement to form an opinion on which feature(s) to extract, which may happen automatically based on knowledge of the task). Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim recites the additional elements: “of a sensor system . . . for a sensor system to monitor, the sensor system comprising at least one sensor capable of generating sensor data based on sensing the at least one activity . . . of the sensor system . . . deploying the desired application for execution at an electronic device comprising the sensor system” (amounts to mere instructions to apply the judicial exception on generic and unspecialized computer components, which do not impose any meaningful limits on practicing the abstract idea); “receiving a desired application. . . accessing a database” (amounts to insignificant extra-solution activity, merely receiving a desired application and accessing information in a database are incidental to the functioning of the claimed method); and “comprising at least one activity . . . comprising a plurality of raw sensor data and a plurality of annotations, the plurality of annotations identifying activities corresponding to the plurality of raw sensor data . . . the processing chain for processing the sensor data and for extracting at least one feature from the sensor data for use in sensing the at least one activity . . . for use in the sensing of the activity based on the desired application . . . data for use in sensing the at least one activity” (amounts to merely generally linking the use of the judicial exception to a particular technological environment or field of use, which do not impose any meaningful limits on practicing the abstract idea). Step 2B: The claim does not include additional elements considered individually and in combination that are sufficient to amount to significantly more than the judicial exception. “of a sensor system . . . for a sensor system to monitor, the sensor system comprising at least one sensor capable of generating sensor data based on sensing the at least one activity . . . of the sensor system . . . deploying the desired application for execution at an electronic device comprising the sensor system” (mere instructions to apply the exception using generic computer components cannot provide an inventive concept); “receiving a desired application. . . accessing a database” (receiving and transmitting data, such as through a network (see buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014)) or accessing information in memory (see Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93), is well‐understood, routine, and conventional; which is recited here with a high level of generality, and remains insignificant extra-solution activity even upon reconsideration); and “comprising at least one activity . . . comprising a plurality of raw sensor data and a plurality of annotations, the plurality of annotations identifying activities corresponding to the plurality of raw sensor data . . . the processing chain for processing the sensor data and for extracting at least one feature from the sensor data for use in sensing the at least one activity . . . for use in the sensing of the activity based on the desired application . . . data for use in sensing the at least one activity” (merely generally linking the use of the judicial exception to a particular technological environment or field of use does not provide an inventive concept). For the reasons above, Claim 1 is rejected as being directed to an abstract idea without significantly more. This rejection applies equally to dependent claims 2, 4, and 6-17. The additional limitations of the dependent claims are addressed below. Regarding Claim 2: Step 2A Prong 1: See the rejection of Claim 1 above, which Claim 2 depends on. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim recites the additional element: “wherein the raw sensor data comprises timestamped data and axial signal values for a plurality of sensors” (amounts to merely generally linking the use of the judicial exception to a particular technological environment or field of use, which do not impose any meaningful limits on practicing the abstract idea). Step 2B: The claim does not include additional elements considered individually and in combination that are sufficient to amount to significantly more than the judicial exception. The claim recites the additional element: “wherein the raw sensor data comprises timestamped data and axial signal values for a plurality of sensors” (merely generally linking the use of the judicial exception to a particular technological environment or field of use does not provide an inventive concept). Accordingly, Claim 2 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 4: Step 2A Prong 1: See the rejection of Claim 1 above, which Claim 4 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “wherein the automatically generating a processing chain of a sensor system for executing the desired application based on the desired application and the plurality of raw sensor data further comprises: automatically selecting at least one axis of the sensor for use in the sensing of the desired application” (mental process – amounts to exercising judgement to form an opinion on which sensor axis to use, which may happen automatically based on knowledge of the task). Step 2A Prong 2 & Step 2B: There are no elements left for consideration of implementation within a practical application or for consideration, individually or in combination, of significantly more. Accordingly, Claim 4 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 6: Step 2A Prong 1: See the rejection of Claim 1 above, which Claim 6 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “wherein the automatically generating a processing chain of a sensor system for executing the desired application based on the desired application and the plurality of raw sensor data comprises: automatically selecting at least one preprocessing operation to apply to the sensor data for extracting the at least one feature” (mental process – amounts to exercising judgement to form an opinion on which preprocessing operation for feature extraction to use, which may happen automatically based on knowledge of the dataset). Step 2A Prong 2 & Step 2B: There are no elements left for consideration of implementation within a practical application or for consideration, individually or in combination, of significantly more. Accordingly, Claim 6 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 7: Step 2A Prong 1: See the rejection of Claim 1 above, which Claim 7 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “wherein the automatically generating a processing chain of a sensor system for executing the desired application based on the desired application and the plurality of raw sensor data comprises: automatically selecting at least one filter to apply to the sensor data for extracting the at least one feature” (mental process – amounts to exercising judgement to form an opinion on which filter operation for feature extraction to use, which may happen automatically based on knowledge of the dataset). Step 2A Prong 2 & Step 2B: There are no elements left for consideration of implementation within a practical application or for consideration, individually or in combination, of significantly more. Accordingly, Claim 7 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 8: Step 2A Prong 1: See the rejection of Claim 1 above, which Claim 8 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “wherein the automatically generating a processing chain of a sensor system for executing the desired application based on the desired application and the plurality of raw sensor data comprises: automatically selecting the optimal data segment length and inference length” (mental process – amounts to exercising judgement to form an opinion on what data segment length and inference length to use, which may happen automatically based on knowledge of the dataset and task). Step 2A Prong 2 & Step 2B: There are no elements left for consideration of implementation within a practical application or for consideration, individually or in combination, of significantly more. Accordingly, Claim 8 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 9: Step 2A Prong 1: See the rejection of Claim 1 above, which Claim 9 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “tracking system requirements of the at least one feature for sensing the at least one activity of the desired application” (mental process – except for the “tracking” itself, which may require a particular technological environment, amounts to repeatedly observing system requirements during sensing an application activity, which may be aided by pen and paper). Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim recites the additional element: “tracking system requirements” (tracking, while potentially requiring a particular technological environment where system requirements can be observed, amounts to merely generally linking the use of the judicial exception to a particular technological environment or field of use, which do not impose any meaningful limits on practicing the abstract idea). Step 2B: The claim does not include additional elements considered individually and in combination that are sufficient to amount to significantly more than the judicial exception. The claim recites the additional element: “tracking system requirements” (merely generally linking the use of the judicial exception to an environment and field of use cannot provide an inventive concept). Accordingly, Claim 9 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 10: Step 2A Prong 1: See the rejection of Claim 9 above, which Claim 10 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “wherein the system requirements comprise memory consumption for the at least one sensor, computation consumption for the at least one sensor, power consumption for the at least one sensor” (mental process – amounts to repeatedly observing specific system requirements during sensing an application activity, which may be aided by pen and paper). Step 2A Prong 2 & Step 2B: There are no elements left for consideration of implementation within a practical application or for consideration, individually or in combination, of significantly more. Accordingly, Claim 10 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 11: Step 2A Prong 1: See the rejection of Claim 9 above, which Claim 11 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “wherein the sensing the at least one activity utilizes a plurality of features” (mental process – amounts to observing or otherwise experiencing sensory inputs and utilizing multiple input features to form an opinion, such as recognizing an observed object based on size and color). Step 2A Prong 2 & Step 2B: There are no elements left for consideration of implementation within a practical application or for consideration, individually or in combination, of significantly more. Accordingly, Claim 11 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 12: Step 2A Prong 1: See the rejection of Claim 11 above, which Claim 12 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “wherein the tracking system requirements of the at least one feature for sensing the at least one activity of the desired application comprises: determining a relative importance of the plurality of features used in sensing the at least one activity of the desired application” (mental process – except for the “tracking” itself, which may require a particular technological environment, amounts to exercising judgement to form an opinion on the importance of data features). Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim recites the additional element: “tracking system requirements” (amounts to merely generally linking the use of the judicial exception to a particular technological environment or field of use, which do not impose any meaningful limits on practicing the abstract idea). Step 2B: The claim does not include additional elements considered individually and in combination that are sufficient to amount to significantly more than the judicial exception. The claim recites the additional element: “tracking system requirements” (merely generally linking the use of the judicial exception to an environment and field of use cannot provide an inventive concept). Accordingly, Claim 12 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 13: Step 2A Prong 1: See the rejection of Claim 12 above, which Claim 13 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “wherein the tracking system requirements of the at least one feature for sensing the at least one activity of the desired application comprises: aggregating the system requirements for the at least one feature for sensing the at least one activity of the desired application” (mental process – except for the “tracking” itself, which may require a particular technological environment, amounts to observing system requirements and grouping observed data, which may be aided by pen and paper). Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim recites the additional element: “tracking system requirements” (amounts to merely generally linking the use of the judicial exception to a particular technological environment or field of use, which do not impose any meaningful limits on practicing the abstract idea). Step 2B: The claim does not include additional elements considered individually and in combination that are sufficient to amount to significantly more than the judicial exception. The claim recites the additional element: “tracking system requirements” (merely generally linking the use of the judicial exception to an environment and field of use cannot provide an inventive concept). Accordingly, Claim 13 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 14: Step 2A Prong 1: See the rejection of Claim 11 above, which Claim 14 depends on. Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim recites the additional element: “displaying the system requirements” (amounts to presenting information and gathering statistics, which is insignificant extra-solution activity because it is incidental to the claimed subject matter) and “for the plurality of features for sensing the at least one activity of the desired application” (amounts to merely generally linking the use of the judicial exception to a particular technological environment or field of use, which do not impose any meaningful limits on practicing the abstract idea). Step 2B: The claim does not include additional elements considered individually and in combination that are sufficient to amount to significantly more than the judicial exception. The claim recites the additional element: “displaying the system requirements” (presenting information and gathering statistics is well‐understood, routine, and conventional see OIP Techs., 788 F.3d at 1362-63, 115 USPQ2d at 1092-93; which is recited here with a high level of generality, and remains insignificant extra-solution activity even upon reconsideration) and “displaying the system requirements for the plurality of features for sensing the at least one activity of the desired application” (merely generally linking the use of the judicial exception to an environment and field of use cannot provide an inventive concept). Accordingly, Claim 14 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 15: Step 2A Prong 1: See the rejection of Claim 14 above, which Claim 15 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “updating the plurality of features to remove the deselected feature from the plurality of features for sensing the activity . . . without considering the deselected feature” (mental process – except for the “updating” itself, which may require a particular technological environment, amounts to disregarding an observed data feature, which may be aided by pen and paper). Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim recites the additional element: “receiving a selection of a feature of the plurality of features for deselection from sensing the activity” (amounts to insignificant extra-solution activity, merely receiving a selected feature is incidental to the functioning of the claimed method); “updating the plurality of features” (updating features, while potentially requiring a particular technological environment where features can be modified, amounts to merely generally linking the use of the judicial exception to an environment and field of use); “displaying the system requirements” (amounts to presenting information and gathering statistics, which is insignificant extra-solution activity because it is incidental to the claimed subject matter); and “for the plurality of features for sensing the at least one activity of the desired application” (amounts to merely generally linking the use of the judicial exception to a particular technological environment or field of use, which do not impose any meaningful limits on practicing the abstract idea). Step 2B: The claim does not include additional elements considered individually and in combination that are sufficient to amount to significantly more than the judicial exception. The claim recites the additional element: “receiving a selection of a feature of the plurality of features for deselection from sensing the activity” (receiving and transmitting data, such as through a network (see buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014)) is well‐understood, routine, and conventional; which is recited here with a high level of generality, and remains insignificant extra-solution activity even upon reconsideration); “updating the plurality of features” (merely generally linking the use of the judicial exception to an environment and field of use cannot provide an inventive concept) “displaying the system requirements” (presenting information and gathering statistics is well‐understood, routine, and conventional see OIP Techs., 788 F.3d at 1362-63, 115 USPQ2d at 1092-93; which is recited here with a high level of generality, and remains insignificant extra-solution activity even upon reconsideration); and “for the plurality of features for sensing the at least one activity of the desired application” (merely generally linking the use of the judicial exception to an environment and field of use cannot provide an inventive concept). Accordingly, Claim 15 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 16: Step 2A Prong 1: See the rejection of Claim 9 above, which Claim 16 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “wherein the system requirements comprise memory consumption for the at least one feature, computation consumption for the at least one feature, power consumption for the at least one feature” (mental process – amounts to repeatedly observing specific system requirements of features, which may be aided by pen and paper). Step 2A Prong 2 & Step 2B: There are no elements left for consideration of implementation within a practical application or for consideration, individually or in combination, of significantly more. Accordingly, Claim 16 is rejected as being directed to an abstract idea without significantly more. Regarding Claim 17: Step 2A Prong 1: See the rejection of Claim 11 above, which Claim 17 depends on. Here, the claim recites additional elements that are mental processes. Specifically, the claim recites “tracking system requirements of the at least one feature for sensing the at least one activity of the desired application” (mental process – except for the “tracking” itself, which may require a particular technological environment, amounts to repeatedly observing system requirements, which may be aided by pen and paper); “determining a relative importance of the plurality of features used in sensing the at least one activity of the desired application based on the at least one resource limitation” (mental process – amounts to exercising judgement to form an opinion on the relative importance of each feature); and “automatically deselecting at least one feature based on the at least one resource limitation” (mental process – amounts to exercising judgement to form an opinion on at least one feature, which may happen automatically based on preferences). Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim recites the additional element: “tracking system requirements” (field of use and technological environment – tracking, while potentially requiring a particular technological environment where system requirements can be observed, amounts to merely generally linking the use of the judicial exception to an environment and field of use) and “receiving a selection of a feature of the plurality of features for deselection from sensing the activity” (amounts to insignificant extra-solution activity, merely receiving a resource limitation is incidental to the functioning of the claimed method). Step 2B: The claim does not include additional elements considered individually and in combination that are sufficient to amount to significantly more than the judicial exception. The claim recites the additional element: “tracking system requirements” (merely generally linking the use of the judicial exception to an environment and field of use cannot provide an inventive concept) and “receiving a selection of a feature of the plurality of features for deselection from sensing the activity” (receiving and transmitting data, such as through a network (see buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014)) is well‐understood, routine, and conventional; which is recited here with a high level of generality, and remains insignificant extra-solution activity even upon reconsideration). Accordingly, Claim 17 is rejected as being directed to an abstract idea without significantly more. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-2, 4, 6-9, 11, 14, and 15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Edge Impulse contributors (hereinafter Edge Impulse) (“Building a Continuous Motion Recognition System with Embedded Machine Learning” [online]; Published on Jan. 24, 2020 [retrieved on June 12, 2025]; Retrieved from https://www.youtube.com/watch?v=FseGCn-oBA0). Regarding Claim 1, Edge Impulse teaches PNG media_image1.png 451 705 media_image1.png Greyscale IMAGE 1 a method for designing a processing chain (TS. [0:56-0:57], IMAGE 1, where the on-screen display includes “Creating your first impulse . . . Connect a device and acquire data . . . Design an impulse . . . Deploy”, where the “Creating” process is within the broadest reasonable interpretation of a method for designing a processing chain) PNG media_image2.png 296 929 media_image2.png Greyscale IMAGE 2 of a sensor system (TS. [0:57-1:05], IMAGE 2, where the on-screen display includes “Your devices” with a category for “sensors” such as “Built-in accelerometer”, the method comprising: PNG media_image3.png 377 1103 media_image3.png Greyscale IMAGE 3 receiving a desired application comprising at least one activity for a sensor system to monitor (TS. [0:56-57], IMAGE 1, where the on-screen display includes “Design an impulse[:] Teach the model to . . . categorize new data, or to find anomalies in sensor readings”, with options to select “getting started: continuous motion recognition” and “getting started recognizing sounds from audio”, demonstrating the method receives a desired application from the user for a sensor system to monitor; TS. [1:42-2:36], IMAGE 3, where the “sensor” captures the activity of “updown” motion for the application of “continuous motion recognition”), the sensor system comprising at least one sensor capable of generating sensor data based on sensing the at least one activity (TS. [0:57-1:05], IMAGE 2, “I can sample data from the device straight from the UI” where the on-screen display includes “Your devices” with a category for “sensors” such as “Built-in accelerometer”; TS. [1:42-2:36], IMAGE 3, where the “sensor” captures the activity of “updown” motion for the application of “continuous motion recognition”); PNG media_image4.png 523 763 media_image4.png Greyscale IMAGE 4 accessing a database comprising a plurality of raw sensor data and a plurality of annotations corresponding to the plurality of raw sensor data, the plurality of annotations identifying activities corresponding to the plurality of raw sensor data (TS. [2:27-2:57], IMAGE 4, “we now have collected nine minutes of data actually split between four labels . . .snake . . . updown”, where the on-screen display includes, “Collected data”, which is within the broadest reasonable interpretation of a database, comprised of rows of “raw data” captured by the “sensor” and includes “label” annotations that identify activities, such as “snake”); PNG media_image5.png 526 1142 media_image5.png Greyscale IMAGE 5 automatically generating a processing chain of the sensor system for executing the desired application (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes “An impulse takes raw data, uses signal processing to extract features, and then uses the learning block to classify new data”, and selecting “Save Impulse” automatically generates a preliminary processing chain for “classify[ing] new data”, which is the desired application of motion recognition in this instance, and which can be returned to in order to alter the impulse at any stage in the process, as shown by the functionality of the left column toolbar, see IMAGE 2, “Design Impulse”; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, discussed in detail below, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”) based on the desired application and the plurality of raw sensor data (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes a “raw data” component, “spectral analysis” component and a “save impulse” component; selecting “Save Impulse” automatically generates the preliminary processing chain; based on the raw sensor data, see TS. [1:42-2:36], IMAGE 3, where the “sensor” captures the activity of “updown” motion; and the desired application see TS. [4:56-5:02], “we have a spectral analysis block that is great for analyzing your repetitive motion”; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, including operations based on the desired application and plurality of raw sensor data, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”), the processing chain for processing the sensor data and for extracting at least one feature from the sensor data for use in sensing the at least one activity (TS. [4:04-4:15], “we create an impulse. An impulse takes raw data and it uses a signal processing block very standard signal processing to extract features”, where, as discussed above, the raw data is sensor data, see TS. [1:42-2:36], IMAGE 3, where the “sensor” captures the activity of “updown” motion), wherein the automatically generating a processing chain of a sensor system for executing the desired application (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes “An impulse takes raw data, uses signal processing to extract features, and then uses the learning block to classify new data”, and selecting “Save Impulse” automatically generates a preliminary processing chain for “classify[ing] new data”, which is the desired application of motion recognition in this instance, and which can be returned to in order to alter the impulse at any stage in the process, as shown by the functionality of the left column toolbar, see IMAGE 2, “Design Impulse”; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, discussed in detail below, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”) based on the desired application and the plurality of raw sensor data (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes a “raw data” component, “spectral analysis” component and a “save impulse” component; selecting “Save Impulse” automatically generates the preliminary processing chain; based on the raw sensor data, see TS. [1:42-2:36], IMAGE 3, where the “sensor” captures the activity of “updown” motion; and the desired application see TS. [4:56-5:02], “we have a spectral analysis block that is great for analyzing your repetitive motion”) comprises (wherein the automatic generating using the “Save impulse”, discussed above see TS. [5:10-5:13], IMAGE 5, comprises the steps of “connect a device and acquire data”, see generally TS. [0:55-4:07] discussed in detail below, because the generated impulse is comprised of the selected components; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, discussed in detail below, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”): PNG media_image6.png 454 1023 media_image6.png Greyscale IMAGE 6 automatically selecting a sensor of the at least one sensor for use in the sensing of the activity based on the desired application (TS. [1:00-1:10], IMAGE 3, where the on-screen display includes a prepopulated value for “sensor” after a device is connected, which is within the broadest reasonable interpretation of automatically selecting, which, as discussed above, is used for sensing the activity based on the desired application, see TS. [0:56-57], IMAGE 1, where the on-screen display includes “Design an impulse[:] Teach the model to . . . categorize new data, or to find anomalies in sensor readings”, with options to select “getting started: continuous motion recognition” and “getting started recognizing sounds from audio”, demonstrating the method receives a desired application from the user for a sensor system to monitor; TS. [1:42-2:36], IMAGE 3, where the “sensor” captures the activity of “updown” motion for the application of “continuous motion recognition”, which is a comprising precondition of generating the finalized processing chain and could, if desired, be used as part of an iterative process for generating the preliminary processing chain see, TS. [5:10-5:13], IMAGE 5, where the on-screen display includes “An impulse takes raw data, uses signal processing to extract features, and then uses the learning block to classify new data”, and selecting “Save Impulse” automatically generates a preliminary processing chain for “classify[ing] new data”, which is the desired application of motion recognition in this instance, and which can be returned to in order to alter the impulse at any stage in the process, as shown by the functionality of the left column toolbar, see IMAGE 2, “Design Impulse”; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, including sensor selection, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”); and automatically selecting the at least one feature (TS. [6:42-6:49], IMAGE 6, where the on-screen display includes a button to “generate features”, without requiring users to select features to generate, demonstrating that features are automatically selected, which is a comprising precondition of generating the finalized processing chain and could, if desired, be used as part of an iterative process for generating the preliminary processing chain see, TS. [5:10-5:13], IMAGE 5, where the on-screen display includes “An impulse takes raw data, uses signal processing to extract features, and then uses the learning block to classify new data”, and selecting “Save Impulse” automatically generates a preliminary processing chain for “classify[ing] new data”, which is the desired application of motion recognition in this instance, and which can be returned to in order to alter the impulse at any stage in the process, as shown by the functionality of the left column toolbar, see IMAGE 2, “Design Impulse”; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, including feature selection, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”) to extract from the sensor data for use in sensing the at least one activity (TS. [6:50-7:19], “we run the feature extraction code . . . a . . . feature explorer which allows you to look at . . . interesting features we extracted”); and deploying the desired application for execution at an electronic device comprising the sensor system (TS. [0:56-0:57], IMAGE 1, where the on-screen display includes “ Design an Impulse . . . [for example] CONTINUOUS MOTION RECOGNITION . . . Package the complete impulse up . . . and deploy it on your device”, where the “device” is the electronic device, which the desired application, such as “CONTINUOUS MOTION RECOGNITION”, is “deploy[ed]” to, and where, as discussed above, the device comprises the sensor system, see TS. [0:56-57], IMAGE 1, where the on-screen display includes “Design an impulse[:] Teach the model to . . . categorize new data, or to find anomalies in sensor readings”; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, discussed in detail above, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”). Regarding Claim 2, Edge Impulse teaches the method of Claim 1, wherein the raw sensor data comprises timestamped data (TS. [2:25-4:05], IMAGE 4, where the on-screen display includes “added” value, such as “Today, 15:12:44”, for each raw sensor datapoint, which is within the broadest reasonable interpretation of timestamped data) and axial signal values (TS. [5:10-5:13], IMAGE 5, where clicking on a raw data sample displays “accX”, “accY”, and “accZ” for the data sample) for a plurality of sensors (TS. [2:25-4:05], IMAGE 3, where the on-screen display includes a “sensor” dropdown selection to allow for raw data to be used for a plurality of sensors). Regarding Claim 4, Edge Impulse teaches the method of Claim 1, wherein the automatically generating a processing chain of a sensor system for executing the desired application (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes “An impulse takes raw data, uses signal processing to extract features, and then uses the learning block to classify new data”, and selecting “Save Impulse” automatically generates a preliminary processing chain for “classify[ing] new data”, which is the desired application of motion recognition in this instance, and which can be returned to in order to alter the impulse at any stage in the process, as shown by the functionality of the left column toolbar, see IMAGE 2, “Design Impulse”; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, discussed in detail below, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”) based on the desired application and the plurality of raw sensor data ((TS. [5:10-5:13], IMAGE 5, where the on-screen display includes a “raw data” component, “spectral analysis” component and a “save impulse” component; selecting “Save Impulse” automatically generates the preliminary processing chain; based on the raw sensor data, see TS. [1:42-2:36], IMAGE 3, where the “sensor” captures the activity of “updown” motion; and the desired application see TS. [4:56-5:02], “we have a spectral analysis block that is great for analyzing your repetitive motion”) comprises (wherein the automatic generating using the “Save impulse”, discussed above see TS. [5:10-5:13], IMAGE 5, comprises the steps of “connect a device and acquire data”, see generally TS. [0:55-4:07], where the generated impulse is comprised of the selected components; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, discussed in detail below, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”): automatically selecting at least one axis of the sensor for use in the sensing of the desired application (TS. [4:07-4:37], IMAGE 5, where the on-screen display is prepopulated with “Axis[:] accX, AccY, accZ”, which is within the broadest reasonable interpretation of automatically selecting). Regarding Claim 6, Edge Impulse teaches the method of Claim 1, wherein the automatically generating a processing chain of a sensor system for executing the desired application (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes “An impulse takes raw data, uses signal processing to extract features, and then uses the learning block to classify new data”, and selecting “Save Impulse” automatically generates a preliminary processing chain for “classify[ing] new data”, which is the desired application of motion recognition in this instance, and which can be returned to in order to alter the impulse at any stage in the process, as shown by the functionality of the left column toolbar, see IMAGE 2, “Design Impulse”; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, discussed in detail below, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”) based on the desired application and the plurality of raw sensor data (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes a “raw data” component, “spectral analysis” component and a “save impulse” component; selecting “Save Impulse” automatically generates the preliminary processing chain; based on the raw sensor data, see TS. [1:42-2:36], IMAGE 3, where the “sensor” captures the activity of “updown” motion; and the desired application see TS. [4:56-5:02], “we have a spectral analysis block that is great for analyzing your repetitive motion”) comprises (wherein selections of preprocessing operations must comprise the generation of the “impulse” for it to “uses signal processing to extract features”, as discussed above, see TS. [5:10-5:13], IMAGE 5; see also wherein the automatic generating using the “Save impulse”, discussed above see TS. [5:10-5:13], IMAGE 5, comprises the steps of “connect a device and acquire data”, see generally TS. [0:55-4:07] discussed in detail below, because the generated impulse is comprised of the selected components; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, discussed in detail below, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”): PNG media_image7.png 350 420 media_image7.png Greyscale IMAGE 7 automatically selecting at least one preprocessing operation to apply to the sensor data for extracting the at least one feature (TS. [4:52-5:08], IMAGE 7, where the on-screen display includes a “recommended” star next to the “spectral analysis” “processing block” for “raw data . . . preprocessing”, which is within the broadest reasonable interpretation of automatically selecting). Regarding Claim 7, Edge Impulse teaches the method of Claim 1, wherein the automatically generating a processing chain of a sensor system for executing the desired application (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes “An impulse takes raw data, uses signal processing to extract features, and then uses the learning block to classify new data”, and selecting “Save Impulse” automatically generates a preliminary processing chain for “classify[ing] new data”, which is the desired application of motion recognition in this instance, and which can be returned to in order to alter the impulse at any stage in the process, as shown by the functionality of the left column toolbar, see IMAGE 2, “Design Impulse”; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, discussed in detail below, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”) based on the desired application and the plurality of raw sensor data (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes a “raw data” component, “spectral analysis” component and a “save impulse” component; selecting “Save Impulse” automatically generates the preliminary processing chain; based on the raw sensor data, see TS. [1:42-2:36], IMAGE 3, where the “sensor” captures the activity of “updown” motion; and the desired application see TS. [4:56-5:02], “we have a spectral analysis block that is great for analyzing your repetitive motion”) comprises (wherein selections of preprocessing, which includes filtering (see TS. [5:15-5:23], “there’s three stages involved here the first one us that we apply a filter”), must comprise the generation of the “impulse” for it to “uses signal processing to extract features”, as discussed above, see TS. [5:10-5:13], IMAGE 5): PNG media_image8.png 648 1071 media_image8.png Greyscale IMAGE 8 automatically selecting at least one filter to apply to the sensor data (TS. [5:15- 6:41], IMAGE 8, where the on-screen display includes prepopulated “filter” values, which is within the broadest reasonable interpretation of automatically selected) for extracting the at least one feature (TS. [5:52-6:05], IMAGE 8, where the on-screen display updates the values for “processed features” in response with updates to filter parameters, demonstrating the filter is used to extract the features). Regarding Claim 8, Edge Impulse teaches the method of Claim 1, wherein the automatically generating a processing chain of a sensor system for executing the desired application (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes “An impulse takes raw data, uses signal processing to extract features, and then uses the learning block to classify new data”, and selecting “Save Impulse” automatically generates a preliminary processing chain for “classify[ing] new data”, which is the desired application of motion recognition in this instance, and which can be returned to in order to alter the impulse at any stage in the process, as shown by the functionality of the left column toolbar, see IMAGE 2, “Design Impulse”; see also TS. [16:41 – 18:00], “Let’s actually deploy this back to the device . . . we can take all the code we just generated . . . and generate highly optimized binary code specifically for this board . . . so I can run the impulse”, where, ultimately, after finalizing the impulse with respect to subsequent operations, discussed in detail below, the processing chain is generated in finalized form, as an executable that can then be “deploy[ed]” “back to the device”) based on the desired application and the plurality of raw sensor data (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes a “raw data” component, “spectral analysis” component and a “save impulse” component; selecting “Save Impulse” automatically generates the preliminary processing chain; based on the raw sensor data, see TS. [1:42-2:36], IMAGE 3, where the “sensor” captures the activity of “updown” motion; and the desired application see TS. [4:56-5:02], “we have a spectral analysis block that is great for analyzing your repetitive motion”) comprises (wherein the automatic generating using the “Save impulse”, discussed above see TS. [5:10-5:13], IMAGE 5, comprises the steps of “connect a device and acquire data”, see generally TS. [0:55-4:07] discussed in detail below, because the generated impulse if comprised of the selected components): automatically selecting the optimal data segment length (TS. [4:08-4:28], IMAGE 5, “the first thing we do is slice the data up into two second windows”, where the on-screen display prepopulates “window size”, which is within the broadest reasonable interpretation of automatically selecting, and where the “window size” is within the broadest reasonable interpretation of data segment length) and inference length (TS. [1:05-1:40], IMAGE 3, where the on-screen display prepopulates “sample length”, which is within the broadest reasonable interpretation of automatically selecting, and where the “sample length” is within the broadest reasonable interpretation of inference length). Regarding Claim 9, Edge Impulse teaches the method of Claim 1, further comprising: PNG media_image9.png 328 1088 media_image9.png Greyscale IMAGE 9 tracking system requirements of the at least one feature for sensing the at least one activity of the desired application (TS. [8:51-9:29], IMAGE 9, where the on-screen display includes “inferencing time” and “peak memory usage” of the system, which is within the broadest reasonable interpretation of tracking system requirements; and where the requirements are for the application of “motion recognition” of activity inferencing, for example “updown”, which as discussed above are based on sensed features). Regarding Claim 11, Edge Impulse teaches the method of Claim 9, wherein the sensing the at least one activity utilizes a plurality of features (TS. [6:48-7:44], IMAGE 6, where the on-screen display includes “5351” “generat[ed] features”, which correspond with windows, and a “Feature explorer” to explore features such as “X Axis”, “Y Axis”, and “Z Axis” and associated sub features for each of the “5,351 samples”). Regarding Claim 14, Edge Impulse teaches the method of Claim 11, further comprising: displaying the system requirements for the plurality of features for sensing the at least one activity of the desired application (TS. [8:51-9:29], IMAGE 9, where the on-screen display includes “inferencing time” and “peak memory usage” of the system, which is within the broadest reasonable interpretation of tracking system requirements; and where the requirements are for the application of “motion recognition” of activity inferencing, for example “updown”, which as discussed above are based on sensed features). Regarding Claim 15, Edge Impulse teaches the method of Claim 14, further comprising: receiving a selection of a feature of the plurality of features for deselection from sensing the activity (TS. [5:14-5:17], IMAGE 5, where the on-screen display includes a sidebar button “create impulse”, which allows users to return to the generation page and deselect “Input axes accX, accY, accZ” that correspond with features such as “X Axis”, “Y Axis”, and “Z Axis” and associated sub features, discussed above, see TS. [6:48-7:44], IMAGE 6); updating the plurality of features to remove the deselected feature from the plurality of features for sensing the activity (TS. [5:10-5:13], IMAGE 5, where the on-screen display includes “Save Impulse”, which can be clicked again to update the “impulse . . . to extract features” that would no longer include the deselected feature); and displaying the system requirements for the plurality of features for sensing the at least one activity of the desired application without considering the deselected feature (TS. [8:51-9:29], IMAGE 9, where the on-screen display includes “inferencing time” and “peak memory usage” of the system, which is within the broadest reasonable interpretation of tracking system requirements; and where the requirements are for the application of motion recognition of activity inferencing, for example “updown”, which as discussed above are based on sensed features – which only includes features not deselected). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Edge Impulse in view of Lockhart et al. (hereinafter Lockhart) (“Design Considerations for the WISDM Smart Phone-based Sensor Mining Architecture”). Regarding Claim 10, Edge Impulse teaches the method of Claim 9. Edge Impulse does not teach . . . wherein the system requirements comprise memory consumption for the at least one sensor, computation consumption for the at least one sensor, power consumption for the at least one sensor. However, Lockhart teaches . . . wherein the system requirements comprise (Pg. 1, Abstract, “there are obstacles to sensor mining applications, due to the severe resource limitations (e.g., power, memory, bandwidth) faced by mobile devices”; Pg. 8, Col. 1, Para. 3, “5. RESOURCE USAGE . . . in this section . . . we present actual performance results associated with our WISDM Data Collector application,”) memory consumption for the at least one sensor (Pg. 8, Col. 2, Para. 6, “The WISDM Data Collector uses 18MB of RAM, of which 12MB is reserved for data and 6MB is shared. Full RAM usage is approximately 3.5% of the HTC EVO's 512 MB RAM total”; where the “WISDM (WIireless Sensor Data Mining)” must contain at least one sensor, see Pg. 1, Abstract), computation consumption for the at least one sensor (Pg. 8, Col. 2, Para. 4, “the WISDM Data Collector's service spends approximately 3.4% of its uptime as the active process, which translates to approximately 2% of the CPU's total potential”; where the “WISDM (WIireless Sensor Data Mining)” must contain at least one sensor, see Pg. 1, Abstract), power consumption for the at least one sensor (Pg. 8, Col. 1, Para. 6, “Our measurements indicate that on the HTC EVO our Data Collector application consumes 35-51 milliwatts per second, and on an idle device with the screen powered off this corresponds to 6.6% of the device's power consumption”; where the “WISDM (WIireless Sensor Data Mining)” must contain at least one sensor, see Pg. 1, Abstract). Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to combine the designing of a processing chain for a sensor system, including tracking system requirements for the use of features to sense activities of an application of Edge Impulse, with the tracking of sensor memory consumption, computational consumption, and power consumption of Lockhart, in order to design systems that can utilize the sensors of devices with limited resource capacities (Lockhart, Pg. 1, Abstract, “Smart phone . . . devices provide unprecedented opportunities for sensor mining since they include a large variety of sensors, including an: acceleration sensor (accelerometer), location sensor (GPS), direction sensor (compass), audio sensor (microphone), image sensor (camera), proximity sensor, light sensor, and temperature sensor”; Lockhart, Pg. 1, Col. 2, Para. 1, “The design considerations that we identify will be useful to others who build smart phone-based data mining applications and can also be used to evaluate future smart phone-based sensor mining platforms”). Claims 12-13 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Edge Impulse in view of Chang et al. (hereinafter Chang) (“Dropout Feature Ranking for Deep Learning Models”). Regarding Claim 12, Edge Impulse teaches the method of Claim 11, wherein the tracking system requirements of the at least one feature for sensing the at least one activity of the desired application comprises: . . . sensing the at least one activity of the desired application (TS. [8:51-9:29], IMAGE 9, where the on-screen display includes the “inferencing time” and “peak memory usage” requirements of the system when using the sensed features, “input layer (33 features)”, for the “motion recognition” application of sensing “idle”, “snake”, “updown”, and “wave” activities). Edge Impulse does not teach . . . determining a relative importance of the plurality of features used in . . . (where the tracking of system requirements for sensing the at least on activity of the desired application is taught without reference to the relative importance of the features). However, Chang teaches . . . determining a relative importance of the plurality of features used in (Pg. 2, Col. 1, Para. 7, “To analyze which features are important for a given pre-trained model M to correctly predict its target variable y, we introduce Dropout Feature Ranking (Dropout FR) method”; see also Pg. 7, Table 5, where the “top features” are ranked “1-10”) . . . . Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to combine the designing of a processing chain for a sensor system, including tracking system requirements for the use of features to sense activities of an application of Edge Impulse, with the ranking of the features used in the application based on relative importance of Change, in order to remove resources that are less important for tasks where resources are limited (Chang, Pg. 1, Abstract, “in the resource-constraint setting, it is critical to design tests relying on fewer more informative features leading to high accuracy performance within reasonable budget”). Regarding Claim 13, Edge Impulse in view of Chang teaches the method of Claim 12, wherein the tracking system requirements of the at least one feature for sensing the at least one activity of the desired application (Edge Impulse, TS. [8:51-9:29], IMAGE 9, where the on-screen displays tracked “inferencing time” and “peak memory usage” requirements of the system when using the sensed features, “input layer (33 features)” (see IMAGE 10), for the “motion recognition” application of sensing “idle”, “snake”, “updown”, and “wave” activities) comprises: aggregating the system requirements for the at least one feature for sensing the at least one activity of the desired application (Edge Impulse, TS. [8:51-9:29], IMAGE 9, where the on-screen display aggregates “inferencing time” and “peak memory usage” requirements of the system when using the sensed features, “input layer (33 features)” (see IMAGE 10), for the “motion recognition” application of sensing “idle”, “snake”, “updown”, and “wave” activities). Regarding Claim 17, Edge Impulse in view of Chang teaches PNG media_image10.png 505 1000 media_image10.png Greyscale IMAGE 10 the method of Claim 11, wherein the tracking system requirements of the at least one feature for sensing the at least one activity of the desired application (Edge Impulse, TS. [8:51-9:29], IMAGE 9, where the on-screen display includes the “inferencing time” and “peak memory usage” requirements of the system when using the sensed features, “input layer (33 features)” (see IMAGE 10), for the “motion recognition” application of sensing “idle”, “snake”, “updown”, and “wave” activities) comprises: receiving at least one resource limitation (Edge Impulse, TS. [7:54-8:52], IMAGE 10, where the on-screen display includes an input box for “Number of training cycles”, which is within the broadest reasonable interpretation of a resource limitation); determining a relative importance of the plurality of features used (Chang, Pg. 2, Col. 1, Para. 7, “To analyze which features are important for a given pre-trained model M to correctly predict its target variable y, we introduce Dropout Feature Ranking (Dropout FR) method”; see also Chang, Pg. 7, Table 5, where the “top features” are ranked “1-10”) in sensing the at least one activity of the desired application (Edge Impulse, TS. [8:51-9:29], IMAGE 9, where the on-screen display includes requirements of the system when using the sensed features, “input layer (33 features)”(see IMAGE 10), for the “motion recognition” application of sensing “idle”, “snake”, “updown”, and “wave” activities) based on the at least one resource limitation (Edge Impulse, TS. [7:54-8:52], IMAGE 10, where the input for “Number of training cycles” is a basis for the tracked system requirements “inferencing time” and “peak memory usage”); and automatically deselecting at least one feature based on the at least one resource limitation (Edge Impulse, TS. [7:54-8:52], IMAGE 10, where are series of steps happen automatically in response to inputting the “Number of training cycles” resource limitation and selecting “Start training”; Chang, Pg. 3, Col. 2, Para. 5, “Result In Table 2 . . . comparing full features (Top 40), only the informative features (Top 20) and the top 5 most informative features (Top 5)”, where the plurality of features below a varying threshold are deselected, see also Chang, Pg. 4, Table 2). The reasons of obviousness have been noted in the rejection of Claim 12 and remain applicable here. Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Edge Impulse in view of Gao et al. (hereinafter Gao) (“End-to-End Evaluation of Federated Learning and Split Learning for Internet of Things”). Regarding Claim 16, Edge Impulse teaches the method of Claim 9, wherein the system requirements comprise memory consumption for the at least one feature (TS. [8:30-9:23], IMAGE 9, where the on-screen display includes “peak memory usage”, corresponding to “Input layer (33 features)”, see IMAGE 10), computation consumption for the at least one feature (TS. [8:30-9:23], IMAGE 10, where the on-screen display includes “Neural network architecture[:] Inpute layer (33 features)[,] Dense layer (20 neurons)[,] Dense layer (10 neurons)[,] Output layer (4 features)” and “Epoch 1/1”, where tracking the computation architecture (“layers”, “dens[ity]”, and “neurons”) and number of cycles feature inputs are feed through the architecture is within the broadest reasonable interpretation of computational consumption for features) . . . . Edge Impulse does not teach . . . power consumption for the at least one feature (where the on-screen display does not include power consumption). However, Gao teaches . . . wherein the system requirements comprise . . . power consumption for the at least one feature (Pg. 92, Col. 1, Para. 5, “We take the first step toward fairly training performance comparisons between FL and SplitNN . . . We provide detailed performance overhead evaluations of . . . amount of power consumed . . . peak power . . . to serve as a reference for practitioners”). Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to combine the designing of a processing chain for a sensor system, including tracking system requirements for the use of features to sense activities of an application, such as memory consumption and computational consumption, of Edge Impulse with the tracking system power consumption for a machine learning application of Gao in order to determine whether a designed machine learning process could feasibly be implemented on a device with limited access to power (Gao, Pg. 99, Col. 2, Para. 3, “Overall, for the IoT scenario, the FL would be a more practical recommendation because it requires less overall . . . and power consumption overhead when a simple 1D CNN model is used . . . for both FL and SplitNN, the use of more complicated models would still be infeasible to mount training on low-capacity IoT devices such as Raspberry Pi”). Response to Arguments Applicant's arguments filed on February 19, 2026 have been fully considered. Each argument is addressed in detail below. I. Applicant argues the rejections of claims 1-2, 4, and 6-17, under 35 USC § 101, should be withdrawn (Amendment and Response to Final Office Action Accompanying a Request for Continued Examination (RCE), 02/19/2026, Pg. 6-11, Section “Claim Rejections – 35 U.S.C. § 101”). 1. First, Applicant argues the pending claims are not directed to an abstract idea because they do not recite a judicial exception (Step 2A, Prong One). Specifically, Applicant argues “automatically generating a processing chain of the sensor system for executing the desired application based on the desired application and the plurality of raw sensor data, the processing chain for processing the sensor data and for extracting at least one feature from the sensor data for use in sensing the at least one activity, wherein the automatically generating a processing chain of a sensor system for executing the desired application based on the desired application and the plurality of raw sensor data comprises: automatically selecting a sensor of the at least one sensor for use in the sensing of the activity based on the desired application; and automatically selecting the at least one feature to extract from the sensor data for use in sensing the at least one activity” (claim 1, ln. 9-19) (emphasis added by Applicant) cannot practically be performed in the human mind because it does not merely involve observations, evaluations, judgements, or opinions. Instead, per the Applicant, it requires generation of a processing chain of a sensor system that can be deployed on specialized hardware for executing the desire application. In support of this position, Applicant cites MPEP 2106.04(a), which is reproduced below, as well as guidance documents and caselaw in support of the position that limitations the human mind is not equipped to perform, such as analysis of network packets to detect suspicious activity, do not contain mental processes (see SRI Int'l, Inc. v. Cisco Systems, Inc., 930 F.3d 1295, 1304 (Fed. Cir. 2019); see also CyberSource, 654 F.3d at 1376, 99 USPQ2d at 1699; SiRF Tech., Inc. v. Int'l Trade Comm'n, 601 F.3d 1319, 94 USPQ2d 1607 (Fed. Cir. 2010); “Reminders on evaluating subject matter eligibility of claims under 35 U.S.C. 101”, Aug. 4, 2025; Example 39, Claim 1 of the PEG Examples 37-42). According to MPEP 2106.04(a), “Claims do not recite a mental process when they do not contain limitations that can practically be performed in the human mind, for instance when the human mind is not equipped to perform the claim limitations . . . [but] A Claim That Requires a Computer May Still Recite a Mental Process . . . examiners should review the specification to determine if the claimed invention is described as a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept. In these situations, the claim is considered to recite a mental process”. Here, the claimed subject matter includes concepts that are performed in the human mind, “automatically generating a processing chain . . . automatically selecting a sensor . . . automatically selecting the at least one feature”. In contrast with Applicant’s assertions that the claim requires a specialized hardware for executing the desire application, as currently formulated, the claims merely recite that the mental processes are performed on a generic computer or in a computer environment, “of the sensor system . . . deploying the desired application for execution at an electronic device comprising the sensor system”. Regarding the comparison to subject matter deemed eligible because it cannot practically be performed in the mind, analyzing network packets to detect suspicious activity requires evaluating vast amounts of data, potentially transported through hundreds of ports, in a format not readily interpretable by the human mind. As a result, a person observing the vast amounts of data traveling through a network in packets would be unequipped to detect which packets were indicative of suspicious activity. On the other hand, a person with known or observed information on a desired application could readily exercise judgement to form an opinion on what processes, sensors, and features should be used. For example, a person seeking to detect motion could exercise judgement to form an opinion that visual sensors and coordinate features should be used. Further, a person with knowledge of the constraints and fault tolerance of the application could similarly readily exercise judgement to form an opinion on the processes that should be used. Therefore, the comparison between the subject matter of the instant application and cases where the claimed subject matter was deemed patent eligible is not persuasive. Instead, the claimed subject matter amounts to the mental processes of observation, evaluation, judgement, and opinion. As a result, the argument is not persuasive. 2. Second, Applicant argues the pending claims, if reciting an abstract idea, integrate the judicial exception into a practical application and impose meaningful limits on the judicial exception because they do not wholly preempt the designing of a processing chain for a sensor (Step 2A, Prong Two). Specifically, Applicant asserts that the recitations of “automatically generating a processing chain of the sensor system for executing the desired application based on the desired application and the plurality of raw sensor data, the processing chain for processing the sensor data and for extracting at least one feature from the sensor data for use in sensing the at least one activity, wherein the automatically generating a processing chain of a sensor system for executing the desired application based on the desired application and the plurality of raw sensor data comprises: automatically selecting a sensor of the at least one sensor for use in the sensing of the activity based on the desired application; and automatically selecting the at least one feature to extract from the sensor data for use in sensing the at least one activity” (claim 1, ln. 9-19) (emphasis added by Applicant) constitutes practical integration. In support of this position, Applicant cites guidance documentation that stands for the position that all claim limitations should be taking into consideration as a whole when evaluating whether a judicial exception is integrated into a practical application (see Example 39, Claim 1 of the PEG Examples 37-42). According to MPEP 2106.05(f), “Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words "apply it" (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer . . . A claim having broad applicability across many fields of endeavor may not provide meaningful limitations that integrate a judicial exception into a practical application or amount to significantly more”. Here, the Applicant-cited additional elements of the claimed invention are generic computer components, such as the “sensor system” and the “electronic device”, which have broad applicability across many fields of endeavor. Therefore, the recitation of these elements does not integrate the recited judicial exception into a practical application or impose meaningful limitations on the judicial exception, as in Example 39, Claim 1 of the PEG Examples 37-42. Specifically, while the Applicant argues they do not wholly preempt the designing of a processing chain for a sensor, the additional elements merely recite “sensor[s]”, a “sensor system”, and an “electronic device”, which would be present in any claimed process for designing of a processing chain for a sensor. As a result, the argument is not persuasive. 3. Third, Applicant argues the pending claims, if directed to an abstract idea, amount to an inventive concept because they recite a particular way of designing a processing chain of a sensor system, which amounts to more than generally linking to a particular technological environment (Step 2B). Specifically, Applicant asserts that the recitations of “automatically generating a processing chain of the sensor system for executing the desired application based on the desired application and the plurality of raw sensor data, the processing chain for processing the sensor data and for extracting at least one feature from the sensor data for use in sensing the at least one activity, wherein the automatically generating a processing chain of a sensor system for executing the desired application based on the desired application and the plurality of raw sensor data comprises: automatically selecting a sensor of the at least one sensor for use in the sensing of the activity based on the desired application; and automatically selecting the at least one feature to extract from the sensor data for use in sensing the at least one activity” (claim 1, ln. 9-19) (emphasis added by Applicant) amount to an inventive concept by circumscribing the claimed embodiments in a manner that improves performance in an unconventional way and with specialized hardware. According to MPEP 2106.05, “An inventive concept cannot be furnished by the unpatentable law of nature (or natural phenomenon or abstract idea) itself . . . Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words "apply it" (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer . . . claiming the improved speed or efficiency inherent with applying the abstract idea on a computer does not integrate a judicial exception into a practical application or provide an inventive concept” (internal quotation marks omitted). Here, the Applicant-cited additional elements of the claimed invention are generic computer components, such as the “sensor system” and the “electronic device”, which are inherent to applying the judicial exception on a computer, and do not rise to the level of specialized hardware. Additionally, as discussed in detail above, the asserted circumscribing components are themselves mental processes, and, as a result, cannot amount to significantly more. As a result, the argument is not persuasive. II. Applicant argues the rejections of claims 1-2, 4, 6-9, 11, 14, and 15, under 35 USC § 102, should be withdrawn (Amendment and Response to Final Office Action Accompanying a Request for Continued Examination (RCE), 02/19/2026, Pg. 11-16, Section “Claim Rejections – 35 U.S.C. § 102”). Specifically, Applicant argues that Edge Impulse fails to anticipate claim 1 because it does not disclose “automatically generating a processing chain of the sensor system for executing the desired application based on the desired application and the plurality of raw sensor data, the processing chain for processing the sensor data and for extracting at least one feature from the sensor data for use in sensing the at least one activity, wherein the automatically generating a processing chain of a sensor system for executing the desired application based on the desired application and the plurality of raw sensor data comprises: automatically selecting a sensor of the at least one sensor for use in the sensing of the activity based on the desired application; and automatically selecting the at least one feature to extract from the sensor data for use in sensing the at least one activity” (claim 1, ln. 9-19) (emphasis added by Applicant). As a result, Applicant argues Edge Impulse also fails to anticipate claims 2, 4, 6-9, 11, 14, and 15, which depend on claim 1 and recite similar limitations. In support of this position, Applicant cites MPEP 2131, which requires each and every element of a claim to be disclosed in a prior art reference in order for the claim to be anticipated, and argues Edge Impulse fails to disclose the above underlined limitations of claim 1. According to MPEP 2145 (VI), “Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims” (see also In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993)). Additionally, according to MPEP 2111, “During patent examination, the pending claims must be given their broadest reasonable interpretation consistent with the specification” (internal quotation marks omitted) (see also Phillips v. AWH Corp., 415 F.3d 1303, 1316, 75 USPQ2d 1321, 1329 (Fed. Cir. 2005)). Here, Applicant characterizes Edge Impulse as disclosing customizable design of a processing pipeline, where processing blocks, parameters, and sensors can be adjusted from default options using a graphical user interface. However, the claims, as currently recited, do not prohibit manual adjustments of the automatically generated outputs, and any such prohibitions, which may or may not be recited in the specification, are not read into the claims. Therefore, even if Applicant’s characterization of Edge Impulse were assumed to be true, it would not serve as evidence against the anticipation of claim 1 by Edge Impulse. As a result, the argument is not persuasive. Additionally, Applicant asserts that Edge Impulse does not disclose that selecting “Save Impulse” initiates a process where a sensor is selected for inclusion with a processing chain because the selection of a sensor is a precondition for selecting “Save Impulse”. Here, the question at issue is whether “automatically generating a processing chain of a sensor system . . .comprises: automatically selecting a sensor” (Claim 1, ln. 9-16). As discussed above and acknowledged by the Applicant, the automatic selection of a sensor is a prerequisite for the generation of the processing chain of the sensor system. This in isolation is dispositive evidence that the automatically selecting a sensor is a comprising component of the automatically generating a processing chain. Additionally, as also discussed above, the generation of the processing chain finalizes the selection of the sensors. Specifically, the method includes both a preliminary selection of a sensor, which occurs prior to selecting “Save Impulse”, and the final selection of a sensor, which occurs when the “Save[d] Impulse” is generated as an executable. Both of these selections are within the broadest reasonable interpretation of selecting. Therefore, Edge Impulse discloses and does not teach away from the limitations of amended claim 1, as suggested in regard to the rejections under 35 U.S.C. 103, but more appropriately address within the context of the rejections under 35 U.S.C. 102. As a result, the argument is not persuasive. Similarly, Applicant asserts that Edge Impulse does not disclose that feature extraction generation as a component of the generation of the processing chain because the selection of “Save Impulse” occurs before the selection of “Generate Features”. However, as discussed in detail above, the user interface allows for the iterative return to the “Save Impulse” functionality, which, as also discussed above, generates a preliminary processing chain. As a result, the method executed by the system of Edge Impulse allows for the selection of features as a comprising component of the generation of the preliminary processing chain. Furthermore, as also discussed above, the step of feature selection is a necessary precondition of generating the finalized processing chain, which is finalized after all selections are made. Therefore, even if a user does not utilize the functionality to generate the preliminary processing chain with reference to the feature selections, they will ultimately generate the finalized processing chain using all selections, which will finalize those selections. As a result, the argument is not persuasive. III. Applicant argues the rejections of claims 10, 12, 13, 16, and 17, under 35 USC § 103, should be withdrawn (Amendment and Response to Final Office Action Accompanying a Request for Continued Examination (RCE), 02/19/2026, Pg. 16-127, Section “Claim Rejections – 35 U.S.C. § 103”). Regarding Claim 10, Applicant argues that Edge Impulse in view of Lockhart do not render claim 10 obvious because they fail to teach or suggest each and every limitation of amended claim 1. However, as discussed above, the arguments against Edge Impulse’s disclosure of each and every limitation of amended claim 1 are not persuasive. Additionally, Lockhart is not relied upon to teach any limitations of amended claim 1. As a result, the argument is not persuasive. Regarding Claims 12-13 and 17, Applicant argues that Edge Impulse in view of Chang do not render claims 12-13 and 17 obvious because they fail to teach or suggest each and every limitation of amended claim 1. However, as discussed above, the arguments against Edge Impulse’s disclosure of each and every limitation of amended claim 1 are not persuasive. Additionally, Chang is not relied upon to teach any limitations of amended claim 1. Additionally, for clarity of the record, it is worth pointing out that Lockhart is not relied upon to teach Claim 13, as incorrectly asserted by Applicant. As a result, the argument is not persuasive. Regarding Claim 16, Applicant argues that Edge Impulse in view of Gao do not render claim 16 obvious because they fail to teach or suggest each and every limitation of amended claim 1. However, as discussed above, the arguments against Edge Impulse’s disclosure of each and every limitation of amended claim 1 are not persuasive. Additionally, Gao is not relied upon to teach any limitations of amended claim 1. As a result, the argument is not persuasive. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATTHEW BRYCE GOLAN whose telephone number is (571)272-5159. The examiner can normally be reached Monday through Friday, 8:00 AM to 5:00 PM ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alexey Shmatov can be reached at (571) 270-3428. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MATTHEW BRYCE GOLAN/Examiner, Art Unit 2123 /ALEXEY SHMATOV/Supervisory Patent Examiner, Art Unit 2123
Read full office action

Prosecution Timeline

Aug 26, 2022
Application Filed
Jun 16, 2025
Non-Final Rejection — §101, §102, §103
Oct 23, 2025
Response Filed
Nov 14, 2025
Final Rejection — §101, §102, §103
Feb 19, 2026
Request for Continued Examination
Feb 28, 2026
Response after Non-Final Action
Mar 10, 2026
Non-Final Rejection — §101, §102, §103 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
0%
Grant Probability
0%
With Interview (+0.0%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 3 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month