Prosecution Insights
Last updated: April 19, 2026
Application No. 18/322,643

UNSUPERVISED MACHINE LEARNING LEVERAGING HUMAN COGNITIVE ABILITY LEARNING LOOP WORKFLOW

Non-Final OA §101§103
Filed
May 24, 2023
Examiner
MARU, MATIYAS T
Art Unit
2148
Tech Center
2100 — Computer Architecture & Software
Assignee
Booz Allen Hamilton Inc.
OA Round
1 (Non-Final)
58%
Grant Probability
Moderate
1-2
OA Rounds
4y 6m
To Grant
70%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
23 granted / 40 resolved
+2.5% vs TC avg
Moderate +12% lift
Without
With
+12.5%
Interview Lift
resolved cases with interview
Typical timeline
4y 6m
Avg Prosecution
39 currently pending
Career history
79
Total Applications
across all art units

Statute-Specific Performance

§101
35.9%
-4.1% vs TC avg
§103
50.9%
+10.9% vs TC avg
§102
1.9%
-38.1% vs TC avg
§112
11.3%
-28.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 40 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1 – 14 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e. an abstract idea) without significantly more. In step 1, of the 101 – analysis set forth in the MPEP 2106, the examiner has determined that the following limitations recite a process that, under the broadest reasonable interpretation, falls within one or more statutory categories (processes) . In step 2A prong 1, of the 101-analysis set forth in MPEP 2106, the E xaminer has determined that the following limitations recite a process that, under broadest reasonable interpretation, covers a mental process but for the recitation of generic computer components: Regarding claim 1, grouping each data point into one or more groups [ ] (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves observing characteristics of data points, evaluating similarities or criteria and deciding how to organize the data into groups. See (MPEP 2106.04) ). assigning each data point an index based one or more groups into which each data point is grouped; (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves evaluating group membership and deciding an identifying reference or index for each data point. See (MPEP 2106.04) ). classifying all indexed-data points of a group and labelling the classified indexed-data points of the group with the same label (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves observing indexed data points, evaluating them according to classification criteria, and deciding on a common label. See (MPEP 2106.04) ). If the claim limitations, under their broadest reasonable interpretation, covers performance of the limitations as a mental process, but for the recitation of generic computer components, then it falls within the mental process. Accordingly, the claim recites an abstract idea. Step 2A Prong 2 of the 101 – analysis, set forth in MPEP 2106, the E xaminer has determined that the following additional elements do not integrate this judicial exception into a practical application: receiving plural data points ; Deemed insufficient to transform the judicial exception to a patentable invention because the claim recites limitation directed to mere data gathering as deemed insufficient to transform the judicial exception because claimed elements are considered insignificant extra-solution activity, See MPEP (2106.05(g)) . via a clustering algorithm ; Deemed insufficient to transform the judicial exception to a patentable invention because the claim recites limitation which does not amount to more than a recitation of the words "apply it" (or an equivalent), such as mere instructions to implement an abstract idea on a computer. See MPEP 2106.05(f)). In Step 2B of the 101-analysis set forth in the 2019 PEG, the examiner has determined that the claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception: Regarding limitation ( I ), additional elements considered extra/post solution activity, as analyzed above, are activity that are well-understood routine and conventional, specifically: the courts have recognized the computer functions as well‐understood, routine, and conventional functions. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TL| Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network). See MPEP 2106.05(d)(II) . Regarding limitation ( II ), recite mere application of the abstract idea or mere instructions to implement an abstract idea on a computer are deemed insufficient to transform the judicial exception to a patentable invention because the limitations generally apply the use of a generic computer and/or process with the judicial exception, see MPEP 2106.05(f) . As analyzed above, the additional elements, analyzed above, do not integrate the noted judicial exception into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. Regarding claim 2, dependent upon claim 1, and fail to resolve the deficiencies identified above by integrating the judicial exception into a practical application, or introducing significantly more than the judicial exception. The claim recites: classifying involves classifying each indexed-data points of a group simultaneously . (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves considering multiple data points together, evaluating them under a common criteria and deciding their classification at the same time. See (MPEP 2106.04) ). Claim 8 , recite s similar subject matter as claim 2, so is rejected under the same rationale. Regarding claim 3 , dependent upon claim 1, and fail to resolve the deficiencies identified above by integrating the judicial exception into a practical application, or introducing significantly more than the judicial exception. The claim recites: classifying all indexed-data points of a first group and labelling the classified indexed-data points of the first group with a first label; and classifying all indexed-data points of a second group and labelling the classified indexed-data points of the second group with a second label . (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves observing different groups, evaluating their characteristics and deciding distinct labels for each group. See (MPEP 2106.04) ). Claim 9 , recite s similar subject matter as claim 3 , so is rejected under the same rationale. Regarding claim 4 , dependent upon claim 1, and fail to resolve the deficiencies identified above by integrating the judicial exception into a practical application, or introducing significantly more than the judicial exception. The claim recites: encoding each data point before grouping each data point . (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves converting information into a different representation before assigning them into groups . See (MPEP 2106.04) ). Claim 10 , recite s similar subject matter as claim 4 , so is rejected under the same rationale. Regarding claim 5 , dependent upon claim 4 , and fail to resolve the deficiencies identified above by integrating the judicial exception into a practical application, or introducing significantly more than the judicial exception. The claim recites: wherein: encoding each data point involves one-hot encoding . The recitation in the additional limitation simply links the judicial exception to a field of use and/or technology environment, see MPEP 2106.05(h). Limitations directed to field of use cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B. Claim 11 , recite s similar subject matter as claim 5 , so is rejected under the same rationale. Regarding claim 6 , dependent upon claim 1 , and fail to resolve the deficiencies identified above by integrating the judicial exception into a practical application, or introducing significantly more than the judicial exception. The claim recites: performing dimensionality reduction of the encoded data points . (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves evaluating multiple variables, deciding which aspect are relevant and reducing complexity by focusing on fewer dimensions. See (MPEP 2106.04) ). Claim 12 , recite s similar subject matter as claim 6 , so is rejected under the same rationale. Regarding claim 7, In step 2A prong 1 : group each data point into one or more groups [ ] (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves observing characteristics of data points, evaluating similarities or criteria and deciding how to organize the data into groups. See (MPEP 2106.04) ). assign each data point an index based one or more groups into which each data point is grouped ; (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves evaluating group membership and deciding an identifying reference or index for each data point. See (MPEP 2106.04) ). label each indexed-data points of the group with that label, the label being based on a classification of all indexed-data points of the group . (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves observing a group level classification result, evaluating how it applies to individual data points and assign label accordingly . See (MPEP 2106.04) ). If the claim limitations, under their broadest reasonable interpretation, covers performance of the limitations as a mental process, but for the recitation of generic computer components, then it falls within the mental process. Accordingly, the claim recites an abstract idea. Step 2A Prong 2 of the 101-analysis, set forth in MPEP 2106, the examiner has determined that the following additional elements do not integrate this judicial exception into a practical application: a processor; computer memory having instructions stored thereon that when executed will cause the processor to Deemed insufficient to transform the judicial exception to a patentable invention because the claim recites limitation which does not amount to more than a recitation of the words "apply it" (or an equivalent), such as mere instructions to implement an abstract idea on a computer. See MPEP 2106.05(f) . receive plural data points ; Deemed insufficient to transform the judicial exception to a patentable invention because the claim recites limitation directed to mere data gathering as deemed insufficient to transform the judicial exception because claimed elements are considered insignificant extra-solution activity, See MPEP (2106.05(g)) . via a clustering algorithm ; Deemed insufficient to transform the judicial exception to a patentable invention because the claim recites limitation which does not amount to more than a recitation of the words "apply it" (or an equivalent), such as mere instructions to implement an abstract idea on a computer. See MPEP 2106.05(f)). store plural indexed data points in memory ; Deemed insufficient to transform the judicial exception to a patentable invention because the claim recites limitation directed to storing information , as deemed insufficient to transform the judicial exception because claimed elements are considered insignificant extra-solution activity. See MPEP (2106.05(g)) ). receive a label for a group Deemed insufficient to transform the judicial exception to a patentable invention because the claim recites limitation directed to mere data gathering as deemed insufficient to transform the judicial exception because claimed elements are considered insignificant extra-solution activity, See MPEP (2106.05(g)) . In Step 2B of the 101-analysis set forth in the 2019 PEG, the examiner has determined that the claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception: Regarding limitation ( I and III ), recite mere application of the abstract idea or mere instructions to implement an abstract idea on a computer are deemed insufficient to transform the judicial exception to a patentable invention because the limitations generally apply the use of a generic computer and/or process with the judicial exception, see MPEP 2106.05(f). Regarding limitation ( II and V ), additional elements considered extra/post solution activity, as analyzed above, are activity that are well-understood routine and conventional, specifically: the courts have recognized the computer functions as well‐understood, routine, and conventional functions. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TL| Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network). See MPEP 2106.05(d)(II) . Regarding limitation ( IV ), additional elements considered extra/post solution activity, as analyzed above, are activity that are well-understood routine and conventional, specifically: the courts have recognized the computer functions as well‐understood, routine, and conventional functions. Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93; As analyzed above, the additional elements, analyzed above, do not integrate the noted judicial exception into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. Regarding claim 13, In step 2A prong 1 : comparing the incoming data points to a corpus of labelled data points, the corpus of labelled data points including data points that have been grouped into a group (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves observing new data points, comparing it against known examples and evaluating similarities or matches that have been group ed. See (MPEP 2106.04) ). each data point of the group labelled with a same label ; (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves deciding to assign an identical label to all members of a group based on their grouping. See (MPEP 2106.04) ). labeling an incoming data point with a label based on a match between the incoming data point and a labelled data pack . (i.e.: the broadest reasonable interpretation, the claim recites abstract idea: mental process: It involves observing a match between new data points and labeled data to decide on a corresponding label . See (MPEP 2106.04) ). If the claim limitations, under their broadest reasonable interpretation, covers performance of the limitations as a mental process, but for the recitation of generic computer components, then it falls within the mental process. Accordingly, the claim recites an abstract idea. Step 2A Prong 2 of the 101-analysis, set forth in MPEP 2106, the examiner has determined that the following additional elements do not integrate this judicial exception into a practical application: receiving incoming data points ; Deemed insufficient to transform the judicial exception to a patentable invention because the claim recites limitation directed to mere data gathering as deemed insufficient to transform the judicial exception because claimed elements are considered insignificant extra-solution activity, See MPEP (2106.05(g)) . via a clustering algorithm Deemed insufficient to transform the judicial exception to a patentable invention because the claim recites limitation which does not amount to more than a recitation of the words "apply it" (or an equivalent), such as mere instructions to implement an abstract idea on a computer. See MPEP 2106.05(f)). In Step 2B of the 101-analysis set forth in the 2019 PEG, the examiner has determined that the claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception: Regarding limitation ( I ), additional elements considered extra/post solution activity, as analyzed above, are activity that are well-understood routine and conventional, specifically: the courts have recognized the computer functions as well‐understood, routine, and conventional functions. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TL| Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network). See MPEP 2106.05(d)(II) . Regarding limitation ( II ), recite mere application of the abstract idea or mere instructions to implement an abstract idea on a computer are deemed insufficient to transform the judicial exception to a patentable invention because the limitations generally apply the use of a generic computer and/or process with the judicial exception, see MPEP 2106.05(f). As analyzed above, the additional elements, analyzed above, do not integrate the noted judicial exception into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. Claim 14 , recite s similar subject matter as claim 13 , so is rejected under the same rationale. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 3, 7 and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Kanta, Pub. No.: US20220343115A1, in view of Bras, Pub. No.: US20040215428A1. Regarding claim 1, Kanta teaches: A method for developing a model to classify data, the method comprising: receiving plural data points; grouping each data point into one or more groups via a clustering algorithm ; (Kanta, “[0012] The unsupervised classification model may implement a clustering algorithm to assign each data record [ receiving plural data points ] of the dataset to one or more of the groups [ grouping each data point into one or more groups ] based on similarities between the data records that are assigned to the same group. The clustering algorithm [ via a clustering algorithm ] used may depend on the nature of the dataset and/or based on user preference. If the dataset is numerical, the unsupervised classification model may use a clustering algorithm that works better with numerical data (e.g., k-means clustering)...”) and labelling the classified indexed-data points of the group with the same label . (Kanta, “[0038] At block 220, the processing device may then assign a label to each of the data records based on the groups. The labels may be randomly generated, selected from an existing list of predefined labels, or may be sequential integer numbers, for example. For example, if the processing device divided the data set into C number of clusters, the processing device may assign the label “c1” to the data records belonging to the first cluster, the label “c2” to the data records belonging the second cluster, and so on [ labelling the classified indexed-data points of the group with the same label ] (i.e.: labeling all data records belonging to the same cluster with the same label) . In embodiments, the processing device may add a column to the data set, wherein the column contains the dummy label of each corresponding data record.”) Kanta does not teach: assigning each data point an index based one or more groups into which each data point is grouped; and classifying all indexed-data points of a group Bras teaches: assigning each data point an index based one or more groups into which each data point is grouped; and classifying all indexed-data points of a group (Bras, “[0011] According to another aspect of the invention, a computer-implemented method of generating a finite-element mesh incorporating model-specific response to produce variable resolution in the mesh comprises: assigning an index value to each of a group of original elements [ assigning each data point an index based one or more groups into which each data point is grouped ] based on an index function providing a heuristic measure of impact on a model to produce an indexed element for each original element; grouping the indexed elements into at least two groups based on the index value of each element [ classifying all indexed-data points of a group ]; selecting a subset of indexed elements from each of the groups based on a selection function; creating a finite-element mesh for each of the groups using the corresponding original elements of each of the subset of indexed elements selected by the selection function; and combining the finite-element mesh from each of the groups into a final finite-element mesh.”) Bras and Kanta are related to the same field of endeavor (i.e.: neural network optimization ) . It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to combine the teaching of Bras with teachings of Kanta to add model specific mechanism (index function, selection function and variable resolution representations) to improve how data is grouped and selected. (Bras, Abstract). Regarding claim 3, Kanta in view of Bras teach the method of claim 1. Kanta further teaches: classifying all indexed-data points of a first group and labelling the classified indexed-data points of the first group with a first label; and classifying all indexed-data points of a second group and labelling the classified indexed-data points of the second group with a second label . (Kanta, “[0038] At block 220, the processing device may then assign a label to each of the data records based on the groups [ classifying all indexed-data points ]. The labels may be randomly generated, selected from an existing list of predefined labels, or may be sequential integer numbers, for example. For example, if the processing device divided the data set into C number of clusters, the processing device may assign the label “c1” [ and labelling the classified indexed-data points of the first group with a first label ] to the data records belonging to the first cluster [ of a first group ], the label “c2” [ and labelling the classified indexed-data points of the second group with a second label ] to the data records belonging the second cluster [ classifying all indexed-data points of a second group ], and so on. In embodiments, the processing device may add a column to the data set, wherein the column contains the dummy label of each corresponding data record.”) Claim 9 , recites limitations analogous to claim 3 , so is rejected under the same rationale. Regarding claim 7, Kanta teaches A system for developing a model to classify data, the system (Kanta , “ [0010] Aspects of the present disclosure address the above-noted and other deficiencies by implementing an unsupervised classification model that uses unlabeled data as the input by converting the unlabeled data to labeled data [ a model to classify data ]. ”) comprising: a processor; computer memory having instructions stored thereon that when executed will cause the processor to : (Kanta, “[0065] The example computer system 500 may include a processing device 502, a main memory 504 (e.g., read-only memory (ROM) [ a processor; computer memory having instructions stored thereon that when executed will cause the processor to ], flash memory, dynamic random access memory (DRAM) (such as synchronous DRAM (SDRAM), etc.), a static memory 506 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 518, which communicate with each other via a bus 530.”) receive a label for a group and label each indexed-data points of the group with that label, (Kanta, “[0038] At block 220, the processing device may then assign a label to each of the data records based on the groups [ receive a label for a group ]. The labels may be randomly generated, selected from an existing list of predefined labels, or may be sequential integer numbers, for example. For example, if the processing device divided the data set into C number of clusters, the processing device may assign the label “c1” to the data records belonging to the first cluster, the label “c2” to the data records belonging the second cluster, and so on [ label each indexed-data points of the group with that label ]. In embodiments, the processing device may add a column to the data set, wherein the column contains the dummy label of each corresponding data record.”) Kanta d oes not teach: store plural indexed data points in memory; and the label being based on a classification of all indexed-data points of the group Bras teaches: store plural indexed data points in memory; and (Bras, “[0046] Once index values have been calculated for each data point, the points are grouped (act 504) based on their respective index values. In the illustrated embodiment, the DEM data points are first sorted into groups and placed in “bins” in memory based on their respective index values [ store plural indexed data points in memory ], in a predetermined way.”) the label being based on a classification of all indexed-data points of the group. (Bras, “[0011] According to another aspect of the invention, a computer-implemented method of generating a finite-element mesh incorporating model-specific response to produce variable resolution in the mesh comprises: assigning an index value to each of a group of original elements based on an index function providing a heuristic measure of impact on a model to produce an indexed element for each original element; grouping the indexed elements into at least two groups based on the index value of each element [ the label being based on a classification of all indexed-data points of the group ]; selecting a subset of indexed elements from each of the groups based on a selection function; creating a finite-element mesh for each of the groups using the corresponding original elements of each of the subset of indexed elements selected by the selection function; and combining the finite-element mesh from each of the groups into a final finite-element mesh.”) It would have been obvious to one of ordinary skill in the art before the effective filling date of the present application to combine the teachings of Bras with teachings of Kanta for the same reasons disclosed for claim 1 . Claim(s) 2 and 8 are rejected under 35 U.S.C. 103 as being unpatentable over Kanta in view of Bras and in further view of GRINIS et al. , Pub. No.: US20230306051A1 . Regarding claim 2, Kanta in view of Bras teach the method of claim 1. Kanta in view of Bras do not teach: wherein: classifying involves classifying each indexed-data points of a group simultaneously . GRINIS teaches: wherein: classifying involves classifying each indexed-data points of a group simultaneously . (GRINIS, “[0022] In an embodiment, the multi-label classifier may be configured to rapidly process textual data, such as but not limited to, emails, text messages, chats, and the like, and simultaneously determine label scores and decisions for all revenue-based labels. The simultaneous identification of all revenue-based labels by the trained multi-label classifier may be performed at similar processing times [ classifying involves classifying each indexed-data points of a group simultaneously ] to that of identifying one revenue-based label. It should be appreciated that enhanced computer efficiency is achieved with the significantly reduced processing time.”) GRINIS , Kanta and Bras are related to the same field of endeavor (i.e.: neural network optimization ) . It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to combine the teaching of GRINIS with teachings of Kanta and Bras to add more structured classification framework by specifying how multiple classifiers are trained and combined to produce multiple labels and confidence values from a single input . ( GRINIS , Abstract). Claim 8 , recites limitations analogous to claim 2 , so is rejected under the same rationale. Claim(s) 4 – 6 and 10 – 12 are rejected under 35 U.S.C. 103 as being unpatentable over Kanta in view of Bras and in further view of Elkind et al. , Pub. No.: US20190273510A1 . Regarding claim 4, Kanta in view of Bras teach the method of claim 1. Kanta in view of Bras do not teach: comprising: encoding each data point before grouping each data point Elkind teaches: comprising: encoding each data point before grouping each data point . (Elkind, “[0127] The encoder RNN 725 takes as input a sequence of numbers or sequence of vectors of numbers that are generated from the input data [ encoding each data point before grouping each data point ] either through a vector space embedding, either learned or not, a one-hot or other encoding, or the sequence of single numbers, scaled, normalized, otherwise transformed, or not. Encoder RNN 725 may include any number of layers of one or more types of RNN cells, each layer including an LSTM, GRU, or other RNN cell type. Additionally, each layer may have multiple RNN cells, the output of which is combined in some way before being sent to the next layer up if present. The learned weights within each cell may vary depending on the cell type. Encoder RNN 725 may produce an output after each element of the input sequence is provided to it in addition to the internal states of the RNN cells, all of which can be sent to the decoder RNN during training or used for embedding or classification.”) Elkind , Kanta and Bras are related to the same field of endeavor (i.e.: neural network optimization ) . It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to combine the teaching of Elkind with teachings of Kanta and Bras to add ability to classify variable length or complex data types, rather than just assigning labels to grouped data . ( Elkind , Abstract). Claim 10 , recites limitations analogous to claim 4 , so is rejected under the same rationale. Regarding claim 5, Kanta in view of Bras and Elkind teach the method of claim 4. Elkind further teaches: wherein: encoding each data point involves one-hot encoding . (Elkind, “[0127] The encoder RNN 725 takes as input a sequence of numbers or sequence of vectors of numbers that are generated from the input data either through a vector space embedding, either learned or not, a one-hot or other encoding [ encoding each data point involves one-hot encoding ], or the sequence of single numbers, scaled, normalized, otherwise transformed, or not. Encoder RNN 725 may include any number of layers of one or more types of RNN cells, each layer including an LSTM, GRU, or other RNN cell type. Additionally, each layer may have multiple RNN cells, the output of which is combined in some way before being sent to the next layer up if present. The learned weights within each cell may vary depending on the cell type. Encoder RNN 725 may produce an output after each element of the input sequence is provided to it in addition to the internal states of the RNN cells, all of which can be sent to the decoder RNN during training or used for embedding or classification.”) It would have been obvious to one of ordinary skill in the art before the effective filling date of the present application to combine the teachings of Elkind with teachings of Kanta and Bras for the same reasons disclosed for claim 4 . Claim 11 , recites limitations analogous to claim 5 , so is rejected under the same rationale. Regarding claim 6, Kanta in view of Bras teach the method of claim 1 . Kanta in view of Bras do not teach: comprising: performing dimensionality reduction of the encoded data points . Elkind teaches: comprising: performing dimensionality reduction of the encoded data points . (Elkind, “[0121] This system characterizes its input by identifying a reduced number of features of the input. The system uses encoder RNN 725 to reduce (compress) the dimensionality of the input source data [ performing dimensionality reduction of the encoded data points ] so that a classifier can analyze this reduced vector space to determine the characteristics of the source data. An encoder RNN whose output includes less nodes than the sequential input data creates a compressed version of the input.”) It would have been obvious to one of ordinary skill in the art before the effective filling date of the present application to combine the teachings of Elkind with teachings of Kanta and Bras for the same reasons disclosed for claim 4 . Claim 12 , recites limitations analogous to claim 6 , so is rejected under the same rationale. Claim(s) 13 – 14 are rejected under 35 U.S.C. 103 as being unpatentable over Kanta in view of Mars , Pub. No.: US20190130244A1 . Regarding claim 13, Kanta teaches: A method for classifying data, the method comprising: receiving incoming data points; (Kanta, “[0038] At block 220, the processing device may then assign a label to each of the data records based on the groups [ receiving incoming data points ]. The labels may be randomly generated, selected from an existing list of predefined labels, or may be sequential integer numbers, for example...”) the corpus of labelled data points including data points that have been grouped into a group via a clustering algorithm (Kanta, “[0012] The unsupervised classification model may implement a clustering algorithm to assign each data record [ the corpus of labelled data points including data points ] of the dataset to one or more of the groups [ that have been grouped into a group ] based on similarities between the data records that are assigned to the same group. The clustering algorithm [ via a clustering algorithm ] used may depend on the nature of the dataset and/or based on user preference. If the dataset is numerical, the unsupervised classification model may use a clustering algorithm that works better with numerical data (e.g., k-means clustering)...”) each data point of the group labelled with a same label; and (Kanta, “[0038] At block 220, the processing device may then assign a label to each of the data records based on the groups. The labels may be randomly generated, selected from an existing list of predefined labels, or may be sequential integer numbers, for example. For example, if the processing device divided the data set into C number of clusters, the processing device may assign the label “c1” to the data records belonging to the first cluster, the label “c2” to the data records belonging the second cluster, and so on [ each data point of the group labelled with a same label ] (i.e.: labeling all data records belonging to the same cluster with the same label) . In embodiments, the processing device may add a column to the data set, wherein the column contains the dummy label of each corresponding data record.”) Kanta does not teach: comparing the incoming data points to a corpus of labelled data points, labeling an incoming data point with a label based on a match between the incoming data point and a labelled data pack Mars teaches: comparing the incoming data points to a corpus of labelled data points, (Mars, “[0070] In a first implementation, S240 may function to implement or use a predetermined reference table to identify or determine a machine and/or program-comprehensible object or operation to map to each slot and associated one or more slot classification labels of the user input data. In such implementation, S240 implementing the slot extractor functions to match (or compare) [ comparing ] the slot (value or data) [ the incoming data points ] and the associated slot classification label(s) to the predetermined reference table [ to a corpus of labelled data points ] to identify the program-comprehensible object or operation that should be mapped to the slot and the associated slot classification label.”) labeling an incoming data point with a label based on a match between the incoming data point and a labelled data pack . (Mars, “[0045] In operation, S220 implementing the competency classification deep machine learning algorithm may function to analyze the user input data and generate a classification label. Specifically, based on the features, meaning and semantics of the words and phrases in the user input data, the competency classification deep machine learning algorithm may function to calculate and output a competency classification label [ labeling an incoming data point with a label ] having a highest probability of matching an intent of the user input data [ based on a match between the incoming data point and a labelled data pack ]. For example, the classification machine learning model generate, based on user input data, a classification label of “Income” having a probability of intent match of “89%” for a given query or command of the user input data, as shown by way of example in FIG. 3B.”) Mars and Kanta are related to the same field of endeavor (i.e.: neural network optimization ) . It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to combine the teaching of Mars with teachings of Kanta to add using classification results, where generated labels are used to trigger further processing or actions . ( Mars , Abstract). Claim 14 , recites limitations analogous to claim 13 , so is rejected under the same rationale. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Li et al., Pub. No.: US11216491B2 . Li teaches an automatic data input and query system is controlled by well-defined control data. Certain control data may relate to data schemas and direct operations performed by the system to extract fields from machine data. DUPLESSIS et al., Pub. No.: US12417407B2 . DUPLESSIS teaches monitoring and automated assessment is a computational approach that can be used to computationally monitor deployed models that are being used in production (e.g., client facing or in-use) systems as a warning mechanism tuned to issue alerts or cause downstream model changes upon detecting mismatches. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATIYAS T MARU whose telephone number is (571)270-0902 or via email: matiyas.maru@uspto.gov . The examiner can normally be reached Monday 8:00am - Friday 4:00pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michelle Bechtold can be reached on (571)431-0762. The fax phone number for the organization were this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /M.T.M./ Examiner, Art Unit 2148 /MICHELLE T BECHTOLD/ Supervisory Patent Examiner, Art Unit 2148
Read full office action

Prosecution Timeline

May 24, 2023
Application Filed
Mar 23, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586114
GENERATING DIGITAL RECOMMENDATIONS UTILIZING COLLABORATIVE FILTERING, REINFORCEMENT LEARNING, AND INCLUSIVE SETS OF NEGATIVE FEEDBACK
2y 5m to grant Granted Mar 24, 2026
Patent 12572796
METHODS AND SYSTEMS FOR GENERATING RECOMMENDATIONS FOR COUNTERFACTUAL EXPLANATIONS OF COMPUTER ALERTS THAT ARE AUTOMATICALLY DETECTED BY A MACHINE LEARNING ALGORITHM
2y 5m to grant Granted Mar 10, 2026
Patent 12567004
METHOD OF MACHINE LEARNING TRAINING FOR DATA AUGMENTATION
2y 5m to grant Granted Mar 03, 2026
Patent 12561588
Methods and Systems for Generating Example-Based Explanations of Link Prediction Models in Knowledge Graphs
2y 5m to grant Granted Feb 24, 2026
Patent 12561584
TEACHING DATA PREPARATION DEVICE, TEACHING DATA PREPARATION METHOD, AND PROGRAM
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
58%
Grant Probability
70%
With Interview (+12.5%)
4y 6m
Median Time to Grant
Low
PTA Risk
Based on 40 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month