Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. Information Disclosure Statement Acknowledgment is made of the information disclosure statements filed Ma y 11, 2023 , which comply with 37 CFR 1.97. As such, the information disclosure statements have been placed in the application file and the information referred to therein has been considered by the examiner. Specification The disclosure is objected to because of the following informalities: In paragraph [ 0060 ] , The term “The user device” is labeled as “ 430 ” in reference to step 254 in Figure 2. There are n o label s of “ 430 ” found in Figure 2 or Figures 1-9 . Figure 2 has the term “user device” with the label of “230”. The term “The user device” appears to be improperly labeled within the specification and may have been intended to read “230” . Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1 – 20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1: Claim 1 is a process type claim. Claim 16 is a machine type claim. Claim 20 is a CRM claim. Therefore, claims 1-20 are directed to either a process, machine manufacture, or composition of matter. As per claim 1. 2A Prong 1: “ obtaining, for each feature in a set of features , a respective noise model” The user, mentally or with pencil and paper, creates a noise model for each feature. “ a selection of features from the set of features;” The user, mentally or with pencil and paper, selects features from a set of features. “ for each entity of a first plurality of entities used for training, obtaining a value for each of the selected features; ” The user, mentally or with pencil and paper, retrieves a data value for every selected feature. “using a first set of training pairs, each training pair in the first set of training pairs comprising respective entity data paired with corresponding distorted entity data,” The user, mentally or with pencil and paper, applies the training pairs of original data and distorted data as input to the machine learning model. “ wherein: the respective entity data comprises values of the selected features for a respective entity in the first plurality of entities, and the corresponding distorted entity data is obtained by distorting a value of at least one of the selected features for the respective entity based on the respective noise model for the at least one of the selected features ;” The user, mentally or with pencil and paper, manipulates data using the developed noise model. “ values of the selected features for a particular entity to obtain a representation of the values of the selected features for that entity; ” The user, mentally or with pencil and paper, creates a representation of the data values from the selected features. “ and determining a similarity between the particular entity and another entity based on the representation for the particular entity. ” The user, mentally or with pencil and paper, determines a similarity of an entity and another entity using the data representations. 2A Prong 2: Additional elements: “ receiving, from a user device” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). “ training a machine-learning model ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of training a machine learning model using a first set of training pairs of the original data and distorted data . ) “ and inputting, to the trained machine-learning model, ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of the selected features .) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “ receiving, from a user device” (MPEP 2106.05(d)(II) indicate that merely "transmitting or receiving data" is a well-understood, routine, conventional function when it is claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed receiving steps are well-understood, routine, conventional activity is supported under Berkheimer). “ training a machine-learning model ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of training a machine learning model using a first set of training pairs of the original data and distorted data.) “ and inputting, to the trained machine-learning model, ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of the selected features.) As per claim 2, 2A Prong 1: “The computer-implemented method of claim 1, wherein the first plurality of entities includes at least one of the particular entity or the other entity.” Examiner’s note: Rejected under 101 as this limitation further elaborates on the mental process recited in claim 1, “for each entity of a first plurality of entities used for training, obtaining a value for each of the selected features;” . Th e limitation in claim 2 denotes that the particular or the other entit y are included in the first plurality of entities when obtaining a value for each selected features for each entity of a first plurality of entities used for training. 2A Prong 2: The claim does not recite any additional elements beyond the judicial exception. 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As per claim 3 , 2A Prong 1: “ values of the selected features for the other entity to obtain a representation of the values of the selected features for the other entity, ” The user, mentally or with pencil and paper, creates a representation of the data values from the selected features. “ wherein the similarity between the particular entity and the other entity is determined based on the representation for the particular entity and the representation for the other entity ” The user, mentally or with pencil and paper, determines a similarity of an entity and another entity using the data representations. 2A Prong 2: Additional elements: “inputting, to the trained machine-learning model,” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of the selected features.) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “inputting, to the trained machine-learning model,” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of the selected features.) As per claim 4 , 2A Prong 1: “ wherein determining the similarity between the particular entity and the two or more other entities comprises: ” The user, mentally or with pencil and paper, determines a similarity of an entity and another entity. “ values of the selected features for the two or more other entities to obtain representations of the values of the selected features for the two or more other entities; ” The user, mentally or with pencil and paper, creates a representation of the data values from the selected features for the two or more other entities. “ and determining similarity scores for the two or more other entities compared to the particular entity based on the representations for the particular entity and the two or more other entities, ” The user, mentally or with pencil and paper, compares and determines the similarity scores of entities using the data representations. 2A Prong 2: Additional elements: “ inputting, to the trained machine-learning model, ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of features for entities by inputting the selected features .) “ the method further comprising: identifying a subset of the two or more other entities as similar to the particular entity based on their respective similarity scores. ” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “ inputting, to the trained machine-learning model, ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of features for entities by inputting the selected features .) “ the method further comprising: identifying a subset of the two or more other entities as similar to the particular entity based on their respective similarity scores. ” (MPEP 2106.05(d)(II) indicate that merely " Identifying undeliverable mail items, decoding data on those mail items and creating output data " is a well-understood, routine, conventional function when it is claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed receiving steps are well-understood, routine, conventional activity is supported under Berkheimer). As per claim 5 , 2A Prong 1: A judicial exception is not recited in the claims as they do not recite an abstract idea (mathematical concepts, certain methods of organizing human activity, or mental processes, law of nature or natural phenomenon). 2A Prong 2: Additional elements: providing information relating to the subset of the two or more other entities for output at a user interface of the user device. (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: providing information relating to the subset of the two or more other entities for output at a user interface of the user device. (MPEP 2106.05(d)(II) indicate that merely "transmitting or receiving data over a network " is a well-understood, routine, conventional function when it is claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed receiving steps are well-understood, routine, conventional activity is supported under Berkheimer). As per claim 6, 2A Prong 1: “filtering the subset of the two or more other entities to obtain a filtered set of entities;” The user, mentally or with pencil and paper, groups together similar entities based on the representation of data to get a set of similar entities. 2A Prong 2: Additional elements: “ receiving, from the user device” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). “an indication of one or more filters to be applied;” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). “and providing information relating to the filtered set of entities for output at a user interface of the user device.” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “ receiving, from the user device” (MPEP 2106.05(d)(II) indicate that merely "transmitting or receiving data over a network " is a well-understood, routine, conventional function when it is claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed receiving steps are well-understood, routine, conventional activity is supported under Berkheimer). “an indication of one or more filters to be applied;” (MPEP 2106.05(d)(II) indicate that merely "transmitting or receiving data over a network" is a well-understood, routine, conventional function when it is claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed receiving steps are well-understood, routine, conventional activity is supported under Berkheimer). “and providing information relating to the filtered set of entities for output at a user interface of the user device.” (MPEP 2106.05(d)(II) indicate that merely "transmitting or receiving data over a network" is a well-understood, routine, conventional function when it is claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed receiving steps are well-understood, routine, conventional activity is supported under Berkheimer). As per claim 7 , 2A Prong 1: “ and wherein determining the similarity between the particular entity and the other entity comprises determining the similarity between the particular entity and the other entity …” The user, mentally or with pencil and paper, determines the similarity of an entity and another entity.” 2A Prong 2: Additional elements: “ to cluster the second plurality of entities into one or more groups based on the representations for the second plurality of entities.” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note: high level application of a previously trained model to group entities based on the representations of another group of entities.) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “to cluster the second plurality of entities into one or more groups based on the representations for the second plurality of entities.” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note: high level application of a previously trained model to group entities based on the representations of another group of entities.) As per claim 8 , 2A Prong 1: “and selecting the particular entity from a larger group of entities based on the one or more filters.” The user, mentally or with pencil and paper, selects or chooses an entity from a group of entities. 2A Prong 2: Additional elements: “ receiving, from the user device” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). “an indication of one or more filters to be applied;” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “ receiving, from the user device” (MPEP 2106.05(d)(II) indicate that merely "transmitting or receiving data over a network " is a well-understood, routine, conventional function when it is claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed receiving steps are well-understood, routine, conventional activity is supported under Berkheimer). “an indication of one or more filters to be applied;” (MPEP 2106.05(d)(II) indicate that merely "transmitting or receiving data over a network" is a well-understood, routine, conventional function when it is claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed receiving steps are well-understood, routine, conventional activity is supported under Berkheimer). As per claim 9 , 2A Prong 1: “the first label indicating that, for each training pair in the first set of training pairs, the entity data and the distorted entity data in that training pair are similar to one another.” The user, mentally or with pencil and paper, provides labels for the training data, which indicate if the original data and distorted data are similar. 2A Prong 2: Additional elements: “wherein training the machine-learning model using the first set of training pairs comprises:” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note: high level application of a training a machine learning model using a first set of training pairs .) “ training the machine-learning model using the first set of training pairs and a first label ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note: high level application of a training a machine learning model using a first set of training pair and a first label .) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “wherein training the machine-learning model using the first set of training pairs comprises:” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note: high level application of a training a machine learning model using a first set of training pairs .) “ training the machine-learning model using the first set of training pairs and a first label ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note: high level application of a training a machine learning model using a first set of training pair and a first label .) As per claim 10 , 2A Prong 1: “the second label indicating that, for each training pair in the second set of training pairs, the entity data for the different entities are dissimilar to one another.” The user, mentally or with pencil and paper, provides a second label for the second training data, which indicate if the original data and distorted data are dissimilar. 2A Prong 2: Additional elements: “wherein a second set of training pairs and a second label are also used to train the machine-learning model,” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note: high level application of training a machine learning model using a second set of training pairs and labels .) “each training pair in the second set of training pairs comprising entity data including values of the selected features for each of two different entities in the first plurality of entities,” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “wherein a second set of training pairs and a second label are also used to train the machine-learning model,” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note: high level application of training a machine learning model using a second set of training pairs and labels .) “each training pair in the second set of training pairs comprising entity data including values of the selected features for each of two different entities in the first plurality of entities,” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). As per claim 11, 2A Prong 1: “ wherein obtaining, for each feature in a set of features, the respective noise model comprises obtaining the noise model for a particular feature by performing operations including: ” The user, mentally or with pencil and paper, creates a noise model for each feature. “ obtaining values of the particular feature for a second plurality of entities; ” The user, mentally or with pencil and paper, retrieves values of features . “ and determining the noise model for the particular feature based on the values of the particular feature for the second plurality of entities . ” The user, mentally or with pencil and paper, uses data values from features to determine a noise model . 2A Prong 2: The claim does not recite any additional elements beyond the judicial exception. 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As per claim 12 , 2A Prong 1: “ wherein the noise model for the particular feature in the set of features is based on a distribution of the values of the particular feature for the second plurality of entities. ” The user, mentally or with pencil and paper, uses the distribution of feature value data to create a noise model. 2A Prong 2: The claim does not recite any additional elements beyond the judicial exception. 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As per claim 13 , 2A Prong 1: “ wherein determining the noise model for the particular feature comprises: encoding, for each entity in the second plurality of entities the value of the particular feature to obtain respective numeric data for that entity; ” The user, mentally or with pencil and paper, converts categorical or non-numeric data of features into numeric data to determine a noise model . “ and determining the noise model for the particular feature based on a distribution of the numeric data for the second plurality of entities. ” The user, mentally or with pencil and paper, uses the distribution of numeric data to create a noise model . 2A Prong 2: The claim does not recite any additional elements beyond the judicial exception . 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As per claim 14, 2A Prong 1: “ wherein the noise model for the particular feature is based on a Gaussian distribution of the numeric data for the second plurality of entities. ” 2A Prong 2: The claim does not recite any additional elements beyond the judicial exception. 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As per claim 15, 2A Prong 1: A judicial exception is not recited in the claims as they do not recite an abstract idea (mathematical concepts, certain methods of organizing human activity, or mental processes, law of nature or natural phenomenon). 2A Prong 2: Additional elements: “The computer-implemented method of claim 13, wherein the particular feature comprises a category” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “The computer-implemented method of claim 13, wherein the particular feature comprises a category” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). As per claim 16, 2A Prong 1: “obtaining, for each feature in a set of features, a respective noise model” The user, mentally or with pencil and paper, creates a noise model for each feature. “a selection of features from the set of features;” The user, mentally or with pencil and paper, selects features from a set of features. “ for each entity of a first plurality of entities used for training, obtaining a value for each of the selected features; ” The user, mentally or with pencil and paper, retrieves a data value for every selected feature. “using a first set of training pairs, each training pair in the first set of training pairs comprising respective entity data paired with corresponding distorted entity data,” The user, mentally or with pencil and paper, applies the training pairs of original data and distorted data as input to the machine learning model. “wherein: the respective entity data comprises values of the selected features for a respective entity in the first plurality of entities, and the corresponding distorted entity data is obtained by distorting a value of at least one of the selected features for the respective entity based on the respective noise model for the at least one of the selected features ;” The user, mentally or with pencil and paper, manipulates data using the developed noise model. “ values of the selected features for a particular entity to obtain a representation of the values of the selected features for that entity; ” The user, mentally or with pencil and paper, creates a representation of the data values from the selected features. “ and determining a similarity between the particular entity and another entity based on the representation for the particular entity. ” The user, mentally or with pencil and paper, determines a similarity of an entity and another entity using the data representations. 2A Prong 2: Additional elements: “ a network interface ”, “ and a processor to:” ( mere instructions to apply an exception using a generic computer component). “ receiving, from a user device” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). “ training a machine-learning model ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of training a machine learning model using a first set of training pairs of the original data and distorted data.) “ and inputting, to the trained machine-learning model, ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of the selected features.) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “ a network interface ”, “ and a processor to:” ( mere instructions to apply an exception using a generic computer component). “ receiving, from a user device” (MPEP 2106.05(d)(II) indicate that merely "transmitting or receiving data" is a well-understood, routine, conventional function when it is claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed receiving steps are well-understood, routine, conventional activity is supported under Berkheimer). “ training a machine-learning model ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of training a machine learning model using a first set of training pairs of the original data and distorted data.) “ and inputting, to the trained machine-learning model, ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of the selected features.) As per claim 17 , 2A Prong 1: “The system of claim 16, wherein the first plurality of entities includes at least one of the particular entity or the other entity.” Examiner’s note: Rejected under 101 as this limitation further elaborates on the mental process recited in claim 16 , “for each entity of a first plurality of entities used for training, obtaining a value for each of the selected features;” . Th e limitation in claim 17 denotes that the particular or the other entit y are included in the first plurality of entities when obtaining a value for each selected features for each entity of a first plurality of entities used for training. 2A Prong 2: The claim does not recite any additional elements beyond the judicial exception. 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As per claim 18, 2A Prong 1: “ values of the selected features for the other entity to obtain a representation of the values of the selected features for the other entity, ” The user, mentally or with pencil and paper, creates a representation of the data values from the selected features. “ wherein the similarity between the particular entity and the other entity is determined based on the representation for the particular entity and the representation for the other entity. ” The user, mentally or with pencil and paper, determines a similarity of an entity and another entity using the data representations. 2A Prong 2 : Additional elements: “input, to the trained machine-learning model,” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of the selected features .) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “input, to the trained machine-learning model,” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of the selected features.) As per claim 19, 2A Prong 1: “ obtaining values of the particular feature for a second plurality of entities; ” The user, mentally or with pencil and paper, retrieves values of features. “ and determining the noise model for the particular feature based on the values of the particular feature for the second plurality of entities. ” The user, mentally or with pencil and paper, uses data values from features to determine a noise model. 2A Prong 2: The claim does not recite any additional elements beyond the judicial exception. 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As per claim 20 , 2A Prong 1: “obtaining, for each feature in a set of features, a respective noise model” The user, mentally or with pencil and paper, creates a noise model for each feature. “a selection of features from the set of features;” The user, mentally or with pencil and paper, selects features from a set of features. “ for each entity of a first plurality of entities used for training, obtaining a value for each of the selected features; ” The user, mentally or with pencil and paper, retrieves a data value for every selected feature. “using a first set of training pairs, each training pair in the first set of training pairs comprising respective entity data paired with corresponding distorted entity data,” The user, mentally or with pencil and paper, applies the training pairs of original data and distorted data as input to the machine learning model. “wherein: the respective entity data comprises values of the selected features for a respective entity in the first plurality of entities, and the corresponding distorted entity data is obtained by distorting a value of at least one of the selected features for the respective entity based on the respective noise model for the at least one of the selected features ;” The user, mentally or with pencil and paper, manipulates data using the developed noise model. “ values of the selected features for a particular entity to obtain a representation of the values of the selected features for that entity; ” The user, mentally or with pencil and paper, creates a representation of the data values from the selected features. “ and determining a similarity between the particular entity and another entity based on the representation for the particular entity. ” The user, mentally or with pencil and paper, determines a similarity of an entity and another entity using the data representations. 2A Prong 2: Additional elements: “ A non-transitory computer readable medium ” ( mere instructions to apply an exception using a generic computer component). “having stored thereon computer-executable instructions that,” (Adding insignificant extra-solution activity to the judicial exception - see MPEP 2106.05(g)). “when executed by a computer, cause the computer to perform operations comprising:” (mere instructions to apply an exception using a generic computer component). “ receiving, from a user device” (Adding insignificant extra-solution activity to the judicial exception – see MPEP 2106.05(g)). “ training a machine-learning model ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of training a machine learning model using a first set of training pairs of the original data and distorted data.) “ and inputting, to the trained machine-learning model, ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of the selected features.) 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: “ A non-transitory computer readable medium ” ( mere instructions to apply an exception using a generic computer component). “having stored thereon computer-executable instructions that,” (MPEP 2106.05(d)(II) indicate that merely “storing and retrieving information in memory” is a well‐understood, routine, conventional function when it is claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed storing step is well-understood, routine, conventional activity is supported under Berkheimer). “when executed by a computer, cause the computer to perform operations comprising:” (mere instructions to apply an exception using a generic computer component). “ receiving, from a user device” (MPEP 2106.05(d)(II) indicate that merely "transmitting or receiving data" is a well-understood, routine, conventional function when it is claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed receiving steps are well-understood, routine, conventional activity is supported under Berkheimer). “ training a machine-learning model ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of training a machine learning model using training pairs of the original data and distorted data.) “ and inputting, to the trained machine-learning model, ” (Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea – see MPEP 2106.05(f) – Examiner’s note : high level application of a previously trained model to obtain a representation of values of the selected features.) Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis ( i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1 - 6, 8 - 12, 14 - 20 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Gietema (U.S. Patent Application Publication No. US20220351532A1) . As per claim 1 , A computer-implemented method comprising: obtaining, for each feature in a set of features, a respective noise model; ( [Paragraph 0023] – “In some embodiments, the noise model may be determined separately for printed features, spacings between printed features and positions of printed features”, Gietema discloses a process of creating noise models using feature value data.) receiving, from a user device, a selection of features from the set of features; ( [Paragraph 0015] , Fig ure 6., - “The next step in the disclosed method is selecting at least one of a plurality of printed features, plurality of spacings between the plurality of printed features in the first digital images and plurality of positions of a plurality of printed features in the second digital images… this selection is done by a human and only sharp images are selected at this stage” , The flow chart teaches a user input network interface that allows for selecting a plurality of features .) for each entity of a first plurality of entities used for training, obtaining a value for each of the selected features; ( [Paragraph 0008] - “The printed features can be alphanumeric characters, holograms, watermark, flags, a particular field in a document such as a name or an address field or a signature strip, or any other features that are printed on official documents such as driving licenses and passports. In essence, using the methods of the present disclosure, any printed feature, its position in a document, and its spacing relative to other printed features can be annotated” , T his denotes the values that are taken from the features of the documents in Gietema , which describes the obtaining a value process within the limitation. ) training a machine-learning model using a first set of training pairs, each training pair in the first set of training pairs comprising respective entity data paired with corresponding distorted entity data, wherein: ( [Paragraph 0073] - “ The modified and original reference landmark locations are then used to generate input for the machine learning model and the machine learning model is trained using the input data. The training of the machine learning model can be a supervised training ”, Gietema discloses the training of a machine learning model using a first training pair with a manipulated data of a second training pair as inputs to train the machine learning model . ) the respective entity data comprises values of the selected features for a respective entity in the first plurality of entities, ( [Paragraph 0007] - “The first stage of training the machine learning model is annotating a plurality of printed features and/or a plurality of spacings between the plurality of printed features in digital images of a first set of genuine documents of one or more classes of documents and/or a plurality of positions of a plurality of printed features in digital images of a second set of genuine documents to obtain “original reference landmark locations” for these printed features, spacings and positions ”, T his denotes the data values of selected features that corresponds to the respective set of entitie s where Gietema annotates the features of a first set of documents.) and the corresponding distorted entity data is obtained by distorting a value of at least one of the selected features for the respective entity based on the respective noise model for the at least one of the selected features; ( [Paragraph 0060] - “With reference to FIG. 1A, at step 105, the annotated transformed printed features, spacings and/or positions are further “augmented” by combining them with a noise model. Here, combining with a noise model can, for example, be imitating at least one of geometric transformations and radiometric transformations of the respective annotated transformed printed features, annotated transformed spacings and/or annotated transformed positions. In this way, for each transformed printed feature, spacing and/or position, yet more “genuine” printed features, spacings and/or positions are generated. The geometric transformations can, for example, include imitating perspective transformations or image distortions, while radiometric transformations may include imitating reflected or emitted radiation measured by a sensor during image capture, which leads to distortions on a document due to document capturing conditions. As a result, not only is the number of training data increased, the geometric and radiometric transformations can represent noise in genuine documents, which means that the machine learning model is ultimately trained to account for certain types of noise in genuine documents” , T his denotes the process of using the noise model to distort the data value of the selected features to obtain the manipulated training data as corresponding data for an entity .) and inputting, to the trained machine-learning model, values of the selected features for a particular entity to obtain a representation of the values of the selected features for that entity; ( [Paragraph 0030] – “ Input data corresponding to the document for the trained machine learning model is generated using at least one of a plurality of printed features, a plurality of spacings between the plurality of printed features, and a plurality of positions of the plurality of printed features in this digital image. Once these are inputted to the trained machine learning model, the output of the model is a plurality of annotations on printed features, spacings between the printed features, and/or positions of printed features on the digital image. These annotations indicate respective landmark locations on the respective annotated printed feature, spacing and/or position” , T his denotes the application of the training pairs as inputs to the trained machine learning model to generate representations of the selected features for the corresponding entit y.) and determining a similarity between the particular entity and another entity based on the representation for the particular entity. ( [Paragraph 0032] - “In order to determine whether or not the document is fraudulent, a geometric score is calculated for each landmark location on the annotated printed feature, spacing and/or position in the digital image of the document based on the distances between the landmark locations on the printed feature, spacing and/or location and the respective original reference landmark locations. A first threshold is defined for each geometric score, above which the respective printed feature, spacing and/or position is identified as “fake.” Thus, if a geometric score is above the associated first threshold, then the document is determined to be fraudulent. In this way, not only are fake features, wrong spacings and mispositions detected, the misorientations of the printed features are also detected since a misorientation leads to annotations on features, spacings and/or positions to be outside the associated thresholds” , T his denotes the step of the process in using the obtained feature data representations of the entities for comparison .) As per claim 2 , The computer-implemented method of claim 1, wherein the first plurality of entities includes at least one of the particular entity or the other entity. ( [Paragraph 0014] – “ The first step in the disclosed method is obtaining digital images of the first and second sets of genuine documents. The first set of genuine documents may comprise one or more documents which may or may not be of the class of documents to which the document to be authenticated belongs, whereas the second set of genuine documents are of the same class to which the document to be authenticated belongs.” , Gietema discloses a grouping, and the images retrieved from the first and second set of documents .) As per claim 3 , The computer-implemented method of claim 1, further comprising:inputting , to the trained machine-learning model, values of the selected features for the other entity to obtain a representation of the values of the selected features for the other entity, ( [Paragraph 00 30 ] – “ Input data corresponding to the document for the trained machine learning model is generated using at least one of a plurality of printed features, a plurality of spacings between the plurality of printed features, and a plurality of positions of the plurality of printed features in this digital image. Once these are inputted to the trained machine learning model, the output of the model is a plurality of annotations on printed features, spacings between the printed features, and/or positions of printed features on the digital image. These annotations indicate respective landmark locations on the respective annotated printed feature, spacing and/or position.” , This teaches the process of using the trained model with values of selected features to obtain representation s of those feature values .) wherein the similarity between the particular entity and the other entity is determined based on the representation for the particular entity and the representation for the other entity. ( [Paragraph 00 32 ] – “ In order to determine whether or not the document is fraudulent, a geometric score is calculated for each landmark location on the annotated printed feature, spacing and/or position in the digital image of the document based on the distances between the landmark locations on the printed feature, spacing and/or location and the respective original reference landmark locations. A first threshold is defined for each geometric score, above which the respective printed feature, spacing and/or position is identified as “fake.” Thus, if a geometric score is above the associated first threshold, then the document is determined to be fraudulent. In this way, not only are fake features, wrong spacings and mispositions detected, the misorientations of the printed features are also detected since a misorientation leads to annotations on features, spacings and/or posi