DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments filed 11/28/2025 have been fully considered but they are not persuasive.
Regarding claim 1, Applicant alleges that Liu and/or McCourt does not disclose amended limitations “wherein the at least one binary classifier includes two or more binary classifiers, and wherein the preset condition includes at least two binary classifiers among the two or more binary classifiers outputting True or a positive result, or the preset condition includes all of the two or more binary classifiers outputting False or a negative result.” After further reviewing McCourt reference, the Examiner respectfully disagrees. McCourt discloses (¶0017) that the performance score determiner automatically determines a meaningful performance score for trained binary signal classifiers; (¶0051-¶0056) the process is repeated for different classifiers where the system determines whether outputs by the classifiers are true/positive or false/negative outputs.
Furthermore, Applicant alleges that McCourt failed to disclose the claimed features of determining whether to execute the multi-classifier based on the True/False result of the binary classifiers. Examiner respectfully disagrees and would like to point out that claim requires to input binary classifier’s data to a multi-classifier and does not mention anywhere in the claim that the multi-classifier’s used/execution depends based on the result of the binary classifiers. Therefore, it moots Applicant’s argument. See the updated rejection below.
In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., controlling the execution of the multi-classifier based on the preset condition, performing during real-time inference) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993).
Regarding amendment to claim 9, see the new rejection made below.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-4, 7-8, and 15 are rejected under 35 U.S.C. 103 as being unpatentable over US Patent 7,809,723 to Liu (“Liu”) in view of US PG Pub 2020/0019883 to McCourt (“McCourt”).
Regarding claim 1, “A method for performing classification” reads on the method/system for providing distributed training of a hierarchical classifier for classifying documents using a classification hierarchy (abstract) disclosed by Liu and represented in Fig. 8.
As to “the method comprising: inputting data to at least one binary classifier configured to perform binary classification to generate a binary classification result” Liu discloses (7:56-8:7; claim 1) that a computing device with a processor trains a hierarchical classifier for classification of documents into a classification hierarchy by providing training data for training the classifiers; (4:64-5:5; 7:5-22; claim 3) a binary classifier for a classification classifies documents with a certain confidence.
As to “in response to the binary classification result output by the at least one binary classifier satisfying a preset condition, inputting the data to a multi-classifier configured to perform multi-classification for generating a multi-classification result and outputting a classification result based on the multi-classification result output generated by the multi-classifier” Liu discloses (9:16-35; 6:36-7:2) that the training system trains a final classifier/classifier component (multi-classifier) using all the training data, classified by the binary classifiers, to give the final classifier as represented in Fig. 8.
Liu meets all the limitations of the claim except “wherein the at least one binary classifier includes two or more binary classifiers, and wherein the preset condition includes at least two binary classifiers among the two or more binary classifiers outputting True or a positive result, or the preset condition includes all of the two or more binary classifiers outputting False or a negative result.” However, McCourt discloses (¶0017) that the performance score determiner automatically determines a meaningful performance score for trained binary signal classifiers; (¶0051-¶0056) the process is repeated for different classifiers where the system determines whether outputs by the classifiers are true/positive or false/negative outputs as represented in Fig. 6. Therefore, it would have been obvious to one of the ordinary skills in the art before the effective filing date of the invention to modify Liu’s system by outputting true/positive or false/negative results by the classifiers as taught by McCourt in order to determine the accuracy for the predictions of such a trained binary signal classifier operating on actual input data and to reflect the same accuracy score as an accuracy determined by comparing the predicted outputs to observed outputs (McCourt - ¶0005).
Regarding claim 2, “The method of claim 1, wherein a number of the at least one binary classifier corresponds to a number of classes to be classified for the data” Liu discloses (6:36-7:4) that the training system sets N to the number of training documents so that each part has at one document within the classification; the training system trains N classifiers for each classification holding out each part once for each classifier.
Regarding claim 3, “The method of claim 2, wherein the number of classes to be classified is n, the number of the at least one binary classifier is n, and a number of the multi-classifier is at least one, and wherein n is a natural number” Liu discloses (6:36-7:4) that the training system sets N to the number of training documents so that each part has at one document within the classification; the training system trains N classifiers for each classification holding out each part once for each classifier; the training system trains a final classifier using all the training data (e.g., N parts) to give the final classifier.
Regarding claim 4, “The method of claim 1, further comprising: determining whether to perform the multi-classification on the data with the multi-classifier based on the binary classification result output from the at least one binary classifier” Liu discloses (7:1-4) that the training system then trains a final classifier using all the training data (i.e., N parts) to give the final classifier; the training system averages the N classifiers previously trained to give the final classifier.
Regarding claim 7, “The method of claim 1, wherein the preset condition includes the at least one binary classifier only outputting False or only outputting a negative result” McCourt discloses (¶0022, ¶0053, claim 1) that the trained binary signal classifier may be a binary signal classifier that has been trained with training data to classify only one input signal including a true/false signal.
Regarding claim 8, “The method of claim 1, wherein the at least one binary classifier includes two or more binary classifiers, and wherein the two or more binary classifiers perform binary classification on different classes” Liu discloses (6:36-7:4) that the training system divides the training data for that classification into multiple parts or N parts. The training system trains the classifier for that classification using the training data of all but one part or N-1 parts; the training system may train N classifiers for each classification holding out each part once for each classifier.
Regarding claim 15, “The method of claim 1, wherein the at least one binary classifier and the multi-classifier are implemented in a processor or individual hardware components” Liu discloses (claim 10) that a plurality of agents implemented as instructions stored in the memory for execution by the processor such that the select features for classifier component that for each classification of the classification hierarchy, identifies features of the documents of the training data that are to be used for training a classifier for that classification.
Claims 5, 16, and 18-19 are rejected under 35 U.S.C. 103 as being unpatentable over Liu in view of McCourt, and further in view of US Patent 7,020,337 to Viola (“Viola”).
Regarding claim 5, combination of Liu and McCourt meets all the limitations of the claim except “The method of claim 4, further comprising: in response to the binary classification result output from the at least one binary classifier not satisfying a preset condition, preventing the multi-classifier from performing multi-classification on the data and outputting a final classification result based on the binary classification result output from the at least one binary classifier.” However, Viola discloses (3:64-67; 5:55-6:19) that the system processes set of data using the classifiers to produce acceptance thresholds (preset condition) for the classifiers; the acceptance threshold which achieves the performance goals is selected as the Nth threshold where all examples (binary result) that fall below the acceptance threshold are rejected; when the acceptance threshold is the lowest value so that it accepts only for those negative examples which are accepted by the final threshold; basically, the classification is terminated and produced the final decision when the examples/score is less than the acceptance threshold as represented in Fig. 3. Therefore, it would have been obvious to one of the ordinary skills in the art before the effective filing date of the invention to modify Liu and McCourt’s systems by outputting a final classification result using the binary result by terminating classification process early as taught by Viola in order to improve efficiency by avoiding unnecessary multi-class processing when a binary result is sufficient.
Regarding claim 16, “A method for controlling an artificial intelligence device to perform classification” reads on the method/system for providing distributed training of a hierarchical classifier for classifying documents using a classification hierarchy (abstract) disclosed by Liu and represented in Fig. 8.
As to “the method comprising: receiving data by a processor in the artificial intelligence device” Liu discloses (7:56-8:7; claim 1) that a computing device with a processor trains a hierarchical classifier for classification of documents into a classification hierarchy by providing training data for training the classifiers; (2:65; 4:64-66) the training system uses binary classifiers to implement a hierarchical classifier by training the binary classifiers using a linear SVM.
As to “performing, by the processor, binary classification on the data based on at least one binary classifier to generate a binary classification result” Liu discloses (4:64-5:5; 7:5-22; claim 3) that a binary classifier for a classification classifies documents with a certain confidence.
As to “in response to the binary classification result satisfying the preset condition, performing, by the processor, multi-classification on the data based on a multi-classifier to generate a multi-classification result and outputting the final classification result based on the multi-classification result” Liu discloses (9:16-35; 6:36-7:2) that the training system trains a final classifier/classifier component (multi-classifier) using all the training data, classified by the binary classifiers, to give the final classifier as represented in Fig. 8.
Liu meets all the limitations of the claim except “wherein the at least one binary classifier includes two or more binary classifiers, and wherein the preset condition includes at least two binary classifiers among the two or more binary classifiers outputting True or a positive result, or the preset condition includes all of the two or more binary classifiers outputting False or a negative result.” However, McCourt discloses (¶0017) that the performance score determiner automatically determines a meaningful performance score for trained binary signal classifiers; (¶0051-¶0056) the process is repeated for different classifiers where the system determines whether outputs by the classifiers are true/positive or false/negative outputs as represented in Fig. 6. Therefore, it would have been obvious to one of the ordinary skills in the art before the effective filing date of the invention to modify Liu’s system by outputting true/positive or false/negative results by the classifiers as taught by McCourt in order to determine the accuracy for the predictions of such a trained binary signal classifier operating on actual input data and to reflect the same accuracy score as an accuracy determined by comparing the predicted outputs to observed outputs (McCourt - ¶0005).
Combination of Liu and McCourt meets all the limitations of the claim except “in response to the binary classification result not satisfying a preset condition, outputting, by the processor, a final classification result based on the binary classification result.” However, Viola discloses (3:64-67; 5:55-6:19) that the system processes set of data using the classifiers to produce acceptance thresholds (preset condition) for the classifiers; the acceptance threshold which achieves the performance goals is selected as the Nth threshold where all examples (binary result) that fall below the acceptance threshold are rejected; when the acceptance threshold is the lowest value so that it accepts only for those negative examples which are accepted by the final threshold; basically, the classification is terminated and produced the final decision when the examples/score is less than the acceptance threshold as represented in Fig. 3. Therefore, it would have been obvious to one of the ordinary skills in the art before the effective filing date of the invention to modify Liu and McCourt’s systems by outputting a final classification result using the binary result by terminating classification process early as taught by Viola in order to improve efficiency by avoiding unnecessary multi-class processing when a binary result is sufficient.
Regarding claim 18, see rejection similar to claim 7.
Regarding claim 19, see rejection similar to claim 16.
Claims 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Liu in view of McCourt, and further in view of US PG Pub 2020/0160196 to Ramakrishnan (“Ramakrishnan”).
Regarding claim 9, “The method of claim 8, further comprising: varying a number of binary classifiers among the two or more classifiers that perform binary classification on the data based on an execution order of the two or more binary classifiers” Liu discloses (6:36-7:4) that the training system divides the training data for that classification into multiple parts or N parts. The training system trains the classifier for that classification using the training data of all but one part or N-1 parts; the training system may train N classifiers for each classification holding out each part once for each classifier.
Combination of Liu and McCourt meets all the limitations of the claim except “in response to a binary classification result output by at least one of the binary classifiers satisfying a preset condition, terminating remaining binary classifiers without performing further classification.” However, Ramakrishnan discloses (¶0048-¶0052) that the binary classifier identifies content and only passes to a second step classifiers if it determines content to be a specific content otherwise it is not being sent any classifiers for further determination as represented in Fig. 1. Therefore, it would have been obvious to one of the ordinary skills in the art before the effective filing date of the invention to modify Liu and McCourt’s systems by terminating remaining binary classifiers without performing further classification when the first binary classifier satisfies the condition as taught by Ramakrishnan in order to verify the factual accuracy of information by determining if the content needs further fact checking (Ramakrishnan - ¶0001).
Regarding claim 10, “The method of claim 9, further comprising: executing a first number of binary classifiers among the two or more binary classifiers to perform binary classification on the data when the two or more binary classifiers are executed in a first order; and executing a second number of binary classifiers among the two or more binary classifiers to perform binary classification on the data when the two or more binary classifiers are executed in a second order different from the first order, the first number being different than the second number” Liu discloses (5:40-6:5) that the training system assigns classifiers to agents based on the anticipated complexity or load of training of each classifier; the training system then assigns classifiers to agents based on the complexity of the classifiers already assigned to the agents. For example, if there are 10 agents, the training system may first assign to each agent one of the classifiers with the 10 highest complexities. The training system then assigns classifiers to agents in order of complexity of the classifiers. The training system repeatedly assigns the unassigned classifier with the highest complexity to the agent whose total complexity of assigned classifiers is the lowest until all the classifiers are assigned.
Claims 11-14 are rejected under 35 U.S.C. 103 as being unpatentable over Liu in view of McCourt, and further in view of US PG Pub 2023/0177441 to Durvasula (“Durvasula”).
Regarding claim 11, combination of Liu and McCourt’s meets all the limitations of the claim except “The method of claim 1, further comprising: before outputting the binary classification result using the at least one binary classifier, preprocessing the data to generate preprocessed data and replicating the preprocessed data to generate replicated preprocessed data.” However, Durvasula discloses (¶0039, ¶0044, ¶0055) that the database stores training data set which is used to train ML/AI models where (¶0049) machine learning involves identifying and recognizing patterns in existing data in order to facilitate classifications and identifications for subsequent data; (¶0045) training data may be divided into training, validation, and testing data. For example, 20% of the training data set may be held back for later validation and/or testing. In that example, 80% of the training data set may be used for training. In that example, the training data set data may be shuffled before being so divided; (¶0101) the system analyzes the one or more experimental ML models of the user and/or related data sets, where the implementation module includes additional instructions that cause the computer to replicate the one or more experimental ML models and model dependencies in a computing environment. Therefore, it would have been obvious to one of the ordinary skills in the art before the effective filing date of the invention to modify Liu and McCourt’s systems by using preprocessed to generate replicated preprocessed data as taught by Durvasula in order to facilitate making predictions, classifications, and/or identifications for subsequent data (such as using the models to determine or generate a classification or prediction for, or associated with, applying a data governance engine to train a descriptive analytics model (Durvasula - ¶0049).
Regarding claim 12, “The method of claim 11, wherein the at least one binary classifier includes two or more binary classifiers, and wherein the method further comprises applying the replicated preprocessed data to each of the two of more binary classifiers in parallel” Liu discloses (7:5-22) that the controller assigns the training of classifiers to the agents where the classifier store contains the classifiers trained by the agents along with their confidence thresholds and represents the hierarchical classifier, and Durvasula discloses (¶0101) the system analyzes the one or more experimental ML models of the user and/or related data sets, where the implementation module includes additional instructions that cause the computer to replicate the one or more experimental ML models and model dependencies in a computing environment.
Regarding claim 13, “The method of claim 11, further comprising classifying the data for each class to solve a classification problem by using the replicated preprocessed data to generate classified data” Liu discloses (7:56-8:7; claim 1) that a computing device with a processor trains a hierarchical classifier for classification of documents into a classification hierarchy by providing training data for training the classifiers; (4:64-5:5; 7:5-22; claim 3) a binary classifier for a classification classifies documents with a certain confidence.
As to “performing machine learning of the at least one binary classifier for each class using the classified data, wherein the machine learning of the at least one binary classifier includes adjusting a threshold or a condition of one or more of the at least one binary classifier” Durvasula discloses (¶0027) that the machine learning is directed to classification, mapping, and recommendation; (¶0094) the system compare the performance of the ML models and adjust the mapping, or add additional models for scoring and ranking; (¶0045) the weights may be initialized to random values and then adjusted as the network is successively trained, by using one or more gradient descent algorithms.
Regarding claim 14, “The method of claim 11, further comprising performing machine learning of the multi-classifier using the replicated data, wherein the machine learning of the multi-classifier includes adjusting a probability or a condition of the multi-classifier” Durvasula discloses (¶0027) that the machine learning is directed to classification, mapping, and recommendation; (¶0094) the system compare the performance of the ML models and adjust the mapping, or add additional models for scoring and ranking; (¶0045) the weights may be initialized to random values and then adjusted as the network is successively trained, by using one or more gradient descent algorithms.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PINKAL R CHOKSHI whose telephone number is (571)270-3317. The examiner can normally be reached Monday - Friday, 8am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, BRIAN T PENDLETON can be reached at (571)272-7527. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PINKAL R CHOKSHI/Primary Examiner, Art Unit 2425