Prosecution Insights
Last updated: April 19, 2026
Application No. 18/077,173

METHOD FOR TREE-BASED MACHINE LEARNING MODEL REDUCTION AND ELECTRONIC DEVICE USING THE SAME

Final Rejection §103§112
Filed
Dec 07, 2022
Examiner
BRAHMACHARI, MANDRITA
Art Unit
2144
Tech Center
2100 — Computer Architecture & Software
Assignee
Industrial Technology Research Institute
OA Round
2 (Final)
76%
Grant Probability
Favorable
3-4
OA Rounds
3y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
311 granted / 407 resolved
+21.4% vs TC avg
Strong +30% interview lift
Without
With
+29.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
27 currently pending
Career history
434
Total Applications
across all art units

Statute-Specific Performance

§101
10.5%
-29.5% vs TC avg
§103
54.5%
+14.5% vs TC avg
§102
7.8%
-32.2% vs TC avg
§112
17.9%
-22.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 407 resolved cases

Office Action

§103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION The action is in response to claims dated 11/18/2025 Claims pending in the case: 1-24 IDS notes from the previous office action: The NPL documents submitted on 7/10/2023 and 11/7/2023 have not been considered as an English translation of the documents have not been submitted. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claim 1-24 rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for pre-AIA the inventor(s), at the time the application was filed, had possession of the claimed invention. Claims 1 and recites “extracting at least one continuous tree subset from the subtrees according to the subtree importance of each of the subtrees and the ordinal number of each of the subtrees”. The examiner was unable to find support for this limitation in the current specification. While the specification recites that subtrees may have an ordinal number, there is no mention of using this number for extraction. The specification suggests extracting according to importance only. The applicant is requested to identify the paragraphs and lines in the specification that supports this limitation. All claims dependent on this claim are also rejected under 35 U.S.C. 112(a) due to the virtue of their respective direct and indirect dependencies. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-2, 4-14, 16-24 is/are rejected under 35 U.S.C. 103 as being unpatentable over Zhuo (CN 111291896) in view of Doan(CN 113554178) and daSilveira (US 20230342612). Please refer to attached English translations for claim mappings. daSilveira not used in the prior office action. Regarding claim 1, Zhuo teaches, a method for tree-based machine learning model reduction, suitable for an electronic device comprising a processor, comprising: obtaining a … tree model comprising a plurality of subtrees … (Zhuo: PG. 6 [2]: obtain random forest model); determining subtree importance of each of the subtrees according to feature importance information respectively corresponding to a plurality of model features of the … tree model (Zhuo: Pg. 6 [5-6]: obtain subtree performance index), … extracting at least one continuous tree subset from the subtrees according to the subtree importance of each of the subtrees …, wherein the at least one continuous tree subset comprises at least one of the subtrees (Zhuo: Pg.3 [1], Pg. 6 [4-5]: extract subtree to evaluate performance and remove it); and obtaining at least one reduced … tree model of the boosting tree model according to the at least one continuous tree subset (Zhuo: Pg.3 [1]: obtain reduced random forest model by de-selecting some of the sub-trees); However, Zhuo does not specifically teach, a boosting tree model; wherein each of the subtrees has an ordinal number based on a specific order relation; wherein the feature importance information respectively corresponding to the model features is generated by training the boosting tree model; extracting .. the ordinal number of each of the subtrees; Doan teaches, a boosting tree model (Doan: Pg. 2 [4]: gradient lifting decision tree stacking trees by reducing error in each step (boosting tree model); Pg. 3 [4], PG. 5 [3-4]: constructing by feature reduction) wherein each of the subtrees has an ordinal number based on a specific order relation (Doan: abstract, Pg. 2 [4]: successively stacking tree (trees have ordinal relation)); It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Zhuo and Doan because the arts are related to the same technical content of ensemble learning teaching reduction of tree models and the combination would enable using the reduction method taught in Zhuo to apply to a boosting tree model. One of ordinary skill in the art would have been motivated to combine the teachings because the combination would enable using an identified process of sub-tree screening in other tree based models; The examiner also finds that broadly interpreted the limitation “extracting .. the ordinal number of each of the subtrees” is obvious over the combined teachings in Zhuo and Doan since Zhuo teaches portions of the tree may be extracted, it is obvious that a boosting tree is an ordered sequence and thus extracting a portion may also include the corresponding information which may be its ordinal number. daSilveira further teaches, wherein the feature importance information respectively corresponding to the model features is generated by training the boosting tree model (daSilveira: [16, 18, 30, 71]: determine feature importance at training time for dimensionality reduction; [3, 100]: dimensionality reduction based on feature importance); It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Zhuo, Doan and daSilveira because the combination would enable determining feature importance during training time. One of ordinary skill in the art would have been motivated to combine the teachings because the combination would enable using a determined feature importance for an efficient model reduction with reduced computational resources (see daSilveira [3-4]); Regarding claim 2, Zhuo, Doan and daSilveira teach the invention as claimed in claim 1 above and, wherein determining the subtree importance of each of the subtrees according to the feature importance information respectively corresponding to the model features of the boosting tree model comprises: obtaining feature importance of each of the model features of the boosting tree model; selecting at least one important model feature from the model features according to the feature importance of each of the model features; and determining the subtree importance of each of the subtrees according to the feature importance information of the at least one important model feature (Zhuo: Pg.3 [1], Pg. 6 [4-6]: subtree to evaluate performance) (Doan: Pg. 2 [4]: gradient lifting decision tree stacking trees by reducing error in each step (boosting tree model); Pg. 3 [4], PG. 5 [3-4]: constructing by feature reduction). Regarding claim 4, Zhuo, Doan and daSilveira teach the invention as claimed in claim 2 above and, wherein selecting the at least one important model feature from the model features according to the feature importance of each of the model features comprises: performing a statistical operation on the feature importance of each of the model features to obtain a feature importance statistic value; and selecting the at least one important model feature according to the feature importance statistic value, wherein the feature importance of the at least one important model feature is greater than the feature importance statistic value (Zhuo: Pg. 6 [5], Pg. 7 [2, 6], Pg. 8 [2]: evaluate and select based on calculated performance value (expected statistic value)); (Doan: Pg. 5 [4]: decision based on a calculated prediction value compared with a threshold value (statistic value)). Regarding claim 5, Zhuo, Doan and daSilveira teach the invention as claimed in claim 1 above and, wherein extracting the at least one continuous tree subset from the subtrees according to the subtree importance of each of the subtrees comprises: selecting at least one important subtree from the subtrees according to the subtree importance of each of the subtrees; and obtaining the at least one continuous tree subset by performing slicing selection on the subtrees according to the at least one important subtree (Zhuo: Pg.3 [1], Pg. 6 [4-5], Pg. 8 [2]: extract subtree to evaluate performance and remove it). Regarding claim 6, Zhuo, Doan and daSilveira teach the invention as claimed in claim 5 above and, wherein selecting the at least one important subtree from the subtrees according to the subtree importance of each of the subtrees comprises: performing a statistical operation on the subtree importance of each of the subtrees to obtain a subtree importance statistic value; and selecting the at least one important subtree according to the subtree importance statistic value, wherein the subtree importance of the at least one important subtree is greater than the subtree importance statistic value (Zhuo: Pg. 6 [5], Pg. 7 [2, 6], Pg. 8 [2]: evaluate and select based on calculated performance value (expected statistic value)). Regarding claim 7, Zhuo, Doan and daSilveira teach the invention as claimed in claim 6 above and, wherein the subtrees comprise a plurality of first subtrees, the subtree importance of each of the first subtrees is less than the subtree importance statistic value, and selecting the at least one important subtree from the subtrees according to the subtree importance of each of the subtrees comprises: when the first subtrees have the same subtrees importance as one another, and a number of subtrees of the first subtrees is greater than or equal to a threshold value, selecting one of the first subtrees as the at least one important subtree (Zhuo: Pg. 6 [5], Pg. 7 [2], Pg. 9 [6]: screening condition based on calculated performance indicator); It would have been obvious to one skilled in the art to select a group of a specific performance level. This limitation is an obvious variation of screening criteria. Regarding claim 8, Zhuo, Doan and daSilveira teach the invention as claimed in claim 5 above and, wherein selecting the at least one important subtree from the subtrees according to the subtree importance of each of the subtrees comprises: selecting an initial subtree of the subtrees as the at least one important subtree (Zhuo: Pg.3 [1], Pg. 6 [4-5], Pg. 7 [2]: select subtree based on performance). Regarding claim 9, Zhuo, Doan and daSilveira teach the invention as claimed in claim 5 above and, wherein obtaining the at least one continuous tree subset by performing the slicing selection on the subtrees according to the at least one important subtree comprises: performing the slicing selection on the subtrees by using the at least one important subtree as a beginning subtree or a trailing subtree, wherein the beginning subtree or the trailing subtree of the at least one continuous tree subset is the at least one important subtree (Doan: Pg. 3 [4], Pg. 5 [3-4]: construction and reduction of gradient boosting decision tree by selecting the important subtrees). Regarding claim 10, Zhuo, Doan and daSilveira teach the invention as claimed in claim 5 above and, wherein obtaining the at least one continuous tree subset by performing the slicing selection on the subtrees according to the at least one important subtree comprises: obtaining at least one unimportant subtree by excluding the at least one important subtree from the subtrees; and performing the slicing selection on the subtrees by avoiding using the at least one unimportant subtree as a beginning subtree or a trailing subtree, wherein the beginning subtree or the trailing subtree of the at least one continuous tree subset is not the at least one unimportant subtree (Doan: Pg. 3 [4], Pg. 5 [3-4]: construction and reduction of gradient boosting decision tree by selecting and deselecting based on important) (Zhuo: Pg.3 [1], Pg. 6 [4-5]: extract subtree to evaluate performance and remove it). Regarding claim 11, Zhuo, Doan and daSilveira teach the invention as claimed in claim 1 above and, wherein obtaining the at least one reduced boosting tree model of the boosting tree model according to the at least one continuous tree subset comprises: obtaining a model evaluation metric of each of the at least one continuous tree subset; and selecting at least one of the at least one continuous tree subset as the at least one reduced boosting tree model according to the model evaluation metric of each of the at least one continuous tree subset (Zhuo: Pg.3 [1], Pg. 6 [4-5]: extract subtree based on evaluated performance). Regarding claim 12, Zhuo, Doan and daSilveira teach the invention as claimed in claim 1 above and, further comprising: displaying the reduced boosting tree model through an operation interface (Zhuo: Pg.3 [1]: obtain reduced random forest model by de-selecting some of the sub-trees using an interactive user interface) (Doan: Pg. 2 [4]: gradient lifting decision tree stacking trees by reducing error in each step (boosting tree model)). Regarding Claim(s) 13-14, 16-24 this/these claim(s) is/are similar in scope as claim(s) 1-2, 4-12 respectively. Therefore, this/these claim(s) is/are rejected under the same rationale. Claim(s) 3, 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Zhuo (CN 111291896) in view of Doan(CN 113554178), daSilveira (US 20230342612) and ZhouY (US 12008484). Regarding claim 3, Zhuo, Doan and daSilveira teach the invention as claimed in claim 2 above and, wherein determining the subtree importance of each of the subtrees according to the feature importance information of the at least one important model feature comprises: obtaining a feature usage count of each of the model features used by each of the subtrees; and determining the subtree importance of each of the subtrees according to the feature usage … of each of the subtrees using the at least one important model feature and the feature importance of the at least one important model feature (Zhuo: PG. 6 [5], Pg. 8 [2-3]: evaluate random forest feature); Zhuo, Doan and daSilveira do not specifically teach, feature usage count; ZhouY teaches, feature usage count (ZhouY: col 14 lines 5-43: feature count may be used to evaluate feature importance value); It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Zhuo, Doan, daSilveira and ZhouY because the combination would enable using statistical analysis and feature count to evaluate feature importance. One of ordinary skill in the art would have been motivated to combine the teachings because the combination would improve the process of finding influential features in dataset by using a technique known in the field of data science. Regarding Claim(s) 15, this/these claim(s) is/are similar in scope as claim(s) 3. Therefore, this/these claim(s) is/are rejected under the same rationale. Response to Arguments Applicants’ amendments to claims have been fully considered and overcome the 35 U.S.C. § 101 rejection. These rejections are respectfully withdrawn. Applicants’ prior art arguments have been fully considered but since they pertain to the amended sections of the claim, they are considered moot in view of the new grounds of rejection presented above. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure in the attached 892. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MANDRITA BRAHMACHARI whose telephone number is (571)272-9735. The examiner can normally be reached Monday to Friday, 11 am to 8 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tamara Kyle can be reached at 571 272 4241. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Mandrita Brahmachari/Primary Examiner, Art Unit 2144
Read full office action

Prosecution Timeline

Dec 07, 2022
Application Filed
Aug 28, 2025
Non-Final Rejection — §103, §112
Nov 18, 2025
Response Filed
Jan 22, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596746
AUDIO PREVIEWING METHOD, APPARATUS AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12596469
COMBINED DATA DISPLAY WITH HISTORIC DATA ANALYSIS
2y 5m to grant Granted Apr 07, 2026
Patent 12591358
DAMAGE DETECTION PORTAL
2y 5m to grant Granted Mar 31, 2026
Patent 12585979
MANAGING DATA DRIFT AND OUTLIERS FOR MACHINE LEARNING MODELS TRAINED FOR IMAGE CLASSIFICATION
2y 5m to grant Granted Mar 24, 2026
Patent 12585992
MACHINE LEARNING WITH ATTRIBUTE FEEDBACK BASED ON EXPRESS INDICATORS
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
76%
Grant Probability
99%
With Interview (+29.8%)
3y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 407 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month