Prosecution Insights
Last updated: April 19, 2026
Application No. 18/117,552

LEARNING SYSTEM AND LEARNING METHOD

Non-Final OA §101
Filed
Mar 06, 2023
Examiner
GONZALES, VINCENT
Art Unit
2124
Tech Center
2100 — Computer Architecture & Software
Assignee
Hitachi, Ltd.
OA Round
1 (Non-Final)
78%
Grant Probability
Favorable
1-2
OA Rounds
3y 6m
To Grant
89%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
410 granted / 522 resolved
+23.5% vs TC avg
Moderate +10% lift
Without
With
+10.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
26 currently pending
Career history
548
Total Applications
across all art units

Statute-Specific Performance

§101
21.2%
-18.8% vs TC avg
§103
39.9%
-0.1% vs TC avg
§102
13.2%
-26.8% vs TC avg
§112
14.6%
-25.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 522 resolved cases

Office Action

§101
DETAILED ACTION This action is written in response to the application filed 3/6/23. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Subject Matter Eligibility Independent claims 1 and 8 each recite a “learning system” comprising various ‘devices’ and ‘servers’. However, none of these components is defined in the Applicant’s written description. The broadest reasonable interpretation of these components (as well as the “learning system” as a whole) encompasses software per se which is not a process, machine, manufacture, or a composition of matter, and therefore is nonstatutory subject matter. Therefore, claims 1 and 8 are rejected under §101. Dependent claims 2-7 and 9-12 are rejected for the same reason. Claims 13-15 recite a method, and are not rejected. Allowable Claims and Allowable Subject Matter Claims 13-15 are allowed. Claims 1-12 are allowable over the prior art (but are rejected under §101). Below are the closest cited references, each of which disclose various aspects of the claimed invention: Papadaki discloses a traditional federated learning system featuring one central server for training (and updating) a global model, as well as a plurality of client devices which managing (and update) local models which are subsequently used to update the global model. See excerpts below. However, it does not disclose classifying an individual model at a central/global server (eg at a ‘training server’). (Afroditi Papadaki, et al., "Federating for Learning Group Fair Models", 35th Conference on Neural Information Processing Systems Workshop, 2021. <https://arxiv.org/abs/2110.01999>. Cited by Applicant on IDS dated 3/6/23.) PNG media_image1.png 396 382 media_image1.png Greyscale P. 3, fig. 1 (excerpt). See also p. 2, “The clients do not share data with one another or with the server; instead the clients only share focused updates with the server, the server then updates a global model, and distributes the updated model to the clients, with the process carried out over multiple rounds or iterations.” Sattler discloses a federated learning system which classifies the individual client models as either benign or adversarial. However, it does not disclose classifying a global model received from client devices. (Sattler, Felix, Klaus-Robert Müller, Thomas Wiegand, and Wojciech Samek. "On the byzantine robustness of clustered federated learning." In ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 8861-8865. IEEE, 2020.) P. 8863, “However, as we assume in the byzantine setting that the majority of clients belongs to one single benign cluster and all other clients are considered adversarial, we can of course save computation effort by excluding all clients from training, which do not belong to the largest cluster (10).” Ye presents a survey of techniques to address heterogeneity in federated learning. Although Ye is not prior art under §102, it presents the state of the art around the time of filing. Topics discussed include different types of data heterogeneity (label skew, features skew, quality skew and quantity skew, see p. 8) as well as device heterogeneity (see p. 10). (Ye, Mang, Xiuwen Fang, Bo Du, Pong C. Yuen, and Dacheng Tao. "Heterogeneous federated learning: State-of-the-art and research challenges." ACM Computing Surveys 56, no. 3 (2023): 1-44.) However, none of the prior art references of record—alone or in combination—disclose or suggest the combined features recited in the independent claims, including specifically (for claim 1): receive the common model from the training server, update the common model and the individual model on the basis of the individual data, and transmit the updated common model and individual model to the training server, and the training server classifies the common model and the individual model transmitted from the plurality of client devices on the basis of the individual model transmitted from the training data management server, and updates the common model and the individual model in accordance with a classification result. Independent claims 8 and 13 are allowable for the same reason as claim 1. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Vincent Gonzales whose telephone number is (571) 270-3837. The examiner can normally be reached on Monday-Friday 7 a.m. to 4 p.m. MT. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Miranda Huang, can be reached at (571) 270-7092. Information regarding the status of an application may be obtained from the USPTO Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. /Vincent Gonzales/Primary Examiner, Art Unit 2124
Read full office action

Prosecution Timeline

Mar 06, 2023
Application Filed
Oct 10, 2025
Non-Final Rejection — §101 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585920
PREDICTING OPTIMAL PARAMETERS FOR PHYSICAL DESIGN SYNTHESIS
2y 5m to grant Granted Mar 24, 2026
Patent 12580040
DIFFUSION MODEL FOR GENERATIVE PROTEIN DESIGN
2y 5m to grant Granted Mar 17, 2026
Patent 12566984
METHODS AND SYSTEMS FOR EXPLAINING ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING
2y 5m to grant Granted Mar 03, 2026
Patent 12561402
IDENTIFICATION OF A SECTION OF BODILY TISSUE FOR PATHOLOGY TESTS
2y 5m to grant Granted Feb 24, 2026
Patent 12547647
Unsupervised Machine Learning System to Automate Functions On a Graph Structure
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
78%
Grant Probability
89%
With Interview (+10.5%)
3y 6m
Median Time to Grant
Low
PTA Risk
Based on 522 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month