Prosecution Insights
Last updated: April 19, 2026
Application No. 17/933,348

GAN-BASED DATA GENERATION FOR CONTINUOUS CENTRALIZED ML TRAINING

Non-Final OA §103§112
Filed
Sep 19, 2022
Examiner
ROY, SANCHITA
Art Unit
2146
Tech Center
2100 — Computer Architecture & Software
Assignee
DELL PRODUCTS, L.P.
OA Round
1 (Non-Final)
72%
Grant Probability
Favorable
1-2
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
228 granted / 316 resolved
+17.2% vs TC avg
Strong +46% interview lift
Without
With
+46.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
19 currently pending
Career history
335
Total Applications
across all art units

Statute-Specific Performance

§101
11.3%
-28.7% vs TC avg
§103
45.4%
+5.4% vs TC avg
§102
10.9%
-29.1% vs TC avg
§112
27.3%
-12.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 316 resolved cases

Office Action

§103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-20 are presented for examination. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention. Claim(s) 1, 9 and 17, each recite(s) “storing the data in a data repository associated with the machine learning service, which is configured to retrain the machine learning model”. It is unclear whether (a) the data or (b) the machine learning service, is configured to retrain the machine learning model, rendering the claim(s) indefinite. For examination purposes the examiner has interpreted “storing the data in a data repository associated with the machine learning service, which is configured to retrain the machine learning model” to be “storing the data in a data repository associated with the machine learning service, wherein the data is configured to retrain the machine learning model”. Claim(s) 1, 9 and 17, each recite(s) “wherein a machine learning model operates at each of the nodes... which is configured to retrain the machine learning model, ... training models associated with the nodes, wherein each of the nodes is associated with a different one of the models and wherein each of the models is trained with data from the associated node, wherein each of the models includes a generator that is configured to generate synthetic data; retraining the machine learning model using the data stored in the data repository and the synthetic data generated by one or more of the generators when necessary; and deploying the retrained machine learning model to each of the nodes”. It is unclear whether (a) “a machine learning model ... at each of the nodes” is the same as “different one of the models” associated with each node, (b) “a machine learning model ... at each of the nodes” is identical at each node, or (c) “a machine learning model ... at each of the nodes” is a different version or copy at each node, and since the above gives rise to the possibility of multiple machine learning models, it is unclear as to which of the possible multiple machine learning models is retrained, rendering the claim(s) indefinite. Claim(s) 1, 9 and 17, each recite(s) “the synthetic data generated by one or more of the generators when necessary”. It is unclear (a) what constitutes necessary versus unnecessary in the context of the claim, and (b) whether (i) retraining the machine learning model or (ii) synthetic data generated by one or more of the generators, are performed when necessary, rendering the claim(s) indefinite. Claim(s) 2-8, 10-16, 18-20 do not contain claim limitations that cure the indefiniteness of claim(s) 1, 9 and 17 respectively, and therefore are also indefinite under 35 U.S.C. 112(b). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Ickin (US 20230088561 A1), in view of Wouhaybi (US 20220222583 A1). Regarding claim 1, Ickin teaches a method comprising (Ickin [162]): receiving data from nodes at a machine learning service, wherein a machine learning model operates at each of the nodes (Ickin Fig. 2, [93, 64] multiple worker models are at worker nodes, workers send data to central node (master)); ...receiving... the data ... at... the machine learning service, which is configured to retrain the machine learning model, wherein the machine learning service is centrally located with respect to the nodes and wherein the data repository stores data received from the plurality of nodes (Ickin [93-96] central node may be retrained based on information from workers); training models associated with the nodes, wherein each of the nodes is associated with a different one of the models and wherein each of the models is trained with data from the associated node, wherein each of the models includes a generator that is configured to generate synthetic data (Ickin [5, 93] Fig. 2, models at each worker node may be trained separately, worker nodes have generators ); retraining the machine learning model using the data stored in the data repository and the synthetic data generated by one or more of the generators when necessary; and deploying the retrained machine learning model to each of the nodes (Ickin [93-96, 98] central node determines for each worker node whether synthetic data is required, based on determination synthetic data may be added, model(s) maybe retrained using dataset and synthetic data, model with aggregate from workers may be send to workers ). Ickin does not specifically teach storing the data in a data repository associated with the machine learning service However Wouhaybi teaches storing the data in a data repository associated with the machine learning service (Wouhaybi [28, 30, 77] data from remote nodes may be sent to central node, model data may be stored). It would have been obvious to one of an ordinary skill in the art before the effective filing date of the claimed invention, to have incorporated the concept taught by Wouhaybi of storing the data in a data repository associated with the machine learning service, into the invention suggested by Ickin; since both inventions are directed towards generating a aggregate model based on remote models, and incorporating the teaching of Wouhaybi into the invention suggested by Ickin would provide the added advantage of allowing information to be stored so it may be used when required, and the combination would perform with a reasonable expectation of success (Wouhaybi [28, 30, 77]). Regarding claim 2, Ickin and Wouhaybi teach the invention as claimed in claim 1 above. Ickin further teaches retraining the machine learning model with the synthetic data only from generators that are enabled (Ickin [93, 97] workers that were directed to use synthetic data are the only ones that contribute synthetic data to retraining). Regarding claim 3, Ickin and Wouhaybi teach the invention as claimed in claim 2 above. Ickin further teaches wherein the generators are enabled when corresponding discriminators in the models cannot distinguish between a real data sample and a synthetic sample (Ickin [23, 89, 60, 97] generator data may be used when discriminator is satisfied and cannot tell real from synthetic data). Regarding claim 4, Ickin and Wouhaybi teach the invention as claimed in claim 1 above. Ickin further teaches determining, for each of the nodes, an amount of synthetic data to be used in retraining the machine learning models (Ickin [64] worker synthetic data is the same size as real data). Regarding claim 5, Ickin and Wouhaybi teach the invention as claimed in claim 4 above. Ickin further teaches contributing an amount of synthetic data to ensure that the amount of training data from each of the nodes is within a standard deviation of a mean amount of training data contributed from each of the nodes (Ickin [64] worker synthetic data is the same size as real data, Ickin [122] workers may generate data to compensate for different distribution). Regarding claim 6, Ickin and Wouhaybi teach the invention as claimed in claim 1 above. Ickin further teaches wherein synthetic data is included to ensure that each of the nodes contributes the amount of training data (Ickin [64] worker synthetic data is the same size as real data, Ickin [122] workers may generate data to compensate for different distribution). Regarding claim 7, Ickin and Wouhaybi teach the invention as claimed in claim 1 above. Ickin further teaches wherein each of the models is configured to learn a distribution of data from a corresponding node (Ickin [122] workers may generate data to compensate for different distribution compared to other nodes). Regarding claim 8, Ickin and Wouhaybi teach the invention as claimed in claim 1 above. Ickin does not specifically teach deleting the data repository after retraining the machine learning model However Wouhaybi teaches deleting the data repository after retraining the machine learning model (Wouhaybi [28, 30] information may be updated after retraining). Claim 9 is directed towards a medium storing instructions similar in scope to the instructions performed by the method of claim 1, and is rejected under the same rationale. Ickin further teaches a non-transitory storage medium having stored therein instructions that are executable by one or more hardware processors to perform operations (Ickin [162]). Claim(s) 10-16 is/are dependent on claim 9 above, is/are directed towards a medium storing instructions similar in scope to the instructions performed by the method of claim(s) 2-8 respectively, and is/are rejected under the same rationale. Claim 17 is directed towards a method performing instructions similar in scope to the instructions performed by the method of claim 1 and is rejected under the same rationale. Ickin does not specifically teach wherein the nodes are grouped into groups, each of the groups including one or more of the nodes; wherein each of the groups is associated with a different one of the models However Wouhaybi teaches wherein the nodes are grouped into groups, each of the groups including one or more of the nodes; wherein each of the groups is associated with a different one of the models (Wouhaybi [41, 67] Fig. 12, nodes may have partial of full machine learning models, nodes may be grouped). Claim(s) 18, 19, 20 is/are dependent on claim 17 above, is/are directed towards a method performing instructions similar in scope to the instructions performed by the method of claim(s) 2, 3 and 5 respectively, and is/are rejected under the same rationale. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SANCHITA ROY whose telephone number is (571)272-5310. The examiner can normally be reached Monday-Friday 12-8. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Usmaan Saeed can be reached at (571) 272-4046. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. SANCHITA . ROY Primary Examiner Art Unit 2146 /SANCHITA ROY/Primary Examiner, Art Unit 2146
Read full office action

Prosecution Timeline

Sep 19, 2022
Application Filed
Jan 10, 2026
Non-Final Rejection — §103, §112
Apr 13, 2026
Examiner Interview Summary
Apr 13, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599476
AI-BASED VIDEO ANALYSIS OF CATARACT SURGERY FOR DYNAMIC ANOMALY RECOGNITION AND CORRECTION
2y 5m to grant Granted Apr 14, 2026
Patent 12585966
INTELLIGENT DEVICE SELECTION USING HISTORICAL INTERACTIONS
2y 5m to grant Granted Mar 24, 2026
Patent 12585870
READER MODE-OPTIMIZED ATTENTION APPLICATION
2y 5m to grant Granted Mar 24, 2026
Patent 12579656
MACHINE LEARNING DENTAL SEGMENTATION SYSTEM AND METHODS USING GRAPH-BASED APPROACHES
2y 5m to grant Granted Mar 17, 2026
Patent 12562275
INTERACTIVE SUBGROUP DISCOVERY
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
72%
Grant Probability
99%
With Interview (+46.0%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 316 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month