Prosecution Insights
Last updated: April 19, 2026
Application No. 18/394,509

GENERATING ANNOTATED DATA SAMPLES FOR TRAINING USING TRAINED GENERATIVE MODEL

Final Rejection §101§102§103§112
Filed
Dec 22, 2023
Examiner
ORANGE, DAVID BENJAMIN
Art Unit
2663
Tech Center
2600 — Communications
Assignee
International Business Machines Corporation
OA Round
2 (Final)
34%
Grant Probability
At Risk
3-4
OA Rounds
3y 7m
To Grant
63%
With Interview

Examiner Intelligence

Grants only 34% of cases
34%
Career Allow Rate
51 granted / 151 resolved
-28.2% vs TC avg
Strong +29% interview lift
Without
With
+29.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
51 currently pending
Career history
202
Total Applications
across all art units

Statute-Specific Performance

§101
13.1%
-26.9% vs TC avg
§103
29.0%
-11.0% vs TC avg
§102
20.2%
-19.8% vs TC avg
§112
32.0%
-8.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 151 resolved cases

Office Action

§101 §102 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Examiner Note A copy of Pobitzer, M., Janicki, F., Rigotti, M., & Malossi, C. (2024). Outline-guided object inpainting with diffusion models. arXiv preprint arXiv:2402.16421. is attached because, while published after the priority date, it describes the same invention. Claim Interpretation The claims recite both “erase” and “erode.” A review of the art shows that “erode” is a technical term, see, e.g., https://en.wikipedia.org/wiki/Erosion_(morphology). The specification describes erosion as morphological erosion at [0014] and [0042]. “Erase” is understood with its plain meaning, i.e., to remove information. “Segmentation learning model” is any machine learning model that segments or learns from segments. See, e.g., specification [0041] “For example, the trained segmentation machine learning model 408 may be a neural network, such as … among other suitable networks.” Here, the specification has defined the term to include (at least) neural networks that are suitable, i.e., that perform this function. In other words, the segmentation learning model is defined by what it does, rather than what it is (i.e., it is not defined as a particular neural network architecture). “Blend” is interpreted under its plain meaning, that is, blending an edge is interpreted as the edge between the two images being aesthetically similar. Drawings The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore, the “inpainting model trained on top of a latent diffusion model” of claim 3 must be shown or the feature(s) canceled from the claim(s). No new matter should be entered. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Claim Objections The below claims are objected to because of the following informalities: Claim 1 recites “a processor to.” The claim should instead recite that there are stored instructions to carry out these steps (whereas “a processor to” is understood as intended use, such that claims 1-7 read on any processor). Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 1, 8, and 15 recite “a predetermined number of times,” but this is subjective because the determination is not specified. MPEP 2173.05(b)(IV). Claims, 1, 8, 14, 15, and 20 recite various steps occurring “via” a processor or a generative model. It is unclear what role the processor or generative model has to be “via.” For example, if in claim 1, the generative model generates a prompt that is used to fill out an area of the erased object, is that level of involvement “via?” Claim 3 recites “inpainting model trained on top of a latent diffusion model,” but it is unclear what relationship is meant by “on top of.” For example, are the models in series, is one boosting the other? Dependent claims are likewise rejected. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 15-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claims do not fall within at least one of the four categories of patent eligible subject matter because claim 15 recites a “computer program product.” The broadest reasonable interpretation of “computer program product” includes software per se. MPEP 2106.03(I). The most relevant section of the specification is [0016], which discusses transitory signals in the context of a “computer readable storage medium” and a “storage device,” but not a “computer program product.” Claims 16-20 are likewise rejected. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea (mental process) without significantly more. Step 1: Claim 1 (and its dependents) recite a system, and machines are eligible subject matter. Claim 8 (and its dependents) recite a method, and processes are eligible subject matter. Claim 15 (and its dependents) is not directed to a statutory category. Step 2A, prong one: All of the elements of claims 1-20 are a mental process because a person can look at a CT image and decide if there’s lung cancer. Further, the various models are also mental processes, see example 47, claim 2, element (d) (from the July 2024 AI subject matter eligibility examples). MPEP 2106.04(a)(2)(III)(C) explains that use of a generic computer or in a computer environment is still a mental process. In particular, this section begins by citing Gottschalk v. Benson, 409 US 63 (1972). “The Supreme Court recognized this in Benson, determining that a mathematical algorithm for converting binary coded decimal to pure binary within a computer’s shift register was an abstract idea.” In Benson the Supreme Court did not separately analyze the computer hardware at issue; the specifics of what hardware was claimed is only included in an appendix to the decision. Because there are no additional elements, no further analysis is required for Step 2A, prong two or Step 2B. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-11, 13-17 and 20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by US20240169622A1 (“Xie”). As highlighted by the claim objection, claim 1 and dependents require only a processor ”to” perform certain operations. The operations are not particular to the structure of the processor; they appear to be performable with appropriate programming by any conventional processor. Xie shows a processor 405 in Fig. 4 that appears to be suitable for the intended use of Claim 1-8. There is no associated memory claimed; there is no particular structure of a processor claimed; this is just intended use. In addition, 1. A system, comprising a processor to: receive an annotated data sample comprising an object contained in an annotated mask; (Xie, Fig. 2, step 205) partially erase the object contained in the annotated mask; and (Xie, Fig. 2, step 210. Xie’s partially noisy map teaches the claimed partially erase because the specification uses “erase” broadly (see, e.g., specification [0037] that eroding is a type of erasing), and Xie’s noise means that information has been removed.) fill out an erased area of the object a predetermined number of times via a trained generative model to generate an additional annotated data sample. (Xie, Fig. 2, step 215. Xie’s denoising teaches the claimed filling out. If you look closely, you see that the cat in the bottom right is wearing a different hat than the original cat in the upper left. Fig. 12, step 1225. Xie’s training the model teaches the claimed additional annotated data sample.) 2. The system of claim 1, wherein the trained generative model comprises a diffusion-based inpainting model. (Xie, [0003] “The multi-modal image editing system performs image inpainting by replacing a region of the image corresponding to the mask with noise and using a diffusion model … ”) 3. The system of claim 1, wherein the generative model comprises an inpainting model trained on top of a latent diffusion model. (Xie, [0011] “FIG. 5 shows an example of a guided latent diffusion architecture according to aspects of the present disclosure.”) 4. The system of claim 1, wherein the generated additional annotated data sample comprises an image-mask pair that comprises the annotated mask of the annotated data sample. (Xie, Fig. 2, step 215. [0171] “The loss function provides a value (a “loss”) for how close the predicted annotation data is to the actual annotation data.” See also, claim 20 “generate a predicted mask based on the composite image map using a mask network.”) 5. The system of claim 1, wherein the processor is to erode the annotated mask to generate an eroded mask and erase an area of the object within the eroded mask to partially erase the object. (Xie, Fig. 2, step 210. Xie’s partially noisy map teaches the claimed partially erase because Xie’s noise is the same as the specification’s eroding (see, specification [0037] that eroding is a type of erasing).) 6. The system of claim 1, wherein the processor is to blend an edge of the filled out area of the object with an original outer portion of the object that was not erased using a Gaussian filter. (Xie, Fig. 7, step 755. This is the same cat image as Fig. 2, but easier to see. That the picture looks relatively unedited teaches the claimed blended edges.) 7. The system of claim 1, wherein the processor is to input the associated class from the annotated mask as a text guidance into the diffusion-based inpainting model. (Xie, Fig. 2, step 205. Xie’s text prompt teaches the claimed associated class because specification [0043] identifies text input as a type of class.) Claim 8 is rejected as per claim 1. 9. The computer-implemented method of claim 8, further comprising training a segmentation learning model using the generated additional annotated data sample. (Xie, Fig. 12, step 1225. See also, [0082] “training component 445 fine-tunes the pretrained diffusion model based on the segmentation training data … .”) 10. The computer-implemented method of claim 8, wherein the generated additional annotated data sample comprises the annotated mask from the annotated data sample. (Xie, Fig. 12, step 1215.) 11. The computer-implemented method of claim 8, further comprising receiving a text prompt and filling out the erased area using a diffusion-based inpainting model guided by the text prompt. (Xie, Fig. 11, step 1125.) 13. The computer-implemented method of claim 8, wherein filling out the erased area comprises blending an edge of the filled out area of the object with an original outer portion of the object that was not erased using a Gaussian filter. (Xie, Fig. 13, step 1325.) 14. The computer-implemented method of claim 8, further comprising filling out, via the processor, the erased area of the object via the generative model a predetermined number of times to generate a plurality of additional annotated data samples having the same annotation as the annotated data sample. (Xie, Fig. 3. Xie’s composite images (plural) disclose the claimed plurality of samples. Xie, Fig. 12, step 1225. Xie’s training teaches that these are the claimed annotated data samples.) For claims 15-20, see also, Xie, [0004] “non-transitory computer readable medium.” Claim 15 is rejected as per claim 1. Claim 16 is rejected as per claim 9. Claim 17 is rejected as per claim 11. 20. The computer program product of claim 15, further comprising program code executable by the processor to fill the erased area of the object via the generative model a predetermined number of times to generate a plurality of additional annotated data samples having the same annotation as the annotated data sample. (Xie, Fig. 3. Xie’s composite images (plural) disclose the claimed plurality of samples. Xie, Fig. 12, step 1225. Xie’s training teaches that these are the claimed annotated data samples.) Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 12, 18, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over US20240169622A1 (“Xie”) in view of the Wikipedia article Erosion (morphology) as of October 27, 2023, retrieved from https://en.wikipedia.org/w/index.php?title=Erosion_(morphology)&oldid=1182092397 (“Wikipedia”). 12. The computer-implemented method of claim 8, wherein partially erasing the object comprises eroding the annotated mask to generate an eroded mask and erasing an area of the object within the eroded mask. (Li, [100] “erosion kernel can be preset during the preprocessing stage … Similarly, the mask container table can also be preset in the image material preprocessing stage.”) (Wikipedia, “The erosion operation usually uses a structuring element for probing and reducing the shapes contained in the input image.” Wikipedia’s reducing teaches the claimed erasing, see also the example under “binary erosion” and “otherwise it gets deleted”. Wikipedia’s input image teaches the claimed mask, further, Wikipedia’s input can be used with Xie’s mask because Xie teaches processing the masked and unmasked images separately.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to apply the teachings of Wikipedia to the teachings of Xie such that Wikipedia’s erosion is used to implement Li’s erasing for the purpose of a technique for deleting data that is specific to shapes in images (Wikipedia, “reducing the shapes contained in the input image”), simple substitution (MPEP 2143), or art recognized equivalence for the same purpose (MPEP 2144.06(II)) because erosion is a “one of two fundamental operations (the other being dilation) in morphological image processing,” (Wikipedia) and this technique deletes data (i.e., it is a known substitute for other types of removing information). Based on the above, this is an example of “combining prior art elements according to known methods to yield predictable results.” MPEP 2143. Claims 18 is rejected as per claim 12. See also, Xie, [0004] “non-transitory computer readable medium.” 19. The computer program product of claim 15, further comprising program code executable by the processor to receive an erosion kernel and erode the annotated mask based on the erosion kernel. (Wikipedia, Binary erosion, “Example … B is a 3 x 3 matrix.” Wikipedia’s example uses B as an erosion kernel, and B is analogous to specification, [0048] “a conservative erosion kernel of 12x12 pixels.”) Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US20210158523A1 – [0220] “The remaining mask was dilated by a kernel size of 6 to reverse the effects of the initial erosion kernel” US11570398B2 – Fig. 8 Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID ORANGE whose telephone number is (571)270-1799. The examiner can normally be reached Mon-Fri, 9-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Gregory Morse can be reached at 571-272-3838. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DAVID ORANGE/Primary Examiner, Art Unit 2663
Read full office action

Prosecution Timeline

Dec 22, 2023
Application Filed
Nov 21, 2025
Non-Final Rejection — §101, §102, §103
Feb 19, 2026
Interview Requested
Feb 27, 2026
Examiner Interview Summary
Feb 27, 2026
Applicant Interview (Telephonic)
Mar 03, 2026
Response Filed
Apr 10, 2026
Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12567126
INFRASTRUCTURE-SUPPORTED PERCEPTION SYSTEM FOR CONNECTED VEHICLE APPLICATIONS
2y 5m to grant Granted Mar 03, 2026
Patent 11300964
METHOD AND SYSTEM FOR UPDATING OCCUPANCY MAP FOR A ROBOTIC SYSTEM
2y 5m to grant Granted Apr 12, 2022
Patent 10816794
METHOD FOR DESIGNING ILLUMINATION SYSTEM WITH FREEFORM SURFACE
2y 5m to grant Granted Oct 27, 2020
Patent 10433126
METHOD AND APPARATUS FOR SUPPORTING PUBLIC TRANSPORTATION BY USING V2X SERVICES IN A WIRELESS ACCESS SYSTEM
2y 5m to grant Granted Oct 01, 2019
Patent 10285010
ADAPTIVE TRIGGERING OF RTT RANGING FOR ENHANCED POSITION ACCURACY
2y 5m to grant Granted May 07, 2019
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
34%
Grant Probability
63%
With Interview (+29.4%)
3y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 151 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month