Prosecution Insights
Last updated: April 19, 2026
Application No. 18/039,898

IMAGE AUGMENTATION TECHNIQUES FOR AUTOMATED VISUAL INSPECTION

Final Rejection §103
Filed
Jun 01, 2023
Examiner
THOMPSON, JAMES A
Art Unit
2615
Tech Center
2600 — Communications
Assignee
Amgen, Inc.
OA Round
2 (Final)
85%
Grant Probability
Favorable
3-4
OA Rounds
2y 11m
To Grant
89%
With Interview

Examiner Intelligence

Grants 85% — above average
85%
Career Allow Rate
612 granted / 717 resolved
+23.4% vs TC avg
Minimal +4% lift
Without
With
+3.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
11 currently pending
Career history
728
Total Applications
across all art units

Statute-Specific Performance

§101
8.8%
-31.2% vs TC avg
§103
54.4%
+14.4% vs TC avg
§102
25.0%
-15.0% vs TC avg
§112
7.9%
-32.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 717 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 2. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Response to Arguments 3. Applicant's arguments filed 3 December 2025 have been fully considered but they are not persuasive. Applicant argues that ‘the combination of Tremblay and Sheremeteva does not teach normalizing the feature matrix relative to a portion of the feature matrix that does not represent the feature.’ Applicant further argues that, ‘in Sheremeteva, the feature matrix represents a matrix of "features" or vectors of "geometrically pixel-by-pixel aligned" data generated across n different scans in n different spectral ranges. Sheremeteva, p. 6. As such, in Sheremeteva, there is no concept of a "a feature image depicting the feature," as claimed, because a "feature" in Sheremeteva refers to a single pixel. Said another way, in Sheremeteva, each element of the feature matrix corresponds to a different feature, not "a different pixel of the feature image," as claimed.’ Examiner replies firstly that one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Furthermore, the test for obviousness is not whether the features of a secondary reference may be bodily incorporated into the structure of the primary reference; nor is it that the claimed invention must be expressly suggested in any one or all of the references. Rather, the test is what the combined teachings of the references would have suggested to those of ordinary skill in the art. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981). Examiner has not rejected the disputed feature based on either reference individually, but rather on the combination of references. Tremblay teaches weighting the feature matrix in paragraphs [0052] and [0128]-[0132], wherein the 3D geometric shapes and 3D objects are weighted during training until a difference between training data and output data reaches a predetermined value. This general teaching is modified by the more specific teaching of Sheremeteva, which teaches on page 6, line 22 to page 9, line 2 (of the original Russian document); and page 4, last paragraph to page 7, paragraph 3 (of the English translation) normalizing the feature matrix relative to a portion of the feature matrix that does not represent the feature, and generating the synthetic image based on the normalized feature matrix. Therein, Sheremeteva shows Euclidian distance normalizations according to a determined standard, which is in a portion of the feature matrix that does not represent the feature. The “feature matrix” of claim 1 corresponds to “matrix of features” in the cited portion of Sheremeteva. While this matrix of features is processed pixel-by-pixel, it is still a matrix of features, and thus corresponds to the “feature matrix” of claim 1. In the cited portion of Sheremeteva, a Euclidian distance normalization scheme is used on the matrix of features, and done so based on a portion of the matrix of features that does not represent the feature. A synthetic image (“synthesized image” in Sheremeteva) is then generated accordingly. Thus, the teachings of Sheremeteva are combined with the more generalized weighting scheme of Tremblay to teach the disputed limitation. Applicant argues that ‘To the extent that the Office Action asserts that the 3D geometric shapes of Tremblay are comparable to the recited "feature matrix" (Present Action, p. 4, a comparison with which the Applicant disagrees), the feature matrix of Sheremeteva represents fundamentally distinct data such that the feature matrix processing techniques (including the asserted normalization techniques) of Sheremeteva would not have been applied to the feature matrix of Tremblay. To this end, the Euclidean distances relied upon in Sheremeteva represent the distances (or weighted distances) from a reference across the n different scans. See equations (1) and (3) of Sheremeteva referring to elements xi to xn. Because the feature matrix of Tremblay is based on a single image, these equations would not have been applied to the feature matrix of Tremblay with a reasonable expectation of succuss, and one ordinarily skilled in the art would have had no motivation to seek out the techniques Sheremeteva related to combining data points across multiple scans.’ Examiner replies that, again, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Furthermore, the test for obviousness is not whether the features of a secondary reference may be bodily incorporated into the structure of the primary reference; nor is it that the claimed invention must be expressly suggested in any one or all of the references. Rather, the test is what the combined teachings of the references would have suggested to those of ordinary skill in the art. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981). The normalization scheme of Sheremeteva is applied in the context of the teachings of Tremblay, and not merely plugged in mindlessly into the system of Tremblay. Applicant argues that ‘the Present Action references Tremblay assertedly disclosing "weighting a feature matrix." Present Action, p. 4. Claim 1 does not refer to "weighting” a feature matrix, but "normalizing" a feature matrix. Moreover, the "weights" of Tremblay relate to the weights of a neural network that are adjusting during a machine learning training process, not a feature matrix. Thus, the "weighting" of Tremblay cannot otherwise provide a motivation related to normalizing a feature matrix.’ Examiner replies that, again, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Furthermore, the test for obviousness is not whether the features of a secondary reference may be bodily incorporated into the structure of the primary reference; nor is it that the claimed invention must be expressly suggested in any one or all of the references. Rather, the test is what the combined teachings of the references would have suggested to those of ordinary skill in the art. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981). Furthermore, "A person of ordinary skill in the art is also a person of ordinary creativity, not an automaton." KSR Int'l Co. v. Teleflex Inc., 550 U.S. 398, 421, 82 USPQ2d 1385, 1397 (2007). "[I]n many cases a person of ordinary skill will be able to fit the teachings of multiple patents together like pieces of a puzzle." Id. at 420, 82 USPQ2d 1397. Office personnel may also take into account "the inferences and creative steps that a person of ordinary skill in the art would employ." Id. at 418, 82 USPQ2d at 1396. See also MPEP § 2141.03. As stated above, the teachings of Sheremeteva are applied in the context of Tremblay, thus producing the invention as recited in claim 1. Applicant argues that ‘Li does not disclose or suggest "normalizing the feature matrix relative to a portion of the feature matrix that does not represent the feature," as claimed, and cannot resolve the deficiencies with the Tremblay- Sheremeteva combination.’ Examiner replies that the combination of Tremblay and Sheremeteva is shown above to fully teach the disputed limitation. Thus, the Li reference is not required to reject claim 1. Conclusion: The disputed features are shown above to be fully taught by the cited references, and there are no new grounds of rejection. Accordingly, the present Office Action is made final. Claim Rejections - 35 USC § 103 4. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. 5. Claims 1, 5, 6, 12, 14, 16-18, 20-22, 27 and 29 are rejected under 35 U.S.C. 103 as being unpatentable over Tremblay (US-2019/0251397) in view of Sheremeteva (RU-2267232-C1 – English translation with original Russian attached). Regarding claim 1: Tremblay discloses a method of generating a synthetic image by transferring a feature onto an original image (figs 2A-2C, [0037] of Tremblay), the method comprising: receiving or generating a feature matrix that is a numeric representation of a feature image depicting the feature, with each element of the feature matrix corresponding to a different pixel of the feature image (fig 2A (“Geometric Shapes”), fig 2B(“Rendered 3D Geometric Shapes”), and [0039]-[0042] of Tremblay – data corresponding to Geometric Shapes (features) generated by the GPU and received by the input image generator as rendered data (feature matrix), the rendered data corresponding to a 2D matrix of pixels for the features); receiving or generating a surrogate area matrix that is a numeric representation of an area, within the original image, to which the feature will be transferred, with each element of the surrogate area matrix corresponding to a different pixel of the original image (fig 2A(“3D object(s)”, “Background Image”), fig 2B (“Task-Specific Training Data”), and [0039]-[0042] of Tremblay – surrogate areas established with 3D objects rendered in the surrogate areas as 2D rendered areas (surrogate area matrix), which are 2D matrices of pixels, one matrix for each area); weighting the feature matrix ([0052], and [0128]-[0132] of Tremblay – 3D geometric shapes and 3D objects weighted during training until a difference between training data and output data reaches a predetermined value); and generating the synthetic image based on (i) the surrogate area matrix and (ii) the weighted feature matrix (figs 2B-2D, [0041]-[0044], [0050], and [0053]-[0054] of Tremblay – 3D Geometric Shapes (feature matrix) and 3D objects (surrogate matrix) combined to generate synthetic image). Tremblay does not disclose normalizing the feature matrix relative to a portion of the feature matrix that does not represent the feature, and generating the synthetic image based on the normalized feature matrix. Sheremeteva discloses normalizing the feature matrix relative to a portion of the feature matrix that does not represent the feature, and generating the synthetic image based on the normalized feature matrix (page 6, line 22 to page 9, line 2 (original Russian); page 4, last paragraph to page 7, paragraph 3 (translation) of Sheremeteva – equations shown clearly in original Russian portion; Euclidian distance normalizations according to determined standard, which is in a portion of the feature matrix that does not represent the feature). Tremblay and Sheremeteva are analogous art because they are from the same field of endeavor, namely synthetic image generation. Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to normalizing the feature matrix relative to a portion of the feature matrix that does not represent the feature, and generating the synthetic image based on the normalized feature matrix, as taught by Sheremeteva. The motivation for doing so would have been to provide an image which can more readily be visually interpreted by an observer/user. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Tremblay according to the relied-upon teachings of Sheremeteva to obtain the invention as specified in claim 1. Regarding claim 5: Tremblay in view of Sheremeteva discloses the method of claim 1 (as rejected above), wherein normalizing the feature matrix includes normalizing the feature matrix on a per-row or per-column basis (page 7, lines 18-35 (original Russian); page 5, paragraph beginning “If to the feature matrix element” (translation) of Sheremeteva). Tremblay and Sheremeteva are combined for the reasons set forth above with respect to claim 1. Regarding claim 6: Tremblay in view of Sheremeteva discloses the method of claim 5 (as rejected above), wherein normalizing the feature matrix on a per-row or per-column basis includes, for each row or column of the feature matrix: generating a feature row histogram of element values for the row or column of the feature matrix (page 5, line 50 to page 6, line 3; and page 7, lines 18-35 (original Russian); page 4, 4th paragraph; and page 5, paragraph beginning “If to the feature matrix element” (translation) of Sheremeteva). Tremblay and Sheremeteva are combined for the reasons set forth above with respect to claim 1. Regarding claim 12: Tremblay in view of Sheremeteva discloses the method of claim 1 (as rejected above), wherein receiving or generating the feature matrix includes rotating the feature matrix or the feature image, and wherein rotating the feature matrix or the feature image includes rotating the feature matrix or the feature image by an amount that is based on (i) a rotation of the feature depicted in the feature image and (ii) a desired rotation of the feature depicted in the feature image (fig 1B, fig 2B, [0032], and [0043] of Tremblay). Regarding claim 14: Tremblay in view of Sheremeteva discloses the method of claim 12 (as rejected above), further comprising: determining the desired rotation based on a position of the area to which the feature will be transferred (fig 1B and [0032] of Tremblay). Regarding claim 16: Tremblay in view of Sheremeteva discloses the method of claim 1 (as rejected above), further comprising: repeating the method for each of a plurality of features corresponding to different features in a feature library (fig 1B (“Rendered Images of the Object of Interest”), and [0030]-[0031] of Tremblay – repeated for plurality of features, including color, texture, and others). Regarding claim 17: Tremblay in view of Sheremeteva discloses the method of claim 1 (as rejected above), further comprising: generating a plurality of synthetic images by repeating the method for each of a plurality of original images ([0020]-[0021] of Tremblay – performed for multiple images/data sets to train the deep neural network). Regarding claim 18: Tremblay in view of Sheremeteva discloses the method of claim 17 (as rejected above), further comprising: training a neural network for automated visual inspection using the plurality of synthetic images and the plurality of original images ([0020]-[0021] of Tremblay). Regarding claim 20: Tremblay discloses a system (fig 3 and [0055] of Tremblay) comprising: one or more processors (fig 3(300) and [0056] of Tremblay); and one or more non-transitory, computer-readable media storing instructions that, when executed by the one or more processors, cause the system (fig 3(304), [0057], and [0065] of Tremblay) to receive or generate a feature matrix that is a numeric representation of a feature image depicting the feature, with each element of the feature matrix corresponding to a different pixel of the feature image (fig 2A(“Geometric Shapes”), fig 2B(“Rendered 3D Geometric Shapes”), and [0039]-[0042] of Tremblay – data corresponding to Geometric Shapes (features) generated by the GPU and received by the input image generator as rendered data (feature matrix), the rendered data corresponding to a 2D matrix of pixels for the features); receive or generate a surrogate area matrix that is a numeric representation of an area, within the original image, to which the feature will be transferred, with each element of the surrogate area matrix corresponding to a different pixel of the original image (fig 2A (“3D object(s)”, “Background Image”), fig 2B(“Task-Specific Training Data”), and [0039]-[0042] of Tremblay – surrogate areas established with 3D objects rendered in the surrogate areas as 2D rendered areas (surrogate area matrix), which are 2D matrices of pixels, one matrix for each area); weight the feature matrix ([0052], and [0128]-[0132] of Tremblay – 3D geometric shapes and 3D objects weighted during training until a difference between training data and output data reaches a predetermined value); and generate a synthetic image based on (i) the surrogate area matrix and (ii) the weighted feature matrix (figs 2B-2D, [0041]-[0044], [0050], and [0053]-[0054] of Tremblay – 3D Geometric Shapes (feature matrix) and 3D objects (surrogate matrix) combined to generate synthetic image). Tremblay does not disclose normalize the feature matrix relative to a portion of the feature matrix that does not represent the feature, and generate a synthetic image based on the normalized feature matrix. Sheremeteva discloses normalizing the feature matrix relative to a portion of the feature matrix that does not represent the feature, and generating a synthetic image based on the normalized feature matrix (page 6, line 22 to page 9, line 2 (original Russian); page 4, last paragraph to page 7, paragraph 3 (translation) of Sheremeteva – equations shown clearly in original Russian portion; Euclidian distance normalizations according to determined standard, which is in a portion of the feature matrix that does not represent the feature). Tremblay and Sheremeteva are analogous art because they are from the same field of endeavor, namely synthetic image generation. Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to normalizing the feature matrix relative to a portion of the feature matrix that does not represent the feature, and generating a synthetic image based on the normalized feature matrix, as taught by Sheremeteva. The motivation for doing so would have been to provide an image which can more readily be visually interpreted by an observer/user. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Tremblay according to the relied-upon teachings of Sheremeteva to obtain the invention as specified in claim 20. Regarding claim 21: Tremblay in view of Sheremeteva discloses the system of claim 20 (as rejected above), wherein normalizing the feature matrix includes normalizing the feature matrix on a per-row or per-column basis (page 7, lines 18-35 (original Russian); page 5, paragraph beginning “If to the feature matrix element” (translation) of Sheremeteva). Tremblay and Sheremeteva are combined for the reasons set forth above with respect to claim 20. Regarding claim 22: Tremblay in view of Sheremeteva discloses the system of claim 21 (as rejected above), wherein normalizing the feature matrix on a per-row or per-column basis includes, for each row or column of the feature matrix: generating a feature row histogram of element values for the row or column of the feature matrix (page 5, line 50 to page 6, line 3; and page 7, lines 18-35 (original Russian); page 4, 4th paragraph; and page 5, paragraph beginning “If to the feature matrix element” (translation) of Sheremeteva). Tremblay and Sheremeteva are combined for the reasons set forth above with respect to claim 20. Regarding claim 27: Tremblay in view of Sheremeteva discloses the system of claim 20 (as rejected above), wherein receiving or generating the feature matrix includes rotating the feature matrix or the feature image, and wherein rotating the feature matrix or the feature image includes rotating the feature matrix or the feature image by an amount that is based on (i) a rotation of the feature depicted in the feature image and (ii) a desired rotation of the feature depicted in the feature image (fig 1B, fig 2B, [0032], and [0043] of Tremblay). Regarding claim 29: Tremblay in view of Sheremeteva discloses the system of claim 27 (as rejected above), wherein the instructions further cause the system to: determine the desired rotation based on a position of the area to which the feature will be transferred (fig 1B and [0032] of Tremblay). 6. Claims 2 and 3 are rejected under 35 U.S.C. 103 as being unpatentable over Tremblay (US-2019/0251397) in view of Sheremeteva (RU-2267232-C1 – English translation with original Russian attached), and in further view of Li (US-2020/0126210). Regarding claim 2: Tremblay in view of Sheremeteva discloses the method of claim 1 (as rejected above). Tremblay in view of Sheremeteva does not disclose wherein: the original image is an image of a container; and the feature is a defect associated with the container or contents of the container. Li discloses wherein: the original image is an image of a container; and the feature is a defect associated with the container or contents of the container (fig 3 and [0025] of Li – images of vials, some of which have defects detected using computer vision). Tremblay and Li are analogous art because they are from the same field of endeavor, namely machine learning for image data processing and analysis. Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to have the original image be an image of a container; and the feature be a defect associated with the container or contents of the container, as taught by Li. The motivation for doing so would have been to provide an efficient analysis of images pertaining to a particular application of the imaging system. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Tremblay further according to the relied-upon teachings of Li to obtain the invention as specified in claim 2. Regarding claim 3: Tremblay in view of Sheremeteva, and in further view of Li, discloses the method of claim 2 (as rejected above), wherein: (i) the container is a syringe, and the feature is a defect associated with a barrel of the syringe, a plunger of the syringe, a needle shield of the syringe, or a fluid within the syringe (list recited in the alternative; possibility ii is taught by Li); or (ii) the container is a vial, and the feature is a defect associated with a wall of the vial, a cap of the vial, a crimp of the vial, or a fluid or lyophilized cake within the vial (figs 3-5, [0025], [0027], and [0030]-[0033] of Li). Tremblay and Li are combined for the reasons set forth above with respect to claim 2. Allowable Subject Matter 7. Claims 7-11 and 23-26 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion 8. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to James A Thompson whose telephone number is (571)272-7441. The examiner can normally be reached M-F 8am-6pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at 571-272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JAMES A THOMPSON/Primary Examiner, Art Unit 2615
Read full office action

Prosecution Timeline

Jun 01, 2023
Application Filed
Jul 30, 2025
Non-Final Rejection — §103
Dec 03, 2025
Response Filed
Feb 03, 2026
Final Rejection — §103
Apr 06, 2026
Examiner Interview Summary
Apr 06, 2026
Applicant Interview (Telephonic)
Apr 15, 2026
Response after Non-Final Action

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602893
VOLUMETRIC HEAT MAPS FOR INTELLIGENTLY DISTRIBUTING VIRTUAL OBJECTS OF INTEREST
2y 5m to grant Granted Apr 14, 2026
Patent 12592011
Digital Representation of Intertwined Vector Objects
2y 5m to grant Granted Mar 31, 2026
Patent 12585379
Methods and systems for generating image tool recommendations
2y 5m to grant Granted Mar 24, 2026
Patent 12579681
METHOD AND DEVICE FOR DRAWING SPATIAL MAP, CAMERA EQUIPMENT AND STORAGE MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12579743
ALIGNED AUGMENTED REALITY VIEWS
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
85%
Grant Probability
89%
With Interview (+3.7%)
2y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 717 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month