Prosecution Insights
Last updated: April 19, 2026
Application No. 17/827,724

SYSTEM AND METHOD FOR AUTHORING HIGH QUALITY RENDERINGS AND GENERATING MANUFACTURING OUTPUT OF CUSTOM PRODUCTS

Final Rejection §103§112
Filed
May 29, 2022
Examiner
MOLL, NITHYA JANAKIRAMAN
Art Unit
2189
Tech Center
2100 — Computer Architecture & Software
Assignee
Zazzle Inc.
OA Round
2 (Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
3y 10m
To Grant
81%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
355 granted / 530 resolved
+12.0% vs TC avg
Moderate +14% lift
Without
With
+13.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 10m
Avg Prosecution
24 currently pending
Career history
554
Total Applications
across all art units

Statute-Specific Performance

§101
24.0%
-16.0% vs TC avg
§103
37.3%
-2.7% vs TC avg
§102
15.5%
-24.5% vs TC avg
§112
19.5%
-20.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 530 resolved cases

Office Action

§103 §112
DETAILED ACTION This action is in response to the submission filed on 12/30/2025. Claims 1-20 are presented for examination. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments- 35 USC § 112 Applicant's arguments filed 12/30/2025 have been fully considered. The rejections regarding claims 3, 7, 10, 14 and 17 are withdrawn in light of the amendments. Applicant has not addressed the rejections to claims 5, 12 and 19. The rejection stands. Response to Arguments- 35 USC § 102 Applicant’s arguments have been considered but are moot in view of the new grounds of rejection necessitated by the amendments. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 5, 12, and 19 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 5, 12 and 19 recite “controlling a flow of data associated with the one or more filters based on one or more controlling functionalities implemented in the one or more filters”. It is unclear how a filter can have ‘controlling functionalities’. It is unknown how filters can have a ‘flow of data’. It is unknown how to interpret this limitation. For the purposes of examination it is interpreted to merely mean that filters are used. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over US 20110292451 A1 (“Harvill”) in view of US 11205023 B2 (“Bowen”). Regarding claims 1, 8 and 15, Harvill teaches: A method for applying filters to a composite image file (Harvill: para [0045], “Logic 212 may include or implement image filter logic 216”), the method comprising: receiving a composite image file generated for a physical product (Harvill: Fig. 1, 102, “Customer uploads a finished digital graphical image to a server computer”); selecting, one or more filters, from a plurality of filters, for a layer or a plurality of layers in the image file, based on a link to corresponding customizable product area, to be applied to the composite image file (Harvill: Fig. 2A; para [0053], “Filter the image so that contiguous color areas that are close in color space are preserved, and represented by a good candidate color”; para [0059], “Step 264 represents initiating a loop in which steps 266, 268, 270 are performed for each 3.times.3 sample area in the image. Thus, for each sample area in the source image, in step 266 a sample pixel is selected. In step 268, the process selects or finds the closest candidate pixel within the sample area that is not the sample pixel. "Closest," in this context, refers to closeness in color space. The particular pixel of the destination image that maps to the current sample area is set to the value of the found candidate pixel, at step 270. Processing continues at step 266 until all sample areas of the source image are processed”; para [0061], “Referring now to FIG. 2C, in step 280 a Color Count value is obtained. The Color Count value determines a number of colors, which correspond to layers of screen print ink, are to be used for printing the final image on a product”); wherein each filter, of the plurality of filters, is a processing component that operates on data (Harvill: para [0053], “Filter the image so that contiguous color areas that are close in color space are preserved, and represented by a good candidate color …Color areas that are not contiguous, which could be called color noise, are filtered out”; the filter is operating on the contiguous color areas data), and processes data (Harvill: para [0053], “Filter the image so that contiguous color areas that are close in color space are preserved, and represented by a good candidate color …Color areas that are not contiguous, which could be called color noise, are filtered out”; the filter is processing the contiguous color areas data); applying the one or more filters to the composite image file to (Harvill: para [0059], “Step 264 represents initiating a loop in which steps 266, 268, 270 are performed for each 3.times.3 sample area in the image. Thus, for each sample area in the source image, in step 266 a sample pixel is selected. In step 268, the process selects or finds the closest candidate pixel within the sample area that is not the sample pixel. "Closest," in this context, refers to closeness in color space. The particular pixel of the destination image that maps to the current sample area is set to the value of the found candidate pixel, at step 270. Processing continues at step 266 until all sample areas of the source image are processed”; para [0053], “Filter the image so that contiguous color areas that are close in color space are preserved, and represented by a good candidate color”): automatically segment the composite image file into a plurality of parts of the composite image file (Harvill: para [0058], “In step 262, the process logically tiles the source image specified by the user into sample areas of 3.times.3 pixels in size. The 3.times.3 sample areas may overlap in the source image. In an embodiment, there is a one to one mapping between each 3.times.3 sample area in the source image and one different particular pixel of a destination image. FIG. 2D, 2E illustrate mapping sample areas of a source image to particular pixels of a destination image. FIG. 2D illustrates a relationship of a source image A to a destination image B in which a first 3.times.3 sample area C maps to a pixel E of the destination image. As seen in FIG. 2E, similarly, a second 3.times.3 sample area of the same source image A maps to a different pixel E of the destination image”); and automatically apply the one or more filters to the plurality of parts of the composite image file to generate a digital representation of the physical product (Harvill: rara [0053], “Filter the image so that contiguous color areas that are close in color space are preserved, and represented by a good candidate color”; para [0021], “producing custom screen printed products based on digital images. Disclosed techniques include a work flow for receiving user-supplied images and transforming the images into color separation data and related metadata that can drive screen printing, including previewing the appearance of the products as they will appear when screen printed, for user approval; finding key color components using a filtering and grouping approach; image resolution limiting; structured error diffusion halftoning with an improved random number generator for use in adding blue noise; error diffusion dot spread correction; and others”); generating a digital representation of the physical product as a series of user interface elements for the user to control, modify, and view the digital representation of the physical product using one or more common user interface classes available on a digital platform (Harvill: Fig. 1, 114, “User views translation of original image into screen print image and adjust settings if needed” 116, “User selects substrate type, sizes, colors”, 118, “Customer image display on image of substrate”, 120 “Product purchase”, 122 “Server computer delivers digital files to screen print service”). Harvill does not teach but Bowen does teach: wherein each filter, of the plurality of filters, is a processing component that creates data (Bowen: col. 60, lines 52-63, “filters (e.g., vintage filters, vivid filters, sepia, black and white, pop-art/saturated colors, graphite, invert colors, texture, grain, etc.)”; modifying an image using a filter is creating data, for example applying sepia filter is creating sepia image data, inverting colors is creating inverted color data, etc.), Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined Harvill (directed to applying filters to a composite image file) and Bowen (directed to filtering that creates data) and arrived at applying filters to a composite image file using filtering that creates data. One of ordinary skill in the art would have been motivated to make such a combination to “A computer-aided design system enables physical articles to be customized via printing or embroidering and enables digital content to be customized” (Bowen: col. 1). Regarding claims 2, 9 and 16, Harvill and Bowen teach: The method of Claim 1, wherein a filter, of the plurality of filters, is a computer program that comprises a set of executable commands which, when executed by a computer processor, cause the computer processor to traverse paths defined in a corresponding layout of the composite image file (paragraph [0474] of Applicant’s disclosure indicates that a ‘path’ merely captures the characteristics of a custom design; Harvill: Fig. 1, 102, “Customer uploads a finished digital graphical image to a server computer”; Fig. 1, 114, “User views translation of original image into screen print image and adjust settings if needed” 116, “User selects substrate type, sizes, colors”, 118, “Customer image display on image of substrate”, 120 “Product purchase”, 122 “Server computer delivers digital files to screen print service”). Regarding claims 3, 10 and 17, Harvill and Bowen teach: The method of Claim 1, wherein the composite image file is represented as a data structure generated for the physical product (Harvill: Fig. 1, 102, “Customer uploads a finished digital graphical image to a server computer”; Fig. 1, 114, “User views translation of original image into screen print image and adjust settings if needed” 116, “User selects substrate type, sizes, colors”, 118, “Customer image display on image of substrate”, 120 “Product purchase”, 122 “Server computer delivers digital files to screen print service”); wherein the composite image file is a layered image file that includes information about one or more components of the physical product and one or more characteristics of the physical product (Harvill: Fig. 1, 102, “Customer uploads a finished digital graphical image to a server computer”; para [0061], “Referring now to FIG. 2C, in step 280 a Color Count value is obtained. The Color Count value determines a number of colors, which correspond to layers of screen print ink, are to be used for printing the final image on a product”); wherein the composite image file comprises information about one or more layers (Harvill: para [0024], “A customer provides a draft image or sketch of the design they wish to be made into a screen print product to a screen printing service. This is usually done by digital means”; Abstract, “a layer or a plurality of layers in the image file,”). Regarding claims 4, 11 and 18, Harvill and Bowen teach: The method of Claim 1, wherein the digital representation comprises layers data and paths data that, in combination, represent an interactive digital design corresponding to the physical product (Harvill: Fig. 1, 114, “User views translation of original image into screen print image and adjust settings if needed” 116, “User selects substrate type, sizes, colors”, 118, “Customer image display on image of substrate”, 120 “Product purchase”, 122 “Server computer delivers digital files to screen print service”); wherein the digital representation is compatible with the one or more functionalities of the website (Harvill: Fig. 1, 114, “User views translation of original image into screen print image and adjust settings if needed” 116, “User selects substrate type, sizes, colors”, 118, “Customer image display on image of substrate”, 120 “Product purchase”, 122 “Server computer delivers digital files to screen print service”). Regarding claims 5, 12 and 19, Harvill and Bowen teach: The method of Claim 1, wherein the generating of the digital representation of the physical product comprises: controlling a flow of data associated with the one or more filters based on one or more controlling functionalities implemented in the one or more filters as they applied to the plurality of parts of the composite image file (see rejection under 35 USC 112; Harvill: Fig. 2B). Regarding claims 6, 13 and 20, Harvill and Bowen teach: The method of Claim 1, wherein the generating of the digital representation of the physical product comprises: applying the one or more filters to the plurality of parts of the composite image file to cause the one or more filters to process, in parallel, the plurality of parts of the composite image file and to generate a data structure that captures the composite image file of the physical product (Harvill: Fig. 2A; para [0053], “Filter the image so that contiguous color areas that are close in color space are preserved, and represented by a good candidate color”; para [0021], “producing custom screen printed products based on digital images. Disclosed techniques include a work flow for receiving user-supplied images and transforming the images into color separation data and related metadata that can drive screen printing, including previewing the appearance of the products as they will appear when screen printed, for user approval; finding key color components using a filtering and grouping approach; image resolution limiting; structured error diffusion halftoning with an improved random number generator for use in adding blue noise; error diffusion dot spread correction; and others”); wherein the processing of the plurality of parts of the composite image file by a filter of the one or more filters comprises: evaluating the plurality of parts of the composite image file to determine a type of processing for the plurality of parts of the composite image file (Harvill: para [0059], “Step 264 represents initiating a loop in which steps 266, 268, 270 are performed for each 3.times.3 sample area in the image. Thus, for each sample area in the source image, in step 266 a sample pixel is selected. In step 268, the process selects or finds the closest candidate pixel within the sample area that is not the sample pixel. "Closest," in this context, refers to closeness in color space. The particular pixel of the destination image that maps to the current sample area is set to the value of the found candidate pixel, at step 270. Processing continues at step 266 until all sample areas of the source image are processed”;), generating, based on the type of processing for the plurality of parts of the composite image file, result data for the plurality of parts of the composite image file (Harvill: para [0059], “Step 264 represents initiating a loop in which steps 266, 268, 270 are performed for each 3.times.3 sample area in the image. Thus, for each sample area in the source image, in step 266 a sample pixel is selected. In step 268, the process selects or finds the closest candidate pixel within the sample area that is not the sample pixel. "Closest," in this context, refers to closeness in color space. The particular pixel of the destination image that maps to the current sample area is set to the value of the found candidate pixel, at step 270. Processing continues at step 266 until all sample areas of the source image are processed”), and transmitting the result data to one or more of the one or more filters (Harvill: para [0045], “In an embodiment, server computer 208 comprises key color component finding logic 212, resolution limiting logic 220, structured error diffusion halftoning logic 222, substrate selection and display logic 226, presentation logic 224, and screen print data generating logic 228. In an embodiment, key color component finding logic 212 comprises circuits, logic, stored computer programs or other software elements, or a combination, that are configured to perform a process of finding key color components in the user image file 204 for purposes of preparing the image for screen printing, and may implement the processes of FIG. 2B, FIG. 2C. Logic 212 may include or implement image filter logic 216 and grouping logic 218 for filtering image data and grouping pixels using the techniques herein to result in creating and storing a destination image 214 and related metadata identifying key color components of the user image file 204”). Regarding claims 7 and 14, Harvill and Bowen teach: The method of Claim 1, wherein the composite image file that captures the physical product is a combination of automatically processed photography and manually processed photography (see rejection under 35 USC 112; Harvill: para [0007], “FIG. 1 illustrates an example automated process of screen printing using user-provided digital graphic images”; para [0023], “2. Example Manual Screen Printing Process”). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to NITHYA J. MOLL whose telephone number is (571)270-1003. The examiner can normally be reached Monday-Friday 10am-6pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Rehana Perveen can be reached at 571-272-3676. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NITHYA J. MOLL/Primary Examiner, Art Unit 2189
Read full office action

Prosecution Timeline

May 29, 2022
Application Filed
Dec 10, 2025
Non-Final Rejection — §103, §112
Dec 30, 2025
Response Filed
Mar 02, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585839
PROCESS VARIABILITY SIMULATOR FOR MANUFACTURING PROCESSES
2y 5m to grant Granted Mar 24, 2026
Patent 12579338
TRANSITIONING SIMULATION ENTITIES BETWEEN SMART ENTITY STATUS AND DISCRETE ENTITY STATUS
2y 5m to grant Granted Mar 17, 2026
Patent 12579339
JET AIRCRAFT MANEUVERING CHARACTERISTIC SIMULATION SYSTEM FOR SINGLE PROPELLER AIRCRAFT AND SINGLE PROPELLER AIRCRAFT
2y 5m to grant Granted Mar 17, 2026
Patent 12558748
METHOD AND VARIABLE SYSTEM FOR ADJUSTING WORKPIECE-SUPPORTING MODULE
2y 5m to grant Granted Feb 24, 2026
Patent 12560739
MACHINE LEARNING OF GEOLOGY BY PROBABILISTIC INTEGRATION OF LOCAL CONSTRAINTS
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
81%
With Interview (+13.6%)
3y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 530 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month