Prosecution Insights
Last updated: April 19, 2026
Application No. 18/739,791

SYSTEMS AND METHODS FOR GENERATING CUSTOMIZABLE THREE-DIMENSIONAL ACCESSORY MODELS FROM TWO-DIMENSIONAL IMAGES OF AN ACCESSORY DESIGN

Non-Final OA §103
Filed
Jun 11, 2024
Examiner
SUN, HAI TAO
Art Unit
2616
Tech Center
2600 — Communications
Assignee
Perfect Mobile Corp.
OA Round
1 (Non-Final)
73%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
347 granted / 476 resolved
+10.9% vs TC avg
Strong +27% interview lift
Without
With
+26.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
35 currently pending
Career history
511
Total Applications
across all art units

Statute-Specific Performance

§101
6.9%
-33.1% vs TC avg
§103
65.8%
+25.8% vs TC avg
§102
2.3%
-37.7% vs TC avg
§112
15.9%
-24.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 476 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-4, 7-11, 14-18, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Stuyck (US 20230088866 A1) and in view of Hancock (US 20210232720 A1). Regarding to claim 1, Stuyck discloses a method implemented in a computing device for virtual try-on (Fig. 1; [0018]: the computing device 101 processes the one or more accessed images 110 to generate a three-dimensional digital garment 120; generate a three-dimensional digital garment based on one or more 2D images; Fig. 5; [0025]: virtual try-on system; the generated digital garment 120 is used for an application such as a virtual try-on; Fig. 9; [0032-0033]: computer system 900 includes a processor 902, memory 904, storage 906), comprising: processing a two-dimensional (2D) image depicting an image of a product design ([0018]: the computing device 101 processes the one or more accessed images 110 to generate a three-dimensional digital product; [0019]: the computing device 101 generates a front-segmentation mask identifying the front of the garment in the one or more accessed images; the front-segmentation mask may be two-dimensional; Fig. 1; [0020]: the computing device 101 generates a back panel of the garment; the back panel is a duplicate of the front panel; the computing device 101 accesses one or more second images comprising a back of the garment to generate the back panel of the garment; generate a back-segmentation mask identifying the back of the garment in the one or more second images); extracting target design features of the product design depicted in the 2D image ([0019]: identify a front of the garment in the one or more RGB images; [0020]: identify the back of the garment in the one or more second images; [0022]: identify, i.e., extract, one or more pairs of boundary segments of the front panel 205 and the back panel 207; identify one or more pairs of boundary segments of the front panel and back panel in any suitable manner; boundary segments are target design features; Fig. 8; [0029]: identify, i.e., extract, one or more pairs of boundary segments of the front panel and the back panel.); converting the target design features into a three-dimensional (3D) product model ([0021]: the computing device 101 aligns the front panel 205 and the back panel 207 in a three-dimensional space so that the front panel is in front of a three-dimensional body 209 and the back panel 207 is behind the three-dimensional body 209; the three-dimensional body 209 is a three-dimensional template of a human body; Fig. 8; [0029]: align the front panel and the back panel in a three-dimensional space so that the front panel is in front of a three-dimensional body and the back panel is behind the three-dimensional body.); generating a surface attributes map based on the product design depicted in the 2D image (Fig. 2; [0019]: generate a front panel 205 of the garment corresponding to the front-segmentation mask 203; the front panel 205 comprises three-dimensional meshes; surface attributes map with color as illustrated in Fig. 2; PNG media_image1.png 86 136 media_image1.png Greyscale ; PNG media_image2.png 82 234 media_image2.png Greyscale ; Fig. 1; [0021]: a depth map of the user wearing the garment; Fig. 5; [0026]: observe collisions between meshes corresponding to the avatar 501 and meshes corresponding to the digital garment 503); generating a joint based on the target design features and the 3D product model (Fig. 3; [0022]: the computing device 101 draws virtual lines that perpendicularly intersect with boundaries of the front panel 205 and the back panel 207; determine a continuous portion of the boundaries as a boundary segment 310; Fig. 4A; Fig. 4B; [0023]: the computing device 101 decreases the distance between each pair of boundary segments 310 of the front panel 205 and the pack panel 207 at each iteration; join each pair of boundary segments); and performing virtual application of the 3D product model with the surface attributes map and the joint on a user (Fig. 4A; Fig. 4B; [0023]: the garment has been deformed to fit to the three-dimensional body 209; the generated digital garment 120 is represented by three-dimensional meshes; PNG media_image3.png 492 790 media_image3.png Greyscale ; [0024]: the computing device 101 deforms the digital garment 120 according to the second pose of the three-dimensional body 209; Fig. 5; [0025]: the generated digital garment 120 is used for an application such as a virtual try-on, where a user may drape the digital garment 120 to an avatar representing the user; [0026]: the computing device 101 detects the user's action to the piece of the digital garment by observing collisions between meshes corresponding to the avatar 501 and meshes corresponding to the digital garment 503.). Stuyck fails to explicitly disclose product is accessory. In same field of endeavor, Hancock teaches product is accessory ([0035]: the product is an article of jewelry such as a pendant, ring, earring, brooch, bracelet, necklace, watch, or the like; [0038]: enable more efficient customization of intricate, elaborate jewelry such as pendants, rings, earrings, brooches, bracelets, necklaces, etc.; Fig. 6A-6D; [0050-0051]: the information 140 is associated with the generic Base Model 210 and generic Subparts 230; PNG media_image4.png 654 666 media_image4.png Greyscale ). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Stuyck to include product is accessory as taught by Hancock. The motivation for doing so would have been to improve the process by which the 2D and 3D data is passed back and forth from the manufacturing entity to the entity customizing the product and back to the manufacturing entity; to display the product and create the digital models for manufacturing; to design a generic and or customized Base Model(s) and Subparts; to create a custom brand by jewelry designer as taught by Hancock in paragraphs [0007], [0036], [0041], and [0054]. Regarding to claim 2, Stuyck in view of Hancock discloses the method of claim 1, wherein the accessory design comprises at least one of: an earring, a necklace, a bracelet or a wristband (Hancock; [0035]: the product is an article of jewelry such as a pendant, ring, earring, brooch, bracelet, necklace, watch, or the like; [0038]: enable more efficient customization of intricate, elaborate jewelry such as pendants, rings, earrings, brooches, bracelets, necklaces, etc.). Same motivation of claim 1 is applied here. Regarding to claim 3, Stuyck in view of Hancock discloses the method of claim 1, wherein the target design features comprise: a segmentation mask of the product design depicted in the 2D image (Stuyck; Fig. 2; [0019]: the computing device 101 generates a front-segmentation mask identifying the front of the garment in the one or more accessed images; the front-segmentation mask is two-dimensional; PNG media_image5.png 544 804 media_image5.png Greyscale ; Fig. 8; [0029]: identify one or more pairs of boundary segments of the front panel and the back panel.), an object contour of the product design depicted in the 2D image, and a skeletonization graph of the product design depicted in the 2D image (Stuyck; Fig. 2; [0020]: the computing device 101 may generate a back-segmentation mask identifying the back of the garment in the one or more second images; Fig. 5; [0025]: the generated digital garment 120 is used for an application such as a virtual try-on, where a user may drape the digital garment 120 to an avatar representing the user ; [0026]: the computing device 101 detects the user's action to the piece of the digital garment by observing collisions between meshes corresponding to the avatar 501 and meshes corresponding to the digital garment 503; Fig. 8; [0029]: identify one or more pairs of boundary segments of the front panel and the back panel.). Stuyck in view of Hancock further discloses product is accessory (Hancock; [0035]: the product is an article of jewelry such as a pendant, ring, earring, brooch, bracelet, necklace, watch, or the like; [0038]: enable more efficient customization of intricate, elaborate jewelry such as pendants, rings, earrings, brooches, bracelets, necklaces, etc.). Same motivation of claim 1 is applied here. Regarding to claim 4, Stuyck in view of Hancock discloses the method of claim 1, wherein converting the target design features into the 3D accessory model (same as rejected in claim 1) comprises: generating a 3D flat mesh in an x-y plane based on one of the target design features (Stuyck; [0019]: the front panel comprises three-dimensional flat meshes; [0020]: the back panel comprises three-dimensional flat meshes); adjusting a depth of each vertex in the 3D flat mesh along a z-axis to generate a 3D curve mesh (Stuyck; [0021]: generate the three-dimensional body 209 based on a depth map of the user wearing the garment; align the front panel and the back panel in the three-dimensional space; Fig. 4A; Fig. 4B; [0023]: intermediate iteration among the plurality of simulation iterations for generating a digital garment; the computing device 101 decreases and adjusts the distance between each pair of boundary segments 310 of the front panel 205 and the pack panel 207 at each iteration; PNG media_image3.png 492 790 media_image3.png Greyscale ), wherein adjusting the depth of each vertex comprises adjusting the depth of each vertex along the z-axis based on a threshold and a distance in the x-y plane between the vertex in the 3D flat mesh and an object contour of target design features (Stuyck; Fig. 4A; Fig. 4B; [0023]: intermediate iteration among the plurality of simulation iterations for generating a digital garment; the computing device 101 decreases and adjusts the distance between each pair of boundary segments 310 of the front panel 205 and the pack panel 207 at each iteration; PNG media_image3.png 492 790 media_image3.png Greyscale .); generating a mirrored duplicate of the 3D curve mesh along the z-axis (Stuyck; [0020]: the back panel is a duplicate of the front panel; the computing device 101 generates a back panel 207 by duplicating the front panel 205; Fig. 4B; [0023]: a final iteration of the plurality of simulation iterations for generating a digital garment; each pair of boundary segments 310 has been attached together; PNG media_image6.png 282 340 media_image6.png Greyscale ; 3D curve mesh); and merging the 3D curve mesh with the mirrored duplicate of the 3D curve mesh to generate the 3D product model (Stuyck; [0021]: the computing device 101 aligns the front panel 205 and the back panel 207 in a three-dimensional space so that the front panel is in front of a three-dimensional body 209 and the back panel 207 is behind the three-dimensional body 209; the three-dimensional body 209 may be a three-dimensional template of a human body; Fig. 4B; [0023]: a final iteration of the plurality of simulation iterations for generating a digital garment; each pair of boundary segments 310 has been attached together; PNG media_image6.png 282 340 media_image6.png Greyscale ). Stuyck in view of Hancock further discloses product is accessory (Hancock; [0035]: the product is an article of jewelry such as a pendant, ring, earring, brooch, bracelet, necklace, watch, or the like; [0038]: enable more efficient customization of intricate, elaborate jewelry such as pendants, rings, earrings, brooches, bracelets, necklaces, etc.). Same motivation of claim 1 is applied here. Regarding to claim 7, Stuyck in view of Hancock discloses the method of claim 1, wherein generating the joint based on the target design features and the 3D accessory model (same as rejected in claim 1) comprises: determining the joint in the 3D product model by mapping the 2D joint to 3D space (Stuyck; Fig. 2; [0019]: the computing device 101 generates a front-segmentation mask identifying the front of the garment in the one or more accessed images; the front-segmentation mask is two-dimensional; PNG media_image5.png 544 804 media_image5.png Greyscale ; [0021]: the computing device 101 aligns the front panel 205 and the back panel 207 in a three-dimensional space so that the front panel is in front of a three-dimensional body 209 and the back panel 207 is behind the three-dimensional body 209; the three-dimensional body 209 is a three-dimensional template of a human body). Stuyck in view of Hancock further discloses: identifying a cycle portion and a non-cycle portion based on connectivity of a skeletonization graph of the target design features (Hancock; [0046]: specify 450 a Template 156 or one or more generic Assets 240; [0048]: the visualization Assets 240 are generated using a lower CPU; Fig. 6A-6D; [0050-0051]: the information 140 is associated with the generic Base Model 210, generic Subparts 230, and generic Assets 240; PNG media_image4.png 654 666 media_image4.png Greyscale ); and identifying a 2D joint comprising a connection between the cycle portion and the non-cycle portion (Hancock; Fig. 6A-6D; [0050-0051]: the information 140 is associated with the generic Base Model 210, generic Subparts 230, and generic Assets 240; the join and connection are shown in Fig, 6A-6D; PNG media_image4.png 654 666 media_image4.png Greyscale ); and determining the joint in the 3D accessory model by mapping the 2D joint to 3D space (Hancock; [0006]: bridge three-dimensional (“3D”) and two-dimensional (“2D”) technology to allow consumers to easily, quickly, and conveniently engage with products by customizing them online; [0036]: bridging three-dimensional (“3D”) and two-dimensional (“2D”) technology; [0037]: customization happens in 2D via image manipulation while being tracked in 3D for optimized efficiency). Same motivation of claim 1 is applied here. Regarding to claim 8, Stuyck discloses a system (Fig. 1; [0018]: the computing device 101 processes the one or more accessed images 110 to generate a three-dimensional digital garment 120; generate a three-dimensional digital garment based on one or more 2D images; Fig. 5; [0025]: virtual try-on system; the generated digital garment 120 is used for an application such as a virtual try-on; Fig. 9; [0032-0033]: computer system 900 includes a processor 902, memory 904, storage 906), comprising: a memory storing instructions (Fig. 9; [0032]: computer system 900 includes a processor 902, memory 904, storage 906; [0033]: processor 902 includes hardware for executing instructions; processor 902 may retrieve the instructions from an internal register, an internal cache, memory 904); a processor coupled to the memory and configured by the instructions to at least (Fig. 9; [0032]: computer system 900 includes a processor 902, memory 904, storage 906; [0033]: processor 902 includes hardware for executing instructions): the rest claim limitations are similar to claim limitations recited in claim 1. Therefore, same rational used to reject claim 1 is also used to reject claim 8. Regarding to claim 9, Stuyck in view of Hancock discloses the system of claim 8, The rest claim limitations are similar to claim limitations recited in claim 2. Therefore, same rational used to reject claim 2 is also used to reject claim 9. Regarding to claim 10, Stuyck in view of Hancock discloses the system of claim 8, The rest claim limitations are similar to claim limitations recited in claim 3. Therefore, same rational used to reject claim 3 is also used to reject claim 10. Regarding to claim 11, Stuyck in view of Hancock discloses the system of claim 8, wherein the processor is configured to convert the target design features into the 3D accessory model by (same as rejected in claim 8): The rest claim limitations are similar to claim limitations recited in claim 4. Therefore, same rational used to reject claim 4 is also used to reject claim 11. Regarding to claim 14, Stuyck in view of Hancock discloses the system of claim 8, wherein the processor is configured to generate the joint based on the target design features and the 3D accessory model by (same as rejected in claim 8): The rest claim limitations are similar to claim limitations recited in claim 7. Therefore, same rational used to reject claim 7 is also used to reject claim 14. Regarding to claim 15, Stuyck discloses a non-transitory computer-readable storage medium storing instructions to be implemented by a computing device having a processor, wherein the instructions, when executed by the processor, cause the computing device to at least ([0019]: the computing device 101 generates a front-segmentation mask identifying the front of the garment in the one or more accessed images; the front-segmentation mask may be two-dimensional; Fig. 1; [0020]: the computing device 101 generates a back panel of the garment; the back panel is a duplicate of the front panel; the computing device 101 accesses one or more second images comprising a back of the garment to generate the back panel of the garment; generate a back-segmentation mask identifying the back of the garment in the one or more second images; Fig. 9; [0032]: computer system 900 includes a processor 902, memory 904, storage 906; [0033]: processor 902 includes hardware for executing instructions; processor 902 may retrieve the instructions from an internal register, an internal cache, memory 904): The rest claim limitations are similar to claim limitations recited in claim 1. Therefore, same rational used to reject claim 1 is also used to reject claim 15. Regarding to claim 16, Stuyck in view of Hancock discloses the non-transitory computer-readable storage medium of claim 15, The rest claim limitations are similar to claim limitations recited in claim 2. Therefore, same rational used to reject claim 2 is also used to reject claim 16. Regarding to claim 17, Stuyck in view of Hancock discloses the non-transitory computer-readable storage medium of claim 15, The rest claim limitations are similar to claim limitations recited in claim 3. Therefore, same rational used to reject claim 3 is also used to reject claim 17. Regarding to claim 18, Stuyck in view of Hancock discloses the non-transitory computer-readable storage medium of claim 15, wherein the processor is configured by the instructions (same as rejected in claim 15) to The rest claim limitations are similar to claim limitations recited in claim 4. Therefore, same rational used to reject claim 4 is also used to reject claim 18. Regarding to claim 20, Stuyck in view of Hancock discloses the non-transitory computer-readable storage medium of claim 15, wherein the processor is configured by the instructions to (same as rejected in claim 15) The rest claim limitations are similar to claim limitations recited in claim 7. Therefore, same rational used to reject claim 7 is also used to reject claim 20. Claims 5-6, 12-13, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Stuyck (US 20230088866 A1) in view of Hancock (US 20210232720 A1), and further in view of Tung (US 11158121 B1 ). Regarding to claim 5, Stuyck in view of Hancock discloses the method of claim 1, Stuyck in view of Hancock fails to explicitly disclose wherein the surface attributes map comprises at least one of: an albedo map, metallic appearance attributes, a roughness map, and a normal map. In same field of endeavor, Tung teaches wherein the surface attributes map comprises at least one of: an albedo map, metallic appearance attributes, a roughness map, and a normal map (col. 4, lines 5-20: fine geometric details are added to normal maps generated by using a conditional adversarial network; tackle 3D surface geometry refinement using deep neural network on normal maps for realistic garment reconstruction). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Stuyck in view of Hancock to include wherein the surface attributes map comprises at least one of: an albedo map, metallic appearance attributes, a roughness map, and a normal map as taught by Tung. The motivation for doing so would have been to tackle 3D surface geometry refinement using deep neural network on normal maps for realistic garment reconstruction; to improve the realism of the final rendered product as taught by Tung in col. 4, lines 5-20 and col. 13, lines 25-35. Regarding to claim 6, Stuyck in view of Hancock and Tung discloses the method of claim 5, wherein the surface attributes map is generated by an artificial intelligence (AI) model (Tung; col. 4, lines 5-20: fine geometric details are added to normal maps generated by using a conditional adversarial network, i.e. artificial intelligence; tackle 3D surface geometry refinement using deep neural network on normal maps for realistic garment reconstruction). Same motivation of claim 5 is applied here. Regarding to claim 12, Stuyck in view of Hancock discloses the system of claim 8, The claim limitations are similar to claim limitations recited in claim 5. Therefore, same rational used to reject claim 5 is also used to reject claim 12. Regarding to claim 13, Stuyck in view of Hancock and Tung discloses the system of claim 12, The claim limitations are similar to claim limitations recited in claim 6. Therefore, same rational used to reject claim 6 is also used to reject claim 13. Regarding to claim 19, Stuyck in view of Hancock discloses the non-transitory computer-readable storage medium of claim 15, the rest claim limitations are similar to claim limitations recited in claim 5. Therefore, same rational used to reject claim 5 is also used to reject claim 19. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Hai Tao Sun whose telephone number is (571)272-5630. The examiner can normally be reached 9:00AM-6:00PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Daniel Hajnik can be reached at 5712727642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HAI TAO SUN/Primary Examiner, Art Unit 2616
Read full office action

Prosecution Timeline

Jun 11, 2024
Application Filed
Feb 12, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602816
SIMULATED CONFIGURATION EVALUATION APPARATUS AND METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12603024
DISPLAY CONTROL DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12586310
APPARATUS AND METHOD WITH IMAGE PROCESSING
2y 5m to grant Granted Mar 24, 2026
Patent 12578846
GENERATING MASKED REGIONS OF AN IMAGE USING A PREDICTED USER INTENT
2y 5m to grant Granted Mar 17, 2026
Patent 12579727
APPARATUS AND METHOD FOR ASYNCHRONOUS RAY TRACING
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+26.6%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 476 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month