Prosecution Insights
Last updated: April 19, 2026
Application No. 18/421,534

METHOD AND APPARATUS FOR ESTIMATING PHYSICAL PROPERTY PARAMETER OF TARGET FABRIC

Non-Final OA §103
Filed
Jan 24, 2024
Examiner
LEE, BENEDICT E
Art Unit
2665
Tech Center
2600 — Communications
Assignee
Clo Virtual Fashion Inc.
OA Round
1 (Non-Final)
87%
Grant Probability
Favorable
1-2
OA Rounds
3y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
92 granted / 106 resolved
+24.8% vs TC avg
Moderate +15% lift
Without
With
+14.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
16 currently pending
Career history
122
Total Applications
across all art units

Statute-Specific Performance

§101
7.6%
-32.4% vs TC avg
§103
50.7%
+10.7% vs TC avg
§102
31.8%
-8.2% vs TC avg
§112
7.3%
-32.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 106 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. § 119 (a)-(d). The certified copy has been filed in parent Application No. KR10-2023-0010372, filed on 01/26/2023. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1–2, 11–13 and 15–16 are rejected under 35 U.S.C. § 103 as being unpatentable over Chao (U.S. 12,154,316 B2) in view of Ayush et al. (U.S. 11,080,817 B2). Regarding claim 1, Chao discloses a method of estimating one or more physical property parameters of a target fabric, the method comprising: receiving a two-dimensional (2D) image capturing a draped shape of the target fabric and basic information of the target fabric. (Per Fig. 1, Chao’s image capturing apparatus 110 receives a two-dimensional image of a fabric and analyzes its information. The image processing module 121 may automatically analyze the fabric image to obtain a plurality of pieces of corresponding fabric information. Chao col. 3 lines 8–33.) However, Chao fails to specifically disclose estimating the one or more physical property parameters of the target fabric by applying the 2D image and the basic information of the target fabric to a neural network model, the one or more physical property parameters comprising at least one of a stretching parameter of the target fabric or a bending parameter of the target fabric; and outputting the one or more physical property parameters of the target fabric. In related art, Ayush discloses estimating the one or more physical property parameters of the target fabric by applying the 2D image and the basic information of the target fabric to a neural network model, the one or more physical property parameters comprising at least one of a bending parameter of the target fabric1; and (Per Fig. 4, Ayush’s regression network 410 applies transformation parameters to warping module 412 such that related parameters are rendered to analyze warped 2D image of the target clothing 402. Ayush col. 11 line 62 – col. 12 line 6. The TPS warping module 412 uses the spatial parameters to warp the two-dimensional image of the target clothing 402 and output the warped target clothing 210.) outputting the one or more physical property parameters of the target fabric. (Per Fig. 4, Ayush discloses the warped target clothing 210 after passing the parameter through his regression network 410. Id. The TPS warping module 412 first estimates an affine transformation for generating the warped target clothing 210.) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to incorporate the teachings of Ayush into the teachings of Chao to create a warped image of a target cloth using multi-scale patch adversarial loss. Id. col. 1 lines 46–67. Regarding claim 15, Chao discloses a non-transitory computer-readable storage medium storing instructions that, when executed by a processor, cause the processor to: receive a two-dimensional (2D) image capturing a draped shape of a target fabric and basic information of the target fabric. (Per Fig. 1, Chao’s image capturing apparatus 110 receives a two-dimensional image of a fabric and analyzes its information. The image processing module 121 may automatically analyze the fabric image to obtain a plurality of pieces of corresponding fabric information. Chao col. 3 lines 8–33.) However, Chao fails to specifically disclose estimate one or more physical property parameters of the target fabric by applying the 2D image and the basic information of the target fabric to a neural network model, the one or more physical property parameters comprising at least one of a stretching parameter of the target fabric or a bending parameter of the target fabric; and output the one or more physical property parameters of the target fabric. In related art, Ayush discloses estimate one or more physical property parameters of the target fabric by applying the 2D image and the basic information of the target fabric to a neural network model, the one or more physical property parameters comprising at least one of a bending parameter of the target fabric; and (Per Fig. 4, Ayush’s regression network 410 applies transformation parameters to warping module 412 such that related parameters are rendered to analyze warped 2D image of the target clothing 402. Ayush col. 11 line 62 – col. 12 line 6. The TPS warping module 412 uses the spatial parameters to warp the two-dimensional image of the target clothing 402 and output the warped target clothing 210.) output the one or more physical property parameters of the target fabric. (Per Fig. 4, Ayush discloses the warped target clothing 210 after passing the parameter through his regression network 410. Id. The TPS warping module 412 first estimates an affine transformation for generating the warped target clothing 210.) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to incorporate the teachings of Ayush into the teachings of Chao to create a warped image of a target cloth using multi-scale patch adversarial loss. Id. col. 1 lines 46–67. Regarding claim 16, Chao discloses an apparatus for estimating one or more physical property parameters of a target fabric, the apparatus comprising: a communication interface configured to receive a two-dimensional (2D) image capturing a draped shape of the target fabric and basic information of the target fabric. (Per Fig. 1, Chao’s image capturing apparatus 110 receives a two-dimensional image of a fabric and analyzes its information. The image processing module 121 may automatically analyze the fabric image to obtain a plurality of pieces of corresponding fabric information. Chao col. 3 lines 8–33.) However, Chao fails to specifically disclose a processor configured to estimate the one or more physical property parameters of the target fabric by applying the 2D image and the basic information of the target fabric to a neural network model, the one or more physical property parameters comprising at least one of a stretching parameter of the target fabric or a bending parameter of the target fabric; and an output device configured to output the one or more physical property parameters. In related art, Ayush discloses a processor configured to estimate the one or more physical property parameters of the target fabric by applying the 2D image and the basic information of the target fabric to a neural network model, the one or more physical property parameters comprising at least one of a bending parameter of the target fabric; and (Per Fig. 4, Ayush’s regression network 410 applies transformation parameters to warping module 412 such that related parameters are rendered to analyze warped 2D image of the target clothing 402. Ayush col. 11 line 62 – col. 12 line 6. The TPS warping module 412 uses the spatial parameters to warp the two-dimensional image of the target clothing 402 and output the warped target clothing 210.) an output device configured to output the one or more physical property parameters. (Per Fig. 4, Ayush discloses the warped target clothing 210 after passing the parameter through his regression network 410. Id. The TPS warping module 412 first estimates an affine transformation for generating the warped target clothing 210.) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to incorporate the teachings of Ayush into the teachings of Chao to create a warped image of a target cloth using multi-scale patch adversarial loss. Id. col. 1 lines 46–67. Regarding claim 2, Chao as modified by Ayush, discloses the method, wherein the basic information of the target fabric comprises: at least one of a type of the target fabric, composition of the target fabric, a density of the target fabric, a weight of the target fabric, a thickness of the target fabric, dyeing of the target fabric, and adding of printing to the target fabric. (Per Fig. 5, Ayush discloses a composition mask 504 to analyze a synthesized cloth image. Ayush col. 13 lines 20–30. The composition mask 504 is useable by the try-on module 120 to generate the synthesized image 110, which includes the person image 506 and the warped target clothing 210 fused together as a synthesized final result,) Regarding claim 11, Chao as modified by Ayush, the method, wherein the outputting the one or more physical property parameters of the target fabric comprises: displaying a 3D drape simulation result corresponding to 3D clothes by applying the one or more physical property parameters to the target fabric used for the 3D clothes draped on an object. (Per Fig. 6, Chao discloses a 3D fabric model built by a normal map and a roughness map. Chao col. 8 line 54 – col. 9 line 8. [t]he three-dimensional model modeling software may perform three-dimensional model modeling according to the normal map and roughness map in each fabric file to generate a simulated three-dimensional fabric model.) Regarding claim 12, Chao as modified by Ayush, discloses the method, wherein the stretching parameter comprises at least one of a weft stretch force parameter, a wrap stretch force parameter, or a shear parameter. (Per Fig. 1, Ayush’s sampling module 116 discloses a ground truth warped cloth. Ayush col. 5 line 55 – col. 6 line 8. The sampling module 116 is configured to receive the warped cloth and the ground truth warped cloth from the representation module 114 and sample pairs of patches from corresponding same location of the warped cloth and the ground truth warped cloth.) Regarding claim 13, it has been rejected in the same manner as claim 12. Claims 3–6, 10 and 14 are rejected under 35 U.S.C. § 103 as being unpatentable over Chao in view of Ayush and further in view of Wang (CN114925600A). Regarding claim 3, Chao as modified by Ayush, discloses the method, wherein the neural network model comprises at least one of: a regression model configured to estimate the stretching parameter related to stretching of the target fabric, based on the basic information of the target fabric. (Chao discloses a roughness map 305 in his generative network model discriminating different weaved fabric in order that features of a physical fabric are generated. Chao col. 7 lines 38–60. [t]he image processing module 321 provided by the disclosure may automatically generate a light and shadow feature (normal map 304) and a gray-scale feature (roughness map 305) that may faithfully reflect the features of the physical fabric.) However, Chao as modified by Ayush, fails to specifically disclose an estimation model trained to estimate the bending parameter related to bending of the target fabric, based on the latent vectors and the stretching parameter. In related art, Wang discloses an estimation model trained to estimate the bending parameter related to bending of the target fabric, based on the latent vectors and the stretching parameter. (Wang discloses parameter vector from his trained VAE model to obtain bending stiffness. Wang para. ¶0066. [a]nd the bending stiffness of the real fabric to be measured is obtained using the learned deep neural network.) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to incorporate the teachings of Wang into the teachings of Chao and Ayush to measure bending stiffness of draped fabrics. Id. para. ¶0007. Regarding claim 4, it has been rejected in the same manner as claim 3. Regarding claim 5, Chao as modified by Ayush and Wang, discloses the method, wherein the estimating the stretching parameter comprises: encoding the basic information of the target fabric into features; and (Per Fig. 1, Chao’s image capturing apparatus 110 receives a two-dimensional image of a fabric and analyzes its information. The image processing module 121 may automatically analyze the fabric image to obtain a plurality of pieces of corresponding fabric information. Chao col. 3 lines 8–33.) estimating the stretching parameter by feeding the encoded features to the regression model. (Per Fig. 4, Ayush’s regression network 410 applies transformation parameters to warping module 412 such that related parameters are rendered to analyze warped 2D image of the target clothing 402. Ayush col. 11 line 62 – col. 12 line 6. The TPS warping module 412 uses the spatial parameters to warp the two-dimensional image of the target clothing 402 and output the warped target clothing 210.) Regarding claim 6, Chao as modified by Ayush and Wang, discloses the method, wherein the encoding comprises encoding the features indicating a type of the target fabric among the basic information of the target fabric. (Per Fig. 1, Chao’s image capturing apparatus 110 receives a two-dimensional image of a fabric and analyzes its information. The image processing module 121 may automatically analyze the fabric image to obtain a plurality of pieces of corresponding fabric information. Chao col. 3 lines 8–33.) Regarding claim 10, Chao as modified by Ayush, discloses the claimed invention, but fails to specifically disclose the method, wherein the stretching parameter correlates with the basic information of the fabric, and the bending parameter correlates with the draped shape of the target fabric. In related art, Wang discloses the method, wherein the stretching parameter correlates with the basic information of the fabric, and the bending parameter correlates with the draped shape of the target fabric. (Per Fig. 3, Wang discloses a bending stiffness of draped fabrics. Wang para. ¶0065. [a] learning-based method for measuring the bending stiffness of draped fabrics includes:) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to incorporate the teachings of Wang into the teachings of Chao and Ayush to measure bending stiffness of draped fabrics. Id. para. ¶0007. Regarding claim 14, Chao as modified by Ayush, discloses the claimed invention, but fails to specifically disclose the method, wherein the 2D image comprises a top view image capturing the draped shape of a circular specimen of the target fabric. In related art, Wang discloses the method, wherein the 2D image comprises a top view image capturing the draped shape of a circular specimen of the target fabric. (Wang discloses a multi-view map simulating fabric dataset. Wang para. ¶0066. [a] multi-view depth map is generated using the simulation dataset;) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to incorporate the teachings of Wang into the teachings of Chao and Ayush to measure bending stiffness of draped fabrics. Id. para. ¶0007. Allowable Subject Matter Claims 7–9 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Das et al. (U.S. 10,990,718 B2) discloses a method and system for generating physical design parameters of an object. Contact Any inquiry concerning this communication or earlier communications from the examiner should be directed to BENEDICT LEE whose telephone number is (571)270-0390. The examiner can normally be reached 10:00-16:00 (EST). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen R. Koziol can be reached at (408) 918-7630. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BENEDICT E LEE/Examiner, Art Unit 2665 /Stephen R Koziol/Supervisory Patent Examiner, Art Unit 2665 1 Under broadest reasonable interpretation (BRI), Examiner construes a bending parameter of the target fabric as shear bending force parameter. See Applicant’s para. ¶0058. Ayush analogously teaches modeling shear of the warped target clothing in his neural network model 412. See his Fig. 4 and col. 11 line 62 – col. 12 line 6.
Read full office action

Prosecution Timeline

Jan 24, 2024
Application Filed
Feb 23, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12567243
METHOD FOR OPTIMIZING DATA TO BE USED TO TRAIN OBJECT RECOGNITION MODEL, METHOD FOR BUILDING OBJECT RECOGNITION MODEL, AND METHOD FOR RECOGNIZING AN OBJECT
2y 5m to grant Granted Mar 03, 2026
Patent 12561958
METHOD OF TRAINING SEMICONDUCTOR PROCESS IMAGE GENERATOR
2y 5m to grant Granted Feb 24, 2026
Patent 12561215
GRAPH MACHINE LEARNING FOR CASE SIMILARITY
2y 5m to grant Granted Feb 24, 2026
Patent 12548170
METHOD, DEVICE AND SYSTEM FOR REAL-TIME MULTI-CAMERA TRACKING OF A TARGET OBJECT
2y 5m to grant Granted Feb 10, 2026
Patent 12541999
METHOD FOR EMOTION RECOGNITION BASED ON HUMAN-OBJECT TIME-SPACE INTERACTION BEHAVIOR
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
87%
Grant Probability
99%
With Interview (+14.8%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 106 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month