Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
2. Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Claim Rejections - 35 USC § 102
3. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
4. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
5. Claims 1-6, 13-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by US 20240420225 A1 (Vidith et al, hereinafter Vidith).
Regarding claim 1, Vidith teaches A computer-implemented method to validate a clothing item for a three-dimensional (3D) avatar usable in a virtual experience, the computer-implemented method comprising: performing at least one static validation check on the clothing item to validate the clothing item when layered over an underlying surface, based on at least one property of the clothing item from the group comprising: an inner cage of the clothing item, an outer cage of the clothing item, a reference mesh of the clothing item, and combinations thereof; (Par 45 “The algorithm may also include comparing the calculated fit value to an ideal fit value (step 708) and providing feedback to the user based on the comparison (step 718). For example, the user may be provided with information via the display 122 that indicates the selected size of the garment is a good fit if the calculated fit value does not deviate from the ideal fit value by more than a given threshold, which threshold may be set by the programmer. Alternatively, the algorithm may include suggesting a different size of the garment to the user in response to the calculated fit value deviating from the ideal fit value by more than the given threshold.”)
and in response to detecting at least one failure result from the at least one static validation check, providing an identified issue based on the at least one static validation check having the at least one failure result. (Par 66 “However, as noted above, if the best calculated fit value is not within a threshold of the ideal fit value, the selected size is not ideal for the scaled base model. Dependent on whether the deviation from the ideal fit value is positive or negative, the user may be provided with a suggestion of a different size to try at step 718.”)
Regarding claim 2, Vidith teaches The computer-implemented method of claim 1, wherein providing the identified issue comprises providing to a user at least one from the group comprising: a description of the identified issue, a description of a remedy for the identified issue, a visualization of an area of the clothing item affected by the identified issue, and combinations thereof. (Par 45 “The algorithm may also include comparing the calculated fit value to an ideal fit value (step 708) and providing feedback to the user based on the comparison (step 718). For example, the user may be provided with information via the display 122 that indicates the selected size of the garment is a good fit if the calculated fit value does not deviate from the ideal fit value by more than a given threshold, which threshold may be set by the programmer. Alternatively, the algorithm may include suggesting a different size of the garment to the user in response to the calculated fit value deviating from the ideal fit value by more than the given threshold.”)
Regarding claim 3, Vidith teaches The computer-implemented method of claim 1, further comprising automatically performing an automatic remedy for the identified issue. (Par 46 “It may be that the initial superimposition of the base product mesh of the garment on the base model does not represent the best fit, but that the actual garment itself will fit the user well based on material constants of the actual garment. For example, if a garment is very stretchy, it may fit a user even if the initial superimposition shows the base product mesh will be too small for the base model. Thus, the algorithm may include adjusting locations of the predefined reference points on the inside face of the base product mesh based on given material constants associated with the base product mesh, as shown at 712.”)
Regarding claim 4, Vidith teaches The computer-implemented method of claim 1, further comprising flagging a manual remedy for the identified issue for performance by a user. (Par 45 “The algorithm may also include comparing the calculated fit value to an ideal fit value (step 708) and providing feedback to the user based on the comparison (step 718). For example, the user may be provided with information via the display 122 that indicates the selected size of the garment is a good fit if the calculated fit value does not deviate from the ideal fit value by more than a given threshold, which threshold may be set by the programmer. Alternatively, the algorithm may include suggesting a different size of the garment to the user in response to the calculated fit value deviating from the ideal fit value by more than the given threshold.”)
Regarding claim 5, Vidith teaches the computer-implemented method of claim 1, wherein performing the at least one static validation check comprises performing at least one from the group comprising: an import check, a cage edit check, a user-generated content (UGC) check, and combinations thereof (Par 64 “the algorithm will adjust the points of the base product mesh by scaling and changing the coordinates based on the material constants of each material type in each layer of the garment. The algorithm will thereafter return to step 706 and recalculate the fit value of the base product mesh after adjusting the locations of the predefined reference points on the inside face of the base product mesh.”)
Regarding claim 6, Vidith teaches the computer-implemented method of claim 1, wherein performing the at least one static validation check comprises identifying at least one from the group comprising: a cage UV modification, a cage and mesh intersection, a modification of an outer cage area that does not correspond to an accessory, a presence of a bloating cage, a presence of non-manifold and hole occurrences, and combinations thereof. (Par 50 “Each base product mesh representing a garment will have also a fit value, as noted hereinabove. For each coordinate of the garment, the distance between the base product mesh and the skin (i.e. the “collision”) is calculated while fitting the base product mesh on the scaled base model (see step 706).”)
Regarding claim 13, Vidith teaches the computer-implemented method of claim 1, wherein in response to at least one result of the at least one static validation check indicating that the clothing item satisfies a threshold number of static validation checks of the at least one static validation check, importing the clothing item into the virtual experience, wherein after the importing, the clothing item is available to be worn by an avatar that participates in the virtual experience (Par 68 “The method 800 is carried out by a processor and comprises scaling a base model depending on a brassiere band size and a brassiere cup size of the user, as shown at 802. The method includes rendering a garment on the scaled base model by superimposing a product mesh on the scaled base model, as shown at 804. … The method includes calculating a fit value of the product mesh on the scaled base model by calculating distances between predefined reference points on a skin of the scaled base model and corresponding predefined reference points on an inside face of the product mesh, as shown at 806. The method includes comparing the calculated fit value to an ideal fit value, as shown at 808, and providing feedback to the user based on the comparison, as shown at 810”).
Regarding claim 14, the non-transitory CRM claim 14 is similar in scope to the method claim 1, and is rejected under similar rationale.
Regarding claim 15, the non-transitory CRM claim 15 is similar in scope to the method claim 2, and is rejected under similar rationale.
Regarding claim 16, the non-transitory CRM claim 16 is similar in scope to the method claim 5, and is rejected under similar rationale.
Regarding claim 17, the non-transitory CRM claim 17 is similar in scope to the method claim 6, and is rejected under similar rationale.
Regarding claim 18, the system claim 18 is similar in scope to the method claim 1, and is rejected under similar rationale.
Regarding claim 19, the system claim 19 is similar in scope to the method claim 5, and is rejected under similar rationale.
Regarding claim 20, the system claim 20 is similar in scope to the method claim 6, and is rejected under similar rationale.
Claim Rejections - 35 USC § 103
6. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
7. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
8. Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Vidith as applied to claim 6 above, and further in view of US 20070273711 A1 (Maffei), and in further view of Carmen Cincotti (Hereinafter Cincotti).
Regarding claim 7, Vidith teaches The computer-implemented method of claim 6, but fails to explicitly teach wherein identifying the cage UV modification comprises: creating a spatial hash map for a UV map of the inner cage of the clothing item based on values for vertices of the UV map of the inner cage of the clothing item; and detecting collisions of the UV map of the outer cage of the clothing item with corresponding vertices of the UV map of the inner cage of the clothing item based on the spatial hash map. In related field of endeavor, Maffei teaches detecting collisions of the UV map of the outer cage of the clothing item with corresponding vertices of the UV map of the inner cage of the clothing item (Par 8 “determining if one or more slicer polygons associated with the outermost layer clothing model intersect the inner layer clothing model if the inner layer clothing model is encapsulated by the outermost layer clothing model, and excluding further processing of the inner layer clothing model if none of the slicer polygons associated with the outermost layer clothing model intersect the inner layer clothing model.”)
It would have been obvious to one of ordinary skill in the art at the time of filing to have modified Vidith to include detecting collisions of the UV map of the outer cage of the clothing item with corresponding vertices of the UV map of the inner cage of the clothing item. Doing so would allow for testing for visual occlusions of 3D clothing models on a 3D body model (Par 8 “there is a method of testing for visual occlusion of layered three dimensional (3D) clothing models on a 3D body model”)
Further regarding claim 7, Vidith as modified by Maffei fails to explicitly teach creating a spatial hash map for a UV map of the inner cage of the clothing item based on values for vertices of the UV map of the inner cage of the clothing item, and detecting collisions … based on the spatial hash map. In related field of endeavor, Cincotti teaches creating a spatial hash map of a clothing item to detect self collisions (Par 2 “Spatial hash maps will allow us to implement a very important feature of cloth simulations : self collisions.”).
It would have been obvious to one of ordinary skill in the art at the time of filing to have modified Vidith as modified by Maffei as described above to create a spatial has map of a clothing item to detect self collisions. Doing so would allow nearby vertices to be detected while keeping time complexity fast (How to query a spatial hash table “queries to a spatial hash table in order to detect particles close to each other, while keeping time complexity fast.”).
9. Claims 8 and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Vidith as applied to claim 6 above, and further in view of US 20070273711 A1 (Maffei).
Regarding claim 8, Vidith teaches The computer-implemented method of claim 6, but fails to explicitly teach wherein identifying the cage and mesh intersection comprises: finding correspondences between vertices in the inner cage of the clothing item and vertices of the outer cage of the clothing item; and performing raycasting between the corresponding vertices to identify intersecting vertices that indicate the cage and mesh intersection. In related field of endeavor, Maffei teaches identifying the cage and mesh intersection comprises: finding correspondences between vertices in the inner cage of the clothing item and vertices of the outer cage of the clothing item; and performing raycasting between the corresponding vertices to identify intersecting vertices that indicate the cage and mesh intersection (Par 105 “To test if any vertices from the outer layer fall inside an inner layer, a special intersection test is performed. Specifically, a copy of the outer layer is created. This outer layer is then scaled with respect to the body model to create a larger version of the clothing model. For each vertex in the clothing model, a line segment is defined that passes from the scaled model to the original model. If this line segment intersects any inner layer polygon, then that polygon must lie between the original outer layer and the scaled outer layer.”)
It would have been obvious to one of ordinary skill in the art at the time of filing to have modified Vidith to include identifying the cage and mesh intersection comprises: finding correspondences between vertices in the inner cage of the clothing item and vertices of the outer cage of the clothing item; and performing raycasting between the corresponding vertices to identify intersecting vertices that indicate the cage and mesh intersection as taught by Maffei. Doing so would allow intersections between outer and inner layers to be determined (Par 105 “To test if any vertices from the outer layer fall inside an inner layer, a special intersection test is performed.”)
Regarding claim 9, Vidith teaches The computer-implemented method of claim 6, but fails to explicitly teach wherein identifying the modification of the outer cage area that does not correspond to an accessory comprises: finding correspondences between vertices in the inner cage of the clothing item and vertices in the outer cage of the clothing item; and analyzing line segments that connect the corresponding vertices. In related field of endeavor, Maffei teaches wherein identifying the modification of the outer cage area that does not correspond to an accessory comprises: finding correspondences between vertices in the inner cage of the clothing item and vertices in the outer cage of the clothing item; and analyzing line segments that connect the corresponding vertices (Par 108 “For each vertex, at state 3350 a line segment is constructed from the original outer layer to the contracted layer. At decision state 3355, the process determines which polygon of the inner layer is intersected by the line segment. At state 3360, the process determines the distance of the vertex from the intersected polygon.”)
It would have been obvious to one of ordinary skill in the art at the time of filing to have modified Vidith to include wherein identifying the modification of the outer cage area that does not correspond to an accessory comprises: finding correspondences between vertices in the inner cage of the clothing item and vertices in the outer cage of the clothing item; and analyzing line segments that connect the corresponding vertices as taught by Maffei. Doing so would help to prevent bleedthrough problems during animation (Par 108 “ensures that all of the outer layer vertices are a minimum distance from the inner layer. This aids in correcting bleedthrough problems that can occur during animations when an outer layer vertex is very close to an inner layer polygon.”)
10. Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Vidith as applied to claim 6 above, and further in view of US 20210124333 A1 (Gonzalez et al, hereinafter Gonzalez).
Regarding claim 11, Vidith teaches The computer-implemented method of claim 6, but fails to explicitly teach wherein identifying the presence of the non-manifold and hole occurrences comprises detecting the non-manifold and hole occurrences using edge loops using half angle and half edge information in at least one from the group comprising: the inner cage of the clothing item, the outer cage of the clothing item, and combinations thereof. In related field of endeavor, Gonzalez teaches identifying the presence of the non-manifold and hole occurrences comprises detecting the non-manifold and hole occurrences using edge loops using half angle and half edge information in at least one from the group comprising: the inner cage of the clothing item, the outer cage of the clothing item, and combinations thereof (Par 20 “Mesh models for an object may be inspected prior to 3D printing/object generation to determine whether they contain at least one mesh error, which in some examples may comprise determining if a model comprises a mesh error which would result in an object generation error. Such errors may for example comprise any or any combination of an isolated polygon or vertex, an empty or negative mesh volume, a hole in the mesh (the mesh is not ‘watertight’, which may mean detecting holes of at least a threshold size as smaller holes may ‘close up’ on object generation), inconsistent polygon orientation, overlapping polygons, duplicated polygons or vertices, zero area polygons, non-manifold vertices or edges, and mesh intersections”, Par 22 “Every triangle edge in the mesh shares common vertex endpoints with the edge of exactly one other triangle (Manifold Edge rule). An inspection in relation to this rule may comprise traversing all the polygon edges and counting the number of edges which are not shared by more than one polygon (boundary edges). If any of these values is greater than 0, this rule may be considered violated.”)
It would have been obvious to one of ordinary skill in the art at the time of filing to have modified Vidith to include identifying the presence of the non-manifold and hole occurrences comprises detecting the non-manifold and hole occurrences using edge loops using half angle and half edge information in at least one from the group comprising: the inner cage of the clothing item, the outer cage of the clothing item, and combinations thereof as taught by Gonzalez. Doing so would allow errors which would result in generation errors to be detected. (Par 20 “Mesh models for an object may be inspected prior to 3D printing/object generation to determine whether they contain at least one mesh error, which in some examples may comprise determining if a model comprises a mesh error which would result in an object generation error.”)
11. Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Vidith as applied to claim 1 above, and further in view of US 11829686 B2 (Abunojaim et al, hereinafter Abunojaim).
Regarding claim 12, Vidith teaches The computer-implemented method of claim 1, but fails to explicitly teach further comprising categorizing respective static validation checks of the at least one static validation check as one of an error, a warning, a visual quality suggestion check, or a user-generated content (UGC) check. In related field of endeavor, Abunojaim teaches categorizing respective static validation checks of the at least one static validation check as one of an error, a warning, a visual quality suggestion check, or a user-generated content (UGC) check (Par 23 “For instance, certain printability issues that can cause undesirable artifacts in the resulting printed object but do not prevent the 3D model to be printed can be displayed in a first color (e.g., yellow). Other more serious printability issues that can prevent the 3D model (or a portion thereof) from being printed can be displayed in a second color (e.g., red). Furthermore, portions of the 3D model having no detected printability issues can be displayed in a third color (e.g., green). In one implementation, portions of the unprocessed 3D model that are detected by the 3D model system to correspond to non-manifold geometries are displayed in yellow, detected boundary edges are displayed in red, and portions of the 3D model that meet all validity criteria analyzed by the 3D model system are displayed in green.”)
It would have been obvious to one of ordinary skill in the art at the time of filing to have modified Vidith to include categorizing respective static validation checks of the at least one static validation check as one of an error, a warning, a visual quality suggestion check, or a user-generated content (UGC) check as taught by Abunojaim. Doing so would allow issues in a model to be highlighted (Par 23 “Still further, the unprocessed 3D model can be displayed within the first user interface feature in a manner that highlights the detected printability issues.”)
Allowable Subject Matter
12. Claim 10 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
13. The following is a statement of reasons for the indication of allowable subject matter: Regarding claim 10, the closest prior art of Maffei teaches measuring a distance between vertices of an outer cage and a polygon, and comparing that distance to a threshold value (Maffei Par 108 “For each vertex, at state 3350 a line segment is constructed from the original outer layer to the contracted layer. At decision state 3355, the process determines which polygon of the inner layer is intersected by the line segment. At state 3360, the process determines the distance of the vertex from the intersected polygon. If the distance is less than a prescribed threshold”). However, Maffei fails to teach the combined limitation below as a whole, “wherein identifying the presence of the bloating cage comprises: measuring cage mesh distances as distances between vertices of the reference mesh of the clothing item and corresponding vertices of the outer cage of the clothing item; and building a Gaussian distribution based on the cage mesh distances, wherein if the cage mesh distances for predetermined vertices exceed a predetermined number of standard deviations of the Gaussian distribution, the predetermined vertices are identified as part of the bloating cage.” Furthermore, no prior art of record either alone or in combination teaches the above limitation as a whole. Therefore claim 10 is considered to be allowable.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOHN PATRICK GOCO whose telephone number is (571)272-5872. The examiner can normally be reached M-Th, 7:00 am - 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Chan can be reached at (571) 272-3022. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JOHN P GOCO/Examiner, Art Unit 2619
/JASON CHAN/Supervisory Patent Examiner, Art Unit 2619