DETAILED ACTION
1. Claims 1-19 have been presented for examination.
Notice of Pre-AIA or AIA Status
2. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
PRIORITY
3. Acknowledgment is made that this application is a 371 of PCT/US2021/072805 filed 12/08/2021.
Information Disclosure Statement
4. The information disclosure statement (IDS) submitted on 11/14/22 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the Examiner has considered the IDS as to the merits.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
5. Claims 7-12 are rejected under 35 U.S.C. 101.
i) As per claims 7-12 they are rejected because the applicant has provided evidence that the applicant intends the term "nontransitive storage medium" to include non-statutory matter. Specifically the claimed term is not define in the specification. Rather the applicant recites the term “non-transitory storage medium” as in paragraph 5. It is noted that the words "storage" and/or "recording" are insufficient to convey only statutory embodiments to one of ordinary skill in the art absent an explicit and deliberate limiting definition or clear differentiation between storage media and transitory media in the disclosure. As such, the claim(s) is/are drawn to a form of energy. Energy is not one of the four categories of invention and therefore this/these claim(s) is/are not statutory. Energy is not a series of steps or and thus is not a process. Energy is not a physical article or object and as such is not a machine or manufacture. Energy is not a combination of substances and therefore not a composition of matter. The Examiner suggests amending the claim(s) to read as a "non-transitory machine-readable storage medium" to align with the accepted term and Applicants own recitation in their specification.
Appropriate correction is required.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
6. Claims 1-19 are rejected under 35 U.S.C. 102(a)(1) as being clearly anticipated by U.S. Patent Publication No. 20190221034.
Regarding Claim 1: The reference discloses A method comprising:
receiving input mesh data representing a mesh for a three-dimensional object, the mesh including a plurality of vertices, a plurality of edges, and a plurality of faces; (“[0003] In computer games and other areas, such as computer-aided manufacturing (CAM) processes, polygonal meshes may be used. A polygonal mesh may be a collection of vertices, edges, and faces that define the shape and/or boundary of an object. The faces may consist of various polygonal shapes such as triangles, quadrilaterals, convex polygons, concave polygons, regular polygons (e.g. polygons which may have equal length sides and may have equal angles) and/or irregular polygons (e.g. polygons which may not have equal length sides and may not have equal angles).”)
for each of a subset of a grid of voxels,
determining a respective set of intersection points from at least one intersection between that voxel and the mesh; and (“[0024] Determining the silhouette volume of the object may comprise representing the object as a volumetric grid, and filling any unfilled regions of the volumetric grid that are completely enclosed by filled regions with filled voxels to produce a solid voxel volume; extracting the boundary of the voxel volume as a closed mesh; intersecting the boundary of the voxel volume with a plurality of 2D planes to form a plurality of intersection loops; computing the convex hulls of the intersection loops; determining the voxels which lie near a plane yet outside its convex hulls; determining that the remaining voxels are within or on the boundary of the silhouette volume of the object.”)
generating a convex hull of the respective set of intersection points for that voxel; and (“[0024] Determining the silhouette volume of the object may comprise representing the object as a volumetric grid, and filling any unfilled regions of the volumetric grid that are completely enclosed by filled regions with filled voxels to produce a solid voxel volume; extracting the boundary of the voxel volume as a closed mesh; intersecting the boundary of the voxel volume with a plurality of 2D planes to form a plurality of intersection loops; computing the convex hulls of the intersection loops; determining the voxels which lie near a plane yet outside its convex hulls; determining that the remaining voxels are within or on the boundary of the silhouette volume of the object.
[0025] A second aspect of this specification discloses apparatus comprising one or more processors and a memory, the memory comprising instructions that, when executed by the one or more processors, cause the apparatus to perform: receiving a computer representation of a 3D object; determining a silhouette volume of the object, wherein the silhouette volume is the maximal volume of space having a silhouette from every viewing direction which is identical to the silhouette of the object from the same viewing direction, and wherein points of the object which lie on the boundary of the silhouette volume also lie on the boundary of the object's projected silhouette from at least one viewing direction; and determining, based on the silhouette volume, the extent to which features of the object are silhouette features.
[0026] The instructions, when executed by the one or more processors, may cause the apparatus to perform: determining, for a plurality of planes and for a plurality of different axes, at least one intersection loop, wherein each intersection loop corresponds to a planar cross-section of the boundary of the object in its respective plane; and determining the convex hull of each intersection loop.”)
aggregating the respective convex hulls for the subset of the grid of voxels to produce a proxy mesh of the three-dimensional object. (“[0027] The instructions, when executed by the one or more processors, may cause the apparatus to perform: classifying any 3D points which are located in or near a given plane, yet outside the convex hulls of the intersection loops formed within that plane, as being outside the silhouette volume; and determining that any points not classified as outside the silhouette volume form part of the silhouette volume.
[0028] This specification also discloses a non-transitory computer readable storage medium having instructions that, when executed by a processing device, cause the processing device to perform operations comprising: receiving a computer representation of a 3D object; determining a silhouette volume of the object, wherein the silhouette volume is the maximal volume of space having a silhouette from every viewing direction which is identical to the silhouette of the object from the same viewing direction, and wherein points of the object which lie on the boundary of the silhouette volume also lie on the boundary of the object's projected silhouette from at least one viewing direction; and determining, based on the silhouette volume, the extent to which features of the object are silhouette features.”)
Regarding Claim 2: The reference discloses The method as in claim 1, wherein each voxel of the subset includes a respective interior and a respective boundary, and wherein determining the respective set of intersection points includes: identifying points on the boundary of that voxel through which an edge of the plurality of edges or a vertex of the plurality of vertices of the mesh intersect the boundary of that voxel. (“[0024] Determining the silhouette volume of the object may comprise representing the object as a volumetric grid, and filling any unfilled regions of the volumetric grid that are completely enclosed by filled regions with filled voxels to produce a solid voxel volume; extracting the boundary of the voxel volume as a closed mesh; intersecting the boundary of the voxel volume with a plurality of 2D planes to form a plurality of intersection loops; computing the convex hulls of the intersection loops; determining the voxels which lie near a plane yet outside its convex hulls; determining that the remaining voxels are within or on the boundary of the silhouette volume of the object.)
Regarding Claim 3: The reference discloses The method as in claim 1, wherein each voxel of the subset includes a respective interior and a respective boundary, and wherein determining the respective set of intersection points includes: identifying vertices of the plurality of vertices of the mesh located in the interior of that voxel, the identified vertices being included in the respective set of intersection points for that voxel. (“[0100] The intersection loops can be projected from 3D to 2D along the axis that is the normal of the intersecting plane, without losing any information, as the points of the intersection loop all lie in the plane. Therefore, subsequent processing of the intersection loops can be performed entirely in 2D. This can be advantageous as calculations are simpler and utilize less processing resource in 2D than do corresponding calculations in 3D, and it is mathematically simpler to describe regions of an object as being convex, and points in the 2D plane as being inside or outside the convex hull of the shape.
[0101] The loops may be used to determine whether to reject points (where the points may correspond to voxels, for example) as outside the silhouette volume. A given point can be tested to determine whether it lies inside any of the hulled loops (e.g. the hulled loops of FIG. 9). This may be done, for example, using a 2D point-inside-convex-hull test, which exploits the fact that the hulls are convex. The 2D point-inside-convex-bull test is implemented as a software routine.”)
Regarding Claim 4: The reference discloses The method as in claim 1, wherein each convex hull for the voxel includes a set of triangles, each of the set of triangles separating an interior volume of the proxy mesh from an exterior volume of the proxy mesh, and wherein aggregating the respective convex hulls for the subset of the grid of voxels includes: adding each of the set of triangles to the proxy mesh. (“[0066] In step S201, the method comprises receiving a computer representation of an object. The computer representation of the object may comprise a polygonal mesh. The polygonal mesh may be formed of polygons of any suitable size and shape. In some examples, the mesh may be formed of triangles. In other examples, the mesh may be formed of quadrilaterals. In other examples, the mesh may be formed of polygons having any suitable number of vertices and edges. However, it will be recognized that the computer representation of the object may be any suitable representation, and is not limited to being a polygonal mesh.
[0067] In step S202, the method may comprise identifying features of the polygonal mesh. This step may be performed by identifying mesh features such as vertices, edges, or faces, if the object is a polygonal mesh. In this case, the features may be recorded as a list of vertices of the object which may be stored in a memory as part of the computer representation of the object.
[0068] In step S203, the method may comprise determining the extent to which features are silhouette features, i.e. which are located on the boundary of silhouette of the object when viewed from one or more angles. If the object is a polygonal mesh, then the vertices, edges and/or faces may be the features of the mesh analyzed to determine the extent to which they are silhouette. An example of how the vertices located on the silhouette are identified is described in more detail below. An example of silhouette vertices and non-silhouette vertices is described in more detail below with reference to FIG. 3.
[0069] In some embodiments according to the present disclosure, in order to identify the silhouette features of the object, a silhouette volume is determined. Determining the silhouette volume of the object may comprise forming a large number of planar cross-sections of the boundary of the object. Each planar cross-section captures the extracted cross-section of the boundary of the object in that plane.
[0070] Once intersection loops characterizing the intersection of a plane with the boundary of the object have been computed, they form a set of closed loops projected in 2D in the intersecting plane. The loops may then be replaced with corresponding computed 2D convex hulls. These convex hulls may then be used to categorize areas of space within the plane that lie outside all convex hulls as outside the silhouette volume of the object.”)
Regarding Claim 5: The reference discloses The method as in claim 1, wherein aggregating the respective convex hulls for the subset of the grid of voxels includes: merging the convex hulls of each of the respective convex hulls for the grid of voxels such that duplicate vertices and edges are deleted. (“[0058] During simplification, the processor is configured to select the edge candidate at the front of the queue at each simplification step. The selected edge candidate is collapsed. The collapse changes the mesh locally around the collapsed edges. Following collapse of the selected edge, the processor is configured to update the queue by removing any candidates which are no longer available, and by adding any new available edge collapse candidates. In addition, the costs of the available edge collapses are recomputed for any changes to the cost.”)
Regarding Claim 6: The reference discloses The method as in claim 1, wherein each voxel of the subset includes a respective interior and a respective boundary, wherein each convex hull for the voxel includes a set of triangles, and wherein aggregating the respective convex hulls for the subset of the grid of voxels includes: for each voxel of the subset of the grid of voxels, identifying a triangle of the set of triangles of the convex hull for that voxel not embedded in the boundary of that voxel; and adding the identified triangle to the proxy mesh. (“[0066] In step S201, the method comprises receiving a computer representation of an object. The computer representation of the object may comprise a polygonal mesh. The polygonal mesh may be formed of polygons of any suitable size and shape. In some examples, the mesh may be formed of triangles. In other examples, the mesh may be formed of quadrilaterals. In other examples, the mesh may be formed of polygons having any suitable number of vertices and edges. However, it will be recognized that the computer representation of the object may be any suitable representation, and is not limited to being a polygonal mesh.
[0067] In step S202, the method may comprise identifying features of the polygonal mesh. This step may be performed by identifying mesh features such as vertices, edges, or faces, if the object is a polygonal mesh. In this case, the features may be recorded as a list of vertices of the object which may be stored in a memory as part of the computer representation of the object.
[0068] In step S203, the method may comprise determining the extent to which features are silhouette features, i.e. which are located on the boundary of silhouette of the object when viewed from one or more angles. If the object is a polygonal mesh, then the vertices, edges and/or faces may be the features of the mesh analyzed to determine the extent to which they are silhouette. An example of how the vertices located on the silhouette are identified is described in more detail below. An example of silhouette vertices and non-silhouette vertices is described in more detail below with reference to FIG. 3.
[0069] In some embodiments according to the present disclosure, in order to identify the silhouette features of the object, a silhouette volume is determined. Determining the silhouette volume of the object may comprise forming a large number of planar cross-sections of the boundary of the object. Each planar cross-section captures the extracted cross-section of the boundary of the object in that plane.
[0070] Once intersection loops characterizing the intersection of a plane with the boundary of the object have been computed, they form a set of closed loops projected in 2D in the intersecting plane. The loops may then be replaced with corresponding computed 2D convex hulls. These convex hulls may then be used to categorize areas of space within the plane that lie outside all convex hulls as outside the silhouette volume of the object.”)
Regarding Claim 7: The reference discloses A computer program product comprising a nontransitive storage medium, the computer program product including code that, when executed by processing circuitry, causes the processing circuitry to perform a method, the method comprising: receiving input mesh data representing a mesh for a three-dimensional object, the mesh including a plurality of vertices, a plurality of edges, and a plurality of faces; for each of a subset of a grid of voxels, determining a respective set of intersection points from at least one intersection between that voxel and the mesh; and generating a convex hull of the respective set of intersection points for that voxel; and aggregating the respective convex hulls for the subset of the grid of voxels to produce a proxy mesh of the three-dimensional object. (See rejection for claim 1)
Regarding Claim 8: The reference discloses The computer program product as in claim 7, wherein each voxel of the subset includes a respective interior and a respective boundary, and wherein determining the respective set of intersection points includes: identifying points on the boundary of that voxel through which an edge of the plurality of edges or a vertex of the plurality of vertices of the mesh intersect the boundary of that voxel. (See rejection for claim 2)
Regarding Claim 9: The reference discloses The computer program product as in claim 7, wherein each voxel of the subset includes a respective interior and a respective boundary, and wherein determining the respective set of intersection points includes: identifying vertices of the plurality of vertices of the mesh located in the interior of that voxel, the identified vertices being included in the respective set of intersection points for that voxel. (See rejection for claim 3)
Regarding Claim 10: The reference discloses The computer program product as in claim 7, wherein each convex hull for the voxel includes a set of triangles, each of the set of triangles separating an interior volume of the proxy mesh from an exterior volume of the proxy mesh, and wherein aggregating the respective convex hulls for the subset of the grid of voxels includes: adding each of the set of triangles to the proxy mesh. (See rejection for claim 4)
Regarding Claim 11: The reference discloses The computer program product as in claim 7, wherein aggregating the respective convex hulls for the subset of the grid of voxels includes: merging the convex hulls of each of the respective convex hulls for the grid of voxels such that duplicate vertices and edges are deleted. (See rejection for claim 5)
Regarding Claim 12: The reference discloses The computer program product as in claim 7, wherein each voxel of the subset includes a respective interior and a respective boundary, wherein each convex hull for the voxel includes a set of triangles, and wherein aggregating the respective convex hulls for the subset of the grid of voxels includes: for each voxel of the subset of the grid of voxels, identifying a triangle of the set of triangles of the convex hull for that voxel not embedded in the boundary of that voxel; and adding the identified triangle to the proxy mesh. (See rejection for claim 6)
Regarding Claim 13: The reference discloses An apparatus, the apparatus comprising: memory; and controlling circuitry coupled to the memory, the controlling circuitry being configured to: receive input mesh data representing a mesh for a three- dimensional object, the mesh including a plurality of vertices, a plurality of edges, and a plurality of faces; for each of a subset of a grid of voxels, determine a respective set of intersection points from at least one intersection between that voxel and the mesh; and generate a convex hull of the respective set of intersection points for that voxel; and aggregate the respective convex hulls for the subset of the grid of voxels to produce a proxy mesh of the three-dimensional object. (See rejection for claim 1)
Regarding Claim 14: The reference discloses The apparatus as in claim 13, wherein each voxel of the subset includes a respective interior and a respective boundary, and wherein the controlling circuitry configured to determine the respective set of intersection points is further configured to: identify points on the boundary of that voxel through which an edge of the plurality of edges or a vertex of the plurality of vertices of the mesh intersect the boundary of that voxel. (See rejection for claim 2)
Regarding Claim 15: The reference discloses The apparatus as in claim 13, wherein each voxel of the subset includes a respective interior and a respective boundary, and wherein the controlling circuitry configured to determine the respective set of intersection points is further configured to: identify vertices of the plurality of vertices of the mesh located in the interior of that voxel, the identified vertices being included in the respective set of intersection points for that voxel. (See rejection for claim 3)
Regarding Claim 16: The reference discloses The apparatus as in claim 13, wherein each convex hull for the voxel includes a set of triangles, each of the set of triangles separating an interior volume of the proxy mesh from an exterior volume of the proxy mesh, and wherein the controlling circuitry configured to aggregate the respective convex hulls for the subset of the grid of voxels is further configured to: add each of the set of triangles to the proxy mesh. (See rejection for claim 4)
Regarding Claim 17: The reference discloses The apparatus as in claim 13, wherein the controlling circuitry configured to aggregate the respective convex hulls for the subset of the grid of voxels is further configured to: merge the convex hulls of each of the respective convex hulls for the grid of voxels such that duplicate vertices and edges are deleted. (See rejection for claim 5)
Regarding Claim 18: The reference discloses The apparatus as in claim 13, wherein each voxel of the subset includes a respective interior and a respective boundary, wherein each convex hull for the voxel includes a set of triangles, and wherein the controlling circuitry configured to aggregate the respective convex hulls for the subset of the grid of voxels is further configured to: for each voxel of the subset of the grid of voxels, identify a triangle of the set of triangles of the convex hull for that voxel not embedded in the boundary of that voxel; and add the identified triangle to the proxy mesh. (See rejection for claim 6)
Regarding Claim 19: The reference discloses The apparatus as in claim 13, wherein the controlling circuitry is further configured to: display the proxy mesh on a display device. (Figure 3)
Conclusion
7. All Claims are rejected.
8. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
i) Reveillès, Jean-Pierre. "The geometry of the intersection of voxel spaces." Electronic Notes in Theoretical Computer Science 46 (2001): 285-308.
ii) Zimmer, Henrik, Marcel Campen, and Leif Kobbelt. "Efficient computation of shortest path-concavity for 3D meshes." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2013.
iii) U.S. Patent 6405151
9. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Saif A. Alhija whose telephone number is (571) 272-8635. The examiner can normally be reached on M-F, 10:00-6:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Renee Chavez, can be reached at (571) 270-1104. The fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300. Informal or draft communication, please label PROPOSED or DRAFT, can be additionally sent to the Examiners fax phone number, (571) 273-8635.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
SAA
/SAIF A ALHIJA/Primary Examiner, Art Unit 2186