DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Examiner’s Notes
2. This is in response to the applicant response After Final Action filed on 04/06/2026. the Examiner has made a review of the arguments presented in that document and has found it to be persuasive. Accordingly, claims 9-18 are pending and being examined. Claims 9, 12, 15, and 16 are independent form.
Nonstatutory Double Patenting
3. The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP §§ 706.02(l)(1) - 706.02(l)(3) for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
4. Claims 9-18 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-8 of U.S. Patent No. 11902576. Although the claims at issue are not identical, they are not patentably distinct from each other because the respective claims between the instant application and U.S. Patent No. 11902576 describe the same invention. The examiner has explained in detail how claim 9 of the instance application is unpatentable over claim 1 of U.S. Patent No. 11902576 in the following table. The examiner shall not detail the minor difference and the mapping between each of the instant application claims and its corresponding patented claims in U.S. Patent No. 11902576. However, should applicant request such a detailed breakdown, the examiner will be happy to oblige in subsequent Office Action.
Instant application 18/397,223
U.S. Patent No. 11902576 (576’)
The examiner’s explanation
9. A three-dimensional data encoding method, comprising:
[1] assigning three-dimensional points included in point cloud data to layers, based on geometry information of the three-dimensional points;
[2] generating first information used for specifying whether to permit referring to, for a current three-dimensional point included in the three-dimensional points, attribute information of another three-dimensional point belonging to a same layer as the current three-dimensional point; and
[3] encoding attribute information of the current three-dimensional point to generate a bitstream, by or without referring to the attribute information of the other three-dimensional point according to the first information,
[4] wherein the bitstream includes the first information, and each of the layers is a level of detail.
1.A three-dimensional data encoding method, comprising:
[a] classifying three-dimensional points included in point cloud data into layers, based on geometry information of the three-dimensional points;
[b] generating first information indicating whether to permit referring to, for a current three-dimensional point included in the three-dimensional points, attribute information of another three-dimensional point belonging to a same layer as the current three-dimensional point; and
[c] encoding attribute information of the current three-dimensional point to generate a bitstream, by or without referring to the attribute information of the other three-dimensional point according to the first information,
[d] wherein the bitstream includes the first information.
[1] is interpreted as 103 limitations taught by [a] of 576’.
[2] is interpreted as 103 limitations disclosed by [b] of 576’’.
[3] is interpreted as 103 limitations taught by [c] and [a]-[b] of claim 1 of 576’.
[4] is interpreted as 103 limitations taught by [a]- [d] of claim 1 of 576’. It is because the bitstream generated in [c] includes “attribute information of the current three-dimensional point” belonging to a same layer classified in [a]; and thus, the attribute information of the current three-dimensional point is “a level of detail of the same layer.
Claim Rejections - 35 USC § 102
5. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
6. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
7. Claims 9-18 are rejected under 35 U.S.C. 102(a)(1)/102(a)(2) as being anticipated by Lee (US 2005/0180340, hereinafter “Lee”).
Regarding claim 9, Lee discloses a three-dimensional data encoding (the method and the apparatus for encoding 3D data; see abstract and fig.3A, fig.3B, and fig.4) method, comprising:
assigning three-dimensional points included in point cloud data to layers, based on geometry information of the three-dimensional points (see volume data converting unit 300->320 in fig.3A, where volume 3D data is converted into voxel data and in turn converted into the octree data; wherein an adaptive octree includes multiple layers in parent-child relationship, denoted by ‘S’, ‘W’, ‘B’, and ‘E’ in different layers (i.e., nodes) shown by figs.8-9, see para.56. It should be noticed that a point cloud includes a discrete set of data points representing a 3D shape or object in space.);
generating first information used for specifying whether to permit referring to, for a current three-dimensional point included in the three-dimensional points, attribute information of another three-dimensional point belonging to a same layer as the current three-dimensional point (see “node encoding unit 340” in fig.3A, see para.60: the adaptive octree is created by labeling: `S` (split), `B` (black), and `W` (white) nodes as shown figs.8-9 and Table 1; see para.75: “If SOP indicates that the current node is an `S` node, as shown in illustration (c) of FIG. 14, DIB includes the average color of the current node area and up to eight flags indicating whether the child nodes are `W.`” In other words, if the current node is a `S` node, then the method may calculate the average color values of the current same node/layer as detailed information bits (DIB)); and
encoding attribute information of the current three-dimensional point to generate a bitstream, by or without referring to the attribute information of the other three-dimensional point according to the first information, wherein the bitstream includes the first information, and each of the layers is a level of detail (see “bitstream generating unit 360” in fig.3A, see fig.14 and para.74: “The bitstream of each of the nodes is composed of SOP (`S` or `P`) and DIB (detailed information bits) as shown in illustration (b) of FIG. 14.”).
Regarding claim 10, Lee discloses the three-dimensional data encoding method according to claim 9, wherein when the first information specifies to permit referring to the attribute information of the other three-dimensional point, the attribute information of the current three-dimensional point is encoded by referring to attribute information of an encoded three-dimensional point included in three-dimensional points belonging to the same layer as the current three-dimensional point (see para.75: “If SOP indicates that the current node is an `S` node, as shown in illustration (c) of FIG. 14, DIB includes the average color of the current node area and up to eight flags indicating whether the child nodes are `W.`”).
Regarding claim 11, Lee discloses the three-dimensional data encoding method according to claim 9, wherein in the encoding, the attribute information of the current three-dimensional point is encoded by referring to attribute information of a three-dimensional point belonging to a layer higher than a layer to which the current three-dimensional point belongs (see para.76: If SOP indicates that the current node is a `P` node, as shown in illustration (d) of FIG. 14, the depth information of the voxels [included in the DIB] within the current node area is PPM-encoded”, wherein the PPM-encoding operation encodes the voxels of the current depth (i.e., layer) k according to the contexts of previous depth (i.e., layer) k-1 and current depth (i.e., layer) k as shown by fig.13. See para.72: “FIG. 13 shows contexts used for PPM-encoding voxels. A two-dimensional plane with a depth of `k` on the right side is a section where there are voxels previously encoded and to be currently encoded, and a two-dimensional plane with a depth of `k-1` on the left side is a neighboring section where there are voxels previously encoded.”), regardless of the first information (It should be noticed that the PPM-encoding operation is related not to the header information of ‘width’ and ‘height’. See, fig.14).
Regarding claim 12, Lee discloses a three-dimensional data decoding method (the method and the apparatus for decoding 3D data; see abstract and fig.5A, fig.5B, and fig.6), comprising: assigning three-dimensional points included in point cloud data to layers, based on geometry information of the three-dimensional points (see volume data converting unit 500->520 in fig.5A, see para.79: “the node decoding unit 520 decodes an `S` or a `P` node from the read nodes.”);
obtaining first information from a bitstream, the first information used for specifying whether to permit referring to, for a current three-dimensional point included in the three-dimensional points, attribute information of another three-dimensional point belonging to a same layer as the current three-dimensional point (see 540 of fig.5A, and para.80: “The adaptive octree recovering unit 540 recovers the adaptive octree from the recovered nodes, and in turn converts the recovered adaptive octree into an octree. While the adaptive octree has five kinds of labels, the octree does not have these labels but has only black or white value.” See para.84: “If the SOP indicates that the current node is an `S` node, as shown in illustration (c) of FIG. 14, DIB is decoded into the average color of the current node area”); and
decoding attribute information of the current three-dimensional point from the bitstream, by or without referring to the attribute information of the other three-dimensional point according to the first information, wherein each of the layers is a level of detail (see 560 of fig.5A, and para.81: “the volume data recovering unit 560 recovers the original 3D object data from the input octree data, where the original 3D object data is any one of PointTexture, voxel, and octree data.” See [0083] “The bitstream reading unit 500 reads nodes of a tree from bitstreams of the encoded 3D data (operation 50),).
Regarding claim 13, Lee discloses the three-dimensional data decoding method according to claim 12, wherein when the first information specifies to permit referring to the attribute information of the other three-dimensional point, the attribute information of the current three-dimensional point is decoded by referring to attribute information of a decoded three-dimensional point included in three-dimensional points belonging to the same layer as the current three-dimensional point (See para.84: “If the SOP indicates that the current node is an `S` node, as shown in illustration (c) of FIG. 14, DIB is decoded into the average color of the current node area”).
Regarding claim 14, Lee discloses the three-dimensional data decoding method according to claim 12, wherein in the decoding, the attribute information of the current three-dimensional point is decoded by referring to attribute information of a three-dimensional point belonging to a layer higher than a layer to which the current three-dimensional point belongs (see para.91: “FIG. 13 shows contexts used for PPM-decoding voxels. A two-dimensional plane with a depth of `k` on the right side is a section where there are voxels previously decoded and to be currently decoded, and a two-dimensional plane with a depth of `k-1` on the left side is a neighboring section where there are voxels previously decoded.”), regardless of the first information (It should be noticed that the PPM-decoding operation is related not to the header information of ‘width’ and ‘height’. See, fig.14).
Regarding claim 15, claim 15 parallels claim 9 and is an inherent variation of claim 9, thus it is interpreted and rejected for the reasons set forth in the rejection of claim 9.
Regarding claim 16, claim 16 parallels claim 12 and is an inherent variation of claim 12, thus it is interpreted and rejected for the reasons set forth in the rejection of claim 12.
Regarding claim 17, 18, Lee discloses, wherein in the levels of detail, a higher layer has a sparser point cloud in which three-dimensional points are more distant, and a lower layer has a denser point cloud in which three-dimensional points are closer (e.g., see (c) of fig.8, wherein the node ‘S’ in the 2nd layer only has 3 voxels presenting objects out of 4 voxels while the node ‘B’ in the 3rd layer (i.e., a lower layer against the 2nd layer) has 1 voxel presenting objects out of 1 voxel. Therefore, the 3rd layer has a denser point cloud while the 2nd layer has a sparser point cloud).
Response to Arguments
9. On page 7 of applicant’s response, with respects to claim 9 rejected under the non-statutory double patenting, Applicants submits:
Applicant submits the above-noted feature of claim 9 is not included or rendered obvious by the claims of the '576 patent. In this regard, while the claims of the '576 patent include a feature indicating that "the bitstream includes first information", it is respectfully submitted that this feature does not render obvious the above-noted feature recited in claim 9 which indicates that "each of the layers is a level of detail".
.
(The emphases added by the examiner.)
The examiner respectfully disagrees with the argument. See the “Examiner’s Explanation” in Section 4 above.
10. On page 8 of applicant’s response, with respects to claim 9 rejected under 35 USC 102(a)(1)/102(a)(2), Applicants submits:
In this regard, however, while paragraph [0074] of Lee discloses that that the bitstream of each of the nodes is composed of SOP and DIB (detailed information bits), it is respectfully submitted that Lee does not disclose or even suggest that "each of the layers is a level of detail", as recited in claim 9.
The examiner respectfully disagrees with the applicant’s argument. As explained in the rejection of the claim, Lee,
[0074] discloses:
Referring to illustration (a) of FIG. 14, the header information is encoded into width, height, and depth, which are the resolution information of volume containing a 3D object, to generate a bitstream. Next, nodes are sequentially encoded one by one starting from a root node to generate bitstreams. The nodes with the total number of `N` to be encoded can be represented by Node-1, Node-2, . . . , Node-N. The bitstream of each of the nodes is composed of SOP (`S` or `P`) and DIB (detailed information bits) as shown in illustration (b) of FIG. 14.
[0061] discloses:
The adaptive octree is created as follows. First, if the bounding volume for a 3D object contains an object, the root node is labeled with `S` and the volume is subdivided into eight equal size volumes. If a subdivided volume contains only white voxels or only black voxels, the corresponding node is labeled with `W` or `B`, respectively. Otherwise, the node is set to `S` and the volume is further subdivided into eight smaller volumes. This procedure is repeated until the tree is grown to a pre-specified depth. At this depth, if a node contains both black and white voxels, it is labeled with `P` and its voxel values are encoded by the prediction by partial matching (PPM) scheme.
(The emphases added by the examiner.)
It is apparent that, wherein the bitstream of the root note (i.e., the patents ‘S’), saying in layer 1 of a octree, is composed of the note type ‘S’ and its detailed information bits (i.e., the patents 8 children: Ch1, Ch2,...Ch8). If one of the 8 children in layer 2 is labeled with ‘S” again, then the bitstream of the one of the 8 children (i.e., the child ‘S’) in layer 2 of the octree is composed of the note type ‘S’ and its detailed information bits including “mean color” of the same layer. See row (c) of fig.14. If one of the 8 children in layer 2 is labeled with ‘P', then the bitstream of the one of the 8 children in layer 2 of the quadtree is composed of the respective note type ‘P’ and the respective detailed information bits including "Depth information PPM bits" in the different layers and “color bits”. See row (d) of fig.14. Lee therefore clearly discloses “each of the layers is a level of detail”, and the applicant's argument is not persuasive.
Conclusion
11. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
12. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RUIPING LI whose telephone number is (571)270-3376. The examiner can normally be reached 8:30am--5:30pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, HENOK SHIFERAW can be reached on (571)272-4637. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit https://patentcenter.uspto.gov; https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center, and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/RUIPING LI/Primary Examiner, Ph.D., Art Unit 2676