DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Status
2. Claims 9 and 11-15 are currently pending.
Claims 8 and 10 stand canceled.
Claims 1-7 are withdrawn from consideration.
Response to Arguments
3. Applicant’s arguments with respect to claims 9 and 11-15, have been considered but are moot in view of the new ground(s) of rejection.
The remarks of 02/20/2026 are directed to the amended claims. A new search and consideration has been carried out.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application does not currently name joint inventors.
4. Claims 9 and 11-15 are rejected under 35 U.S.C. 103 as being obvious over Hyunmook Oh et al., (hereinafter Oh) (US 2023/0388557) in lieu of Prov. No. 63/088,999 dated 10/07/2020, and Sebastien Lasserre et al., (hereinafter Lasserre) (US 2021/0218969) in view of Gary Bradski et al., (hereinafter ) (US 11,507,193).
1-7. (Withdrawn)
8 and 10. (Cancelled)
Re Claim 9. (Original) Oh discloses, a point cloud geometry decoding method (a point cloud coding method and apparatus, Abstract), comprising:
obtaining a geometric information bitstream (a geometry bitstream Par.[0183, 0190] Fig.1 to 9 containing signaling information Par.[0215]) and
decoding the geometric information bitstream to obtain decoded data (receiving information from the geometry bitstream over slice header signaling Par.[0404-0408] or by high level signaling syntax, SPS, information for decoding syntax structure related to the projection information projection_info( ), Par.[0414-0419], to the coordinate position and conversion, coordinate_conversion_type, Par.[0420-0427], including geometric, attribute, metadata signaling for decoding process, including prediction mode and geolocation (GPS) information, Par.[0459-0522], etc.), wherein the decoded data comprises a prediction mode for a current node (obtaining and decoding at unit 20003 geometric and attribute information from the transmitted bitstream, per Fig.2 Par.[0032, 0102, 0202], regarding the current node coding i.e., the prediction mode, at Par.[0159] and indicating the current node being coded with contexts determined from neighboring nodes, Par.[0520], or e.g., a planar mode signaled by a flag, geometry_planar_mode_flag Par.[0506-0511] or directly inferred from inferred_direct_coding_mode_enabled_flag in the geometry node syntax, Par.[0520-0521]);
performing geometric prediction on the current node according to the prediction mode to obtain predicted residuals (the geometric prediction residual is used to represent among other, the attributes e.g., by syntax attribute_pred_residual_separate_encoding_flag[i] field, Par.[0586]), wherein the predicted residuals comprise a predicted residual in a cylindrical coordinate system and a predicted residual in a Cartesian coordinate system (including the Cartesian Par.[0587-0591] associated with a cylindrical coordinate system Par.[0258, 0267, 0281-0288, 0293, 0673, 0677, 0689, 0702, 0716] Figs.15-19);
reconstructing a prediction tree structure according to the predicted residual in the cylindrical coordinate system (reconstructing cylindrical coordinate system based on the prediction residual and the octree analyzer 40002 and a geometry reconstructor 40005, and reconstructed octree geometry at Par.[0120-0129]), and
performing coordinate conversion on points in the prediction tree structure to obtain predicted Cartesian coordinates of the current node (performing coordinate conversion at unit 1640, Figs.16, 17, 18, 24, 28, for geometry/attribute decoding Fig.46 or Figs.50-53, and described at Par.[0254, 0267-0268, 0278, 0280] cylindrical coordinate conversion at the encoder or decoder Par.[0287-0288, 0291] Eq,1811 Fig.18, Par.[0293] Fig.19 of a cylindrical coordinate system Par.[0258, 0267, 0281-0288, 0293, 0673, 0677, 0689, 0702, 0716] Figs.15-19 and obtaining Cartesian coordinates, Par.[0019, 0258, 0267-0268, 0281, 0473-0479] Figs.15-17 or Par.[0587-0591, 0677, 0689] Eq.24, Par.[0702, 0709, 0716] Figs.15-17 determined at the current node coordinates, per Fig.6 Par.[0036, 0142-0146] of the current node Par.[0159] or as determined for the current node from neighboring nodes Par.0520]); and
reconstructing a point cloud according to the predicted residual in the Cartesian coordinate system and the predicted Cartesian coordinates to obtain reconstructed point cloud data (reconstructing at a decoder, the predicted Cartesian coordinates derived from the predicted residual to obtain the original point cloud data, at unit 11003 in Fig.11, according to the octree reconstruction process at unit 13003, Fig.13 Par.[0216-0217] to reconstruct the point cloud data per Figs.44, 45, 46 Par.[0022, 0107, 0319, 0330, 0348, 0353, 0389] at the point cloud decoder per Figs.10, 11, 13, Par.[0416], or not exclusively taught at Par.[0657-0665]).
It is determined that the art to Oh, teaches each and every limitation of the claimed matter, according to the skilled in the art whom would have found the scope of the claimed matter obvious as expressly disclosed or inferred from the mapped cited embodiments.
However, in order to adapt the examination to the specific claim language, it is found that the art to Lasserre similarly teaches by emphasizing the details of the limitations reciting;
obtaining a geometric information bitstream and decoding the geometric information bitstream to obtain decoded data (receiving at decoder 50 the input bitstream, Par.[0068] comprising the point cloud data producing a tree, e.g., octree, representing the geometry of the volumetric space Par.[0053]), wherein the decoded data comprises a prediction mode for a current node (including prediction modes Par.[0122]);
performing geometric prediction on the current node according to the prediction mode to obtain predicted residuals (identifying and selecting a geometric prediction, Par.[0005, 0006]), wherein the predicted residuals comprise a predicted residual (residual Par.[0119, 0128]) in a cylindrical coordinate system and a predicted residual in a Cartesian coordinate system (a coordinate system but not specifically cylindrical, Par.[0002, 0046, 0055, 0094, e.g., 3D coordinates, Par.[0097, 0099]);
reconstructing a prediction tree structure according to the predicted residual (reconstructing the point cloud data by a tree reconstructor 56, Par.[0068], Fig.1) in the cylindrical coordinate system, and
performing coordinate conversion on points in the prediction tree structure to obtain predicted Cartesian coordinates of the current node (the coordinate system is represented in Cartesian or other system, Par.[0046]); and
reconstructing a point cloud according to the predicted residual in the Cartesian coordinate system and the predicted Cartesian coordinates to obtain reconstructed point cloud data (using the Cartesian system in coding the point cloud data, Par.[0046].
However, Oh and Lasserre do not expressly teach the amended subject matter.
In an analogous art, Bradski teaches about,
wherein the geometric information (according to geometric information Col.81 Lin.17-48, of a geometric map in Fig.34) bitstream corresponds to a regularized structure of original point cloud data (performing data regularization of the original as digital representation of the cloud data, Col.81 Lin.38-43 or Col.250 Lin.13-47, i.e., the original data of the key frames and associated cloud points of the geometric information, Col.87 Lin.1-3),
wherein the original point cloud data is represented by Cartesian coordinates (the point cloud data referring to Cartesian coordinate directions per Fig.73, Col.148 Lin.53-56), and
wherein the regularized structure comprises a regularized distribution in horizontal and vertical directions (per forming the regularization process in vertical and horizontal directions per Equation A*[w/2, h/2, 1] and defined by the matrix at Col.250 Lin.48-65 and Col.251 Lin.1-26);[[and]]
The ordinary skilled in the art would have found obvious to combine prior art elements of similar devices or methods in order to improve compression n of point cloud data, per Lasserre (Par.[0128-0129]) in geometry based prediction, as found in Oh, (Par.[0520]) indicating the geometry node occupancy, by which applying known techniques to known methods obtaining a an improvement identified in Bradski by helping to improve the three-dimensional perception, per Col.34 Lin44-49), by which yielding predictable results, per MPEP 2143 Rationales, (A) (B) (D).
Re Claim 11. (Previously Presented) Oh, Lasserre and Bradski disclose, the point cloud geometry decoding method according to claim 9, wherein the reconstructing the prediction tree structure according to the predicted residual in the cylindrical coordinate system, and the performing the coordinate conversion on the points in the prediction tree structure to obtain the predicted Cartesian coordinates of the current node comprises:
Oh teaches about, calculating reconstructed cylindrical coordinates of the current node based on a cylindrical coordinate residual obtained by decoding and predicted cylindrical coordinates of the current node (calculating the cylindrical coordinates of the current node by inversely quantize the residual at each point cloud, Par.[0170], and computing the respective coordinates at Par.[0689], Eq.24, Par.[0691] Eq.25, Par.[0694] Eq.26), according to the coding mode and the signaled information to convert the coordinates from cylindrical to Cartesian or inversely from Cartesian to Cylindrical Par.[0673, 0677, 0702, 0722] and Eq.5 to 11).
Re Claim 12. (Previously Presented) Oh, Lasserre and Bradski disclose, the point cloud geometry decoding method according to claim 9, wherein the reconstructing the point cloud according to the predicted residual in the Cartesian coordinate system and the predicted Cartesian coordinates to obtain the reconstructed point cloud data comprises: Oh teaches about, calculating reconstructed Cartesian coordinates of the current node based on a Cartesian coordinate residual obtained by decoding and the predicted Cartesian coordinates of the current node (the current node Cartesian coordinates are derived from the residual transmitted to decoder for each point Par.[0170-0171] Tables 2 and 3, based on attribute residual information Par.[0273, 0586]).
Re Claim 13. (Currently Amended) This claim represents the point cloud geometry decoding apparatus, (considering the apparatus and respective embodiments being implemented by a single chip per Par.[0754, 0756]), implementing each and every limitation of the method claim 9. Hence it is rejected on the same evidentiary probe mutatis mutandis.
Re Claim 14. (Previously Presented) This claim represents the point cloud geometry decoding apparatus, (considering the apparatus and respective embodiments being implemented by a single chip per Par.[0754, 0756]), implementing each and every limitation of the method claim 11. Hence it is rejected on the same evidentiary probe mutatis mutandis.
Re Claim 15. (Previously Presented) This claim represents the point cloud geometry decoding apparatus, (considering the apparatus and respective embodiments being implemented by a single chip per Par.[0754, 0756]), implementing each and every limitation of the method claim 12. Hence it is rejected on the same evidentiary probe mutatis mutandis.
Conclusion
5. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVE J CZEKAJ. The examiner can normally be reached on 8-6:00 Monday-Thursday and every other Friday.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Czekaj can be reached at (571)272-7327. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DRAMOS KALAPODAS/Primary Examiner, Art Unit 2487