Prosecution Insights
Last updated: April 19, 2026
Application No. 18/917,651

METHOD, APPARATUS, AND MEDIUM FOR POINT CLOUD CODING

Final Rejection §102§103
Filed
Oct 16, 2024
Examiner
RAHAMAN, SHAHAN UR
Art Unit
2426
Tech Center
2400 — Computer Networks
Assignee
Bytedance Inc.
OA Round
2 (Final)
76%
Grant Probability
Favorable
3-4
OA Rounds
2y 11m
To Grant
88%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
479 granted / 633 resolved
+17.7% vs TC avg
Moderate +13% lift
Without
With
+12.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
51 currently pending
Career history
684
Total Applications
across all art units

Statute-Specific Performance

§101
4.7%
-35.3% vs TC avg
§103
50.0%
+10.0% vs TC avg
§102
14.7%
-25.3% vs TC avg
§112
15.1%
-24.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 633 resolved cases

Office Action

§102 §103
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Following prior arts are considered pertinent to applicant's disclosure. US 20230045663 A1 (hereinafter HUR) US 20210314611 A1 (hereinafter Iguchi) US 20150189298 A1 (hereinafter Ye) US20120170649 A1 US 20210099711 A1 (Fig. 1B) Response to Remarks/Arguments Rejection made under 35 USC § 112 have been withdrawn Applicant’s arguments with respect to prior art rejection have been fully considered but they are not persuasive for following reason. Re: Prior art rejection of independent claims Applicant argued in substance that determination of “I” or “P” frame does not teach whether an inter prediction is applied to the target frame because “P” frame does not indicate the whole frame has been encoded with interprediction. Examiner respectfully such logic is flawed. The limitation is whether an interprediction is applied to a frame, meaning if any part of it has interprediction (such as P-frame) teaches it. If no part is inter predicted that is “I” frame. Applicant further argued that para 351 does not teach “determining information regarding whether an attribute inter prediction is enabled….” Examiner respectfully disagrees and argues that in Hur teaches “information regarding whether an attribute inter prediction is enabled” because in Hur case the “information” is inter-prediction -based encoding applied to the geometry. Because this information allows/enable Hur to apply attribute inter prediction. Therefore, applicant’s arguments are not persuasive Re: Prior art rejection of dependent claims Applicant has presented no additional argument, other than arguments already presented with respect to independent claims. Therefore, the arguments are similarly not persuasive. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-9, 11-12, 17-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by HUR. HUR teaches with respect to following claims: 1. A method for point cloud coding [(Fig.21 of HUR)] , comprising: determining, for a conversion between a target frame of a point cloud sequence and a bitstream of the point cloud sequence, a first indication indicating whether an inter prediction is applied to the target frame [(#53002; P-frame is inter prediction frame)] : determining information regarding whether an attribute inter prediction is enabled to the target frame: [(#55005; #55004; para 351)] and performing the conversion based on the first indication and the information [(#55008 performs the conversion based on 55004 as well as 55007/55005; 55007/55005 is based on 53002)] . 2. The method of claim 1, wherein the inter prediction is not applied to the target frame, or wherein the inter prediction is applied to the target frame [(53002 p-frame; please note teaching only one of the alternatives would be suffice)] , and/or wherein at least one first indication is used to indicate whether the inter prediction is applied to the target frame, or wherein a motion estimation is applied to the reference frame before a motion compensation process [(para 333)] . 3. The method of claim 2, wherein the target frame is one of: a random access point frame, or an I-frame [(para 330)] , or wherein the at least one first indication is indicated to a decoder side, or wherein the at least one first indication is coded with at least one of: a fixed-length coding, a unary coding, or a truncated unary coding, or wherein the at least one first indication is coded in a predictive way. 4. The method of claim 1, wherein determining the information regarding whether the attribute inter prediction is enabled to the target frame comprises: determining whether the attribute inter prediction is enabled to the target frame based on a second indication [(Geo inter prediction at 55004 is determined based on a second indication at 53005, which in turns determines attribute information inter-prediction)] : or determining whether the attribute inter prediction is enabled to the target frame based on motion information between a reference frame and a current frame, and wherein the target frame is the reference frame. 5. The method of claim 4, wherein if the inter prediction is applied to the target frame and the attribute inter prediction is enabled, the attribute inter prediction is applied to the target frame.[[(#53002 and 55005)] reference 6. The method of claim 4, wherein at least one second indication is used to indicate whether the attribute inter prediction is enabled to the target frame: [(Geo inter prediction at 55004 is determined based on a second indication at 53005, which in turns determines attribute information inter-prediction)] : and/or wherein the at least one second indication is indicated to a decoder side, and/or wherein the at least one second indication is coded with at least one of: a fixed-length coding, a unary coding, or a truncated unary coding, or wherein the at least one second indication is coded in a predictive way. 7. The method of claim 4, wherein if a geometry move of the reference frame is less than a threshold, the attribute inter prediction is only applied to the reference frame [(para 332)], or wherein the motion information comprises a translation component and a rotation component, or wherein the motion information is a rigid motion or a non-rigid motion, or wherein the motion information is represented by a matrix, or wherein if the motion information is smaller than a motion information threshold, the attribute inter prediction is applied to the target frame. [(para 332, 330; rate of change between a frame and its reference frame teaches motion)] 8. The method of claim 7, wherein the geometry move is indicated by the motion information, and/or wherein the motion information is provided by a source that provides point cloud data, or wherein the motion information is generated by a motion estimation process [(para 332; rate of change between a frame and its reference frame teaches motion, estimating it is motion estimation)] , or wherein the motion information is indicated by a compensated motion, and/or wherein the translation component is represented by a motion matrix or a motion vector, or wherein the rotation component is represented by a rotation matrix or a rotation vector, or wherein the rotation component is represented by one of: an Euler angle, a quaternion angle, or a rotation angle: and/or wherein if the translation component is smaller than a translation threshold, the attribute inter prediction is applied to the target frame, or wherein if the rotation component is smaller than a rotation threshold, the attribute inter prediction is applied to the target frame. 9. The method of claim 4, wherein at least one of the followings is pre-defined: a motion threshold [(“specific threshold” {para 330})] , a translation threshold, or a rotation threshold, and/or wherein at least one of the followings is indicted in the bitstream: the motion threshold, the translation threshold, or the rotation threshold. 11. The method of claim 1, wherein an attribute inter prediction is applied to the target frame after a geometry offset process. [(#53006 and #53007 each are an offset process; or see Fig.33 geom_slice_qp_offset; or bounding box offset in x, y and z {para 472-474})] 12. The method of claim 11, wherein a geometry offset is applied to geometry information of a reference frame and a current frame before the attribute inter prediction, and wherein the target frame is the reference frame or the current frame, [(#53006 and #53007 each use reference frame from #53009; or see Fig.33 geom_slice_qp_offset; or bounding box offset in x, y and z {para 472-474})] or wherein there is no geometry offset to geometry information of a reference frame and a current frame before the attribute inter prediction, and wherein the target frame is the reference frame or the current frame. 17. The method of claim 1, wherein the conversion includes encoding the target frame into the bitstream, or wherein the conversion includes encoding the target frame into the bitstream. [(Fig.1)] 18. An apparatus for point cloud coding comprising a processor and a non-transitory memory with instructions thereon, wherein the instructions upon execution by the processor, cause the processor to: determine, for a conversion between a target frame of a point cloud sequence and a bitstream of the point cloud sequence, a first indication indicating whether an inter prediction is applied to the target frame: determine information regarding whether an attribute inter prediction is enabled to the target frame: and perform the conversion based on the first indication and the information. [(see analysis of claim 1 and Fig.1 of HUR)] 19. A non-transitory computer-readable storage medium storing instructions that cause a processor to: determine, for a conversion between a target frame of a point cloud sequence and a bitstream of the point cloud sequence, a first indication indicating whether an inter prediction is applied to the target frame: determine information regarding whether an attribute inter prediction is enabled to the target frame: and perform the conversion based on the first indication and the information. [(see analysis of claim 1 and para 668 of HUR)] 20. a method for storing a bitstream of a point cloud sequence, comprising: determining a first indication indicating whether an inter prediction is applied to a target frame of the point cloud sequence: determining information regarding whether an attribute inter prediction is enabled to the target frame: and generating the bitstream based on the first indication and the information, and storing the bitstream in a non-transitory computer readable recording medium [(see analysis of claim 1 and para 80 of HUR )] Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 13-16 are rejected under 35 U.S.C. 103 as being unpatentable over HUR in view of Iguchi. Regarding Claim 13: HUR does not explicitly show a first geometry offset for the reference frame and a second geometry offset for the current frame are same or wherein a first geometry offset for the reference frame and a second geometry offset for the current frame are different However, in the same/related field of endeavor, Iguchi teaches a first geometry offset for the reference frame and a second geometry offset for the current frame are same [(Iguchi para 670 and Fig.94; bounding box position shift or offset of the minimum value, stored in SPS/sequence parameter set, therefore for all frames in sequence{para 331})] , or wherein a first geometry offset for the reference frame and a second geometry offset for the current frame are different Therefore, in light of above discussion it would have been obvious to one of the ordinary skill in the art, before the effective filing date of the claimed invention, to combine the teaching of the prior arts because such combination would provide predictable result with no change of their respective functionalities. HUR in view of Iguchi additionally teaches with regards to claim 14. The method of claim 13, wherein the first geometry offset and the second geometry offset are a minimum position of the geometry information of the reference frame and the current frame in a specific coordinate system [(Iguchi para 670 and Fig.94; bounding box position shift or offset of the minimum value; please note all frame have these HUR teaches current and reference frame)] , or wherein the first geometry offset and the second geometry offset are a minimum position of geometry information of a plurality of frames in a specific coordinate system. HUR in view of Iguchi additionally teaches with regards to claim 15. The method of claim 14, wherein the plurality of frames comprises all decoded frames and the current frame, [(Iguchi para 670 and Fig.94; bounding box position shift or offset of the minimum value, stored in SPS/sequence parameter set, therefore for all frames in sequence{para 331}; also see HUR para 472-474)] , or wherein the plurality of frames comprises a number of previously decoded frames and the current frame, or wherein the plurality of frames comprises all decoded frames after a last random access point frame, the last random access point frame and the current frame. HUR in view of Iguchi additionally teaches with regards to claim 16. The method of claim 13, wherein the first geometry offset is a minimum position of the geometry information of the reference frame in a specific coordinate system [(Iguchi para 670 and Fig.94; bounding box position shift or offset of the minimum value, stored in SPS/sequence parameter set, therefore for all frames in sequence{para 331}; also see HUR para 472-474)], or wherein the second geometry offset is a minimum position of the geometry information of the current frame in the specific coordinate system. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over HUR in view of Ye. Regarding Claim 10: HUR does not explicitly show the motion threshold, the translation threshold, or the rotation threshold is coded with one of a fixed-length coding, a unary coding, or a truncated unary coding, or wherein at least one of: the motion threshold, the translation threshold, or the rotation threshold is coded in a predictive way. However, in the same/related field of endeavor, Ye teaches the motion threshold, the translation threshold, or the rotation threshold is coded with one of a fixed-length coding, a unary coding, or a truncated unary coding [(threshold for motion amount determination is transmitted from encoder to decoder {para 135} fixed- length coding and entropy coding {para 172}. Fixed-length coding is simple form of coding data for encoder to send data to decoder; unary is another simple entropy coding)] , or wherein at least one of: the motion threshold, the translation threshold, or the rotation threshold is coded in a predictive way. Therefore, in light of above discussion it would have been obvious to one of the ordinary skill in the art, before the effective filing date of the claimed invention, to combine the teaching of the prior arts because such combination would provide predictable result with no change of their respective functionalities. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Shahan Rahaman whose telephone number is (571)270-1438. The examiner can normally be reached on 7am - 3:30pm. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Nasser Goodarzi can be reached at telephone number (571) 272-4195. The fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300. Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center for authorized users only. Should you have questions about access to Patent Center, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form. /SHAHAN UR RAHAMAN/Primary Examiner, Art Unit 2426
Read full office action

Prosecution Timeline

Oct 16, 2024
Application Filed
Oct 24, 2025
Non-Final Rejection — §102, §103
Jan 28, 2026
Response Filed
Mar 02, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599294
IMAGE-RECORDING DEVICE FOR IMPROVED LOW LIGHT INTENSITY IMAGING AND ASSOCIATED IMAGE-RECORDING METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12602765
DEFECT INSPECTION SYSTEM AND DEFECT INSPECTION METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12598328
VIDEO SIGNAL PROCESSING METHOD AND DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12593035
IMAGE ENCODING/DECODING METHOD AND DEVICE
2y 5m to grant Granted Mar 31, 2026
Patent 12586224
THREE-DIMENSIONAL SCANNING SYSTEM AND METHOD FOR OPERATING SAME
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
76%
Grant Probability
88%
With Interview (+12.6%)
2y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 633 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month