DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 02/06/2026 has been entered.
Response to Amendment and Argument
Applicant’s amendment and argument with respect to pending claims 1-16 filed on 01/15/2026 have been fully considered but the argument has been rendered moot in view of a new ground(s) of rejection.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-8 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wang et al. (US 20130294499 A1) in view of Alghafli et al., "Identification and recovery of video fragments for forensics file carving," 2016 11th International Conference for Internet Technology and Secured Transactions (ICITST), Barcelona, Spain, 2016, pp. 267-272 and Na et al. “Frame-Based Recovery of Corrupted Video Files Using Video Codec Specifications” IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 23, NO. 2, FEBRUARY 2014, pp. 517-526.
Regarding claim 1, Wang discloses a method for generating a Sequence Parameter Set and a Picture Parameter Set for decoding an H.264 coded video file fragment, comprising: identifying a start of frame data (¶0102-0103, 0130-0131:a video decoder may determine to perform random access including an instantaneous decoder refresh (IDR) picture); creating a parameter dictionary (¶0009, 0099, 0054-0055: encoding a parameter set; storing a parameter set); detecting an entropy coding-mode (¶0050-0052: sending syntax data related to coding mode of the encoder to be used by the decoder); updating the parameter dictionary (¶0054-0055, 0170-0171: a parameter set update); identifying a plurality of headers (¶0123-0125, 0127: different types of parameters identified using different IDs e.g., SPS ID, PPS ID); and verifying a generated picture (¶0022, 0054, 0073: indication of whether a parameter set update can occur in a portion of a bitstream, i.e., a coded video sequence of the bitstream or a group of pictures (GOP) in the coded video sequence of the bitstream. The indication enables the decoder to determine whether an update of a stored parameter set).
Wang does not disclose verifying that start codes of the start of frame data are followed by a header identifier…wherein the method is configured to decode an H.264 coded video file fragment that is missing a file header.
Alghafli teaches wherein the method is configured to decode an H.264 coded video file fragment that is missing a file header (Section I: In this research we will focus on solving the problem of recovering video file fragments without making use of file system metadata…Section IV-A: This method applies in the cases where codec specifications are overwritten. In other words, in this case there will be no meta-data box to figure out the samples sizes which are usually used to identify the header length. We are going to use the redundancy that is added by the syntax description. Moreover, we are going to recover video fragments based on the format of video NAL unit. Our approach will search for the header of the video NAL unit rather than the header length or NAL unit start code prefix of H.264 bitstream).
Alghafli further discloses “Na et al. proposed a frame-based technique to recover video fragments (a frame being the smallest meaningful unit of a video file) [6]. Their technique consists of two main steps: identification and connection of video frames. The video frames were identified based on the start code, such as 0×00000001 for H.264 bitstreams.” Page 268.
Na et al. “Frame-Based Recovery of Corrupted Video Files Using Video Codec Specifications” IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 23, NO. 2, FEBRUARY 2014 paper discloses “H.264 requires SPS, PPS, and frame data (slices) for decoding. Figure 3(b) shows SPS, PPS, frame data (slice) encoded by H.264_Start in an unallocated space. The red box indicates decoding header information consisting of SPS and PPS, if a 1-byte code following a start code (0x00000001) is 0x67, or PPS if it is 0x68…Instantaneous Decoder Refresh (IDR) frames and P/B frames. An IDR frame, as an independent frame, can also express pictures...In H.264 standard, if the last 5 bits of the first byte of a NAL unit is 5, then it denotes IDR frame (or P/B frame if 1). In figure 3(b), the data in the blue box denotes IDR frame data because the code 0x65 comes after the start code. H.264_Start finds the start code (0x00000001) in an unallocated space in the same manner as in MPEG-4 Visual, checks the last 5 bits after the start code. If the last 5 bits are 7, then SPS (PPS if 8, IDR frame if 5, and P/B frame if 1).” Page 521. Thus, Alghafli-Na teaches verifying that start codes of the start of frame data are followed by a header identifier.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Wang by incorporating the teaching of Alghafli as noted above, in order to obtain a method of recovering video file in the case the video codec specification were overwritten (Alghafli: abstract).
Regarding claim 2, Wang discloses the method of claim 1, wherein the start of frame data is identified by identifying an Instantaneous Decoder Refresh (¶0100-0101, 0103, 0113: the decoder configured to decode an indication corresponding to the IDR).
Regarding claim 3, Wang discloses the method of claim 1, wherein the parameter dictionary comprises a plurality of tuples (¶0057, 0073, 0097: parameter sets).
Regarding claim 4, Wang discloses the method of claim 3, wherein the plurality of tuples represents a plurality of parameters in the Sequence Parameter Set, and wherein the plurality of tuples represents a plurality of parameters in the Picture Parameter Set (¶0073, 0095-0097: Parameter sets, including VPSs, SPSs, PPSs, and/or APSs).
Regarding claim 5, Wang discloses the method of claim 4, wherein the plurality of parameters in the Sequence Parameter and the plurality of parameters in the Picture Parameter Set comprise core parameters (¶0073, 0095-0097: Parameter sets, including VPSs, SPSs, PPSs, and/or APSs).
Regarding claim 6, Wang discloses the method of claim 1, wherein the entropy coding-mode is detected with a classifier (¶0050-0052: sending syntax data related to coding mode of the encoder to be used by the decoder).
Regarding claim 7, Wang discloses the method of claim 6, wherein the classifier differentiates between a first entropy coding mode using a CABAC coding method and a second entropy coding mode using a CAVLC coding method (¶0050-0052: CABAC, CAVLC modes).
Regarding claim 8, Wang discloses the method of claim 1, wherein a correlation between the generated picture decoded with a generated header and an original picture decoded using an original header is used to verify the generated picture (¶0127-0128, 0137, 0167-0171: a flag may indicate whether parameter sets of can be updated in the coded video sequence, and updating the parameter set based on comparison between content of parameter sets).
Regarding claim 16, Wang discloses a method for generating a plurality of parameters used for decoding a file fragment, comprising: inputting a plurality of I frames and a parameter dictionary into a system (¶0076, 0078, 0103, 0130-0131: a decoder receiving video bitstream including IDR pictures. ¶0127: receiving a parameter set), wherein the system executes the method comprising: detecting an entropy coding-mode (¶0050-0052: sending syntax data related to coding mode of the encoder to be used by the decoder); updating the parameter dictionary (¶0054-0055, 0170-0171: a parameter set update); identifying a plurality of headers (¶0123-0125, 0127, 0153: different types of parameters identified using different IDs e.g., SPS ID, PPS ID); and verifying a generated picture (¶0022, 0054, 0073: indication of whether a parameter set update can occur in a portion of a bitstream, i.e., a coded video sequence of the bitstream or a group of pictures (GOP) in the coded video sequence of the bitstream. The indication enables the decoder to determine whether an update of a stored parameter set).
Wang does not disclose wherein the system verifies that start codes of the plurality of I frames are followed by a header identifier…wherein the method is configured to decode an H.264 coded video file fragment that is missing a file header.
Alghafli teaches wherein the method is configured to decode an H.264 coded video file fragment that is missing a file header (Section I: In this research we will focus on solving the problem of recovering video file fragments without making use of file system metadata…Section IV-A: This method applies in the cases where codec specifications are overwritten. In other words, in this case there will be no meta-data box to figure out the samples sizes which are usually used to identify the header length. We are going to use the redundancy that is added by the syntax description. Moreover, we are going to recover video fragments based on the format of video NAL unit. Our approach will search for the header of the video NAL unit rather than the header length or NAL unit start code prefix of H.264 bitstream).
Alghafli further discloses “Na et al. proposed a frame-based technique to recover video fragments (a frame being the smallest meaningful unit of a video file) [6]. Their technique consists of two main steps: identification and connection of video frames. The video frames were identified based on the start code, such as 0×00000001 for H.264 bitstreams.” Page 268.
Na et al. “Frame-Based Recovery of Corrupted Video Files Using Video Codec Specifications” IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 23, NO. 2, FEBRUARY 2014 paper discloses “H.264 requires SPS, PPS, and frame data (slices) for decoding. Figure 3(b) shows SPS, PPS, frame data (slice) encoded by H.264_Start in an unallocated space. The red box indicates decoding header information consisting of SPS and PPS, if a 1-byte code following a start code (0x00000001) is 0x67, or PPS if it is 0x68…Instantaneous Decoder Refresh (IDR) frames and P/B frames. An IDR frame, as an independent frame, can also express pictures...In H.264 standard, if the last 5 bits of the first byte of a NAL unit is 5, then it denotes IDR frame (or P/B frame if 1). In figure 3(b), the data in the blue box denotes IDR frame data because the code 0x65 comes after the start code. H.264_Start finds the start code (0x00000001) in an unallocated space in the same manner as in MPEG-4 Visual, checks the last 5 bits after the start code. If the last 5 bits are 7, then SPS (PPS if 8, IDR frame if 5, and P/B frame if 1).” Page 521. Alghafli-Na teaches wherein the system verifies that start codes of the plurality of I frames are followed by a header identifier.
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Wang by incorporating the teaching of Alghafli as noted above, in order to obtain a method of recovering vide file in the case the video codec specification were overwritten (Alghafli: abstract).
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 9-15 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Altinisik et al. “Automatic Generation of H.264 Parameter Sets to Recover Video File Fragments” IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, VOL. 16, 2021, already of record.
Regarding claim 9, Altinisik teaches a method for generating a file header for decoding a file fragment, comprising: constructing a dataset (page 4859: we built a comprehensive dataset of videos derived from publicly available sources); sorting header entries of the dataset in order of decreasing priority based on at least a prioritization of a combination of core parameter values in the dataset and a frequency of each parameter value (page 4862: Header entries are initially sorted in order of decreasing priority based on two criteria. The first criterion prioritizes the combination of core parameter values seen in the design set. Out of the 5,115 unique SPS and PPS headers that cover the design set, we observed 2,663 unique 10-tuples. Therefore, the first 2.6K header entries incorporate these values sorted based on their frequency in the design set. The second criterion determines the sorting of subsequent entries based on frequency of each parameter value) ; extracting a plurality of core parameters in the dataset (Page 4862: B. Parameter Dictionary Creation …we identify as core parameters determine the complexity of generating SPS and PPS headers blindly) ; and using the plurality of core parameters to create a parameter dictionary (Page 4862: Overall the dictionary contains around 3.5 billion entries considering possible values for the 10 core parameters. See table II), wherein the file header is generated for a file fragment that is missing a file header (abstract: We address the problem of decoding video file fragments when the necessary encoding parameters are missing. With this objective, we propose a method that automatically generates H.264 video headers containing these parameters and extracts coded pictures in the partially available compressed video data).
Regarding claim 10, Altinisik teaches method of claim 9, wherein the file fragment is a video file fragment encoded using an H.264 format (abstract).
Regarding claim 11, Altinisik teaches method of claim 9, wherein the file header comprises a Sequence Parameter Set (page 4859: SPS and PPS headers).
Regarding claim 12, Altinisik teaches method of claim 9, wherein the file header comprises a Picture Parameter Set (page 4859: SPS and PPS headers).
Regarding claim 13, Altinisik teaches method of claim 9, wherein the parameter dictionary comprises a plurality of tuples (page 4862: The parameter dictionary is a collection of 63-tuples with each entry H = (p1, p2,..., p63) representing a realization of all parameters in SPS and PPS).
Regarding claim 14, Altinisik teaches method of claim 13, wherein the plurality of tuples represents a plurality of parameters in the Sequence Parameter Set, and wherein the plurality of tuples represents a plurality of parameters in the Picture Parameter Set (page 4862: The parameter dictionary is a collection of 63-tuples with each entry H = (p1, p2,..., p63) representing a realization of all parameters in SPS and PPS).
Regarding claim 15, Altinisik teaches method of claim 14, wherein the plurality of parameters in the Sequence Parameter and the plurality of parameters in the Picture Parameter Set comprise core parameters (page4862: The parameter dictionary is a collection of 63-tuples with each entry H = (p1, p2,..., p63) representing a realization of all parameters in SPS and PPS…what we identify as core parameters determine the complexity of generating SPS and PPS headers blindly. See table II).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NATHNAEL AYNALEM whose telephone number is (571)270-1482. The examiner can normally be reached M-F 9AM-5:30 PM ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, SATH PERUNGAVOOR can be reached at 571-272-7455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NATHNAEL AYNALEM/Primary Examiner, Art Unit 2488