DETAILED ACTION
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1 – 20 are rejected under 35 U.S.C. 103 as being unpatentable over Petterson et al. (Publication: US 2017/0238011 A1) in view of Umeyama et al. (Publication: US 2020/0145570 A1) and Yang et al. (Publication: US 2021/0314588 A1).
Regarding claim 1, Petterson discloses identifying, for each image frame of a plurality of image frames, a dynamic maximum supported range of motion ([0061] - The motion vector search range may be different depending on the scaling factor of the scaled reference picture. [0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time, and conversely, small scaling factors may be omitted for reference pictures 2 that are farther apart in presentation time, “maximum supported range of motion”.
[0059] For each scaling factor, the search area (i.e. the reference template areas 6 and the prediction area 5) of the reference picture 2 may be scaled to the new scale. The scaling, i.e. the resizing of the search area, may be performed in different ways. The scaling may be performed using, for each respective scaling factor, an interpolation filter to generate a filtered reference sample that corresponds to the sample of the current picture 1. This gives a set of scaled samples (each scaled sample having a particular scaling factor) of the search area, “frames, several frames”.) ; and
generating motion vectors for each image frame of the plurality of image frames based on the corresponding dynamic maximum supported range of motion ([0006] , [0061] - generate a motion vector, the motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search is selected such that it is proportional to the current scaling factor. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture “motion vectors for the each image frame of the plurality of image frames”.
[0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time.).
Pettersson does not however Umeyama discloses
generating, at the processing unit ([0282] - the generating unit generates metadata indicating a characteristic amount of each frame of the moving image.).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Pettersson with generating, at the processing unit as taught by Umeyama. The motivation for doing is to enable moving image production .
Pettersson in view of Umeyama do not however Yang discloses
Identifying, for the image frame, a dynamic maximum supported range of motion based on data indicating an expected amount of motion in the plurality of image frames (
[0103] - motion vector predictors are enabled in the plurality of image frames.
[0006] - a prediction mode enabled flag and an affine enabled flag both corresponding to one or more image frames; determining, from the bitstream, a maximum index corresponding to the one or more image frames when the affine enabled flag is true, wherein an index value of the maximum index is in an index range determined based on the prediction mode enabled flag; determining a maximum number of zero or more subblock-based merging motion vector prediction (MVP) candidates based on the maximum index when the affine enabled flag is true; and reconstructing the one or more image frames based on the maximum number of the zero or more subblock-based merging MVP candidates.
Thus, Enabled flag and disabled flag “dynamic”;
expected amount of motion “determined by enable or disabled flag”;
Identifying the image frame of enabled flag “dynamic”, a maximum index range determined based on the prediction mode enabled flag.).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Pettersson in view Umeyama with Identifying, for the image frame, a dynamic maximum supported range of motion based on data indicating an expected amount of motion in the plurality of image frames as taught by Yang. The motivation for doing is to perform the coding more efficiently with the help of flag and index.
Regarding claim 2, see rejection on claim 9.
Regarding claim 3, see rejection on claim 10.
Regarding claim 4, see rejection on claim 11.
Regarding claim 5, see rejection on claim 12.
Regarding claim 6, Pettersson in view Umeyama, Yang discloses all the limitations of claim 5.
Pettersson discloses wherein the pre-pass comprises a search for matching blocks between two or more image frames of the plurality of image frames ([0057] FIG. 4c illustrates the scaled reference picture 2′ when the picture has been scaled with a factor of 1.33. The scaled reference template areas 6′ now match better with the template areas 4 of the current picture 1, “blocks”. In practice the optimal scaling factor is likely to be close or very close to 1 for reference pictures that are close to each other in presentation time.
[0059] For each scaling factor, the search area (i.e. the reference template areas 6 and the prediction area 5) of the reference picture 2 may be scaled to the new scale.
[0061] - The scaling may be performed either as a pre-pass.)
Regarding claim 7, see rejection on claim 14.
Regarding claim 8, Pettersson discloses a method, comprising ([0154] FIG. 15 illustrates schematically an encoder and a decoder and means for implementing embodiments according to the present teachings. The encoder 40 and the decoder 50 each comprises a processor 41, 51 comprising any combination of one or more of a central processing unit (CPU), multiprocessor, microcontroller, digital signal processor (DSP), application specific integrated circuit etc. capable of executing software instructions stored in a memory 42, 52 which can thus be a computer program product. The processor 41 of the encoder 40 can be configured to execute any of the following methods):
identifying a first maximum supported range of motion for a first frame of a plurality of image frames and a second maximum supported range of motion for a second frame of the plurality of image frames ([0061] - The motion vector search range may be different depending on the scaling factor of the scaled reference picture. [0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time, and conversely, small scaling factors may be omitted for reference pictures 2 that are farther apart in presentation time, “maximum supported range of motion”.
[0059] For each scaling factor, the search area (i.e. the reference template areas 6 and the prediction area 5) of the reference picture 2 may be scaled to the new scale. The scaling, i.e. the resizing of the search area, may be performed in different ways. The scaling may be performed using, for each respective scaling factor, an interpolation filter to generate a filtered reference sample that corresponds to the sample of the current picture 1. This gives a set of scaled samples (each scaled sample having a particular scaling factor) of the search area, “first frame, second frame”.);
generating motion vectors for the first frame based on the first maximum supported range of motion ([0006] , [0061] - generate a motion vector, the motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search is selected such that it is proportional to the current scaling factor. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture “different, motion vectors for the first frame and second frame”.
[0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time. ); and
Generating motion vectors for the second frame based on the second maximum supported range of motion ([0006] , [0061] - generate a motion vector, the motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search is selected such that it is proportional to the current scaling factor. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture “different, motion vectors for the first frame and second frame”.
[0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time.).
Pettersson does not however Umeyama discloses
generating, at the processing unit ([0282] - the generating unit generates metadata indicating a characteristic amount of each frame of the moving image.).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Pettersson with generating, at the processing unit as taught by Umeyama. The motivation for doing is to enable moving image production .
Pettersson in view of Umeyama do not however Yang discloses
Identifying, for the image frame, a dynamic maximum supported range of motion based on data indicating an expected amount of motion in the plurality of image frames (
[0103] - motion vector predictors are enabled in the plurality of image frames.
[0006] - a prediction mode enabled flag and an affine enabled flag both corresponding to one or more image frames; determining, from the bitstream, a maximum index corresponding to the one or more image frames when the affine enabled flag is true, wherein an index value of the maximum index is in an index range determined based on the prediction mode enabled flag; determining a maximum number of zero or more subblock-based merging motion vector prediction (MVP) candidates based on the maximum index when the affine enabled flag is true; and reconstructing the one or more image frames based on the maximum number of the zero or more subblock-based merging MVP candidates.
Thus, Enabled flag and disabled flag “dynamic”;
expected amount of motion “determined by enable or disabled flag”;
Identifying the image frame of enabled flag “dynamic”, a maximum index range determined based on the prediction mode enabled flag.).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Pettersson in view Umeyama with Identifying, for the image frame, a dynamic maximum supported range of motion based on data indicating an expected amount of motion in the plurality of image frames as taught by Yang. The motivation for doing is to perform the coding more efficiently with the help of flag and index.
Regarding claim 9, Pettersson in view Umeyama, Yang discloses all the limitations of claim 8.
Pettersson discloses generating the motion vectors for the first frame comprises setting a first search range based on the first maximum supported range of motion ([0061] The motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search is selected such that it is proportional to the current scaling factor. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture “different, motion vectors for the first and second”.
[0006] - generate a motion vector.); and generating the motion vectors for the second frame comprises setting a second search range based on the second maximum supported range of motion ([0061] The motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search is selected such that it is proportional to the current scaling factor. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture “different, motion vectors for the first and second”.
[0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time, and conversely, small scaling factors may be omitted for reference pictures 2 that are farther apart in presentation time, “maximum supported range of motion”
[0006] - generate a motion vector.),
the second search range different from the first search range ([0059] For each scaling factor, the search area (i.e. the reference template areas 6 and the prediction area 5) of the reference picture 2 may be scaled to the new scale. The scaling, i.e. the resizing of the search area, may be performed in different ways. The scaling may be performed using, for each respective scaling factor, an interpolation filter to generate a filtered reference sample that corresponds to the sample of the current picture 1. This gives a set of scaled samples (each scaled sample having a particular scaling factor) of the search area, “first frame, second frame”
[0061] The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture “different, motion vectors for the first and second”.).
Regarding claim 10, Pettersson in view Umeyama, Yang discloses all the limitations of claim 8.
Pettersson discloses generating the motion vectors for the first frame comprises setting a first block size based on the first maximum supported range of motion ([0061] The motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search is selected such that it is proportional to the current scaling factor. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture “different, motion vectors for the first and second”.
[0006] - generate a motion vector.); and generating the motion vectors for the second frame comprises setting a second block size based on the second maximum supported range of motion, the second block size different from the first block size ([0061] The motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search is selected such that it is proportional to the current scaling factor. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture “different, motion vectors for the first and second”.
[0006] - generate a motion vector.).
Regarding claim 11, Pettersson in view Umeyama, Yang discloses all the limitations of claim 8 including the data indicating an expected amount of motion in the plurality of image frames.
Pettersson discloses includes information provided by an executing application ([0061] - The motion vector search range may be different depending on the scaling factor of the scaled reference picture. [0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time, and conversely, small scaling factors may be omitted for reference pictures 2 that are farther apart in presentation time, “maximum supported range of motion”.
[0011] – the method above is performed by a program.).
Regarding claim 12, Pettersson in view Umeyama, Yang discloses all the limitations of claim 8 including the data indicating an expected amount of motion in the plurality of image frames.
Pettersson discloses wherein generated based on a pre-pass of the first frame (
[0085] By scaling the reference picture(s) 12, 13 with the center of the zoom as the origin, the displacement of a block 15, 16 due to zoom will be corrected. The scaled total motion vectors tv′.sub.0 and tv′.sub.1 will be calculated.
[0061] - The motion vector search range may be different depending on the scaling factor of the scaled reference picture. [0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time
[0061] - The scaling is performed as a pre-pass.).
Yang discloses the data indicating an expected amount of motion in the plurality of image frames is generated based on the data (
[0103] - motion vector predictors are enabled in the plurality of image frames.
[0006] - a prediction mode enabled flag and an affine enabled flag both corresponding to one or more image frames; determining, from the bitstream, a maximum index corresponding to the one or more image frames when the affine enabled flag is true, wherein an index value of the maximum index is in an index range determined based on the prediction mode enabled flag; determining a maximum number of zero or more subblock-based merging motion vector prediction (MVP) candidates based on the maximum index when the affine enabled flag is true; and reconstructing the one or more image frames based on the maximum number of the zero or more subblock-based merging MVP candidates.
Thus, Enabled flag and disabled flag “data indicating” expected amount of motion ;
Identifying the image frame of enabled flag “dynamic”, a maximum index range determined based on the prediction mode enabled flag.).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Pettersson in view Umeyama with the data indicating an expected amount of motion in the plurality of image frames is generated based on the data as taught by Yang. The motivation for doing is to perform the coding more efficiently with the help of flag and index.
Regarding claim 13, Pettersson in view Umeyama, Yang discloses all the limitations of claim 12.
Pettersson discloses wherein the pre-pass comprises a search for matching blocks between the first frame and a reference frame ([0057] FIG. 4c illustrates the scaled reference picture 2′ when the picture has been scaled with a factor of 1.33. The scaled reference template areas 6′ now match better with the template areas 4 of the current picture 1, “blocks”. In practice the optimal scaling factor is likely to be close or very close to 1 for reference pictures that are close to each other in presentation time.
[0059] For each scaling factor, the search area (i.e. the reference template areas 6 and the prediction area 5) of the reference picture 2 may be scaled to the new scale.
[0061] - The scaling may be performed either as a pre-pass.).
Regarding claim 14, Pettersson in view Umeyama, Yang discloses all the limitations of claim 8 includes the data indicating an expected amount of motion in the plurality of image frames.
Pettersson discloses provided by at least one selected form a group consisting of an application, an application type, a power setting, and a quality-of-service parameter (0061] - The motion vector search range may be different depending on the scaling factor of the scaled reference picture. [0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time, and conversely, small scaling factors may be omitted for reference pictures 2 that are farther apart in presentation time, “maximum supported range of motion”.
[0011] – the method above is performed by a program.).
Regarding claim 15, Pettersson discloses a processing system comprising a processor including one or more processor cores configured to:
([0154] FIG. 15 illustrates schematically an encoder and a decoder and means for implementing embodiments according to the present teachings. The encoder 40 and the decoder 50 each comprises a processor 41, 51 comprising any combination of one or more of a central processing unit (CPU), multiprocessor, microcontroller, digital signal processor (DSP), application specific integrated circuit etc. capable of executing software instructions stored in a memory 42, 52 which can thus be a computer program product. The processor 41 of the encoder 40 can be configured to execute any of the following methods)
identify, for each image frames of a plurality of image frames, a dynamic maximum supported range of motion ([0061] - The motion vector search range may be different depending on the scaling factor of the scaled reference picture. [0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time, and conversely, small scaling factors may be omitted for reference pictures 2 that are farther apart in presentation time, “maximum supported range of motion”.
[0059] For each scaling factor, the search area (i.e. the reference template areas 6 and the prediction area 5) of the reference picture 2 may be scaled to the new scale. The scaling, i.e. the resizing of the search area, may be performed in different ways. The scaling may be performed using, for each respective scaling factor, an interpolation filter to generate a filtered reference sample that corresponds to the sample of the current picture 1. This gives a set of scaled samples (each scaled sample having a particular scaling factor) of the search area, “first frame, second frame”.); and
generate motion vectors for each image frame of the plurality of image frames based on the corresponding maximum supported range of motion ([0006] , [0061] - generate a motion vector, the motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search is selected such that it is proportional to the current scaling factor. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture “different, motion vectors for the first frame and second frame”.
[0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time.).
Pettersson does not however Umeyama discloses
generate, at a processing unit ([0125] - The identification information may be automatically generated or generated manually in accordance with a user operation. Generation of the identification information may be switched between automatic generation and manual generation in accordance with the photography mode set by the CPU 2509.).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Pettersson with for each of the plurality of image frames, generating for each of the plurality of image frames as taught by Umeyama. The motivation for doing is to enable moving image production .
identify, for each image frame of a plurality of image frames, a dynamic
maximum supported range of motion based on data indicating an expected
amount of motion in the plurality of image frames
Regarding claim 16, Pettersson in view Umeyama, Yang discloses all the limitations of claim 15.
Pettersson discloses setting a search range based on the corresponding maximum supported range of motion ([0061] The motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search is selected such that it is proportional to the current scaling factor. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture.
[0006] - generate a motion vector.); and
generating the motion vectors based on the search range ( [0006] - generate the motion vector on matching block. The motion vectors MV0 and MV1 are proportional to the temporal differences TD0 and TD1. The motion vectors along the motion trajectory that minimizes the prediction error is selected.
[0061] Template matching may be applied in order to find the best template match for the scaled reference picture 2′. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search may be selected such that it is proportional to the current scaling factor, i.e. the input candidate motion vector may be scaled according to the scaling factor, “based on the search range for each of the plurality of image frames”.).
Pettersson does not however Umeyama discloses
for each of the plurality of image frames ([0230] indicating a characteristic amount (a frame maximum brightness) of each frame of a moving image can be generated and added to moving image data.),
generating for each of the plurality of image frames ([0282] - the generating unit generates metadata indicating a characteristic amount of each frame of the moving image.).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Pettersson with for each of the plurality of image frames, generating for each of the plurality of image frames as taught by Umeyama. The motivation for doing is to enable moving image production .
Regarding claim 17, Pettersson in view Umeyama, Yang discloses all the limitations of claim 15.
Pettersson discloses setting, for each of the plurality of image frames, a block size based on the corresponding maximum supported range of motion ([0061] The motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search is selected such that it is proportional to the current scaling factor. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture); and
generating the motion vectors based on the block size for each of the plurality of image frames ([0061] The motion vector search range may be different depending on the scaling factor of the scaled reference picture. An initial motion vector for the search is selected such that it is proportional to the current scaling factor. The search is performed within a selected motion vector search range, i.e. the motion vector range defines the search area. The motion vector search range may be different depending on the scaling factor of the scaled reference picture “different, motion vectors for the first and second”.
[0006] - generate a motion vector.).
Regarding claim 18, Pettersson in view Umeyama, Yang discloses all the limitations of claim 15 includes the data indicating an expected amount of motion in the plurality of image frames.
Pettersson discloses wherein based on information provided by an executing application ([0061] - The motion vector search range may be different depending on the scaling factor of the scaled reference picture. [0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time, and conversely, small scaling factors may be omitted for reference pictures 2 that are farther apart in presentation time, “maximum supported range of motion”.
[0011] – the method above is performed by a program.).
Regarding claim 19, Pettersson in view Umeyama, Yang discloses all the limitations of claim 15 includes the data indicating an expected amount of motion in the plurality of image frames.
Pettersson discloses wherein based on a pre-pass of the plurality of image frames ([0085] By scaling the reference picture(s) 12, 13 with the center of the zoom as the origin, the displacement of a block 15, 16 due to zoom will be corrected. The scaled total motion vectors tv′.sub.0 and tv′.sub.1 will be calculated.
[0061] - The motion vector search range may be different depending on the scaling factor of the scaled reference picture. [0058] - the scaling factors may be selected according to the (temporal) distance to the reference picture 2 such that large scaling factors may be omitted for reference pictures 2 that are closer in presentation time.
[0059] For each scaling factor, the search area (i.e. the reference template areas 6 and the prediction area 5) of the reference picture 2 may be scaled to the new scale, “plurality of image frames” .
[0061] - The scaling is performed as a pre-pass.).
Regarding claim 20, Pettersson in view Umeyama, Yang discloses all the limitations of claim 19.
Pettersson discloses comprises a search for matching blocks between the plurality of image frames ([0057] FIG. 4c illustrates the scaled reference picture 2′ when the picture has been scaled with a factor of 1.33. The scaled reference template areas 6′ now match better with the template areas 4 of the current picture 1. In practice the optimal scaling factor is likely to be close or very close to 1 for reference pictures that are close to each other in presentation time.
[0059] For each scaling factor, the search area (i.e. the reference template areas 6 and the prediction area 5) of the reference picture 2 may be scaled to the new scale.
[0061] - The scaling may be performed either as a pre-pass.).
Response to Arguments
Claim Rejection Under 35 U.S.C. 103
Examiner suggests to amend a specific element in the claim that when reading a claim in light of the invention, it directs to a unique technology. The examiner can be reached at 571-270-0724 for further discussion.
Applicant asserts “Petterson discloses that a motion vector search range depends on the scaling factor of a scaled reference picture. However, Petterson does not disclose or suggest identifying a dynamic maximum supported range of motion for each frame in a set of frames based on data indicating an expected amount of motion for that set of frames. ”
The argument has been fully considered and is persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Yang reference.
Regarding claims 2 – 7, 9 – 14, and 16 – 20, the Applicant asserts that they are not obvious over based on their dependency from independent claims 1, 8, and 15 respectively. The examiner cannot concur with the Applicant respectfully from same reason noted in the examiner’s response to argument asserted from claims 1, 8, and 15 respectively.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Ming Wu whose telephone number is (571) 270-0724. The examiner can normally be reached on Monday - Friday.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Devona Faulk can be reached on 571-272-7515. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Ming Wu/
Primary Examiner, Art Unit 2616