DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Remarks
1. It should be noted that with respect to claims 2-19, and 21-30 and 32-3, the claims recite a generation unit, transmission unit, and segmentation unit, file generation unit, partial region file generation information unit, request processing unit, and acquisition unit, which all invoke 112f paragraph and are therefore limited to the hardware implementations or the combination of hardware and software, and equivalents thereof which are disclosed in applicants specification as being a CPU for executing the functions and algorithms as described in the claim limitations as disclosed in applicants specification [0060]-[0142] and Figs.6-8.
Claim Rejections - 35 USC § 112
2. The following is a quotation of 35 U.S.C. 112(b):
(B) CONCLUSION. The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
3. Claim 20 is rejected under 35 U.S.C. 112(b), second paragraph, as being vague and indefinite for failing to particularly point out and distinctly claim the subject matter which applicant regards as the invention for the reasons stated below. Claim 20 recites “another apparatus” without antecedent basis for any other apparatus being present. Appropriate Correction is required.
Claim Rejections - 35 USC § 102
4. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
5. Claims 20-27, 29, and 31 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Di et al., US 2019/0238933.
Regarding claim 20, Di teaches of an image processing method comprising:
generating a bitstream of a partial region constituted by some of subpictures among the subpictures included in a picture on a basis of a request from another apparatus (See [0165]-[0179] request from client for region of interest; Di, Fig.4 and [0161] which discloses of multiple client devices); and transmitting the bitstream of the partial region to the another apparatus (See [0165]-[0179]; Di, Fig.4 and [0161] which discloses of multiple client devices).
Regarding claim 21, Di teaches of an image processing apparatus comprising:
a partial region file generation information generation unit that generates, on a basis of data of content, partial region file generation information for generating a file for storing a bitstream of a partial region of a picture; and a file generation unit that generates a control file for controlling distribution of a content file to store the partial region file generation information (See [0010]-[0013]; [0019]-[0025], and [0165]-[0179] which discloses the file of the mpd, xml, dash which stores and generates the spatial regions and region of interest region used for sending the desired regions to the client based on the request).
Regarding claim 22, Di teaches the image processing apparatus according to
claim 21, wherein the partial region file generation information generation unit further generates partial region information indicating a position of a subpicture constituting the partial region in the picture, and the file generation unit stores the partial region
information in the control file (See [0010]-[0013]; [0019]-[0025], and [0165]-[0179] which discloses of the position/coordinates of the regions).
Regarding claim 23, Di teaches the image processing apparatus according to claim 22, wherein the file generation unit stores pieces of the partial region information of candidates for the partial region in mutually different adaptation sets of the control file (See [0019]-[0025] which discloses of the partial regions in different adaption sets within the mpd/xml file).
Regarding claim 24, Di teaches the image processing apparatus according to
claim 22, wherein the partial region information further includes information indicating a position of the subpicture in the partial region (See [0010]-[0013]; [0019]-[0025], and [0165]-[0179] which discloses of the position/coordinates of the regions).
Regarding claim 25, Di teaches the image processing apparatus according to
claim 22, wherein the partial region information further includes information indicating a size of the subpicture (See [0068], [0179], and [0200]-[0201] size information).
Regarding claim 26, Di teaches the image processing apparatus according to
claim 22, wherein the file generation unit stores pieces of the partial region information of candidates for the partial region in one adaptation set of the control file (See [0019]-[0025] which discloses of the partial regions in different adaption sets within the mpd/xml file).
Regarding claim 27, Di teaches the image processing apparatus according to claim 22, wherein the partial region information further includes link information with respect to information indicating a position of the subpicture in the partial region (See [0020]-[0025] uri/url for information of at least the spatial position).
Regarding claim 29, Di teaches the image processing apparatus according to claim 22, wherein the partial region information further includes link information with respect to the partial region information (See [0020]-[0025] uri/url for information of the regions).
Regarding claim 31, Di teaches an image processing method comprising:
generating partial region file generation information for generating a file for storing a bitstream of a partial region of a picture on a basis of data of content; and generating a control file for controlling distribution of a content file to store the partial region file generation information (See analysis of claims 20-22; [0010]-[0013]; [0019]-[0025], and [0165]-[0179]).
Claim Rejections - 35 USC § 103
6. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
7. Claims 1-19 and 32-34 are rejected under 35 U.S.C. 103 as being unpatentable over Di et al., US 2019/0238933 in view of Ramaswamy et al., WO 2018/152437.
Regarding claim 1, Di teaches of an image processing system comprising a
server and a client apparatus (See Fig.4, client and server), wherein
the client apparatus
requests the server for a bitstream of a partial region of a picture (See [0165]-[0178] client sends request to server for the bitstream of the region of interest) on a basis of partial region file generation information included in a control file for controlling distribution of a content file to generate a file for storing the bitstream of the partial region (See [0010]-[0013] and [0019]-[0025] which discloses the file of the mpd, xml, dash which stores and generates the spatial regions and region of interest region used for sending the desired regions to the client based on the request),
the server
generates the bitstream of the partial region constituted by some of subpictures among the subpictures included in the picture on a basis of the request by the
client apparatus (See [0165]-[0179] the server generates and multiplexes the bitstream of the selected region of interests and transmits to the client based on the client request), and
transmits the bitstream of the partial region to the client apparatus, and the client apparatus receives the bitstream of the partial region transmitted from the server (See [0165]-[0179] the server generates and multiplexes the bitstream of the selected region of interests and transmits to the client based on the client request wherein the client receives the bitstream).
Di is silent with respect to wherein the client segments a segmented region from the bitstream of the partial region.
However, in the same field of endeavor, Ramaswamy teaches of wherein the client segments a segmented region from the bitstream of the partial region (See [0009]-[0011], [0015], [0055], and [0066]).
It would have been obvious to one of ordinary skill in the art before the time effective filing date of the claimed invention to have modified the teachings of Di to have incorporated the teachings of Ramaswamy for the mere benefit of being able to display the segments in a desired manner.
Regarding claim 2, the combination teaches of an image processing apparatus comprising:
a generation unit that generates a bitstream of a partial region constituted by some of subpictures among the subpictures included in a picture on a basis of a
request from another apparatus; and a transmission unit that transmits a bitstream of the partial region to the another apparatus (See analysis of claim 1; Di, Fig.4 and [0161] which discloses of multiple client devices; Ramaswamy, [0053] multiple clients).
Regarding claim 3, the combination teaches the image processing apparatus according to claim 2, wherein the generation unit extracts the subpictures constituting the partial region from a bitstream of the picture on a basis of the request, and generates a bitstream of the partial region (See Di, [0019]-[0027], [0168]-[0187], and [0232]; Ramaswamy, [0005]-[0015], [0045]-[0054], [0061], and [0069]).
Regarding claim 4, the combination teaches the image processing apparatus according to claim 3, wherein the generation unit identifies a subpicture including a segmented region designated using a coordinate in the request (See analysis of claim 1; Di, [0168]-[0173] coordinates in client request), selects metadata from which
the identified subpicture is extractable, extracts the identified subpicture from the bitstream of the picture using the selected metadata, and generates a bitstream of
the partial region constituted by the extracted subpicture (See Di, [0019]-[0027], [0168]-[0187], and [0232]; Ramaswamy, [0005]-[0015], [0045]-[0054], [0061], and [0069]).
Regarding claim 5, the combination teaches the image processing apparatus according to claim 4, wherein the generation unit generates a bitstream of the partial region in a case where a bitstream of the subpicture is requested (See Di, [0019]-[0027], [0168]-[0187], and [0232]; Ramaswamy, [0005]-[0015], [0045]-[0054], [0061], and [0069]), and decodes the bitstream of the picture (See Di, [0186] decoding the bitstream by the client) and segments the segmented region from the picture and encodes the segmented region to generate a bitstream of the segmented region in a case where the bitstream of the segmented region is requested (See Ramaswamy, analysis of claim 1 and [0009] encoding of the segmented regions being requested such that the content is displayed according to the users desires at the client device).
Regarding claim 6, the combination teaches the image processing apparatus according to claim 3, wherein the generation unit identifies a subpicture including a segmented region designated using information regarding a display region in the request, selects metadata from which the identified subpicture is extractable, extracts the identified subpicture from the bitstream of the picture using the selected metadata, and generates a bitstream of the partial region constituted by the extracted subpicture (See Di, [0012]-[0027], [0165]-[0187], and [0232] coordinates in the display region is requested and uses at least metadata from the mpd file and resolution/bandwidth of the pictures to generate the bitstream of the desired regions; Ramaswamy, [0005]-[0015], [0045]-[0054], [0061], and [0069]).
Regarding claim 7, the combination teaches of the image processing apparatus according to claim 3, wherein the generation unit selects metadata from which a subpicture constituting the partial region designated using identification information in the request is extractable, extracts the subpicture from the bitstream of the picture using the selected metadata, and generates a bitstream of the partial region constituted by the
extracted subpicture (See Di, [0012]-[0027], [0165]-[0187], and [0232] coordinates in the display region is requested and uses at least metadata from the mpd file and resolution/bandwidth of the pictures to generate the bitstream of the desired regions).
Regarding claim 8, the combination teaches the image processing apparatus according to claim 3, wherein the generation unit selects metadata from which a subpicture designated using identification information in the request is extractable (See Di, [0012]-[0027], [0165]-[0187], and [0232] coordinates in the display region is requested and uses at least metadata from the mpd file and resolution/bandwidth of the pictures to generate the bitstream of the desired regions), extracts the subpicture from the bitstream of the picture using the selected metadata, and generates a bitstream of the partial region constituted by the extracted subpicture (See Di, [0012]-[0027], [0165]-[0187], and [0232] coordinates in the display region is requested and uses at least metadata from the mpd file and resolution/bandwidth of the pictures to generate the bitstream of the desired regions).
Regarding claim 9, the image processing apparatus according to
claim 3, wherein the generation unit selects metadata from which a subpicture constituting a partial region designated using identification information defined in a control file for controlling distribution of a content file is extractable (See Di, [0012]-[0027], [0165]-[0187], and [0232] coordinates in the display region is requested and uses at least metadata from the mpd file and resolution/bandwidth of the pictures to generate the bitstream of the desired regions), extracts the subpicture from the bitstream of the picture using the selected metadata, and generates a bitstream of the partial region (See Di, [0012]-[0027], [0165]-[0187], and [0232] coordinates in the display region is requested and uses at least metadata from the mpd file and resolution/bandwidth of the pictures to generate the bitstream of the desired regions).
Regarding claim 10, the combination teaches the image processing apparatus according to claim 3, wherein the generation unit selects metadata from which a subpicture constituting a partial region, designated in the request among partial regions (See Di, [0012]-[0027], [0165]-[0187], and [0232]) defined as an adaptation set in a control file for controlling distribution of a content file, is extractable, extracts the subpicture from the bitstream of the picture using the selected metadata (See Di, [0013]-[0023] adaption sets within the mpd file; Ramaswamy, [0054]-[0058]), and generates a bitstream of the designated partial region (See Di, [0012]-[0027], [0165]-[0187], and [0232]).
Regarding claim 11, the combination teaches the image processing apparatus according to claim 3, wherein the generation unit uses metadata for extracting a subpicture designated in the request to extract the subpicture from the bitstream of the picture, and generates a bitstream of the partial region constituted by the extracted subpicture (See Di, [0012]-[0027], [0165]-[0187], and [0232]).
Regarding claim 12, the combination teaches the image processing apparatus according to claim 2, wherein the generation unit merges bitstreams of the subpictures constituting the partial region on a basis of the request and generates a bitstream of the partial region (See Di, [0012]-[0027], [0165]-[0187], and [0232]).
Regarding claim 13, the combination teaches the image processing apparatus according to claim 12, wherein the generation unit identifies subpictures including a segmented region designated using a coordinate in the request (See Di, [0168]-[0173] coordinate in request), selects metadata capable of merging the identified subpictures, merges the identified subpictures using the selected metadata, and generates a bitstream of the partial region constituted by the merged subpictures (See Di, [0012]-[0027], [0165]-[0187], and [0232] merging/multiplexing/combining the multiple region of interests).
Regarding claim 14, the claim has been analyzed and rejected for the same reasons set forth in the rejection of claim 5.
Regarding claim 15, the combination teaches image processing apparatus according to claim 12, wherein the generation unit identifies subpictures including a segmented region designated using information regarding a display region in the request (See Di, [0012]-[0027], [0165]-[0187], and [0232; Ramaswamy, [0005]-[0015], [0045]-[0054], [0061], and [0069]), selects metadata capable of merging the identified subpictures, merges bitstreams of the identified subpictures using the selected metadata, and generates a bitstream of the partial region constituted by the merged subpictures (See Di, [0012]-[0027], [0165]-[0187], and [0232]).
Regarding claim 16, the combination teaches the image processing apparatus according to claim 12, wherein the generation unit selects metadata capable of merging subpictures constituting the partial region designated using identification information in the request, merges the subpictures using the selected metadata, and generates a bitstream of the partial region constituted by the merged subpictures (See Di, [0012]-[0027], [0165]-[0187], and [0232] coordinates in the display region is requested and uses at least metadata from the mpd file and resolution/bandwidth of the pictures to generate the bitstream of the desired regions; Ramaswamy, [0005]-[0015], [0045]-[0054], [0061], and [0069]).
Regarding claim 17, the combination teaches the image processing apparatus according to claim 12, wherein the generation unit selects metadata capable of merging subpictures constituting a partial region designated using identification information defined in a control file for controlling distribution of a content file, and merges the subpictures using the selected metadata, and generates a bitstream of the partial region (See analysis of claim 1; Di, [0012]-[0027], [0165]-[0187], and [0232]; Ramaswamy, [0005]-[0015], [0045]-[0054], [0061], and [0069]).
Regarding claim 18, the combination teaches the image processing apparatus according to claim 12, wherein the generation unit selects metadata capable of merging subpictures constituting a partial region, designated in the request among partial regions defined as an adaptation set in a control file for controlling distribution of a content file, merges the subpictures using the selected metadata, and generates a bitstream of the designated partial region (See analysis of claim 10; See Di, [0012]-[0027], [0165]-[0187], and [0232]; Ramaswamy, [0005]-[0015], [0045]-[0054], [0061], and [0069]).
Regarding claim 19, the combination teaches the image processing apparatus according to claim 12, wherein the generation unit merges subpictures using metadata for merging the subpictures designated in the request, and generates a bitstream of the partial region constituted by the merged subpictures (See Di, [0012]-[0027], [0165]-[0187], and [0232]; Ramaswamy, [0005]-[0015], [0045]-[0054], [0061], and [0069]).
Regarding claim 32, the claim has been analyzed and rejected for the same reasons set forth in the rejection of claim 1.
Regarding claim 33, the combination teaches the image processing apparatus according to claim 32, wherein the segmentation unit segments the segmented region on a basis of partial region information that is included in the control file and indicates a position in the picture of a subpicture constituting the partial region (See Di, [0020]-[0024], Ramaswamy, [0009]-[0014] and [0066]-[0080]).
Regarding claim 34, the claim has been analyzed and rejected for the same reasons set forth in the rejection of claim 1.
8. Claims 28 and 30 are rejected under 35 U.S.C. 103 as being unpatentable over Di et al., US 2019/0238933 in view of Westerlund et al., WO 2018087311.
Regarding claim 28, Di the image processing apparatus according to
claim 27, wherein the file generation unit further generates the content file, and stores the information indicating the position of the subpicture in the partial region of the content file (See analysis of claim 20-22).
Di is silent with respect to the storing being in a sample entry of a movie box.
However, in the same field of endeavor, Westerlund teaches of storing being in a sample entry of a movie box (See [00131]-[00136).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Di to have incorporated the teachings of Westerlund for the mere benefit of compatibility with different types of formats and media files.
Regarding claim 30, the combination teaches the image processing apparatus according to claim 29, wherein the file generation unit further generates the content file, and stores the partial region information as a sample group in a movie fragment box of the content file (See analysis of claim 28; Westerlund, [00131]-[00136]).
Contact
9. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Ricky Chin whose telephone number is 571-270-3753. The examiner can normally be reached on M-F 8:30-6:00.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Benjamin Bruckart can be reached on 571-272-3982. The fax phone number for the organization where this application or proceeding is assigned is 703-872-9306.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
/Ricky Chin/
Primary Examiner
AU 2424
(571) 270-3753
Ricky.Chin@uspto.gov