Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Claims 1–20 have been submitted for examination.
Claims 1–4, 7–9, 15–16, and 20 have been examined and rejected.
Claims 5–6, 10–14 and 17–19 objected to.
Allowable Subject Matter
Claims 5–6, 10–14 and 17–19 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1–4, 7–9, 15–16, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Drouin et al. (US 2015/0358662) in view of Heo et al. (US 2008/0177865).
Regarding claims 1, 16, and 20, Drouin discloses:
A signal processing device comprising:
generate list information (“manifest”) including information on a plurality of first units of data (“all available parts”) based on the received streaming data, (Drouin, ¶ [0034], “The manifest contains a list of the various streams that are available in the given asset.”) and output (“indicate”) the generated list information; (Drouin, ¶ [0006], “To indicate to the end user what content is available, a manifest of the asset may be provided. The manifest lists all of the available parts of the asset that the user can select to stream.”) and
a decoder configured to receive the list information (“Using the manifest”) and decode the plurality of first units of data based on the list information, wherein the streaming data processor is configured to output data decoded by the decoder. (Drouin, ¶ [0035], “Using the manifest, a client device 106 can request portions of an asset to stream. However, some client devices may attempt to stream portions of the asset that they are unable to stream in a satisfactory way, either because the device itself does not have the appropriate codecs or processing power; or because the network bandwidth is too low to allow delivery of the asset portion. The client device will then have to iteratively try lower and lower quality or size portions of the asset; or differently encoded portions to identify a portion that the client device 106 is able to playback.”)
Drouin does not explicitly teach “a streaming data processor configured to receive streaming data,”.
In a similar field of endeavor Heo teaches:
a streaming data processor configured to receive streaming data, (Heo, ¶ [0021], “The media processing unit 12 includes the header parser 14 and the decoder 16. Among a media data source input to the media processing unit 12, as described above, only a header portion thereof is parsed through the header parser 14. Thereafter, the parsed header information is stored in the queue buffer 22 and memory unit 20. The header information stored in the queue buffer 22 is provided when a request for media data sharing is received from the client terminal 110. Also, the header information stored in the memory unit 20 is used to reproduce media data in the server terminal 100. In this case, the header information includes decoder generation information, such as a codec type of a file, a frame size, a frame rate, a number of bits per pixel, and a quantization precision.”)
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the system for generating manifest data and decoding a stream as taught by Drouin with the system for receiving streaming data as taught by Heo, the motivation is “for providing a real-time streaming service from the server terminal to the client terminal” as taught by Heo (¶ [0032]).
Regarding claim 2, the combination of Drouin and Heo teaches:
The signal processing device of claim 1, wherein the decoder is configured to decode the plurality of first units of data related to the streaming data (Drouin, ¶ [0034], “The manifest contains a list of the various streams that are available in the given asset.”) based on the information on the first units of data in the list information. (Drouin, ¶ [0035], “Using the manifest, a client device 106 can request portions of an asset to stream. However, some client devices may attempt to stream portions of the asset that they are unable to stream in a satisfactory way, either because the device itself does not have the appropriate codecs or processing power; or because the network bandwidth is too low to allow delivery of the asset portion. The client device will then have to iteratively try lower and lower quality or size portions of the asset; or differently encoded portions to identify a portion that the client device 106 is able to playback.”)
Regarding claim 3, the combination of Drouin and Heo teaches:
The signal processing device of claim 1, further comprising: a memory configured to store the streaming data, wherein the decoder is configured to split the streaming data from the memory into the first units of data based on the number information (Drouin, ¶ [0034], “The manifest contains a list of the various streams that are available in the given asset.”) and address information of the first units of data in the list information (Drouin, ¶ [0035], “Using the manifest, a client device 106 can request portions of an asset to stream. However, some client devices may attempt to stream portions of the asset that they are unable to stream in a satisfactory way, either because the device itself does not have the appropriate codecs or processing power; or because the network bandwidth is too low to allow delivery of the asset portion. The client device will then have to iteratively try lower and lower quality or size portions of the asset; or differently encoded portions to identify a portion that the client device 106 is able to playback.”) and decode the plurality of first units of data based on the split first units of data. (Heo, ¶ [0021], “The media processing unit 12 includes the header parser 14 and the decoder 16. Among a media data source input to the media processing unit 12, as described above, only a header portion thereof is parsed through the header parser 14. Thereafter, the parsed header information is stored in the queue buffer 22 and memory unit 20. The header information stored in the queue buffer 22 is provided when a request for media data sharing is received from the client terminal 110. Also, the header information stored in the memory unit 20 is used to reproduce media data in the server terminal 100. In this case, the header information includes decoder generation information, such as a codec type of a file, a frame size, a frame rate, a number of bits per pixel, and a quantization precision.”)
Regarding claim 4, the combination of Drouin and Heo teaches:
The signal processing device of claim 1, further comprising: a memory configured to store the first units of data related to the streaming data, wherein the decoder is configured to access the first units of data corresponding to the memory based on the number information (Drouin, ¶ [0034], “The manifest contains a list of the various streams that are available in the given asset.”) and address information of the first units of data in the list information (Drouin, ¶ [0035], “Using the manifest, a client device 106 can request portions of an asset to stream. However, some client devices may attempt to stream portions of the asset that they are unable to stream in a satisfactory way, either because the device itself does not have the appropriate codecs or processing power; or because the network bandwidth is too low to allow delivery of the asset portion. The client device will then have to iteratively try lower and lower quality or size portions of the asset; or differently encoded portions to identify a portion that the client device 106 is able to playback.”) and decode the plurality of first units of data based on the accessed first units of data. (Heo, ¶ [0021], “The media processing unit 12 includes the header parser 14 and the decoder 16. Among a media data source input to the media processing unit 12, as described above, only a header portion thereof is parsed through the header parser 14. Thereafter, the parsed header information is stored in the queue buffer 22 and memory unit 20. The header information stored in the queue buffer 22 is provided when a request for media data sharing is received from the client terminal 110. Also, the header information stored in the memory unit 20 is used to reproduce media data in the server terminal 100. In this case, the header information includes decoder generation information, such as a codec type of a file, a frame size, a frame rate, a number of bits per pixel, and a quantization precision.”)
Regarding claim 7, the combination of Drouin and Heo teaches:
The signal processing device of claim 1, wherein the streaming data processor is configured to extract the plurality of first units of data by parsing a second unit of data greater than the first unit, (Drouin, ¶ [0035], “Using the manifest, a client device 106 can request portions of an asset to stream. However, some client devices may attempt to stream portions of the asset that they are unable to stream in a satisfactory way, either because the device itself does not have the appropriate codecs or processing power; or because the network bandwidth is too low to allow delivery of the asset portion. The client device will then have to iteratively try lower and lower quality or size portions of the asset; or differently encoded portions to identify a portion that the client device 106 is able to playback.”) and generate the list information including the information on the plurality of first units of data. (Drouin, ¶ [0034], “The manifest contains a list of the various streams that are available in the given asset.”)
Regarding claim 8, the combination of Drouin and Heo teaches:
The signal processing device of claim 1, wherein the streaming data processor is configured to convert the first units of data into parameter information (Drouin, ¶ [0035], “Using the manifest, a client device 106 can request portions of an asset to stream. However, some client devices may attempt to stream portions of the asset that they are unable to stream in a satisfactory way, either because the device itself does not have the appropriate codecs or processing power; or because the network bandwidth is too low to allow delivery of the asset portion. The client device will then have to iteratively try lower and lower quality or size portions of the asset; or differently encoded portions to identify a portion that the client device 106 is able to playback.”) and transmit update information of the list information and the parameter information to the decoder. (Heo, ¶ [0021], “The media processing unit 12 includes the header parser 14 and the decoder 16. Among a media data source input to the media processing unit 12, as described above, only a header portion thereof is parsed through the header parser 14. Thereafter, the parsed header information is stored in the queue buffer 22 and memory unit 20. The header information stored in the queue buffer 22 is provided when a request for media data sharing is received from the client terminal 110. Also, the header information stored in the memory unit 20 is used to reproduce media data in the server terminal 100. In this case, the header information includes decoder generation information, such as a codec type of a file, a frame size, a frame rate, a number of bits per pixel, and a quantization precision.”)
Regarding claim 9, the combination of Drouin and Heo teaches:
The signal processing device of claim 1, wherein the streaming data processor is configured to update at least a portion of the list information with parameter information (Drouin, ¶ [0035], “Using the manifest, a client device 106 can request portions of an asset to stream. However, some client devices may attempt to stream portions of the asset that they are unable to stream in a satisfactory way, either because the device itself does not have the appropriate codecs or processing power; or because the network bandwidth is too low to allow delivery of the asset portion. The client device will then have to iteratively try lower and lower quality or size portions of the asset; or differently encoded portions to identify a portion that the client device 106 is able to playback.”) and transmit the updated parameter information, as the list information, to the decoder. (Heo, ¶ [0021], “The media processing unit 12 includes the header parser 14 and the decoder 16. Among a media data source input to the media processing unit 12, as described above, only a header portion thereof is parsed through the header parser 14. Thereafter, the parsed header information is stored in the queue buffer 22 and memory unit 20. The header information stored in the queue buffer 22 is provided when a request for media data sharing is received from the client terminal 110. Also, the header information stored in the memory unit 20 is used to reproduce media data in the server terminal 100. In this case, the header information includes decoder generation information, such as a codec type of a file, a frame size, a frame rate, a number of bits per pixel, and a quantization precision.”)
Regarding claim 15, the combination of Drouin and Heo teaches:
The signal processing device of claim 1, wherein the number of communications between the streaming data processor (Drouin, ¶ [0035], “Using the manifest, a client device 106 can request portions of an asset to stream. However, some client devices may attempt to stream portions of the asset that they are unable to stream in a satisfactory way, either because the device itself does not have the appropriate codecs or processing power; or because the network bandwidth is too low to allow delivery of the asset portion. The client device will then have to iteratively try lower and lower quality or size portions of the asset; or differently encoded portions to identify a portion that the client device 106 is able to playback.”) and the decoder is inversely proportional to the number of first units of data in the list information. (Heo, ¶ [0021], “The media processing unit 12 includes the header parser 14 and the decoder 16. Among a media data source input to the media processing unit 12, as described above, only a header portion thereof is parsed through the header parser 14. Thereafter, the parsed header information is stored in the queue buffer 22 and memory unit 20. The header information stored in the queue buffer 22 is provided when a request for media data sharing is received from the client terminal 110. Also, the header information stored in the memory unit 20 is used to reproduce media data in the server terminal 100. In this case, the header information includes decoder generation information, such as a codec type of a file, a frame size, a frame rate, a number of bits per pixel, and a quantization precision.”)
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MICHAEL B PIERORAZIO whose telephone number is (571)270-3679. The examiner can normally be reached on Monday - Thursday, 8am - 5pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Nasser Goodarzi can be reached on 5712704195. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MICHAEL B. PIERORAZIO/Primary Examiner, Art Unit 2426