Prosecution Insights
Last updated: April 19, 2026
Application No. 18/757,337

HARDWARE SPECIFIC GRAPHICAL USER INTERFACE CODEC

Non-Final OA §101§102§103
Filed
Jun 27, 2024
Examiner
MUNG, ON S
Art Unit
2486
Tech Center
2400 — Computer Networks
Assignee
Amazon Technologies, Inc.
OA Round
1 (Non-Final)
74%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
83%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
507 granted / 683 resolved
+16.2% vs TC avg
Moderate +9% lift
Without
With
+9.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
33 currently pending
Career history
716
Total Applications
across all art units

Statute-Specific Performance

§101
6.8%
-33.2% vs TC avg
§103
41.2%
+1.2% vs TC avg
§102
30.2%
-9.8% vs TC avg
§112
7.2%
-32.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 683 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections 2. Claims 1-2, 4, and 9 are objected to because of the following informalities: The limitations “RGBA”, “RGB”, and “GPU” as cited in claims 1, 4, and 9 should be written out as ------- Red Green Blue Alpha (RGBA) -----, ------ Red Green Blue (RGB), and ----- Graphics Processing Unit (GPU) ---- in their first occurrence. It is noted that these limitations are insufficient by themselves to give public notice to what an applicant regards as the invention. The limitation “CPU” as cited in claim 2 should be written out as ------ Central Processing Unit (CPU) ------ in the first occurrence. Appropriate correction is required. Claim Rejections - 35 USC § 101 3. 35 U.S.C. § 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. 4. Claims 4-6, 8, 14, and 15 are rejected under pre-AIA 35 U.S.C § 101 because the claimed inventions are directed to non-statutory subject matter as follow. In view of the specification (see paragraphs 0095, 0114, 0122 of U.S published application), applicant has provided evidence that applicant intends computer readable media to include transmission type media and signal type media as such the claim is drawn to a form of energy. The claims can be broadly interpreted to be signal per se. Hence, they are rejected as being directed to nonstatutory subject matter. The rejections above can be overcome by amending the claims by adding the limitation "non-transitory" to the claim. The applicant is directed to guidance provided in the document titled "Subject Matter Eligibility of Computer Readable Media" dated January 26, 2010 and available on the USPTO public website athttp://www.uspto.gov/patents/law/notices/101_crm_20100127.pdf Claim Rejections - 35 USC § 102 5. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 6. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. 7. Claims 4-15, 18-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Huang et al. (US 2019/0222623A1) (hereinafter Huang). Regarding claim 4, Huang discloses an encoder device (e.g., see Figs. 1-3: encoder/decoder) comprising: at least one processor (e.g., see Fig. 16, paragraphs 0209-0214: processor or CPU); at least one computer readable media containing computer executable instructions (e.g., see Fig. 16, paragraphs 0006, 0209, 0210: computer readable medium and processor; also see paragraphs 0207, 0211: program instructions); which, when executed using the at least one processor (e.g., see Fig. 16, paragraphs 0209-0214: processor or CPU), cause the encoder device to perform operations comprising: dividing an RGBA bitstream into RGB data and Alpha data (e.g., see Fig. 1, paragraphs 0032-0034: the RGBA data is separate into RGB data and Alpha data; also see Fig. 2, paragraphs 0040, 0116; Figs. 13-15, paragraphs 0187-0189: separation module); generating encoded RGB data by encoding the RGB data using a first encoder, wherein the first encoder utilizes at least one video codec (e.g., see paragraphs 0024, 0034: JPEG, PNG, GIF); generating encoded Alpha data by encoding the Alpha data using a second encoder (e.g., see Fig.1, paragraphs 0031-0036: compress the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data);, wherein the second encoder utilizes a GPU compression format (e.g., see paragraphs 0024, 0034: compression format; Fig. 16, paragraphs 0206, 0207: user interface 1502); and transmitting the encoded RGB data and the encoded Alpha data to a decoder device (e.g., see Figs. 6-7, paragraphs 0103, 0111, 0117: output stream data to client devices; Figs. 1, paragraphs 0031-0036: transmitting data from encoder side to decoder side), wherein the encoded RGB data and the encoded Alpha data comprise at least one block of a graphical user interface (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502). Regarding claim 5, Huang discloses the encoder device of claim 4, wherein the at least one computer readable media containing the computer executable instructions which, when executed using the at least one processor (e.g., see Fig. 16, paragraphs 0006, 0209, 0210: computer readable medium and processor), cause the encoder device to perform operations comprising: receiving a full frame update request from the decoder device (e.g., see Fig. 1, paragraphs 0032-0034: full frame or all frame update; Fig. 2, paragraphs 0042-0048: full frame such as I-frame); generating encoded additional RGB data by encoding additional RGB data using a first encoder (e.g., see Fig. 1, paragraphs 0031-0034: video encoders for RGA and Alpha data), wherein the additional RGB data is representative of pixel color information for a full frame of the graphical user interface (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU); generating encoded additional Alpha data by encoding additional Alpha data using a second encoder (e.g., see Fig. 1, paragraphs 0031-0034: video encoders for RGA and Alpha data), wherein the additional Alpha data is representative of pixel opaqueness information (e.g., see paragraphs 0025, 0030, 0104: transparency represents completely opaque or translucent) for the full frame of the graphical user interface; and transmitting the encoded additional RGB data and the encoded additional Alpha data to the decoder device (e.g., see Figs. 6-7, paragraphs 0103, 0111, 0117: output stream data to client devices; Figs. 1, paragraphs 0031-0036: transmitting data from encoder side to decoder side), wherein the encoded additional RGB data and the encoded additional Alpha data are representative of the full frame of the graphical user interface (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU). Regarding claim 6, Huang discloses the encoder device of claim 4, wherein the at least one computer readable media containing the computer executable instructions which, when executed using the at least one processor (e.g., see Fig. 16, paragraphs 0006, 0209, 0210: computer readable medium and processor), cause the encoder device to perform operations comprising: receiving a partial frame update request from the decoder device (e.g., see Fig. 1, paragraphs 0032-0034: partial frame or all frame update; Fig. 2, paragraphs 0042-0048: frame such as I-frame); generating encoded additional RGB data by encoding additional RGB data using a first encoder (e.g., see Fig. 1, paragraphs 0031-0034: video encoders for RGA and Alpha data), wherein the additional RGB data is representative of pixel color information for at least one block of the graphical user interface (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU); generating encoded additional Alpha data by encoding additional Alpha data using a second encoder (e.g., see Fig. 1, paragraphs 0031-0034: video encoders for RGA and Alpha data), wherein the additional Alpha data is representative of pixel opaqueness information for the at least one block of the graphical user interface (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU); and transmitting the encoded additional RGB data and the encoded additional Alpha data to the decoder device(e.g., see Figs. 6-7, paragraphs 0103, 0111, 0117: output stream data to client devices; Figs. 1, paragraphs 0031-0036: transmitting data from encoder side to decoder side), wherein the encoded additional RGB data and the encoded additional Alpha data are representative of the at least one block of the graphical user interface (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU). Regarding claim 7, Huang discloses the encoder device of claim 4, wherein encoding the Alpha data with the second encoder (e.g., see Fig. 1, paragraphs 0031-0034: video encoders for RGA and Alpha data) comprises: performing a solid color optimization process (e.g., see paragraphs 0028, 0029: analyzing/adjusting colors; Figs. 2-3, paragraphs 0041, 0042, 0047: color space conversion) comprising: skipping encoding of the Alpha data for individual pixels (e.g., see paragraphs 0031: skipping encoding/decoding process; Fig. 4, paragraphs 0059, 0062, 0110); generating a full screen Alpha value frame (e.g., see paragraphs 0025, 0083: full screen or fully transparent); and generating at least one additional Alpha value update frame (e.g., see Fig. 1, paragraphs 0032-0034: generating RGA and Alpha data; also see Fig. 2, paragraphs 0038-0044: processes of encoding RGB and Alpha data). Regarding claim 8, The encoder device of claim 4, wherein the at least one computer readable media containing the computer executable instructions which, when executed using the at least one processor (e.g., see Fig. 16, paragraphs 0006, 0209, 0210: computer readable medium and processor), cause the encoder device to perform operations comprising: identifying a decoder device for receipt of the RGBA bitstream (e.g., see Fig.1, paragraphs 0031-0036: decoder for decompressing the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data); determining a GPU specification associated with the decoder device (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; Figs. 1-3: decoder/decoding); and selecting the second encoder based on the GPU specification associated with the decoder device (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; Figs. 1-3: decoder/decoding), wherein the second encoder comprises at least one of a software encoder or a hardware encoder compatible with the GPU specification (e.g., see Fig. 16, paragraphs 0208-2011: software/instructions; Figs. 1-3: decoder/decoding). Regarding claim 9, this claim is a method claim of an encoder device version as applied to claim 4 above, wherein the device performs the same limitations cited in claim 4, the rejections of which are incorporated herein. Regarding claim 10, it contains the limitations of claims 5 and 9, and is analyzed as previously discussed with respect to those claims. Regarding claim 11, it contains the limitations of claims 6 and 9, and is analyzed as previously discussed with respect to those claims. Regarding claim 12, it contains the limitations of claims 7 and 9, and is analyzed as previously discussed with respect to those claims. Regarding claim 13, it contains the limitations of claims 8 and 9, and is analyzed as previously discussed with respect to those claims. Regarding claim 14, Huang discloses an electronic device (e.g., see Figs. 12-16) comprising: one or more central processing units (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502); a graphics processing unit (e.g., see paragraphs 0209-0214: processor or CPU); a display (e.g., see paragraph 0025: a picture is displayed); one or more computer readable media storing processor executable instructions which, when executed using the one or more central processing units (e.g., see Fig. 16, paragraphs 0006, 0209, 0210: computer readable medium and processor), cause the electronic device to perform operations comprising receiving first encoded data representing a first frame of video (e.g., see Fig.1, paragraphs 0031-0036: decoder for decompressing the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data), receiving second encoded data representing pixel color data for a graphical user interface associated with the first frame of video (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502), receiving third encoded data representing pixel transparency data (e.g., see abstract, paragraphs 0004, 0005, 0038: transparency data; paragraphs 0025, 0030, 0104: transparency) for the graphical user interface associated with the first frame of video (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502), decoding the first encoded data into the first frame of video using a video decoding scheme (e.g., see Fig.1, paragraphs 0031-0036: decoder for decompressing the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data), decoding the second encoded data into the pixel color data (e.g., see paragraphs 0025, 0028, 0034: pixel color and alpha) for the graphical user interface using a video decoding scheme (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502), determining, based on the third encoded data (e.g., see Fig. 1-3), one or more compressed blocks representing the pixel transparency data (e.g., see paragraphs 0025, 0028, 0034: pixel color and alpha) for the graphical user interface (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502), sending the one or more compressed blocks to the graphics processing unit for rendering (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502), and causing the display to display the first frame of video (e.g., see paragraph 0025: a picture is displayed) and the graphical user interface (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502). Regarding claim 15, Huang discloses the encoder device of claim 14, wherein the one or more computer readable media store processor executable instructions which, when executed using the one or more central processing units (e.g., see paragraphs 0209-0214: processor or CPU), cause the electronic device (e.g., see Figs. 12-16) to perform operations comprising causing the graphics processing unit to render the graphical user interface without decompressing the one or more compressed blocks (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502). Regarding claim 18, this claim is a method claim of a device version as applied to claim 14 above, wherein the system performs the same limitations cited in claim 14, the rejections of which are incorporated herein. Regarding claim 19, Huang discloses the encoder device of claim 18, wherein the method comprises determining the decoding device, and, based on the determining of the decoding device, determining the second encoding scheme (e.g., see Figs. 1-3: encoding/decoding device). Regarding claim 20, Huang discloses the encoder device of claim 18, wherein the encoded pixel color data and the encoded pixel transparency data is sent to the decoding device (e.g., see Figs. 1-3: encoding/decoding device; paragraphs 0025, 0028, 0034: pixel color and alpha). Claim Rejections - 35 USC § 103 8. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 9. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 10. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. 11. Claim 1-3, 16-17 are rejected under 35 U.S.C. 103 as being unpatentable over Huang et al. (US 2019/0222623A1) in view of Holland et al. (US 2015/0042659A1) (hereinafter Holland). Regarding claim 1, Huang discloses a video streaming system (e.g., see abstract; Figs. 1-2) comprising: a streaming server comprising a plurality of encoders (e.g., see Fig. 1, paragraphs 0033, 0034: video encoders for RGA and Alpha data; paragraphs 0103, 0117, 0136: server), wherein the streaming server is configured to: identify a client device for receipt of an RGBA bitstream (e.g., see Fig. 1, paragraphs 0032-0034: RGBA data), wherein the RGBA bitstream is representative of a graphical user interface (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502); divide the RGBA bitstream into an RGB channel buffer and an Alpha channel buffer (e.g., see Fig. 1, paragraphs 0032-0034: the RGBA data is separate into RGB data and Alpha data; also see Fig. 2, paragraphs 0040, 0116; Figs. 13-15, paragraphs 0187-0189: separation module); generate encoded RGB data by encoding RGB data from the RGB channel buffer (e.g., see abstract, paragraphs 0004, 0029: encoding color data of the picture; Fig. 1, paragraphs 0031-0034: video encoder for encoding RGB and Alpha data; also see Fig. 2, paragraphs 0038-0044: processes of encoding RGB and Alpha data) using a first encoder of the plurality of encoders (e.g., see Fig. 1, paragraphs 0033, 0034: video encoders for RGA and Alpha data); compress the encoded RGB data (e.g., see Fig.1, paragraphs 0031-0036: compress the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data); determine a second encoder from the plurality of encoders (e.g., see Fig. 1, paragraphs 0033, 0034: video encoders for RGA and Alpha data) based on compatibility with a GPU of the client device (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502); generate encoded Alpha data by encoding Alpha data from the Alpha channel buffer with the second encoder (e.g., see Fig. 1, paragraphs 0032-0034: video encoders for RGA and Alpha data; also see Fig. 2, paragraphs 0038-0044: processes of encoding RGB and Alpha data), wherein the second encoder utilizes a GPU tile based framebuffer compression format that is specific to the GPU of the client device (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502); and transmit the encoded RGB data and the encoded Alpha data to the client device (e.g., see Figs. 6-7, paragraphs 0103, 0111, 0117: output stream data to client devices; Figs. 1, paragraphs 0031-0036: transmitting data from encoder side to decoder side), wherein the encoded RGB data and the encoded Alpha data comprise at least one block of at least one frame of the graphical user interface (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502). Huang does not explicitly disclose a graphical user interface comprising 1080p60 resolution; wherein the first encoder utilizes H.264 video compression format. However, Holland discloses a graphical user interface comprising 1080p60 resolution (e.g., see paragraphs 0006, 0021, 0024: 1080p resolution; also see paragraphs 0023-0025: various resolutions); wherein the first encoder utilizes H.264 video compression format (e.g., see paragraphs 0051: H.264/AVC). It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify the system disclosed by Huang to add the teachings of Holland as above, in order to provide improved methods for modifying video encoding formats (see paragraph 0002: Holland). Regarding claim 2, Huang and Holland disclose all the limitations of claim 1, and are analyzed as previously discussed with respect to that claim. Furthermore, Huang discloses comprising: the client device comprising a decoder (e.g., see Figs. 1, 3), a CPU, and the GPU (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU), wherein the client device is configured to: receive the encoded RGB data and the encoded Alpha data from the streaming server (e.g., see Fig.1, paragraphs 0031-0036: receiving the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data); generate the RGB data by decoding the encoded RGB data using the decoder and the CPU of the client device (e.g., see Fig.1, paragraphs 0031-0036: compress/decompress the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data); decompress the RGB data using the CPU (e.g., see Fig.1, paragraphs 0031-0036: compress/decompress the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data); generate the Alpha data by decoding the encoded Alpha data using the decoder (e.g., see Fig.1, paragraphs 0031-0036: compress/decompress the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data) and the GPU of the client device, wherein the Alpha data retains the GPU tile based framebuffer compression format (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU); generate output data by merging the RGB data and the Alpha data (e.g., see Fig. 2, step S206: combining/merging RGB data and Alpha data; Fig. 3, step S304: Combine the Alpha data in S303 and the RGB data in S302 to generate RGBA data); render, at least in part, the graphical user interface from the RGB data of the output data (e.g., see paragraphs 0004-0006: outputting RGB and Alpha data; also see paragraphs 0032, 0033, 0036), using the GPU of the client device (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU); and natively render, at least in part, the graphical user interface using the Alpha data of the output data (e.g., see paragraphs 0004-0006: outputting RGB and Alpha data; also see paragraphs 0032, 0033, 0036), wherein the Alpha data comprises the GPU tile based framebuffer compression format, using the GPU of the client device (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU). Regarding claim 3, Huang and Holland disclose all the limitations of claim 2, and are analyzed as previously discussed with respect to that claim. Furthermore, Huang discloses wherein the streaming server (e.g., see paragraphs 0103, 0117, 0136: server) is configured to: transmit an at least partial frame update to the client device (e.g., see Figs. 6-7, paragraphs 0103, 0111, 0117: output stream data to client devices; Figs. 1, paragraphs 0031-0036: transmitting data from encoder side to decoder side), wherein the at least partial frame update comprises at least one block that has changed in the at least one frame of the graphical user interface (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU) of the encoded RGB data and the encoded Alpha data, wherein the at least one block comprises additional encoded RGB data and additional encoded Alpha data (e.g., see Fig.1, paragraphs 0031-0036: compress/decompress the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data); and wherein the client device (e.g., see Figs. 6-7, paragraphs 0103, 0111, 0117: output stream data to client devices) is configured to: receive the at least partial frame update from the streaming server (e.g., see Fig. 1, paragraphs 0033, 0034: video encoders for RGA and Alpha data; paragraphs 0103, 0117, 0136: server); generate additional RGB data by decoding the additional encoded RGB data (e.g., see Fig.1, paragraphs 0031-0036: compress/decompress the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data) using the decoder and the CPU of the client device (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU); generate additional Alpha data by decoding the additional encoded Alpha data using the decoder (e.g., see Fig.1, paragraphs 0031-0036: compress/decompress the encoded Alpha and RGB data; Fig. 3, paragraphs 0045-0048: decoding the encoded RGB and Alpha data) and the GPU of the client device, wherein the additional Alpha data retains the GPU tile based framebuffer compression format (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU); generate additional output data by merging the additional RGB data and the additional Alpha data (e.g., see paragraphs 0004-0006: outputting RGB and Alpha data; also see paragraphs 0032, 0033, 0036); re-render, at least in part, the graphical user interface from the RGB data of the output data (e.g., see paragraphs 0004-0006: outputting RGB and Alpha data; also see paragraphs 0032, 0033, 0036), using the GPU of the client device (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU); and natively re-render, at least in part, the graphical user interface from the additional Alpha data of the output data (e.g., see paragraphs 0004-0006: outputting RGB and Alpha data; also see paragraphs 0032, 0033, 0036; Figs. 1-2), wherein the Alpha data comprises the GPU tile based framebuffer compression format, using the GPU of the client device (e.g., see Fig. 16, paragraphs 0206, 0207: user interface 1502; paragraphs 0209-0214: processor or CPU). Regarding claim 16, Huang does explicitly disclose the electronic device of claim 14, wherein the decoding of the first encoded data into the first frame of video uses H.264, and wherein the decoding of the second encoded data into the pixel color data uses H.264. However, Holland discloses wherein the decoding of the first encoded data into the first frame of video uses H.264, and wherein the decoding of the second encoded data into the pixel color data uses H.264 (e.g., see paragraphs 0051: H.264/AVC). It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify the system disclosed by Huang to add the teachings of Holland as above, in order to provide improved methods for modifying video encoding formats (see paragraph 0002: Holland). Regarding claim 17, Huang does not explicitly disclose the electronic device of claim 14, wherein the decoding of the first encoded data into the first frame of video does not use H.264, and wherein the decoding of the second encoded data into the pixel color data uses H.264. However, Holland discloses wherein the decoding of the first encoded data into the first frame of video does not use H.264, and wherein the decoding of the second encoded data into the pixel color data uses H.264 (e.g., see paragraphs 0051: H.264/AVC). It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify the system disclosed by Huang to add the teachings of Holland as above, in order to provide improved methods for modifying video encoding formats (see paragraph 0002: Holland). Conclusion 12. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ON MUNG whose telephone number is (571) 270-7557 and whose direct fax number is (571) 270-8557. The examiner can normally be reached on Mon-Fri 9am - 6pm (ET). If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JAMIE ATALA can be reached on (571)272-7384. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ON S MUNG/Primary Examiner, Art Unit 2486
Read full office action

Prosecution Timeline

Jun 27, 2024
Application Filed
Nov 19, 2025
Response after Non-Final Action
Feb 07, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593064
IMAGE ENCODING/DECODING METHOD AND APPARATUS, AND RECORDING MEDIUM STORING BITSTREAM
2y 5m to grant Granted Mar 31, 2026
Patent 12587688
Signaling of Picture Header in Video Coding
2y 5m to grant Granted Mar 24, 2026
Patent 12578560
CAMERA SYSTEM FOR GENERATING A GAPLESS OPTICAL IMAGE
2y 5m to grant Granted Mar 17, 2026
Patent 12581197
EXTENDED SCENE VIEW
2y 5m to grant Granted Mar 17, 2026
Patent 12574503
METHOD AND DEVICE FOR ENCODING/DECODING IMAGE, AND RECORDING MEDIUM STORING BIT STREAM
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
74%
Grant Probability
83%
With Interview (+9.2%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 683 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month