Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
MPEP 607(III) states:
Any claim which is in dependent form but which is so worded that it, in fact, is not a proper dependent claim, as for example it does not include every limitation of the claim on which it depends, will be required to be canceled as not being a proper dependent claim; and cancellation of any further claim depending on such a dependent claim will be similarly required. The applicant may thereupon amend the claims to place them in proper dependent form, or may redraft them as independent claims, upon payment of any necessary additional fee.
Claim 18 is such a claim because it is directed to a system rather than a method as in referenced claim 1. MPEP 608.01(n)(III). While, in the interest of compact prosecution, claim 18 has been examined, claim 18 is required to be cancelled.
Claim 2 is objected to for reciting “produce an further” with “an” instead of “a.”
Examiner Notes
1) Applicant’s 63/053,807 includes what appears to be copies of documents that were made public prior to the earliest asserted priority date (e.g., blog posts, white papers, scholarly publications). Some of the documents have dates, but others do not. Some of the documents have authors, but others do not. The examiner would appreciate a list of citations for the underlying documents (e.g., authorship and earliest availability). The examiner notes that some of the URLs in these documents are no longer available, such as various posts from https://deeprender.blogin.co or https://github.com/deeprender/compression.
2) Applicant’s priority documents address a wide range of technologies, and the present claims are directed to many of them. Here, while some of the claims may be directed to more than one invention (e.g., claim 6’s “target function is a perceptual metric” versus claim 7’s “target function is a runtime device proxy,” or claims 5 and 10 reciting different inputs to the proxy network), because all of the claims are anticipated, there is not presently a search burden.
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Drawings
The drawings are objected to because Figs. 6 and 7 are labeled with “distorted” and “non-distorted,” but the distortion is not apparent. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-18 are rejected on the ground of nonstatutory double patenting as being unpatentable over the claims of each of US 12026924 B1, US 12113985 B2, US 12167003 B2, US 11936866 B2,
US 11881003 B2, US 11677948 B2, US 11985319 B2, US 12015776 B2, US 12022077 B2, US 12028525 B2, US 12075053 B2, US 12081759 B2, US 12095994 B2, US 12160579 B2, US 12256075 B2, US 12323593 B2, US 11599972 B1, US 11558620 B2, US 11606560 B2, US 11843777 B2,
US 11544881 B1, US 11532104 B2, and US 11893762 B2 in view of the prior art as applied below.
Both the pending claims and the conflicting patents are all directed to compressing lossy data. Therefore, all of the conflicting patents are directed to the same problem as the present application. Further, any differences between the present claims and the claims in any of the conflicting patents are obvious in view of the prior art as applied below. It would have been obvious to one of ordinary skill in the art, before the effective filing date, to combine the below prior art with any of the conflicting patents in for implementation details (especially as the patent claims lack implementation details). Based on the findings herein, this is an example of “(A) Combining prior art elements according to known methods to yield predictable results.” MPEP 2143.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1-18 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention.
Claims 1 and 12 recite the use of various neural network without specifying the architectures (i.e., “neural network” and “proxy network”). MPEP 2173.05(g) explains “Further, without reciting the particular structure, materials or steps that accomplish the function or achieve the result, all means or methods of resolving the problem may be encompassed by the claim.” Here, because the claims do not specify the structure that accomplishes the function, this is an example of unlimited functional claiming. The layers shown in Fig. 3 are sufficiently specific and well-known to overcome this rejection.
Claim 1 recites “evaluating a function,” but this is unlimited functional claiming. MPEP 2173.05(g).
Dependent claims are likewise rejected. Claim 18 is rejected as per claim 1.
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claims 1, 2, and 12 recite “approximation,” but this is a relative term. MPEP 2173.05(b). While the specification repeatedly refers to approximations, it does not provide sufficient guidance to determine if a particular image or function is an approximation of another image or function.
Claims 1 and 12 recite “proxy network,” but “proxy” is subjective. MPEP 2173.05(b)(IV). Here, whether a network can substitute for a second network (i.e., be a proxy for), is a matter of opinion.
Claims 1 and 12 recite “configured to approximate a target function,” but because the target function is not specified, there is insufficient guidance as to whether it is configured to be the unspecified target function or not.
Claim 1 recites “evaluating a function based on the output of the differentiable proxy network.” It is not clear what “based on the output” entails. Specifying that the output is an input to the function is expected to overcome this rejection.
Claim 1 recites “updating the parameters of the differentiable proxy network based on the evaluated function.” It is not clear what “based on” entails. Specifying what the evaluated function is, and that this function serves as the loss index for training is expected to overcome this rejection.
Claims 2, 8, and 10 recite “the trained differentiable proxy network,” but this lacks sufficient antecedent basis because claim 1 twice recites “a trained differentiable proxy network.” One option is to amend claim 1 to specify that the two references to proxy networks are both to the same network, whereas other approaches (such as having one proxy network train a second proxy network) do not appear to have support in the priority documents).
Claims 4 and 13 recite “gradient intractable function,” but this is new terminology. MPEP 2173.05(a). “Gradient intractable function” is not a term of art, and the plain meaning appears to contradict itself (i.e., if it is intractable, how can it be a gradient function?).
Claim 5 recites “is the input image and the output image,” but this conflicts with parent claim 1 because parent claim 1 defines the input and output image as different images.
Claim 7 recites “runtime device proxy,” but this is new terminology. MPEP 2173.05(a).
Claim 12 recites first and second “trained” network, but does not specify what the training was, and thus one cannot determine whether or not the first and second networks are “trained” or not because the training could be for anything.
Claims 9 and 11 as well as claims 15 and 17 recite both “quantization function” as well as “rounding function,” but the specification states that these terms are synonymous. See, e.g., specification p. 40. The lack of distinction between these terms renders the claims indefinite.
Dependent claims are likewise rejected. Claim 18 is rejected as per claim 1.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-18 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea (mental process) without significantly more.
Step 1: Claims 1 and 12 (and their dependents) recite methods, and processes are eligible subject matter.
Claim 18 recites a system, and machines are eligible subject matter.
Step 2A, prong one: All of the elements of claims 1-18 are a mental process because a person can remember what something roughly looked like. Further, the various models are also mental processes, see example 47, claim 2, element (d) (from the July 2024 AI subject matter eligibility examples). MPEP 2106.04(a)(2)(III)(C) explains that use of a generic computer or in a computer environment is still a mental process. In particular, this section begins by citing Gottschalk v. Benson, 409 US 63 (1972). “The Supreme Court recognized this in Benson, determining that a mathematical algorithm for converting binary coded decimal to pure binary within a computer’s shift register was an abstract idea.” In Benson the Supreme Court did not separately analyze the computer hardware at issue; the specifics of what hardware was claimed is only included in an appendix to the decision.
Because there are no additional elements, no further analysis is required for Step 2A, prong two or Step 2B.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-18 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Chen LH, Bampis CG, Li Z, Norkin A, Bovik AC. ProxIQA: A proxy approach to perceptual optimization of learned image compression. IEEE Transactions on Image Processing. 2020 Nov 13;30:360-73, October 19, 2019 preprint retrieved from https://arxiv.org/pdf/1910.08845v1 (“Chen”).
1. A method of training one or more neural networks, the one or more neural networks being for use in lossy image or video encoding, transmission and decoding, the method comprising the steps of:
receiving an input image at a first computer system; (Chen, Fig. 1. Also, to the extent that Fig. is hypothetical (as opposed to an actual computer), Fig. 11 teaches the actual computer.)
encoding the input image using a first neural network to produce a latent representation; (Chen, Fig. 1)
decoding the latent representation using a second neural network to produce an output image, wherein the output image is an approximation of the input image; (Chen, Fig. 1)
wherein the method further comprises a step of generating an output using a trained differentiable proxy network, (Chen, III A, “As depicted in Fig. 4, the ProxIQA network may be as simple as a shallow CNN consisting of three stages of convolution, ReLU as an activation function, and maxpooling.” Chen’s ProxIQA teaches the claimed proxy network (abstract), and convolutional neural networks are differentiable, see also Chen, III stating that this network can be back-propagated, and this property also teaches the claimed differentiability. Chen, V, “experimentally demonstrated” teaches the claimed generating an output.)
where the differentiable proxy network is configured to approximate a target function; (Chen, IV C, “using three perceptual IQA models as optimization targets.” Chen’s IQA models teach the claimed target functions.)
evaluating a function based on the output of the differentiable proxy network; (Chen, Fig. 11, caption, “VMAF BD-rate change (improvement) against the number of training steps and learning rate, using the Kodak images.”)
updating the parameters of the differentiable proxy network based on the evaluated function; and (Chen, Fig. 11, caption, “VMAF BD-rate change (improvement) against the number of training steps and learning rate, using the Kodak images.”)
repeating the above steps using a set of input images to produce a trained differentiable proxy network. (Chen, Fig. 11)
2. The method of claim 1, further comprising the steps of, after obtaining the trained differentiable proxy network:
receiving a further input image at a first computer system; (Chen, Fig. 11. Chen’s millions of training steps teach the claimed further image.)
encoding the further input image using a first neural network to produce a latent representation; (Chen, Fig. 1, Generative Network)
decoding the latent representation using a second neural network to produce an further output image, wherein the further output image is an approximation of the further input image; (Chen, Fig. 1, caption, “A generative network takes an image x as input and outputs a reconstructed image yˆ = fg(x; θg).” Chen’s reconstruction teaches the claimed approximation.)
evaluating a function based on a difference between the further output image and the further input image; (Chen, Fig. 1, caption, “Given an image quality measurement M, the ProxIQA network is learned as its proxy, where the output Mˆproxy represents M(y, yˆ).”)
updating the parameters of the first neural network and the second neural network based on the evaluated function; and (Chen, Fig. 1, caption, “Given an image quality measurement M, the ProxIQA network is learned as its proxy, where the output Mˆproxy represents M(y, yˆ).” Chen’s learning teaches the claimed updating.)
repeating the above steps using a further set of input images to produce a first trained neural network and a second trained neural network. (Chen, Fig. 1, caption, “Given an image quality measurement M, the ProxIQA network is learned as its proxy, where the output Mˆproxy represents M(y, yˆ).” Chen’s learning teaches the claimed training and repeating.)
3. The method of claim 1, wherein the function is additionally based on a difference between the output image and the input image; and (Chen, Fig. 1, caption, “Given an image quality measurement M, the ProxIQA network is learned as its proxy, where the output Mˆproxy represents M(y, yˆ).” Chen’s image quality measurement teaches the claimed difference.)
the parameters of the first neural network and the second neural network are additionally updated based on the evaluated function to obtain a first trained neural network and a second trained neural network. (Chen, Fig. 1, caption, “Given an image quality measurement M, the ProxIQA network is learned as its proxy, where the output Mˆproxy represents M(y, yˆ).” Chen’s learning teaches the claimed training.)
4. The method of claim 1, wherein the target function is a gradient intractable function. (Chen, I, “other image quality indexes have never been adopted as deep network loss functions, because they are generally non-differentiable and functionally complex.” Chen’s non-differentiable teaches the claimed gradient intractable. The mapping of claim 1 details that the target function is a perceptual assessment as discussed here.)
5. The method of claim 1, wherein the input to the differentiable proxy network is the input image and the output image. (Chen, Fig. 1)
6. The method of claim 1, wherein the target function is a perceptual metric. (Chen, IV C, “using three perceptual IQA models as optimization targets.” Chen’s IQA models teach the claimed target functions.)
7. The method of claim 1, wherein the target function is a runtime device proxy. (Chen, Fig. 12)
8. The method of claim 1, wherein the output of the trained differentiable proxy network is used to obtain the output image. (Chen, Fig. 1)
9. The method of claim 1, wherein the target function is a quantization function. (Chen, IV C, “using three perceptual IQA models as optimization targets.” Chen’s IQA models teach the claimed target functions.)
10. The method of claim 1, wherein the input to the trained differentiable proxy network is the latent representation. (Chen, Fig. 1)
11. The method of claim 1, wherein the target function is a rounding function. (Chen, IV C, “using three perceptual IQA models as optimization targets.” Chen’s IQA models teach the claimed target functions. Specification, p. 40 equates quantization and rounding.)
Claims 12-17 are rejected as per the corresponding system claims.
Claim 18 is rejected as per claim 1. Chen, Figs. 11 and 12 teach the claimed system.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 10965948 B2 – titled “Hierarchical auto-regressive image compression system”
US 11693372 B2 – abstract, “maps the feature space to a non-Euclidean perceptual space”
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID ORANGE whose telephone number is (571)270-1799. The examiner can normally be reached Mon-Fri, 9-5.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Gregory Morse can be reached at 571-272-3838. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DAVID ORANGE/Primary Examiner, Art Unit 2663