DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 20 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because the claim is directed to a “computer readable storage device” but the specification does not explicitly exclude transitory signals. Examiner suggests amending the claim to recite “non-transitory computer readable medium”.
Claims 1-6 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claims are drawn to an apparatus, but the claim fails to recite any structure. The recited neural networks are really software implemented, and “software per se” is not statutory.
Claim Rejections - 35 USC § 112(b)
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-6 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention. The claims are drawn to an apparatus but fail to recite any structure and the metes and bounds of a structure for implementing the software (neural network) are not clear from the claim.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-6, 8-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by D1.1
With regard to claim 1, D1 teach apparatus for ultra-low-dose (ULD) computed tomography (CT) reconstruction (see fig. 1, 11, ¶ 76), the apparatus comprising: a low dimensional estimation neural network configured to receive sparse sinogram data, and to reconstruct a low dimensional estimated image based, at least in part, on the sparse sinogram data (see fig. 9: shows coarse resolution inputs into a neural network for outputting an enhanced resolution image; fig. 11: algorithm to receive a sinogram and using a neural network for sparse-view reconstructions); a high dimensional refinement neural network configured to receive the sparse sinogram data and intermediate image data, and to reconstruct a relatively high resolution CT image data, wherein the intermediate image data is related to the low dimensional estimated image (see fig. 9: coarse resolution inputs into a neural network for outputting an enhanced resolution image; Fig. 11 show an algorithm to receive a sinogram and using a neural network for sparse-view reconstructions; Fig. 21 shows versions of the algorithm that uses the sinogram run through a neural network to create a coarse [sparse] resolution reconstruction, which can be used with the original sinogram [related to sparse sinogram data] through a different neural network to reconstruct an enhanced [higher resolution] image).
With regard to claim 2, D1 teach wherein each neural network comprises an image reconstruction module (RM), a deep estimation module (DM), and an error correction module (EM) (see ¶¶ 49, 72-73: reconstruction module; see ¶ 58, fig. 9: new DNN architecture, referred to herein as an "H-net", is introduced. The H-net provides improvements in medical image analysis, and in particular medical image signal recovery; fig. 9: H-net 900 combines two deep image-to-image networks; see ¶ 63: error correction).
With regard to claim 3, D1 teach wherein each neural network is configured to implement a split-Bregman technique (see ¶¶ 80, 87: split Bregman algorithm).
With regard to claim 4, D1 teach further comprising a filtered back projection (FBP) module configured to produce an FBP output based, at least in part, on the sparse sinogram data, the low dimensional estimated image reconstructed based, at least in part, on the FBP output (see fig. 9: shows coarse resolution inputs into a neural network for outputting an enhanced resolution image; Fig. 11 show an algorithm to receive a sinogram and using a neural network for sparse-view reconstructions; Fig. 21 shows versions of the algorithm that uses the sinogram run through a neural network to create a coarse [sparse] resolution reconstruction, which can be used with the original sinogram [related to sparse sinogram data] through a different neural network to reconstruct an enhanced [higher resolution] image; see also ¶¶ 69, 72, 81: back projection module).
With regard to claim 5, D1 teach further comprising an up-sampling module configured to produce the intermediate image data based, at least in part, on the low dimensional estimated image (see fig. 9, ¶¶ 58, 60-61).
With regard to claim 6, D1 teach wherein the low dimensional estimation neural network and the high dimensional refinement neural network are trained based, at least in part, on normal dose (ND) CT image data (see ¶¶ 63, 76, 97, 108: training data including training input images and
corresponding ground truth output images).
With regard to claims 8 and 14, see discussion of claim 1.
With regard to claims 9-13, see discussion of claims 2-6, respectively.
With regard to claim 15-19, see discussion of corresponding claims 2-6.
With regard to claim 20, see discussion of claim 1.
Claim 7 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AVINASH YENTRAPATI whose telephone number is (571)270-7982. The examiner can normally be reached on 8AM-5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sumati Lefkowitz can be reached on (571) 272-3638. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AVINASH YENTRAPATI/Primary Examiner, Art Unit 2672
1 US Publication No. 2021/0035338.