DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/2/23 has been entered.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1, 3-14, and 16-22 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention.
In claims 1 and 14, the limitation “determine using a second neural network, a probability that at least one anatomical landmark is present in the at least one image frame based on evaluating a position of the ultrasound transducer relative to the target region, wherein the probability is provided to the first neural network and the confidence metric is based at least in part on the probability, and wherein the at least one anatomical landmark is separate from the anatomical feature”.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1, 3-6, 8-12, 14, 16, and 18-22 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ardon et al. (US 2017/0181730; hereinafter Ardon) in view of Carneiro et al. (US 2009/0093717; hereinafter Carneiro), Georgescu et al. (US 2016/0174902; hereinafter Georgescu), and Cadieu et al. (US 2018/0153505; hereinafter Cadieu).
Ardon shows an ultrasound imaging system and method (abstract) comprising: an ultrasound transducer configured to acquire echo signals responsive to ultrasound pulses transmitted toward a target region of a patient ([0048]); a graphical user interface configured to display a biometry tool widget for acquiring a measurement of an anatomical feature within the target region from at least one image frame generated from the ultrasound echoes (alert user when initiating a measurement in a region of low confidence; [0072]-[0073], [0078]); and one or more processors in communication with the ultrasound transducer and configured to: determine a confidence metric indicative of an accuracy of the measurement ([0059]); and cause the graphical user interface to display a graphical indicator corresponding to the confidence metric ([0061]-[0062]).
Ardon also shows wherein the graphical user interface is not physically coupled to the ultrasound transducer (Figure 1); wherein the processors are further configured to: apply a threshold to the confidence metric to determine whether the measurement should be re- acquired ([0069]-[0073]); and cause the graphical user interface to display an indication of whether measurement should be re- acquired ([0069]-[0073]); wherein the anatomical feature is a feature associated with a fetus or a uterus ([0049], Figure 1).
Ardon also shows wherein the quality level of the image frame is based on a distance of the anatomical feature from the ultrasound transducer, an orientation of the biometry tool widget relative to the ultrasound transducer, a distance of a beam focus region to the anatomical feature, a noise estimate obtained via frequency analysis, or combinations thereof (quality dependent on anatomical structures and propagation direction of ultrasound wave; [0059]).
Ardon fails to show determine using a second neural network, a probability that at least one anatomical landmark is present in the at least one image frame based on evaluating a position of the ultrasound transducer relative to the target region, wherein the probability is provided to the first neural network and the confidence metric is based at least in part on the probability, and wherein the at least one anatomical landmark is separate from the anatomical feature.
Ardon fails to show wherein the one or more processors are further configured to perform: evaluate a quality of the at least one image frame using a third neural network, wherein the third neural network is trained with imaging data comprising quality data for a plurality of ultrasound images, and wherein the confidence metric is based on the evaluation of the quality of the at least one image frame.
Ardon fails to show wherein the first neural network comprises a set of neurons arranged in an input layer, multiple hidden layers, and an output layer, and wherein the confidence metric is based at least in part on determining a prediction variation associated with at least one path comprising a subset of the set of neurons.
Ardon fails to show wherein the processors are configured to determine the confidence metric by inputting the at least one image frame into a first neural network trained with imaging data comprising the anatomical feature; wherein the processors are further configured to determine the confidence metric by inputting a patient statistic, a prior measurement of the anatomical feature, a derived measurement based on the prior measurement, a probability that the image frame contains an anatomical landmark associated with the anatomical feature, a quality level of the image frame, a setting of the ultrasound transducer, or combinations thereof, into the first neural network; wherein the probability that the image frame contains the anatomical landmark indicates whether a correct imaging plane has been obtained for measuring the anatomical feature; wherein the anatomical feature is a feature associated with a fetus or a uterus, and the derived measurement comprises a gestational age or an age-adjusted risk of a chromosomal abnormality; wherein the biometry tool widget comprises a caliper, a trace tool, an ellipse tool, a curve tool, an area tool, a volume tool, or combinations thereof; wherein the processors are further configured to determine a gestational age and/or a weight estimate based on the measurement.
Carneiro discloses an automated fetal-measurement from three-dimensional ultrasound data. Carneiro teaches determine the confidence metric by inputting the at least one image frame into a first neural network trained with imaging data comprising the anatomical feature (processor identifies fetal anatomy based on neural network; [0028]-[0029]); wherein the processors are further configured to determine the confidence metric by inputting a patient statistic, a prior measurement of the anatomical feature, a derived measurement based on the prior measurement, a probability that the image frame contains an anatomical landmark associated with the anatomical feature, a quality level of the image frame, a setting of the ultrasound transducer, or combinations thereof, into the first neural network ([0030]-[0035]); wherein the probability that the image frame contains the anatomical landmark indicates whether a correct imaging plane has been obtained for measuring the anatomical feature (probabilistic models, [0030]-[0031], [0039]); wherein the anatomical feature is a feature associated with a fetus or a uterus, and the derived measurement comprises a gestational age or an age-adjusted risk of a chromosomal abnormality ([0103]); wherein the biometry tool widget comprises a caliper, a trace tool, an ellipse tool, a curve tool, an area tool, a volume tool, or combinations thereof ([0044]); wherein the processors are further configured to determine a gestational age and/or a weight estimate based on the measurement ([0103]).
Georgescu discloses a method and system for anatomical object detection using neural networks. Georgescu teaches determine using a second neural network, a probability that at least one anatomical landmark is present in the at least one image frame, wherein the probability is provided to the first neural network and the confidence metric is based at least in part on the probability, and wherein the at least one anatomical landmark is separate from the anatomical feature (probability score for each landmark candidate, where at least one of the landmark candidates is separate from the anatomical feature; [0133]) Georgescu also teaches wherein the first neural network comprises a set of neurons arranged in an input layer, multiple hidden layers, and an output layer ([0037], [0044]-[0045], [0049]-[0050]), and wherein the confidence metric is based at least in part on determining a prediction variation associated with at least one path comprising a subset of the set of neurons ([0049], [0054], [0057], [0061]-[0065], [0077]-[0078], [0082]). Georgescu also teaches using second and third neural networks, wherein the second neural network is trained with imaging data comprising landmarks associated with the anatomical feature ([0073], [0079]-[0082]).
Cadieu discloses a system for guided navigation of an ultrasound probe. Cadieu teaches evaluate a position of the ultrasound transducer relative to the target region using a neural network ([0009], [0016], [0022]-[0023], [0027], [0033]-[0034]), evaluate a quality of the at least one image frame using a neural network, wherein the neural network is trained with imaging data comprising quality data for a plurality of ultrasound images (train with imaging of target organ with a known probe pose deviation from the optimal pose, where the deviations and optimal pose correspond with poor quality and high quality of the image desired, and the neural network can thus guide a user to obtain a higher quality image by obtaining the optimal pose; [0016], [0005], [0022]-[0023], [0034]).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the invention of Ardon to utilize artificial intelligence such as neural networks as taught by Carneiro, to more efficiently and accurately diagnose the fetus by utilizing the processing power of artificial intelligence networks.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Ardon and Carneiro to utilize second and third neural networks and to train the neural networks with anatomical landmarks as taught by Georgescu, as the additional neural networks and training with anatomical landmarks will help to further refine the data to accurately locate/determine the anatomical object, and to obtain a higher accuracy and quality of the image.
Furthermore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Ardon and Carneiro to utilize a probability that an anatomical landmark is present as taught by Georgescu, as Georgescu teaches that the probability provides a computerized score ([0133]) which can be further utilized by the system/operator when analyzing the image to improve the accuracy of the result.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Ardon, Carneiro, and Georgescu to evaluate the position of the ultrasound transducer using a neural network as taught by Cadieu, as this will aid the operator in further locating the anatomical object by correctly orienting the pose of the ultrasound transducer corresponding to the anatomical object, increasing the accuracy and quality of the images.
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Ardon and Carneiro to utilize a neural network having an input layer, multiple hidden layers, and an output layer and a confidence metric associated with the layers as taught by Georgescu, as Georgescu teaches that this is a typical neural network arrangement ([0037]) and will allow for accurately detecting an anatomical object in a medical image ([0049]).
Claim(s) 7 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ardon et al. (US 2017/0181730; hereinafter Ardon) in view of Carneiro et al. (US 2009/0093717; hereinafter Carneiro), Georgescu et al. (US 2016/0174902; hereinafter Georgescu), and Cadieu et al. (US 2018/0153505; hereinafter Cadieu) as applied to claims 3 and 16 above, and further in view of Cuckle (US 2011/0208053).
Ardon fails to show wherein the patient statistic comprises a maternal age, a patient weight, a patient height, or combinations thereof.
Cuckle discloses systems and methods for assessing risk of chromosomal disorders using ultrasound. Cuckle teaches wherein the patient statistic comprises a maternal age, a patient weight, a patient height, or combinations thereof ([0035]). Cuckle also teaches comparing the representation to a manifold of population based-data (ethnicity factors, [0035], [0041]).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Ardon, Carneiro, Georgescu, and Cadieu to additionally utilize parameters related to the patient/mother as taught by Cuckle, in order to more accurately diagnose the fetus and any potential abnormality.
Claim(s) 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ardon et al. (US 2017/0181730; hereinafter Ardon) in view of Carneiro et al. (US 2009/0093717; hereinafter Carneiro), Georgescu et al. (US 2016/0174902; hereinafter Georgescu), and Cadieu et al. (US 2018/0153505; hereinafter Cadieu) as applied to claim 1 above, and further in view of Zhang et al. (US 2020/0345330; hereinafter Zhang) and Cuckle (US 2011/0208053).
Ardon fails to show wherein the first neural network comprises a multilayer perceptron network configured to perform supervised learning with stochastic dropout, or an autoencoder network configured to generate a compressed representation of the image frame and the measurement, and compare the compressed representation to a manifold of population-based data.
Zhang discloses a method for optimizing ultrasonic imaging system parameters based on deep learning. Zhang teaches wherein the first neural network comprises a multilayer perceptron network configured to perform supervised learning with stochastic dropout, or an autoencoder network configured to generate a compressed representation of the image frame and the measurement ([0071]-[0078]).
Cuckle discloses systems and methods for assessing risk of chromosomal disorders using ultrasound. Cuckle teaches wherein the patient statistic comprises a maternal age, a patient weight, a patient height, or combinations thereof ([0035]). Cuckle also teaches comparing the representation to a manifold of population based-data (ethnicity factors, [0035], [0041]).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the combined invention of Ardon, Carneiro, Georgescu, and Cadieu to compress the data as taught by Zhang, as this will allow for the system to obtain the benefits associated with data compression in computer processing systems, by minimizing the size of the data to be acted upon. Compression is particularly beneficial in the context of a neural network which processes a large amount of data simultaneously, and Zhang illustrates that it is known to incorporate an auto-encoder with a neural network, to more efficiently process the data in the neural network. Regarding the limitation comparing the representation to a manifold of population based-data, this feature is taught by Cuckle, which compares to population data for further weighting, and it would have been obvious to further compare to population data in order to improve the accuracy of the measurements (Cuckle, [0035], [0041]).
Response to Arguments
Applicant’s arguments with respect to the claim(s) have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
The examiner cites new teachings of Georgescu which describe calculating a probability score for each landmark candidate, where at least one of the landmark candidates is separate from the anatomical feature ([0133]).
The examiner maintains that Cadieu teaches that it is known in the ultrasound imaging arts to utilize neural networks to evaluate the position of the transducer, in order to improve the accuracy of the result.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JONATHAN CWERN whose telephone number is (571)270-1560. The examiner can normally be reached Monday - Friday, 8:00 am - 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Koharski can be reached at (571) 272-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JONATHAN CWERN/Primary Examiner, Art Unit 3797