DETAILED ACTION
This Office action is responsive to communications filed on 02/03/2026. Claims 1-3, 5, 8, 11-13, 15-16, 18-20 have been amended. Claims 4, 14 canceled. Presently, Claims 1-3, 5-13, 15-20 remain pending and are hereinafter examined on the merits.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Previous rejections under 35 USC § 112(b) are withdrawn in view of the amendments filed on 02/03/2026
Previous claim objections are withdrawn in view of the amendments filed on 02/03/2026
The Applicant’s arguments with respect to rejections under 35 USC § 101 have been fully, considered, but are not persuasive.
The Examiner directs the Applicant’s attention provided in the Office Action regarding the grounds for rejection of the claims under 35 U.S.C. 101 in view of the amendments filed on 02/03/2026. Specifically, the Examiner response is set forth in the rejection under 35 U.S.C. 101 below.
The Applicant’s arguments with respect to rejections under 35 USC § 102 have been fully, considered, but are not persuasive.
As an initial matter, the Applicant arguments rely on whether Mienkina explicitly labels certain display elements as information of a scanning view, rather than addressing what Mienkina teaches and renders the apparent to one of ordinary skill in the art. Claim 1 does not require any labeling or visual format for the identification information of the scanning view. In fact, the claim do not even define what identification information refers to the in the context of the claim, other than being associated with the scanning view, leaving the interpretation open in view of the prior arts. Instead, the claim broadly recites generating ultrasound images supplemented with identification information of the scanning view. As set forth in the Office Action, Mienkina expressly discloses automatically detecting a target view corresponding to the acquired ultrasound image view using trained models and presenting that view related information to the user during the scanning session. The “target view” is continuous referred to throughout Mienkina’s disclosure, see (¶Abstract, ¶0060 ¶0067-0068, ¶0070, ¶0072, Claim 1, Claim 15, Claim 7-8: -Mienkina discloses acquiring, by an ultrasound image an ultrasound image view. Subsequently, the system automatically detects, by that least one processor 132, 140 of the ultrasound system, a target view from a set of target views. The target view corresponds with the ultrasound image view. ¶0036, ¶0038-0039, ¶0040, ¶0060-0061: -The detection relies on trained models which are referred to as artificial intelligence image analysis techniques responsible for this is view detection processor 140 and the anatomical structure detection processor 150. These specific techniques include the AI image analysis algorithms and machine learning processing functionalities. The deep neural network processing is described as identifying a target view with a high degree of probability, ¶0038.)
The target view in Mienkina is the scanning view identified by the system, and the displayed user interface elements, (i.e., list, identifiers, and structural overlays) are generated because the scanning view had been identified. The claims do not exclude structural or graphical representation from constituting a scanning view identification information, nor do they require an express textual labeling.
Furthermore, the Applicant arguments regarding the concern with matching an image view with a target view and not identifying the scanning view is not persuasive. The Applicant’s argument relies on a narrower interpretation that is not reflected in the claims. Mienkina automatically detects the target view using trained models and presenting view related information in real-time. Whether that information is presented as a list, markers, or overlaid on anatomical structures does not negate that it conveys identification information of the detected scanning view.
For these reasons, the 35 USC § 102 rejection is maintained.
Examiners Notes
Applicant is reminded of manner of making amendment in application according to 37 C.F.C. 1.121.(c). Amendments to claims filed on 02/03/2026 (specifically, claim 18) does not show each amendment added.
Claims filed on 12/12/2024, Claim 18: lines 9-10: “generate ultrasound images supplemented with a list of one or more anatomical features captured in the ultrasound imagery during the ultrasound examination”
Claims filed on 02/03/2026, Claim 18: lines 11-13 “generate ultrasound images supplemented with a list of one or more anatomical features captured in the ultrasound imagery during the ultrasound examination and identification information of the scanning view.”
The term “and identification information of the scanning view” was not previously recited in the claims filed on 12/12/2024, and thus the added subject matter is not shown by underlining the added text.
The text of any added subject matter must be shown by underlining the added text. The text of any deleted matter must be shown by strike-through except that double brackets placed before and after the deleted characters may be used to show deletion of five or fewer consecutive characters. The text of any deleted subject matter must be shown by being placed within double brackets if strike-through cannot be easily perceived; See MPEP 714,II,C,(A).
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-3, 5-13, 15-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 of the subject matter eligibility test (see MPEP 2106.03).
Claims 1-3, 5-11 are drawn to a “system” which describes one of the four statutory categories, i.e., a machine.
Claim 12-13, 15-17 is directed to a “method” which describes one of the four statutory categories of patentable subject matter, i.e., a process.
Claim 18-20 is directed to an “apparatus” which describes one of the four statutory categories of patentable subject matter, i.e., a machine.
Step 2A of the subject matter eligibility test (see MPEP 2106.04).
Prong One:
Claims 1 recite (“sets forth” or “describes”) the abstract idea of “a mental process” (MPEP 2106.04(a)(2).III.), & the abstract idea of “mathematical concepts” (MPEP 2106.04(a)(2).I.), substantially as follows:
“identify anatomical features captured in the ultrasound imagery during the ultrasound examination; identify a scanning view captured during the ultrasound examination by applying trained models to the ultrasound images; and”
Claims 12 recite (“sets forth” or “describes”) the abstract idea of “a mental process” (MPEP 2106.04(a)(2).III.), & the abstract idea of “mathematical concepts” (MPEP 2106.04(a)(2).I.), substantially as follows:
“identifying, [...], anatomical features captured in the ultrasound imagery during the ultrasound examination; identifying, [...], a scanning view captured during the ultrasound examination by applying trained models to the ultrasound images; and”
Claims 18 recite (“sets forth” or “describes”) the abstract idea of “a mental process” (MPEP 2106.04(a)(2).III.), & the abstract idea of “mathematical concepts” (MPEP 2106.04(a)(2).I.), substantially as follows:
“identify anatomical features captured in the ultrasound imagery during the ultrasound examination; identify a scanning view captured during the ultrasound examination by applying trained models to the ultrasound images; and”
In claims (1, 12, 18), the identified limitations recite abstract ideas because they set forth both the abstract idea of “a mental process” (MPEP 2106.04(a)(2).III.), & the abstract idea of “mathematical concepts” (MPEP 2106.04(a)(2).I.), substantially as follows: Specifically, the limitations directed to identifying anatomical features in ultrasound imagery and identifying a scanning view by applying trained models describe activities that fundamentally involve evaluation, judgment, and recognition of patterns in visual information. A medical professional can mentally review ultrasound images, compare observed shapes and structures to known anatomical knowledge, and determine what organs or features are present, as well as recognize the scanning view based on experience. These acts constitute a mental process that can be practically performed in the human mind without aid of a computer. In addition, the recitation directed to applying trained models to identify features or views describe mathematical relationships and calculations such as classification, comparison, or pattern matching that are abstract mathematical concepts. The claimed limitations identified above broadly cover conceptual steps of analyzing image data and labeling results. Accordingly, these limitations set fourth abstract ideas of mental processes and mathematical concepts. There is nothing recited in the claim to suggest an undue level of complexity in how the identification is done.
Prong Two: Claims (1, 12, 18) do not include additional elements that integrate the mental process into a practical application.
This judicial exception is not integrated into a practical application. In particular, the claims recites (1) additional steps of “a memory that stores instructions; a processor that executes the instructions; and a display, wherein in response to being executed by the processor, the instructions cause the ultrasound system to: control an ultrasound probe to capture ultrasound imagery during an ultrasound examination;” – (claim 1), “A method for supplementing ultrasound images, the method comprising: controlling an ultrasound probe to capture capturing ultrasound imagery during an ultrasound examination; [...], by a controller with a processor executing instructions from a memory, [...]; [...], by the controller, [...]; and”-(claim 12), “a memory that stores instructions; and a processor that executes the instructions, wherein in response to being executed by the processor, the instructions cause the controller to: control an ultrasound probe to capture ultrasound imagery during an ultrasound examination;”-(claim 18); and
(2) further an addition step of generat[ing] ultrasound images supplemented with a list of one or more anatomical features captured in the ultrasound imagery during the ultrasound examination and identification information of the scanning view. (claims 1, 12, 18).
The steps in (1) represent merely data gathering or pre-solution activities that are necessary for use of the recited judicial exception and are recited at a high level of generality with conventionally used tools (see below Step IIB for further details). Data gathering and mere instructions to implement an abstract idea on a computer do not integrate a judicial exception into a practical application (MPEP 2106.05 (f and g)). Regarding the limitations of claim 12, directed to the controller with a processor executing instructions from a memory (i.e., “identifying, by a controller with a processor executing instructions from a memory, anatomical features captured in the ultrasound imagery during the ultrasound examination; and [...] identifying, by the controller, a scanning view captured during the ultrasound examination by applying trained models to the ultrasound images; and”) is treated as a generic computer implementation, which falls under mere instructions to apply the abstract idea on a computer and therefore does not place the abstract idea into a practical application that solves a technological solution in a meaningful way or improve the functionality of the technology or generic computer “itself”. Simply, it’s a generic computer implementation of a mental process rather than a meaningful limitation. Regarding the processor language written at such a high level of generality of structural limitations, the processor language amounts to a generic computer component with mere instructions to implement the abstract idea on a computer.
The step in (2) represents merely amounts to insignificant post-solution activity and is recited at a high level of generality. It amounts to insignificant post-solution activity because it amounts to no more than presenting desired results of the abstract mental process, rather than integrating the abstract idea into a practical application that solves a technological solution in a meaningful way.
As a whole, the additional elements merely serve to gather and feed information to the abstract idea and to output a notification based on the abstract idea, while generically implementing it on conventionally used tools. There is no practical application because the abstract idea is not applied, relied on, or used in a meaningful way. No improvement to the technology is evident, and the information is not outputted in any way such that a practical benefit is realized. Therefore, the additional elements, alone or in combination, do not integrate the abstract idea into a practical application.
Accordingly, these additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. Further, there is no evidence of record that would support the assertion that this step is an improvement to a computer or technological solution to a technological problem. Ultimately, the Applicant’s describe improvement in the process of using displaying anatomical features of interest, but this is not an improvement in the function of a computer or other technology (See MPEP 2106.05(a)(ii); “the court determined that the claimed user interface simply provided a trader with more information to facilitate market trades, which improved the business process of market trading but did not improve computers or technology”; See MPEP 2106.04(d)(1); 2106.05(a); and 2106.05(f)). The claims are directed to the abstract idea. Also, there does not appear to be any particular structure or machine, treatment or prophylaxis, transformation, or any other meaningful application that would render the claim eligible at step 2A, prong 2.
Step 2B of the subject matter eligibility test (see MPEP 2106.05).
Claims (1, 12, 18) do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above, the claims recite additional steps of instructions to control an ultrasound probe for capturing ultrasound imagery during an examination. These steps represents mere data gathering, data outputting or pre/post/extra-solution activities that are necessary for use of the recited judicial exception and are recited at a high level of generality. Furthermore, as discussed above, limitations with respect to the processor languages/terms, respectively, amount to mere instructions to implement the abstract idea on a computer. As discussed with respect to Step 2A Prong Two, the additional elements in the claims amount to no more than insignificant extra solution activity and mere instructions to apply the exception using a generic computer component. The same analysis applies here in 2B and does not provide an inventive concept. The data gathering steps that were considered insignificant extra-solution activity in Step 2A Prong Two, have been re-evaluated in Step 2B and determined to be well-understood, routine, conventional activity in the field.
As an evidence, Popovic et al (US 2019/0290247 A1) discloses:
¶0044, ‘endoscope controller 30 is structurally configured as well known in the art of the present disclosure for controlling an operation of an endoscope in generating an endoscopic image, LUS probe controller 40 is structurally configured as well known in the art for controlling an operation of a LUS probe in generating ultrasound image(s), and display controller 60 is structurally configured as well known in the art for controlling an operation of a display/monitor.’
For these reasons, there is no inventive concept. The claim is not patent eligible. Even when viewed as a whole, nothing in the claim adds significantly more to the abstract idea.
Dependent Claims
The following dependent claims merely further define the abstract idea and are, therefore, recite an abstract idea for similar reasons:
Defining identifying anatomical organs captured during the ultrasound examination; and– (claim 5 & 15 & 20) directed to the aforementioned mental concepts therefore directed to the abstract idea. Whereas the step of supplementing the ultrasound images with the anatomical organs as one or more anatomical features captured in the ultrasound imagery during the ultrasound examination is directed to data gathering steps and post-solution activity/insignificant post solution activity which is recited conventionally and recited at high level of generality. As such, the abstract idea is not applied, relied on, or used in a meaningful way. No improved to the technology is evident, and the determined visualization of context is not outputted in any way such that the practical benefit is realized.
Defining updating the list of one or more anatomical features captured in the ultrasound imagery in response to the captured ultrasound imagery changing during the ultrasound examination thereby capturing at least one previously uncaptured anatomical feature. (claim 8 & 16) directed to the abstract idea of mental observation and mentally updating the list to keep track of the anatomical features. The claims do not preclude this interpretation of the abstract idea.
Defining populating a template with the list of one or more anatomical features. (claim 11)
The following dependent claims merely further describe the extra-solution activities and therefore, do not amount to significantly more than the judicial exception or integrate the abstract idea into a practical application for similar reasons:
describing the instructions cause the ultrasound system further to: receive information from an external processor; wherein the ultrasound images are supplemented with the information received from the external processor, (claim 2)
describing display the ultrasound images supplemented with the information received from the processor and the list of one or more anatomical features captured in the ultrasound imagery during the ultrasound examination (claim 3 & 13, 19), where as the transmitting and receiving ultrasound beams is pre-solution data gathering.
describing connecting to an external record system during the ultrasound examination to retrieve data, and supplement the ultrasound images with the data retrieved from the external record system. – (claim 6).
Describing connecting to an external record system during the ultrasound examination to retrieve data, and upload data to the external record system during the ultrasound examination. – (claim 7)
Describing the merge of the ultrasound images in an output file with subject-specific information. (claim 9).
Describing displaying information indicating completion of one or more scans for one or more scanning view captured during the ultrasound examination. – (claim 10 & 17). The Examiner agrees displaying information on a monitor cannot performed in the human mind; hence, it is not part of the abstract idea. However, it is not a practical application either. It is merely an insignificant post-solution activity. In addition, the abstract idea is not applied, relied on, or used in a meaningful way. No improved to the technology is evident, and the determined visualization of context is not outputted in any way such that the practical benefit is realized.
In claims (2-3, 6-7, 9-10, 13, 17, 19), The data gathering steps and pre/post-solution activity and/or insignificant extra-solution activity which is recited conventionally and recited at high level of generality. As such, the abstract idea is not applied, relied on, or used in a meaningful way. No improved to the technology is evident, and the determined visualization of context is not outputted in any way such that the practical benefit is realized.
Taken alone and in combination, the additional elements do not integrate the judicial exception into a practical application at least because the abstract idea is not applied, relied on, or used in a meaningful way. They also do not add anything significantly more than the abstract idea. Their collective functions merely provide computer/electronic implementation and processing, and no additional elements beyond those of the abstract idea. Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements individually. There is no indication that the combination of elements improves the functioning of a computer, output device, improves technology other than the technical field of the claimed invention, etc. Therefore, the claims are rejected as being directed to non-statutory subject matter.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as failing to set forth the subject matter which the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the applicant regards as the invention.
Claim 1: line 11, “the ultrasound images” in view of line 12, “generate ultrasound images”, renders the claim indefinite. First, there is insufficient antecedent basis for this limitation in line 11 in the claim, as required by MPEP 2173.05(e). Second, it unclear if these ultrasound images in line 11 refer to or are separate from the generated ultrasound images in line 12. For examination purposes, the Examiner assumes they are the same images. Appropriate correction is required.
The above rejections to claim 1 apply to claim 12 and claim 18 for substantially identical claim
limitations recited in the claim. Accordingly, proper ordinal numbering and/or antecedent basis is required.
Claim 5, “the one or more anatomical features”. It is unclear if the phrase refers to or is separate from the list of the one or more anatomical features or the identified anatomical features. Consistent claim language is required when referring to the same term. For examination purposes, the Examiner assumes the list of the one or more anatomical features (i.e., supplement[ing] the ultrasound images with the list of the one or more anatomical features capturing in the ultrasound imagery during the ultrasound examination). Appropriate correction is required.
The above rejections to claim 5 apply to claim 15 and claim 20 for substantially identical claim
limitations recited in the claim. Accordingly, proper ordinal numbering and/or antecedent basis is required.
The dependent claims of the above rejected claims are rejected due to their dependency.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1, 5, 8, 11-12, 15-16, and 18-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Mienkina (US 2022/0071595 A1).
Claim 1: Mienkina discloses, An ultrasound system (ultrasound system 100), comprising (¶Abstract – real-time anatomical structure recognition in acquired ultrasound image views is provided.)
a memory that stores instructions; (Archive 138 includes databases and sets of information and instructions for detecting anatomical and/or image featured in the acquired ultrasound image views-¶0056)
a processor (signal processor 132 (i.e., controller) includes processor 140, 150, and 160) that executes the instructions; and (¶0036, ‘The signal processor 132, including the view detection processor 140, the anatomical structure detection processor 150, and the user interface element processor 160, may be capable of executing any of the method(s) and/or set(s) of instructions discussed herein in accordance with the various embodiments, for example.’, see also - ¶0056, ¶0070)
a display (display system 134), wherein in response to being executed by the processor, the instructions cause the ultrasound system to: (¶0027, ¶0054-0055, ¶0070, Claim 8 – regarding the display system is configured to present the ultrasound image view and user interface elements)
control an ultrasound probe to capture ultrasound imagery during an ultrasound examination; (Claims 1, 7-8, & 15, ¶0035, ¶0067-0068, ¶0070, - The system 100 comprises an ultrasound probe 104 configured to acquire an ultrasound image view, which is during a scanning session. The system and method also includes the step of acquiring an ultrasound image view.)
With regard to control an ultrasound probe, Mienkina explicitly teaches including a transmitter 102 operable to drive an ultrasound probe 104 and a transmit beamformer 110 operable to control the transmitter, ¶0028-0029. The system is operable to continuously acquire ultrasound scan data via the probe, which captures the imagery, ¶0028-0029.
identify anatomical features captured in the ultrasound imagery during the ultrasound examination; and (Claims 8, 15, ¶0039-0040, ¶0061, ¶0070, -The processor is configured to automatically determine one or more a presence or absence of a plurality of anatomical features associated with a detected target view in the acquired ultrasound image view. Specifically, the anatomical structure detection processor 150 performs image analysis techniques such as artificial intelligence or deep neural network image analysis to determine the features present and absent)
identify a scanning view captured during the ultrasound examination by applying trained models to the ultrasound images; and
(¶Abstract, ¶0060 ¶0067-0068, ¶0070, ¶0072, Claim 1, Claim 15, Claim 7-8: -Mienkina discloses acquiring, by an ultrasound image an ultrasound image view. Subsequently, the system automatically detects, by that least one processor 132, 140 of the ultrasound system, a target view from a set of target views. The target view corresponds with the ultrasound image view. ¶0036, ¶0038-0039, ¶0040, ¶0060-0061: -The detection relies on trained models which are referred to as artificial intelligence image analysis techniques responsible for this is view detection processor 140 and the anatomical structure detection processor 150. These specific techniques include the AI image analysis algorithms and machine learning processing functionalities. The deep neural network processing is described as identifying a target view with a high degree of probability, ¶0038.)
generate ultrasound images supplemented with a list of one or more of the anatomical features captured in the ultrasound imagery during the ultrasound examination and identification information of the scanning view. (¶0042, ¶0070, Claims 4-6, 8, 12-14, 17-19, - The processor specifically the user interface elements processor 160 is configured to generate at least one user interface element (220-270) indicating the presence or absence of each of the anatomical features. One type of user interface element is a list (230). This list 230 contains the anatomical and/or image features present 232 and/or missing 234 in the acquired ultrasound image view 210 corresponding to a target view. The display system 134 presents the ultrasound image view 210 along with the user interface elements, including the list. The system of Mienkina operates in real-time during a scanning session, detecting the presence of absence of these features as the echo signals are received, ¶0035, ¶0039, ¶0061. As previously discussed, Mienkina system includes the “view detection processor 140” which automatically detects a target view (i.e., the scanning view) corresponding to the acquired image. The system then supplements the image with identification information regarding this view such as a 3D anatomical model having representation 262 of a location of the acquired ultrasound image view, ¶0038, ¶0042, ¶0053, ¶0060, ¶0065.)
Claim 5: Mienkina discloses all the elements above in claim 1, Mienkina discloses: wherein in response to being executed by the processor, the instructions cause the ultrasound system(¶Abstract, ¶0036, ¶0039, ¶0058, ¶0061, ¶0072, Claim 1, Claim 8-9, Claim 15: - Mienkina discloses acquiring by the ultrasound system an ultrasound image view. The system then automatically detects a target view corresponding to the acquired image. After the target view is identified, the system automatically determines one or both of a presence or absence of a plurality of anatomical features associated with the target view in the ultrasound image view. This determination is performed by the anatomical detection processor 150. The anatomical features are components of organs or specific structures for protocol adherence during an examination. In the head transcerebellar plane view of a second trimester obstetric fetal examination, the features include include anatomical features, such as a cerebellum, cavum septum pellucidum, cisterna magna, midline falx, and brain symmetry, and image features, such as a particular magnification of the acquired ultrasound image view.)
supplement the ultrasound images with the anatomical organs as the one or more anatomical features captured in the ultrasound imagery during the ultrasound examination. (¶0042, ¶0044, ¶0062, ¶0069, ¶0071, Claim 4-5, 12-14, 18-19: -Mienkina discloses supplementation is providing a list 230 that indicates the presence or absence of the anatomical and/or image features captured or expected to be captured in the view. This list provides the feedback regarding whether the image is protocol adherent by identifying the features that are present and/or missing during the ultrasound examination.)
Claim 8: Mienkina discloses all the elements above in claim 1, Mienkina discloses: wherein in response to being executed by the processor, the instructions cause the ultrasound system
-The entire method and system of Mienkina is designed for adapting the user interface elements based on real-time anatomical structure recognition in acquired ultrasound image views, ¶Abstract, ¶0005, ¶0019, ¶0035-0036, ¶0067-0068. The user interface elements include the list 230 that indicates the presence 232 and/or absence 234 of anatomical features, generated based on the current determination of structure present in the image view, ¶0041-0042, ¶0044, ¶0069. The system automatically determines the presence and/or absence of a plurality of anatomical features in the ultrasound image view, ¶Abstract, ¶0068, Claim 1. The system provides user feedback for manipulating an ultrasound probe to acquire a protocol adherent ultrasound iamge view. This feedback includes instructions 270 based on what features are currently present and/or missing, ¶0042, ¶0065-0066. The operator follows the instructions 270 for manipulating the probe position or orientation, and the acquired ultrasound image view changes, ¶0067-0068. When the image view changes due to manipulation the process is designed to return to step 1304 to acquire a different ultrasound image view 210. Steps 1306 and 1308 then re-analyze the new image view to detect the view and determine the presence and/or absence of the anatomical features, ¶0060-0061, ¶0067-0068. Subsequently, the system presents identification 220-234 which includes the list 230, of the anatomical structures present and/or absent, ¶0062.
Claim 11: Mienkina discloses all the elements above in claim 1, Mienkina discloses: wherein supplementing the ultrasound images comprises populating a template with the list of the one or more anatomical features. (¶Abstract, ¶0041, ¶0070: -supplementing the ultrasound iamge 210 involves presenting user interface elements 220-270. One of these elements is the list of anatomical and/or image features. The examination protocol defines a number of specific target views and criteria for adherence based on the present of certain anatomical features, ¶0038, ¶0058, ¶0060. This defined set of required features is a structural template for the displayed list. See also ¶0041, ‘the user interface element processor 160 may be configured to register a pictogram template 240 or structural overlay template 250 with the acquired ultrasound image view 210 and present the pictogram template 240 or structural overlay template 250 overlaid on the acquired ultrasound image view 210 with markers 224, 226 or other identifiers indicating the presence 224 and/or absence 226 of anatomical and/or image features in the acquired ultrasound image view 210.’)
Claim 12: Mienkina discloses, A method for supplementing ultrasound images, the method comprising: (¶Abstract – real-time anatomical structure recognition in acquired ultrasound image views is provided, where supplemental 3D models, ¶0065, pictograms, ¶0041 and list, ¶0042, are provided.)
controlling an ultrasound probe to capture capturing ultrasound imagery during an ultrasound examination; (Claims 1, 7-8, & 15, ¶0035, ¶0067-0068, ¶0070, - The system 100 comprises an ultrasound probe 104 configured to acquire an ultrasound image view, which is during a scanning session. The system and method also includes the step of acquiring an ultrasound image view.)
With regard to control an ultrasound probe, Mienkina explicitly teaches including a transmitter 102 operable to drive an ultrasound probe 104 and a transmit beamformer 110 operable to control the transmitter, ¶0028-0029. The system is operable to continuously acquire ultrasound scan data via the probe, which captures the imagery, ¶0028-0029.
identifying, by a controller ((signal processor 132 (i.e., controller) includes processor 140, 150, and 160) with a processor (processor 140, 150, 160 - (¶0036, ‘The signal processor 132, including the view detection processor 140, the anatomical structure detection processor 150, and the user interface element processor 160, may be capable of executing any of the method(s) and/or set(s) of instructions discussed herein in accordance with the various embodiments, for example.’, see also - ¶0056, ¶0070)) executing instructions from a memory, (Archive 138 includes databases and sets of information and instructions for detecting anatomical and/or image featured in the acquired ultrasound image views-¶0056) anatomical features captured in the ultrasound imagery during the ultrasound examination; and (Claims 8, 15, ¶0039-0040, ¶0061, ¶0070, -The processor is configured to automatically determine one or more a presence or absence of a plurality of anatomical features associated with a detected target view in the acquired ultrasound image view. Specifically, the anatomical structure detection processor 150 performs image analysis techniques such as artificial intelligence or deep neural network image analysis to determine the features present and absent)
identifying, by the controller, a scanning view captured during the ultrasound examination by applying trained models to the ultrasound images; and
(¶Abstract, ¶0060 ¶0067-0068, ¶0070, ¶0072, Claim 1, Claim 15, Claim 7-8: -Mienkina discloses acquiring, by an ultrasound image an ultrasound image view. Subsequently, the system automatically detects, by that least one processor 132, 140 of the ultrasound system, a target view from a set of target views. The target view corresponds with the ultrasound image view. ¶0036, ¶0038-0039, ¶0040, ¶0060-0061: -The detection relies on trained models which are referred to as artificial intelligence image analysis techniques responsible for this is view detection processor 140 and the anatomical structure detection processor 150. These specific techniques include the AI image analysis algorithms and machine learning processing functionalities. The deep neural network processing is described as identifying a target view with a high degree of probability, ¶0038.)
generating ultrasound images supplemented with a list of one or more of the anatomical features captured in the ultrasound imagery during the ultrasound examination and identification information of the scanning view. (¶0042, ¶0070, Claims 4-6, 8, 12-14, 17-19, - The processor specifically the user interface elements processor 160 is configured to generate at least one user interface element (220-270) indicating the presence or absence of each of the anatomical features. One type of user interface element is a list (230). This list 230 contains the anatomical and/or image features present 232 and/or missing 234 in the acquired ultrasound image view 210 corresponding to a target view. The display system 134 presents the ultrasound image view 210 along with the user interface elements, including the list. The system of Mienkina operates in real-time during a scanning session, detecting the presence of absence of these features as the echo signals are received, ¶0035, ¶0039, ¶0061. As previously discussed, Mienkina system includes the “view detection processor 140” which automatically detects a target view (i.e., the scanning view) corresponding to the acquired image. The system then supplements the image with identification information regarding this view such as a 3D anatomical model having representation 262 of a location of the acquired ultrasound image view, ¶0038, ¶0042, ¶0053, ¶0060, ¶0065.)
Claim 15: Mienkina discloses all the elements above in claim 12, Mienkina discloses: further comprising: identifying anatomical organs captured during the ultrasound examination; and
(¶Abstract, ¶0036, ¶0039, ¶0058, ¶0061, ¶0072, Claim 1, Claim 8-9, Claim 15: - Mienkina discloses acquiring by the ultrasound system an ultrasound image view. The system then automatically detects a target view corresponding to the acquired image. After the target view is identified, the system automatically determines one or both of a presence or absence of a plurality of anatomical features associated with the target view in the ultrasound image view. This determination is performed by the anatomical detection processor 150. The anatomical features are components of organs or specific structures for protocol adherence during an examination. In the head transcerebellar plane view of a second trimester obstetric fetal examination, the features include include anatomical features, such as a cerebellum, cavum septum pellucidum, cisterna magna, midline falx, and brain symmetry, and image features, such as a particular magnification of the acquired ultrasound image view.)
supplementing the ultrasound images with the anatomical organs as the one or more anatomical features captured in the ultrasound imagery during the ultrasound examination. (¶0042, ¶0044, ¶0062, ¶0069, ¶0071, Claim 4-5, 12-14, 18-19: -Mienkina discloses supplementation is providing a list 230 that indicates the presence or absence of the anatomical and/or image features captured or expected to be captured in the view. This list provides the feedback regarding whether the image is protocol adherent by identifying the features that are present and/or missing during the ultrasound examination.)
Claim 16: Mienkina discloses all the elements above in claim 12, Mienkina discloses: further comprising: updating the list of the one or more anatomical features captured in the ultrasound imagery in response to the captured ultrasound imagery changing during the ultrasound examination thereby capturing at least one previously uncaptured anatomical
feature.
-The entire method and system of Mienkina is designed for adapting the user interface elements based on real-time anatomical structure recognition in acquired ultrasound image views, ¶Abstract, ¶0005, ¶0019, ¶0035-0036, ¶0067-0068. The user interface elements include the list 230 that indicates the presence 232 and/or absence 234 of anatomical features, generated based on the current determination of structure present in the image view, ¶0041-0042, ¶0044, ¶0069. The system automatically determines the presence and/or absence of a plurality of anatomical features in the ultrasound image view, ¶Abstract, ¶0068, Claim 1. The system provides user feedback for manipulating an ultrasound probe to acquire a protocol adherent ultrasound iamge view. This feedback includes instructions 270 based on what features are currently present and/or missing, ¶0042, ¶0065-0066. The operator follows the instructions 270 for manipulating the probe position or orientation, and the acquired ultrasound image view changes, ¶0067-0068. When the image view changes due to manipulation the process is designed to return to step 1304 to acquire a different ultrasound image view 210. Steps 1306 and 1308 then re-analyze the new image view to detect the view and determine the presence and/or absence of the anatomical features, ¶0060-0061, ¶0067-0068. Subsequently, the system presents identification 220-234 which includes the list 230, of the anatomical structures present and/or absent, ¶0062.
Claim 18: Mienkina disclose, A controller (signal processor 132 (i.e., controller) includes processor 140, 150, and 160) for an ultrasound system (ultrasound system 100), comprising: a memory that stores instructions; and (¶Abstract – real-time anatomical structure recognition in acquired ultrasound image views is provided, where supplemental 3D models, ¶0065, pictograms, ¶0041 and list, ¶0042, are provided.)(Archive 138 includes databases and sets of information and instructions for detecting anatomical and/or image featured in the acquired ultrasound image views-¶0056)
a processor that executes the instructions, wherein in response to being executed by the processor, the instructions cause the controller to: (¶0036, ‘The signal processor 132, including the view detection processor 140, the anatomical structure detection processor 150, and the user interface element processor 160, may be capable of executing any of the method(s) and/or set(s) of instructions discussed herein in accordance with the various embodiments, for example.’, see also - ¶0056, ¶0070)
control an ultrasound probe to capture ultrasound imagery during an ultrasound examination; (Claims 1, 7-8, & 15, ¶0035, ¶0067-0068, ¶0070, - The system 100 comprises an ultrasound probe 104 configured to acquire an ultrasound image view, which is during a scanning session. The system and method also includes the step of acquiring an ultrasound image view.)
-The entire method and system of Mienkina is designed for adapting the user interface elements based on real-time anatomical structure recognition in acquired ultrasound image views, ¶Abstract, ¶0005, ¶0019, ¶0035-0036, ¶0067-0068. The user interface elements include the list 230 that indicates the presence 232 and/or absence 234 of anatomical features, generated based on the current determination of structure present in the image view, ¶0041-0042, ¶0044, ¶0069. The system automatically determines the presence and/or absence of a plurality of anatomical features in the ultrasound image view, ¶Abstract, ¶0068, Claim 1. The system provides user feedback for manipulating an ultrasound probe to acquire a protocol adherent ultrasound image view. This feedback includes instructions 270 based on what features are currently present and/or missing, ¶0042, ¶0065-0066. The operator follows the instructions 270 for manipulating the probe position or orientation, and the acquired ultrasound image view changes, ¶0067-0068. When the image view changes due to manipulation the process is designed to return to step 1304 to acquire a different ultrasound image view 210. Steps 1306 and 1308 then re-analyze the new image view to detect the view and determine the presence and/or absence of the anatomical features, ¶0060-0061, ¶0067-0068. Subsequently, the system presents identification 220-234 which includes the list 230, of the anatomical structures present and/or absent, ¶0062.
With regard to control an ultrasound probe, Mienkina explicitly teaches including a transmitter 102 operable to drive an ultrasound probe 104 and a transmit beamformer 110 operable to control the transmitter, ¶0028-0029. The system is operable to continuously acquire ultrasound scan data via the probe, which captures the imagery, ¶0028-0029.
identify anatomical features captured in the ultrasound imagery during the ultrasound examination; (Claims 8, 15, ¶0039-0040, ¶0061, ¶0070, -The processor is configured to automatically determine one or more a presence or absence of a plurality of anatomical features associated with a detected target view in the acquired ultrasound image view. Specifically, the anatomical structure detection processor 150 performs image analysis techniques such as artificial intelligence or deep neural network image analysis to determine the features present and absent)
identify a scanning view captured during the ultrasound examination by applying trained models to the ultrasound images; and
(¶Abstract, ¶0060 ¶0067-0068, ¶0070, ¶0072, Claim 1, Claim 15, Claim 7-8: -Mienkina discloses acquiring, by an ultrasound image an ultrasound image view. Subsequently, the system automatically detects, by that least one processor 132, 140 of the ultrasound system, a target view from a set of target views. The target view corresponds with the ultrasound image view. ¶0036, ¶0038-0039, ¶0040, ¶0060-0061: -The detection relies on trained models which are referred to as artificial intelligence image analysis techniques responsible for this is view detection processor 140 and the anatomical structure detection processor 150. These specific techniques include the AI image analysis algorithms and machine learning processing functionalities. The deep neural network processing is described as identifying a target view with a high degree of probability, ¶0038.)
generate ultrasound images supplemented with a list of one or more of the anatomical features captured in the ultrasound imagery during the ultrasound examination and identification information of the scanning view. (¶0042, ¶0070, Claims 4-6, 8, 12-14, 17-19, - The processor specifically the user interface elements processor 160 is configured to generate at least one user interface element (220-270) indicating the presence or absence of each of the anatomical features. One type of user interface element is a list (230). This list 230 contains the anatomical and/or image features present 232 and/or missing 234 in the acquired ultrasound image view 210 corresponding to a target view. The display system 134 presents the ultrasound image view 210 along with the user interface elements, including the list. The system of Mienkina operates in real-time during a scanning session, detecting the presence of absence of these features as the echo signals are received, ¶0035, ¶0039, ¶0061. As previously discussed, Mienkina system includes the “view detection processor 140” which automatically detects a target view (i.e., the scanning view) corresponding to the acquired image. The system then supplements the image with identification information regarding this view such as a 3D anatomical model having representation 262 of a location of the acquired ultrasound image view, ¶0038, ¶0042, ¶0053, ¶0060, ¶0065.)
Claim 19: Mienkina discloses all the elements above in claim 17, Mienkina discloses:
wherein in response to being executed by the processor, the instructions cause the controller further to: transmit, by the ultrasound probe, ultrasound imaging beams; receive and detect, by the ultrasound probe, feedback from the ultrasound imaging beams to capture the ultrasound imagery during the ultrasound examination.
-Mienkina teaches a beamformer 110 that controls a transmitter drive to a group of transmit transducer elements 106 within the ultrasound probe, ¶0028-0029. The transmitted signals are back-scattered from the structures in the object of interest to produce echoes. These echoes (i.e., feedback) are received by the receive transducer elements 108 in the ultrasound probe, ¶0029-0030.
Claim 20: Mienkina discloses all the elements above in claim 18, Mienkina discloses: wherein in response to being executed by the processor, the instructions cause the ultrasound system(¶Abstract, ¶0036, ¶0039, ¶0058, ¶0061, ¶0072, Claim 1, Claim 8-9, Claim 15: - Mienkina discloses acquiring by the ultrasound system an ultrasound image view. The system then automatically detects a target view corresponding to the acquired image. After the target view is identified, the system automatically determines one or both of a presence or absence of a plurality of anatomical features associated with the target view in the ultrasound image view. This determination is performed by the anatomical detection processor 150. The anatomical features are components of organs or specific structures for protocol adherence during an examination. In the head transcerebellar plane view of a second trimester obstetric fetal examination, the features include include anatomical features, such as a cerebellum, cavum septum pellucidum, cisterna magna, midline falx, and brain symmetry, and image features, such as a particular magnification of the acquired ultrasound image view.)
supplement the ultrasound images with the anatomical organs as the one or more anatomical features captured in the ultrasound imagery during the ultrasound examination. (¶0042, ¶0044, ¶0062, ¶0069, ¶0071, Claim 4-5, 12-14, 18-19: -Mienkina discloses supplementation is providing a list 230 that indicates the presence or absence of the anatomical and/or image features captured or expected to be captured in the view. This list provides the feedback regarding whether the image is protocol adherent by identifying the features that are present and/or missing during the ultrasound examination.)
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 2, 3, 13 are rejected under 35 U.S.C. 103 as being unpatentable over Mienkina (US 2022/0071595 A1), as applied to claim 1 & 12 respectively, in further view of Lundberg et al (US 2019/0269384 A1).
Claim 2: Mienkina discloses all the elements above in claim 1, Mienkina discloses, further comprising: the ultrasound probe, wherein in response to being executed by the processor, the instructions cause the ultrasound system further to: receive information from an external processor; wherein the ultrasound images are supplemented with the information received from the external processor,
¶0038, ¶0040 – In determining the presence and absence of anatomical and/or image features additionally and/or alternatively a remote processor is communicatively coupled to the ultrasound system 100. This received information (whether local or remotely) is then used by the user interface element processor 160 to generate and present the user interface elements identifying the anatomical structure present and/or absent from the acquired ultrasound image view, ¶0041, ¶0062. These user interface elements supplement the ultrasound images include the list 230 and pictogram 240 and 3D anatomical model 260.
Mienkina fails to explicitly disclose: an external monitor
However Lundberg in the context of annotating ultrasound examination discloses: an external monitor (¶0016, ¶0018, ¶0024-0025, ¶003-0033 – pictographs are presented to the operator and associated with the ultrasound image are retrieved from a memory of a remote source over a wired or wireless communication. The processor of the ultrasound imaging system can request transmission of pictographs stored on a remote computing device. The pictographs are blended into or stored in associated with the ultrasound image in a patient record.)
It would have been obvious to one of ordinary skilled in the art before the effective filing date of the claimed invention to modify the external processor of Mienkina to be configured as a external monitor as taught by Lundberg. The motivation to do this yields predictable results such as providing an increase in additional computing power for the imaging system as suggested by Lundberg, ¶0016.
Claim 3: Modified Mienkina discloses all the elements above in claim 2, Mienkina discloses, wherein in response to being executed by the processor, the instructions cause the ultrasound system further to: display the ultrasound images supplemented with the information received from the processor and the list of the one or more anatomical features captured in the ultrasound imagery during the ultrasound examination. (¶0042, ¶0070, Claims 4-6, 8, 12-14, 17-19, - The processor specifically the user interface elements processor 160 is configured to generate at least one user interface element (220-270) indicating the presence or absence of each of the anatomical features. One type of user interface element is a list (230). This list 230 contains the anatomical and/or image features present 232 and/or missing 234 in the acquired ultrasound image view 210 corresponding to a target view. The display system 134 presents the ultrasound image view 210 along with the user interface elements, including the list.)
Mienkina fails to explicitly disclose: display the ultrasound images supplemented with the information received from the external monitor.
However, Lundberg is relied upon above discloses, display the ultrasound images supplemented with the information received from the external monitor. (-The ultrasound image is stored with metadata that indicates which pictograph is to be displayed with the image and where the pictograph is to appear as an overlay on the image, ¶0016, ¶0025)
It would have been obvious to one of ordinary skilled in the art before the effective filing date of the claimed invention to modify the display of modified Mienkina to include displaying the ultrasound images supplemented with the information received from the external monitor as taught by Lundberg. The motivation to do this yields predictable results such as providing an increase in additional computing power for the imaging system as suggested by Lundberg, ¶0016.
Claim 13: Mienkina discloses all the elements above in claim 12, Mienkina discloses, further comprising: transmitting, by the ultrasound probe, ultrasound imaging beams; receiving and detecting, by the ultrasound probe, feedback from the ultrasound imaging beams to capture the ultrasound imagery during the ultrasound examination.
-Mienkina teaches a beamformer 110 that controls a transmitter drive to a group of transmit transducer elements 106 within the ultrasound probe, ¶0028-0029. The transmitted signals are back-scattered from the structures in the object of interest to produce echoes. These echoes (i.e., feedback) are received by the receive transducer elements 108 in the ultrasound probe, ¶0029-0030.
receiving information from an external monitor; and displaying the ultrasound images supplemented with the information received from the processor and the list of the one or more anatomical features captured in the ultrasound imagery during the ultrasound examination, (¶0042, ¶0070, Claims 4-6, 8, 12-14, 17-19, - The processor specifically the user interface elements processor 160 is configured to generate at least one user interface element (220-270) indicating the presence or absence of each of the anatomical features. One type of user interface element is a list (230). This list 230 contains the anatomical and/or image features present 232 and/or missing 234 in the acquired ultrasound image view 210 corresponding to a target view. The display system 134 presents the ultrasound image view 210 along with the user interface elements, including the list.)
Mienkina fails to explicitly disclose: an external monitor & displaying the ultrasound images supplemented with the information received from the external monitor
However Lundberg in the context of annotating ultrasound examination discloses: an external monitor (¶0016, ¶0018, ¶0024-0025, ¶003-0033 – pictographs are presented to the operator and associated with the ultrasound image are retrieved from a memory of a remote source over a wired or wireless communication. The processor of the ultrasound imaging system can request transmission of pictographs stored on a remote computing device. The pictographs are blended into or stored in associated with the ultrasound image in a patient record.) & displaying the ultrasound images supplemented with the information received from the external monitor (-The ultrasound image is stored with metadata that indicates which pictograph is to be displayed with the image and where the pictograph is to appear as an overlay on the image, ¶0016, ¶0025)
It would have been obvious to one of ordinary skilled in the art before the effective filing date of the claimed invention to modify the external processor of modified Mienkina to be configured as a external monitor as taught by Lundberg. The motivation to do this yields predictable results such as providing an increase in additional computing power for the imaging system as suggested by Lundberg, ¶0016.
It would have been obvious to one of ordinary skilled in the art before the effective filing date of the claimed invention to modify the display of modified Mienkina to include displaying the ultrasound images supplemented with the information received from the external monitor as taught by Lundberg. The motivation to do this yields predictable results such as providing an increase in additional computing power for the imaging system as suggested by Lundberg, ¶0016.
Claims 6-7, 9 are rejected under 35 U.S.C. 103 as being unpatentable over Mienkina (US 2022/0071595 A1), as applied to claim 1, in further view of Tanabe (US 2020/0265930 A1).
Claim 6: Mienkina discloses all the elements above in claim 1, Mienkina fails to disclose: wherein in response to being executed by the processor, the instructions cause the ultrasound system further to: connect to an external record system during the ultrasound examination to retrieve data, and supplement the ultrasound images with the data retrieved from the external record system.
However, Tanabe in the context of medical examination apparatus and medical examination systems, discloses: connect to an external record system during the ultrasound examination to retrieve data, and supplement the ultrasound images with the data retrieved from the external record system. (¶0040, ‘The number of electronic medical record terminals 10A, 10B, and 10C and ultrasound diagnostic apparatuses 20X, 20Y, and 20Z is not particularly limited. The medical examination system 100 is installed in a hospital. Each apparatus configuring the medical examination system 100 conforms to the digital image and communications in medicine (DICOM) standard, and communication between apparatuses is performed according to the DICOM.’; ¶0043, ‘The ultrasound diagnostic apparatus 20 generates supplementary information on the generated ultrasound image data based on the order information. The ultrasound diagnostic apparatus 20 generates an image file in accordance with the DICOM standard by adding the supplementary information to the ultrasound image data, and transmits the generated image file to PACS 30. DICOM is an abbreviation for Digital Imaging and Communication in Medicine.’) – This supplementary information is then added to the ultrasound image data to generate an image file configured to include DICOM image data (i.e., the standard use for medical communication). Therefore, the data received from the external record system is explicitly used to supplement the ultrasound images.)
It would have been obvious to one of ordinary skilled in the art before the effective filing date of the claimed invention to modify the processor of Mienkina to be further configured to connect to an external record system during the ultrasound examination to retrieve data, and supplement the ultrasound images with the data retrieved from the external record system as taught by Tanabe. The motivation to do this yields predictable results such as improving security for example leakage of personal information can be prevented, as suggested by Tanabe, ¶0091.
Claim 7: Mienkina discloses all the elements above in claim 1, Mienkina fails to disclose:: wherein in response to being executed by the processor the instructions cause the ultrasound system
However, Tanabe in the context of medical examination apparatus and medical examination systems, discloses: connect to an external record system during the ultrasound examination to retrieve data, and upload data to the external record system during the ultrasound examination. (¶0040, ‘The number of electronic medical record terminals 10A, 10B, and 10C and ultrasound diagnostic apparatuses 20X, 20Y, and 20Z is not particularly limited. The medical examination system 100 is installed in a hospital. Each apparatus configuring the medical examination system 100 conforms to the digital image and communications in medicine (DICOM) standard, and communication between apparatuses is performed according to the DICOM.’; ¶0043, ‘The ultrasound diagnostic apparatus 20 generates supplementary information on the generated ultrasound image data based on the order information. The ultrasound diagnostic apparatus 20 generates an image file in accordance with the DICOM standard by adding the supplementary information to the ultrasound image data, and transmits the generated image file to PACS 30. DICOM is an abbreviation for Digital Imaging and Communication in Medicine.’) – This supplementary information is then added to the ultrasound image data to generate an image file configured to include DICOM image data (i.e., the standard use for medical communication). Therefore, the data received from the external record system is explicitly used to supplement the ultrasound images.)
It would have been obvious to one of ordinary skilled in the art before the effective filing date of the claimed invention to modify the processor of Mienkina to be further configured to connect to an external record system during the ultrasound examination to retrieve data, and upload data to the external record system during the ultrasound examination as taught by Tanabe. The motivation to do this yields predictable results such as improving security for example leakage of personal information can be prevented, as suggested by Tanabe, ¶0091.
Claim 9: Mienkina discloses all the elements above in claim 1, Mienkina fails to disclose: wherein in response to being executed by the processor, the instructions cause the ultrasound system
However, Tanabe in the context of medical examination apparatus and medical examination systems, discloses: merge the ultrasound images in an output file with subject-specific information. (¶0040, ‘The number of electronic medical record terminals 10A, 10B, and 10C and ultrasound diagnostic apparatuses 20X, 20Y, and 20Z is not particularly limited. The medical examination system 100 is installed in a hospital. Each apparatus configuring the medical examination system 100 conforms to the digital image and communications in medicine (DICOM) standard, and communication between apparatuses is performed according to the DICOM.’; ¶0043, ‘The ultrasound diagnostic apparatus 20 generates supplementary information on the generated ultrasound image data based on the order information. The ultrasound diagnostic apparatus 20 generates an image file in accordance with the DICOM standard by adding the supplementary information to the ultrasound image data, and transmits the generated image file to PACS 30. DICOM is an abbreviation for Digital Imaging and Communication in Medicine.’, see also ¶0040-0044) – The apparat adds the supplementary information to the ultrasound image data to generate an image filed configured to include DICOM image data conforming to the DICOM standard.)
It would have been obvious to one of ordinary skilled in the art before the effective filing date of the claimed invention to modify the processor of Mienkina to be further configured to merge the ultrasound images in an output file with subject-specific information as taught by Tanabe. The motivation to do this yields predictable results such as improving security for example leakage of personal information can be prevented, as suggested by Tanabe, ¶0091.
Claims 10 & 17 are rejected under 35 U.S.C. 103 as being unpatentable over Mienkina (US 2022/0071595 A1), as applied to claim 1, in further view of Choi et al (US 2018/0161010 A1).
Claim 10: Mienkina discloses all the elements above in claim 1, Mienkina fails to disclose: wherein in response to being executed by the processor, the instructions cause the ultrasound system
display information indicating completion of one or more scans for one or more scanning view captured during the ultrasound examination.
However, Choi in the context of ultrasound image processing discloses: display information indicating completion of one or more scans for one or more scanning view captured during the ultrasound examination. (¶0036, ‘Throughout the specification, “imaging status information” refers to imaging status information regarding target regions included in an imaging list, which includes pieces of information such as a target region of which imaging is completed, a target region of which imaging has been mistakenly omitted, a quality value for an acquired ultrasound image, progression of imaging being performed on the entire imaging list, etc.’; ¶0136, ‘When the target region E is currently being imaged, since imaging of the two (2) target regions A and B among a total of five (5) target regions is completed, the ultrasound image processing apparatus 300 may display the third imaging status information as the progress bar 820a or pie chart 820b indicating that about 40% of the ultrasound imaging has been completed.’)
It would have been obvious to one of ordinary skilled in the art before the effective filing date of the claimed invention to modify the display of Mienkina such that it is configured to display information indicating completion of one or more scans for one or more scanning view captured during the ultrasound examination as taught by Choi. The motivation to do this yields predictable results such as preventing omission of imaging due to human errors that may occur during an ultrasound scan for acquiring a larger number of images of targer regions or standard vis, thereby improving the accuracy of the ultrasound scan, as suggested by Choi, ¶0129.
Claim 17: Mienkina discloses all the elements above in claim 12, Mienkina fails to disclose: further comprising: displaying information indicating completion of one or more scans for one or more scanning views captured during the ultrasound examination.
However, Choi in the context of ultrasound image processing discloses: displaying information indicating completion of one or more scans for one or more scanning views captured during the ultrasound examination. (¶0036, ‘Throughout the specification, “imaging status information” refers to imaging status information regarding target regions included in an imaging list, which includes pieces of information such as a target region of which imaging is completed, a target region of which imaging has been mistakenly omitted, a quality value for an acquired ultrasound image, progression of imaging being performed on the entire imaging list, etc.’; ¶0136, ‘When the target region E is currently being imaged, since imaging of the two (2) target regions A and B among a total of five (5) target regions is completed, the ultrasound image processing apparatus 300 may display the third imaging status information as the progress bar 820a or pie chart 820b indicating that about 40% of the ultrasound imaging has been completed.’)
It would have been obvious to one of ordinary skilled in the art before the effective filing date of the claimed invention to modify the display of Mienkina such that it is configured for displaying information indicating completion of one or more scans for one or more scanning views captured during the ultrasound examination as taught by Choi. The motivation to do this yields predictable results such as preventing omission of imaging due to human errors that may occur during an ultrasound scan for acquiring a larger number of images of targer regions or standard vis, thereby improving the accuracy of the ultrasound scan, as suggested by Choi, ¶0129.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Nicholas Robinson whose telephone number is (571)272-9019. The examiner can normally be reached M-F 9:00AM-5:00PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached at (571) 272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/N.A.R./Examiner, Art Unit 3798
/PASCAL M BUI PHO/Supervisory Patent Examiner, Art Unit 3798