Prosecution Insights
Last updated: April 19, 2026
Application No. 18/663,484

ULTRASOUND CREDENTIALING SYSTEM

Final Rejection §101§103§112
Filed
May 14, 2024
Examiner
ASGHAR, AMINAH
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Fujifilm Corporation
OA Round
2 (Final)
63%
Grant Probability
Moderate
3-4
OA Rounds
3y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 63% of resolved cases
63%
Career Allow Rate
102 granted / 163 resolved
-7.4% vs TC avg
Strong +47% interview lift
Without
With
+46.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
46 currently pending
Career history
209
Total Applications
across all art units

Statute-Specific Performance

§101
6.5%
-33.5% vs TC avg
§103
45.8%
+5.8% vs TC avg
§102
12.9%
-27.1% vs TC avg
§112
32.9%
-7.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 163 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This action is in response to the remarks filed on 06/24/2025. The amendments filed on 06/24/2025 have been entered. Accordingly claims 1-20 remain pending. Claims 1, 4, 6-9, 11, and 19 are presently amended. The previous claim interpretation under 35 U.S.C. 112(f) has been withdrawn in light of applicant’s amendments to claim 1 deleting the limitation “candidate credentialing application”. The previous rejections of claims 9 and 19 under 35 U.S.C 112(b) have been withdrawn in light of applicant's amendments to claims 9 and 19. However, the amendments introduce new issues of indefiniteness which are detailed below. Response to Arguments Applicant's arguments filed 06/24/2025 regarding the 35 U.S.C. 101 rejection of the claims have been fully considered but they are not persuasive. First applicant argues, see excerpt from page 9 of the remarks below, that the judicial exception is integrated into a practical application. PNG media_image1.png 362 910 media_image1.png Greyscale Examiner respectfully disagrees. The alleged practical application described in the above argument is achieved by merely applying the abstract idea on a computer. This does not integrate the judicial exception into a practical application. See MPEP 2106.04(d). Applicant also argues that the claims impose meaningful limits on the use of the abstract idea of “mere instructions”. In response examiner notes that as previously stated in the rejection the abstract ideas recited in the claims are mental processes, e.g., concepts performed in the human mind (including an observation, evaluation, judgment, opinion). Applicant appears to be referring to portions of the analysis of step 2B instead of the abstract idea itself. Applicant’s arguments, see remarks, filed 06/24/2025, with respect to the prior art rejection of independent claims 1 and 11 under 35 U.S.C. 102 have been fully considered and are persuasive, in part. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of newly discovered prior art Shoudy et al. Examiner notes that although the Olivier reference is still used in the new ground of rejection, Olivier is not relied on to teach the newly amended limitations regarding the grip map. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-10 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites the limitation "the first subset of the ultrasound images " in line 10 and again in line 11. There is insufficient antecedent basis for this limitation in the claim. Further clarification is required. Claim 9 recites the limitation "the candidate credentialing application " in line 2. There is insufficient antecedent basis for this limitation in the claim. For the present purposes of examination, the limitation has been interpreted as the processing logic of claim 1. Further clarification is required. Claims dependent upon a claim rejected under 35 U.S.C. 112(b) are also rejected under the same statute because they each inherit the indefiniteness of the claim(s) they respectively depend upon. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more. Analysis step 1 of Subject Matter Eligibility Test The claims are directed to a machine (i.e., a credentialing system) of claims 1-10 and a process (i.e., a computer implemented method) of claims 11-20. Analysis step 2A, Prong I The claims recite abstract ideas, in particular mental processes, e.g., concepts performed in the human mind (including an observation, evaluation, judgment, opinion). Claim 1 recites “generate image quality scores for the first subset of the ultrasound images based on the grip map” which encompasses mental observations or evaluations, e.g., a computer programmer’s mental identification of image quality scores of the first subset of ultrasound image. It is noted that a neural network generates the image quality scores as described in at least paragraph [0060] of the pre-grant publication of the instant application. Claim 11 recites an analogous limitation of “generating [...] image quality scores for the first subset of the ultrasound images based on the grip map”. Claim 4 recites “wherein the processing logic is implemented to determine, based on at least one of the image quality scores being below a threshold score, guidance for the sonography candidate to improve the at least one of the image quality scores” which encompasses mental observations or evaluations, e.g., evaluating an image quality score with a threshold score and a computer programmer’s mental identification of particular guidance. It is noted that a neural network is used to generate the guidance as described in at least paragraph [0063] of the pre-grant publication of the instant application. Claim 14 recites an analogous limitation. Claim 6 recites “wherein the processing logic is implemented to communicate the second subset of the ultrasound images to the reviewer computing device based on at least one of a percentage of the image quality scores being above a threshold score” which encompasses mental observations or evaluations. Claim 16 recites an analogous limitation. Claim 7 recites “wherein the processing logic is configured to implement a neural network to generate the image quality scores” which encompasses mental observations or evaluations. Claim 17 recites an analogous limitation. Claim 9 recites “generate an ultrasound examination score based on the image quality scores” which encompasses mental observations or evaluations. Claim 19 recites analogous limitations. Analysis step 2A, Prong II The judicial exception is not integrated into a practical application because the additional elements merely add insignificant extra-solution activity and are mere instructions to implement an abstract idea on a computer. See MPEP 2106.05 (f) and (g). Claim 1 recites “an ultrasound system configured to generate ultrasound images; an ultrasound probe coupled to the ultrasound system to generate ultrasound data, the ultrasound probe comprising a grip portion by the sonography candidate” which is insignificant extra-solution activity, in particular mere data gathering. Claim 1 also recites “receive the ultrasound data and a grip map indicating locations on the grip portion gripped by the sonography candidate” which is insignificant extra-solution activity, in particular mere data gathering. Claim 1 also recites “generate the first subset of the ultrasound images based on the ultrasound data” which is insignificant extra-solution activity, in particular selecting a particular data source or type of data to be manipulated. Claim 1 also recites “communicate, based on the image quality scores, a second subset of the ultrasound images to a reviewer computing device” which is insignificant extra-solution activity, in particular insignificant application. Claim 1 also recites “a processing logic implemented at least partially in hardware of the credentialing system” which results in mere instructions to implement an abstract idea on a computer. Claim 11 recites analogous limitations. Claim 2 recites “the reviewer computing device” which results in mere instructions to implement an abstract idea on a computer. Claims 3 and 13 merely further define the first and second subset. Claim 4 recites “a display device implemented to display a visual representation of the guidance” which is insignificant extra-solution activity, in particular insignificant application. Claim 14 recites an analogous limitation. Claims 5 and 15 merely further define the visual representation. Claims 6 and 16 further define the first subset. Claims 8 and 18 merely further define the input of the neural network. Claim 12 recites “communicating [...] the image quality scores to the reviewer computing device” which is insignificant extra-solution activity, in particular insignificant application. Claims 10 and 20 merely further define the second subset. Analysis step 2B The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the processor is an additional element that merely results in instructions to implement an abstract idea on a computer that is well-understood, routine, and conventional activity previously known to the industry. The remaining additional elements merely add insignificant extra-solution activity, in particular mere data gathering, selecting a particular data source or type of data to be manipulated, and insignificant application, to the judicial exception that are well-understood, routine, and conventional activities previously known to the industry. Claims 1-20 are therefore directed to a judicial exception without significantly more. The claims are not patent eligible. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-2, 7, 9, 11-12, 17, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Olivier et al. (WO 2022/096471, May 12, 2022, applicant submitted prior art via the IDS, citations refer to corresponding US 2023/0401719, hereinafter “Olivier”) in view of Shoudy et al. (US 2020/0214667, July 9, 2020). Regarding claim 1, Olivier discloses a credentialing system for issuing a sonographer credential to a sonography candidate (“A method of analyzing an ultrasound image involves assessing the quality of the image in terms of which features of interest have been identified in the image and assessing a segmentation quality relating to the quality of a segmentation of the image. The two quality assessments are combined to derive and output an overall quality assessment for biometry measurements obtained from the image.” Abstract; also see Fig. 5 and corresponding description), the credentialing system comprising: an ultrasound system configured to generate ultrasound images (“ultrasound imaging system” [0046]); an ultrasound probe coupled to the ultrasound system to generate ultrasound data, the ultrasound probe comprising a grip portion to grip by the sonography candidate (“an ultrasound probe adapted to acquire ultrasound images of an imaging region” [0047]; examiner notes that the handle of the ultrasound probe 50 in Fig. 5 is interpreted as the grip portion); a processing logic implemented at least partially in hardware of the credentialing system (“the system makes use of processor to perform the data processing. The processor can be implemented in numerous ways, with software and/or hardware, to perform the various functions required.” [0140]) and configured to: receive the ultrasound data (“In step 30, receiving an ultrasound image of an imaging region having features of interest” [0132]; also see Fig. 4 and corresponding description); generate the first subset of the ultrasound images based on the ultrasound data (“the leftmost images, top and bottom in Fig. 3, reproduced below, and corresponding description; e.g., see “The left images show the original image” [0120]); generate image quality scores (“An image content quality assessment for the image is determined in step 20. This may be considered to be an “image quality score”.” [0072]; also see segmentation quality score 24 in Fig. 1 and corresponding description) for the first subset of the ultrasound images (the leftmost images, top and bottom in Fig. 3, reproduced below, and corresponding description; e.g., see “The left images show the original image” [0120]); and communicate, based on the image quality scores, a second subset of the ultrasound images to a reviewer computing device (“The bottom set of images show a high number of highlighted pixels on the confidence map hence a low confidence score, and the DNN is not confident in the quality of a biometric prediction. The user is warned and asked to perform a manual review.” [0122]; the bottom right image in Fig. 3 and corresponding description are the second subset of ultrasound images). PNG media_image2.png 340 480 media_image2.png Greyscale Olivier fails to disclose receiving a grip map indicating locations on the grip portion gripped by the sonography candidate; and the image quality scores for the first subset of the ultrasound images being based on the grip map. However, Shoudy teaches, in the same field of endeavor, receiving a grip map indicating locations on the grip portion gripped by a sonography candidate (“Additionally, or alternatively, the controller 24 may compare the current position and/or orientation of the ultrasound probe 15, based at least in part on the signals received from the probe position sensor 36, to the position and/or orientation of the haptic feedback device 14 based at least in part on the signals received from the position sensor 37. Based on the comparison, the controller 24 may determine the grip style of the operator's hand on the ultrasound probe 15. For example, the position and/or orientation data of the ultrasound probe 15 may include a plurality of probe reference points (e.g., x-coordinate, y-coordinate, z-coordinate) relative to the imaging space 35, and the position and/or orientation data of the haptic feedback device 14 may include a plurality haptic feedback device reference points (e.g., x-coordinate, y-coordinate, z-coordinate) relative to the imaging space 35. The controller 24 may compare the relative positions of the probe reference points and the haptic feedback device reference points in the imaging space 35 to lookup tables stored in the memory 32 to determine the grip style of the operator's hand on the ultrasound probe 15.” [0039]). Shoudy further teaches that the ultrasound image quality is based on the grip map (“After determining the grip style used by the operator with the ultrasound probe 15, the controller 24 may determine whether the position and/or orientation of the ultrasound probe 15 corresponds to the appropriate position and/or orientation to image the desired anatomy in the desired scan plane at step 166.” [0081]). Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Olivier with receiving a grip map indicating locations on the grip portion gripped by the sonography candidate; and the image quality scores for the first subset of the ultrasound images being based on the grip map as taught by Shoudy in order to accommodate different users with different grip styles while still providing accurate imaging of desired anatomy ([0075], [0081] of Shoudy). Regarding claim 11, Olivier discloses a computer implemented method to issue a sonographer credential to a sonography candidate (“A method of analyzing an ultrasound image involves assessing the quality of the image in terms of which features of interest have been identified in the image and assessing a segmentation quality relating to the quality of a segmentation of the image. The two quality assessments are combined to derive and output an overall quality assessment for biometry measurements obtained from the image.” Abstract; also see Fig. 5 and corresponding description, [0140]-[0142]), the method comprising: generating, by a processor (“FIG. 1 shows the various functional units involved in the method. The various functions are all performed by a processor of an ultrasound imaging system” [0059]; also see [0140]), a first subset of ultrasound images (“An ultrasound image 10 is received of an imaging region” [0061], also see Fig. 1 and corresponding description; also see “the leftmost images, top and bottom in Fig. 3, reproduced below, and corresponding description; e.g., see “The left images show the original image” [0120]) based on ultrasound data generated by an ultrasound probe comprising a grip portion to grip by the sonography candidate (“an ultrasound probe adapted to acquire ultrasound images of an imaging region” [0047]; examiner notes that the handle of the ultrasound probe 50 in Fig. 5 is interpreted as the grip portion); generating, by the processor, image quality scores (“An image content quality assessment for the image is determined in step 20. This may be considered to be an “image quality score”.” [0072]; also see segmentation quality score 24 in Fig. 1 and corresponding description) for the first subset of the ultrasound images (the leftmost images, top and bottom in Fig. 3, reproduced below, and corresponding description; e.g., see “The left images show the original image” [0120]); and communicating, by the processor, a second subset of the ultrasound images to a reviewer computing device based on the image quality scores (“The bottom set of images show a high number of highlighted pixels on the confidence map hence a low confidence score, and the DNN is not confident in the quality of a biometric prediction. The user is warned and asked to perform a manual review.” [0122]; the bottom right image in Fig. 3 and corresponding description are the second subset of ultrasound images). PNG media_image2.png 340 480 media_image2.png Greyscale Olivier fails to disclose receiving, by the processor a grip map indicating locations on the grip portion gripped by the sonography candidate; and the image quality scores for the first subset of the ultrasound images being based on the grip map. However, Shoudy teaches, in the same field of endeavor, receiving, by the processor, a grip map indicating locations on the grip portion gripped by a sonography candidate (“Additionally, or alternatively, the controller 24 may compare the current position and/or orientation of the ultrasound probe 15, based at least in part on the signals received from the probe position sensor 36, to the position and/or orientation of the haptic feedback device 14 based at least in part on the signals received from the position sensor 37. Based on the comparison, the controller 24 may determine the grip style of the operator's hand on the ultrasound probe 15. For example, the position and/or orientation data of the ultrasound probe 15 may include a plurality of probe reference points (e.g., x-coordinate, y-coordinate, z-coordinate) relative to the imaging space 35, and the position and/or orientation data of the haptic feedback device 14 may include a plurality haptic feedback device reference points (e.g., x-coordinate, y-coordinate, z-coordinate) relative to the imaging space 35. The controller 24 may compare the relative positions of the probe reference points and the haptic feedback device reference points in the imaging space 35 to lookup tables stored in the memory 32 to determine the grip style of the operator's hand on the ultrasound probe 15.” [0039]). Shoudy further teaches that the ultrasound image quality is based on the grip map (“After determining the grip style used by the operator with the ultrasound probe 15, the controller 24 may determine whether the position and/or orientation of the ultrasound probe 15 corresponds to the appropriate position and/or orientation to image the desired anatomy in the desired scan plane at step 166.” [0081]). Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Olivier with receiving, by the processor a grip map indicating locations on the grip portion gripped by the sonography candidate; and the image quality scores for the first subset of the ultrasound images being based on the grip map as taught by Shoudy in order to accommodate different users with different grip styles while still providing accurate imaging of desired anatomy ([0075], [0081] of Shoudy). Regarding claim 2, Olivier further discloses the reviewer computing device (display 54 and processor 52 in Fig. 5 and corresponding description; also see [0140]-[0142]). Regarding claims 7 and 17, Olivier further discloses wherein the processing logic is configured to implement a neural network to generate the image quality scores (“performing the segmentation, generating the segmentation quality assessment and generating the image content quality assessment may be performed using deep learning, such as using a deep neural network (DNN), for example one or several stochastic deep neural networks.” [0043]; also see [0118]). Regarding claims 9 and 19, as best understood in light of the 35 U.S.C. 112(b) rejection stated above, Olivier further discloses wherein the candidate credentialing application is configured to: generate an ultrasound examination score based on the image quality scores (“The overall quality assessment indicates whether or not (or to what extent) the biometry measurements may be relied upon. Thus, a user is provided with a simple indication of the quality of the biometry measurements, taking into account whether or not the image meets standardized image acquisition requirements.” [0016]). Regarding claim 12, Olivier further discloses communicating, by the processor, the image quality scores to the reviewer computing device (“The bottom set of images show a high number of highlighted pixels on the confidence map hence a low confidence score, and the DNN is not confident in the quality of a biometric prediction. The user is warned and asked to perform a manual review. The user may also be given a numeric or graphical confidence score.” [0122]). Claims 3, 10, 13, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Olivier in view of Shoudy as applied to claims 1 and 11 above and further in view of Ayinde et al. (US 2024/0050069, filed August 10, 2022, hereinafter “Ayinde”). Regarding claims 3 and 13, Olivier modified by Shoudy discloses the limitations of claims 1 and 11, respectively, as stated above but fails to disclose wherein the first subset and the second subset are disjoint with respect to one another. However, Ayinde teaches, in the same field of endeavor, wherein the first subset and the second subset are disjoint with respect to one another (“the ultrasound imaging system 100 initially attempts to automatically record an ultrasound clip of higher quality images [first subset], but if the quality scores of the acquired ultrasound images fail to reach the higher, first quality threshold, for a set of contiguous image frames of at least a first predetermined size, the ultrasound imaging system 100 provides an smart capture option to record an ultrasound clip comprised of an alternate set [second subset] of contiguous image frames of at least a second predetermined size (e.g., as a “best available quality” clip) where the ultrasound images in the alternate set meet a lesser, second quality threshold. The second predetermined size of the alternate set of contiguous image frames may be smaller than, or in some cases equal to or larger than, the first predetermined size of the (initial) set of contiguous image frames.” [0045]). Therefore before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Olivier with wherein the first subset and the second subset are disjoint with respect to one another as taught by Ayinde in order to provide images with the best available image quality ([0045] of Ayinde). Regarding claims 10 and 20, Olivier modified by Shoudy discloses the limitations of claims 1 and 11, respectively, as stated above but fails to disclose wherein the second subset includes one or more of at least one ultrasound image of a second anatomy, at least one ultrasound image generated with the ultrasound system in a second imaging mode, or at least one ultrasound image generated according to a second examination protocol. However, Ayinde teaches, in the same field of endeavor, wherein the second subset includes one or more of at least one ultrasound image of a second anatomy, at least one ultrasound image generated with the ultrasound system in a second imaging mode, or at least one ultrasound image generated according to a second examination protocol (“an ultrasound clip comprised of an alternate set [second subset] of contiguous image frames of at least a second predetermined size (e.g., as a “best available quality” clip) where the ultrasound images in the alternate set meet a lesser, second quality threshold. The second predetermined size of the alternate set of contiguous image frames may be smaller than, or in some cases equal to or larger than, the first predetermined size of the (initial) set of contiguous image frames.” [0045]). Therefore before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Olivier with wherein the second subset includes one or more of at least one ultrasound image of a second anatomy, at least one ultrasound image generated with the ultrasound system in a second imaging mode, or at least one ultrasound image generated according to a second examination protocol as taught by Ayinde in order to provide images with the best available image quality ([0045] of Ayinde). Claims 4, 5, 8, 14, 15, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Olivier in view of Shoudy as applied to claims 1, 7, 11, and 17 above and further in view of Patil et al. (US 2022/0087644, March 24, 2022, hereinafter “Patil”). Regarding claims 4 and 14, Olivier modified by Shoudy discloses the limitations of claims 1 and 11, respectively, as stated above but fails to disclose wherein the processing logic is implemented to determine, based on at least one of the image quality scores being below a threshold score, guidance for the sonography candidate to improve the at least one of the image quality scores, and further comprising a display device implemented to display a visual representation of the guidance. Although examiner notes that Olivier does disclose a display device (display 54 in Fig. 5 and corresponding description). However, Patil teaches, in the same field of endeavor, wherein the candidate credentialing application is implemented to determine, based on at least one of the image quality scores being below a threshold score, guidance for the sonography candidate to improve the at least one of the image quality scores (“if a scan plane of an acquired image is not of a desired quality (e.g., if a difference between a scan plane and a target scan plane exceeds a threshold difference), user guidance may automatically be displayed to the user.” [0088]), and further comprising a display device implemented to display a visual representation of the guidance (“The real-time visual guidance display 500 may include a guidance cue 502, which may indicate to a user a suggested adjustment to the ultrasound device in order to more accurately acquire an ultrasound image. For example, the guidance cue 502 may include a visual representation of an ultrasound device with a visual indication of a direction to move the ultrasound device in, an adjustment to be made to the pressure applied to the ultrasound device, etc.” [0114]). Therefore before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Olivier with wherein the candidate credentialing application is implemented to determine, based on at least one of the image quality scores being below a threshold score, guidance for the sonography candidate to improve the at least one of the image quality scores, and further comprising a display device implemented to display a visual representation of the guidance as taught by Patil in order to aid an operator in achieving a target scan plane ([0048] of Patil). Regarding claims 5 and 15, Olivier modified by Shoudy and Patil discloses the limitations of claims 4 and 14, respectively, as stated above, in particular Patil was relied on to teach the visual representation. Patil further teaches, in the same field of endeavor, wherein the visual representation incudes at least one of a training video, an icon of an ultrasound probe, an arrow to indicate a direction to move the ultrasound probe, and an icon of a grip orientation for holding the ultrasound probe (“the guidance cue 502 may include a visual representation of an ultrasound device with a visual indication of a direction to move the ultrasound device in, an adjustment to be made to the pressure applied to the ultrasound device, etc.” [0114]; also see [0029], [0049], [0069], Fig. 5A, reproduced below, and corresponding description, Figs 6A-B and corresponding descriptions). PNG media_image3.png 286 422 media_image3.png Greyscale Therefore before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Olivier with wherein the visual representation incudes at least one of a training video, an icon of an ultrasound probe, an arrow to indicate a direction to move the ultrasound probe, and an icon of a grip orientation for holding the ultrasound probe as taught by Patil in order to aid an operator in achieving a target scan plane ([0048] of Patil). Regarding claims 8 and 18, Olivier modified by Shoudy discloses the limitations of claims 7 and 17, respectively, as stated above but fails to disclose wherein the neural network is implemented to generate the image quality scores based on at least one of the grip map indicating a hand grip orientation of the sonography candidate holding the grip portion of the a grip orientation of the ultrasound probe of the ultrasound system, an amount of movement of the ultrasound probe, an amount of movement of the sonography candidate, audio content, an amount of pressure applied by the ultrasound probe to a patient, and an amount of time taken by the sonography candidate to operate the ultrasound system to generate the ultrasound images. However, Patil teaches, in the same field of endeavor, wherein the neural network is implemented to generate the image quality scores based on at least one of the grip map indicating a hand grip orientation of the sonography candidate holding the grip portion of the a grip orientation of the ultrasound probe of the ultrasound system, an amount of movement of the ultrasound probe, an amount of movement of the sonography candidate, audio content, an amount of pressure applied by the ultrasound probe to a patient, and an amount of time taken by the sonography candidate to operate the ultrasound system to generate the ultrasound images (“The proficiency score may be determined and/or updated based on a performance of the user in accordance with a plurality of proficiency parameters, such as a quality of an acquired image, an accuracy of a scan plane, a speed of image acquisition, a reproducibility and/or repeatability of image acquisition, etc.” [0027]; also see [0021], [0045], [0140]). Therefore before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Olivier with wherein the neural network is implemented to generate the image quality scores based on at least one of the grip map indicating a hand grip orientation of the sonography candidate holding the grip portion of the a grip orientation of the ultrasound probe of the ultrasound system, an amount of movement of the ultrasound probe, an amount of movement of the sonography candidate, audio content, an amount of pressure applied by the ultrasound probe to a patient, and an amount of time taken by the sonography candidate to operate the ultrasound system to generate the ultrasound images as taught by Patil in order to provide guidance according to an operator’s proficiency level ([0005] of Patil). Claims 6 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Olivier in view of Shoudy as applied to claims 1 and 11 above and further in view of Jorgensen et al. (US 2024/0074729, corresponding PCT filed December 20, 2021, hereinafter “Jorgensen”). Regarding claims 6 and 16, Olivier modified by Shoudy discloses the limitations of claims 1 and 11, respectively, as stated above but fails to disclose wherein the processing logic is implemented to communicate the second subset of the ultrasound images to the reviewer computing device based on at least one of a percentage of the image quality scores being above a threshold score, the first subset including one or more of at least one ultrasound image of a first anatomy, at least one ultrasound image generated with the ultrasound system in a first imaging mode, or at least one ultrasound image generated according to a first examination protocol. However, Jorgensen teaches, in the same field of endeavor, wherein the processing logic is implemented to communicate the second subset of the ultrasound images to the reviewer computing device based on at least one of a percentage of the image quality scores being above a threshold score, the first subset including one or more of at least one ultrasound image of a first anatomy, at least one ultrasound image generated with the ultrasound system in a first imaging mode, or at least one ultrasound image generated according to a first examination protocol (“The score assigned may be indicative of a picture quality and/or whether an anatomical feature is correctly captured. The control unit 6 sorts 616 the plurality of first ultrasound images into a first group or a second group. The sorting may be based on whether if the score assigned a first ultrasound image of the plurality of first ultrasound images is under a quality threshold the first ultrasound image is sorted into the second group, wherein if the score assigned the first ultrasound image is over the quality threshold the first ultrasound image is sorted into the first group. The one or more first ultrasound images sorted into the second group may simple be discarded as not useful for further purposes. The control unit 6 determines 617 one or more acquiring positions and one or more acquiring orientations corresponding to a position and/or an orientation of the ultrasound probe 31 and/or the positioning device 4 when the one or more first ultrasound image sorted in the first group was obtained. The control unit 6 generates 618 the second movement instruction comprising the one or more acquiring positions and the one or more acquiring orientations for the positioning device 4 for moving the ultrasound probe 41 to obtain one or more second ultrasound images of the body part.” [0102]). Therefore before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Olivier with wherein the processing logic is implemented to communicate the second subset of the ultrasound images to the reviewer computing device based on at least one of a percentage of the image quality scores being above a threshold score, the first subset including one or more of at least one ultrasound image of a first anatomy, at least one ultrasound image generated with the ultrasound system in a first imaging mode, or at least one ultrasound image generated according to a first examination protocol as taught by Jorgensen in order to obtain more detailed images of an anatomy of interest ([0102] of Jorgensen). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMINAH ASGHAR whose telephone number is (571)272-0527. The examiner can normally be reached M-W, F 9am-5pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Koharski can be reached at (571) 272-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /A.A./Examiner, Art Unit 3797 /CHRISTOPHER KOHARSKI/Supervisory Patent Examiner, Art Unit 3797
Read full office action

Prosecution Timeline

May 14, 2024
Application Filed
Mar 18, 2025
Non-Final Rejection — §101, §103, §112
Jun 24, 2025
Response Filed
Aug 26, 2025
Applicant Interview (Telephonic)
Aug 26, 2025
Examiner Interview Summary
Feb 05, 2026
Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582483
Tracking Apparatus For Tracking A Patient Limb
2y 5m to grant Granted Mar 24, 2026
Patent 12533109
Systems and Methods of Determining Dimensions of Structures in Medical Images
2y 5m to grant Granted Jan 27, 2026
Patent 12357162
VIDEOSTROBOSCOPY OF VOCAL CORDS WITH A HYPERSPECTRAL, FLUORESCENCE, AND LASER MAPPING IMAGING SYSTEM
2y 5m to grant Granted Jul 15, 2025
Patent 12350526
DOPPLER GUIDED ULTRASOUND THERAPY
2y 5m to grant Granted Jul 08, 2025
Patent 12329534
BIOLOGICAL EXAMINATION DEVICE AND BIOLOGICAL INFORMATION ANALYSIS METHOD
2y 5m to grant Granted Jun 17, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
63%
Grant Probability
99%
With Interview (+46.8%)
3y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 163 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month