Prosecution Insights
Last updated: April 19, 2026
Application No. 18/170,074

ULTRASONIC IMAGE DISPLAY SYSTEM AND STORAGE MEDIA

Non-Final OA §101§103§112
Filed
Feb 16, 2023
Examiner
ASGHAR, AMINAH
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
GE Precision Healthcare LLC
OA Round
3 (Non-Final)
63%
Grant Probability
Moderate
3-4
OA Rounds
3y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 63% of resolved cases
63%
Career Allow Rate
102 granted / 163 resolved
-7.4% vs TC avg
Strong +47% interview lift
Without
With
+46.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
46 currently pending
Career history
209
Total Applications
across all art units

Statute-Specific Performance

§101
6.5%
-33.5% vs TC avg
§103
45.8%
+5.8% vs TC avg
§102
12.9%
-27.1% vs TC avg
§112
32.9%
-7.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 163 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 05/27/2025 has been entered. Response to Amendment This action is in response to the remarks filed on 05/27/2025. The amendments filed on 05/27/2025 have been entered. Applicant has canceled claims 2, 4, 6, and 22. Accordingly claims 1, 3, 5, 7-9, 11-17, and 19-21 are pending. Claims 1, 3, 5, 7, 13, 19, and 20 are presently amended. The previous objections to claims 2 and 19 have been withdrawn in light of applicant's canceling of claim 2 and the amendment to claim 19. The previous rejections of claims 1, 6, and all dependents thereof under 35 U.S.C 112(b) have been withdrawn in light of applicant's amendments to claim 1 and the canceling of claim 6. The previous rejection of claim 3 under 35 U.S.C. 112(b) has not been addressed by applicant’s amendments or remarks and is therefore maintained. Additionally, the amendments introduce new issues of indefiniteness that are detailed below. Response to Arguments Applicant’s arguments with respect to the prior art rejections of the claims have been fully considered and are persuasive, in part. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of newly discovered prior art Kato et al. Examiner notes that although Lundberg is still relied on as the primary reference in the rejection it is not relied on for any teaching or matter specifically challenged in the argument, i.e., the claimed first and second thresholds. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1, 3, 5, 7-9, 11-17, and 19-21 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites the limitation “based on the probability being between a first threshold and a second threshold, display a message on the display recommending that the user change the selected preset based on a determination that a preset change should be recommended to the user” which renders the claim indefinite. It is unclear from this limitation whether the displaying step is performed based on the probability being between the first and second thresholds or the determination that a preset change should be recommended to the user or both. For the present purposes of examination, the limitation has been interpreted as the display step being based on the probability being between a first and a second threshold. Further clarification is required. Claim 3 recites the limitation “the plurality of categories” in line 3. There is insufficient antecedent basis for this limitation in the claim. Further clarification is required. Further, claim 3 recites the limitation “a plurality of examination locations” in line 4 which renders the claim indefinite because it is unclear whether this is the same or different plurality of examination locations previously recited in claim 1. For the present purposes of examination, the limitations have been interpreted as being the same. Further clarification is required. Claims dependent upon a claim rejected under 35 U.S.C. 112(b) are also rejected under the same statute because they each inherit the indefiniteness of the claim(s) they respectively depend upon. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1, 3, 5, 7-9, 11-17, and 19-21 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more. Analysis step 1 of Subject Matter Eligibility Test The claims are directed to a machine (i.e., an ultrasonic image display system) of claims 1, 3, 5, 7-9, 11-17, 19, and 21 and a manufacture (i.e., a non-transitory computer readable storage medium) of claim 20. Analysis step 2A, Prong I The claims recite abstract ideas, in particular mental processes, e.g., concepts performed in the human mind (including an observation, evaluation, judgment, opinion) and mathematical concepts. In particular, claim 1 recites the limitation “select a preset used in examination from among a plurality of presets for a plurality of examination locations based on a signal input through the user interface”, “determine whether to recommend that a user change the selected preset to a preset of the obtained examination location based on the selected preset examination location and the obtained examination location”, and “based on the probability being above the second threshold, change the selected preset to the preset for the obtained examination location”. Claim 20 recites analogous limitations. Claim 13 recites “weigh the probability” which is a mathematical concept, e.g., a mathematical calculation. The dependent claims further limit the abstract ideas. Claim 5 recites the limitation “wherein based on the probability being lower than the first threshold, the one or more processors do not recommend the preset change” which involves further observation, evaluation, judgment. Claim 7 recites “chang[ing] the selected preset” based on the observation of “transmission and reception of ultrasonic waves being interrupted”. Claim 8 merely further defines the prescribed operation. Claim 9 recites “ based on the preset being changed [...] stop obtaining the examination location” which involves observation, evaluation, judgment. Claim 11 recites performing a subsequent evaluation based on the observation of the ”preset change being recommended to the user”. Claim 12 recites changing the “selected preset” based on the observation of “the prescribed operation by the user”. Claim 21 recites an evaluation and judgment. Analysis step 2A, Prong II The judicial exception is not integrated into a practical application because the additional elements merely add insignificant extra-solution activity to the judicial exception and are mere instructions to implement an abstract idea on a computer. See MPEP 2106.05 (f) and (g). Claim 1 recites “an ultrasonic probe”, “ a user interface”, and “obtain an examination location of a subject using a trained model by inputting to the trained model an input image created based on an ultrasonic image obtained by scanning the subject via the ultrasonic probe, the obtaining the examination location comprises obtaining probability indicating the confidence level of the obtained examination location and determining the examination location of the subject based on the probability” which is insignificant extra-solution activity, in particular mere data gathering. Claim 1 also recites “a display” and “based on the probability being between a first threshold and a second threshold, display a message on the display recommending that the user change the selected preset based on a determination that a preset change should be recommended to the user” which is insignificant extra-solution activity, in particular insignificant application. Finally claim 1 recites “a memory storing instructions” and “one or more processors for communicating with the ultrasonic probe, the user interface, and the display, the one or more processors are configured to execute the instructions to” which results in mere instructions to implement an abstract idea on a computer. Claim 20 recites analogous limitations. Dependent claims 14-15 also recite insignificant extra-solution activity, in particular mere data gathering. Dependent claims 16 and 19 also recite insignificant extra-solution activity, in particular insignificant application. Claim 17 recites insignificant extra-solution activity, in particular selecting a particular data source or type of data to be manipulated. Analysis step 2B The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the memory and processors are additional elements that merely result in instructions to implement an abstract idea on a computer that is well-understood, routine, and conventional activity previously known to the industry. The remaining additional elements merely add insignificant extra-solution activity, in particular mere data gathering and insignificant application, to the judicial exception that are well-understood, routine, and conventional activities previously known to the industry. Claims 1, 3, 5, 7-9, 11-17, and 19-21 are therefore directed to a judicial exception without significantly more. The claims are not patent eligible. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1, 3, 5, 7-9, 11-17, and 20-21 are rejected under 35 U.S.C. 103 as being unpatentable over Lundberg et al. (US 2019/0269384, September 5, 2019) in view of Kato et al. (US 2021/0158513, May 27, 2021). Regarding claims 1 and 20, as best understood in light of the 35 U.S.C. 112(b) rejection stated above, Lundberg discloses an ultrasonic image display system and corresponding non-transitory computer readable storage medium (“ultrasound imaging system” Abstract, also see Fig. 1, reproduced below, and corresponding description; also see [0049]-[0050]), comprising: PNG media_image1.png 404 576 media_image1.png Greyscale an ultrasonic probe (imaging transducer 20 in Fig. 1; also see “imaging probe” in [0017]); a user interface (“an operator can operate the imaging system with a graphical user interface on the display or by using more conventional input devices on the imaging system itself such as a keyboard, trackball, touch pad, buttons, voice commands etc.” [0016]); a display (“The ultrasound imaging system 10 generally has one or more video displays” [0016]); a memory storing instructions (“The processor is configured to execute a series of instructions that are stored in a processor-readable memory” [0017]; also see [0050], claims 1, 15, and 20); and one or more processors for communicating with the ultrasonic probe, the user interface, and the display (“An ultrasound imaging system includes a processor” Abstract; also see [0017], [0019], [0020], [0030]), the one or more processors are configured to execute the instructions to: select a preset used in examination from among a plurality of presets set for a plurality of examination locations based on a signal input through the user interface (“pre-set imaging parameters (such as but not limited to: gain, frame rate, line density, acoustic power, sector size, available worksheets etc.) set on the ultrasound imaging system or the type of examination selected. For example, if the operator has selected imaging parameters that are optimized for liver imaging and the tissue identified by the neural network is heart tissue, then the ultrasound system can prompt the user to either confirm that the correct set of ultrasound imaging parameters are selected or that the correct type of examination set at 256.” [0048]; examiner notes that the pre-set imaging parameters are interpreted as the claimed presets and the liver, heart tissue, etc. are interpreted as the claimed plurality of examination locations; also see [0019]-[0021]); obtain an examination location of a subject using a trained model by inputting to the trained model an input image created based on an ultrasonic image obtained by scanning the subject via the ultrasonic probe (“Beginning at 250, the processor in the ultrasound system supplies a saved image to a trained neural network to identify the type of tissue that is shown in the image. At 252, the processor receives the type of tissue identified back from the trained neural network.” [0048]), the obtaining the examination location comprises obtaining probability indicating the confidence level of the obtained examination location and determining the examination location of the subject based on the probability (“The neural network 40 is trained to classify an input ultrasound image (or portion of the image) based on image features that are present (or not present) in the image. For example, images can be classified as one of several different tissue types or image features (heart tissue, liver tissue, breast tissue, abdominal tissue, bladder tissue, kidney tissue, heart valves, vessels etc.). In one embodiment, the neural network 40 returns a list of calculated values representing how likely the image corresponds to a number of particular classifications (tissue type, image feature, lack of a particular feature in an image or other criteria that the neural network is trained to recognize). Such calculated values may be a probability that an image is a particular tissue type (e.g. cardiac tissue=0.72) or may be a probability that the image contains a particular anatomical feature (carotid artery=0.87) or lacks an image feature (no kidney tissue=0.87) etc. Upon receipt of the determined probabilities from the neural network, the processor is programmed to recall one or more pictographs that are stored in a pictograph library 50 or other memory of the ultrasound imaging system and that correspond to the classified image.” [0017]; also see “confidence value” in [0024]); determine whether to recommend that a user change the selected preset to a preset of the obtained examination location based on the selected preset examination location and the obtained examination location (“At 254, the processor determines if the type of tissue identified by the trained neural network corresponds to pre-set imaging parameters (such as but not limited to: gain, frame rate, line density, acoustic power, sector size, available worksheets etc) set on the ultrasound imaging system or the type of examination selected. For example, if the operator has selected imaging parameters that are optimized for liver imaging and the tissue identified by the neural network is heart tissue, then the ultrasound system can prompt the user to either confirm that the correct set of ultrasound imaging parameters are selected or that the correct type of examination set at 256. If the imaging parameters or the type of examination on the ultrasound imaging system correspond to the detected tissue type, then the process ends at 258 with no recommendation to confirm/modify the imaging parameters or the examination type.” [0048]); based on the probability, provide a message recommending that the user change the selected preset based on a determination that a preset change should be recommended to the user (“The processor is therefore programmed to prompt the operator to change the imaging parameters (or have the imaging system select the imaging parameters) to better suit the type of tissue detected” [0020]; also see [0048]); and based on the probability, change the selected preset to the preset for the obtained examination location (“The processor is therefore programmed to prompt the operator to change the imaging parameters (or have the imaging system select the imaging parameters) to better suit the type of tissue detected” [0020]; also see [0048]). Although, in the embodiment relied upon above, Lundberg suggests displaying the message on the display (e.g., see [0016], [0019], [0023]-[0025]), Lundberg fails to explicitly disclose displaying the message on the display. However, in a separate embodiment, Lundberg teaches, displaying a message on the display (“At 200, the processor determines if the image data corresponds to a required view. If so, the ultrasound system preferably alerts the operator that a desired view has been obtained. Examples of such an alert can include an audible or visual queue that a corresponding image has been obtained. If the desired views are represented by pictographs, the alert can place a check by the pictograph corresponding to the desired view or show the pictograph in a different color. Other alerts or indications can include printed messages or audible cues provided to the user on a display screen or from a speaker etc. At 204, the processor determines if all required views are obtained, the system provides an alert or indication to the user that all required images are obtained. If not, the user can be alerted to the fact that one or more required views are not yet obtained and processing returns to 194 and more images are obtained. If all required views are obtained, then the examination can stop at 206.” [0043]; also see). Therefore, before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Lundberg’s first embodiment with displaying the message on the display as taught by a separate embodiment of Lundberg in order to visually prompt the user. Lundberg fails to disclose based on the probability being between a first threshold and a second threshold; and based on the probability being above the second threshold. However, Kato teaches, in the same field of endeavor, performing various processing based on a probability being between a first threshold and a second threshold; and based on the probability being above the second threshold (“a pixel having the probability that is equal to or greater than a first threshold, and the area generator is configured to form the mammary gland area by filling in a missing area with respect to the candidate pixel map” [0011]; also see “the mammary gland pixel detector is configured to detect a pixel, as the mammary gland pixel, having the probability that is equal to or greater than a second threshold, and the probability of the mammary gland area is higher when the second threshold is used than when the first threshold is used.” [0025]). Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Lundberg with based on the probability being between a first threshold and a second threshold; and based on the probability being above the second threshold as taught by Kato in order to provide intervention only when needed to limit unnecessary additional imaging (e.g., see [0002] of Kato). Regarding claim 3, as best understood in light of the 35 U.S.C. 112(b) rejection stated above, wherein obtaining an examination location of the subject includes determining into which category among the plurality of categories, including a plurality of examination locations, a location of the input image is to be classified, obtaining a probability of the location shown by the input image being classified into each category (“Depending on how the networks are designed, such an output can comprise the probability of the input image corresponding to one or more tissue types. In some embodiments, the neural networks return a number of values (e.g. 0.05, 0.05, 0.05, 0.8, 0.05 etc.) that represent the probabilities that the image is of a certain tissue type.” [0031]; also see [0017], [0019]-[0021], [0026], [0032]). Regarding claim 5, Lundberg modified by Kato discloses the limitations of claim 1 as stated above, in particular Kato was relied on to teach the first threshold. Lundberg further discloses wherein based on the probability being lower than the first threshold, the one or more processors do not recommend the preset change (“Such calculated values may be a probability that an image is a particular tissue type (e.g. cardiac tissue=0.72)” [0017]; also see “If the imaging parameters or the type of examination on the ultrasound imaging system correspond to the detected tissue type, then the process ends at 258 with no recommendation to confirm/modify the imaging parameters or the examination type.” [0048]; also see [0020]). Regarding claim 7, Lundberg further discloses wherein based on transmission and reception of ultrasonic waves being interrupted by a prescribed operation of the user (“images are sent to the neural network when the user hits a “freeze” button or similar feature” [0034]), the one or more processors are further configured to execute an instruction to change the selected preset to the preset for the obtained examination location (“Some ultrasound imaging systems allow the operator to program the system for a particular examination type or to adjust imaging parameters for particular types of tissue.” [0020]). Regarding claim 8, Lundberg further discloses wherein the prescribed operation is one of a freeze operation, a screen storing operation, and a depth change operation (“images are sent to the neural network when the user hits a “freeze” button or similar feature” [0034]). Regarding claim 9, Lundberg further discloses wherein based on the preset being changed by the user, the one or more processors are further configured to execute the instructions to stop obtaining the examination location (step 256 in Fig. 8, reproduced below, proceeds to END). PNG media_image2.png 356 400 media_image2.png Greyscale Regarding claim 11, Lundberg further discloses wherein based on the preset change being recommended to the user, the one or more processors are configured to execute the instructions to determine whether the preset was changed by the user (“At 254, the processor determines if the type of tissue identified by the trained neural network corresponds to pre-set imaging parameters (such as but not limited to: gain, frame rate, line density, acoustic power, sector size, available worksheets etc) set on the ultrasound imaging system or the type of examination selected. For example, if the operator has selected imaging parameters that are optimized for liver imaging” [0048]). Regarding claim 12, Lundberg further discloses wherein in response to the prescribed operation by the user, the one or more processors are configured to execute the instructions to change the selected preset to the preset for the obtained examination location (“If the type of examination set on the imaging system or the imaging parameters do not agree with the classification of the image by the neural network, the processor may ask the operator to check the settings on the ultrasound imaging system and may suggest that the operator change the settings to those that are optimal for the type of tissue identified by the neural network.” [0020]). Regarding claim 13, Lundberg modified by Kato discloses the limitations of claim 1as stated above. Lundberg fails to disclose wherein the one or more processors are configured to execute the instructions to execute weigh the probability. However, Kato further teaches, in the same field of endeavor, wherein the one or more processors are configured to execute the instructions to execute weigh the probability (“When a lot of learning is done with the mammography image adjusted based on the first viewpoint and the weight coefficients are fixed, information processing devices may not provide an accurate output (the probability map)” [0111]; also see [0076], [0126]). Therefore before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Lundberg with wherein the one or more processors are configured to execute the instructions to execute weigh the probability as taught by Kato since it was known in the art that weighting can be used to classify data using trained models. Regarding claim 14, Lundberg further discloses wherein the one more processors are further configured to execute the instructions to obtain results including the plurality of categories and the probability of the location indicated by the input image being classified into each category are displayed (“For example, if the neural network returns a probability value such as “cardiac tissue=0.98,” the processor executes program steps to retrieve one or more of the pictographs associated with cardiac tissue from a folder in the pictograph library, from a pictograph database or from a remote computing device. In some embodiments, the use of more than one neural network allows more specific pictographs to be retrieved. For example, if a first neural network is trained to identify the type of tissue and returns a value such as “cardiac tissue=0.98,” then the classified image can be provided to a second cardiac-specific neural network that is configured to return probability values of how likely the image is from a particular view. If the second neural network returns a value such as “apical view=0.92,” then one or more of the pictographs corresponding to apical views of the heart can be retrieved and presented on the video display screen for the operator to select in response to a command to retrieve the corresponding pictograph(s).” [0019]; also see [0024], Figs. 4, 6, and corresponding descriptions). Regarding claim 15, Lundberg further discloses wherein the obtaining results include an indicator corresponding to a value of the probability (“the pictographs are color coded in a manner that indicates how likely each pictograph corresponds to the classified image (green=most likely, red=least likely etc.) In yet another embodiment, the pictographs are shown with a visual cue (number, score, word description such as most likely, least likely etc.) that indicates how likely each pictograph corresponds to the classified image. Presentations of pictographs may involve showing all those pictographs corresponding to an identified tissue type.” [0024]). Regarding claim 16, Lundberg further discloses wherein the indicator is displayed in a color based on the value of the probability or a length is displayed based on the value of the probability (“the pictographs are color coded in a manner that indicates how likely each pictograph corresponds to the classified image (green=most likely, red=least likely etc.)” [0024]). Regarding claim 17, Lundberg further discloses wherein a first examination location included in the plurality of categories includes a plurality of sub-examination locations, the obtaining results include the plurality of sub-examination locations, and the location shown by the input image is classified into each sub-examination location (“For example, cardiac tissue images can be classified in a neural network that is trained to identify the views with which such images are obtained such as parasternal, apical, subcostal or suprasternal notch views. The processor receives the likelihood that an image represents each of these different views and can retrieve the pictograph representing the most likely determined tissue type and view.” [0021]). Regarding claim 21, Lundberg further discloses wherein the one or more processors are further configured to execute the instructions to evaluate the obtained examination location against the selected preset examination location to determine a match or mismatch (“For example, if the operator has selected imaging parameters that are optimized for liver imaging and the tissue identified by the neural network is heart tissue [mismatch], then the ultrasound system can prompt the user to either confirm that the correct set of ultrasound imaging parameters are selected or that the correct type of examination set at 256. If the imaging parameters or the type of examination on the ultrasound imaging system correspond to the detected tissue type [match]” [0048]). Claims 19 is rejected under 35 U.S.C. 103 as being unpatentable over Lundberg and Kato as applied to claims 1, 3, and 14 above and further in view of Casciaro (US 2024/0252140, corresponding PCT filed May 4, 2021). Regarding claim 19, Lundberg modified by Kato discloses the limitations of claim 14 as stated above. Lundberg further discloses wherein an operating mode of the ultrasonic image display system includes a color mode for displaying a color image, and based on the color mode being activated (“In the case of a color ultrasound image, each pixel generally stores pixel intensity values for red, green and blue color components.” [0021]), the one or more processors are further configured to execute the instructions to display the color image and the obtained examination location results on the display (“The ultrasound imaging system 10 generally has one or more video displays on which the ultrasound images are displayed.” [0016]; also see “A pictograph is a simplified ultrasound image or other symbol representing a tissue type or an image feature seen from a particular location and/or viewing angle. In some embodiments, a pictograph can also be or include a text annotation such as “Liver”, “Heart”, “Mitral Valve” or the like.” [0003]; also see Figs. 4, 6, and corresponding descriptions). Although Lundberg discloses a color image, Lundberg does not explicitly disclose a color image in which blood flow is shown in color. However, Casciaro teaches, in the same field of endeavor, a color image in which blood flow is shown in color (“according to a technique commonly known as Color Doppler” [0092]; also see [0096]). Therefore before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to modify the invention of Lundberg with a color image in which blood flow is shown in color as taught by Casciaro in order to evaluate blood flow using color Doppler which is a technique known in the art. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMINAH ASGHAR whose telephone number is (571)272-0527. The examiner can normally be reached M-W, F 9am-5pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Koharski can be reached at (571) 272-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /A.A./Examiner, Art Unit 3797 /SHAHDEEP MOHAMMED/Primary Examiner, Art Unit 3797
Read full office action

Prosecution Timeline

Feb 16, 2023
Application Filed
Sep 26, 2024
Non-Final Rejection — §101, §103, §112
Jan 28, 2025
Response Filed
Feb 13, 2025
Final Rejection — §101, §103, §112
May 27, 2025
Request for Continued Examination
Jun 03, 2025
Response after Non-Final Action
Dec 22, 2025
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582483
Tracking Apparatus For Tracking A Patient Limb
2y 5m to grant Granted Mar 24, 2026
Patent 12533109
Systems and Methods of Determining Dimensions of Structures in Medical Images
2y 5m to grant Granted Jan 27, 2026
Patent 12357162
VIDEOSTROBOSCOPY OF VOCAL CORDS WITH A HYPERSPECTRAL, FLUORESCENCE, AND LASER MAPPING IMAGING SYSTEM
2y 5m to grant Granted Jul 15, 2025
Patent 12350526
DOPPLER GUIDED ULTRASOUND THERAPY
2y 5m to grant Granted Jul 08, 2025
Patent 12329534
BIOLOGICAL EXAMINATION DEVICE AND BIOLOGICAL INFORMATION ANALYSIS METHOD
2y 5m to grant Granted Jun 17, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
63%
Grant Probability
99%
With Interview (+46.8%)
3y 11m
Median Time to Grant
High
PTA Risk
Based on 163 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month