Prosecution Insights
Last updated: April 18, 2026
Application No. 18/319,408

IMAGE DISPLAY APPARATUS AND CONTROL METHOD OF IMAGE DISPLAY APPARATUS

Non-Final OA §103
Filed
May 17, 2023
Examiner
VIRK, ADIL PARTAP S
Art Unit
3798
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Fujifilm Corporation
OA Round
5 (Non-Final)
48%
Grant Probability
Moderate
5-6
OA Rounds
3y 2m
To Grant
89%
With Interview

Examiner Intelligence

Grants 48% of resolved cases
48%
Career Allow Rate
102 granted / 213 resolved
-22.1% vs TC avg
Strong +41% interview lift
Without
With
+41.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
44 currently pending
Career history
257
Total Applications
across all art units

Statute-Specific Performance

§101
13.0%
-27.0% vs TC avg
§103
38.8%
-1.2% vs TC avg
§102
13.6%
-26.4% vs TC avg
§112
31.0%
-9.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 213 resolved cases

Office Action

§103
DETAILED ACTION This office action is in response to the communication received on 03/31/2026 concerning application no. 18/319,408 filed on 05/17/2023. Claims 1-6, 9-11, and 14-20 are pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 03/31/2026 has been entered. Claims 1-6, 9-11, and 14-20 are pending. Response to Arguments Applicant’s arguments with respect to claims 1 and 20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-6, 9-11, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Caluser et al. (PGPUB No. US 2015/0051489) in view of Shinohara (PGPUB No. US 2011/0224550) further in view of Lee et al. (US Patent No. 10,395,346). Regarding claim 1, Caluser teaches an image display apparatus, comprising: a monitor (Fig. 1 shows a display); a memory (Paragraph 0019 teaches a memory); and a processor (Paragraph 0091 teaches that the software is processed via a processor for the utilization of the apparatus) configured to: analyze a plurality of ultrasound images in which a lesion part in an inside of a subject is imaged to acquire probe positional information of an ultrasound probe at a time of imaging (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged), and imaging feature information, the imaging feature information including lesion part position information regarding a position of the lesion part of the subject in each respective image of the plurality of images (Paragraph 0022 teaches the image set is according to each pixel with respect to the position of a target pixel. The target pixel selection can be made at the time of capture, before the image save, or at a later time. Paragraph 0117 teaches that the image can be displayed in relation to the target region that is defined by the pixels. Paragraph 0120 teaches that each position of the pixel is performed in relation to the anatomical references. Paragraph 0096 teaches that the position coordinates of the lesion can be obtained. Paragraph 0115 teaches that the real size of the body part is known and the position tracking allows for the assessment of when the probe is outside of the range. Fig. 15 shows a range of images over the body position. Paragraph 0138 teaches that the spatial range can be used in the assessment of the lesions); extract an ultrasound image conforming to a display layout that is set by a user, the display layout including a plurality of screen divisions of the monitor, a display order, from the plurality of ultrasound images by referring to the probe positional information and the imaging feature information acquired (Paragraph 0023 teaches that the user is guided to display the target in a body diagram and adjust the probe position in real time. Paragraph 0078 teaches that the user can use a single monitor or multiple monitors for display. Paragraph 0083 teaches that the ultrasound images and the associated position information can be displayed at a later time if decided. Paragraph 0138 teaches that the multiple images of the lesion can be captured and recorded with the position and orientation of the probe. The lesion can be determined via image interpretation and tolerate only images with the lesion. Paragraph 0140 teaches real time scanning and display with respect to a body diagram. Paragraph 0100 teaches the display of the ultrasound image and the tagged image that can be displayed and recorded in any combination or order. See Fig. 60, 62, and 69-70); and display the extracted ultrasound image on the monitor according to the display layout (Paragraph 0138 teaches that the images with the lesion that are within the tolerance are contained and can be grouped and displayed. Fig. 62 shows the display of the probe pose information and the ultrasound image), wherein the processor is further configured to: add a body mark plotted with a probe mark indicating a position and an orientation of the ultrasound probe on each respective image of the plurality of images, the body mark schematically representing a body part of the subject (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0021 teaches the ultrasound imaging. Paragraph 0078 teaches the coregistration of the body location and the tracked probe position and orientation with respect to the real time ultrasound images. Anatomical references can also be used for the coregistration. See at least Figs. 6, 30, 60, 62, and 69); and acquire the probe positional information on the basis of the position and the orientation of the ultrasound probe indicated by the probe mark (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0021 teaches the ultrasound imaging. Paragraph 0078 teaches the coregistration of the body location and the tracked probe position and orientation with respect to the real time ultrasound images. Anatomical references can also be used for the coregistration. See at least Figs. 6, 30, 60, 62, and 69); the extracted ultrasound image is stored in the memory along with tag information associated with the extracted ultrasound image (Paragraph 0083 teaches that the ultrasound images with their associated positional information can be stored. Paragraph 0091 teaches that the storage is done with respect to the memory and can be used for later retrieval). While Caluser teaches compatibility with DICOM systems, Caluser is silent regarding an image display apparatus, extract an ultrasound image conforming to a display layout that is set by a user, the display layout including a display magnification, and the processor is further configured to convert the extracted ultrasound image into image data in a Digital Imaging and Communications In Medicine (DICOM) format including the tag information. In an analogous imaging field of endeavor, regarding ultrasound image processing, Shinohara teaches an image display apparatus, the processor is further configured to convert the extracted ultrasound image into image data in a Digital Imaging and Communications In Medicine (DICOM) format including the tag information (Paragraph 0047 teaches that the 3D image data is converted from the image data to the DICOM format and stored. The information includes the image position information and the inclination information of the slice image data. Paragraphs 0056-71 teaches that the image’s 3D arrangement is assigned according to tagging that is considerate of pixel spacing. Paragraph 0053 teaches the pixel spacing and the intervoxel distancing. This information is stored according to the DICOM format. Abstract teaches an ultrasound system. See Fig. 6). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Caluser with Shinohara’s teaching of DICOM formatting of image data and the tag information. This modified apparatus would allow the user to track and correlate position information with high accuracy (Paragraph 0016 of Shinohara). Furthermore, the modification can assist in easy comparison of standard image data structure, treatment planning or progress observation (Paragraph 0018 of Shinohara). However, Shinohara is silent regarding an image display apparatus, extract an ultrasound image conforming to a display layout that is set by a user, the display layout including a display magnification. In an analogous imaging field of endeavor, regarding ultrasound image processing, Lee teaches an image display apparatus, extract an ultrasound image conforming to a display layout that is set by a user, the display layout including a plurality of screen divisions of the monitor, a display order, and a display magnification (Col. 12, lines 5-45 teaches the display of a predetermined number of frame images among the frame images during a time period. The images can be organized and placed in a particular order. Col. 2, lines 37-41 teaches that a particular image can be magnified. See Fig. 7). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Caluser and Shinohara with Lee’s teaching of a display magnification. This modified apparatus would allow the user to improve ultrasound diagnosis accuracy (Col. 2, lines 19-26 of Lee). Furthermore, the modification provides precise diagnosis of observed lesions (Col. 9, lines 27-35 of Lee). Regarding claim 2, modified Caluser teaches the image display apparatus in claim 1, as discussed above. Caluser further teaches an image display apparatus, further comprising: the ultrasound probe (Fig. 1 shows an ultrasound probe), wherein the processor is configured to: acquire the plurality of ultrasound images in which the lesion part of the subject is imaged by using the ultrasound probe (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0021 teaches the ultrasound imaging); and perform the analyzation of the plurality of ultrasound images acquired (Paragraph 0096 teaches that the position coordinates of the lesion can be obtained. Paragraph 0115 teaches that the real size of the body part is known and the position tracking allows for the assessment of when the probe is outside of the range. Fig. 15 shows a range of images over the body position. Paragraph 0138 teaches that the spatial range can be used in the assessment of the lesions). Regarding claim 3, modified Caluser teaches the image display apparatus in claim 1, as discussed above. Caluser further teaches an image display apparatus, wherein the processor is configured to set the display layout on the basis of an instruction from the user (Paragraph 0023 teaches that the user is guided to display the target in a body diagram and adjust the probe position in real time. Paragraph 0078 teaches that the user can use a single monitor or multiple monitors for display. Paragraph 0083 teaches that the ultrasound images and the associated position information can be displayed at a later time if decided. Paragraph 0147 teaches the review and feedback by the user and the transfer of data for later assessment. Paragraph 0091 teaches that that data can be manipulated and stored by the user and the user can accurately review, evaluate, and compare examination results). Regarding claim 4, modified Caluser teaches the image display apparatus in claim 2, as discussed above. Caluser further teaches an image display apparatus, wherein the processor is configured to set the display layout on the basis of an instruction from the user (Paragraph 0023 teaches that the user is guided to display the target in a body diagram and adjust the probe position in real time. Paragraph 0078 teaches that the user can use a single monitor or multiple monitors for display. Paragraph 0083 teaches that the ultrasound images and the associated position information can be displayed at a later time if decided. Paragraph 0147 teaches the review and feedback by the user and the transfer of data for later assessment. Paragraph 0091 teaches that that data can be manipulated and stored by the user and the user can accurately review, evaluate, and compare examination results). Regarding claim 5, modified Caluser teaches the image display apparatus in claim 1, as discussed above. Caluser further teaches an image display apparatus, wherein the processor is configured to: add the probe positional information and the imaging feature information acquired to each of the plurality of ultrasound images (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0021 teaches the ultrasound imaging. Paragraph 0078 teaches the coregistration of the body location and the tracked probe position and orientation with respect to the real time ultrasound images); and perform the extraction of the ultrasound image by referring to the probe positional information and the imaging feature information added to the plurality of ultrasound images (Paragraph 0023 teaches that the user is guided to display the target in a body diagram and adjust the probe position in real time. Paragraph 0078 teaches that the user can use a single monitor or multiple monitors for display. Paragraph 0083 teaches that the ultrasound images and the associated position information can be displayed at a later time if decided. Paragraph 0138 teaches that the multiple images of the lesion can be captured and recorded with the position and orientation of the probe. The lesion can be determined via image interpretation and tolerate only images with the lesion). Regarding claim 6, modified Caluser teaches the image display apparatus in claim 2, as discussed above. Caluser further teaches an image display apparatus, wherein the processor is configured to: add the probe positional information and the imaging feature information acquired to each of the plurality of ultrasound images (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0021 teaches the ultrasound imaging. Paragraph 0078 teaches the coregistration of the body location and the tracked probe position and orientation with respect to the real time ultrasound images); and perform the extraction of the ultrasound image by referring to the probe positional information and the imaging feature information added to the plurality of ultrasound images (Paragraph 0023 teaches that the user is guided to display the target in a body diagram and adjust the probe position in real time. Paragraph 0078 teaches that the user can use a single monitor or multiple monitors for display. Paragraph 0083 teaches that the ultrasound images and the associated position information can be displayed at a later time if decided. Paragraph 0138 teaches that the multiple images of the lesion can be captured and recorded with the position and orientation of the probe. The lesion can be determined via image interpretation and tolerate only images with the lesion). Regarding claim 9, modified Caluser teaches the image display apparatus in claim 2, as discussed above. Caluser further teaches an image display apparatus, wherein the ultrasound probe has a position sensor that detects the position of the ultrasound probe, and the processor acquires the probe positional information on the basis of the position of the ultrasound probe detected by the position sensor (Paragraph 0078 teaches the use of a sensor that is able to provide position tracking information of the ultrasound probe. Fig. 1 shows the position sensor connected to the ultrasound probe). Regarding claim 10, modified Caluser teaches the image display apparatus in claim 1, as discussed above. Caluser further teaches an image display apparatus, wherein the processor performs image analysis on each of the plurality of ultrasound images to detect the lesion part and acquire the lesion part position information (Paragraph 0096 teaches the accurate collection and reproduction of the position coordinate information of the lesion. Paragraph 0138 teaches that the lesion is captured and recorded and the system is able to determine the lesion in the images according to the position information via image interpretation). Regarding claim 11, modified Caluser teaches the image display apparatus in claim 2, as discussed above. Caluser further teaches an image display apparatus, wherein the processor performs image analysis on each of the plurality of ultrasound images to detect the lesion part and acquire the lesion part position information (Paragraph 0096 teaches the accurate collection and reproduction of the position coordinate information of the lesion. Paragraph 0138 teaches that the lesion is captured and recorded and the system is able to determine the lesion in the images according to the position information via image interpretation). Regarding claim 20, Caluser teaches a control method of an image display apparatus, the control method comprising: inputting a plurality of ultrasound images in which a lesion part in an inside of a subject is imaged (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0021 teaches the ultrasound imaging); analyzing the plurality of ultrasound images to acquire probe positional information of an ultrasound probe, and imaging feature information, the imaging feature information including lesion part position information regarding a position of the lesion part of the subject in each respective image of the plurality of images (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0022 teaches the image set is according to each pixel with respect to the position of a target pixel. The target pixel selection can be made at the time of capture, before the image save, or at a later time. Paragraph 0117 teaches that the image can be displayed in relation to the target region that is defined by the pixels. Paragraph 0120 teaches that each position of the pixel is performed in relation to the anatomical references. Paragraph 0096 teaches that the position coordinates of the lesion can be obtained. Paragraph 0115 teaches that the real size of the body part is known and the position tracking allows for the assessment of when the probe is outside of the range. Fig. 15 shows a range of images over the body position. Paragraph 0138 teaches that the spatial range can be used in the assessment of the lesions); extracting an ultrasound image conforming to a display layout that is set by a user, the display layout including a plurality of screen divisions of a monitor, a display order, from the plurality of ultrasound images by referring to the acquired probe positional information and the acquired imaging feature information (Paragraph 0023 teaches that the user is guided to display the target in a body diagram and adjust the probe position in real time. Paragraph 0078 teaches that the user can use a single monitor or multiple monitors for display. Paragraph 0083 teaches that the ultrasound images and the associated position information can be displayed at a later time if decided. Paragraph 0138 teaches that the multiple images of the lesion can be captured and recorded with the position and orientation of the probe. The lesion can be determined via image interpretation and tolerate only images with the lesion. Paragraph 0140 teaches real time scanning and display with respect to a body diagram. Paragraph 0100 teaches the display of the ultrasound image and the tagged image that can be displayed and recorded in any combination or order. See Fig. 60, 62, and 69-70); and display the extracted ultrasound image on the monitor according to the display layout (Paragraph 0138 teaches that the images with the lesion that are within the tolerance are contained and can be grouped and displayed. Fig. 62 shows the display of the probe pose information and the ultrasound image), wherein the processor is further configured to: add a body mark plotted with a probe mark indicating a position and an orientation of the ultrasound probe on each respective image of the plurality of images, the body mark schematically representing a body part of the subject (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0021 teaches the ultrasound imaging. Paragraph 0078 teaches the coregistration of the body location and the tracked probe position and orientation with respect to the real time ultrasound images. Anatomical references can also be used for the coregistration. See at least Figs. 6, 30, 60, 62, and 69); and acquire the probe positional information on the basis of the position and the orientation of the ultrasound probe indicated by the probe mark (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0021 teaches the ultrasound imaging. Paragraph 0078 teaches the coregistration of the body location and the tracked probe position and orientation with respect to the real time ultrasound images. Anatomical references can also be used for the coregistration. See at least Figs. 6, 30, 60, 62, and 69); the extracted ultrasound image is stored in the memory along with tag information associated with the extracted ultrasound image (Paragraph 0083 teaches that the ultrasound images with their associated positional information can be stored. Paragraph 0091 teaches that the storage is done with respect to the memory and can be used for later retrieval). While Caluser teaches compatibility with DICOM systems, Caluser is silent regarding a method, extracting an ultrasound image conforming to a display layout that is set by a user, the display layout including a display magnification; the method further comprises converting the extracted ultrasound image into image data in a Digital Imaging and Communications In Medicine (DICOM) format including the tag information. In an analogous imaging field of endeavor, regarding ultrasound image processing, Shinohara teaches a method, the method further comprises converting the extracted ultrasound image into image data in a Digital Imaging and Communications In Medicine (DICOM) format including the tag information (Paragraph 0047 teaches that the 3D image data is converted from the image data to the DICOM format and stored. The information includes the image position information and the inclination information of the slice image data. Paragraphs 0056-71 teaches that the image’s 3D arrangement is assigned according to tagging that is considerate of pixel spacing. Paragraph 0053 teaches the pixel spacing and the intervoxel distancing. This information is stored according to the DICOM format. Abstract teaches an ultrasound system. See Fig. 6). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Caluser with Shinohara’s teaching of DICOM formatting of image data and the tag information. This modified method would allow the user to track and correlate position information with high accuracy (Paragraph 0016 of Shinohara). Furthermore, the modification can assist in easy comparison of standard image data structure, treatment planning or progress observation (Paragraph 0018 of Shinohara). However, Shinohara is silent regarding a method, extracting an ultrasound image conforming to a display layout that is set by a user, the display layout including a display magnification. In an analogous imaging field of endeavor, regarding ultrasound image processing, Lee teaches a method, extracting an ultrasound image conforming to a display layout that is set by a user, the display layout including a display magnification (Col. 12, lines 5-45 teaches the display of a predetermined number of frame images among the frame images during a time period. The images can be organized and placed in a particular order. Col. 2, lines 37-41 teaches that a particular image can be magnified. See Fig. 7). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Caluser and Shinohara with Lee’s teaching of a display magnification. This modified method would allow the user to improve ultrasound diagnosis accuracy (Col. 2, lines 19-26 of Lee). Furthermore, the modification provides precise diagnosis of observed lesions (Col. 9, lines 27-35 of Lee). Claims 14-17 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Caluser et al. (PGPUB No. US 2015/0051489) in view of Shinohara (PGPUB No. US 2011/0224550) further in view of Lee et al. (US Patent No. 10,395,346) further in view of Takimoto (PGPUB No. US 2015/0359506). Regarding claim 14, modified Caluser teaches the image display apparatus in claim 1, as discussed above. However, the combination of Caluser, Shinohara, and Lee is silent regarding an image display apparatus, wherein the processor is configured to: acquire image type information indicating whether the ultrasound image is a B- mode image or an ultrasound image other than a B-mode image, together with the probe positional information and the imaging feature information, and perform the extraction of the ultrasound image by referring to the probe positional information, the image type information, and the imaging feature information that are acquired. In an analogous imaging field of endeavor, regarding ultrasound image acquisition in relation to probe positioning, Takimoto teaches an image display apparatus, wherein the processor is configured to: acquire image type information indicating whether the ultrasound image is a B- mode image or an ultrasound image other than a B-mode image, together with the probe positional information and the imaging feature information (Paragraph 0039 teaches that the system is able to confirm if the switching has passed through B-mode data. The system determines if the data has been generated in an order what includes the B-mode in relation to other modes of imaging. Paragraphs 0046-47 teaches the assessment of the position of the ultrasonic probe with respect to the patient and ensuring the imaging being done for the target), and perform the extraction of the ultrasound image by referring to the probe positional information, the image type information, and the imaging feature information that are acquired (Paragraph 0049 teaches that the performance of the imaging with the order to include B-mode and the use of the probe in relation to the patient results in the display of image information that is associated to the patient physiology. See Fig. 2). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Caluser, Shinohara, and Lee with Takimoto’s teaching of assessment and confirmation that the B-mode has been passed through. This modified apparatus would allow the user to acquire patient information without operation by the user and ensuring aliasing-free information (Paragraph 0049 of Takimoto). Furthermore, the modification prevents display of an improper waveform caused by pressing the button by the operator in haste when he wants to check the entire waveform soon (Paragraph 0049 of Takimoto). Regarding claim 15, modified Caluser teaches the image display apparatus in claim 2, as discussed above. However, the combination of Caluser, Shinohara, and Lee is silent regarding an image display apparatus, wherein the processor is configured to: acquire image type information indicating whether the ultrasound image is a B- mode image or an ultrasound image other than a B-mode image, together with the probe positional information and the imaging feature information, and perform the extraction of the ultrasound image by referring to the probe positional information, the image type information, and the imaging feature information that are acquired. In an analogous imaging field of endeavor, regarding ultrasound image acquisition in relation to probe positioning, Takimoto teaches an image display apparatus, wherein the processor is configured to: acquire image type information indicating whether the ultrasound image is a B- mode image or an ultrasound image other than a B-mode image, together with the probe positional information and the imaging feature information (Paragraph 0039 teaches that the system is able to confirm if the switching has passed through B-mode data. The system determines if the data has been generated in an order what includes the B-mode in relation to other modes of imaging. Paragraphs 0046-47 teaches the assessment of the position of the ultrasonic probe with respect to the patient and ensuring the imaging being done for the target), and perform the extraction of the ultrasound image by referring to the probe positional information, the image type information, and the imaging feature information that are acquired (Paragraph 0049 teaches that the performance of the imaging with the order to include B-mode and the use of the probe in relation to the patient results in the display of image information that is associated to the patient physiology. See Fig. 2). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Caluser, Shinohara, and Lee with Takimoto’s teaching of assessment and confirmation that the B-mode has been passed through. This modified apparatus would allow the user to acquire patient information without operation by the user and ensuring aliasing-free information (Paragraph 0049 of Takimoto). Furthermore, the modification prevents display of an improper waveform caused by pressing the button by the operator in haste when he wants to check the entire waveform soon (Paragraph 0049 of Takimoto). Regarding claim 16, modified Caluser teaches the image display apparatus in claim 14, as discussed above. Caluser further teaches an image display apparatus, wherein the processor acquires the image type information on the basis of tag information added to each of the plurality of ultrasound images (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0021 teaches the ultrasound imaging. Paragraph 0078 teaches the coregistration of the body location and the tracked probe position and orientation with respect to the real time ultrasound images). Regarding claim 17, modified Caluser teaches the image display apparatus in claim 15, as discussed above. Caluser further teaches an image display apparatus, wherein the processor acquires the image type information on the basis of tag information added to each of the plurality of ultrasound images (Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0021 teaches the ultrasound imaging. Paragraph 0078 teaches the coregistration of the body location and the tracked probe position and orientation with respect to the real time ultrasound images). Regarding claim 19, modified Caluser teaches the image display apparatus in claim 14, as discussed above. Caluser further teaches an image display apparatus, wherein the processor is configured to: extract B-mode images from the plurality of acquired ultrasound images on the basis of the image type information (Fig. 62 shows a B-mode image. Paragraph 0100 teaches that the position and coordinates of the target, anatomy, and the image itself are recorded. The position representations and corresponding alpha numerical values can be displayed and recorded. Paragraph 0019 teaches that the apparatus is able to perform automated ultrasound probe position registration in real time. Paragraph 0096 teaches that the lesion can be imaged. Paragraph 0021 teaches the ultrasound imaging); and perform the analyzation of the extracted B-mode images as the ultrasound images (Paragraph 0096 teaches that the position coordinates of the lesion can be obtained. Paragraph 0115 teaches that the real size of the body part is known and the position tracking allows for the assessment of when the probe is outside of the range. Fig. 15 shows a range of images over the body position. Paragraph 0138 teaches that the spatial range can be used in the assessment of the lesions). Claim 18 is rejected under 35 U.S.C. 103 as being unpatentable over Caluser et al. (PGPUB No. US 2015/0051489) in view of Shinohara (PGPUB No. US 2011/0224550) further in view of Lee et al. (US Patent No. 10,395,346) further in view of Takimoto (PGPUB No. US 2015/0359506) further in view of Kobayashi et al. (PGPUB No. US 2019/0090855). Regarding claim 18, modified Caluser teaches the image display apparatus in claim 14, as discussed above. However, the combination of Caluser, Shinohara, Lee, and Takimoto is silent regarding an image display apparatus, wherein the processor acquires the image type information on the basis of RGB signal values of each of the plurality of ultrasound images. In an analogous imaging field of endeavor, regarding ultrasound image acquisition in relation to probe positioning, Kobayashi teaches an image display apparatus, wherein the processor acquires the image type information on the basis of RGB signal values of each of the plurality of ultrasound images (Paragraph 0039 teaches that the images undergo an RGB conversion for the generation of the ultrasound images. The images are displayed). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Caluser, Shinohara, Lee, and Takimoto with Kobayashi’s teaching of use of RGB signal values. This modified apparatus would allow the user to provide images of a predetermined resolution and frame rate (Paragraph 0039 of Kobayashi). Furthermore, the modification ensures that the operator can constantly recognize the acquisition position of the echo data (Paragraph 0159 of Kobayashi). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Lee et al. (PGPUB No. US 2014/0276057): Teaches display with screen divisions, display order, and display magnification. Ito et al. (PGPUB No. US 2009/0299181): Teaches display with screen divisions, display order, and display magnification. Chiang et al. (PGPUB No. US 2014/0121524): Teaches display with screen divisions, display order, and display magnification. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ADIL PARTAP S VIRK whose telephone number is (571)272-8569. The examiner can normally be reached Mon-Fri 8-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached on 571-272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ADIL PARTAP S VIRK/Primary Examiner, Art Unit 3798
Read full office action

Prosecution Timeline

May 17, 2023
Application Filed
Sep 23, 2024
Non-Final Rejection — §103
Nov 25, 2024
Interview Requested
Dec 17, 2024
Examiner Interview Summary
Dec 17, 2024
Applicant Interview (Telephonic)
Jan 21, 2025
Response Filed
Apr 07, 2025
Final Rejection — §103
Jul 01, 2025
Applicant Interview (Telephonic)
Jul 01, 2025
Examiner Interview Summary
Aug 11, 2025
Request for Continued Examination
Aug 13, 2025
Response after Non-Final Action
Aug 20, 2025
Non-Final Rejection — §103
Oct 07, 2025
Interview Requested
Oct 15, 2025
Examiner Interview Summary
Oct 15, 2025
Applicant Interview (Telephonic)
Nov 24, 2025
Response Filed
Jan 06, 2026
Final Rejection — §103
Mar 13, 2026
Examiner Interview Summary
Mar 13, 2026
Applicant Interview (Telephonic)
Mar 31, 2026
Request for Continued Examination
Apr 07, 2026
Non-Final Rejection — §103
Apr 07, 2026
Response after Non-Final Action

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599313
Health Trackers for Autonomous Targeting of Tissue Sampling Sites
2y 5m to grant Granted Apr 14, 2026
Patent 12569221
Systems and Methods for Infrared-Enhanced Ultrasound Visualization
2y 5m to grant Granted Mar 10, 2026
Patent 12569228
ULTRASOUND DIAGNOSTIC APPARATUS AND CONTROL METHOD FOR ULTRASOUND DIAGNOSTIC APPARATUS
2y 5m to grant Granted Mar 10, 2026
Patent 12569304
OPTICAL COHERENCE TOMOGRAPHY GUIDED ROBOTIC OPHTHALMIC PROCEDURES
2y 5m to grant Granted Mar 10, 2026
Patent 12564384
SYSTEM AND METHODS FOR JOINT SCAN PARAMETER SELECTION
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
48%
Grant Probability
89%
With Interview (+41.3%)
3y 2m
Median Time to Grant
High
PTA Risk
Based on 213 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month