Prosecution Insights
Last updated: April 19, 2026
Application No. 18/628,160

ULTRASOUND IMAGING APPARATUS AND METHOD FOR PROVIDING USER INTERFACE

Final Rejection §103§112
Filed
Apr 05, 2024
Examiner
LI, JOHN DENNY
Art Unit
3798
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Samsung Electronics
OA Round
2 (Final)
64%
Grant Probability
Moderate
3-4
OA Rounds
3y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
158 granted / 246 resolved
-5.8% vs TC avg
Strong +49% interview lift
Without
With
+48.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
36 currently pending
Career history
282
Total Applications
across all art units

Statute-Specific Performance

§101
6.5%
-33.5% vs TC avg
§103
47.7%
+7.7% vs TC avg
§102
12.2%
-27.8% vs TC avg
§112
29.7%
-10.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 246 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendment filed on 2/3/2026 has been entered. Claims 1-20 remain pending the application. Response to Arguments Applicant's arguments filed on 2/3/2026 have been fully considered but they are not persuasive or are moot. Applicant argues on page 7 that the amendments to claim 20 overcome the 112d rejection. The Examiner respectfully disagrees. Although the instructions on the CRM in claim 20 when executed perform the method of claim 11, it is possible to simply possess the CRM of claim 11 without performing the method steps of claim 11, a claim upon which it depends. The Examiner recommends amending claim 20 to recite the steps of claim 11 rather than depending on claim 11. Accordingly, this argument is not persuasive. Applicant argues on pages 9-11 that the rejection fails to address the newly added limitations to the independent claims related to the display UI. This argument is moot in view of the new grounds of rejection necessitated by amendment which relies Imai and on newly cited portions of Jun to disclose these limitations in the claims. Accordingly, this argument is moot. The following is a quotation of 35 U.S.C. 112(d): (d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. The following is a quotation of pre-AIA 35 U.S.C. 112, fourth paragraph: Subject to the following paragraph [i.e., the fifth paragraph of pre-AIA 35 U.S.C. 112], a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. Claim 20 is rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. Regarding claim 20, the claim fails to include all the limitations of the claim upon which it depends because the claim does not actually require performing the method of claim 11. The Examiner recommends amending claim 20 to recite the steps of claim 11. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 6, 11-13, 16, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Jun et al. (US20180188946, hereafter Jun), Imai (US20200178933), and Li et al. (US20160062581, hereafter Li). Regarding claims 1 and 11, Jun discloses a method and an ultrasound imaging apparatus (Jun, Para 2; “The present disclosure relates to medical image display apparatuses and methods of providing a user interface via the medical image display apparatuses, and more particularly, to medical image display apparatuses and methods of providing a user interface suitable for a user”) comprising: a display; a motion sensor (Jun, Para 233; “a magnetic sensor, an acceleration sensor, a gyroscope sensor, a proximity sensor, an optical sensor, a depth sensor, an infrared sensor, or an ultrasound sensor included therein”); an input interface (Jun, Para 120; “The medical image display apparatus 100 may display various pieces of information processed by the various devices 11 through 14, as well as the medical image, via a graphical user interface (GUI)”); and at least one processor configured to display, via the display, a user interface (UI) image indicating a measuring position, on an ultrasound image (Jun, Para 145-146; “For example, the medical image display apparatus 100 may display a GUI 201 for displaying information about the medical image 210 currently being displayed […] The information about the medical image 210 may include any desired information, for example a position of a region of an object depicted in the medical image 210, a position or a size of a lesion, a position or a size of a region of interest (ROI) designated by a user, information about a text or a body marker stored together with the medical image 210, or any other desired information or combination thereof.”) (Jun, Para 120; “The medical image display apparatus 100 may display various pieces of information processed by the various devices 11 through 14, as well as the medical image, via a graphical user interface (GUI)”) (Jun, Para 117; “According to various exemplary embodiments, the medical image display apparatus 100 may receive medical image data stored therein or in the various devices 11 through 14 and display a medical image generated using the received medical image data. For example, the medical image display apparatus 100 may display an ultrasound image”), receive a user input for setting the position of the UI image within the ultrasound image, via the input interface (Jun, Para 129; “The medical image display apparatus 100 may display an ultrasound image by using ultrasound image data received from the ultrasound probe 12. The medical image display apparatus 100 may also provide a GUI for controlling the medical image display apparatus 100 or for setting a function related to an operation of displaying an ultrasound image by the medical image display apparatus 100. Furthermore, the medical image display apparatus 100 may provide a GUI for controlling the ultrasound probe 12 or for setting a function related to an operation of acquiring ultrasound image data by the ultrasound probe 12”) (Jun, Para 183; ““Drag” may occur when a user places a finger or touch instrument on a screen moves the finger or the touch tool to another location on the screen without lifting it from the screen. The drag gesture may be used to move an object or perform a pan gesture as described below.”) (Jun, Para 146; “The information about the medical image 210 may include any desired information, for example a position of a region of an object depicted in the medical image 210, a position or a size of a lesion, a position or a size of a region of interest (ROI) designated by a user, information about a text or a body marker stored together with the medical image 210, or any other desired information or combination thereof.”), and obtain a measurement value for an object of the ultrasound and display the measurement value for the object of the ultrasound (Jun, Para 129; “The medical image display apparatus 100 may display an ultrasound image by using ultrasound image data received from the ultrasound probe 12. The medical image display apparatus 100 may also provide a GUI for controlling the medical image display apparatus 100 or for setting a function related to an operation of displaying an ultrasound image by the medical image display apparatus 100. Furthermore, the medical image display apparatus 100 may provide a GUI for controlling the ultrasound probe 12 or for setting a function related to an operation of acquiring ultrasound image data by the ultrasound probe 12”) (Jun, Para 146; “The information about the medical image 210 may include any desired information, for example a position of a region of an object depicted in the medical image 210, a position or a size of a lesion, a position or a size of a region of interest (ROI) designated by a user, information about a text or a body marker stored together with the medical image 210, or any other desired information or combination thereof.”). Jun does not clearly and explicitly disclose setting the position of the UI image within the ultrasound image as the measurement position for the ultrasound image, obtaining the measurement value for the object of the ultrasound image based on the measurement position set on the ultrasound image, and changing, via the motion sensor, a position of the UI image within the ultrasound image according to movement of the ultrasound imaging apparatus. In an analogous ultrasound diagnostic device field of endeavor Imai discloses disclose setting a position of a UI image within an ultrasound image as a measurement position for an ultrasound image and obtaining a measurement value for an object of the ultrasound image based on the measurement position set on the ultrasound image (Imai, Para 20; “the measurement target designation receiving unit receives designation of a measurement target selected by the user through the operation unit from the plurality of measurement target candidates displayed on the display unit.”) (Imai, Para 74; “In step S13, the measurement target designation receiving unit 16 receives designation of the measurement target selected by the user through the operation unit 17 from the plurality of measurement targets displayed on the display unit 8 in step S12. In a case where the designation of the measurement target is received in this manner, the process proceeds to step S5 in which a measurement algorithm is set for the designated measurement target. In a case where the automatic measurement for the measurement target is performed in step S6, the measurement result is displayed on the display unit 8 in step S7. In this manner, the measurement operation in the ultrasound diagnostic apparatus 1 ends.”) (Imai, Para 93; “For example, by displaying the measurement line on the display unit 8 so that the measurement line is manually adjusted by the user through the operation unit 17, the measurement line correction receiving unit 25 can receive the correction of the measurement line. For example, in a case where the measurement line is a line segment, the measurement line is corrected by adjusting the position of the end point of the line segment. In this case, a measurement value for the corrected measurement line is calculated.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Jun to include disclose setting the position of the UI image within the ultrasound image as the measurement position for the ultrasound image, obtaining the measurement value for the object of the ultrasound image based on the measurement position set on the ultrasound image in order to allow a user to make adjustments as needed to improve diagnosis as taught by Imai (Imai, Para 93-96). In an analogous portable device for displaying information field of endeavor Li discloses in Figure 1a changing, via a motion sensor, a position of a displayed UI within a display according to movement of the apparatus (Li, Para 33-34; “In step 101, when a target document is displayed on a screen of a terminal device, detecting a displacement of the terminal device and when such displacement is detected, determining a direction of the displacement. In step 102, translating the target document on the screen according to the direction of the displacement.”). The use of the techniques of using motion sensor data to manipulate UIs taught by Li in the invention of an ultrasound imaging UI would have comprised only application of a known technique to a known device ready for improvement to yield the predictable result of controlling the UI of the ultrasound device via the motion sensor; and similar modifications have previously been held to involve only routine skill in the art. KSR Int'l Co. v. Teleflex Inc., 550 U.S. 398, 415-421, 82 USPQ2d 1385, 1395-97 (2007). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Jun to include changing, via the motion sensor, a position of the UI image within the ultrasound image according to movement of the ultrasound imaging apparatus in order to allow a user to manipulate the UI in an improved manner which is more convenient as taught by Li (Li, Para 18). Regarding claims 2 and 12, Jun as modified by Imai and Li above discloses all of the limitations of claims 1 and 11 as discussed above. Jun does not clearly and explicitly disclose wherein the at least one processor is further configured to, in response to a user input for changing at least one of a position or an angle of the ultrasound imaging apparatus, move the ultrasound image in a lateral direction, thereby moving the UI image within the ultrasound image in the lateral direction. Li further discloses wherein a processor is configured to, in response to a user input for changing at least one of a position or an angle of an apparatus, move a document in a lateral direction, thereby moving a UI image within the image in the lateral direction (Li, Para 33-34; “In step 101, when a target document is displayed on a screen of a terminal device, detecting a displacement of the terminal device and when such displacement is detected, determining a direction of the displacement. In step 102, translating the target document on the screen according to the direction of the displacement.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Jun wherein the at least one processor is further configured to, in response to a user input for changing at least one of a position or an angle of the ultrasound imaging apparatus, move the ultrasound image in a lateral direction, thereby moving the UI image within the ultrasound image in the lateral direction in order to allow a user to manipulate the UI in an improved manner which is more convenient as taught by Li (Li, Para 18). Regarding claims 3 and 13, Jun as modified by Imai and Li above discloses all of the limitations of claims 1 and 11 as discussed above. Jun does not clearly and explicitly disclose wherein the at least one processor is further configured to, in response to a user input for changing at least one of a position or an angle of the ultrasound imaging apparatus, move the ultrasound image in an axial direction, thereby moving the UI image within the ultrasound image in the axial direction. Li further discloses wherein a processor is configured to, in response to a user input for changing at least one of a position or an angle of the apparatus, move a document an axial direction, thereby moving a UI image within the document in the axial direction (Li, Para 33-34; “In step 101, when a target document is displayed on a screen of a terminal device, detecting a displacement of the terminal device and when such displacement is detected, determining a direction of the displacement. In step 102, translating the target document on the screen according to the direction of the displacement.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Jun wherein the at least one processor is further configured to, in response to a user input for changing at least one of a position or an angle of the ultrasound imaging apparatus, move the ultrasound image in a lateral direction, thereby moving the UI image within the ultrasound image in the lateral direction in order to allow a user to manipulate the UI in an improved manner which is more convenient as taught by Li (Li, Para 18). Regarding claims 6 and 16, Jun as modified by Imai and Li above discloses all of the limitations of claims 1 and 11 as discussed above. Jun does not clearly and explicitly disclose wherein the changing of the position of the UI image within the ultrasound image according to the movement of the ultrasound imaging apparatus comprises, in response to a user input for changing at least one of a position or an angle of the ultrasound imaging apparatus, moving the ultrasound image up, down, left, or right, thereby moving the position of the UI image within the ultrasound image up, down, left, or right. Li further discloses wherein a processor is configured to, in response to a user input for changing at least one of a position or an angle of an apparatus, move a document up, down, left, or right, thereby moving the position of a UI image within the document up, down, left, or right (Li, Para 33-34; “In step 101, when a target document is displayed on a screen of a terminal device, detecting a displacement of the terminal device and when such displacement is detected, determining a direction of the displacement. In step 102, translating the target document on the screen according to the direction of the displacement.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Jun wherein the at least one processor is further configured to, in response to a user input for changing at least one of a position or an angle of the ultrasound imaging apparatus, move the ultrasound image in a lateral direction, thereby moving the UI image within the ultrasound image in the lateral direction in order to allow a user to manipulate the UI in an improved manner which is more convenient as taught by Li (Li, Para 18). Regarding claim 20, Jun as modified by Imai and Li above discloses all of the limitations of claim 11 as discussed above. Jun further discloses a non-transitory computer-readable recording medium having recorded thereon instructions that, when executed by at least one processor of an ultrasound imaging apparatus, cause the ultrasound imaging apparatus to perform the method (Jun, Para 49; “According to another aspect of an exemplary embodiment, non-transitory computer-readable recording medium may have recorded thereon a program for performing the methods above”). Claims 4 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Jun, Imai, and Li as applied to claims 1 and 11, and in further view of Adams et al. (US20130286573, hereafter Adams). Regarding claims 4 and 14, Jun as modified by Imai and Li above discloses all of the limitations of claims 1 and 11 as discussed above. Jun does not clearly and explicitly disclose wherein the at least one processor is further configured to, in response to a user input for changing at least one of a position or an angle of the ultrasound imaging apparatus, change a size of the UI image within the ultrasound image. In an analogous portable device for displaying information field of endeavor Adams discloses wherein a processor is configured to, in response to a user input for changing at least one of a position or an angle of an apparatus, change a size of a UI within a displayed image of the apparatus (Adams, Para 34-36; “For example, a tilt may be associated with a keyboard transformation function to change the keyboard by resizing the keys of the keyboard […] When the tilt of the device to the left is identified at 206 to be associated with a keyboard transformation function at 208, the keyboard is changed at 210 such that one or more keys proximate (located near) a first, tilted up side of the device are resized to be larger than one or more keys proximate a second, tilted down side of the device. In this example, the keyboard is changed at 210 to increase the size of the keys on the left side of the keyboard, i.e., the keys proximate a first, tilted up side of the device. The keyboard is also changed to reduce the size of the keys on the right side of the keyboard, i.e., the keys proximate a second, tilted down side of the device, to simulate an effect of gravity on the keys of the keyboard”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Jun wherein the at least one processor is further configured to, in response to a user input for changing at least one of a position or an angle of the ultrasound imaging apparatus, change a size of the UI image within the ultrasound image in order to modify the display as needed for different functions as taught by Adams (Adams, Para 3-4) which makes it easier to use. Claims 5 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Jun, Imai, and Li as applied to claims 1 and 11, and in further view of Bhatt et al. (US20130016122, hereafter Bhatt). Regarding claims 5 and 15, Jun as modified by Imai and Li above discloses all of the limitations of claims 1 and 11 as discussed above. Jun does not clearly and explicitly disclose wherein the at least one processor is further configured to, in response to a user input for changing at least one of a position or an angle of the ultrasound imaging apparatus, change an angle of the UI image within the ultrasound image. In an analogous portable device for displaying information field of endeavor Bhatt discloses wherein a processor is configured to, in response to a user input for changing at least one of a position or an angle of the apparatus, change an angle of a UI within a display (Bhatt, Para 3; “The user can enter the zoom, pan and rotate instructions by using multi-touch gestures, gyroscope/accelerometer sensitive gestures or mouse/keyboard actions with respect to the selected image content within the cropping environment”) (Bhatt, Para 29; “The image content 102-A shown within the overlaid crop-box can be panned 145, zoomed 125, rotated 135 by the multifunctional cropping utility 110 as requested by the user via respective instructions 142, 122, 132 entered using multi-touch gestures (or gyroscope/ accelerometer sensitive gestures or keyboard/mouse instructions)”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Jun wherein the at least one processor is further configured to, in response to a user input for changing at least one of a position or an angle of the ultrasound imaging apparatus, change an angle of the UI image within the ultrasound image as taught by Bhatt in order to allow a user to manipulate the UI without clicking any buttons which increases the usability of the device in an intuitive manner. Claims 7 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Jun, Imai, and Li as applied to claims 1 and 11, and in further view of Ozaki (US20110182137). Regarding claims 7 and 17, Jun as modified by Imai and Li above discloses all of the limitations of claims 1 and 11 as discussed above. Jun does not clearly and explicitly disclose wherein the measurement parameter includes a plurality of measurement parameters having a predetermined setting order, and the at least one processor is further configured to, in response to receiving a user input for setting a first measurement parameter from among the plurality of measurement parameters, display a UI image corresponding to a second measurement parameter that is a next parameter following the first measurement parameter. In an analogous portable ultrasound device for displaying information field of endeavor Ozaki discloses a plurality of measurement parameters having a predetermined setting order, wherein a processor is configured to, in response to receiving a user input for setting a first measurement parameter from among the plurality of measurement parameters, display a UI image corresponding to a second measurement parameter that is a next parameter following the first measurement parameter (Ozaki, Para 80; “display unit 80 determines the input of the first control parameter on second display unit 70 executed by the operator, generates the menu for setting a second control parameter which is to be the item next to the determined first control parameter on a part of the first display unit 100, controls the display of the generated menu on first display unit 100, corresponds the menu to the switch and displays the corresponded menu and the switch on second display unit 70”) (Ozaki, Para 34; “Display control unit 80 inputs the first control parameter by an operator to second display unit 70, determines the input of the first control parameter, generates the menu for setting the second control parameter which is to be the item next to the determined first control parameter on a part of the first display unit 100, controls the display of the generated menu on first display unit 100, and controls the display of a switch being corresponded to the generated menu on the second display unit 70”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Jun wherein the measurement parameter includes a plurality of measurement parameters having a predetermined setting order, and the at least one processor is further configured to, in response to receiving a user input for setting a first measurement parameter from among the plurality of measurement parameters, display a UI image corresponding to a second measurement parameter that is a next parameter following the first measurement parameter in order to allow a user to input a plurality of information without changing line of sight as taught by Ozaki (Ozaki, Para 15-16) which increases ease of use. Claims 8 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Jun, Imai, and Li as applied to claims 1 and 11, and in further view of Funakubo (US20180289360). Regarding claims 8 and 18, Jun as modified by Imai and Li above discloses all of the limitations of claims 1 and 11 as discussed above. Jun does not clearly and explicitly disclose wherein the at least one processor is further configured to display the ultrasound image, and based on receiving a user input for entering a measurement mode, enlarge a region of the displayed ultrasound image, and display the UI image on the enlarged region. In an analogous ultrasound device for displaying information field of endeavor Funakubo discloses wherein a processor is configured to display an ultrasound image, and based on receiving a user input for entering a measurement mode, enlarge a region of the displayed ultrasound image, and display a UI image on the enlarged region (Funakubo, Para 68; “Here, to change an operation mode from the distance measurement mode, the control unit 38 causes the mode setting unit 36 to change the setting to the enlargement/reduction mode and gives a command to the ultrasound-image generating unit 32 so as to enlarge the image”) (Funakubo, Figure 10). The use of the techniques of enlarging an image when performing measurements on the image taught by Funakubo in the invention of an ultrasound imaging UI would have comprised only application of a known technique to a known device ready for improvement to yield the predictable result of measuring the larger image; and similar modifications have previously been held to involve only routine skill in the art. KSR Int'l Co. v. Teleflex Inc., 550 U.S. 398, 415-421, 82 USPQ2d 1385, 1395-97 (2007). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Jun wherein the at least one processor is further configured to display the ultrasound image, and based on receiving a user input for entering a measurement mode, enlarge a region of the displayed ultrasound image, and display the UI image on the enlarged region as taught by Funakubo in order to allow the measured image to be seen and interacted with more easily by the user when performing the measurement. Claims 9 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Jun, Imai, and Li as applied to claims 1 and 11, and in further view of Taniguchi (US20180348975). Regarding claims 9 and 19, Jun as modified by Imai and Li above discloses all of the limitations of claims 1 and 11 as discussed above. Jun does not clearly and explicitly disclose wherein the at least one processor is further configured to display guidance information that guides a direction of movement of the ultrasound imaging apparatus for moving the UI image within the ultrasound image. In an analogous ultrasound device for displaying information field of endeavor Taniguchi discloses wherein a processor is configured to display guidance information that guides a direction of movement of an apparatus for moving a UI image displayed on the apparatus (Taniguchi, Para 13; “a tilt detection unit configured to detect a tilt of the touch panel display unit, and the control unit configured to change the display of the touch panel display unit according to […] the tilt of the touch panel display unit detected by the tilt detection unit […] depicts a fore edge of a paper-based book and extends in a vertical direction of the touch panel display unit, in a right or left edge portion of the touch panel display unit with which the indicator is in contact when recognizing that an operation of tilting the touch panel display unit has been performed based on a detection situation of the tilt detection unit”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Jun wherein the at least one processor is further configured to display guidance information that guides a direction of movement of the ultrasound imaging apparatus for moving the UI image within the ultrasound image in order to facilitate easier user of the device as taught by Taniguchi (Taniguchi, Para 11-13). Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Jun, Imai, and Li as applied to claim 1, and in further view of Wigdor (US20070186192). Regarding claim 10, Jun as modified by Imai and Li above discloses all of the limitations of claim 1 as discussed above. Jun does not clearly and explicitly disclose wherein the at least one processor is further configured to, when a predetermined hardware button is pressed, receive a user input for changing at least one of the position or an angle of the ultrasound imaging apparatus, and in response to receiving a user input for releasing the pressed predetermined hardware button, determine the position of the UI image within the ultrasound image as a value of the measurement parameter. In an analogous ultrasound device for displaying information field of endeavor Taniguchi discloses wherein a processor is configured to, when a predetermined hardware button is pressed, receive a user input for changing at least one of the position or the angle of an apparatus, and in response to receiving a user input for releasing the pressed predetermined hardware button, making adjustments based on the position or the angle of the apparatus (Wigdor, Para 30; “the amount of tilt is calculated as the difference in the value of the tilt sensors at key down and key up. This requires the user to carry out three distinct movements once the button has been located: push the button, tilt the phone, and release the button”) (Wigdor, Para 16; “A user may tilt the phone 200 along a first axis 206 and/or a second axis 208 to assist in entering text data into the phone 200.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Jun wherein the at least one processor is further configured to, when a predetermined hardware button is pressed, receive a user input for changing at least one of the position or the angle of the ultrasound imaging apparatus, and in response to receiving a user input for releasing the pressed predetermined hardware button, determine the position of the UI image within the ultrasound image as a value of the measurement parameter as taught by Wigdor in order to allow a device to be controlled more precisely by controlling when position and orientation are used to manipulate the device. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to John Li whose telephone number is (313)446-4916. The examiner can normally be reached Monday to Thursday; 5:30 AM to 3:30 PM Eastern. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached at (571) 272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JOHN D LI/Primary Examiner, Art Unit 3798
Read full office action

Prosecution Timeline

Apr 05, 2024
Application Filed
Nov 03, 2025
Non-Final Rejection — §103, §112
Feb 03, 2026
Response Filed
Feb 25, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12588954
ARTICULATING GUIDE WITH INTEGRAL POSITION SENSOR
2y 5m to grant Granted Mar 31, 2026
Patent 12575885
AUGMENTED REALITY GUIDANCE SYSTEM FOR CARDIAC INTERVENTIONAL SURGERY
2y 5m to grant Granted Mar 17, 2026
Patent 12569301
SURGICAL NAVIGATION SYSTEM FOR ALIGNMENT OF A SURGICAL INSTRUMENT
2y 5m to grant Granted Mar 10, 2026
Patent 12564368
NUCLEAR MEDICINE DIAGNOSIS APPARATUS, ACQUISITION PERIOD EXTENDING METHOD, AND NON-TRANSITORY COMPUTER-READABLE MEDIUM
2y 5m to grant Granted Mar 03, 2026
Patent 12558067
TEMPERATURE INSENSITIVE BACKING STRUCTURE FOR INTRALUMINAL IMAGING DEVICES
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
64%
Grant Probability
99%
With Interview (+48.7%)
3y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 246 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month