Prosecution Insights
Last updated: April 19, 2026
Application No. 18/748,328

ULTRASOUND SITUATED DISPLAY IN AN AUGMENTED REALITY ENVIRONMENT

Non-Final OA §103
Filed
Jun 20, 2024
Examiner
WILSON, NICHOLAS R
Art Unit
2611
Tech Center
2600 — Communications
Assignee
Medivis Inc.
OA Round
1 (Non-Final)
87%
Grant Probability
Favorable
1-2
OA Rounds
1y 12m
To Grant
99%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
467 granted / 537 resolved
+25.0% vs TC avg
Moderate +12% lift
Without
With
+12.1%
Interview Lift
resolved cases with interview
Fast prosecutor
1y 12m
Avg Prosecution
25 currently pending
Career history
562
Total Applications
across all art units

Statute-Specific Performance

§101
9.5%
-30.5% vs TC avg
§103
41.1%
+1.1% vs TC avg
§102
24.0%
-16.0% vs TC avg
§112
14.8%
-25.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 537 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claim 7 is objected to because of the following informalities: the limitation “the same first image plane display orientation” appears to be typographical error of “the first image plane display orientation” as same was not previously recited. Appropriate correction is required. Claim 8 is objected to because of the following informalities: the limitation “the same first ultrasound image.” appears to be typographical error of “the first ultrasound image” as same was not previously recited. Appropriate correction is required. Claim 15 is objected to because of the following informalities: the limitation “The same first image plane display orientation” appears to be typographical error of “the first image plane display orientation” as same was not previously recited. Appropriate correction is required. Claim 16 is objected to because of the following informalities: the limitation “the same first ultrasound image.” appears to be typographical error of “the first ultrasound image” as same was not previously recited. Appropriate correction is required. Claim 20 is objected to because of the following informalities: the limitation “the same first image plane display orientation” appears to be typographical error of “the first image plane display orientation” as same was not previously recited. Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1, 2, 5-10, 13-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rothberg et al. (US 2017/0360412)(Hereinafter referred to as Rothberg) in view of Martin et al. (US 2014/0078138)(Hereinafter referred to as Martin). Regarding claim 1, Rothberg teaches A computer-implemented method (Aspects of the technology described herein relate to techniques for guiding an operator to use an ultrasound device. Thereby, operators with little or no experience operating ultrasound devices may capture medically relevant ultrasound images and/or interpret the contents of the obtained ultrasound images. For example, some of the techniques disclosed herein may be used to identify a particular anatomical view of a subject to image with an ultrasound device, guide an operator of the ultrasound device to capture an ultrasound image of the subject that contains the particular anatomical view, and/or analyze the captured ultrasound image to identify medical information about the subject. See abstract) comprising: defining a display position of an image plane proximate to a current position of an ultrasound probe instrument (An example of such an augmented reality interface is shown in FIG. 6 being displayed on a display 606 of a computing device 604. The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). See paragraph [0208])(See figure 6, element 610), the ultrasound probe instrument visible in an Augmented Reality (AR) environment generated by an AR headset device (An example of such an augmented reality interface is shown in FIG. 6 being displayed on a display 606 of a computing device 604. The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). See paragraph [0208])(See figure 6, element 610) (For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])(In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005]); rendering an AR situated view on the image plane in the AR environment, the situated view portraying ultrasound imagery captured by the ultrasound probe instrument (The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]); determining an AR display orientation of the image plane based on one or more detected movements of the AR headset device (For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])(In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])(Camera is what captures the view and the camera changes based on detected head movements, position would change); but is silent to and registering one or more portions of the ultrasound imagery as being representative of respective portions of a three-dimensional (3D) medical model. Martin teaches capturing ultrasound image data and registering ultrasound imagery with a 3d model (The ultrasound device 116 can be operable to capture 2D ultrasound images of the patient pre-operatively and/or intra-operatively. For example, the ultrasound device 116 can be configured to capture one or more images of one or more salient anatomical features of the patient. The processing device 110 can register the ultrasound images to the 3D model. In some embodiments, the ultrasound device 116 can image the ablation instruments 113 and/or 114 intra-operatively, which can provide data to the processing device 110 sufficient for the processing device to transform the location or position of the ablation instruments 113 and/or 114 in the physical space to the 3D model. See paragraph [0026])(In some embodiments, such as in embodiments in which the 3D model is not updated in real time, the physical space (e.g., the operating room and/or the patient) can be registered to the 3D model, at 240, by conducting a pre-operative registration imaging, at 230. For example, at 230, an ultrasound and/or other medical image can be captured and used to register the patient, an ultrasound device, one or more surgical instruments (e.g., ablation probes), and/or an optical tracking detector to the 3D model. Once the physical space has been registered to the model, at 240, the model can be operable to track the motion of a surgical instrument in the physical space, at 245 (e.g., using the optical tracking system 115 and/or ultrasound device 116, as described above with reference to FIG. 1). See paragraph [0035])( For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]) Rothberg and Martin teach ultrasound imaging and Martin teaches that the ultrasound image can be registered to a 3d model, therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the system of Rothberg with the 3D registration techniques of Martin such that the ultrasound image data could be registered to a 3D model of the patient and updated in real-time. Regarding claim 2, Rothberg in view of Martin teaches The computer-implemented method of claim 1, wherein registering one or more portions of the ultrasound imagery comprises: receiving one or more adjustments to a position and an orientation of a rendering of the 3D medical model displayed in the AR environment, the 3D medical model representing physical anatomy portrayed by the ultrasound imagery of the situated view (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) (Martin; In some embodiments, such as in embodiments in which the 3D model is not updated in real time, the physical space (e.g., the operating room and/or the patient) can be registered to the 3D model, at 240, by conducting a pre-operative registration imaging, at 230. For example, at 230, an ultrasound and/or other medical image can be captured and used to register the patient, an ultrasound device, one or more surgical instruments (e.g., ablation probes), and/or an optical tracking detector to the 3D model. Once the physical space has been registered to the model, at 240, the model can be operable to track the motion of a surgical instrument in the physical space, at 245 (e.g., using the optical tracking system 115 and/or ultrasound device 116, as described above with reference to FIG. 1). See paragraph [0035])( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]); and receiving a selection indicating alignment of the 3D medical model and the ultrasound imagery of the situated view (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]). Regarding claim 5, Rothberg in view of Martin teaches The computer-implemented method of claim 1, wherein determining the AR display orientation of the image plane comprises: determining a first display orientation of the image plane based on a current position and orientation of an AR headset device (Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]); detecting one or more changes to the current position and orientation of an AR headset device ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]); and determining a second display orientation of the image plane based on the one or more changes to the position and orientation of an AR headset device ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]). Regarding claim 6, Rothberg in view of Martin teaches The computer-implemented method of claim 5, wherein rendering the situated view at the image plane comprises: rendering first ultrasound image content portrayed by the situated view at the first display position of the image plane according to the first image plane display orientation, the first display position of the image plane and the first ultrasound image content corresponding to a first position of the ultrasound probe instrument ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]). Regarding claim 7, Rothberg in view of Martin teaches The computer-implemented method of claim 6, wherein rendering the situated view comprises: rendering second ultrasound image content portrayed by the situated view at the second display position of the image plane according to the same first image plane display orientation, the second display position of the image plane and the second ultrasound image content corresponding to a subsequent second position of the ultrasound probe instrument ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034])(As probe moves image tracks probe and is updated live.). Regarding claim 8, Rothberg in view of Martin teaches the computer-implemented method of claim 7, further comprising: rendering the same first ultrasound image content portrayed by the situated view at the first display position of the image plane according to the second image plane display orientation, the first display position of the image plane and the first ultrasound image content corresponding to a first position of the ultrasound probe instrument, the second image plane display orientation corresponding to one or more changes to the current position and orientation of an AR headset device ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034])(As probe moves image tracks probe and is updated live.) (As user moves view of camera the ultrasound image is still projected based on camera position of device in the case of an HMD). Regarding claim 9, Rothberg teaches A system (FIG. 1 shows an example ultrasound system 100 that is configured to guide an operator of an ultrasound device 102 to obtain an ultrasound image of a target anatomical view of a subject 101. As shown, the ultrasound system 100 comprises an ultrasound device 102 that is communicatively coupled to the computing device 104 by a communication link 112. The computing device 104 may be configured to receive ultrasound data from the ultrasound device 102 and use the received ultrasound data to generate an ultrasound image 110. The computing device 104 may analyze the ultrasound image 110 to provide guidance to an operator of the ultrasound device 102 regarding how to reposition the ultrasound device 102 to capture an ultrasound image containing a target anatomical view. For example, the computing device 104 may analyze the ultrasound image 110 to determine whether the ultrasound image 110 contains a target anatomical view, such as a PLAX anatomical view. If the computing device 104 determines that the ultrasound image 110 contains the target anatomical view, the computing device 104 may provide an indication to the operator using a display 106 that the ultrasound device 102 is properly positioned. Otherwise, the computing device 104 may provide an instruction 108 using the display 106 to the operator regarding how to reposition the ultrasound device 102. See paragraph [0181]) comprising one or more processors, and a non-transitory computer-readable medium including one or more sequences of instructions that, when executed by the one or more processors, cause the system to perform operations (The computing device 104 may comprise one or more processing elements (such as a processor) to, for example, process ultrasound data received from the ultrasound device 102. Additionally, the computing device 104 may comprise one or more storage elements (such as a non-transitory computer readable medium) to, for example, store instructions that may be executed by the processing element(s) and/or store all or any portion of the ultrasound data received from the ultrasound device 102. See paragraph [0184]) comprising: defining a display position of an image plane proximate to a current position of an ultrasound probe instrument (An example of such an augmented reality interface is shown in FIG. 6 being displayed on a display 606 of a computing device 604. The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). See paragraph [0208])(See figure 6, element 610), the ultrasound probe instrument visible in an Augmented Reality (AR) environment generated by an AR headset device (An example of such an augmented reality interface is shown in FIG. 6 being displayed on a display 606 of a computing device 604. The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). See paragraph [0208])(See figure 6, element 610) (For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])(In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005]); rendering an AR situated view on the image plane in the AR environment, the situated view portraying ultrasound imagery captured by the ultrasound probe instrument (The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]); determining an AR display orientation of the image plane based on one or more detected movements of the AR headset device (For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])(In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])(Camera is what captures the view and the camera changes based on detected head movements, position would change), but is silent to and registering one or more portions of the ultrasound imagery as being representative of respective portions of a three-dimensional (3D) medical model. Martin teaches capturing ultrasound image data and registering ultrasound imagery with a 3d model (The ultrasound device 116 can be operable to capture 2D ultrasound images of the patient pre-operatively and/or intra-operatively. For example, the ultrasound device 116 can be configured to capture one or more images of one or more salient anatomical features of the patient. The processing device 110 can register the ultrasound images to the 3D model. In some embodiments, the ultrasound device 116 can image the ablation instruments 113 and/or 114 intra-operatively, which can provide data to the processing device 110 sufficient for the processing device to transform the location or position of the ablation instruments 113 and/or 114 in the physical space to the 3D model. See paragraph [0026])(In some embodiments, such as in embodiments in which the 3D model is not updated in real time, the physical space (e.g., the operating room and/or the patient) can be registered to the 3D model, at 240, by conducting a pre-operative registration imaging, at 230. For example, at 230, an ultrasound and/or other medical image can be captured and used to register the patient, an ultrasound device, one or more surgical instruments (e.g., ablation probes), and/or an optical tracking detector to the 3D model. Once the physical space has been registered to the model, at 240, the model can be operable to track the motion of a surgical instrument in the physical space, at 245 (e.g., using the optical tracking system 115 and/or ultrasound device 116, as described above with reference to FIG. 1). See paragraph [0035])( For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]) Rothberg and Martin teach ultrasound imaging and Martin teaches that the ultrasound image can be registered to a 3d model, therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the system of Rothberg with the 3D registration techniques of Martin such that the ultrasound image data could be registered to a 3D model of the patient and updated in real-time. Regarding claim 10, Rothberg in view of Martin teaches The system of claim 9, wherein registering one or more portions of the ultrasound imagery comprises: receiving one or more adjustments to a position and an orientation of a rendering of the 3D medical model displayed in the AR environment, the 3D medical model representing physical anatomy portrayed by the ultrasound imagery of the situated view (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) (Martin; In some embodiments, such as in embodiments in which the 3D model is not updated in real time, the physical space (e.g., the operating room and/or the patient) can be registered to the 3D model, at 240, by conducting a pre-operative registration imaging, at 230. For example, at 230, an ultrasound and/or other medical image can be captured and used to register the patient, an ultrasound device, one or more surgical instruments (e.g., ablation probes), and/or an optical tracking detector to the 3D model. Once the physical space has been registered to the model, at 240, the model can be operable to track the motion of a surgical instrument in the physical space, at 245 (e.g., using the optical tracking system 115 and/or ultrasound device 116, as described above with reference to FIG. 1). See paragraph [0035])( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]); and receiving a selection indicating alignment of the 3D medical model and the ultrasound imagery of the situated view (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]). Regarding claim 13, Rothberg in view of Martin teaches The system of claim 9, wherein determining the AR display orientation of the image plane comprises: determining a first display orientation of the image plane based on a current position and orientation of an AR headset device (Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]); detecting one or more changes to the current position and orientation of an AR headset device ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]); and determining a second display orientation of the image plane based on the one or more changes to the position and orientation of an AR headset device ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]). Regarding claim 14, Rothberg in view of Martin teaches The system of claim 13, wherein rendering the situated view at the image plane comprises: rendering first ultrasound image content portrayed by the situated view at the first display position of the image plane according to the first image plane display orientation, the first display position of the image plane and the first ultrasound image content corresponding to a first position of the ultrasound probe instrument ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]). Regarding claim 15, Rothberg in view of Martin teaches The system of claim 14, wherein rendering the situated view comprises: rendering second ultrasound image content portrayed by the situated view at the second display position of the image plane according to the same first image plane display orientation, the second display position of the image plane and the second ultrasound image content corresponding to a subsequent second position of the ultrasound probe instrument ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034])(As probe moves image tracks probe and is updated live.). Regarding claim 16, Rothberg in view of Martin teaches The system of claim 15, further comprising: rendering the same first ultrasound image content portrayed by the situated view at the first display position of the image plane according to the second image plane display orientation, the first display position of the image plane and the first ultrasound image content corresponding to a first position of the ultrasound probe instrument, the second image plane display orientation corresponding to one or more changes to the current position and orientation of an AR headset device ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034])(As probe moves image tracks probe and is updated live.) (As user moves view of camera the ultrasound image is still projected based on camera position of device in the case of an HMD). Regarding claim 17, Rothberg teaches A computer program product comprising a non-transitory computer-readable medium having a computer-readable program code embodied therein to be executed by one or more processors, the program code including instructions (The computing device 104 may comprise one or more processing elements (such as a processor) to, for example, process ultrasound data received from the ultrasound device 102. Additionally, the computing device 104 may comprise one or more storage elements (such as a non-transitory computer readable medium) to, for example, store instructions that may be executed by the processing element(s) and/or store all or any portion of the ultrasound data received from the ultrasound device 102. See paragraph [0184]) to perform: defining a display position of an image plane proximate to a current position of an ultrasound probe instrument (An example of such an augmented reality interface is shown in FIG. 6 being displayed on a display 606 of a computing device 604. The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). See paragraph [0208])(See figure 6, element 610), the ultrasound probe instrument visible in an Augmented Reality (AR) environment generated by an AR headset device (An example of such an augmented reality interface is shown in FIG. 6 being displayed on a display 606 of a computing device 604. The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). See paragraph [0208])(See figure 6, element 610) (For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])(In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005]); rendering an AR situated view on the image plane in the AR environment, the situated view portraying ultrasound imagery captured by the ultrasound probe instrument (The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]); determining an AR display orientation of the image plane based on one or more detected movements of the AR headset device (For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])(In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])(Camera is what captures the view and the camera changes based on detected head movements, position would change), but is silent to and registering one or more portions of the ultrasound imagery as being representative of respective portions of a three-dimensional (3D) medical model. Martin teaches capturing ultrasound image data and registering ultrasound imagery with a 3d model (The ultrasound device 116 can be operable to capture 2D ultrasound images of the patient pre-operatively and/or intra-operatively. For example, the ultrasound device 116 can be configured to capture one or more images of one or more salient anatomical features of the patient. The processing device 110 can register the ultrasound images to the 3D model. In some embodiments, the ultrasound device 116 can image the ablation instruments 113 and/or 114 intra-operatively, which can provide data to the processing device 110 sufficient for the processing device to transform the location or position of the ablation instruments 113 and/or 114 in the physical space to the 3D model. See paragraph [0026])(In some embodiments, such as in embodiments in which the 3D model is not updated in real time, the physical space (e.g., the operating room and/or the patient) can be registered to the 3D model, at 240, by conducting a pre-operative registration imaging, at 230. For example, at 230, an ultrasound and/or other medical image can be captured and used to register the patient, an ultrasound device, one or more surgical instruments (e.g., ablation probes), and/or an optical tracking detector to the 3D model. Once the physical space has been registered to the model, at 240, the model can be operable to track the motion of a surgical instrument in the physical space, at 245 (e.g., using the optical tracking system 115 and/or ultrasound device 116, as described above with reference to FIG. 1). See paragraph [0035])( For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]) Rothberg and Martin teach ultrasound imaging and Martin teaches that the ultrasound image can be registered to a 3d model, therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the system of Rothberg with the 3D registration techniques of Martin such that the ultrasound image data could be registered to a 3D model of the patient and updated in real-time. Regarding claim 18, Rothberg in view of Martin teaches The computer program product of claim 17, wherein registering one or more portions of the ultrasound imagery comprises: receiving one or more adjustments to a position and an orientation of a rendering of the 3D medical model displayed in the AR environment, the 3D medical model representing physical anatomy portrayed by the ultrasound imagery of the situated view (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) (Martin; In some embodiments, such as in embodiments in which the 3D model is not updated in real time, the physical space (e.g., the operating room and/or the patient) can be registered to the 3D model, at 240, by conducting a pre-operative registration imaging, at 230. For example, at 230, an ultrasound and/or other medical image can be captured and used to register the patient, an ultrasound device, one or more surgical instruments (e.g., ablation probes), and/or an optical tracking detector to the 3D model. Once the physical space has been registered to the model, at 240, the model can be operable to track the motion of a surgical instrument in the physical space, at 245 (e.g., using the optical tracking system 115 and/or ultrasound device 116, as described above with reference to FIG. 1). See paragraph [0035])( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]); and receiving a selection indicating alignment of the 3D medical model and the ultrasound imagery of the situated view (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]). Regarding claim 19, Rothberg in view of Martin teaches The computer program product of claim 17,wherein determining the AR display orientation of the image plane comprises: determining a first display orientation of the image plane based on a current position and orientation of an AR headset device (Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]); detecting one or more changes to the current position and orientation of an AR headset device ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]); and determining a second display orientation of the image plane based on the one or more changes to the position and orientation of an AR headset device ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]). Regarding claim 20, Rothberg in view of Martin teaches The computer program product of claim 19, further comprising: rendering first ultrasound image content portrayed by the situated view at the first display position of the image plane according to the first image plane display orientation, the first display position of the image plane and the first ultrasound image content corresponding to a first position of the ultrasound probe instrument ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034]); and rendering second ultrasound image content portrayed by the situated view at the second display position of the image plane according to the same first image plane display orientation, the second display position of the image plane and the second ultrasound image content corresponding to a subsequent second position of the ultrasound probe instrument ((Rothberg; For example, the computing device 504 may be implemented as a wearable headset and/or a pair of smart glasses ( e.g., GOOGLE GLASS, APPLE AR glasses, and MICROSOFT HOLOLENS). See paragraph [0203])( (Rothberg; In some embodiments, these techniques may be embodied in a software application (hereinafter "App") that may be installed on a computing device ( e.g., a mobile smartphone, a tablet, a laptop, a smart watch, virtual reality (VR) headsets, augmented reality (AR) headsets, smart wearable devices, etc.). See paragraph [0005])( (Rothberg; Camera is what captures the view and the camera changes based on detected head movements, position would change) (Rothberg; The augmented reality interface overlays the ultrasound image 610 and an ultrasound device symbol 608 onto an image of an ultrasound device 602 being used to image the subject 601 (e.g., captured from a front-facing camera in the handheld device computing device 604). As shown, the ultrasound image 610 is overlaid onto the portion of the subject 601 that is being imaged by the ultrasound device 602. In particular, the ultrasound image 610 has been positioned and oriented so as to be extending from the ultrasound device 602 into the subject 601. This position and orientation of the ultrasound image 610 may indicate to the operator the particular portion of the subject 601 that is being imaged. For example, the ultrasound device 602 may be positioned on an upper torso of the subject 601 and the ultrasound image 610 may extend from an end of the ultrasound device 602 in contact with the subject 601 into the upper torso of the subject 601. Thereby, the operator may be informed that the captured image is that of a 2D cross-section of body tissue in the upper torso of subject 601. See paragraph [0208]) ( Martin; For example, the scan, at 210, and or the model generation at 220, can occur simultaneously with, or otherwise overlapping in time with, a surgical intervention. In this manner, the diagnostic scan and/or 3D model can be updated in real time (e.g., "live") or close to real time as the surgical intervention progresses. See paragraph [0034])(As probe moves image tracks probe and is updated live.). Claim(s) 3 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Rothberg et al. (US 2017/0360412)(Hereinafter referred to as Rothberg) in view of Martin et al. (US 2014/0078138)(Hereinafter referred to as Martin) in view of Yu et al. (US 2010/0331699)(Hereinafter referred to as Yu). Regarding claim 3, Rothberg in view of Martin teaches the computer-implemented method of claim 1, but is silent to further comprising: receiving a calibration selection that corresponds to ultrasound imagery depth. Yu teaches calibrating ultrasound gain/depth for an ultrasound system (Software interface. See FIG. 6. Ultrasound beamformer independent software generic control interface: Probe insertion/removal detection and notification; Probe and application selection; Scanning modality selection; Ultrasound beam gain/depth calibration; Ultrasound beam lateral field of view control; Ultrasound beam depth control; Ultrasound tissue dependent acquisition parameters; Ultrasound image position tagging: Transducer parameter reprogramming. See paragraph [0049]). Rothberg in view of Martin and Yu teach performing ultrasound imaging and Yu teaches calibrating the probe for gain/depth, therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the system of Rothberg in view of Martin with the calibration techniques of Yu such that the system could be properly calibrated for the type of ultrasound probe and procedure being performed. Regarding claim 11, Rothberg in view of Martin teaches The system of claim 9, but is silent to further comprising: receiving a calibration selection that corresponds to ultrasound imagery depth. Yu teaches calibrating ultrasound gain/depth for an ultrasound system (Software interface. See FIG. 6. Ultrasound beamformer independent software generic control interface: Probe insertion/removal detection and notification; Probe and application selection; Scanning modality selection; Ultrasound beam gain/depth calibration; Ultrasound beam lateral field of view control; Ultrasound beam depth control; Ultrasound tissue dependent acquisition parameters; Ultrasound image position tagging: Transducer parameter reprogramming. See paragraph [0049]). Rothberg in view of Martin and Yu teach performing ultrasound imaging and Yu teaches calibrating the probe for gain/depth, therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the system of Rothberg in view of Martin with the calibration techniques of Yu such that the system could be properly calibrated for the type of ultrasound probe and procedure being performed. Claim(s) 4 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Rothberg et al. (US 2017/0360412)(Hereinafter referred to as Rothberg) in view of Martin et al. (US 2014/0078138)(Hereinafter referred to as Martin) in view of Yu et al. (US 2010/0331699)(Hereinafter referred to as Yu) in view of Kniest (US 2002/0156864)(Hereinafter referred to as Kniest). Regarding claim 4, Rothberg in view of Martin in view of Yu teaches the computer-implemented method of claim 3, but is silent to further comprising: receiving an assignment of a measure of distance per pixel of the situated view. Kneist teaches the ability to measure an object on a display knowing the distance per pixel (It may be desirable to actually "measure" an object on the display. WebAngel allows the user to place cursors on an image and show the distance between them (based on the information provided for distance per pixel). Areas, velocities (e.g. blood flow) and even volumes of objects on images are estimated using a variety of measurement schemes already developed for medical imaging devices (e.g. ultrasound machines). See paragraph [0352]). Rothberg in view of Martin in view of Yu and Kneist teach of ultrasound imaging and Kniest teaches that knowing the distance per pixel the user can measure the distance between objects on screen, therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the system of Rothberg in view of Martin in view of Yu with the distance per pixel measuring techniques of Kniest such that the user would have the ability to measure desired objects on screen for analysis and diagnosis purposes. Regarding claim 12, Rothberg in view of Martin in view of Yu teaches the system of claim 11, but is silent to further comprising: receiving an assignment of a measure of distance per pixel of the situated view. Kneist teaches the ability to measure an object on a display knowing the distance per pixel (It may be desirable to actually "measure" an object on the display. WebAngel allows the user to place cursors on an image and show the distance between them (based on the information provided for distance per pixel). Areas, velocities (e.g. blood flow) and even volumes of objects on images are estimated using a variety of measurement schemes already developed for medical imaging devices (e.g. ultrasound machines). See paragraph [0352]). Rothberg in view of Martin in view of Yu and Kneist teach of ultrasound imaging and Kniest teaches that knowing the distance per pixel the user can measure the distance between objects on screen, therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine the system of Rothberg in view of Martin in view of Yu with the distance per pixel measuring techniques of Kniest such that the user would have the ability to measure desired objects on screen for analysis and diagnosis purposes. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to NICHOLAS R WILSON whose telephone number is (571)272-0936. The examiner can normally be reached M-F 7:30-5:00PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kee Tung can be reached at (572)-272-7794. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NICHOLAS R WILSON/Primary Examiner, Art Unit 2611
Read full office action

Prosecution Timeline

Jun 20, 2024
Application Filed
Dec 23, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602869
APPARATUS, SYSTEMS AND METHODS FOR PROCESSING IMAGES
2y 5m to grant Granted Apr 14, 2026
Patent 12602891
TELEPORTATION SYSTEM COMBINING VIRTUAL REALITY AND AUGMENTED REALITY
2y 5m to grant Granted Apr 14, 2026
Patent 12579605
INFORMATION PROCESSING DEVICE AND METHOD OF CONTROLLING DISPLAY DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12567215
SYSTEM AND METHOD OF CONTROLLING SYSTEM
2y 5m to grant Granted Mar 03, 2026
Patent 12561911
3D CAGE GENERATION USING SIGNED DISTANCE FUNCTION APPROXIMANT
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
87%
Grant Probability
99%
With Interview (+12.1%)
1y 12m
Median Time to Grant
Low
PTA Risk
Based on 537 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month